Quantcast
Channel: Integration Cloud Service – ATeam Chronicles
Viewing all 74 articles
Browse latest View live

Fusion Applications WebCenter Content Integration – Automating File Import/Export

$
0
0

Introduction

Oracle WebCenter Content, a component of Fusion Middleware, is a strategic solution for the management, security, and distribution of unstructured content such as documents, spreadsheets, presentations, and video. Oracle Fusion Applications leverages Oracle WebCenter Content to store all marketing collateral as well as all attachments. Import flow also uses it to stage the CSV files that the user uploads.

WebCenter Content replaces SSH File Transfer Protocol (SFTP) as a content repository in Fusion Applications starting with Release 7. For example, it is implemented in HCM for File Based Loader (FBL) and Extracts integration. There are several ways of importing and exporting content to and from Fusion Applications such as:

  • Upload using “File Import and Export” UI from home page navigation: Navigator > Tools
  • Upload using WebCenter Content Document Transfer Utility
  • Upload programmatically via Java Code or Web Service API

This post provides an introduction, with working sample code, on how to programmatically export content from Fusion Applications to automate the outbound integration process to other applications in the cloud or on-premise. A Service Oriented Architecture (SOA) composite is implemented to demonstrate the concept.

Main Article

Fusion Applications Security in WebCenter Content

The content in WebCenter Content are secured through user, roles, privileges and accounts. The user could be any valid user with a role such as “Integration Specialist”. The role may have privileges such as read, write and delete. The accounts are predefined by each application. For example, HCM uses /hcm/dataloader/import and /hcm/dataloader/export respectively.

Let’s review the inbound and outbound batch integration flows.

Inbound Flow

This is a typical Inbound FBL process flow:

 

fblflow

The uploaded file is registered by invoking the Loader Integration Service – http://{Host}/hcmCommonBatchLoader/LoaderIntegrationService.

You specify the following in the payload:

  • Content id of the file to be loaded
  • Business objects that you are loading
  • Batch name
  • Load type (FBL)
  • Imported file to be loaded automatically

Fusion Applications UI also allows the end user to register and initiate the data load process.

 

Outbound Flow

This is a typical Outbound batch Integration flow using tools such as Business Intelligence (BI) Publishers and Answers and HCM Extracts:

extractflow

The extracted file could be delivered to the WebCenter Content server.

Programmatic Approach to export files from Webcenter Content

In Fusion Applications, the WebCenter Content Managed server is installed in the Common domain Weblogic Server. The WebCenter Content server provides two types of web services:

Generic JAX-WS based web service

This is a generic web service for general access to the Content Server. The context root for this service is “/idcws”. For details of the format, see the published WSDL at https://<hostname>:<port>/idcws/GenericSoapPort?WSDL. This service is protected through Oracle Web Services Security Manager (OWSM). As a result of allowing WS-Security policies to be applied to this service, streaming Message Transmission Optimization Mechanism (MTOM) is not available for use with this service. Very large files (greater than the memory of the client or the server) cannot be uploaded or downloaded.

Native SOAP based web service

This is the general WebCenter Content service. Essentially, it is a normal socket request to Content Server, wrapped in a SOAP request. Requests are sent to the Content Server using streaming Message Transmission Optimization Mechanism (MTOM) in order to support large files. The context root for this service is “/idcnativews”. The main web service is IdcWebRequestPort and it requires JSESSIONID, which can be retrieved from IdcWebLoginPort service.

The Remote Intradoc Client (RIDC) uses the native web services. Oracle recommends that you do not develop a custom client against these services.

For more information, please refer “Developing with WebCenter Content Web Services for Integration“.

Generic Web Service Implementation

This post provides a sample of implementing generic web service /idcws/GenericSoapPort. In order to implement this web service, it is critical to review the following definitions to generate the request message and parse the response message:

IdcService:

IdcService is a predefined service node’s attribute that is to be executed, for example, CHECKIN_UNIVERSAL, GET_SEARCH_RESULTS, GET_FILE, CHECKOUT_BY_NAME, etc.

User

User is a subnode within a <service> and contains all user information.

Document

Document is a collection of all the content-item information and is the parent node of the all the data.

ResultSet

ResultSet is a typical row/column based schema. The name attribute specifies the name of the ResultSet. It contains set of row subnodes.

Row

Row is a typical row within a ResultSet, which can have multiple <row> subnodes. It contains sets of Field objects

Field

Field is a subnode of either <document> or <row>. It represents document or user metadata such as content Id, Name, Version, etc.

File

File is a file object that is either being uploaded or downloaded

For more information, please refer Configuring Web Services with WSDL, SOAP, and the WSDL Generator.

Web Service Security

The genericSoapPort web service is protected by Oracle Web Services Manager (OWSM). In Oracle Fusion Applications cloud, the OWSM policy is: “oracle/wss11_saml_or_username_token_with_message_protection_service_policy”.

In your SOAP envelope, you will need the appropriate “wsee” headers. This is a sample:

<soapenv:Header>
<wsse:Security xmlns:wsse="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-secext-1.0.xsd" soapenv:mustUnderstand="1">
<saml:Assertion xmlns:saml="urn:oasis:names:tc:SAML:1.0:assertion" MajorVersion="1" MinorVersion="1" AssertionID="SAML-iiYLE6rlHjI2j9AUZXrXmg22" IssueInstant="2014-10-20T13:52:25Z" Issuer="www.oracle.com">
<saml:Conditions NotBefore="2014-10-20T13:52:25Z" NotOnOrAfter="2015-11-22T13:57:25Z"/>
<saml:AuthenticationStatement AuthenticationInstant="2014-10-20T14:52:25Z" AuthenticationMethod="urn:oasis:names:tc:SAML:1.0:am:password">
<saml:Subject>
<saml:NameIdentifier Format="urn:oasis:names:tc:SAML:1.1:nameid-format:unspecified">FAAdmin</saml:NameIdentifier>
<saml:SubjectConfirmation>
<saml:ConfirmationMethod>urn:oasis:names:tc:SAML:1.0:cm:sender-vouches</saml:ConfirmationMethod>
</saml:SubjectConfirmation>
</saml:Subject>
</saml:AuthenticationStatement>
</saml:Assertion>
</wsse:Security>
</soapenv:Header>

Sample SOA Composite

The SOA code provides a sample on how to search for a document in WebCenter Content, extract a file name from the search result, and get the file and save it in your local directory. The file could be processed immediately based on your requirements. Since this is a generic web service with a generic request message, you can use the same interface to invoke various IdcServices, such as GET_FILE, GET_SEARCH_RESULTS, etc.

In the SOA composite sample, two external services are created: GenericSoapPort and FileAdapter. If the service is GET_FILE, then it will save a copy of the retrieved file in your local machine.

Export File

The GET_FILE service returns a specific rendition of a content item, the latest revision, or the latest released revision. A copy of the file is retrieved without performing a check out. It requires either dID (content item revision ID) for the revision, or dDocName (content item name) along with a RevisionSelectionMethod parameter. The RevisionSelectionMethod could be either “Latest” (latest revision of the content) or “LatestReleased” (latest released revision of the content). For example, to retrieve file:

<ucm:GenericRequest webKey="cs">
<ucm:Service IdcService="GET_FILE">
<ucm:Document>
<ucm:Field name="dID">401</ucm:Field>
</ucm:Document>
</ucm:Service>
</ucm:GenericRequest>

Search File

The dID of the content could be retrieved using the service GET_SEARCH_RESULTS. It uses a QueryText attribute in <Field> node. The QueryText attribute defines the query and must be XML encoded. You can append values for title, content Id, and so on, in the QueryText, to refine the search. The syntax for QueryText could be challenging, but once you understand the special characters formats, it is straight forward. For example, to search content by its original name:

<ucm:Service IdcService="GET_SEARCH_RESULTS">
<ucm:Document>
<ucm:Field name="QueryText">dOriginalName &lt;starts&gt; `Test`</ucm:Field>
</ucm:Document>
</ucm:Service>

In plain text, it is dOriginalName <starts> `Test`. The <substring> is the mandatory format. You can further refine the query by adding more parameters.

This a sample SOA composite with 2 external references, genericSoapPort and FileAdapter.

ucmComposite

This is a sample BPEL process flow that demonstrates how to retrieve the file and save a copy to a local directory using File Adapter. If the idcService is GET_SEARCH_RESULTS, then do not save the file. In a real scenario, you will search, check out and start processing the file.

 

ucmBPEL1

The original file name is preserved when copying it to a local directory by passing the header property to the FileAdapter. For example, create a variable fileName and use assign as follows:

1. get file name from the response message in your <assign> activity as follows:

<from expression="bpws:getVariableData('InvokeGenericSoapPort_GenericSoapOperation_OutputVariable','GenericResponse','/ns2:GenericResponse/ns2:Service/ns2:Document/ns2:ResultSet/ns2:Row/ns2:Field[@name=&quot;dOriginalName&quot;]')"/>
<to variable="fileName"/>

Please make note of the XPath expression as this will assist you to retrieve other metadata.

2. Pass this fileName variable to the <invoke> of the FileAdapter as follows:

<bpelx:inputProperty name="jca.file.FileName" variable="fileName"/>

Please add the following property manually to the ../CommonDomain/ucm/cs/config/config.cfg file for the QueryText syntax: AllowNativeQueryFormat=true
Restart the managed server.
The typical error is: “StatusMessage”>Unable to retrieve search results. Parsing error at character xx in query….”

Testing SOA Composite:

After the composite is deployed in your SOA server, you can test it either from Enterprise Manager (EM) or using SoapUI. These are the sample request messages for GET_SEARCH_RESULTS and GET_FILE.

The following screens show the SOA composites for “GET_SEARCH_RESULTS” and “GET_FILE”:

searchfile

getfile

Get_File Response snippet with critical objects:

<ns2:GenericResponse xmlns:ns2="http://www.oracle.com/UCM">
<ns2:Service IdcService="GET_FILE">
<ns2:Document>
<ns2:Field name="dID">401</ns2:Field>
<ns2:Field name="IdcService">GET_FILE</ns2:Field>
....
<ns2:ResultSet name="FILE_DOC_INFO">
<ns2:Row>
<ns2:Field name="dID">401</ns2:Field>
<ns2:Field name="dDocName">UCMFA000401</ns2:Field>
<ns2:Field name="dDocType">Document</ns2:Field>
<ns2:Field name="dDocTitle">JRD Test</ns2:Field>
<ns2:Field name="dDocAuthor">FAAdmin</ns2:Field>
<ns2:Field name="dRevClassID">401</ns2:Field>
<ns2:Field name="dOriginalName">Readme.html</ns2:Field>
</ns2:Row>
</ns2:ResultSet>
</ns2:ResultSet>
<ns2:File name="" href="/u01/app/fa/config/domains/fusionhost.mycompany.com/CommonDomain/ucm/cs/vault/document/bwzh/mdaw/401.html">
<ns2:Contents>
<xop:Include href="cid:7405676a-11f8-442d-b13c-f8f6c2b682e4" xmlns:xop="http://www.w3.org/2004/08/xop/include"/>
</ns2:Contents>
</ns2:File>
</ns2:Document>
</ns2:Service>
</ns2:GenericResponse>

Import File

The above sample can also be use to import files into the WebCenter Content repository for Inbound integration or other use cases. The service name is CHECKIN_UNIVERSAL.

Summary

This post demonstrates how to automate the export and import of contents in WebCenter Content server implemented by Fusion Applications. It further demonstrates how integration tools like SOA can be implemented to automate, extend and orchestrate integration between Fusion Applications in the cloud or on-premise, with Oracle or non-Oracle applications, either in Cloud or on-premise sites.

The SOA sample code is here.


Fusion HCM Cloud Bulk Integration Automation

$
0
0

Introduction

Fusion HCM Cloud provides a comprehensive set of tools, templates, and pre-packaged integration to cover various scenarios using modern and efficient technologies. One of the patterns is the bulk integration to load and extract data to/from cloud. The inbound tool is the File Based data loader (FBL) evolving into HCM Data Loaders (HDL). HDL supports data migration for full HR, incremental load to support co-existence with Oracle Applications such as E-Business Suite (EBS) and PeopleSoft (PSFT). It also provides the ability to bulk load into configured flexfields. HCM Extracts is an outbound integration tool that let’s you choose data, gathers and archives it. This archived raw data is converted into a desired format and delivered to supported channels recipients.

HCM cloud implements Oracle WebCenter Content, a component of Fusion Middleware, to store and secure data files for both inbound and outbound bulk integration patterns. This post focuses on how to automate data file transfer with WebCenter Content to initiate the loader. The same APIs will be used to download data file from the WebCenter Content delivered through the extract process.

WebCenter Content replaces SSH File Transfer Protocol (SFTP) server in the cloud as a content repository in Fusion HCM starting with Release 7+. There are several ways of importing and exporting content to and from Fusion Applications such as:

  • Upload using “File Import and Export” UI from home page navigation: Navigator > Tools
  • Upload using WebCenter Content Document Transfer Utility
  • Upload programmatically via Java Code or Web Service API

This post provides an introduction, with working sample code, on how to programmatically export content from Fusion Applications to automate the outbound integration process to other applications in the cloud or on-premise. A Service Oriented Architecture (SOA) composite is implemented to demonstrate the concept.

Main Article

Fusion Applications Security in WebCenter Content

The content in WebCenter Content is secured through users, roles, privileges and accounts. The user could be any valid user with a role such as “Integration Specialist.” The role may have privileges such as read, write and delete. The accounts are predefined by each application. For example, HCM uses /hcm/dataloader/import and /hcm/dataloader/export respectively.

Let’s review the inbound and outbound batch integration flows.

Inbound Flow

This is a typical Inbound FBL process flow:

 

HDL_loader_process

The data file is uploaded to WebCenter Content Server either using Fusion HCM UI or programmatically in /hcm/dataloader/import account. This uploaded file is registered by invoking the Loader Integration Service – http://{Host}/hcmCommonBatchLoader/LoaderIntegrationService.

You must specify the following in the payload:

  • Content id of the file to be loaded
  • Business objects that you are loading
  • Batch name
  • Load type (FBL)
  • Imported file to be loaded automatically

Fusion Applications UI also allows the end user to register and initiate the data load process.

 

Encryption of Data File using Pretty Good Privacy (PGP)

All data files transit over a network via SSL. In addition, HCM Cloud supports encryption of data files at rest using PGP.
Fusion supports the following types of encryption:

  • PGP Signed
  • PGP Unsigned
  • PGPX509 Signed
  • PGPX509 Unsigned

To use this PGP Encryption capability, a customer must exchange encryption keys with Fusion for the following:

  • Fusion can decrypt inbound files
  • Fusion can encrypt outbound files
  • Customer can encrypt files sent to Fusion
  • Customer can decrypt files received from Fusion

Steps to Implement PGP

  1. 1. Provide your PGP Public Key
  2. 2. Oracle’s Cloud Operations team provides you with the Fusion PGP Public Key.

Steps to Implement PGP X.509

  1. 1. Self signed fusion key pair (default option)
    • You provide the public X.509 certificate
  2. 2. Fusion Key Pair provided by you:
    • Public X.509 certificate uploaded via Oracle Support Service Request (SR)
    • Fusion Key Pair for Fusion’s X.509 certificate in a Keystore with Keystore password.

Steps for Certificate Authority (CA) signed Fusion certificate

      1. Obtain Certificate Authority (CA) signed Fusion certificate
      2. Public X.509 certificate uploaded via SR
      3. Oracle’s Cloud Operations exports the fusion public X.509 CSR certificate and uploads it to SR
      4. Using Fusion public X.509 CSR certificate, Customer provides signed CA certificate and uploads it to SR
    5. Oracle’s Cloud Operations provides the Fusion PGP Public Certificate to you via an SR

 

Modification to Loader Integration Service Payload to support PGP

The loaderIntegrationService has a new method called “submitEncryptedBatch” which has an additional parameter named “encryptType”. The valid values to pass in the “encryptType” parameter are taken from the ORA_HRC_FILE_ENCRYPT_TYPE lookup:

  • NONE
  • PGPSIGNED
  • PGPUNSIGNED
  • PGPX509SIGNED
  • PGPX509UNSIGNED

Sample Payload

<soap:Envelope xmlns:soap=”http://schemas.xmlsoap.org/soap/envelope/”> <soap:Body>
<ns1:submitEncryptedBatch
xmlns:ns1=”http://xmlns.oracle.com/apps/hcm/common/batchLoader/core/loaderIntegrationService/types/”>
<ns1:ZipFileName>LOCATIONTEST622.ZIP</ns1:ZipFileName>
<ns1:BusinessObjectList>Location</ns1:BusinessObjectList>
<ns1:BatchName>LOCATIONTEST622.ZIP</ns1:BatchName>
<ns1:LoadType>FBL</ns1:LoadType>
<ns1:AutoLoad>Y</ns1:AutoLoad>
<ns1:encryptType>PGPX509SIGNED</ns1:encryptType>
</ns1:submitEncryptedBatch>
</soap:Body>
</soap:Envelope>

 

Outbound Flow

This is a typical Outbound batch Integration flow using HCM Extracts:

extractflow

The extracted file could be delivered to the WebCenter Content server. HCM Extract has an ability to generate an encrypted output file. In Extract delivery options ensure the following options are correctly configured:

  1. Select HCM Delivery Type to “HCM Connect”
  2. Select an Encryption Mode of the 4 supported encryption types. or select None
  3. Specify the Integration Name – his value is used to build the title of the entry in WebCenter Content

 

Extracted File Naming Convention in WebCenter Content

The file will have the following properties:

  • Author: FUSION_APPSHCM_ESS_APPID
  • Security Group: FAFusionImportExport
  • Account: hcm/dataloader/export
  • Title: HEXTV1CON_{IntegrationName}_{EncryptionType}_{DateTimeStamp}

 

Programmatic Approach to export/import files from/to WebCenter Content

In Fusion Applications, the WebCenter Content Managed server is installed in the Common domain Weblogic Server. The WebCenter Content server provides two types of web services:

Generic JAX-WS based web service

This is a generic web service for general access to the Content Server. The context root for this service is “/idcws”. For details of the format, see the published WSDL at https://<hostname>:<port>/idcws/GenericSoapPort?WSDL. This service is protected through Oracle Web Services Security Manager (OWSM). As a result of allowing WS-Security policies to be applied to this service, streaming Message Transmission Optimization Mechanism (MTOM) is not available for use with this service. Very large files (greater than the memory of the client or the server) cannot be uploaded or downloaded.

Native SOAP based web service

This is the general WebCenter Content service. Essentially, it is a normal socket request to Content Server, wrapped in a SOAP request. Requests are sent to the Content Server using streaming Message Transmission Optimization Mechanism (MTOM) in order to support large files. The context root for this service is “/idcnativews”. The main web service is IdcWebRequestPort and it requires JSESSIONID, which can be retrieved from IdcWebLoginPort service.

The Remote Intradoc Client (RIDC) uses the native web services. Oracle recommends that you do not develop a custom client against these services.

For more information, please refer “Developing with WebCenter Content Web Services for Integration.”

Generic Web Service Implementation

This post provides a sample of implementing generic web service /idcws/GenericSoapPort. In order to implement this web service, it is critical to review the following definitions to generate the request message and parse the response message:

IdcService:

IdcService is a predefined service node’s attribute that is to be executed, for example, CHECKIN_UNIVERSAL, GET_SEARCH_RESULTS, GET_FILE, CHECKOUT_BY_NAME, etc.

User

User is a subnode within a <service> and contains all user information.

Document

Document is a collection of all the content-item information and is the parent node of the all the data.

ResultSet

ResultSet is a typical row/column based schema. The name attribute specifies the name of the ResultSet. It contains set of row subnodes.

Row

Row is a typical row within a ResultSet, which can have multiple <row> subnodes. It contains sets of Field objects

Field

Field is a subnode of either <document> or <row>. It represents document or user metadata such as content Id, Name, Version, etc.

File

File is a file object that is either being uploaded or downloaded

For more information, please refer Configuring Web Services with WSDL, SOAP, and the WSDL Generator.

Web Service Security

The genericSoapPort web service is protected by Oracle Web Services Manager (OWSM). In Oracle Fusion Applications cloud, the OWSM policy is: “oracle/wss11_saml_or_username_token_with_message_protection_service_policy”.

In your SOAP envelope, you will need the appropriate “wsee” headers. This is a sample:

<soapenv:Header>
<wsse:Security xmlns:wsse="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-secext-1.0.xsd" soapenv:mustUnderstand="1">
<saml:Assertion xmlns:saml="urn:oasis:names:tc:SAML:1.0:assertion" MajorVersion="1" MinorVersion="1" AssertionID="SAML-iiYLE6rlHjI2j9AUZXrXmg22" IssueInstant="2014-10-20T13:52:25Z" Issuer="www.oracle.com">
<saml:Conditions NotBefore="2014-10-20T13:52:25Z" NotOnOrAfter="2015-11-22T13:57:25Z"/>
<saml:AuthenticationStatement AuthenticationInstant="2014-10-20T14:52:25Z" AuthenticationMethod="urn:oasis:names:tc:SAML:1.0:am:password">
<saml:Subject>
<saml:NameIdentifier Format="urn:oasis:names:tc:SAML:1.1:nameid-format:unspecified">FAAdmin</saml:NameIdentifier>
<saml:SubjectConfirmation>
<saml:ConfirmationMethod>urn:oasis:names:tc:SAML:1.0:cm:sender-vouches</saml:ConfirmationMethod>
</saml:SubjectConfirmation>
</saml:Subject>
</saml:AuthenticationStatement>
</saml:Assertion>
</wsse:Security>
</soapenv:Header>

Sample SOA Composite

The SOA code provides a sample on how to search for a document in WebCenter Content, extract a file name from the search result, and get the file and save it in your local directory. The file could be processed immediately based on your requirements. Since this is a generic web service with a generic request message, you can use the same interface to invoke various IdcServices, such as GET_FILE, GET_SEARCH_RESULTS, etc.

In the SOA composite sample, two external services are created: GenericSoapPort and FileAdapter. If the service is GET_FILE, then it will save a copy of the retrieved file in your local machine.

Export File

The GET_FILE service returns a specific rendition of a content item, the latest revision, or the latest released revision. A copy of the file is retrieved without performing a check out. It requires either dID (content item revision ID) for the revision, or dDocName (content item name) along with a RevisionSelectionMethod parameter. The RevisionSelectionMethod could be either “Latest” (latest revision of the content) or “LatestReleased” (latest released revision of the content). For example, to retrieve file:

<ucm:GenericRequest webKey="cs">
<ucm:Service IdcService="GET_FILE">
<ucm:Document>
<ucm:Field name="dID">401</ucm:Field>
</ucm:Document>
</ucm:Service>
</ucm:GenericRequest>

Search File

The dID of the content could be retrieved using the service GET_SEARCH_RESULTS. It uses a QueryText attribute in <Field> node. The QueryText attribute defines the query and must be XML encoded. You can append values for title, content Id, and so on, in the QueryText, to refine the search. The syntax for QueryText could be challenging, but once you understand the special characters formats, it is straight forward. For example, to search content by its original name:

<ucm:Service IdcService="GET_SEARCH_RESULTS">
<ucm:Document>
<ucm:Field name="QueryText">dOriginalName &lt;starts&gt; `Test`</ucm:Field>
</ucm:Document>
</ucm:Service>

In plain text, it is dOriginalName <starts> `Test`. The <substring> is the mandatory format. You can further refine the query by adding more parameters.

This a sample SOA composite with 2 external references, genericSoapPort and FileAdapter.

ucmComposite

This is a sample BPEL process flow that demonstrates how to retrieve the file and save a copy to a local directory using File Adapter. If the idcService is GET_SEARCH_RESULTS, then do not save the file. In a real scenario, you will search, check out and start processing the file.

 

ucmBPEL1

The original file name is preserved when copying it to a local directory by passing the header property to the FileAdapter. For example, create a variable fileName and use assign as follows:

1. get file name from the response message in your <assign> activity as follows:

<from expression="bpws:getVariableData('InvokeGenericSoapPort_GenericSoapOperation_OutputVariable','GenericResponse','/ns2:GenericResponse/ns2:Service/ns2:Document/ns2:ResultSet/ns2:Row/ns2:Field[@name=&quot;dOriginalName&quot;]')"/>
<to variable="fileName"/>

Please make note of the XPath expression as this will assist you to retrieve other metadata.

2. Pass this fileName variable to the <invoke> of the FileAdapter as follows:

<bpelx:inputProperty name="jca.file.FileName" variable="fileName"/>

Please add the following property manually to the ../CommonDomain/ucm/cs/config/config.cfg file for the QueryText syntax: AllowNativeQueryFormat=true
Restart the managed server.
The typical error is: “StatusMessage”>Unable to retrieve search results. Parsing error at character xx in query….”

Testing SOA Composite:

After the composite is deployed in your SOA server, you can test it either from Enterprise Manager (EM) or using SoapUI. These are the sample request messages for GET_SEARCH_RESULTS and GET_FILE.

The following screens show the SOA composites for “GET_SEARCH_RESULTS” and “GET_FILE”:

searchfile

getfile

Get_File Response snippet with critical objects:

<ns2:GenericResponse xmlns:ns2="http://www.oracle.com/UCM">
<ns2:Service IdcService="GET_FILE">
<ns2:Document>
<ns2:Field name="dID">401</ns2:Field>
<ns2:Field name="IdcService">GET_FILE</ns2:Field>
....
<ns2:ResultSet name="FILE_DOC_INFO">
<ns2:Row>
<ns2:Field name="dID">401</ns2:Field>
<ns2:Field name="dDocName">UCMFA000401</ns2:Field>
<ns2:Field name="dDocType">Document</ns2:Field>
<ns2:Field name="dDocTitle">JRD Test</ns2:Field>
<ns2:Field name="dDocAuthor">FAAdmin</ns2:Field>
<ns2:Field name="dRevClassID">401</ns2:Field>
<ns2:Field name="dOriginalName">Readme.html</ns2:Field>
</ns2:Row>
</ns2:ResultSet>
</ns2:ResultSet>
<ns2:File name="" href="/u01/app/fa/config/domains/fusionhost.mycompany.com/CommonDomain/ucm/cs/vault/document/bwzh/mdaw/401.html">
<ns2:Contents>
<xop:Include href="cid:7405676a-11f8-442d-b13c-f8f6c2b682e4" xmlns:xop="http://www.w3.org/2004/08/xop/include"/>
</ns2:Contents>
</ns2:File>
</ns2:Document>
</ns2:Service>
</ns2:GenericResponse>

Import (Upload) File for HDL

The above sample can also be use to import files into the WebCenter Content repository for Inbound integration or other use cases. The service name is CHECKIN_UNIVERSAL.

Summary

This post demonstrates how to secure and automate the export and import of data files in WebCenter Content server implemented by Fusion HCM Cloud. It further demonstrates how integration tools like SOA can be implemented to automate, extend and orchestrate integration between HCM in the cloud and Oracle or non-Oracle applications, either in Cloud or on-premise sites.

The SOA sample code is here.

Oracle HCM Cloud – Bulk Integration Automation Using SOA Cloud Service

$
0
0

Introduction

Oracle Human Capital Management (HCM) Cloud provides a comprehensive set of tools, templates, and pre-packaged integration to cover various scenarios using modern and efficient technologies. One of the patterns is the batch integration to load and extract data to and from the HCM cloud. HCM provides the following bulk integration interfaces and tools:

HCM Data Loader (HDL)

HDL is a powerful tool for bulk-loading data from any source to Oracle Fusion HCM. It supports important business objects belonging to key Oracle Fusion HCM products, including Oracle Fusion Global Human Resources, Compensation, Absence Management, Performance Management, Profile Management, Global Payroll, Talent and Workforce Management. For detailed information on HDL, please refer to this.

HCM Extracts

HCM Extract is an outbound integration tool that lets you select HCM data elements, extracting them from the HCM database and archiving these data elements as XML. This archived raw XML data can be converted into a desired format and delivered to supported channels recipients.

Oracle Fusion HCM provides the above tools with comprehensive user interfaces for initiating data uploads, monitoring upload progress, and reviewing errors, with real-time information provided for both the import and load stages of upload processing. Fusion HCM provides tools, but it requires additional orchestration such as generating FBL or HDL file, uploading these files to WebCenter Content and initiating FBL or HDL web services. This post describes how to design and automate these steps leveraging Oracle Service Oriented Architecture (SOA) Cloud Service deployed on Oracle’s cloud Platform As a Service (PaaS) infrastructure.  For more information on SOA Cloud Service, please refer to this.

Oracle SOA is the industry’s most complete and unified application integration and SOA solution. It transforms complex application integration into agile and re-usable service-based components to speed time to market, respond faster to business requirements, and lower costs.. SOA facilitates the development of enterprise applications as modular business web services that can be easily integrated and reused, creating a truly flexible, adaptable IT infrastructure. For more information on getting started with Oracle SOA, please refer this. For developing SOA applications using SOA Suite, please refer to this.

These bulk integration interfaces and patterns are not applicable to Oracle Taleo.

Main Article

 

HCM Inbound Flow (HDL)

Oracle WebCenter Content (WCC) acts as the staging repository for files to be loaded and processed by HDL. WCC is part of the Fusion HCM infrastructure.

The loading process for FBL and HDL consists of the following steps:

  • Upload the data file to WCC/UCM using WCC GenericSoapPort web service
  • Invoke the “LoaderIntegrationService” or the “HCMDataLoader” to initiate the loading process.

However, the above steps assume the existence of an HDL file and do not provide a mechanism to generate an HDL file of the respective objects. In this post we will use the sample use case where we get the data file from customer, using it to transform the data and generate an HDL file, and then initiate the loading process.

The following diagram illustrates the typical orchestration of the end-to-end HDL process using SOA cloud service:

 

hcm_inbound_v1

HCM Outbound Flow (Extract)

The “Extract” process for HCM has the following steps:

  • An Extract report is generated in HCM either by user or through Enterprise Scheduler Service (ESS)
  • Report is stored in WCC under the hcm/dataloader/export account.

 

However, the report must then be delivered to its destination depending on the use cases. The following diagram illustrates the typical end-to-end orchestration after the Extract report is generated:

hcm_outbound_v1

 

For HCM bulk integration introduction including security, roles and privileges, please refer to my blog Fusion HCM Cloud – Bulk Integration Automation using Managed File Trasnfer (MFT) and Node.js. For introduction to WebCenter Content Integration services using SOA, please refer to my blog Fusion HCM Cloud Bulk Automation.

 

Sample Use Case

Assume that a customer receives benefits data from their partner in a file with CSV (comma separated value) format periodically. This data must be converted into HDL format for the “ElementEntry” object and initiate the loading process in Fusion HCM cloud.

This is a sample source data:

E138_ASG,2015/01/01,2015/12/31,4,UK LDG,CRP_UK_MNTH,E,H,Amount,23,Reason,Corrected all entry value,Date,2013-01-10
E139_ASG,2015/01/01,2015/12/31,4,UK LDG,CRP_UK_MNTH,E,H,Amount,33,Reason,Corrected one entry value,Date,2013-01-11

This is the HDL format of ElementryEntry object that needs to be generated based on above sample file:

METADATA|ElementEntry|EffectiveStartDate|EffectiveEndDate|AssignmentNumber|MultipleEntryCount|LegislativeDataGroupName|ElementName|EntryType|CreatorType
MERGE|ElementEntry|2015/01/01|2015/12/31|E138_ASG|4|UK LDG|CRP_UK_MNTH|E|H
MERGE|ElementEntry|2015/01/01|2015/12/31|E139_ASG|4|UK LDG|CRP_UK_MNTH|E|H
METADATA|ElementEntryValue|EffectiveStartDate|EffectiveEndDate|AssignmentNumber|MultipleEntryCount|LegislativeDataGroupName|ElementName|InputValueName|ScreenEntryValue
MERGE|ElementEntryValue|2015/01/01|2015/12/31|E138_ASG|4|UK LDG|CRP_UK_MNTH|Amount|23
MERGE|ElementEntryValue|2015/01/01|2015/12/31|E138_ASG|4|UK LDG|CRP_UK_MNTH|Reason|Corrected all entry value
MERGE|ElementEntryValue|2015/01/01|2015/12/31|E138_ASG|4|UK LDG|CRP_UK_MNTH|Date|2013-01-10
MERGE|ElementEntryValue|2015/01/01|2015/12/31|E139_ASG|4|UK LDG|CRP_UK_MNTH|Amount|33
MERGE|ElementEntryValue|2015/01/01|2015/12/31|E139_ASG|4|UK LDG|CRP_UK_MNTH|Reason|Corrected one entry value
MERGE|ElementEntryValue|2015/01/01|2015/12/31|E139_ASG|4|UK LDG|CRP_UK_MNTH|Date|2013-01-11

SOA Cloud Service Design and Implementation

A canonical schema pattern has been implemented to design end-to-end inbound bulk integration process – from the source data file to generating HDL file and initiating the loading process in HCM cloud. The XML schema of HDL object “ElementEntry” is created. The source data is mapped to this HDL schema and SOA activities will generate the HDL file.

Having a canonical pattern automates the generation of HDL file and it becomes a reusable asset for various interfaces. The developer or business user only needs to focus on mapping the source data to this canonical schema. All other activities such as generating the HDL file, compressing and encrypting the file, uploading the file to WebCenter Content and invoking web services needs to be developed once and then once these activities are developed they also become reusable assets.

Please refer to Wikipedia for the definition of Canonical Schema Pattern

These are the following design considerations:

1. Convert source data file from delimited format to XML

2. Generate Canonical Schema of ElementEntry HDL Object

3. Transform source XML data to HDL canonical schema

4. Generate and compress HDL file

5. Upload a file to WebCenter Content and invoke HDL web service

 

Please refer to SOA Cloud Service Develop and Deploy for introduction and creating SOA applications.

SOA Composite Design

This is a composite based on above implementation principles:

hdl_composite

Convert Source Data to XML

“GetEntryData” in the above composite is a File Adapter service. It is configured to use native format builder to convert CSV data to XML format. For more information on File Adapter, refer to this. For more information on Native Format Builder, refer to this.

The following provides detailed steps on how to use Native Format Builder in JDeveloper:

In native format builder, select delimited format type and use source data as a sample to generate a XML schema. Please see the following diagrams:

FileAdapterConfig

nxsd1

nxsd2_v1 nxsd3_v1 nxsd4_v1 nxsd5_v1 nxsd6_v1 nxsd7_v1

Generate XML Schema of ElementEntry HDL Object

A similar approach is used to generate ElementEntry schema. It has two main objects: ElementEntry and ElementEntryValue.

ElementEntry Schema generated using Native Format Builder

<?xml version = ‘1.0’ encoding = ‘UTF-8’?>
<xsd:schema xmlns:xsd=”http://www.w3.org/2001/XMLSchema” xmlns:nxsd=”http://xmlns.oracle.com/pcbpel/nxsd” xmlns:tns=”http://TargetNamespace.com/GetEntryHdlData” targetNamespace=”http://TargetNamespace.com/GetEntryHdlData” elementFormDefault=”qualified” attributeFormDefault=”unqualified” nxsd:version=”NXSD” nxsd:stream=”chars” nxsd:encoding=”UTF-8″>
<xsd:element name=”Root-Element”>
<xsd:complexType>
<xsd:sequence>
<xsd:element name=”Entry” minOccurs=”1″ maxOccurs=”unbounded”>
<xsd:complexType>
<xsd:sequence>
<xsd:element name=”METADATA” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
<xsd:element name=”ElementEntry” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
<xsd:element name=”EffectiveStartDate” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
<xsd:element name=”EffectiveEndDate” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
<xsd:element name=”AssignmentNumber” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
<xsd:element name=”MultipleEntryCount” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
<xsd:element name=”LegislativeDataGroupName” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
<xsd:element name=”ElementName” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
<xsd:element name=”EntryType” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
<xsd:element name=”CreatorType” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”${eol}” nxsd:quotedBy=”&quot;”/>
</xsd:sequence>
</xsd:complexType>
</xsd:element>
</xsd:sequence>
</xsd:complexType>
</xsd:element>
<xsd:annotation>
<xsd:appinfo>NXSDSAMPLE=/ElementEntryAllSrc.dat</xsd:appinfo>
<xsd:appinfo>USEHEADER=false</xsd:appinfo>
</xsd:annotation>
</xsd:schema>

ElementEntryValue Schema generated using Native Format Builder

<?xml version = ‘1.0’ encoding = ‘UTF-8’?>
<xsd:schema xmlns:xsd=”http://www.w3.org/2001/XMLSchema” xmlns:nxsd=”http://xmlns.oracle.com/pcbpel/nxsd” xmlns:tns=”http://TargetNamespace.com/GetEntryValueHdlData” targetNamespace=”http://TargetNamespace.com/GetEntryValueHdlData” elementFormDefault=”qualified” attributeFormDefault=”unqualified” nxsd:version=”NXSD” nxsd:stream=”chars” nxsd:encoding=”UTF-8″>
<xsd:element name=”Root-Element”>
<xsd:complexType>
<xsd:sequence>
<xsd:element name=”EntryValue” minOccurs=”1″ maxOccurs=”unbounded”>
<xsd:complexType>
<xsd:sequence>
<xsd:element name=”METADATA” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
<xsd:element name=”ElementEntryValue” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
<xsd:element name=”EffectiveStartDate” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
<xsd:element name=”EffectiveEndDate” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
<xsd:element name=”AssignmentNumber” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
<xsd:element name=”MultipleEntryCount” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
<xsd:element name=”LegislativeDataGroupName” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
<xsd:element name=”ElementName” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
<xsd:element name=”InputValueName” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
<xsd:element name=”ScreenEntryValue” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”${eol}” nxsd:quotedBy=”&quot;”/>
</xsd:sequence>
</xsd:complexType>
</xsd:element>
</xsd:sequence>
</xsd:complexType>
</xsd:element>
<xsd:annotation>
<xsd:appinfo>NXSDSAMPLE=/ElementEntryAllSrc.dat</xsd:appinfo>
<xsd:appinfo>USEHEADER=false</xsd:appinfo>
</xsd:annotation>
</xsd:schema>

In Native Format Builder, change “|” separator to “,” in the sample file and change it to “|” for each element in the generated schema.

Transform Source XML Data to HDL Canonical Schema

Since we are using canonical schema, all we need to do is map the source data appropriately and Native Format Builder will convert each object into HDL output file. The transformation could be complex depending on the source data format and organization of data values. In our sample use case, each row has one ElementEntry object and 3 ElementEntryValue sub-objects respectively.

The following provides the organization of the data elements in a single row of the source:

Entry_Desc_v1

The main ElementEntry entries are mapped to each respective row, but ElementEntryValue entries attributes are located at the end of each row. In this sample it results 3 entries. This can be achieved easily by splitting and transforming each row with different mappings as follows:

<xsl:for-each select=”/ns0:Root-Element/ns0:Entry”> – map pair of columns “1” from above diagram

<xsl:for-each select=”/ns0:Root-Element/ns0:Entry”> – map pair of columns “2” from above diagram

<xsl:for-each select=”/ns0:Root-Element/ns0:Entry”> – map pair of columns “3” from above diagram

 

Metadata Attribute

The most common use cases are to use “merge” action for creating and updating objects. In this use case, it is hard coded to “merge”, but the action could be set up to be dynamic if source data row has this information. The “delete” action removes the entire record and must not be used with “merge” instruction of the same record as HDL cannot guarantee in which order the instructions will be processed. It is highly recommended to correct the data rather than to delete and recreate it using the “delete” action. The deleted data cannot be recovered.

 

This is the sample schema developed in JDeveloper to split each row into 3 rows for ElementEntryValue object:

<xsl:template match=”/”>
<tns:Root-Element>
<xsl:for-each select=”/ns0:Root-Element/ns0:Entry”>
<tns:Entry>
<tns:METADATA>
<xsl:value-of select=”‘MERGE'”/>
</tns:METADATA>
<tns:ElementEntry>
<xsl:value-of select=”‘ElementEntryValue'”/>
</tns:ElementEntry>
<tns:EffectiveStartDate>
<xsl:value-of select=”ns0:C2″/>
</tns:EffectiveStartDate>
<tns:EffectiveEndDate>
<xsl:value-of select=”ns0:C3″/>
</tns:EffectiveEndDate>
<tns:AssignmentNumber>
<xsl:value-of select=”ns0:C1″/>
</tns:AssignmentNumber>
<tns:MultipleEntryCount>
<xsl:value-of select=”ns0:C4″/>
</tns:MultipleEntryCount>
<tns:LegislativeDataGroupName>
<xsl:value-of select=”ns0:C5″/>
</tns:LegislativeDataGroupName>
<tns:ElementName>
<xsl:value-of select=”ns0:C6″/>
</tns:ElementName>
<tns:EntryType>
<xsl:value-of select=”ns0:C9″/>
</tns:EntryType>
<tns:CreatorType>
<xsl:value-of select=”ns0:C10″/>
</tns:CreatorType>
</tns:Entry>
</xsl:for-each>
<xsl:for-each select=”/ns0:Root-Element/ns0:Entry”>
<tns:Entry>
<tns:METADATA>
<xsl:value-of select=”‘MERGE'”/>
</tns:METADATA>
<tns:ElementEntry>
<xsl:value-of select=”‘ElementEntryValue'”/>
</tns:ElementEntry>
<tns:EffectiveStartDate>
<xsl:value-of select=”ns0:C2″/>
</tns:EffectiveStartDate>
<tns:EffectiveEndDate>
<xsl:value-of select=”ns0:C3″/>
</tns:EffectiveEndDate>
<tns:AssignmentNumber>
<xsl:value-of select=”ns0:C1″/>
</tns:AssignmentNumber>
<tns:MultipleEntryCount>
<xsl:value-of select=”ns0:C4″/>
</tns:MultipleEntryCount>
<tns:LegislativeDataGroupName>
<xsl:value-of select=”ns0:C5″/>
</tns:LegislativeDataGroupName>
<tns:ElementName>
<xsl:value-of select=”ns0:C6″/>
</tns:ElementName>
<tns:EntryType>
<xsl:value-of select=”ns0:C11″/>
</tns:EntryType>
<tns:CreatorType>
<xsl:value-of select=”ns0:C12″/>
</tns:CreatorType>
</tns:Entry>
</xsl:for-each>
<xsl:for-each select=”/ns0:Root-Element/ns0:Entry”>
<tns:Entry>
<tns:METADATA>
<xsl:value-of select=”‘MERGE'”/>
</tns:METADATA>
<tns:ElementEntry>
<xsl:value-of select=”‘ElementEntryValue'”/>
</tns:ElementEntry>
<tns:EffectiveStartDate>
<xsl:value-of select=”ns0:C2″/>
</tns:EffectiveStartDate>
<tns:EffectiveEndDate>
<xsl:value-of select=”ns0:C3″/>
</tns:EffectiveEndDate>
<tns:AssignmentNumber>
<xsl:value-of select=”ns0:C1″/>
</tns:AssignmentNumber>
<tns:MultipleEntryCount>
<xsl:value-of select=”ns0:C4″/>
</tns:MultipleEntryCount>
<tns:LegislativeDataGroupName>
<xsl:value-of select=”ns0:C5″/>
</tns:LegislativeDataGroupName>
<tns:ElementName>
<xsl:value-of select=”ns0:C6″/>
</tns:ElementName>
<tns:EntryType>
<xsl:value-of select=”ns0:C13″/>
</tns:EntryType>
<tns:CreatorType>
<xsl:value-of select=”ns0:C14″/>
</tns:CreatorType>
</tns:Entry>
</xsl:for-each>
</tns:Root-Element>
</xsl:template>

BPEL Design – “ElementEntryPro…”

This is a BPEL component where all the major orchestration activities are defined. In this sample, all the activities after transformation are reusable and can be moved to a separate composite. A separate composite may be developed only for transformation and data enrichment that in the end invokes the reusable composite to complete the loading process.

 

hdl_bpel_v2

 

 

SOA Cloud Service Instance Flows

The following diagram shows an instance flow:

ElementEntry Composite Instance

instance1

BPEL Instance Flow

audit_1

Receive Input Activity – receives delimited data to XML format through Native Format Builder using File Adapter

audit_2

Transformation to Canonical ElementEntry data

Canonical_entry

Transformation to Canonical ElementEntryValue data

Canonical_entryvalue

Conclusion

This post demonstrates how to automate HCM inbound and outbound patterns using SOA Cloud Service. It shows how to convert customer’s data to HDL format followed by initiating the loading process. This process can also be replicated to other Fusion Applications pillars such as Oracle Enterprise Resource Planning (ERP).

Common WSDL Issues in ICS and How to Solve Them

$
0
0

Introduction

When using SOAP Web Services, WSDL documents play a very important role. Therefore, while using SOA services it is important to have knowledge about how to handle WSDL documents. While working with Oracle ICS (Integration Cloud Service) having a good handle on WSDL basics and troubleshooting will be very helpful. The fundamental reasoning behind this is because most of the built-in SaaS adapters available in Oracle ICS connect with those applications using SOAP. Salesforce.com, Oracle CPQ and Oracle Sales Cloud are some good examples, not to mention the generic SOAP adapter. Thus, most of these adapters as their first setup step require a WSDL document that describes the structural elements necessary to perform SOAP-based message exchanges, such as message types, port types and bindings.

Properly parsing a WSDL in ICS is a critical step for three reasons.

1) Because It describes how ICS will connect with the application, leveraging bindings and SOAP addresses within it.

2) Because It allows ICS to discover the business objects and operations, which eventually are used in the mapping phase.

3) For those adapters that provide automatic mapping recommendations, it is imperative that the adapter correctly parse all complex types available in the types section of the WSDL document.

Failing to parse the WSDL document of an application pretty much invalidates any further work in ICS. This blog will present common issues found while handling WSDL documents, and what can be done to solve those issues.

Rule Of Thumb: Correctly Inspect the WSDL

Regardless of which issue you are having with WSDL documents, one of the best practices is to always inspect the WSDL content. Most people wrongly assume that if the WSDL is accessible via its URL, then it will be valid. The verification process is basically entering the URL in the browser and checking if any content is being displayed. If any content is displayed it means that the WSDL is accessible, and no network restrictions are in place. However, the content shown in the browser can significantly differ from the raw content generated by the WSDL, so what you see is not what you want.

Tip: From the ICS perspective, the raw content of the WSDL is what the adapters rely on to generate and build the runtime artifacts. Keep this in mind if you are working with any SOAP-based adapter.

This happens because most modern browsers have built-in formatting features that are applied to the content received by the servers. These features present the content in a much better view for end users, such as removing empty lines, coloring the text or breaking down structured contents (such as XML derived documents) into a tree view. For instance, figure 1 shows a WSDL document opened in Google Chrome, where formatting took place while accessing the content.

fig1

Figure 1: Formatted WSDL content being shown in Google Chrome.

Do not rely on what the browser displays; this is a huge mistake, since the browser may obfuscate some issues in the WSDL. A better way to inspect the WSDL is getting access to its raw content. This can be accomplished using several techniques; but from the browser, you can access an option called “View Page Source” that displays the content in raw format, which allows you to copy-and-paste the content into a text editor. Save the content AS-IS in a file with a .wsdl extension. That file must be your starting point to troubleshoot any WSDL issue.

#1 Common Issue: Bad Generated WSDL Documents

Although SOAP-based Web Services are regulated by a W3C specification, which technology to use and how the Web Services are implemented, that is entirely up to the developer. Thus; there are thousands of ways to implement them, and their WSDL can also be created using different approaches. A common practice is having the WSDL automatically generated on-demand. This means that the WSDL is created when its URL is invoked. While this is good practice, since it ensures that the WSDL is always up-to-date with the Web Service implementation, it can also allow for issues on the consumer side.

For example, there are cases where the WSDL is generated with empty lines in the beginning of the document. Issues like this generates parsing errors; because according to the W3C specification, nothing can be contained before the XML declaration (i.e.: <?xml). If that happens, you have to make sure that those empty lines are removed from the WSDL, before using it in the ICS’s connections page. Figure 2 shows an example of a bad generated WSDL document.

fig2

Figure 2: Bad generated WSDL document, with illegal empty lines before the XML declaration.

While being in ICS’s connection page, if you use the WSDL shown in figure 2 and try to hit the “Test” button, ICS will throw an error related to parsing. This pretty much invalidates the connection because in order to be used in integrations, a connection needs to be 100% complete. Figure 3 shows the error thrown by ICS.

fig3

Figure 3: Error thrown by ICS after testing the connection.

To solve this issue, make sure that the generated WSDL has no empty lines before the XML declaration. While this is really simple, it can sometimes be really hard to accomplish that if the people responsible for the Web Service have zero control over the WSDL generation. It is not an uncommon scenario where the exposed Web Service is part of a product that cannot be easily changed. If that happens, another alternative can be hosting a modified version of the WSDL in a HTTP Web Server, and having ICS pointing to that server instead. As long the port types don’t have their SOAP addresses changed, this might work. The counterpart of this approach is that it introduces additional overhead over the implementation, with another extra layer to implement, patch and monitor.

#2 Common Issue: Non Well formed WSDL Documents

Whether having the WSDL automatically generated; or having it statically defined, it is the responsibility of the service provider to make sure that the WSDL document is well formed. If the WSDL is not well formed, then the ICS parser will not be able to validate the document and an error will be thrown. Just like the first common issue; this leads to connection invalidation, which cannot be used when building integrations.

In this site, there are a set of rules that state what XML documents must adhere to, in order to be considered well formed. It also contains a validator tool that you can leverage to make sure a WSDL document is valid.

#3 Common Issue: Types in Separated Documents

Some Web Services that have their WSDL automatically generated create the types used within the WSDL in a separate document. This means that when you get the WSDL, the WSDL only mentions the types by their names; but the types are defined someplace else. Typically, these types are defined in a XML schema document that the WSDL only points to using the import clause. This practice improves the reusability of the element types, and allows to be used in more than one web service definition.

While this practice is great from the service provider point of view, this might cause some issues for the service consumer. If for some reason ICS is not able to completely retrieve the types used in the WSDL, then it will not be able to create the business objects and operations for the integration. This might happen if ICS is not able to access the URL mentioned due to network connectivity issues such as firewall, proxies, etc. Figure 4 shows an example of WSDL that accesses its types using the import clause.

fig4

Figure 4: WSDL document using the import clause for the types.

This situation can be tricky to foresee because any error related to this practice will only occur when you start building the integration. The connection page will inform a user that the connection is “complete”; because for the sake of the test performed in the connections page, it does not establish any physical connection. It only checks if the WSDL document is valid. But when you start building your integration, an error might be thrown when the wizard tries to retrieve the business objects that must be invoked for a given operation. If that happens, make sure that any URL used in the import clause is reachable. If the error still persists, you will have no choice besides including all types directly in the WSDL, manually.

Conclusion

Most built-in adapters found in ICS allow native connection with SaaS applications using the SOAP Web Services technology. Because of this characteristic, being familiar with WSDL documents is essential to obtain its maximum benefits. However, there are issues related to WSDL that most users using ICS might face. This blog explored a few issues discovered  while using WSDL in ICS,  and presented how to solve these issues.

EDI Processing with B2B in hybrid SOA Cloud Cluster integrating On-Premise Endpoints

$
0
0

Executive Overview

SOA Cloud Service (SOACS) can be used to support the B2B commerce requirements of many large corporations. This article discusses a common use case of EDI processing with Oracle B2B within SOA Cloud Service in a hybrid cloud architecture. The documents are received and sent from on-premise endpoints using SFTP channels configured using SSH tunnels.

Solution Approach

Overview

The overall solution is described in the diagram shown here.

B2BCloudFlow(1)(1)An XML file with PurchaseOrder content is sent to a SOACS instance running in Oracle Public Cloud (OPC) from an on-premise SFTP server.

The XML file is received by an FTP Adapter in a simple composite for hand-off to B2B. The B2B engine within SOACS then generates the actual EDI file and transmits it over an SFTP delivery channel back to an on-premise endpoint.

In reality, the endpoint can be any endpoint inside or outside the corporate firewall. Communication with an external endpoint is trivial and hence left out of the discussion here. Using the techniques of SSH tunnels, the objective here is to demonstrate the ease by which any on-premises endpoint can be seamlessly integrated into the SOA Cloud Service hybrid solution architecture.

Our environment involves a SOACS domain on OPC with 2 managed servers. Hence, the communication with an on-premise endpoint is configured using SSH tunnels as described in my team-mate, Christian Weeks’ blog on SSH tunnel for on-premises connectivity in SOA Cloud clusters[1].

If the SOACS domain contains only a single SOACS node, then a simpler approach can also be used to establish the on-premise connectivity via SSH tunneling, as described in my blog on simple SSH tunnel connectivity for on-premises databases from SOA Cloud instance[2].

The following sections walk through the details of setting up the flow for a PurchaseOrder XML document from an on-premise back-end application, like eBusiness Suite to the 850 X12 EDI generated for transmission to an external trading partner.

Summary of Steps

  • Copy the private key of SOACS instance to the on-premise SFTP server
  • Update the whilelist for SOACS compute nodes to allow traffic flow between the SOACS compute nodes and the on-premise endpoints via the intermediate gateway compute node, referred to as CloudGatewayforOnPremTunnel in rest of this post from here onwards. This topic has also been extensively discussed in Christian’s blog[1].
  • Establish an SSH tunnel from the on-premise SFTP Server (OnPremSFTPServer) to the Cloud Gateway Listener host identified within the SOA Cloud Service compute nodes (CloudGatewayforOnPremTunnel). The role of this host to establish the SSH tunnel for a cluster has been extensively discussed in Christian’s blog[1]. This SSH tunnel, as described, will specify a local port and a remote port. The local port will be the listening port of SFTP server, (default is 22) and the remote port can be any port that is available within the SOACS instance (e.g. 2522).
  • Update FTP Adapter’s outbound connection pool configuration to include the new endpoint and redeploy. Since we have a cluster within the SOA Cloud service, the standard JNDI entries for eis/ftp/HAFtpAdapter should be used.
  • Define a new B2B delivery channel for the OnPremise SFTP server using the redirected ports for SFTP transmission.
  • Develop a simple SOA composite to receive the XML  payload via FTP adapter and hand-off to B2B using B2B Adapter.
  • Deploy the B2B agreement and the SOA composite.
  • Test the entire round-trip flow for generation of an 850 X12 EDI from a PurchaseOrder XML file.

sftpTunnel

Task and Activity Details

The following sections will walk through the details of individual steps. The environment consists of the following key machines:

  • SOACS cluster with 2 managed servers and all the dependent cloud services within OPC.
  • A compute node within SOACS instance is identified to be the gateway listener for the SSH tunnel from on-premise hosts (CloudGatewayforOnPremTunnel)
  • Linux machine inside the corporate firewall, used for hosting the On-Premise SFTP Server (myOnPremSFTPServer)

I. Copy the private key of SOACS instance to the on-premise SFTP server

When a SOACS instance is created, a public key file is uploaded for establishing SSH sessions. The corresponding private key has to be copied to the SFTP server. The private key can then be used to start the SSH tunnel from the database server to the SOACS instance.

Alternatively, a private/public key can be generated in the SFTP server and the public key can be copied into the authorized_keys file of the SOACS instance. In the example here, the private key for the SOACS instance has been copied to the SFTP server. A transcript of a typical session is shown below.

slahiri@slahiri-lnx:~/stage/cloud$ ls -l shubsoa_key*
-rw——- 1 slahiri slahiri 1679 Dec 29 18:05 shubsoa_key
-rw-r–r– 1 slahiri slahiri 397 Dec 29 18:05 shubsoa_key.pub
slahiri@slahiri-lnx:~/stage/cloud$ scp shubsoa_key myOnPremSFTPServer:/home/slahiri/.ssh
slahiri@myOnPremDBServer’s password:
shubsoa_key                                                                                100% 1679        1.6KB/s     00:00
slahiri@slahiri-lnx:~/stage/cloud$

On the on-premise SFTP server, login and confirm that the private key for SOACS instance has been copied in the $HOME/.ssh directory.

[slahiri@myOnPremSFTPServer ~/.ssh]$ pwd
/home/slahiri/.ssh
[slahiri@myOnPremSFTPServer ~/.ssh]$ ls -l shubsoa_key
-rw——-+ 1 slahiri g900 1679 Jan  9 06:39 shubsoa_key
[slahiri@myOnPremSFTPServer ~/.ssh]$

II. Create whitelist entries to allow communications between different SOACS compute nodes and on-premise SFTP server

The details about creation of a new security application and rule have been discussed extensively in Christian’s blog[1]. For the sake of brevity, just the relevant parameters for the definition are shown here. These entries are created from the Compute Node Service Console under Network tab.

Security Application
  • Name: OnPremSFTPServer_sshtunnel_sftp
  • Port Type: tcp
  • Port Range Start: 2522
  • Port Range End: 2522
  • Description: SSH Tunnel for On-Premises SFTP Server
Security Rule
  • Name: OnPremSFTPServer_ssh_sftp
  • Status: Enabled
  • Security Application: OnPremSFTPServer_sshtunnel_sftp (as created in last step)
  • Source: Security Lists – ShubSOACS-jcs/wls/ora-ms (select entry that refers to all the managed servers in the cluster)
  • Destination: ShubSOACS-jcs/lb/ora_otd (select the host designated to be CloudGatewayforOnPremTunnel, which could be either the DB or LBR VM)
  • Description: ssh tunnel for On-Premises SFTP Server

III. Create an SSH Tunnel from On-Premise SFTP Server to the CloudGatewayforOnPremTunnel VM’s public IP

Using the private key from Step I, start an SSH session from the on-premise SFTP server host to the CloudGatewayforOnPremTunnel, specifying the local and remote ports. As mentioned earlier, the local port is the standard port for SFTP daemon, e.g. 22. The remote port is any suitable port that is available in the SOACS instance. The syntax of the ssh command used is shown here.

ssh -R :<remote-port>:<host>:<local port> -i <private keyfile> opc@<CloudGatewayforOnPremTunnel VM IP>

The session transcript is shown below.

[slahiri@myOnPremSFTPServer ~/.ssh]$ ssh -v -R :2522:localhost:22 -i ./shubsoa_key opc@CloudGatewayforOnPremTunnel
[opc@CloudGatewayforOnPremTunnel ~]$ netstat -an | grep 2522
tcp        0      0 127.0.0.1:2522              0.0.0.0:*                   LISTEN
tcp        0      0 ::1:2522                         :::*                            LISTEN
[opc@CloudGatewayforOnPremTunnel ~]$

After establishing the SSH tunnel, the netstat utility can confirm that the remote port 2522 is enabled in listening mode within the Cloud Gateway VM. This remote port, 2522 and localhost along with other on-premises SFTP parameters can now be used to define an endpoint in FTP Adapter’s outbound connection pool in Weblogic Adminserver (WLS) console.

IV. Define a new JNDI entry for FTP Adapter that uses the on-premise SFTP server via the SSH  tunnel

From WLS console, under Deployments, update FtpAdapter application by defining parameters for the outbound connection pool JNDI entry for clusters, i.e eis/Ftp/HAFtpAdapter.

The remote port from Step II is used in defining the port within the JNDI entry for FTP Adapter. It should be noted that the host specified will be CloudGatewayforOnPremTunnel instead of the actual on-premise hostname or address of the SFTP server, since the port forwarding with SSH tunnel is now enabled locally within the SOACS instance in Step III.

It should be noted that SOA Cloud instances do not use any shared storage. So, the deployment plan must be copied to the file systems for each node before deployment of the FTP Adapter application.

The process to update the FtpAdapter deployment is fairly straightforward and follows the standard methodology. So, only the primary field values that are used in the JNDI definition are provided below.

  • JNDI under Outbound Connection Pools: eis/Ftp/HAFtpAdapter
  • Host:CloudGatewayforOnPremTunnel
  • Username: <SFTP User>
  • Password: <SFTP User Password>
  • Port:2522
  • UseSftp: true

V. Configure B2B Metadata

Standard B2B configuration will be required to set up the trading partners, document definitions and agreements. The unique configuration pertaining to this test case involves setting up the SFTP delivery channel to send the EDI document to SFTP server residing on premises inside the corporate firewall. Again, the remote port from Step III is used in defining the port for the delivery channel. The screen-shot for channel definition is shown below.

edicloud6After definition of the metadata, the agreement for outbound 850 EDI is deployed for runtime processing.

VI. Verification of SFTP connectivity

After the deployment of the FTP Adapter. another quick check of netstat for port 2522 may show additional entries indicating an established session corresponding to the newly created FTP Adapter. The connections are established and disconnected based on the polling interval of the FTP Adapter. Another alternative to verify the SFTP connectivity will be to manually launch an SFTP session from the command-line as shown here.

[opc@shubsoacs-jcs-wls-1 ~]$ sftp -oPort=2522 slahiri@CloudGatewayforOnPremTunnel
Connecting to CloudGatewayforOnPremTunnel…
The authenticity of host ‘[cloudgatewayforonpremtunnel]:2522 ([10.196.240.130]:2522)’ can’t be established.
RSA key fingerprint is 93:c3:5c:8f:61:c6:60:ac:12:31:06:13:58:00:50:eb.
Are you sure you want to continue connecting (yes/no)? yes
Warning: Permanently added ‘[cloudgatewayforonpremtunnel]:2522′ (RSA) to the list of known hosts.
slahiri@cloudgatewayforonpremtunnel’s password:
sftp> quit
[opc@shubsoacs-jcs-wls-1 ~]$

While this SFTP session is connected, a quick netstat check on the CloudGatewayforOnPremTunnel host will confirm the established session for port 2522 from the SOACS compute node.

[opc@CloudGatewayforOnPremTunnel ~]$ netstat -an | grep 2522
tcp        0       0 0.0.0.0:2522                       0.0.0.0:*                               LISTEN
tcp        0      0 10.196.240.130:2522         10.196.246.186:14059        ESTABLISHED
tcp        0       0 :::2522                                 :::*                                       LISTEN
[opc@CloudGatewayforOnPremTunnel ~]$

VII. Use the newly created JNDI to develop a SOA composite containing FTP Adapter and B2B Adapter to hand-off the XML payload from SFTP Server to B2B engine

The simple SOA composite diagram built in JDeveloper for this test case is shown below.

The JNDI entry created in step IV (eis/ftp/HAFtpAdapter) is used in the FTP Adapter Wizard session within JDeveloper to set up a receiving endpoint from the on-premises SFTP server. A simple BPEL process is included to transfer the input XML payload to B2B. The B2B Adapter then hands-off the XML payload to the B2B engine for generation of the X12 EDI in native format.

edicloud4

Deploy the composite via EM console to complete the design-time activities. We are now ready for testing.

VIII. Test the end-to-end EDI processing flow

After deployment, the entire flow can be tested by copying a PurchaseOrder XML file in the polling directory for incoming files within the on-premise SFTP server. An excerpt from the sample XML file used as input file to trigger the process, is shown below.

[slahiri@myOnPremSFTPServer cloud]$ more po_850.xml
<Transaction-850 xmlns=”http://www.edifecs.com/xdata/200″ xmlns:xsi=”http://www.w3.org/2001/XMLSchema-instance” XDataVersion=”1.0″ Standard=”X12” Version=”V4010” CreatedDate=”2007-04-10T17:16:24″ CreatedBy=”ECXEngine_837″>
     <Segment-ST>
           <Element-143>850</Element-143>
           <Element-329>16950001</Element-329>
      </Segment-ST>
      <Segment-BEG>
           <Element-353>00</Element-353>
           <Element-92>SA</Element-92>
           <Element-324>815455</Element-324>
           <Element-328 xsi:nil=”true”/>
           <Element-373>20041216</Element-373>
        </Segment-BEG>
–More–(7%)

The FTP Adapter of the SOA composite from SOACS instance will pick up the XML file via the SSH tunnel and process it in Oracle Public Cloud within Oracle B2B engine to generate the EDI. The EDI file will then be transmitted back to the on-premise SFTP server via the same SSH tunnel.

Results from the completed composite instance should be visible in the Enterprise Manager, as shown below.

edicloud2

Content of the EDI file along with the SFTP URL used to transmit the file can be seen in the B2B console, under Wire Message Reports section.

edicloud1

Summary

The test case described here is a quick way to demonstrate the concept that SOA Cloud Service can be very easily used in a hybrid architecture for modelling common B2B use cases, that require access to on-premise endpoints. The EDI generation process and all the business layer orchestration can be done in Oracle Public Cloud (OPC) with SOA Suite. Most importantly, integration with on-premise server endpoints can be enabled as needed via SSH tunnels to provide a hybrid cloud solution.

Acknowledgements

SOACS Product Management and Engineering teams have been actively involved in the development of this solution for many months. It would not have been possible to deliver such a solution to the customers without their valuable contribution.

References

  1. 1. Setting up SSH tunnels for cloud to on-premise with SOA Cloud Service clusters – Christian Weeks, A-Team
  2. 2. SOA Cloud Service – Quick and Simple Setup of an SSH Tunnel for On-Premises Database Connectivity - Shub Lahiri, A-Team

Integration Cloud Service (ICS) On-Premise Agent Installation

$
0
0

The Oracle On-Premises Agent (aka, Connectivity Agent) is necessary for Oracle ICS to communicate to on-premise resources without the need for firewall configurations or VPN. Additional details about the Agent can be found under New Agent Simplifies Cloud to On-premises Integration. The purpose of this A-Team blog is to give a consolidated and simplified flow of what is needed to install the agent and provide a foundation for other blogs (e.g., E-Business Suite Integration with Integration Cloud Service and DB Adapter). For the detailed online documentation for the On-Premises Agent, see Managing Agent Groups and the On-Premises Agent.

On-Premises Agent Installation

The high-level steps for getting the On-Premises Agent installed on your production POD consist of two activities: 1. Create an Agent Group in the ICS console, and 2. Run the On-Premises Agent installer. Step 2 will be done on an on-premise Linux machine and the end result will be a lightweight WebLogic server instance that will be running on port 7001.

Create an Agent Group

1. Login to the production ICS console and view landing page.
 ICSConnectivityAgent_001
2. Verify that the ICS version is 15.4.5 or greater.
ICSConnectivityAgent_002
ICSConnectivityAgent_003
3. Scroll down on ICS Home page and select Create Agents. Notice this brings you to the Agents page of the Designer section.
ICSConnectivityAgent_004
ICSConnectivityAgent_005
4. On the Agents page click on Create New Agent Group.
5. Provide a name for your agent group (e.g., AGENT_GROUP).
ICSConnectivityAgent_006
6. Review the Agent page containing new group.
ICSConnectivityAgent_007

Run the On-Premises Agent Installer

1. Click on the Download Agent Installer drop down on the Agent page, select Connectivity Agent, and save file to an on-premise Linux machine where the agent will be installed/running.
ICSConnectivityAgent_008
ICSConnectivityAgent_009
2. Extract the contents of the zip file for the cloud-connectivity-agent-installer.bsx.  This .bsx is the installation script that will be executed in the on-premise machine where the agent will reside.  A .bsx is a self extracting Linux bash script:
ICSConnectivityAgent_010
3. Make sure the cloud-connectivity-agent-installer.bsx file is executable (e.g., chmod +x cloud-connectivity-agent-installer.bsx) and execute the shell script.  NOTE: It is important to specify the SSL port (443) as part as the host URL.  For example:
./cloud-connectivity-agent-installer.bsx -h=https://<ICS_HOST>:443 -u=[username] -p=[password] -ad=AGENT_GROUP
ICSConnectivityAgent_011
4. Return to the ICS console and the Agents configuration page.
ICSConnectivityAgent_012
5. Review the Agent Group.
ICSConnectivityAgent_013
ICSConnectivityAgent_014
6. Click on Monitoring and select the Agents icon on the left-hand side.
ICSConnectivityAgent_0151
7. Review the Agent monitoring landing page.
ICSConnectivityAgent_016
8. Review the directory structure for the agent installation.
ICSConnectivityAgent_017
As you can see this is a standard WLS installation.  The agent server is a single-server configuration where everything is targeted to the Admin server and is listening on port 7001.  Simply use the scripts in the ${agent_domain}/bin directory to start and stop the server.

We are now ready to leverage the agent for things like the Database or EBS Cloud Adapter.

E-Business Suite Integration with Integration Cloud Service and DB Adapter

$
0
0

Introduction

Integration Cloud Service (ICS) is an Oracle offering for a Platform-as-a-Service (PaaS) to implement message-driven integration scenarios. This article will introduce into the use of ICS for integrating an on-premise E-Business Suite (EBS) instance via Database Adapter. While EBS in recent releases offers a broad set of integration features like SOAP and REST support (i.e. via Integrated SOA Gateway), these interfaces are not available in older versions like 11.5.x. In the past it has been a proven approach to use Oracle Fusion Middleware Integration products (SOA, OSB etc.) running on-premise in a customer data center to connect to an EBS database via DB Adapter. In a short time this feature will be available also in a cloud based integration solution as we will discuss in this article.

Unless we focus on EBS integration here the DB Adapter in ICS will work similarly against any other custom database. Main reason to use an EBS context is the business case shown below, where ICS is connected to Mobile Cloud Service (MCS) to provide a mobile device solution.

Business Case and Architecture

Not hard to imagine that Oracle customers running EBS 11.5.x might have a demand to add a mobile channel for their end-users. One option could be an upgrade to a recent release of EBS. As this will be in most cases a bigger project, an alternative could be the creation of a custom mobile solution via Oracle Jet and MCS as figured below. MCS is a PaaS offering and requires access to an underlying database via REST/JSON. This is the situation where ICS appears in this architecture.

01_Architecture

In absence of native SOAP or REST capabilities being available in EBS 11.5.x tech stack, the integration via ICS would close that gap. Any database access activities (retrieving data, CRUD operations etc.) can run via an ICS/DB Adapter connection to an EBS on-premise database. ICS itself will provide a REST/JSON interface for the external interaction with EBS. This external interface is generic and not restricted to MCS as caller at all. However in our business case the ICS with DB Adapter fulfills the role of a data access layer for a mobile solution.

As shown in the architecture figure above the following components are involved in this end-to-end mobile solution:

  • DB Adapter uses a local component to be installed on-premise in EBS data center named ICS Agent. This agent communicates via JCA with the database and retrieves/sends data from DB Adapter in ICS from/to database
  • Communication between ICS Agent and DB Adapter is setup via Oracle Messaging Service tunneled through HTTPS
  • DB Adapter provides a standard SQL interface for database access
  • Part of the embedded features in ICS are data mapping and transformation capabilities
  • The external REST endpoint in ICS will be made public through REST Adapter in ICS

The ICS configuration and communication in architecture figure stands for a generic approach. In this sample the mobile solution for EBS 11.5.x makes use of the described data access capabilities as follows (mobile components and Jet are not in scope of this document as we focus on the ICS part here):

  • MCS connects to ICS via a connector or generic REST interface
  • EBS data will be processed and cached in MCS
  • Mobile devices communicate with MCS via REST to render the EBS data for visualization and user interaction

In the following article we will focus purely on the ICS and DB Adapter integration and leave the mobile features out of scope. The technical details of ICS and DB Adapter implementation itself won’t be handled here too, as they will become the main content of another blog. Instead we will show how the implementation features can be used from an Application Integration Developer’s perspective.

ICS Configuration Overview

At the beginning of an ICS based integration there are some configuration activities to be done like creation of connections. This is a one-time or better first-time task in order to make ICS ready for creation of integration flows. This is probably not really an Application Developer’s activity. In most cases a dedicated ICS Administrator will perform the following actions by himself.

02_ICS_ConnectionsAt least two connections must be setup for this EBS integration via database communication

  • Database Adapter pointing to the EBS database – database connection parameters will be used by ICS Agent running in-house on customers datacenter
  • REST Adapter to provide a REST interface for external communication

Screenshot below shows a sample configuration page for DB Adapter connected to an EBS instance. The main parameters can be seen as a local connection from ICS Agent to database: hostname, port, SID.

By using this configuration page there must be also an assignment of a local ICS Agent to this DB Adapter made.

03_1_ICSDBAdapterEBiz

03_2_ICSDBAdapterEbizIn most cases it will make sense to use EBS database user APPS for this connection as this credential provides the most universal and context-sensitive access to EBS data model.

04_ICSDBAdapterEBizCredentials

The other connection to setup is a REST interface (further listed as ICS LocalRest in this article) used for inbound requests and outbound responses. As showing in screenshot below this is a quite straightforward task without extensive configuration in our case. Variances are possible – especially for Security Policies, Username etc:

  • Connection Type: REST API Base URL
  • Connection URL: https://<hostname>:<port>/ics
  • Security Policy: Basic Authentication
  • Username: <Weblogic_User>
  • Password: <password>
  • Confirm Password: <password>

05_ICSLocalRestAdapterConfig

After setting up two connections we are good to create an integration between EBS database and any other system being connected via REST.

DB Adapter based Integration with EBS

During our activities we created some good practices that are probably worth to be shared this way. In general we made some good experience with a top-down approach that looks like follows for creation of an integration flow:

  • Identify the parameter in REST call to become part of the JSON payload (functionality of this integration point) for the external interface
  • Identify the EBS database objects being involved (tables, views, packages, procedures etc)
  • Create a JSON sample message for inbound and another one for outbound
  • Design the data mapping between inbound/outbound parameters and SQL statement or PLSQL call
  • Create a EBS DB integration endpoint, enter the SQL statement or call the PLSQL procedure/function dedicated to perform the database activity
  • Create a local REST integration endpoint to manage the external communication
  • Assign the previously created inbound and outbound sample JSON messages to the request and response action
  • Create a message mapping for inbound parameters to SQL/PLSQL parameters
  • Do the same for outbound parameters
  • Add a tracking activity, save the integration and activate it for an external usage

The DB adapter is able to handle complex database types for a mapping to record and array structures in JSON. This means there won’t be any obvious limitations to pass nested data structures to PLSQL packages via JSON.

Here is a sample. In PLSQL we define a data type like follows:

TYPE timeCard IS RECORD (
startTime VARCHAR2(20),
stopTime VARCHAR2(20),
tcComment VARCHAR2(100),
tcCategoryID VARCHAR2(40));
TYPE timeCardRec IS VARRAY(20) OF timeCard;

The parameter list of the procedure looks embeds this datatype in addition to plain type parameters:

procedure createTimecard(
userName   in varchar2,
tcRecord   in timeCardRec,
timecardID out NUMBER,
status     out varchar2,
message     out varchar2 );

The JSON sample payload for the IN parameters would look like this:

{
"EBSTimecardCreationCollection": {
   "EBSTimecardCreationInput": {
       "userName": "GEVANS",
       "timeEntries" : [
           {
             "startTime": "2015-08-17 07:30:00",
             "stopTime": "2015-08-17 16:00:00",
             "timecardComment": "Regular work",
             "timecardCategoryID": "31"
           },{
             "startTime": "2015-08-18 09:00:00",
             "stopTime": "2015-08-18 17:30:00",
             "timecardComment": "",
             "timecardCategoryID": "31"
           },{
             "startTime": "2015-08-19 08:00:00",
             "stopTime": "2015-08-19 16:00:00",
             "timecardComment": "Product Bugs Fixing",
             "timecardCategoryID": "31"
           },{
             "startTime": "2015-08-20 08:30:00",
             "stopTime": "2015-08-20 17:30:00",
             "timecardComment": "Customers Demo Preparation",
             "timecardCategoryID": "31"
           },{
             "startTime": "2015-08-21 09:00:00",
             "stopTime": "2015-08-21 17:00:00",
             "timecardComment": "Holiday taken",
             "timecardCategoryID": "33"
           }
           ] }
     }
}

The JSON sample below will carry the output informtion from PLSQL package back inside the response message:

{
   "EBSTimecardCreationOutput":
   {
       "timecardID": "6232",
       "status": "Success",
       "message": "Timecard with ID 6232 created for User GEVANS”
   }
}

As shown we can use complex types in EBS database and are able to create an according JSON structure that can be mapped 1:1 for request and response parameters.

Creating an EBS Integration

To start with the creation of an EBS integration an Application Developer must login to the assigned Integration Services Cloud instance with the username and password as provided.

06_Login_ICS

Entry screen after login shows the available activities that are

  • Connections
  • Integrations
  • Dashboard

As an Applications Developer we will chose Integrations to create, modify or activate integration flows. Connections handling has been shown earlier in this article and Dashboard is usually an option to monitor runtime information.

07_MainScreenICSTo create a new integration flow choose Create New Integration and Map My Data. This will create an empty integration where you have the opportunity to connect to adapters/endpoints and to create data mappings.

08_1_NewIntegrationEnter the following information

  • Integration Name : Visible Integration name, can be changed
  • Identifier : Internal Identifier, not changeable once created
  • Version :  Version number to start with
  • Package Name (optional) : Enter name if integration belongs to a package
  • Description (optional) : Additional explanatory information about integration

08_2_NewIntegration_CapabilitiesScreenshot below shows an integration which is done by 100% and ready for activation. When creating a new integration both sides for source and target will be empty. Suggestion is to start creating a source as marked on left side in figure below.

09_LocalRestAdapterIntegrationConfig

As mentioned before it might be a good practice to follow a top-down approach. In this case the payload for REST service is defined and exists in form of a JSON sample.

The following information will be requested when running the Oracle REST Endpoint configuration wizard:

  • Name of the endpoint (what do you want to call your endpoint?)
  • Description of this endpoint
  • Relative path of this endpoint like /employee/timecard/create in our sample
  • Action for this endpoint like GET, POST, PUT, DELETE
  • Options to be configured like
    • Add and review parameters for this endpoint
    • Configuration of a request payload
    • Configure this endpoint to receive the response

Sample screenshot below shows a configuration where a POST operation will be handled by this REST endpoint including the request and response.

10_LocalRestAdapterIntegrationConfigThe next dialog window configures the request parameter and the JSON sample is taken as a payload file. The payload content will appear later in mapping dialog as the input structure.

11_LocalRestAdapterIntegrationRequestParamThe response payload will be configured similar to the request payload. As mentioned the input/output parameters are supposed to be defined in a top-down approach for this endpoint. In the response payload dialog we assign the sample JSON payload structure as defined for output payload for this REST service.

12_LocalRestAdapterIntegrationResponseParamFinally the summary dialog window appears and we con confirm and close this configuration wizard.

13_LocalRestAdapterIntegrationSummaryNext action is a similar configuration for target – in our sample the DB adapter connected to EBS database.

14_EBSDbAdapterPackageConfigDB adapter configuration wizard starts with a Basic Information page where the name of this endpoint is requested and general decisions has to be made whether the service will use a SQL statement or make a PLSQL procedure/function call.

As shown in screenshot below the further dialog for a PLSQL based database access will basically start by choosing the schema, package and procedure/function to be used. For EBS databases the schema name for PLSQL packages and procedures is usually APPS.

15_EBSDbAdapterPackageConfigAfter making this choice the configuration is done. Any in/out parameter and return values of a specific function become part of the request/response payload and appear in message mapping dialog later.

16_EBSDbAdapterPackageConfigIn case the endpoint will run a plain SQL statement just choose Run a SQL statement in basic information dialog window.

A different dialog window will appear which allows the entering of a SQL statement that might be a query or even a DML operation. Parameter must be passed in a JCA notation with a preceding hash-mark (#).

17_EBSDbAdapterSQLValidationAfter entering the SQL statement it must be validated by activating Validate SQL Query button. As long as any validation error messages appear those must be corrected first in order to finalize this configuration step. Once the statement has been successfully validated a schema file will be generated.

18_EBSDbAdapterSQLSummaryBy clicking on the schema file URL a dialog window shows the generated structure as shown below. The elements of this structure have to be mapped in transformation component later, once the endpoint configuration has is finished.

19_EBSDbAdapterSQLXSDGeneratedThe newly created integration contains two transformations after endpoint configuration has been finished – one for requests/inbound and another one for response/outbound mappings.

20_MessageMappingThe mapping component itself follows the same principles like the comparable XSLT mapping tools in Fusion Middleware’s integration products. As shown in screenshot below the mapped fields are marked with a green check mark. The sample shows an input structure with a single field (here: userName) and a collection of records.

21_MessageMappingInputSample below shows the outbund message mapping. In the according PLSQL procedure three parameters are marked as type OUT and will carry the return information in JSON output message.

22_MessageMappingOutParamsOnce finished with the message mappings, the final step for integration flow completion is the addition of at least one tracking information (see link on top of page). This means one field in message payload has to be identified for monitoring purposes. The completion level will change to 100% afterwards. The integration must be saved and Application Developer can return to integration overview page.

23_IntegrationOverviewLast step is the activation of an integration flow – supposed to be a straightforward task. Once the completion level of 100% has been reached for completion level the integration flow is ready to be activated.

24_Activate_TimecardAfter clicking on Activate button a Confirmation dialog appears asking whether this flow should be traced or not.

25_Activate_TimecardOnce activated the REST endpoint for this integration is enabled and ready for invocation.

26_IntegrationsOverview

Entering the following URL in a browser window will test the REST interface and return a sample:

  • https://<hostname>:<port>/integration/flowapi/rest/<Integration Identifier>/v<version>/metadata

Testing the REST integration workflow requires a tool like SoapUI to post a JSON message to REST service. In this case the URL from above changes in terms of adding the integration access path as configured in REST connection wizard:

  • https://<hostname>:<port>/integration/flowapi/rest/<Integration Identifier>/v<version>/employee/timecard/create

Security Considerations

Earlier in this document we discussed the creation of a DB endpoint in EBS and the authentication as APPS user. In general it is possible to use other DB users alternately. The usage of a higher privileged user like SYSTEM is probably not required and also not recommended due to the impact if this connection might be hacked.

There are multiple factors having an influence on the security setup tasks to be done:

  • What are the security requirements in terms of accessed data via this connection?
    • Gathering of non-sensitive information vs running business-critical transactions
    • Common data access like reading various EBS configuration information vs user specific and classified data
  • Does this connection have to provide access to all EBS objects in database (packages, views across all modules) or can it be restricted to a minimum of objects being accessed?
  • Is the session running in a specific user context or is it sufficient to load data as a feeder user into interface tables?

Depending on the identified integration purpose above the security requirements demand might range in a span from extremely high to moderate. To restrict user access to a maximum it would be possible to create a user with a limited access to a few objects only like APPLSYSPUB. Access to PLSQL packages would be given on demand of accessibility.

If access to database is required to run in a specific context the existing EBS DB features to put a session into a dedicated user or business org context via FND_GLOBAL.APPS_INITIALIZE or MO_GLOBAL.INIT (R12 onward) must be used. That will probably have an impact on the choice to run a plain SQL statement vs a PLSQL Procedure. With the requirement to perform a preceding call of FND_GLOBAL also a SELECT statement has to run inside a procedure this way and the result values must be declared as OUT parameters as shown previously.

In general the requirement to perform a user authentication is outside of scope of this (EBS) database adapter. In practice the upper layer on top of ICS must support that no unsolicited user access will be given. While connection encryption via SSL is supposed to be the standard there could be obviously a need to create a full logical session management for end-user access including user identification, authentication and session expiration.

Such a deep-dive security discussion was out-of-scope for this blog and should be handled in another article.

For non-EBS databases similar considerations will obviously apply.

Conclusion

This blog posting was dedicated to give an overview on the quite new DB adapter in ICS. While recent EBS releases will have a benefit to integrate via EBS adapter or built-in tools the older versions probably won’t. Using the DB adapter will be possibly the preferred method to create a cloud based access to a legacy on-premise EBS database.

More blog posts are planned for future to give a deeper technical look behind the scenes of ICS and DB adapter. So it might be worth to stay tuned …

Implementing an SFDC Upsert Operation in ICS

$
0
0

Introduction

While designing SOA services; especially those ones that represent operations around a business object, a common implementation pattern used is upsert. Upsert is an acronym that means the union of “update plus insert”. The idea behind is having a unique operation that decides which action to take – either update the existing record or insert a new one – based on information available in the message. Having one operation instead of two, makes the SOA service interface definition clearer and simpler.

Some SaaS applications offer upsert capabilities in their exposed services, and leveraging these capabilities can considerably decrease the amount of effort required while designing SOA services in an integration platform such as ICS. For instance, if you need to develop an upsert operation and the SaaS application does not have this functionality; you will have to implement that logic using some sort of conditional routing (see Content-Based Router in ICS) or via multiple update and insert operations.

ics_cbr_sample

Figure 1: Implementing upsert using CBR in ICS.

Salesforce.com (or SFDC for short) is one of those SaaS applications that offers built-in support for the upsert operation. This post will show how to leverage this support with ICS.

Setting up External Identifiers in SFDC

Every business object in SFDC can have custom fields. This allows business objects from SFDC to be customized to afford specific customer requirements regarding data models. As part of this feature SFDC allows that any custom field can act as a record identifier for systems outside of SFDC. These systems can identify any record through this custom field instead of using the SFDC internal primary key, which for security reasons is unknown. Therefore, if you need to perform transactions against business objects in SFDC from ICS, you need to make sure that the business object carries a custom field with the External ID attribute set. This is a requirement if you want to make the upsert operation work in SFDC.

In order to create a custom field with the External ID attribute, you need to access your SFDC account and click on the setup link on the upper right corner of the screen. Once there, navigate to the left side menu and look for the build section, which is below the administer section. Within that section, expand the customize option and SFDC will list all the business objects that can be customized. Locate the business object that you want to perform the upsert operation on. This blog will use the contact business object as example.

Go ahead and expand the business object. From the options listed, choose fields. That will bring you the page that allows the fields personalization for the selected business object. In this page, navigate to the bottom of it to access the section in which you can create custom fields, as shown in figure 2.

creating_custom_field_in_sfdc_1

Figure 2: Creating custom fields for the contact business object in SFDC.

To create a new custom field, click in the new button. This will invoke the custom field creation wizard. The first step of the wizard will ask which field type you will want to use. In this example we are going to use Text. Figure 3 shows the wizard’s step one. After setting the field type click next.

creating_custom_field_in_sfdc_2

Figure 3: Creating a custom field in SFDC, step one.

The second step is entering the field details. In this step you will need to define the field label, name, length and what special attributes it will have. Set the field name to “ICS_Ext_Field”. The most important attribute is the External ID one. Make sure that this option is selected. Also select Required and Unique since this is a record identifier. Figure 4 shows the wizard’s step two. Click next twice and then save the changes.

creating_custom_field_in_sfdc_3

Figure 4: Creating a custom field in SFDC, step two.

After the custom field creation, the next step is generating the SFDC Enterprise WSDL. This is the WSDL that must be used in ICS to connect to SFDC. The generated WSDL will include the information about the new custom field and ICS will be able to rely on that information to perform the upsert operation.

Creating a REST-Enabled Upsert Integration

In this section, we are going to develop an ICS REST-enabled source endpoint that will perform insertion and updates on the target contact business object, leveraging the upsert operation available in SFDC. Make sure to have two connections configured in ICS; one for the integration source which is REST-based and another for the integration target, which should be SFDC-based. You must have an SFDC account to properly set the connection up in ICS.

Create a new integration, and select the Map My Data pattern. From the connections palette, drag the REST-based connection onto the source icon. This will bring the new REST endpoint wizard. Fill the fields according as to what is shown in figure 5 and click next.

source_wizard_1

Figure 5: New REST endpoint wizard, step one.

Step two of the wizard will ask for the request payload file. Choose JSON Sample and upload a JSON file that contains the following payload:

request_payload_sample

Figure 6: Sample JSON payload for the request.

Click next. Step three of the wizard will ask for the response payload file. Again, choose JSON Sample and upload a JSON file that contains the following payload:

response_payload_sample

Figure 7: Sample JSON payload for the response.

Click next. The wizard will wrap up the options chosen and display for confirmation. Click on the done button to finish the wizard.

source_wizard_4

Figure 8: New REST endpoint wizard, final step.

Moving further, from the connections palette, drag the SFDC-based connection onto the target icon. That will bring the new SFDC endpoint wizard. Fill the fields according as to what is shown in figure 9 and click next.

target_wizard_1

Figure 9: New Salesforce endpoint wizard, step one.

Step two of the wizard will ask for which operation must be performed in SFDC. You need to choose the upsert operation. To accomplish that, first select the option Core in the operation type field and then select the upsert option in the list of operations field. Finally, select the business object in which you would like to perform upserts, as shown in figure 10.

target_wizard_2

Figure 10: New Salesforce endpoint wizard, step two.

Click next twice and then the wizard will wrap up the options chosen and display for confirmation. Click on the done button to finish the wizard.

target_wizard_4

Figure 11: New Salesforce endpoint wizard, final step.

Save all the changes made so far in the integration. With the source and target properly configured, we can now start the mapping phase, in which we will configure how the integration will handle the request and response payloads. Figure 12 shows what we have done so far.

integration_before_mapping

Figure 12: Integration before the mapping implementation.

Create a new request mapping in the integration. This will bring the mapping editor, in which you will perform the upsert implementation. Figure 13 shows how this mapping should be implemented.

request_mapping

Figure 13: Request mapping implementation.

Let’s understand the mapping implementation details. The first thing that needs to be done is set into the externalIDFieldName the field name from the business object that will be used to identify the record. You must use any valid custom field that has the External ID attribute set. Any other field will not work here. To set the value into the field, click on top of the field link to open the expression editor.

setting_external_field_value

Figure 14: Setting the “externalIDFieldName” using the expression editor.

The best way to set the value is using the concat() XLST function. Set the first parameter of the concat() function to the custom field name and the second parameter to a empty string.

Keep in mind that the field name in ICS can be different from what you set in SFDC. When the SFDC Enterprise WSDL is generated, it appends into the custom fields a suffix to make them unique. In most cases, this suffix is a “__c” but a better way to figure this out is reviewing the WSDL for the field.

The next step is making sure that the custom field cited in the externalIDFieldName field has a value set. This is necessary because that field will be used by SFDC to decide which action to take. If no value is set in that field, it means that SFDC will create a new record for that business object. Otherwise if that field has a value; then SFDC will try to locate that record and once found, it will update the record with the data set in the other fields. In this example, we will populate the custom field with the identifier value from the request payload, as shown in figure 13. Map the remaining fields accordingly. Once you finish the mapping, save the changes in click on the exit mapper button to come back to the integration.

Now create a new response mapping in the integration. This will bring the mapping editor, in which you will perform the mapping implementation for the response. Figure 15 shows how this mapping should be implemented.

response_mapping

Figure 15: Response mapping implementation.

Simply map the success field from the source with the result field from the target. According to the SFDC documentation, the success field is set to true if the operation is successfully performed into the record, and it is set to false of any issues happen during the operation. Once you finish the mapping, save the changes in click on the exit mapper button to come back to the integration. Figure 16 shows the integration after the mapping.

integration_after_mapping

Figure 16: Integration after the mapping implementation.

Finish the integration implementation by setting the tracking information and optionally mapping any faults from the SFDC connection. Save all the changes and go ahead and activate the integration in ICS. Once activated, you should be able to get the information from the REST endpoint exposed by ICS. Just access the integrations page and click in the exclamation link situated on the upper right corner of the integration entry.

checking_endpoint_details

Figure 17: Getting the information from REST endpoint exposed by ICS.

Before testing the endpoint, keep in mind that the URL of the REST endpoint does not contain the “metadata” suffix present in the end of the URL shown in figure 17. Remove that suffix before using the URL to avoid any HTTP 403 errors.

Conclusion

The upsert operation is a very handy way to handle insert and update operations within a single API, and it is a feature present in most SaaS applications that expose services for external consumption. SFDC is one of those applications. This blog showed how to leverage the upsert support found in SFDC and the steps required to invoke the upsert operation using the externalIDFieldName element from ICS.


Integration Cloud Service – Promote Integrations from Test to Production (T2P)

$
0
0

The purpose of this blog is to provide simple steps to move Oracle Integration Cloud Service (ICS) integrations between different ICS environments. Oracle ICS provides export and import utilities to achieve integration promotion.

A typical use-case is to promote tested integrations from Test ICS Environment to Production ICS Environment, in preparation for a project go-live. Usually the Connection endpoints used by the integrations will be different on Test and Production Environments.

The main steps involved in code promotion for this typical use-case are as follows

  • Export an integration from Test ICS
  • Import the integration archive on Prod ICS
  • Update Connection details and activate the integration on Prod ICS Environment

Export an integration from Test ICS

Login to Test ICS
Search and locate the integration on Test ICS
Select ‘Export’ and save the integration archive to the file system.

Step2-BrowseAndExport-Integration-TestICS

 

The integration is saved with a “.iar” extension.

Step3-Save-IAR_new

 

 

 

 

 

 

 

During export, basic information about the connections, like identifier, connection type are persisted.

 

Import the integration archive on Prod ICS Environment

Login to Prod ICS
Navigate to ‘Integrations’
Select ‘Import Integration’ and choose the integration archive file that was saved in the previous step

Step4-Import-Saved-IAR-ProdICS

 

Since connection properties and security credentials are not part of the archive, the imported integration is typically not ready for activation.
An attempt to activate will error out and the error message indicates the connection(s) with missing information

Step6-Incomplete-Connections-Warning

Note that, if the connections used by the archive are already present and complete in Prod ICS, then the imported integration is ready for activation.

 

Update Connection details and activate the integration on Prod ICS Environment

After importing the archive, the user needs to update any incomplete connections before activating the flow.
Navigate to “Connections” and locate the connection to be updated

Step7-Find_incompleConn_Edit

 

Select ‘Edit’ and update the connection properties, security credentials and other required fields, as required in the Prod ICS Environment.
‘Test’ Connection and ensure that connection status shows 100%

Step8-Review-And-Complete-Conn-ICSProd

Note that, the connection identifier and Connection type were preserved during import and cannot be changed.

 

Once the connection is completed, then the imported integration is ready to be activated and used on Prod ICS environment.

Intgn-Ready

 

We have seen above the steps for promoting a completed integration for the T2P use-case.
Note that, even incomplete integrations can be moved between ICS environments using the same steps outlined above. This could be useful during development to move integration code reliably between environments.

Also, multiple integrations can be moved between environments using the ‘package’ export and import. This requires that integrations to be organized within ICS packages.

Export-Import-Packages
Finally, Oracle ICS provides a rich REST API which can be used to automate code promotion between ICS environments.

 

Retaining Mappings in ICS when Endpoints Change

$
0
0

ICS (Integration Cloud Service) is a PaaS cloud offering from Oracle that provides capabilities of integrating applications both on-cloud and on-premise. With any integration, there is a requirement to version the integration flows/services. The need to version integration flows is because the endpoint’s schemas change from time to time, due to application changes, for business needs. When the endpoints change, the integration flow needs to also change to make use of the new schema fields. ICS provides capabilities to version integration flows. This is a great way to add new versions of services/Integration flows and also keep the old services/Integration flows intact.

When a new version of Integration flow is created there is a need to retain the old mappings and create new mappings for new fields. ICS provides the functionality of retaining the mapping through the “Regenerate” endpoint functionality.

This blog demonstrates the functionality to regenerate endpoint.

Sample Integration Flow

Below is a simple ICS integration flow to demonstrate the regenerate endpoint functionality. The integration flow has a SOAP source endpoint and REST destination endpoint. The integration flow was created on version 16.1.5 of ICS.

Flow Overview

figure1

Mapping

The integration flow has a simple Request mapping as shown below.

figure2

figure3

WSDL

The source SOAP endpoint has 2 input fields input1 and input2 as shown below.

figure4

Regenerate Endpoint WSDL

Change WSDL

Assume that the SOAP endpoint schema changes and  2 more fields’ input3 and input4 are added as shown below.

figure5

Import New WSDL

After these above fields are added, a new WSDL is created. Once the new WSDL is created, one may choose to clone the existing integration or edit the existing integration if versioning is not required. After this, the connection needs to be updated with the new WSDL.

figure7

Updating the WSDL and trying to save the connection pops up a warning that the connection being modified is used by one or more integration and that the integrations needs to be reactivated for changes to take effect. Click Yes.

figure8

Regenerate Endpoint

To regenerate endpoint wsdl, go to the integration flow and click on the source endpoint (endpoint whose schema was updated). To retain the mappings click on the regenerate icon as shown below.

figure9

This will pop up confirmation message whether you would like to proceed with the regeneration. Click Yes.

figure10

After confirming, a message that the regeneration successful message will be shown as below.

figure11

By clicking on the mappings, one can see that the old mappings have been retained and the new fields added.

figure12

Edit Endpoint Vs Regenerate Endpoint

Editing the endpoint instead of regenerate does not retain the mappings as shown below.

figure13

After clicking on the edit icon the following screens pop up.

figure14

figure15

As seen below the color of the mappings activities change from green to grey. This indicates that both request and response mappings are lost on edit.

figure17

Summary

The regeneration of endpoints is supported for all endpoints.This blog demonstrates how to regenerate endpoints when the schema on the applications change. This functionality comes in handy when they are incremental schema changes. The regenerate endpoint functionality helps retain existing mapping therby saving effort to redo all mappings on schema changes.

Uploading files to Oracle Document Cloud Service using SOA

$
0
0

This blog provides a quick tip for implementing file upload into Oracle Document Cloud Service (DOCS) using java in Oracle SOA and Oracle SOA Cloud Service(SOACS)

The DOCS upload REST service requires POSTing of multipart form, a feature that is currently unavailable in the REST cloud adapter. This POST request to upload a file contains 2 body parts. The first being a json payload and the second containing the actual file content.

 

The request format looks as shown here in the Oracle Documents Cloud Service REST API Reference.

Content-Type: multipart/form-data; boundary=---1234567890
-----1234567890
Content-Disposition: form-data; name="parameters"
Content-Type: application/json
{
"parentID":"FB4CD874EF94CD2CC1B60B72T0000000000100000001"
}
-----1234567890
Content-Disposition: form-data; name="primaryFile"; filename="example.txt"
Content-Type: text/plain
 
<File Content>
-----1234567890--

 

The section below shows a java embedded block of code that can be used within a BPEL process to achieve the file upload. This can be used in Oracle SOA and SOACS – BPEL composites. A valid DOCS endpoint, credentials for authorization, and a GUID of the folder location for the file upload are required to execute this REST call.
In this sample, a pdf document file is being uploaded into DOCS. The media type should be appropriately changed for other content formats.
Also, It is recommended to access the authorization credentials from a credential store when developing for production deployments. This section is only intended as a demo.

 

com.sun.jersey.api.client.Client client = com.sun.jersey.api.client.Client.create(); 
com.sun.jersey.api.client.WebResource webResource = client.resource("https://xxxx-yyyy.documents.zome.oraclecloud.com/documents/api/1.1/files/data"); 
com.sun.jersey.api.client.filter.HTTPBasicAuthFilter basicAuth = new com.sun.jersey.api.client.filter.HTTPBasicAuthFilter("username", "password"); 
client.addFilter(basicAuth);
;
com.sun.jersey.multipart.FormDataMultiPart multiform = new com.sun.jersey.multipart.FormDataMultiPart(); 
String DocCSFolderID = "{'parentID' : 'F1B2DDE55E4606D2B4718FDE2C1A41A800FD957B38C9'}";
com.sun.jersey.multipart.FormDataBodyPart formPart = new com.sun.jersey.multipart.FormDataBodyPart("parameters", DocCSFolderID, javax.ws.rs.core.MediaType.APPLICATION_JSON_TYPE);
com.sun.jersey.multipart.file.FileDataBodyPart filePart = new com.sun.jersey.multipart.file.FileDataBodyPart("primaryFile", new java.io.File("C:\\temp\\SampleDoc.pdf") , javax.ws.rs.core.MediaType.APPLICATION_OCTET_STREAM_TYPE);
multiform.bodyPart(formPart);
multiform.bodyPart(filePart); 

String response = webResource.type(javax.ws.rs.core.MediaType.MULTIPART_FORM_DATA_TYPE).accept(javax.ws.rs.core.MediaType.APPLICATION_JSON_TYPE).post(String.class, multiform);

Note that the REST cloud adapter can be used to interact with DOCS for most of the REST API defined here. The above exception is only for file upload and few other operations which require multipart forms. The REST cloud adapter is being enhanced to add multipart/form-data support in the near future.
Once that is available, the file upload into Oracle Document Cloud Service can also be achieved using the adapter within Oracle Integration Cloud Service (ICS), Oracle SOA, and SOACS.

 

Round Trip On-Premise Integration (Part 1) – ICS to EBS

$
0
0

One of the big challenges with adopting Cloud Services Architecture is how to integrate the on-premise applications when the applications are behind the firewall. A very common scenario that falls within this pattern is cloud integration with Oracle E-Business Suite (EBS). To address this cloud-to-ground pattern without complex firewall configurations, DMZs, etc., Oracle offers a feature with the Integration Cloud Service (ICS) called Connectivity Agent (additional details about the Agent can be found under New Agent Simplifies Cloud to On-premises Integration). Couple this feature with the EBS Cloud Adapter in ICS and now we have a viable option for doing ICS on-premise integration with EBS. The purpose of this A-Team blog is to detail the prerequisites for using the EBS Cloud Adapter and walk through a working ICS integration to EBS via the Connectivity Agent where ICS is calling EBS (EBS is the target application). The blog is also meant to be an additional resource for the Oracle documentation for Using Oracle E-Business Suite Adapter.

The technologies at work for this integration include ICS (Inbound REST Adapter, Outbound EBS Cloud Adapter), Oracle Messaging Cloud Service (OMCS), ICS Connectivity Agent (on-premise), and Oracle EBS R12.  The integration is a synchronous (request/response) to EBS where a new employee will be created via the EBS HR_EMPLOYEE_API. The flow consists of a REST call to ICS with a JSON payload containing the employee details.  These details are then transformed in ICS from JSON to XML for the EBS Cloud Adapter. The EBS adapter then sends the request to the on-premise connectivity agent via OMCS. The agent then makes the call to EBS where the results will then be passed back to ICS via OMCS. The EBS response is transformed to JSON and returned to the invoking client. The following is a high-level view of the integration:

ICSEBSCloudAdapter-Overview

Prerequisites

1. Oracle E-Business Suite 12.1.3* or higher.
2. EBS Configured for the EBS Cloud Adapter per the on-line document: Setting Up Oracle E-Business Suite Adapter from Integration Cloud Service.
a. ISG is configured for the EBS R12 Environment.
b. EBS REST services are configured in ISG.
c. Required REST services are deployed in EBS.
d. Required user privileges granted for the deployed REST services in EBS.
3. Install the on-premise Connectivity Agent (see Integration Cloud Service (ICS) On-Premise Agent Installation).

* For EBS 11 integrations, see another A-Team Blog E-Business Suite Integration with Integration Cloud Service and DB Adapter.

Create Connections

1. Inbound Endpoint Configuration.
a. Start the connection configuration by clicking on Create New Connection in the ICS console:
ICSEBSCloudAdapter-Connections_1-001
b. For this blog, we will be using the REST connection for the inbound endpoint. Locate and Select the REST Adapter in the Create Connection – Select Adapter dialog:
ICSEBSCloudAdapter-Connections_1-002
c. Provide a Connection Name in the New Connection – Information dialog:
ICSEBSCloudAdapter-Connections_1-003
d. The shell of the REST Connection has now been created. The first set of properties that needs to be configured is the Connection Properties. Click on the Configure Connectivity button and select REST API Base URL for the Connection Type. For the Connection URL, provide the ICS POD host since this is an incoming connection for the POD. A simple way to get the URL is to copy it from the browser location of the ICS console being used to configure the connection:
ICSEBSCloudAdapter-Connections_1-004
e. The last set of properties that need to be configured are the Credentials. Click on the Configure Credentials button and select Basic Authentication for the Security Policy. The Username and Password for the basic authentication will be a user configured on the ICS POD:
ICSEBSCloudAdapter-Connections_1-005
f. Now that we have all the properties configured, we can test the connection. This is done by clicking on the Test icon at the top of the window. If everything is configured correctly, a message of The connection test was successful!:
ICSEBSCloudAdapter-Connections_1-006
2. EBS Endpoint Connection
a. Create another connection, but this time select Oracle E-Business Suite from the Create Connection – Select Adapter dialog:
ICSEBSCloudAdapter-Connections_2-001
b. Provide a Connection Name in the New Connection – Information dialog:
ICSEBSCloudAdapter-Connections_2-002
c. Click on the Configure Connectivity button and for the EBS Cloud Adapter there is only one property, the Connection URL. This URL will be the hostname and port where the EBS metadata has been deployed for EBS. This metadata is provided by Oracle’s E-Business Suite Integrated SOA Gateway (ISG) and the setup/configuration of ISG can be found under the Prerequisites for this blog (item #2). The best way to see if the metadata provider has been deployed is to access the WADL using a URL like the following: http://ebs.example.com:8000/webservices/rest/provider?WADL where ebs.example.com is the hostname of your EBS metatdata provider machine. The URL should provide something like the following:
<?xml version = '1.0' encoding = 'UTF-8'?>
<application name="EbsMetadataProvider" targetNamespace="http://xmlns.oracle.com/apps/fnd/soaprovider/pojo/ebsmetadataprovider/" xmlns:tns="http://xmlns.oracle.com/apps/fnd/soaprovider/pojo/ebsmetadataprovider/" xmlns="http://wadl.dev.java.net/2009/02" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:tns1="http://xmlns.oracle.com/apps/fnd/rest/provider/getinterfaces/" xmlns:tns2="http://xmlns.oracle.com/apps/fnd/rest/provider/getmethods/" xmlns:tns3="http://xmlns.oracle.com/apps/fnd/rest/provider/getproductfamilies/" xmlns:tns4="http://xmlns.oracle.com/apps/fnd/rest/provider/isactive/">
   <grammars>
      <include href="http://ebs.example.com:8000/webservices/rest/provider/?XSD=getinterfaces_post.xsd" xmlns="http://www.w3.org/2001/XMLSchema"/>
      <include href="http://ebs.example.com:8000/webservices/rest/provider/?XSD=getmethods_post.xsd" xmlns="http://www.w3.org/2001/XMLSchema"/>
      <include href="http://ebs.example.com:8000/webservices/rest/provider/?XSD=getproductfamilies_post.xsd" xmlns="http://www.w3.org/2001/XMLSchema"/>
      <include href="http://ebs.example.com:8000/webservices/rest/provider/?XSD=isactive_post.xsd" xmlns="http://www.w3.org/2001/XMLSchema"/>
   <include href="http://ebs.example.com:8000/webservices/rest/provider/?XSD=getinterfaces_get.xsd" xmlns="http://www.w3.org/2001/XMLSchema"/>
      <include href="http://ebs.example.com:8000/webservices/rest/provider/?XSD=getmethods_get.xsd" xmlns="http://www.w3.org/2001/XMLSchema"/>
      <include href="http://ebs.example.com:8000/webservices/rest/provider/?XSD=getproductfamilies_get.xsd" xmlns="http://www.w3.org/2001/XMLSchema"/>
      <include href="http://ebs.example.com:8000/webservices/rest/provider/?XSD=isactive_get.xsd" xmlns="http://www.w3.org/2001/XMLSchema"/>
   </grammars>
   <resources base="http://ebs.example.com:8000/webservices/rest/provider/">
      <resource path="getInterfaces/{product}/">
         <param name="product" style="template" required="true" type="xsd:string"/>
         <method id="getInterfaces" name="GET">
            <request>
               <param name="ctx_responsibility" type="xsd:string" style="query" required="false"/>
               <param name="ctx_respapplication" type="xsd:string" style="query" required="false"/>
               <param name="ctx_securitygroup" type="xsd:string" style="query" required="false"/>
               <param name="ctx_nlslanguage" type="xsd:string" style="query" required="false"/>
               <param name="ctx_language" type="xsd:string" style="query" required="false"/>
               <param name="ctx_orgid" type="xsd:int" style="query" required="false"/>
               <param name="scopeFilter" type="xsd:string" style="query" required="true"/>
               <param name="classFilter" type="xsd:string" style="query" required="true"/>
            </request>
            <response>
               <representation mediaType="application/xml" type="tns1:getInterfaces_Output"/>
               <representation mediaType="application/json" type="tns1:getInterfaces_Output"/>
            </response>
         </method>
      </resource>
      <resource path="getInterfaces/">
         <method id="getInterfaces" name="POST">
            <request>
               <representation mediaType="application/xml" type="tns1:getInterfaces_Input"/>
               <representation mediaType="application/json" type="tns1:getInterfaces_Input"/>
            </request>
            <response>
               <representation mediaType="application/xml" type="tns1:getInterfaces_Output"/>
               <representation mediaType="application/json" type="tns1:getInterfaces_Output"/>
            </response>
         </method>
      </resource>
      <resource path="getMethods/{api}/">
         <param name="api" style="template" required="true" type="xsd:string"/>
         <method id="getMethods" name="GET">
            <request>
               <param name="ctx_responsibility" type="xsd:string" style="query" required="false"/>
               <param name="ctx_respapplication" type="xsd:string" style="query" required="false"/>
               <param name="ctx_securitygroup" type="xsd:string" style="query" required="false"/>
               <param name="ctx_nlslanguage" type="xsd:string" style="query" required="false"/>
               <param name="ctx_language" type="xsd:string" style="query" required="false"/>
               <param name="ctx_orgid" type="xsd:int" style="query" required="false"/>
               <param name="scopeFilter" type="xsd:string" style="query" required="true"/>
               <param name="classFilter" type="xsd:string" style="query" required="true"/>
            </request>
            <response>
               <representation mediaType="application/xml" type="tns2:getMethods_Output"/>
               <representation mediaType="application/json" type="tns2:getMethods_Output"/>
            </response>
         </method>
      </resource>
      <resource path="getMethods/">
         <method id="getMethods" name="POST">
            <request>
               <representation mediaType="application/xml" type="tns2:getMethods_Input"/>
               <representation mediaType="application/json" type="tns2:getMethods_Input"/>
            </request>
            <response>
               <representation mediaType="application/xml" type="tns2:getMethods_Output"/>
               <representation mediaType="application/json" type="tns2:getMethods_Output"/>
            </response>
         </method>
      </resource>
      <resource path="getProductFamilies/">
         <method id="getProductFamilies" name="GET">
            <request>
               <param name="ctx_responsibility" type="xsd:string" style="query" required="false"/>
               <param name="ctx_respapplication" type="xsd:string" style="query" required="false"/>
               <param name="ctx_securitygroup" type="xsd:string" style="query" required="false"/>
               <param name="ctx_nlslanguage" type="xsd:string" style="query" required="false"/>
               <param name="ctx_language" type="xsd:string" style="query" required="false"/>
               <param name="ctx_orgid" type="xsd:int" style="query" required="false"/>
               <param name="scopeFilter" type="xsd:string" style="query" required="true"/>
               <param name="classFilter" type="xsd:string" style="query" required="true"/>
            </request>
            <response>
               <representation mediaType="application/xml" type="tns3:getProductFamilies_Output"/>
               <representation mediaType="application/json" type="tns3:getProductFamilies_Output"/>
            </response>
         </method>
      </resource>
      <resource path="getProductFamilies/">
         <method id="getProductFamilies" name="POST">
            <request>
               <representation mediaType="application/xml" type="tns3:getProductFamilies_Input"/>
               <representation mediaType="application/json" type="tns3:getProductFamilies_Input"/>
            </request>
            <response>
               <representation mediaType="application/xml" type="tns3:getProductFamilies_Output"/>
               <representation mediaType="application/json" type="tns3:getProductFamilies_Output"/>
            </response>
         </method>
      </resource>
      <resource path="isActive/">
         <method id="isActive" name="GET">
            <request>
               <param name="ctx_responsibility" type="xsd:string" style="query" required="false"/>
               <param name="ctx_respapplication" type="xsd:string" style="query" required="false"/>
               <param name="ctx_securitygroup" type="xsd:string" style="query" required="false"/>
               <param name="ctx_nlslanguage" type="xsd:string" style="query" required="false"/>
               <param name="ctx_language" type="xsd:string" style="query" required="false"/>
               <param name="ctx_orgid" type="xsd:int" style="query" required="false"/>
            </request>
            <response>
               <representation mediaType="application/xml" type="tns4:isActive_Output"/>
               <representation mediaType="application/json" type="tns4:isActive_Output"/>
            </response>
         </method>
      </resource>
      <resource path="isActive/">
         <method id="isActive" name="POST">
            <request>
               <representation mediaType="application/xml" type="tns4:isActive_Input"/>
               <representation mediaType="application/json" type="tns4:isActive_Input"/>
            </request>
            <response>
               <representation mediaType="application/xml" type="tns4:isActive_Output"/>
               <representation mediaType="application/json" type="tns4:isActive_Output"/>
            </response>
         </method>
      </resource>
   </resources>
</application>

 

If you don’t get something like the above XML, here are some general troubleshooting steps:
Login to EBS console
–> Integrated SOA Gateway
—-> Integration Repository
——> Click on “Search” button on the right
——–> Enter “oracle.apps.fnd.rep.ws.service.EbsMetadataProvider” in the field “Internal Name”
———-> Click “Go” (If this doesn’t list anything, you are missing a patch on the EBS instance. Please follow the Note. 1311068.1)
————> Click on “Metadata Provider”
————–> Click on “REST Web Service” tab
—————-> Enter “provider” as is in the “Service Alias” field and click the button “Deploy”
——————> Navigate to “Grants” tab and give grants on all methods.
If the WADL shows that the metadata provider is deployed and ready, the Connection URL is simply the host name and port where the metatdata provider is deployed. For example, http://ebs.example.com:8000
ICSEBSCloudAdapter-Connections_2-003
d. The next set of properties that need to be configured are the Credentials. Click on the Configure Credentials button and select Basic Authentication for the Security Policy. The Username and Password for the basic authentication will be a user configured on the on-premise EBS environment granted privileges to access the EBS REST services:
ICSEBSCloudAdapter-Connections_2-004
NOTE: The Property Value for Username in the screen shot above shows the EBS sysadmin user. This will most likely “not” be the user that has grants on the EBS REST service. If you use the sysadmin user here and your integration (created later) “fails at runtime” with a “Responsibility is not assigned to user” error from EBS, either the grants on the EBS REST service are not created or a different EBS user needs to be specified for this connection. Here is an example error you might get:
<ISGServiceFault>
    <Code>ISG_USER_RESP_MISMATCH</Code>
    <Message>Responsibility is not assigned to user</Message>
    <Resolution>Please assign the responsibility to the user.</Resolution>
    <ServiceDetails>
        <ServiceName>HREmployeeAPISrvc</ServiceName>
        <OperationName>CREATE_EMPLOYEE</OperationName>
        <InstanceId>0</InstanceId>
    </ServiceDetails>
</ISGServiceFault>
e. Finally, we need to associate this connection with the on-premise Connectivity Agent that was configured as a Prerequisite. To do this, click on the Configure Agents button and select the agent group that contains the running on-premise Connectivity Agent:
ICSEBSCloudAdapter-Connections_2-005
f. Now that we have all the properties configured, we can test the connection. This is done by clicking on the Test icon at the top of the window. If everything is configured correctly, a message of The connection test was successful!:
ICSEBSCloudAdapter-Connections_2-006
3. We are now ready to construct our cloud-to-ground integration using ICS and the connections that were just created.

Create Integration

1. Create New Integration.
a. Navigate to the Integrations page of the Designer section. Then click on Create New Integration:
ICSEBSCloudAdapter-CreateIntegration_1-001
b. In the Create Integration – Select a Pattern dialog, locate the Map My Data and select it:
ICSEBSCloudAdapter-CreateIntegration_1-002
c. Give the new integration a name and click on Create:
ICSEBSCloudAdapter-CreateIntegration_1-003
2. Configure Inbound Endpoint.
a. The first thing we will do is to create our inbound endpoint (entry point to the ICS integration). In the Integration page that opened from the previous step, locate the Connections section and find the REST connection configured earlier. Drag-and-Drop that connection onto the inbound (left-hand side) of the integration labeled “Drage and Drop a Trigger”:
ICSEBSCloudAdapter-CreateIntegration_2-001
b. Since the focus of this blog is on the EBS Adapter, we will not go into the details of setting up this endpoint. The important details for this integration is that the REST service will define both the request and the response in JSON format:

Example Request:

{
  "CREATE_EMPLOYEE_Input": {
    "RESTHeader": {
      "Responsibility": "US_SHRMS_MANAGER",
      "RespApplication": "PER",
      "SecurityGroup": "STANDARD",
      "NLSLanguage": "AMERICAN",
      "Org_Id": "204"
    },
    "InputParameters": {
      "HireDate": "2016-01-01T09:00:00",
      "BusinessGroupID": "202",
      "LastName": "Sled",
      "Sex": "M",
      "Comments": "Create From ICS Integration",
      "DateOfBirth": "1991-07-03T09:00:00",
      "EMailAddress": "bob.sled@example.com",
      "FirstName": "Robert",
      "Nickname": "Bob",
      "MaritalStatus": "S",
      "MiddleName": "Rocket",
      "Nationality": "AM",
      "SocialSSN": "555-44-3333",
      "RegisteredDisabled": "N",
      "CountryOfBirth": "US",
      "RegionOfBirth": "Montana",
      "TownOfBirth": "Missoula"
    }
  }
}

Example Response:

{
  "CreateEmployeeResponse": {
    "EmployeeNumber": 2402,
    "PersonID": 32871,
    "AssignmentID": 34095,
    "ObjectVersionNumber": 2,
    "AsgObjectVersionNumber": 1,
    "EffectiveStartDate": "2016-01-01T00:00:00.000-05:00",
    "EffectiveEndDate": "4712-12-31T00:00:00.000-05:00",
    "FullName": "Sled, Robert Rocket (Bob)",
    "CommentID": 1304,
    "AssignmentSequence": null,
    "AssignmentNumber": 2402,
    "NameCombinationWarning": 0,
    "AssignPayrollWarning": 0,
    "OrigHireWarning": 0
  }
}
ICSEBSCloudAdapter-CreateIntegration_2-002
3. Configure Outbound Endpoint.
a. Now we will configure the endpoint to EBS. In the Integration page, locate the Connections section and find the E-Business Suite adapter connection configured earlier. Drag-and-Drop that connection onto the outbound (right-hand side) of the integration labeled “Drage and Drop an Invoke”:
ICSEBSCloudAdapter-CreateIntegration_3-001
b. The Configure Oracle E-Business Suite Adapter Endpoint configuration window should now be open. Provide a meaningful name for the endpoint and press Next >. If the windows hangs or errors out, check to make sure the connectivity agent is running and ready. This endpoint is dependent on the communication between ICS and EBS via the connectivity agent.
ICSEBSCloudAdapter-CreateIntegration_3-002
c. At this point, the adapter has populated the Web Services section of the wizard with Product Family and Product metatdata from EBS. For this example, the Product Family will be Human Resources Suite and the Product will be Human Resources. Once those are selected, the window will be populated with API details.
ICSEBSCloudAdapter-CreateIntegration_3-003
d. Next to API label is a text entry field where the list of APIs can be searched by typing values in that field. This demo uses the HR_EMPLOYEE_API, which can be found by typing Employee in the text field and selecting Employee from the list:
ICSEBSCloudAdapter-CreateIntegration_3-004
e. The next section of the configuration wizard is the Operations. This will contain a list of “all” operations for the API including operations that have not yet been deployed in the EBS Integration Repository. If you select an operation and see a warning message indicating that the operation has not been deployed, you must go to the EBS console and deploy that operation in the Integration Repository and provide the appropriate grants.
ICSEBSCloudAdapter-CreateIntegration_3-005
f. This demo will use the CREATE_EMPLOYEE method of the HR_EMPLOYEE_API. Notice that there is no warning when this method is selected:
ICSEBSCloudAdapter-CreateIntegration_3-006
g. The Summary section of the configuration wizard shows all the details from the previous steps. Click on Done to complete the endpoint configuration.
ICSEBSCloudAdapter-CreateIntegration_3-007
h. Check point – the ICS integration should look something like the following:
ICSEBSCloudAdapter-CreateIntegration_3-008
4. Request/Response Mappings.
a. The mappings for this example are very straightforward in that the JSON was derived from the EBS input/output parameters, so the relationships are fairly intuitive. Also, the number of data elements have been minimized to simplify the mapping process. It is also a good idea to provide a Fault mapping:

Request Mapping:

ICSEBSCloudAdapter-CreateIntegration_4-001

Response Mapping:

ICSEBSCloudAdapter-CreateIntegration_4-002

Fault Mapping:

ICSEBSCloudAdapter-CreateIntegration_4-003
5. Set Tracking.
a. The final step to getting the ICS Integration to 100% is to Add Tracking. This is done by clikcing on the Tracking icon at the top right-hand side of the Integration window.
ICSEBSCloudAdapter-CreateIntegration_5-001
b. In the Business Identifiers For Tracking window, drag-and-drop fields that will be used for tracking purposes. These fields show up in the ICS console in the Monitoring section for the integration.
ICSEBSCloudAdapter-CreateIntegration_5-002
c. There can be up to 3 fields used for the tracking, but only one is considered the Primary.
ICSEBSCloudAdapter-CreateIntegration_5-003
6. Save (100%).
a. Once the Tracking is configured, the integration should now be at 100% and ready for activation. This is a good time to Save all the work that has been done thus far.
ICSEBSCloudAdapter-CreateIntegration_6-001

Test Integration

1. Make sure the integration is activated and you open the endpoint URL that located by clicking on the “I”nformation icon.
ICSEBSCloudAdapter-Test-001
2. Review the details of this page since it contains everything needed for the REST client that will be used for testing the integration.
ICSEBSCloudAdapter-Test-002
3. Open a REST test client and provide all the necessary details from the endpoint URL. The important details from
the page include:
Base URL: https://[ICS POD Host Name]/integration/flowapi/rest/HR_CREATE_EMPLOYEE/v01
REST Suffix: /hr/employee/create
URL For Test Client: https://[ICS POD Host Name]/integration/flowapi/rest/HR_CREATE_EMPLOYEE/v01/hr/employee/create
REST Method: POST
Content-Type application/json
JSON Payload:
{
  "CREATE_EMPLOYEE_Input": {
    "RESTHeader": {
      "Responsibility": "US_SHRMS_MANAGER",
      "RespApplication": "PER",
      "SecurityGroup": "STANDARD",
      "NLSLanguage": "AMERICAN",
      "Org_Id": "204"
    },
    "InputParameters": {
      "HireDate": "2016-01-01T09:00:00",
      "BusinessGroupID": "202",
      "LastName": "Demo",
      "Sex": "M",
      "Comments": "Create From ICS Integration",
      "DateOfBirth": "1991-07-03T09:00:00",
      "EMailAddress": "joe.demo@example.com",
      "FirstName": "Joseph",
      "Nickname": "Demo",
      "MaritalStatus": "S",
      "MiddleName": "EBS",
      "Nationality": "AM",
      "SocialSSN": "444-33-2222",
      "RegisteredDisabled": "N",
      "CountryOfBirth": "US",
      "RegionOfBirth": "Montana",
      "TownOfBirth": "Missoula"
    }
  }
}
The last piece that is needed for the REST test client is authentication information. Add Basic Authentication to the header with a user name and password for an authorized “ICS” user. The user that will be part of the on-premise EBS operation is specified in the EBS connection that was configured in ICS earlier. The following shows what all this information looks like using the Firefox RESTClient add-on:
ICSEBSCloudAdapter-Test-003
4. Before we test the integration, we can login to the EBS console as the HRMS user. Then navigating to Maintaining Employees, we can search for our user Joseph Demo by his last name. Notice, nothing comes up for the search:
ICSEBSCloudAdapter-Test-004
5. Now we send the POST from the RESTClient and review the response:
ICSEBSCloudAdapter-Test-005
6. We can compare what was returned from EBS to ICS in the EBS application. Here is the search results for the employee Joseph Demo:
ICSEBSCloudAdapter-Test-006
7. Here are the details for Joseph Demo:
ICSEBSCloudAdapter-Test-007
8. Now we return to the ICS console and navigate to the Tracking page of the Monitoring section. The integration instance shows up with the primary tracking field of Last Name: Demo
ICSEBSCloudAdapter-Test-008
9. Finally, by clicking on the tracking field for the instance, we can view the details:
ICSEBSCloudAdapter-Test-009

Hopefully this walkthrough of how to do an ICS integration to an on-premise EBS environment has been useful. I am looking forward to any comments and/or feedback you may have. Also, keep an eye out for the “Part 2” A-Team Blog that will detail EBS business events surfacing in ICS to complete the ICS/EBS on-premise round trip integration scenarios.

Using Event Handling Framework for Outbound Integration of Oracle Sales Cloud using Integration Cloud Service

$
0
0

Introduction:

Oracle’s iPaaS solution is the most comprehensive cloud based integration platform in the market today.  Integration Cloud Service (ICS) gives customers an elevated user experience that makescomplex integration simple to implement.

Oracle Sales Cloud (OSC) is a SaaS application and is a part of the comprehensive CX suite of applications. Since OSC is usually the customer master and is the center for all Sales related activities, integration with OSC is often a requirement in most use cases

Although OSC provides useful tools for outbound as well as inbound integration, it is a common practice to use ICS as a tool to integrate OSC and other SaaS as well as on-premises applications. In this article, I will explore this topic in detail and also demonstrate the use of Event Handling Framework (EHF) in OSC to achieve the same.

Main Article:

Within ICS you can leverage the OSC adapter to create an integration flow. OSC can act both as source (inbound)  or as target (outbound) for integration with other SaaS or on-premises applications; with ICS in the middle acting as the integration agent. While the inbound integration flow is triggered by the source application, invoking the outbound flow is the responsibility of OSC.

InboundIntegration OurboundIntegration

In this article, I will discuss the outbound flow, where OSC acts as the source and other applications serve as the target. There are essentially 2 ways of triggering this integration:

  • Invoking the ICS integration every time the object which needs to be integrated is created or updated. This can be achieved by writing groovy code inside create/update triggers of the object and invoking the flow web service by passing in the payload.
  • Using the Event Handling Framework (EHF) to generate an update or create event on the object and notify the subscribers. In this case, ICS registers itself with OSC and gets notified when the event gets fired along with the payload

 

OSC supports events for most important business objects such as Contact, Opportunities, Partners etc. More objects are being enabled with EHF support on a continuous basis.

In this article, I will demonstrate how to use EHF to achieve an outbound integration. We will create a flow in ICS which subscribes to the “Contact Created” event and on being notified of the event, updates the newly created contact object. While this integration is quite basic, it demonstrates the concept. While we use Update Contact as a target for our integration, you can use another SaaS application (for example Siebel or Service Cloud) as the target and create a Contact there.

Integration

 

Detailed steps:

Before starting, let’s identify some URLs. For the example, we will need 2 URLs – One for CommonDomain and one for CRMDomain. You can find these out using from Review Topology under Setup and Maintenance

CRM_URL FS_URL

The URLs will be of the following form:

CommonDomain: https://<instance_name>.fs.us2.oraclecloud.com

CRMDomain: https://<instance_name>.crm.us2.oraclecloud.com

I will refer to these URLs as COMMON_DOMAIN_URL and CRM_DOMAIN_URL in the rest of the article.

Let’s now move on to configuring our environment and creating a example integration based on events.

The first step is to create a CSF key so that Sales Cloud can connect to ICS and invoke the subscriptions. In R11, this can be achieved through SOA Composer. To access SOA Composer, navigate to <CRM_DOMAIN_URL>/soa/composer

Inside SOA Composer, click on “Manage Security” to  open “Manage Credentials” dialog. The name of csf-key should be the same as identity domain on the ICS instance. Provide username and password of the user that OSC should use to invoke ICS subscriptions.

Note: Customers didn’t have this ability in R10 and it had to be done by the operations team.

001_CSF_Key

Login to ICS Console and and in the home page, click on Create Connections followed by Create New Connection

01_ICS_Home_Page

02_Connections

Click Select under Oracle Sales Cloud

 

03_Create_Connection

Provide a unique name and identifier for the connection. Optionally, provide a detailed description. Click Create

04_New_Connection

 

You will see the prompt that the connection was created successfully and will automatically go to the connection details page. It tracks your progress as well. Click Configure Connectivity

05_Connection_Created

In the Connection Properties page, provide details as follows:

OSC Services Catalog WSDL URL: <COMMON_DOMAIN_URL>/fndAppCoreServices/ServiceCatalogService?wsdl

OSC Events Catalog URL: <CRM_DOMAIN_URL >/soa-infra.

06_Connection_Properties

Click Configure Connectivity

07_Configure_Credential

Provide credentials of the service user that will be user for integration and click OK

08_Credentials

Connection details page shows the connection is 85% complete. The only step remaining at this point it to test the connection to make sure all the details provided are correct. Click on Test

09_Test

If all the provided details are correct, you will see message confirming the test was successful. Progress indicator also shows 100%. At this point, you Save and click Exit Integration.

10_Test_Successful

You see a confirmation that the connection was saved successfully. You can also see the new connection in the list.

11_Connections

The next step is to use this connection to create an integration. Click on Integrations followed by Create New Integration.

12_Create_Integration

In the Create Integration – Select a Pattern dialog, click Select under Map My Data. You may choose a different pattern based on your integration requirements but for this example, we will use Map My Data pattern.

13_Select_Pattern

In the New Integration – Information dialog provide the unique name and identifier for this integration, an appropriate version number, and optionally a package name and description.

14_Integration_Information

Drag and drop the connection that we created on the source. This opens the Configure Sales Cloud Endpoint wizard.

15_Integration_Created

In the Configure Sales Cloud Endpoint wizard, provide the name, and optionally a description of the endpoint. Click Next.

16_Configure_Sales_Cloud_EP

In section titled Configure a Request, choose With Business Events to create this integration using Business Events in OSC. For this example, we will use Contact Created Event which fires when a contact is created is OSC. Click Next.

17_Pick_Event

In the next screen under section titled Response Type, choose None and click Next.

18_Response

The wizard shows the endpoint summary. Review the details and click Done.

19_EP_Summary

Now we have to create a target endpoint. Usually this target will be another application that we are integrating with OSC. For our example, we will simply use OSC as a target application itself. Drag and drop the OSC connection we created earlier into the target.

20_EP1_Done

In the Configure Sales Cloud Endpoint wizard, provide the name, and optionally a description of the endpoint. Click Next.

21_Configure_Sales_Cloud_EP

Under section titled Select a Business Object find the Contact object and click on it. The drop down below the operations this object supports. For this example, choose updateContact and click Next.

22_Pick_Business_Object

The wizard shows the endpoint summary. Review the details and click Done.

23_EP2_Summary

Now we need to map the source payload to the target payload. Clicking on the Map icon followed by the “+” icon to create a mapping.

24_EP1_Done

In the mapping wizard, you can specify the appropriate mapping. For our example, we will use a very simple mapping to update the PreviousLastName with the value of LastName we received in the payload. This doesn’t add a lot of value, but serves the purpose of illustrating an end-to-end integration. Drag and drop PartyId to PartyId from source to target and LastName to PreviousLastName from source to target. Click Save and Exit Mapper.

25_Map1

The integration details page shows our integration is 77% complete. One final step is to add tracking fields which allow us to identify various instances of integration. Click on Tracking.

26_Tracking

Drag and drop appropriate fields from Source into tracking fields and click Done.

27_Tracking_Identifiers

Now our integration is 100% complete. We can optionally choose an action for the response and fault. For our example, we will skip this step. Click on Save followed by Exit Integration.

28_integration_Complete

ICS console shows the integration was saved successfully. Newly created integration also shows up in the list of integrations. Click to Activate to activate this integration.

29_Integration_Saved

In the confirmation dialog, click Yes.

30_Activation_Confirmation

Once the integration is active, a subscription for it is created in OSC. You can review this subscription, as well as all the other subscriptions by invoking the following URL from your browser:

<CRM_DOMAIN_URL>/soa-infra/PublicEvent/subscriptions

31_Subscriptions

You can now create a Contact in Sales Cloud and it will almost instantaneously be updated with the new value of Previous Last Name.

 

Enhancing ICS Mappings with Custom Java Classes

$
0
0

Introduction

One of the most common tasks performed during the creation of integrations in ICS (Integration Cloud Service) is the implementation of mappings. In a nutshell, mappings are the resources that ICS uses to allow messages coming from the configured source application to be sent to the configured target application. Failure in properly defining and configuring these mappings directly impacts how integrations are going to behave while sending messages downstream.

In order to build mappings in ICS, users make use of the mapping editor. The mapping editor allows for the creation of complex XPath expressions via an intuitive drag-and-drop interface. Besides the support for XPath expressions, it is also possible to use built-in XSLT functions available within the Mapping Components section of the mapping editor, as shown in figure 1.

fig00-creating-mappings-editor

Figure 1: ICS mapping editor with the functions palette expanded.

However, it is not uncommon to find situations in which the set of built-in functions is not adequate to perform a specific data handling operation. When that happens, most people using ICS feel they’ve hit a roadblock due to the fact that there is no way to simply add a custom function. While there is always the possibility to open an SR (Service Request) within Oracle and request an enhancement, sometimes this is not possible because the ongoing project requires at least a workaround in order to be able to finish the use case in a timely manner.

This blog is going to show how classes from ICS’s Fusion Middleware foundation can be leveraged to provide custom data handling in mappings. To illustrate this, the following sections will show how to perform Base64 data decoding, using a utility class from the Oracle WebLogic API.

Programming in XLST Directly

In contrast to what many people think, ICS is not a black box. You can access pretty much everything that is generated by ICS when you export the integration, as shown in figure 2. Once you have access to the integration archive file, you can see what ICS generated for you and in case of mappings, even change it.

fig01-exporting-integration

Figure 2: Generating an integration archive.

With this option in mind, most people who are familiar with programming in XSLT feel more comfortable in handling each mapping directly; by using its own programming constructs if of course, it is valid under the XSLT specification. In order to be able to write XSLT code for the mappings, the first thing you need to do is locate the .XSL file that handles the mapping under the integration archive structure.

Be aware that .XSL files are only generated if you perform at least one initial mapping. Therefore, you must create a mapping using the visual editor to generate the file. Once generated, you can change it to suit your needs.

Typically, the location of the .XSL files within the integration archive use the following filename pattern:

$ROOT/icspackage/project/$INTEGRATION_NAME_VERSION/resources/processor_XXX/resourcegroup_XXX

Each mapping in the integration generates a processor_XXX folder. For example, in a typical SOAP-based request-reply integration, there will be at least three mappings: one for the request flow, one for the response flow and one for the fault flow. In this particular case, you can expect that there will be three processor_XXX folders under the resources folder. “XXX” in this context will be a system-generated identifier for each processor, which has the responsibility to uniquely identify that component within the integration. Figure 3 shows an example of a generated .XSL file. We are going to change that file in order to include a Java native function that performs Base64 decoding.

fig02-mapping-before-change

Figure 3: Example of .XSL generated file.

First, you should notice that there is a comment in the .XSL file that states where you should make any code changes. That can be observed in the “User Editing allowed BELOW this line” code comment. Unless you know exactly what you are doing please do not modify other portions of the code or you will end up breaking the code and will probably experience runtime errors.

Second, before calling any custom function in XSLT, you need to provide the namespace that defines that function. Specifically, you need to provide a namespace declaration that specifies which Java class is being used for the custom function. Figure 4 shows how to declare a custom function for the Base64 decoding use case, and how to use that function within the XSLT code.

fig03-mapping-after-change

Figure 4: Defining and using custom functions in XSLT.

The namespace declaration should specify a fully qualified Java class, and associate this class to a prefix. In the example shown in figure 4; the weblogic.apache.xerces.impl.div.util.Base64 class was associated with the “b64” prefix, so it can be used across the XSLT script by mentioning this prefix. Keep in mind that not all Java classes can be used as a custom function. In order to work, the class should expose only static methods, and the data types used must be simple types present in the XML/XSLT specification. Be aware of data type conversion as well. The values passed by argument to the functions must match with the types defined in the method, just like the value returned by the method must match with the enclosing tag that will receive the value.

Third, you have to save all changes back to the integration archive, and re-import the archive into ICS. When you do this, you can continue your development work using the ICS UI, or just activate it to start testing it.

fig04-importing-integration

Figure 5: Importing the integration archive.

While applying this technique in your integrations, please do so with care. Be particularly careful with which Java classes you chose to use it. Oracle provides no guarantees that some specific class will be always available on ICS’s Fusion Middleware foundation. For this reason, you may want to prefer classes that belong to the JDK and/or any class that you relatively certain is available under the ICS runtime architecture.

Using eBS Adapter in Integration Cloud Service – Part 1: Installing eBusiness Suite Integrated SOA Gateway for REST Services

$
0
0

Introduction

Integration Cloud Service (ICS) enables connecting applications in the cloud or on-premise. It also provides an adapter for eBusiness Suite. This eBS adapter is different than the eBS adapter in SOA Suite – it does not use a database connection. Instead it uses the REST services provided by eBS as part of Integrated SOA Gateway (ISG).

This article describes the steps needed to get eBusiness Suite including ISG REST services ready – not only for ICS (These instructions apply as well if you want to use REST services without ICS):

ISG requires some additional patches on top of eBS 12.2.4 – this is shown in this first part.

In a second part which you can find here, we show how to enable the REST metadata provider for ICS and test eBS REST services – both from a native REST client and from ICS using the adapter.

Prerequisites

As a starting point, we assume eBusiness Suite version 12.2.4. We will show the steps needed using the eBusiness Suite Virtual Appliance which is available from Oracle Software Delivery Cloud (http://edeliverly.oracle.com). Search for product “Oracle VM Virtual Appliances for Oracle E-Business Suite” under category “Linux/OVM/VMs“. Be sure to select the 12.2.4 version: “Oracle VM Virtual Appliances for Oracle E-Business Suite 12.2.4.0.0 for x86 64 bit, 41 files“. You only need to download all files marked with “Oracle E-Business Suite Release 12.2.4 Single Node Vision Install X86 (64 bit)” for this exercise.

Using this VM, we will focus on the basic steps needed to get REST services running – this will not include steps required for a production type setup, for example setup of SSL etc.

See also blog https://blogs.oracle.com/stevenChan/entry/e_business_suite_12_2

ISG REST services are part of core eBS 12.2 without the need for additional licenses.  (ISG SOAP services – which are not needed for ICS – would require additional license of SOA Suite with eBS 12.2)

For running the VM, we will use Oracle Virtualbox 5.0.16. All steps are executed for Oracle Linux 6 (x86).

Now, download all patches needed later from MOS – see Appendix.

General Patching Procedure

The overall procedure for patching is described in MOS Note 1617461.1 – Applying the Latest AD and TXK Release Update Packs to Oracle E-Business Suite Release 12.2

The overall procedure for installing ISG is described in MOS Note 1311068.1 – Installing Oracle E-Business Suite Integrated SOA Gateway, Release 12.2. Only steps for REST services in sections B and C are relevant.

Step 1 – Extract and Test the downloaded VM Image

After downloading, extract the VM using the following script – and import the resulting OVA file in Virtualbox.

for i in *.zip
do
unzip $i
done

cat Oracle-E-Business-Suite-12.2.4_VISION_INSTALL.ova.00 \
Oracle-E-Business-Suite-12.2.4_VISION_INSTALL.ova.01 \
Oracle-E-Business-Suite-12.2.4_VISION_INSTALL.ova.02 \
Oracle-E-Business-Suite-12.2.4_VISION_INSTALL.ova.03 \
Oracle-E-Business-Suite-12.2.4_VISION_INSTALL.ova.04 \
Oracle-E-Business-Suite-12.2.4_VISION_INSTALL.ova.05 \
Oracle-E-Business-Suite-12.2.4_VISION_INSTALL.ova.06 \
Oracle-E-Business-Suite-12.2.4_VISION_INSTALL.ova.07 \
Oracle-E-Business-Suite-12.2.4_VISION_INSTALL.ova.08 \
Oracle-E-Business-Suite-12.2.4_VISION_INSTALL.ova.09 \
Oracle-E-Business-Suite-12.2.4_VISION_INSTALL.ova.10 \
Oracle-E-Business-Suite-12.2.4_VISION_INSTALL.ova.11 \
Oracle-E-Business-Suite-12.2.4_VISION_INSTALL.ova.12 \
Oracle-E-Business-Suite-12.2.4_VISION_INSTALL.ova.13 \
Oracle-E-Business-Suite-12.2.4_VISION_INSTALL.ova.14 \
Oracle-E-Business-Suite-12.2.4_VISION_INSTALL.ova.15 > Oracle-E-Business-Suite-12.2.4_VISION_INSTALL.ova

After startup of the VM, try login as root. This will ask for new passwords for users root, oracle and applmgr.

Next, check the network configuration: ebs.example.com must be resolvable in the VM. Then start the DB and App tier using the scripts startvisiondb.sh and startvisionapps.sh

The scripts to manage the Oracle E-Business Suite single node Vision installation are:

SCRIPTS BASE_DIR                      : /u01/install/VISION/scripts/
START SCRIPT FOR DB                   : /u01/install/VISION/scripts/startvisiondb.sh
STOP SCRIPT FOR DB                    : /u01/install/VISION/scripts/stopvisiondb.sh
START SCRIPT FOR APPS                 : /u01/install/VISION/scripts/startvisionapps.sh
STOP SCRIPT FOR APPS                  : /u01/install/VISION/scripts/stopvisionapps.sh
DB RE-CONFIG SCRIPT                   : /u01/install/VISION/scripts/visiondbconfig.sh
APPS RE-CONFIG SCRIPT                 : /u01/install/VISION/scripts/visionappsconfig.sh
DB CLEANUP SCRIPT                     : /u01/install/VISION/scripts/visiondbcleanup.sh
APPS CLEANUP SCRIPT                   : /u01/install/VISION/scripts/visionappscleanup.sh
CONFIGURE A NEW WEB ENTRY POINT       : /u01/install/scripts/configwebentry.sh

You now should be able to login to eBS home page using http://ebs.example.com:8000 using for example user SYSADMIN/sysadmin.

Step 1a – Change to graphical desktop (optional)

If you rather like to work with a graphical desktop that with the default command line shell provided by the VM, log into VM as root and execute

yum install oraclelinux-release

yum groupinstall Desktop

Change runlevel in /etc/inittab from 3 to 5.

Step 1b – Change tnsnames.ora

Edit /u01/install/VISION/11.2.0/network/admin/tnsnames.ora

EBSDB =
(DESCRIPTION =
(ADDRESS = (PROTOCOL = TCP)(HOST = slc01ozg.us.oracle.com)(PORT = 1521))
(CONNECT_DATA =
(SERVER = DEDICATED)
(SERVICE_NAME = EBSDB)
)
)

Change slc01ozg.us.oracle.com to ebs.example.com and verify with tnsping EBSDB that this works fine.

Step 2 – Upgrade FMW – Apply Patch 20642039

Verify FMW Version: WLS is 10.3.6.0.7 (using WLS Console http://localhost:7001/console)

Download patch 20642039 and execute

export ORACLE_HOME=/u01/install/VISION/fs2/FMW_Home/oracle_common

opatch apply

If you encounter the following error, then correct the ORACLE_HOME to /u01/install/VISION/fs2/FMW_Home/oracle_common before running opatch.

OPatch could not find OUI based inventory in the Oracle Home

Step 3 – Attach DB Home

cd /u01/install/VISION/11.2.0/oui/bin

./attachHome.sh

Make sure the output looks like

The inventory is located at /u01/install/oraInventory
‘AttachHome’ was successful.

Step 3a – Check DB connectivity

Execute

cd. /u01/install/VISION/11.2.0/bin

export ORACLE_HOME=/u01/install/VISION/11.2.0

export ORACLE_SID=EBSDB

./sqlplus /NOLOG

connect / as sysdba

This must work`- otherwise adop apply with fail. If “connect / as sysdba” does not, work then check if  $TWO_TASK parameter is set:

echo $TWO_TASK

If this displays a value, execute

export TWO_TASK=

See note

UNIX: Checklist for Resolving Connect AS SYSDBA Issues (Doc ID 69642.1)

Step 4 – Run adgrants.sql

Before we can apply patches to eBS , we need to execute adgrants.sql – otherwise we will get the following error

AutoPatch error: Please apply adgrants.sql file in database tier before applying this patch

See note: E-Business Suite 12.2 R12.AD.C.DELTA.6 Patch 19197270 Fails With Error ‘Please Apply adgrants.sql File In Database Tier Before Applying This Patch’ (Doc ID 2039459.1)

Note: This issue can occur on other patches also.  You need to verify the version of adgrants.sql in all patches and run the highest version prior to attempting to apply the patches.

The EBS context file is located under /u01/install/VISION/fs2/inst/apps/EBSDB_ebs/appl/admin/EBSDB_ebs.xml

(ORACLE_HOME=/u01/install/VISION/11.2.0)

Follow these steps:

copy adgrants.sql from patch 22123818 (in subdirectory admin) to the DB server $ORACLE_HOME/appsutil/admin. (backup the existing one before)

export ORACLE_SID=EBSDB

cd $ORACLE_HOME/bin

./sqlplus /NOLOG

@$ORACLE_HOME/admin/adgrants.sql APPS

(Parameter 1: APPS)

Output should look like

End of Creating PL/SQL Package AD_ZD_SYS.

Start of giving grants. This may take few minutes.

PL/SQL procedure successfully completed.

Start of PURGE DBA_RECYCLEBIN.

PL/SQL procedure successfully completed.

End of PURGE DBA_RECYCLEBIN.

Commit complete.

Step 5 Run ETCC DB Checker

Install ETCC DB Checker via patch 17537119: Create a new directory /u01/install/VISION/11.2.0/appsutil/etcc, copy the zip file there and extract it.

Run DB-ETCC:

cd  /u01/install/VISION/11.2.0/appsutil/etcc

./checkDBpatch.sh  /u01/install/VISION /11.2.0/appsutil/EBSDB_ebs.xml

If the utility ask for Database context file, then enter

Enter full path to Database context file: /u01/install/VISION/11.2.0/appsutil/EBSDB_ebs.xml

The tools was run successfully if you see the output similar to

Apply the missing bugfixes and then rerun the script.

Stored Technology Codelevel Checker results in the database successfully.

Finished prerequisite patch testing : Tue Apr 19 11:08:54 EDT 2016

Log file for this session: ./checkDBpatch_2810.log

Do not apply any of the recommendations made by the utility – we will skip those for this exercise. These should be installed however in a production environment.

Troubleshooting:

If running the ETC DB Checker failed with the following error, the the previous step (attach DB home) has not been executed successfully:

/u01/install/VISION/11.2.0/OPatch

./opatch lsinventory returns

Inventory load failed… OPatch cannot load inventory for the given Oracle Home.

Possible causes are:

   Oracle Home dir. path does not exist in Central Inventory

   Oracle Home is a symbolic link

   Oracle Home inventory is corrupted

LsInventorySession failed: OracleHomeInventory gets null oracleHomeInfo

OPatch failed with error code 73

See OPatch Fails With “LsInventorySession failed: OracleHomeInventory gets null oracleHomeInfo” (Doc ID 728417.1)

Step 6 – Run auto config

Source the run environment:

cd /u01/install/VISION

. ./EBSapps.env

and select the Run file system.

Then run

adautocfg.sh

Provide APPS password.

Check that the output ends with

AutoConfig completed successfully.

Step 7 – Install eBS patches

Download all patches listed in the appendix and unzip it in the $PATCH_TOP directory – in our case

/u01/install/VISION/fs_ne/EBSapps/patch:

. <EBS_ROOT>/EBSapps.env run

cd $PATCH_TOP

Make sure you have started DB and Apps tier before the next steps.

After each execution, the output should be

adop exiting with status = 0 (Success)

Step 7a – Run prepare phase

After sourcing the run environment, execute

adop phase=prepare

This should take a while. The result should look like:

adop phase=prepare – Completed Successfully

Log file: /u01/install/VISION/fs_ne/EBSapps/log/adop/12/adop_20160420_074702.log

adop exiting with status = 0 (Success)

If you receive the following error, then you have not executed auto config (step 6):

Worker count determination…

Validation successful. All expected nodes are listed in ADOP_VALID_NODES table.
[UNEXPECTED]adop is not able to detect any application tier nodes in FND_NODES table.
[UNEXPECTED]Ensure ICM is running and run autoconfig on all nodes
[UNEXPECTED]Error while checking if this is a multi node instance
Log file: /u01/install/VISION/fs_ne/EBSapps/log/adop/adop_20160420_073426.log

Step 7b – Install patches 20745242 and 22123818

Execute

adop phase=apply patches=20745242,22123818 merge=yes

If you see this error

AutoPatch error:
Please apply adgrants.sql file in database tier before applying this patch

then an older adgrants.sql than from patch 22123818 has been applied before. The right version should be in our case

adgrants.sql 120.67.12020000.37 2015/12/18

Step 7c – Install patches 20784380, 22363475, and 22495069

Execute

adop phase=apply patches=20784380,22363475,22495069 merge=yes

Step 7d – Install patch 19259764

Execute

adop phase=apply patches=19259764

Step 7e – Install ISG consolidated patch 22328483:R12.OWF.C

Execute

adop phase=apply patches=22328483

If this patch is not applied, you will get the following error when restarting eBS Apps Tier:

./startvisionapps.sh
Starting the Oracle E-Business Suite Application Tier Servicessh: /u01/install/VISION/fs2/inst/apps/EBSDB_ebs/admin/scripts/adstrta: No such file or directory
./startvisionapps.sh: line 75: l.sh: command not found

Step 7f – Run finalize and cutover

Execute

adop phase=finalize

adop phase=cutover

Step 7g –  Run adop cleanup

EBSDB environment has changed.
All users must re-source the environment using below command:
source /u01/install/VISION/EBSapps.env run|patch

Then execute

adop phase=cleanup

Step 7h – Run adop fs_clone

Execute

adop phase=fs_clone

This step wa the last patch necessary before we procees in part 2 with configuration and testing.

Step 8 – Verify eBS after Restart

Before we proceed with Part 2, verify that the eBS instance is working as expected after patching:

Restart DB and App tier:

cd /u01/install/VISION/scripts

./stopvisionapps.sh

./stopvisiondb.sh

./startvisiondb.sh

./startvisionapps.sh

Login to eBS in a browser using SYSADMIN/sysadmin.

Navigate to Integrated SOA Gateway, Integration Repository, and search for services using “Employee” for “Business Entity” as filter.

You should see results similar to the following picture:

ISG-search-result

 

Appendix

List of patches to be applied

No. Component Patch Title
20642039 WLS/FMW MERGE REQUEST ON TOP OF 11.1.1.6.0 FOR BUGS 20361466 20484781
20745242 eBS R12.AD.C.delta.7: R12.AD.C.DELTA.7 PATCH
22123818 eBS BUNDLE FIXES II FOR R12.AD.C.DELTA.7 (20745242)
20784380 eBS R12.TXK.C.delta.7: R12.TXK.C.DELTA.7
22363475 eBS BUNDLE FIXES II FOR R12.TXK.C.DELTA.7 (20784380)
22495069 eBS TXK CONSOLIDATED PATCH FOR STARTCD 12.2.0.51
19259764 eBS ERROR WHEN OPENING FORMS IN IE8 ON MULTI-NODE EBS 12.2.3
22328483 eBS ISG Rest Services Consolidated Patch for 12.2.3+
17537119 eBS EBS Technology Codelevel Checker

References

Document  Title
Part 2 http://www.ateam-oracle.com/using-ebs-adapter-in-integration-cloud-service-part-2-configure-and-test-isg-rest-services/
1928303.1 Section 1.2.4: Using the Oracle E-Business Suite Oracle VM-based Installation
1311068.1 Installing Oracle E-Business Suite Integrated SOA Gateway, Release 12.2
1355068.1 Oracle E-Business Suite 12.2 Patching Technology Components Guide
2008451.1 How To Run The 12.2 EBS Technology Code Level Checker (ETCC) ?
728417.1 OPatch Fails With “LsInventorySession failed: OracleHomeInventory gets null oracleHomeInfo”
2039459.1 E-Business Suite 12.2 R12.AD.C.DELTA.6 Patch 19197270 Fails With Error ‘Please Apply adgrants.sql File In Database Tier Before Applying This Patch’

Using eBS Adapter in Integration Cloud Service – Part 2: Configure and Test ISG REST Services

$
0
0

Introduction

Integration Cloud Service (ICS) enables connecting applications in the cloud or on-premise. It also provides an adapter for Oracle eBusiness Suite. This eBS adapter is different than the eBS adapter in SOA Suite – it does not use a database connection. Instead it uses the REST services provided by eBS as part of Integrated SOA Gateway (ISG).

This article describes the steps needed to get eBusiness Suite including ISG REST services ready – either for using it with any REST client or with ICS. ISG requires some additional patches on top of eBS 12.2.4 – this was shown in this first part , see here.

In this second part, we will show how to enable the REST services, how to enable the metadata provider for ICS and test eBS REST services, first from a native REST client (SOAPUI) and then from ICS. All steps except chapter 4 are also relevant if you want to use Oracle eBusiness Suite ISG REST services without ICS.

Chapter 1 – Configure Integrated SOA Gateway (ISG) in eBS 12.2.4

Enabling ASADMIN User with the Integration Administrator Role

We will execute the steps in section 3 of the MOS note:

Log in to Oracle E-Business Suite as a SYSADMIN user and enter the associated password.
Expand the User Management responsibility from the main menu of the Oracle E-Business Suite Home Page.

Click the Users link to open the User Maintenance page (under “Vision Enterprises”)
Enter ‘ASADMIN’ in the User Name field and click Go to retrieve the ‘ASADMIN’ user.

Click the Update icon next to the ASADMIN user to open the Update User window.
Remove the Active To date field and click Apply.

Click the Reset Password icon next to ASADMIN user to open the Reset Password window. Make sure that ASADMIN’s password is at least eight characters long.
Enter new password twice and click Submit.

In the Update User window, click Assign Roles.
In the search window, select Code from the Search By drop-down list and enter “UMX|FND_IREP_ADMIN” in the value text box.
Click Select.
Enter a justification in the Justification field and click Apply. You will see a confirmation message indicating you have successfully assigned the role.

In my case, a warning is displayed (which can be ignored because the server is restarted later anyway):

Updates to Role data will not be visible in the application until the following processes are started : Workflow Background Engine

Change ISG agent properties

Execute the following as user oracle (as described in step 3 of MOS note):

mkdir /u01/install/VISION/isg_temp

Create /u01/install/VISION/fs2/inst/apps/EBSDB_ebs/soa/isgagent.properties to update the following line:

EBSDB.ISG_TEMP_DIRECTORY_LOCATION=/u01/install/VISION/isg_temp/

Edit /u01/install/VISION/fs1/inst/apps/EBSDB_ebs/soa/isgagent.properties to update the following line:

EBSDB.ISG_TEMP_DIRECTORY_LOCATION=/u01/install/VISION/isg_temp/

Configure ISG

Run

cd /u01/install/VISION

. ./EBSapps.env run

ant -f $JAVA_TOP/oracle/apps/fnd/txk/util/txkISGConfigurator.xml ebsSetup -DforceStop=yes

as described in step 4 of the MOS note. Enter the passwords as required – including the password of ASADMIN as chosen in the previous step.  If asked

The script will forcefully stop the Weblogic Servers now. Do you want to proceed (yes/no)? (yes, no)

then select “yes” to restart the servers.

Result should be after a couple of minutes “BUILD SUCCESSFUL”.

Execute fs_clone

Run

adop phase=fs_clone

Verify that the result is

adop exiting with status = 0 (Success)

Restart the Apps tier.

Chapter 2 – Test eBS REST Services using a PL/SQL package

We will demonstrate deploying a REST service using the HR_EMPLOYEE_API package.

Deploy CreateEmployee REST Service

Login to eBS in a browser using SYSADMIN. Navigate to Integrated SOA Gateway, Integration Repository and search for “Internal Name”  “HR_EMPLOYEE_API”:

ISG-employee-search

Select Employee and move to the “REST Web:Service” tab. Deploy selecting “Create_Employee” and entering “employee” as Service Alias:

ISG-employee-before-deploy

This should show status “Deployed”:

ISG-employee-after-deploy

Click on “WADL” to save the definition file for later.

Then grant the execution to Group “US SuperXXX”: Click on “Grant”, then select “Group of Users” and click on the search icon:

ISG-employee-before-create-grant2

Search for “US Super%” and select the first result:

ISG-employee-before-create-grant4

The result after the grant should report success and show the icon in the Grant column:

ISG-employee-after-create-grant

Now we can procees with testing this REST service from a client.

Send Request to CreateEmployee REST Service

Create a REST POST request using import of the WADL file in SOAPUI. Add HRMS/welcome as http basic authentication. Don’t forget to include the http headers for Content-Language, Content-Type and Accept – and past all lines after “User-Agent” below as payload.

A sample of a correct request would look in Raw mode like:

POST http://ebs.example.com:8000/webservices/rest/employee/create_employee/ HTTP/1.1
Accept-Encoding: gzip,deflate
Authorization: Basic SFJNUzp3ZWxjb21l
Content-Language: en-US
Content-Type: application/json
Accept: application/json
Content-Length: 886
Host: ebs.example.com:8000
Connection: Keep-Alive
User-Agent: Apache-HttpClient/4.1.1 (java 1.5)

{
“CREATE_EMPLOYEE_Input”: {
“@xmlns”: “http://xmlns.oracle.com/apps/per/rest/createEmployee/create_employee/”,
“RESTHeader”: {
“xmlns”: “http://xmlns.oracle.com/apps/per/rest/createEmployee/header”,
“Responsibility”:”US_SHRMS_MANAGER”,
“RespApplication”:”PER”,
“SecurityGroup”:”STANDARD”,
“NLSLanguage”:”AMERICAN”,
“Org_Id” :”241″
},
“InputParameters”: {
“P_HIRE_DATE”: “2015-06-28T09:00:00”,
“P_BUSINESS_GROUP_ID”:”202″,
“P_LAST_NAME”:”Mueller”,
“P_SEX”:”M”,
“P_PER_COMMENTS”:”Create From REST Service”,
“P_DATE_OF_BIRTH”:”1979-01-04T09:00:00″,
“P_EMAIL_ADDRESS”:”michael.mueller@oracle.com”,
“P_FIRST_NAME”:”Michael”,
“P_KNOWN_AS”:”Michael”,
“P_MARITAL_STATUS”:”S”,
“P_MIDDLE_NAMES”:”Mueller”,
“P_NATIONALITY”:”AM”,
“P_NATIONAL_IDENTIFIER”:”183-25-2523″,
“P_REGISTERED_DISABLED_FLAG”:”N”,
“P_COUNTRY_OF_BIRTH”:”US”,
“P_REGION_OF_BIRTH”:”West”,
“P_TOWN_OF_BIRTH”:”San Francisco”
}
}
}

Execute the request in SOAPUI:

ISG_CreateEmployee_SOAPUI

A sample of a correct response would be

HTTP/1.1 200 OK
Date: Thu, 21 Apr 2016 14:25:36 GMT
Server:
Content-Length: 791
X-ORACLE-DMS-ECID: 005CFxvqUBGDkZWFLzUKOA00004x00008d
X-Frame-Options: SAMEORIGIN
Keep-Alive: timeout=15
Connection: Keep-Alive
Content-Type: application/json
Content-Language: en

{
“OutputParameters” : {
“@xmlns:xsi” : “http://www.w3.org/2001/XMLSchema-instance”,
“@xmlns” : “http://xmlns.oracle.com/apps/per/rest/employee/create_employee/”,
“P_EMPLOYEE_NUMBER” : “2401”,
“P_PERSON_ID” : “32853”,
“P_ASSIGNMENT_ID” : “34077”,
“P_PER_OBJECT_VERSION_NUMBER” : “2”,
“P_ASG_OBJECT_VERSION_NUMBER” : “1”,
“P_PER_EFFECTIVE_START_DATE” : “2015-06-28T00:00:00.000-04:00”,
“P_PER_EFFECTIVE_END_DATE” : “4712-12-31T00:00:00.000-05:00”,
“P_FULL_NAME” : “Mueller, Michael Mueller (Michael)”,
“P_PER_COMMENT_ID” : “306”,
“P_ASSIGNMENT_SEQUENCE” : {
“@xsi:nil” : “true”
},
“P_ASSIGNMENT_NUMBER” : “2401”,
“P_NAME_COMBINATION_WARNING” : “0”,
“P_ASSIGN_PAYROLL_WARNING” : “0”,
“P_ORIG_HIRE_WARNING” : “0”
}
}

Congratulations – you have executed your first REST service on eBusiness Suite!

Troubleshooting

HTTP 500 Internal Server Error

In case you use the wrong value for P_BUSINESS_GROUP_ID (or for any other LOV-based element) in the request for CREATE_EMPLOYEE, the server returns an http 500.

{
“ISGServiceFault” : {
“Code” : “ISG_SERVICE_EXECUTION_ERROR”,
“Message” : “Error occurred while executing the web service request”,
“Resolution” : “System error, please see service log trace for details.”,
“ServiceDetails” : {
“ServiceName” : “employee”,
“OperationName” : “create_employee”,
“InstanceId” : “0”
}
}
}

The same is the case if you have a typo somewhere in the JSON element names.

Chapter 3 – Configure Metadata Provider for using ISG REST with ICS

Open eBS homepage in a broser and login with user SYSADMIN. Navigate to Integrated SOA Gateway, Integration Repository.Click on “Search” button on the right, enter “oracle.apps.fnd.rep.ws.service.EbsMetadataProvider” in the field “Internal Name” and click “Go” (If this doesn’t list anything, you are still missing a patch on the EBS instance. Please check the first part of this article).

Then click on “Metadata Provider”:
ISG-Metadata-Provider

Click on “REST Web Service” tab, then enter “provider” as is in the “Service Alias” field and click the button “Deploy”.
Navigate to “Grants” tab and give grants on all methods to “All users”.

Now you are ready to use the ISG REST services from ICS.

Troubleshooting

If you dont get any result when searching for “oracle.apps.fnd.rep.ws.service.EbsMetadataProvider”, then you have missed to install one or more patches listed in part 1 of this article.

Chapter 4 – Test eBS REST Service from ICS

A previous post by Greg Mally explains how to setup ICS to eBS using the ICS Connectivity Agent:

http://www.ateam-oracle.com/round-trip-on-premise-integration-part1_ics-to-ebs/

Additionally we will show here how to directly connect ICS with eBS without an agent. This works only if you have exposed eBS REST services over public internet.

For this purpose, I have reconfigured the eBS VM to use a public DNS hostname and changed to port to 80.

In ICS, create a new connection with Oracle eBusiness Suite – name it  for example EBS_TEST:

Enter the connection URL for the server and use HRMS/welcome as credentials. Test and Save the connection.

We want to create a new integration using SOAP inbound and this eBS connection outbound: Create a new SOAP connection, upload the WSDL (from here) and select “No Security”. Test and Save it. (For simplicity, I have left out any SOAP faults). Create a new integration named “Create_Employee”. Drop the EBS connection in the target side:

Enter a name for the endpoint, for example “EBS_CreateEmployee”.

ISG2-ICS-DropConnection1

Select Product Family “Human Resources Suite”, Product “Human Resources” and API “Employee”. The screen should look like:

ISG2-ICS-select-apipng

Select Next.

Select the operation Create_Employee:

ISG2-ICS-select-createEmployee-method

Select Next, then Done.

Your integration should look similar to:

 

ISG2-ICS-integration-after-drop-ebs

Drop the SOAP Connection as source and save the integration:

ISG2-ICS-after-drop-SOAP

Add the request mapping:

ISG2_ICS-create-mapping-for-SOAP-request

Add the response mapping:  Map “P_EMPLOYEE_NUMBER” to “result”:

ISG2_ICS-create-mapping-for-SOAP-response  ISG2_ICS-tracking-for-SOAP-test

Create the response mapping:

ISG2_ICS-create-mapping-for-SOAP-response

Add a tracking field:

ISG2_ICS-tracking-for-SOAP-test

The result should look like:

ISG-ICS-soap-ebs-integration-final

Activate the integration. Then copy the WSDL URL and use that to create a new SOAPUI project.

 

In SOAPUI, add the payload like below and add http basic auth, WS Username Token and WS-Timestamp:

<soapenv:Body>
<cre:process>
<cre:lastname>Mueller</cre:lastname>
<cre:middlenames>Edwin</cre:middlenames>
<cre:firstname>Michael</cre:firstname>
<cre:knownas>Michael</cre:knownas>
<cre:email>michael.mueller@oracle.com</cre:email>
<cre:comments>Create From SOAP Service</cre:comments>
<cre:sex>M</cre:sex>
<cre:martialstatus>S</cre:martialstatus>
<cre:businessgroup>202</cre:businessgroup>
<cre:dateofbirth>1979-01-04T09:00:00</cre:dateofbirth>
<cre:nationality>AM</cre:nationality>
<cre:nationalid>183-25-2523</cre:nationalid>
<cre:countryofbirth>US</cre:countryofbirth>
<cre:regionofbirth>West</cre:regionofbirth>
<cre:townofbirth>San Francisco</cre:townofbirth>
<cre:hiredate>2015-06-28T09:00:00</cre:hiredate>
</cre:process>
</soapenv:Body>

 

After executing the request you should get the created employee id as response:

<nstrgmpr:processResponse xmlns:wsdl=”http://schemas.xmlsoap.org/wsdl/” xmlns:nstrgmpr=”http://xmlns.oracle.com/CreateEmployee/CreateEmployeeSOAP/CreateEmployeeProcess” xmlns:plnk=”http://docs.oasis-open.org/wsbpel/2.0/plnktype” xmlns:xsd=”http://www.w3.org/2001/XMLSchema”>
    <nstrgmpr:result>2411</nstrgmpr:result>
</nstrgmpr:processResponse>

Congratulations – you have executed the ICS integration using eBusiness Suite Adapter successfully!

Transport Level Security (TLS) and Java

$
0
0

Know Which Versions of TLS are Supported in Recent Java Versions

In the twenty-plus years of the Internet’s interaction with the Secure Sockets Layer (SSL) and Transport Level Security (TLS) protocols, there have been some rough patches.  Over the years, various vulnerabilities, some of them exposed in a laboratory setting and others discovered and exploited by hackers, have made it necessary to revamp the protocols and to augment specifications.  Major changes to specifications, obviously, make it more difficult to support backwards compatibility, especially in a network as large and as decentralized as the Internet.  After all, it just isn’t realistic to declare by decree that every web site on the Internet upgrade its older, insecure version of SSL or TLS in favor of the new and improved security framework du jour.

Another problem, somewhat related to backwards compatibility, is that the Internet, due to its sheer size and diversity, cannot react instantaneously to an upgraded security protocol, and that the client agents and programming languages need time to assimilate security protocol changes into their own upgrade and version release schedules.  A major case in point is the Java language, which is perhaps the most widely used programming language for developing applications that are designed to communicate with clients and/or with other servers over both private and public networks.  Occasionally, special handling in some versions of Java is required to allow applications to communicate with each other using the latest and greatest versions of these security protocols.  Working with these Java versions as needed given certain versions of TLS is covered here.

A Little History:  SSL Under Siege

In the Internet’s formative years, when it was still the exclusive domain of scientific researchers and developers, before commerce and entertainment became Internet channels, and before it became a part of everyday life and business for millions, few users would have put a very high priority on the need to add a security layer to communications, even though it was a public network.  Of course, priorities changed quickly as the Internet’s usage patterns expanded, as more and more people started using the Internet for more and more things.  With increased popular usage, and with the expanded potential for usage in commerce, the incentive for hijacking network communications became more lucrative potentially.  It became obvious that if the Internet were to take on an expanded role, and if privacy was going to be respected, something would have to be done to protect sensitive data transmissions from unauthorized snoopers and others with criminal intentions.

And thus the Secure Sockets Layer, or SSL, protocol was born.  The original authors and developers of the specification identified three core functional areas:

  • Privacy: Use of encryption
  • Identity authentication: System Identification using certificates
  • Reliability: Maintaining a secure connection with message integrity checking

Attempts to implement these features securely in the first few major versions of the SSL protocol went down a path fraught with false starts and wrong turns.  Indeed, SSL version 1.0, developed by Netscape in the early 1990’s, was never even released publicly due to a large number of security flaws.  (There are claims that SSL v1.0 was cracked ten minutes after Netscape introduced it to the developer community.)  SSL v2.0, released in 1995, also had numerous security gaps, which forced a ground-up redesign and rework of the protocol.  This work culminated in the release of SSL v3.0 in 1996 (IETF RFC 6101).

Up until version 3.0 of SSL, drafting the specifications and doing the actual work largely fell under the purview of one vendor, Netscape.  Technically speaking, therefore, the first few versions of the protocol were proprietary solutions.  Due to security’s increasing significance and importance the IETF governing body took over management of SSL after version 3.0 was released.  Renaming SSL to Transport Level Security (TLS) was one of the body’s first actions, and after that there were a number of incremental revisions to TLS, resulting in TLS 1.0, 1.1, and TLS 1.2, which is the most recent iteration of the specification and protocol.

Historically, vulnerabilities tended to cluster around two key components of the SSL/TLS protocols:  exploiting holes in the initial handshake process, or finding gaps in either the encryption process or the encrypted data.  In SSL v2.0, for example, one of the early exploits manipulated the list of available cipher suites presented to a communications partner during the handshake process, thereby forcing a weaker cipher, smoothing the way for Man-In-The-Middle attacks to be successful.  SSL v3.0 filled this hole, along with incorporating other improvements, but vulnerabilities continued to be uncovered, resulting eventually in TLS 1.0.  Despite improvements in TLS over SSL, this family of cipher suite downgrade attacks continues to be relevant for even newer versions of TLS.

Researchers have been (and continue to be) major contributors in identifying weaknesses and security holes in SSL and TLS.  The BEAST, CRIME, and BREACH attacks were identified in 2011 and 2012, resulting in a number of supplied fixes from browser and O/S vendors.  Some browsers were more open to these types of attacks than others.

Even after fixes and improvements were incorporated into TLS 1.0, Google Security Team researchers were able to uncover a means of accomplishing a Man-In-The-Middle exploit by taking advantage of TLS 1.0’s built-in behavior of reverting back to SSL 3.0 to support backwards compatibility for communications partners.  Exposed in late 2014, this came to be known as the POODLE vulnerability, and it set off a flurry of corrective activity from browser vendors and language providers alike.  The net result was a wholesale movement to remove support for SSL 3.0 in browsers and networked applications.

In addition to the defects uncovered in the various SSL and TLS specifications, bugs in library implementations have also been problematic for secure network communications.  In early 2014, the Heartbleed bug/vulnerability in the extremely popular OpenSSL software stack opened up websites to the potential for compromising their secret private keys, thereby making it possible for attackers to eavesdrop on communications, impersonate identities, and steal valuable data.

SSL/TLS and Java Support

This brief historical overview of SSL and TLS demonstrates that the protocol specifications have been extremely fluid and have never been in a state that other technologies can just take as a constant.  The protocols are in many ways moving targets that other computing technologies, such as Java, need to take into account when patches and major upgrades are released.  There have been occasions when TLS has incorporated a fix for an identified security gap (take the somewhat recent POODLE exploit as an example), and other technologies that interact with SSL/TLS have little choice but to “catch up” in their support of the new additions.  This is especially true when the solution involves removing support for an older protocol, such as what happened with POODLE and SSL 3.0.  The time lags are unfortunate, but due to the “late-breaking” nature of security holes when they are discovered, there does not seem to be much of an alternative.

Over time, Java releases have reacted to the evolution of SSL and TLS by building in support for newer releases of the security protocols.

Java Version SSL/TLS Default Other Supported Versions
Java 6 TLS 1.0 TLS 1.1 (update 111 and later), SSLv3.0*
Java 7 TLS 1.0 TLS 1.2, TLS 1.1, SSLv3.0*
Java 8 TLS 1.2 TLS 1.1, TLS 1.0, SSLv3.0*

* SSLv3 support disabled in January, 2015 patch releases

Up until January, 2015, all of the above-listed Java releases had fallback support for SSL 3.0.  As a response to the late-2014 POODLE exploit, Oracle issued CPU releases in early 2015 (JDK 8u31, JDK 7u75, and JDK 6u91), to disable SSL v3 by default.  Publicly available Java 6 releases do not have built-in support for TLS 1.2.  Java 8, with its default support for TLS 1.2, has caught up with the latest released specification of the protocol.

Due to POODLE, the vast majority of web sites have disabled support for SSL 3.0 on their servers.  There has been related momentum building up in the Internet security community to recommend disabling “early” versions of TLS as well, as there are known security issues both in TLS 1.0 and TLS 1.1.  Running only TLS 1.2, however, may block older browsers and other clients from connecting successfully.  This represents a tradeoff between tightly clamping down on security versus being a bit more flexible about supporting older browsers and other clients.

Clients (and servers connecting to other servers as clients) dependent upon the Java 7 JDK and JRE may be affected negatively by sites allowing only newer TLS versions.  By default Java 7 does not successfully negotiate with TLS 1.1 and TLS 1.2 servers.  If attempts to connect with a Java 7 client result in a stacktrace with an SSLHandshakeException it is possible that the default client behavior will need to be modified.  The expanded stacktrace (taken from a SOAPUI session) makes it fairly clear that the server is shutting down the client connection before the handshake sequence can complete:

javax.net.ssl.SSLException: Connection has been shutdown: javax.net.ssl.SSLHandshakeException: Remote host closed connection during handshake
        at sun.security.ssl.SSLSocketImpl.checkEOF(Unknown Source)
        at sun.security.ssl.SSLSocketImpl.checkWrite(Unknown Source)
        at sun.security.ssl.AppOutputStream.write(Unknown Source)
        at org.apache.http.impl.io.AbstractSessionOutputBuffer.flushBuffer(AbstractSessionOutputBuffer.java:131)
        at org.apache.http.impl.io.AbstractSessionOutputBuffer.flush(AbstractSessionOutputBuffer.java:138)
        at org.apache.http.impl.conn.LoggingSessionOutputBuffer.flush(LoggingSessionOutputBuffer.java:95)
        at org.apache.http.impl.AbstractHttpClientConnection.doFlush(AbstractHttpClientConnection.java:270)
        at org.apache.http.impl.SocketHttpClientConnection.close(SocketHttpClientConnection.java:245)
        at org.apache.http.impl.conn.DefaultClientConnection.close(DefaultClientConnection.java:164)
        at org.apache.http.impl.conn.AbstractPooledConnAdapter.close(AbstractPooledConnAdapter.java:152)
        at org.apache.http.protocol.HttpRequestExecutor.closeConnection(HttpRequestExecutor.java:142)
        at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:129)
        at org.apache.http.impl.client.DefaultRequestDirector.tryExecute(DefaultRequestDirector.java:633)
        at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:454)
        at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:820)
        at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:754)
        at com.eviware.soapui.impl.wsdl.support.http.HttpClientSupport$Helper.execute(HttpClientSupport.java:233)
        at com.eviware.soapui.impl.wsdl.support.http.HttpClientSupport.execute(HttpClientSupport.java:323)
        at com.eviware.soapui.impl.wsdl.submit.transports.http.HttpClientRequestTransport.submitRequest(HttpClientRequestTransport.java:290)
        at com.eviware.soapui.impl.wsdl.submit.transports.http.HttpClientRequestTransport.sendRequest(HttpClientRequestTransport.java:220)
        at com.eviware.soapui.impl.wsdl.WsdlSubmit.run(WsdlSubmit.java:119)
        at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
        at java.util.concurrent.FutureTask.run(Unknown Source)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
        at java.lang.Thread.run(Unknown Source)
Caused by: javax.net.ssl.SSLHandshakeException: Remote host closed connection during handshake
        at sun.security.ssl.SSLSocketImpl.readRecord(Unknown Source)
        at sun.security.ssl.SSLSocketImpl.performInitialHandshake(Unknown Source)
        at sun.security.ssl.SSLSocketImpl.writeRecord(Unknown Source)
        at sun.security.ssl.AppOutputStream.write(Unknown Source)
        at org.apache.http.impl.io.AbstractSessionOutputBuffer.flushBuffer(AbstractSessionOutputBuffer.java:131)
        at org.apache.http.impl.io.AbstractSessionOutputBuffer.write(AbstractSessionOutputBuffer.java:151)
        at org.apache.http.impl.conn.LoggingSessionOutputBuffer.write(LoggingSessionOutputBuffer.java:74)
        at org.apache.http.impl.io.ContentLengthOutputStream.write(ContentLengthOutputStream.java:114)
        at org.apache.http.impl.io.ContentLengthOutputStream.write(ContentLengthOutputStream.java:120)
        at org.apache.http.entity.ByteArrayEntity.writeTo(ByteArrayEntity.java:68)
        at org.apache.http.entity.HttpEntityWrapper.writeTo(HttpEntityWrapper.java:96)
        at org.apache.http.impl.client.EntityEnclosingRequestWrapper$EntityWrapper.writeTo(EntityEnclosingRequestWrapper.java:108)
        at org.apache.http.impl.entity.EntitySerializer.serialize(EntitySerializer.java:120)
        at org.apache.http.impl.AbstractHttpClientConnection.sendRequestEntity(AbstractHttpClientConnection.java:263)
        at org.apache.http.impl.conn.AbstractClientConnAdapter.sendRequestEntity(AbstractClientConnAdapter.java:227)
        at org.apache.http.protocol.HttpRequestExecutor.doSendRequest(HttpRequestExecutor.java:255)
        at com.eviware.soapui.impl.wsdl.support.http.HttpClientSupport$SoapUIHttpRequestExecutor.doSendRequest(HttpClientSupport.java:119)
        at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:123)
        ... 14 more
Caused by: java.io.EOFException: SSL peer shut down incorrectly
        at sun.security.ssl.InputRecord.read(Unknown Source)
        ... 32 more		

SSL handshake failures typically occur because the client and the server cannot agree on which version of the protocol to use.  In the case of a default Java 7 client and a server that supports TLSv1.2 and possibly TLSv1.1, there is no common ground for agreement because there are no shared, supported protocols.  At the beginning of the handshake process, the Java 7 client sends a “ClientHello” message with an indication that it is ready to support the TLSv1 protocol (or anything older such as SSLv3.0).  The server sees the request but has no choice other than to close the connection because it knows the client cannot support TLSv1.2 or TLSv1.1.

To work around this issue, supply the command line directive “-Dhttps.protocols=TLSv1.2,TLSv1.1,TLSv1” when starting the Java VM, or in code, the equivalent system property directive is “System.setProperty(“https.protocols”, “TLSv1.2,TLSv1.1,TLSv1”)”.  If connecting to a TLS 1.1 or TLS 1.2 site from a JEE server environment, it will be necessary to add the command line directive to the server’s startup scripts.  With Weblogic, for example, editing the Java startup properties in the setDomainEnv.sh script should get the job done.

With these directives in place the handshake process succeeds.  Initially, the Java 7 client sends a “ClientHello” message, but now it indicates that it can support the TLSv1.2 protocol.  The server responds, certificate information is sent, a cipher suite is agreed upon, and protected communications between the two parties can begin.

To determine with certainty which version(s) of the TLS protocol are supported by a target web site, Qualys SSL Labs provides a free online service (https://www.ssllabs.com) that provides audit reports of SSL URL endpoints.  In addition to certificate details, supported cipher suite listings, and simulated handshake sequences with a variety of user agents (including Java 6, Java 7, and Java 8), the report has a section on enabled protocols for the site.

Not so coincidentally, repeated unsuccessful attempts to connect to a security-upgraded web site with a default Java 7 client was the catalyst for this research and writeup.  Here is the supported protocol summary from the SSL Labs online report for the problematic (at least for Java 7 configured with the defaults) site:

SSLLabsProtocols

Many endpoints protected by SSL/TLS are serving SOAP or REST web services.  The current release of Smart Bear Software’s popular testing tool, SOAPUI, is built with Java 7, so it too will need special configuration when trying to connect to a TLS 1.1 or TLS 1.2 SOAP or REST endpoint.  One may think that setup for a successful handshake with the server would be to add the Java command line directive detailed above in the SOAPUI startup sequence.  Although seemingly a sound strategy, this does not lead to success; SOAPUI has put in place its own command line configuration modifiers.  Adding “-Dsoapui.https.protocols=TLSv1.2,TLSv1.1,TLSv1.0” to the JAVA_OPTS environment variable in the soapui.bat file will work successfully, however.

Looking Ahead

With default support in Java 8 for TLS 1.2, the current release of the security protocol, it makes the most sense to run Java client applications with Java 8 if there is some flexibility in being able to choose the Java version.  Without that flexibility, If running Java 7 is the only option, then it will be necessary to modify Java startup parameters for the application if the communications target has both SSLv3 and TLSv1.0 disabled.  This non-default setup requirement may become more and more a necessity as more and more secure endpoints disable TLS1.0 along with SSLv3.0.  But at the same time, applications should slowly but surely start to uptake Java 8, so the non-default configuration settings requirement for Java 7 should be short-lived.

Additional Resources

The following on-line resources were helpful in compiling this document:

Java SE 7 Security Enhancements Documentation:  http://docs.oracle.com/javase/7/docs/technotes/guides/security/enhancements-7.html

Java SE 8 Security Enhancements Documentation:  https://docs.oracle.com/javase/8/docs/technotes/guides/security/enhancements-8.html

Java Platform Group Blog: Diagnosing TLS, SSL, and HTTPS : https://blogs.oracle.com/java-platform-group/entry/diagnosing_tls_ssl_and_https

 

Integration Cloud Service (ICS) Security & Compliance

$
0
0

The attached white paper is the product of a joint A-Team effort that included Deepak Arora, Mike Muller, and Greg Mally.  Oracle Integration Cloud Service (ICS) runs within the Oracle Cloud where the architecture is designed to provide customers with a unified suite of Cloud Services with best-in-class performance, scalability, availability, and security. The Cloud Services are designed to run on a unified data center, hardware, software, and network architecture. This document is based on the Cloud Security Assessment section of the Security for Cloud Computing: 10 Steps to Ensure Success V2.0 document, which is produced by the Cloud Standards Customer Council where Oracle is a member.

For more details, see attached:

ICS Security and Compliance_v1.0

Java API for Integration Cloud Service

$
0
0

Introduction

Oracle ICS (Integration Cloud Service) provides a set of handy REST APIs that allow users to manage and monitor related artifacts such as connections, integrations, lookups and packages. It also allow the retrieval of monitoring metrics for further analysis. More details about it can be found in the following documentation link.

The primary use case for these REST APIs is to allow command-line interactions to perform tasks such as gathering data about ICS integrations or backup a set of integrations by exporting its contents. In order to interface with these REST APIs, users may adopt command-line utilities such as cURL to invoke them. For instance, the command below shows how to retrieve information about a connection:

curl -u userName:password -H “Accept:application/json” -X GET https://your-ics-pod.integration.us2.oraclecloud.com/icsapis/v1/connections/connectionId

If the command above executes completely then the output should be something like the following JSON payload:

{
   "links":{
      "@rel":"self",
      "@href":"https:// your-ics-pod.integration.us2.oraclecloud.com:443/icsapis/v1/connections/connectionId"
   },
   "connectionproperties":{
      "displayName":"WSDL URL",
      "hasAttachment":"false",
      "length":"0",
      "propertyGroup":"CONNECTION_PROPS",
      "propertyName":"targetWSDLURL",
      "propertyType":"URL_OR_FILE",
      "required":"true"
   },
   "securityproperties":[
      {
         "displayName":"Username",
         "hasAttachment":"false",
         "length":"0",
         "propertyDescription":"A username credential",
         "propertyGroup":"CREDENTIALS",
         "propertyName":"username",
         "propertyType":"STRING",
         "required":"true"
      },
      {
         "displayName":"Password",
         "hasAttachment":"false",
         "length":"0",
         "propertyDescription":"A password credential",
         "propertyGroup":"CREDENTIALS",
         "propertyName":"password",
         "propertyType":"PASSWORD",
         "required":"true"
      }
   ],
   "adaptertype":{
      "appTypeConnProperties":{
         "displayName":"WSDL URL",
         "hasAttachment":"false",
         "length":"0",
         "propertyGroup":"CONNECTION_PROPS",
         "propertyName":"targetWSDLURL",
         "propertyType":"URL_OR_FILE",
         "required":"true"
      },
      "appTypeCredProperties":[
         {
            "displayName":"Username",
            "hasAttachment":"false",
            "length":"0",
            "propertyDescription":"A username credential",
            "propertyGroup":"CREDENTIALS",
            "propertyName":"username",
            "propertyType":"STRING",
            "required":"true"
         },
         {
            "displayName":"Password",
            "hasAttachment":"false",
            "length":"0",
            "propertyDescription":"A password credential",
            "propertyGroup":"CREDENTIALS",
            "propertyName":"password",
            "propertyType":"PASSWORD",
            "required":"true"
         }
      ],
      "appTypeLargeIconUrl":"/images/soap/wssoap_92.png",
      "appTypeMediumGrayIconUrl":"/images/soap/wssoap_g_46.png",
      "appTypeMediumIconUrl":"/images/soap/wssoap_46.png",
      "appTypeMediumWhiteIconUrl":"/images/soap/wssoap_w_46.png",
      "appTypeName":"soap",
      "appTypeSmallIconUrl":"/images/soap/wssoap_32.png",
      "displayName":"SOAP",
      "features":"",
      "source":"PREINSTALLED",
      "supportedSecurityPolicies":"Basic Authentication, Username Password Token, No Security Policy"
   },
   "code":"connectionCode",
   "imageURL":"/images/soap/wssoap_w_46.png",
   "name":"Connection Name",
   "percentageComplete":"100",
   "securityPolicy":"USERNAME_PASSWORD_TOKEN",
   "status":"CONFIGURED",
   "supportsCache":"true"
}

These APIs were designed to return JSON payloads in most cases. However, some operations allow the result to be returned in the XML format. Users can control this by specifying the “Accept” HTTP header in the request.

Regardless of which format is chosen to work with, it is expected that users must handle the payload to read the data. This means that they will need to develop a program to retrieve the payload, parse it somehow and then work with that data. Same applies to invoking REST endpoints with path parameters or posting payloads to them. The end result is that a considerable amount of boilerplate code must be written, tested and maintained no matter which programming language is chosen.

Aiming to make things easier, the Oracle A-Team developed a Java API to abstract the technical details about how to interact with the REST APIs. The result is a simple-to-use, very small JAR file that contains all you need to rapidly create applications that interact with ICS. Because the API is written in Java, it can be reused across a wide set of programming languages that can run on a JVM including Clojure, JavaScript, Groovy, Scala, Ruby, Python, and of course Java.

The Java API for ICS is provided for free to use “AS-IS” but without any official support from Oracle. Bugs, feedback and enhancement requests are welcome but need to be performed using the comments section of this blog and the A-Team reserves the right of help in the best-effort capacity.

This blog will walk you through the steps required to use this Java API, providing code samples that demonstrate how to implement a number of common use cases.

Getting Started with the Java API for ICS

The first thing you need to do to start playing with the Java API for ICS is to download a copy of the library. You can get a free copy here. This library also depends on a few Open-Source libraries so you will need to download these as well. The necessary libraries are:

* FasterXML Jackson 2.0: The library uses this framework to handle JSON transformations back and forth to Java objects. You can download the libraries here. The necessary JAR files are: jackson-core, jackson-annotations and jackson-databind.

* Apache HTTP Components: Used to handle any HTTP interaction with the REST APIs for ICS. You can download the libraries here. The necessary JAR files are: http-core, http-client, http-mime and commons-codec and commons-logging.

It is important to remember that you must use JDK 1.6 or higher. Any JDK older version won’t work. Once all the libraries are on the classpath, you will be ready to get started.

Excuse me Sir – May I Have a Token?

The Java API for ICS was designed to provide the highest level of simplicity possible. Thus, pretty much all operations can be executed by calling a single object. This object is called Token. In simpler terms, a token gives you access to execute operations against your ICS pod. However; as you may expect, tokens are not freely accessible. In order to create a token, your code needs to authenticate against ICS. The example below shows how to create a token.

import com.oracle.ateam.cloud.ics.javaapi.Token;
import com.oracle.ateam.cloud.ics.javaapi.TokenManager;

public class CreatingTokens {

   public static void main(String[] args) throws Exception {

      String serviceUrl = "https://your-ics-pod.integration.us2.oraclecloud.com";
      String userName = "yourUserName";
      String password = "yourPassword";

      Token token = TokenManager.createToken(serviceUrl, userName, password);
      System.out.println("Yeah... I was able to create a token: " + token);

   }

}

The parameters used to create a token are pretty straightforward, and you should be familiar with them for your ICS pod. When the createToken() method is executed, it tries to authenticate against the ICS pod mentioned in the service URL. If for some reason the authentication does not happen, an exception will be raised with proper details. Otherwise, the token will be created and returned to the caller.

A token is a very lightweight object that can be reused across your application. Thus, it is a good idea to cache it after its creation. Another important aspect of the token is that it is thread-safe. That means that multiple threads can simultaneously invoke its methods without concerns about locks of any kind.

Using a Token to Perform Operations against ICS

Once you have properly created a token, you can start writing code to retrieve data from ICS and/or invoke operations against it. The examples below will show various ways in which the Java API for ICS can be used.

Listing all the integrations; who created them and their status

Token token = TokenManager.createToken(serviceUrl, userName, password);
List<Integration> integrations = token.retrieveIntegrations();

for (Integration integration : integrations) {

   System.out.println(integration.getName() + ": Created by '" +
   integration.getCreatedBy() + "' and it is currently " +
   integration.getStatus());

}

Showing source and target connections of one specific integration

Token token = TokenManager.createToken(serviceUrl, userName, password);
Integration integration = token.retrieveIntegration(integrationId, integrationVersion);
		
System.out.println("Integration: " + integration.getName());
Connection sourceConnection = integration.getSourceConnection();
Connection targetConnection = integration.getTargetConnection();
		
System.out.println("   Source Connection: " + sourceConnection.getName() +
         " (" + sourceConnection.getAdapterType().getDisplayName() + ")");
System.out.println("   Target Connection: " + targetConnection.getName() +
         " (" + targetConnection.getAdapterType().getDisplayName() + ")");

Exporting all integrations that are currently active

final String BACKUP_FOLDER = "/home/rferreira/ics/backup/";
Token token = TokenManager.createToken(serviceUrl, userName, password);
List<Integration> integrations = token.retrieveIntegrations();
		
for (Integration integration : integrations) {
						
   if (integration.getStatus().equals("ACTIVATED")) {
				
      Status status = integration.export(BACKUP_FOLDER + integration.getCode());
      System.out.println(integration.getCode() + " = " + status.getStatusInfo());
				
   }
			
}

Alternatively, if you are using JDK 1.8 then you could rewrite the entire for-each code using Lambdas:

integrations.parallelStream()
   .filter(i -> i.getStatus().equals("ACTIVATED"))
   .forEach((i) -> {
				
      String fileName = BACKUP_FOLDER + i.getCode();
      System.out.println(i.export(fileName).getStatusInfo());
			
   });

Printing monitoring metrics of one specific integration

Token token = TokenManager.createToken(serviceUrl, userName, password);
MonitoringMetrics monitoringMetrics = token.retrieveMonitoringMetrics(integrationId, integrationVersion);
		
System.out.println("Flow Name: " + monitoringMetrics.getFlowName());
System.out.println("   Messages Received...: " + monitoringMetrics.getNoOfMsgsReceived());
System.out.println("   Messages Processed..: " + monitoringMetrics.getNoOfMsgsProcessed());
System.out.println("   Number Of Errors....: " + monitoringMetrics.getNoOfErrors());
System.out.println("   Errors in Queues....: " +
   monitoringMetrics.getErrorsInQueues().getErrorObjects().size());
System.out.println("   Success Rate........: " + monitoringMetrics.getSuccessRate());
System.out.println("   Avg Response Time...: " + monitoringMetrics.getAvgRespTime());
System.out.println("   Last Updated By.....: " + monitoringMetrics.getLastUpdatedBy());

Deleting all connections that are currently incomplete

Token token = TokenManager.createToken(serviceUrl, userName, password);
List<Connection> connections = token.retrieveConnections();
		
for (Connection connection : connections) {
			
   if (Integer.parseInt(connection.getPercentageComplete()) < 100) {
				
      connection.delete();
				
   }
			
}

Deactivating all integrations whose name begins with “POC_”

Token token = TokenManager.createToken(serviceUrl, userName, password);
List<Integration> integrations = token.retrieveIntegrations();
		
for (Integration integration : integrations) {
			
   if (integration.getName().startsWith("POC_")) {
				
      System.out.println(integration.deactivate());
				
   }
			
}

Listing all packages and its integrations (using JDK 1.8 Lambdas)

Token token = TokenManager.createToken(serviceUrl, userName, password);
List<com.oracle.ateam.cloud.ics.javaapi.types.Package> pkgs = token.retrievePackages();
		
pkgs.forEach((p) -> {
			
   System.out.println("Package Name: " + p.getPackageName());
   p.getPackageContent().forEach(pc -> System.out.println(pc.getName()));
		
});

Importing integrations into ICS from the previously exported archive

Token token = TokenManager.createToken(serviceUrl, userName, password);
Status status = token.importIntegration("home/rferreira/ics/backup/myInteg.iar", false);
System.out.println(status.getStatusInfo());

Tip: the boolean parameter in the importIntegration() method controls whenever the integration must be replaced or not. If that parameter is set to true, then it will override any existing integration that has the same name. Otherwise; if that parameter is set to false, then it will assume that the integration does not exist in ICS and it will create it.

Alternatively, you can import a complete set of integrations at once by importing a package:

Status status = token.importPackage("home/rferreira/ics/backup/samplePackage.par");
System.out.println(status.getStatusInfo());

Conclusion

ICS is a powerful iPaaS solution offered by Oracle that provides a robust set of management capabilities. Along with its development console, it also provides a set of REST APIs that enable the creation of custom apps that can fetch data from ICS. Although those REST APIs are useful, developers often need more productivity while writing their code. This blog introduced the Java API for ICS, a simple-to-use library that abstracts the technical details of the REST APIs. We provided details on how to download, configure, and use the library, and code samples were provided to demonstrate how to use the Java API to access a range of common ICS management functions.

Oracle Service Cloud – Outbound Integration Approaches

$
0
0

Introduction

This blog is part of the series of blogs the A-Team has been running on Oracle Service Cloud(Rightnow).

In the previous blogs we went through various options for importing data into Service Cloud. In this article I will first describe two main ways of subscribing to outbound events, as data is created/updated/deleted in Rightnow. These notifications are real-time and meant only for real-time or online use-cases.
Secondly, I will briefly discuss a few options for bulk data export.

This blog is organized as follows :

  1. Event Notification Service (ENS) – The recently introduced mechanism for receiving outbound events
    • a. Common Setup Required – for using ENS
    • b. Registering a Generic Subscriber with ENS
    • c. Using Integration Cloud Service – the automated way of subscribing to ENS
  2. Rightnow Custom Process Model(CPM) – The more generic, PHP-cURL based outbound invocation mechanism
  3. Bulk Export
    • a. Rightnow Object Query Language (ROQL) and ROQL based providers
    • b. Rightnow Analytics Reports
    • c. Third-party providers

1. The Event Notification Service

Sincethe May 2015 release Rightnow has a new feature called the Event Notification Service, documented here .
This service currently allows any external application to subscribe to Create/Update/Delete events for Contact, Incident and Organization objects in Service Cloud. More objects/features may be added in upcoming releases.

I will now demonstrate how to make use of this service to receive events. Essentially there are two ways, using the Notification Service as is (the generic approach) or via Integration Cloud Service (ICS).

a. Common Setup

In order to receive event notifications the following steps have to be completed in the Rightnow Agent Desktop. These steps need to be completed for both generic as well as the ICS approaches below.

  1. In the Agent Desktop go to Configuration -> Site Configuration-> Configuration Settings. In the Search page that comes up, in the ‘Configuration Base’ section select ‘Site’ and click Search.
  2. In the ‘Key’ field enter ‘EVENT%’ and click Search.
  3. Set the following keys:
    • EVENT_NOTIFICATION_ENABLED – Set it to ‘Yes’ for the Site. This is the global setting that enables ENS.
    • EVENT_NOTIFICATION_MAPI_USERNAME – Enter a valid Service Cloud username.
    • EVENT_NOTIFICATION_MAPI_PASSWORD – Enter the corresponding password.
    • EVENT_NOTIFICATION_MAPI_SEC_IP_RANGE – This can be used for specifying whitelisted subscriber IP Addresses. All IPs are accepted if kept blank.
    • EVENT_NOTIFICATION_SUBSCRIBER_USERNAME– Enter the Subscriber service’s username. ENS sends these credentials as part of the outgoing notification, in the form of a WS-Security Username-Password token.
    • EVENT_NOTIFICATION_SUBSCRIBER_PASSWORD – Enter the password.

01

b. Registering a Generic Subscriber

Now that the Event Notifications have been enabled, we need to create a subscriber and register it. The subscriber endpoint should be reachable from Rightnow, and in most cases any publicly available endpoint should be good.

For the purpose of this blog I defined a generic subscriber by creating a Node.js based Cloud9 endpoint accessible at https://test2-ashishksingh.c9users.io/api/test . It’s a dummy endpoint that accepts any HTTP POST and prints the body on Cloud9 terminal. It doesn’t require any authentication as well.

In order to register this endpoint, following steps must be followed :

  1. Rightnow manages subscriptions by using an object called ‘EventSubscription’. By instantiating this object an ‘endpoint’ can be registered as a subscriber, to listen to an object(Contact/Organization/Incident) for a particular operation(Create/Update/Delete). The object also tracks username/password to be sent out to the endpoint as part of the notification.
  2. In order to create an EventSubscription object the usual Connect Web Services Create operation can be used. Below is a sample XML request payload for the Create operation, that registers a Contact Update event to the Cloud9 endpoint.
  3. <soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:v1="urn:messages.ws.rightnow.com/v1_3" xmlns:v11="urn:base.ws.rightnow.com/v1_3">
       <soapenv:Body>
          <v1:Create>
             <v1:RNObjects xmlns:ns4="urn:objects.ws.rightnow.com/v1_3" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:type="ns4:EventSubscription"> <!--specify the subscription object-->
    			<ns4:EndPoint>https://test2-ashishksingh.c9users.io/api/test</ns4:EndPoint> <!--endpoint info-->
    			<ns4:EventType>
    				<ID id="2" xmlns="urn:base.ws.rightnow.com/v1_3" /> <!--1=Create,2=Update,3=Delete-->
    			</ns4:EventType>
    			<ns4:IntegrationUser>
    				<ID id="1" xmlns="urn:base.ws.rightnow.com/v1_3" /> <!--1 = the seeded SUSCRIBER_USERNAME and PWD above-->
    			</ns4:IntegrationUser>
    			<ns4:Name>TestContactSubscription</ns4:Name>  <!--Name of the subscription-->
    			<ns4:ObjectShape xsi:type="Contact"/>   <!--Name of the object to subscribe-->
    			<ns4:Status>
    				<ID id="1" xmlns="urn:base.ws.rightnow.com/v1_3" /> <!--1=Active,2=Paused,3=Inactive-->
    			</ns4:Status>
             </v1:RNObjects>
          </v1:Create>
       </soapenv:Body>
    </soapenv:Envelope>

     
    Note : The OWSM security policy username_token_over_ssl_client_policy can be used to invoke the web service, passing valid Rightnow credentials. However, the SOAP Security Header shouldn’t contain a TimeStamp element. Rightnow will discard the requests containing a Timestamp element in the SOAP Header.

  4. That’s it. The endpoint is now registered, and whenever a contact is updated, Rightnow will invoke the registered endpoint with details. The message sent out is an XML SOAP message that contains object/event details and conforms to the Rightnow Event WSDL available at https:///cgi-bin/.cfg/services/soap?wsdl=event . This message also contains the SUBSCRIBER_USERNAME/PWD in the SOAP Header, in the form of a WS-Security UsernameToken. For now our Cloud9 endpoint doesn’t care about validating the Username token.
  5. In order to test, let’s update a Contact in Agent Desktop
  6. 02

  7. Voila! We see the corresponding EventNotification XML message in the Cloud9 console.
  8. 03

    For reference I have attached the formatted XML message here.

c. Using ICS Service Cloud Adapter

The Oracle Integration Cloud Service (ICS), the tool of choice for SaaS integrations, automates all of the steps in 1.2 above into a simple GUI based integration definition.
Below are the steps for receiving Rightnow events in ICS. It is assumed that the reader is familiar with ICS and knows how to use it.
Please note that the steps in 1.1 still need to be followed, and this time the SUBSCRIBER_USERNAME/PWD ‘Configuration Setting’ should be the ICS account’s username/password.

  1. Create and save an Oracle Rightnow connection in ICS.
  2. Create an Integration by the name ‘receive_contacts’. For this blog I chose the ‘Publish to ICS’ integration type.
  3. 05

  4. Open the integration and drag the Rightnow connection on the source-side. Name the endpoint and click ‘Next’
  5. 06

  6. On the ‘Request’ page select ‘Event Subscription’ , and select the desired event. Click Next.
  7. 07

  8. On the ‘Response’ page select ‘None’. Although, you could select a callback response if the use-case required so. Click Next.
  9. 08

  10. Click ‘Done’. Complete the rest of the integration and activate it.
  11. 09

  12. During activation ICS creates an endpoint and registers it as an EventSubscription object, as described in 1.2 above. But all of that happens in the background, providing a seamless experience to the user.
  13. If a Contact is updated in Agent Desktop now, we’d receive it in ICS.
  14. 10

2. Rightnow Custom Process Model

As discussed above, the Event Notification Service supports only Contact, Organization and Incident objects. But sometimes use-cases may require Custom Objects or other Connect Common Objects. In such cases Service Cloud’s Custom Process Model feature can be used for outbound events. I will now describe how to use them.

First, a few key terms:

  • Object Event Handler : A PHP code snippet that is executed whenever Create/Update/Delete events occur in the specified Rightnow objects. The snippet is used to invoke external endpoints using the cURL library.
  • Process Designer / Custom Process Model (CPM) : A component of the Rightnow Agent Desktop that is used to configure Object Event Handlers.

Below are the steps :

  1. Using any text editor, create a file called ContactHandler.php (or any other name) with the following code. The code basically defines a Contact create/update handler, loads the PHP cURL module and invokes a web service I wrote using Oracle BPEL. I have provided explanation at various places in the code as ‘[Note] :’
  2. <?php
    /**
     * CPMObjectEventHandler: ContactHandler // [Note] : Name of the file.
     * Package: RN
     * Objects: Contact // [Note] : Name of the object.
     * Actions: Create, Update // [Note] : Name of the operations on the object above for which the PHP code will be executed
     * Version: 1.2 // [Note] : Version of the Rightnow PHP API
     * Purpose: CPM handler for contact create and update. It invokes a web service.
     */
    use \RightNow\Connect\v1_2 as RNCPHP;
    use \RightNow\CPM\v1 as RNCPM; 
    /**
     * [Note] : Below is the main code, defining the handler class for the CPM . Like java, the class name should match the file name, and it implements the ObjectEventHandler class. The 'use' statements above define aliases for the \RightNow\Connect\v1_2 'package' .
     */
    class ContactHandler implements RNCPM\ObjectEventHandler
    {
        /**
         * Apply CPM logic to object.
         * @param int $runMode
         * @param int $action
         * @param object $contact
         * @param int $cycles
         */
    // [Note] : Below is the actual function that gets executed on Contact Create/Update.
        public static function apply($runMode, $action, $contact, $cycle)
        {
            if($cycle !== 0) return ;
    		// [Note] : The snippet below declares the URL and the XML Payload to be invoked
                $url = "http://10.245.56.67:10613/soa-infra/services/default/RnContact/bpelprocess1_client_ep?WSDL" ;
                $xml = '<soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/">
            <soap:Header>
                    <wsse:Security xmlns:wsse="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-secext-1.0.xsd" mustUnderstand="1">
                <wsse:UsernameToken>
                    <wsse:Username>HIDDEN</wsse:Username>
                    <wsse:Password Type="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-username-token-profile-1.0#PasswordText">HIDDEN</wsse:Password>
                </wsse:UsernameToken>
            </wsse:Security>
            </soap:Header>
            <soap:Body>
                    <ns1:process xmlns:ns1="http://xmlns.oracle.com/Application6/RnContact/BPELProcess1">
                            <ns1:input>'.$contact->Name->First.' '.$contact->Name->Last .'</ns1:input>
            </ns1:process>
        </soap:Body>
    </soap:Envelope>' ;
      
    
                $header[0]= "Content-Type: text/xml;charset=UTF-8";
                $header[1]= 'SOAPAction: "process"';
    			
    			// [Note] :The invocation requires and makes use of the cURL module.
                load_curl();
                $curl = curl_init();
                curl_setopt_array($curl,array(
                  CURLOPT_URL => $url,            
                  CURLOPT_HEADER => 0,
                  CURLOPT_HTTPHEADER => $header,  
                  CURLOPT_FOLLOWLOCATION => 1, 
                  CURLOPT_RETURNTRANSFER => 1,
                  CURLOPT_CONNECTTIMEOUT => 20,
                  CURLOPT_SSL_VERIFYPEER => 0,
                  CURLOPT_SSL_VERIFYHOST => 0,
     
                ));
                curl_setopt($curl,CURLOPT_POST,TRUE);
                curl_setopt($curl,CURLOPT_POSTFIELDS, $xml);
                $content = curl_exec($curl);
        }
    }
    /**
     * CPM test harness
     */
    // [Note] : These are unit test functions, needed by the RN PHP framework.
    class ContactHandler_TestHarness
            implements RNCPM\ObjectEventHandler_TestHarness
    {
        static $contactOneId = null,
        static $contactTwoId = null;
        /**
         * Set up test cases.
         */
        public static function setup()
        {
            // First test
            $contactOne = new RNCPHP\Contact;
            $contactOne->Name->First = "First";
            $contactOne->save();
            self::$contactOneId = $contactOne->ID;
            // Second test
            $contactTwo = new RNCPHP\Contact;
            $contactTwo->Name->First = "Second";
            $contactTwo->save();
            self::$contactTwoId = $contactTwo->ID;
        }
        /**
         * Return the object that we want to test with. You could also return
         * an array of objects to test more than one variation of an object.
         * @param int $action
         * @param class $object_type
         * @return object | array
         */
        public static function fetchObject($action, $object_type)
        {
            $contactOne = $object_type::fetch(self::$contactOneId);
            $contactTwo = $object_type::fetch(self::$contactTwoId);
            return array($contactOne, $contactTwo);
        }
        /**
         * Validate test cases
         * @param int $action
         * @param object $contact
         * @return bool
         */
        public static function validate($action, $contact)
        {
            echo "Test Passed!!";
            return true;
        }
        /**
         * Destroy every object created by this test. Not necessary since in
         * test mode and nothing is committed, but good practice if only to
         * document the side effects of this test.
         */
        public static function cleanup()
        {
            if (self::$contactOneId)
            {
                $contactOne = RNCPHP\Contact::fetch(self::$contactOneId);
                $contactOne->destroy();
                self::$contactOneId = null;
            }
            if (self::$contactTwoId)
            {
                $contactTwo = RNCPHP\Contact::fetch(self::$contactTwoId);
                $contactTwo->destroy();
                self::$contactTwoId = null;
            }
        }
    }
    ?>
  3. Log on to Agent Desktop. Click on Configuration-> Site Configuration-> Process Designer, and click ‘New’.
  4. 11

  5. Upload the ContactHandler.php file. Check the ‘Execute Asynchronously’ checkbox, the lib_curl module is available for async CPMs only.
  6. 12

  7. Click ‘Save’ on the Home Ribbon , and then click on the ‘Test’ button. On clicking test the ‘validate’ function in the code is executed. Make sure it executes fine, and that the output looks OK.
  8. 13

  9. Click OK, followed by clicking ‘Yes’, and then Save again. Now go to the Contact object under OracleServiceCloud, and assign the newly created ContactHandler to the Create and Update events. Then Save again.
  10. 14

  11. Now click ‘Deploy’ on the Ribbon to upload and activate all the changes to the RN Server
  12. 15

  13. In order to test, create a new contact called ‘John Doe’ in Service Cloud, and the BPEL process gets instantiated.
  14. 16

This ends our discussion on configuring and consuming outbound real-time events. Before moving on to bulk data export, it must be noted that the Rightnow event subscribers and CPMs are inherently transient. Thus, durable subscriptions are not available, although for error scenarios Rightnow does have a retry mechanism with exponential back-off.
If durability is a key requirement then the subscriber must be made highly-available and durability must be built in the subscriber design, such as by persisting messages in a queue immediately upon receiving them.

3. Bulk Export

So far we have discussed various ways of receiving real-time events/notifications from Rightnow. These can be used for online integration scenarios, but not for bulk-export use-cases.
We’ll now discuss a few options for bulk export:

a. ROQL

ROQL, or Rightnow Object Query Language is the simplest tool for extracting data, using SQL-like queries against Rightnow.
ROQL can be executed using Connect Web Services, Connect REST Services, and Connect PHP Services.

ROQL comes in two flavors, Object Query and Tabular Query:

  • Object Query : This is when Rightnow Objects are returned as response to the query. This is the simpler form of queries, available in SOAP API as the QueryObjects operation, or in REST API as the ?q= URL parameter.
  • Tabular Query : Tabular queries are more advanced queries, which allow operands like ‘ORDER BY’, USE, aggregate functions, max returned items, pagination, etc. These are available in SOAP API as the queryCSV operation, or in REST API as the queryResults resource.

Between the two, Tabular Query is the more efficient way of extracting data, as it returns the required dataset in a single database query. Two great resources to get started on tabular queries are the A-Team blogs here and here. They explain how to use SOAP and REST-based Tabular queries to extract data from Service Cloud and import into Oracle BI Cloud Service.

b. Analytics Report

For more advanced querying needs Rightnow Analytics Reports can be defined in the Agent Desktop, and they can be executed using the SOAP RunAnalyticsReport operation, or REST analyticsReportResults resource.

c. Third Party Providers

A number of third party providers, including the Progress ODBC and JDBC Drivers also allow bulk extract of Rightnow data. These providers internally use the same ROQL based approach, but provide a higher layer of abstraction by automating pagination and other needs.

Conclusion

In this blog we looked at a couple of ways to receive outbound events from Service Cloud, and how ICS can be used to seamlessly receive the events in a UI-driven fashion.
We also saw how PHP-based Rightnow Custom processes can be used as object triggers.
Finally, we saw a few options available for bulk data export from Rightnow, using ROQL, Rightnow Analytics and third-party providers.

Viewing all 74 articles
Browse latest View live