In this post we will create a small BPMN process which is querying an order with a specific status from Salesforce using the Salesforce Adapter. Some of the order information is than transmitted to SAP using the SAP Adapter.
To use the adapters and build the process some preparation must be done in advance.
1. Obtain Security Token and Salesforce artefacts
In the next steps you have to provide information and/or artefacts from the Salesforce system. This information / artefacts are
- Salesforce User
- Salesforce Password
- Salesforce Security Token
- Salesforce certificate
- Salesforce enterprise.wsdl
Details how to obtain this information / artefacts can be found in the Oracle Salesforce Adapter documentation
2. Configure the Salesforce Adapter for runtime
To be able to connect to the salesforce system during runtime the following steps are necessary.
Create a credential store key
As the key we will create here will be referenced later during the configuration of the salesforce adapter in design time it is mandatory to do this step in advance.
Add the Salesforce certificate to the key store
In order to connect to Salesforce during runtime the salesforce certificate must be added to the weblogic keystore. In most of the environments this is straight forward and it is described in detail in the Adapter documentation. Using the integrated weblogic server in my development environment I had to spend quite some effort to get this part up and running. As mentioned earlier this will be described in an upcoming post.
More details concerning the configuration can be found in the Oracle Salesforce Adapter documentation
3. Determine your Salesforce query
Thus we want to query data from Salesforce we have to provide a Salesforce compliant query during the adapter configuration step. It makes therefore sense to test this query first in the Salesforce environment directly using the Salesforce Developer Console.
In my case I will use the following query
select id, OrderNumber, totalamount from order where SAP_Status__c = ''
The field SAP_Status__c is a custom field I added to the Order Business component (all custom fields in Salesforce have the extension __c).
4. Configure the SAP Adapter for runtime
The SAP Adapter runtime configuration will be referenced in the design time configuration via a JNDI name. Although the runtime settings will not be checked during design time we will nevertheless provide the information in advance, so we can directly test our process later after the deployment.
The settings can be applied via the weblogic console (we will use the existing Connection Factory FMWDEMO)
Now we have to provide at least the following information (DestinationDataProvider_)
- JCO_ASHOST: IP address of the SAP server
- JCO_CLIENT: SAP Client (e.g. 001)
- JCO_LANG: Language (e.g. EN)
- JCO_PASSWD: Password to connect to SAP
- JCO_SYSNR: SAP System number (e.g. 00)
- JCO_USER: Username to connect to SAP
After changing these information the SAP Adapter deployment must be updated.
5. Select the SAP object to store the order information
If you use an operational SAP system you can most likely use existing BAPIs or RFCs (Remote Function Calls) which you can call to store the order information. In my case I had to use a SAP BW system which doesn't contain any operational objects or tables.
In this case the preparational steps need some effort on the SAP site. I had to do the following steps in SAP
- Create a custom table
- Create a function module (FM) which is exposed as RFC
- Create a small ABAP program in the FM to receive parameters and store the parameters in the custom table
6. Embedding the SAP Java connector
The SAP Adapter uses a library which is provided by SAP. You can get the library from the SAP market place.
Following the instructions in section 2.1.2 of the Oracle SAP Adapter documentation to embedd the SAP Java connector in your environment.
7. Create the BPM process and use the adapters in design time
We start by building a BPM Application with a BPMN process.
Configure the Salesforce Adapter (design time)
In the composite view first add a Salesforce Adapter component as a reference.
Now import the enterprise.wsdl (see step 1) into the project (or better store it in the MDS and use the oramds:reference)
Provide the Salesforce credentials and reference the credential store key (see step 2).
Now you can check if your configuration is valid. Click the button Test connection. As a result you should see Success!
Click 2times Next and then Finish. You have now a valid reference in your composite which we will use soon in our BPM process.
Configure the SAP Adapter (design time)
In the composite view first add a SAP Adapter component as a reference.
Click Next and Click the green plus sign to open the dialog to provide the settings for the SAP system. Also notice the JNDI Name (which we configured in step 4).
Provide all the necessary information (the same you provided in step 4) and click the button Test connection. The result should be Test connection successful
Click OK and Next. You have now direct access to all BAPIs, RFCs and IDOCs which are available in the SAP System. You can easily check if the connection is working correctly by using a standard SAP RFC. For instance, you can use the remote function call RFC_GET_TABLE_ENTRIES to get all rows from a specific table (in my case my custom table ZMGRORDER). By right clicking on the RFC you can directly execute a test and see the result.
In my case I use my custom RFC (Z_CREATE_ORDER) to write the information in my custom table. If you use an existing operational table / object, you can most likely use one of the standard BAPIs / RFCs provided by SAP at this point.
Click next. You can now configure the behaviour of the SAP adapter. For instance you can define how often and in what frequency the adapter will retry if the SAP system is unavailable.
Click Next and Finsih. The reference for the SAP access is now also part of the composite and can be used.
Complete the BPM process
Create the first service activity and select the service to query data from Salesforce.
Open the Data Association dialog and create a new data object in which you can store the Salesforce response (you can use an existing type from the Salesforce wsdl).
Map the output of the service to this new process data object so that you can use the containing information throughout the process.
Now create a second service activity and select the service to create the order in SAP.
Map your Salesforce Result directly or by using a XSLT transformation to the input parameters for your SAP call.
The process and the composite should now look as follows
8. Deploy and test
By starting the process via the enterprise manager test console and looking in the instance logging we can see the following result.
And if we look in the SAP system using the data browser we can see that the order was successfully created.
In reality you will have to consider additional aspects, e.g.
- closed-loop : After the order was updated in SAP transfer necessary information back to the Salesforce system and update Salesforce using CRUD operations of the Salesforce adatper
- XREFs: In a proof-of-concept we did a similiar scenario on the customer site but faced the problem, that the Salesforce Order Id which is case sensitive was automatically changed into Upper Case when the id was stored in a standard SAP table (a strange feature on the SAP site that fires for a lot of standard tables). Therefore the Id stored in SAP could not directly be used to identify the right order in Salesforce on the way back (updating Salesforce in the closed-loop scenario). You have to overcome this problem by using a XREF table where you store the original key from Salesforce and map it to an identifier in SAP.
- the usage of a common data modell can be required. You have to add transformation which can be very complex
- you have to build a good reference architecture in which you increase the reusability of services, e.g put the access to SAP and Salesforce (via the adapters) in seperate layer (basic services) so that this services can be reused in the orchestration layer
However, I hope this post helped to show how powerful the new Oracle 12c adapters are and that they can really facilitate integration requirements.
Cloud is one of the big hypes at the moment and Oracle is also really pushing in this way. The existing Cloud Adapters (and all the new cloud adapters which are on the current roadmap) will help to face this demands.