Welcome!

Weblogic Authors: Yeshim Deniz, Elizabeth White, Michael Meiner, Michael Bushong, Avi Rosenthal

Related Topics: Weblogic

Weblogic: Article

Integration: Not Just Aggregation

Integration: Not Just Aggregation

When talking about enterprise application integration, we tend to think of using Web services technologies such as SOAP and UDDI to virtualize a data model across a large enterprise. The thinking is that with a consistent interface, the data stores of the company can be abstracted behind a Web services layer and reported in XML, which can then be kneaded to the particular needs of your application.

In an ideal world, this would be simple. However, we don't live in an ideal world, so there are a number of problems. The first problem is that we often want to take data from one service and use it as a parameter to call another, take returned data from that service and use it to call another, and so forth. We would then want to take pieces of data from each and aggregate them together to form our response to the original call.

In addition, data-type conversion may be necessary to map parameters to each other. For example, service 1 may return a double that is required by service 2 as an index, but service 2 expects it as a float. Interoperability is more than just being able to parse WSDL to create a proxy - operations such as data-type conversion are paramount.

Finally, you cannot expect all the services to run on the same platform. There may be .NET Web services, COM, DCOM, or other J2EE Web services that need to be integrated, to name a few. This is far more complicated than simple data aggregation, but is a real-world requirement; and with Web services development platforms alone it would require a lot of hand coding, defeating the purpose.

Here's where a tool such as BEA WebLogic Workshop 8.1 and a platform such as BEA WebLogic Integration 8.1 come into play.

A Real-World Use Case
A fictional company, The Pension Source (TPS), provides pension administration for their customers. Their pension plans have always been based on funds based in the U.S. but they want to branch into international funds. Their offices in Europe have a Web service that provides foreign exchange information, and these offices are standardized on Oracle as an application server. In addition, they have a domestic database that contains the recent price history and currency code for each fund. They would like to integrate these Web services to provide a new application. What will this entail?
1.  Exposing an interface that allows the user to query the dollar price of a particular fund
2.  Pulling the price of the fund from the database
3.  Pulling the currency code of the fund from the database
4.  Converting the price to U.S. dollars
5.  Returning the converted price to the user

In step 2, the fund price is returned to the user as a float. In step 4, the distant Web service requires the price to be passed to it as a double.

We'll see how straightforward the design and implementation of such a workflow and orchestration is with BEA WebLogic Workshop.

Building the Integrated Application
The downloads for this article (located at www.sys-con.com/weblogic/sourcec.cfm) include a zip file (CurrencyConv.zip) containing an emulation of the currency conversion Web service as an Oracle JDeveloper project that may be installed on Oracle JDeveloper and run on OC4J. Please check the readme file included with the download for instruction. In addition, PensionAPI.zip contains a WebLogic Workshop 8.1 application that emulates a service that retrieves data from the pension fund's database, and exposes it as an API. For simplicity, neither application is tied to a back-end database; they simply emulate a real-world scenario where they would.

Once both services are up and running, start WebLogic Workshop 8.1.

Getting Started with the Workflow and Adding a Client Request

  • Create a new empty application called TPSReport. For the server, make sure that you select the "Integration" server on the dropdown list.
  • Create a new empty workflow project called GetFundInformation.
  • Within the workflow folder you will see a file called workflow.jwf. Double-click it to open the designer.
  • Double-click the "Starting Event" and select "Invoked By a Client Request"
  • Double-click the Client Request node, select "String", and click Apply.
  • Select the "Receive Data" tab. This is where you will assign the parameter that has been passed by the user to a variable that is internal to your workflow. That variable does not yet exist, so in the next steps you can add it.
    - On the right side of your screen, you should see the Data Palette. Click "Add" besides the Variables tab, and then call the variable strSymbol and make it a string.
    - On the "Receive Data" tab of the Client Request node, click "Select Variable", variable strSymbol is now on the list.
  • Click "Apply" and then "Close" to update the Client Request node.

    You've finished the first step of the workflow, taking the data in from a client request. Your screen should look something like Figure 1.

    Consuming the Fund Price Web Service
    Now that you have the Client Input, as a string denoting the symbol of the fund you are interested in, the next step is to consume the Web service that exposes the Fund Price History and Currency Code.
    Consuming the Price History:

  • From the palette on the bottom left side of the screen, select "Control Send With Return" and drag it onto the green line between the Client Request node and the Finish.
  • Double-click the node that appears.
  • You can see that this node requires a control to continue. With the designer you can add a new one on the fly as follows:
    - On the right side of the screen (use Figure 1 as a guide) the Data Palette contains a Controls tab. On it, click "Add" and select "Web Service".
    - In the ensuing dialog, on step 1, call the control ctlPensionDetails
  • In step 2, select "Create a New Web Services Control to Use" and call it ctlPensionsAPI
    - In step 3, enter the URL to the WSDL of the Pensions API Web service that you downloaded and installed earlier.
    - Click "Create" and you will return to the node designer.
  • You can now select "ctlPensionDetails" as the control implementing the Web service, and when selected, the methods list box is populated.
  • Select the method "getPrice" and click "Apply".
  • Select the "Send Data" tab. This is where you set up the parameters that you are passing to the Web service and where they come from.
  • Click "Select Variable" and select "strSymbol". This will take the variable created by the User Call in the previous node and dispatch it as a parameter to the PensionsAPI Web service. Click "Apply".
  • Select the "Receive Data tab". This is where you specify what to do with the data that comes back from the Web service.
  • We don't have a variable to hold this data, so create a new one as before. Make it a Java float, and call it nAmount.
  • It will now be available on the "Select Variable" list, so select it and click "Apply".
  • Click "Close". You now have a node in your workflow that dispatches the client request to a Web service and gets a value back.

    Repeat these steps to add another node, calling the same control, but this time calling the "getCurrency" method. Map the returned value to a new integer called nCurrency.

    Your screen should now look like Figure 2.

    Orchestrating the Change Currency Web Service
    Now it gets interesting. At this point you have a workflow using all WebLogic Web services. You are going to integrate the service running on another platform - in this case the Currency Conversion service running on Oracle OC4J.

    As before, drag a "Control Send with Return" node onto the workflow, dropping it just above the "Finish" node.

  • Double-click it to configure it.
  • You do not yet have a control for the Web service, so click "Add" near "Controls" on the Data Palette.
  • Select Web Service". - Step 1: Call the instance ChangeCurrency - Step 2: Select "Create a new Control". Call it ctlChangeCurrency. - Step 3: Enter the WSDL of the running Web service. If you followed the instructions in the download it should look something like: http://machinename:8888/WebLogicArticle- CurrencyConv-contextroot/mypackage1.Change Currency?WSDL
    - Click "Close".
  • Return to configuring the node.
    - On "General Settings" select the Control "ChangeCurrency" and the node "ChangeCurrency" and click "Apply"
    - On "Send Data" select nAmount and nCurrency in step 1.
    - You'll notice on the dialog that the input variables are a "double" and an "int", whereas the values you are about to pass are a "float" and an "int". To transform them, on step 2a in the dialog do the following:
  • Click "New". In step 1 set the Variable Name to "TransformForCurrency" and in step 2, set the control name to dtfTransformCurrency
  • Enter "TransformAmount" in the Method box and click "Create Transformation" in step 2b.
  • The Map Variables dialog appears. Drag the float to arg1 and the int to arg 2.
  • Click OK when you're done.
    - Click Apply to update the "Send Data" settings
    - On the "Return Data" node, the double needs to be mapped to a variable. Create a new double variable as before, and call it nReturn. Select it, click "Apply" and then click close.

    You have now integrated the distant Web service, and it was done as seamlessly as the Web services running on the same WebLogic platform.

    Returning the Data to the User
    The user has now passed a string containing the fund that they are interested in to your workflow, and the workflow has pulled the details out, and translated the currency to US dollars. The last step is to return the currency-translated value back to the user.

    This is very simple. On the Palette, select a "Client Response" node and drop it just above the "Finish" node.

  • Double-click the "Client Response" node and select a "Double" on the "General Settings" tab.
  • On the "Send Data" tab, select the variable called "nReturn", click "Apply", and then click "Close".

    Your finished workflow should now look like Figure 3.

    Testing the Workflow
    Build the application and run it. The test harness will run and display the screen shown in Figure 4.

    Enter "ABC" in the textbox and click "clientRequest". Wait a few seconds and click "Refresh" on the Message Log. Keep doing this until you see a callback.clientResponse entry on the Message Log.

    By clicking on the nodes within the harness, you can see the calls and the data being passed around the different Web services.

    In the case of "ABC" the amount was being reported in currency code 1 (dollars) so no conversion was required. As an exercise, try to amend the workflow so that it doesn't bother to call the Currency Conversion Web service in this case. (Hint: Use a decision node.)

    By trying the symbols "BBC" and "CBC" you can investigate the currency conversion as it is happening.

    One really great thing about this is that this compiled workflow is actually running as a Web service, so it in turn could become a node in another, more complex orchestration!

    Conclusion
    In this example you built a simple orchestration. You have barely touched on the features that are available in BEA WebLogic Workshop 8.1 for orchestrations. Other nodes can have Java code associated with them, be conditional, loop, or be asynchronous. The goals of true application integration - data aggregation, service chaining, and orchestration - are now in your hands.

  • More Stories By Laurence Moroney

    Laurence Moroney is a senior Technology Evangelist at Microsoft and the author of 'Introducing Microsoft Silverlight' as well as several more books on .NET, J2EE, Web Services and Security. Prior to working for Microsoft, his career spanned many different domains, including interoperability and architecture for financial services systems, airports, casinos and professional sports.

    Comments (0)

    Share your thoughts on this story.

    Add your comment
    You must be signed in to add a comment. Sign-in | Register

    In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


    IoT & Smart Cities Stories
    The hierarchical architecture that distributes "compute" within the network specially at the edge can enable new services by harnessing emerging technologies. But Edge-Compute comes at increased cost that needs to be managed and potentially augmented by creative architecture solutions as there will always a catching-up with the capacity demands. Processing power in smartphones has enhanced YoY and there is increasingly spare compute capacity that can be potentially pooled. Uber has successfully ...
    The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
    We are seeing a major migration of enterprises applications to the cloud. As cloud and business use of real time applications accelerate, legacy networks are no longer able to architecturally support cloud adoption and deliver the performance and security required by highly distributed enterprises. These outdated solutions have become more costly and complicated to implement, install, manage, and maintain.SD-WAN offers unlimited capabilities for accessing the benefits of the cloud and Internet. ...
    Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
    As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...
    Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
    René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
    With 10 simultaneous tracks, keynotes, general sessions and targeted breakout classes, @CloudEXPO and DXWorldEXPO are two of the most important technology events of the year. Since its launch over eight years ago, @CloudEXPO and DXWorldEXPO have presented a rock star faculty as well as showcased hundreds of sponsors and exhibitors! In this blog post, we provide 7 tips on how, as part of our world-class faculty, you can deliver one of the most popular sessions at our events. But before reading...
    Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
    Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight...