Welcome!

Weblogic Authors: Yeshim Deniz, Elizabeth White, Michael Meiner, Michael Bushong, Avi Rosenthal

Related Topics: Weblogic

Weblogic: Article

Which Integration Approach is Best?

Which Integration Approach is Best?

Recently I made a long trip to the East Coast. While there, I was able to meet with a number of developers, customers, and partners. I spoke to a variety of people and heard about a number of interesting community goings-on.

Lately, I've been on a big Web services kick. I've spent a significant amount of time studying, speaking, and writing about this technology. While speaking to a number of different users groups in the area, I was surprised to see one question bubbling to the surface repeatedly.

People kept asking about the difference between the J2EE Connector Architecture (JCA) and Web services. They were interested in knowing what the criteria should be for selecting one over the other. At first, I was completely perplexed because the answer seemed obvious. But, after giving it some thought, I can easily understand why confusion has started to seep in. Vendors have done such a good job of marketing Web services as standardized integration that companies and individuals have started to voice angst because JCA is also pitched similarly. It naturally leads to the conclusion that these technologies compete and conflict, right? Well, not exactly.

Understanding Positioning
Web services and JCA are both standards that help with integration problems, but they certainly don't compete with one another. It's important that developers and companies understand their differences and what makes the technologies different.

Web services is positioned as a technology that standardizes integration. The JCA specification is designed to standardize enterprise application integration (EAI), which is a subset of the broader integration problem. Therefore, asking the question, "If I have an EAI problem, should I use Web services, JCA, or both?" is completely valid.

The Biggest Difference Is Intrusion

Intrusion is the amount of modification to the legacy system required to support the integration technology.

  • JCA: This technology requires little or no intrusion into the legacy system. JCA adapters gain access to the legacy system by using native APIs, sockets, data access, and a number of other technologies that don't require modification of any existing code bases to be effective. This allows a company to leverage the investment already made in a system in deployment as opposed to investing additional resources into updating the system to support integration.
  • Web services: In order for a legacy system to natively support using Web services as an EAI approach, a high level of intrusion would have to occur. The legacy system would have to be updated and modified to contain a Web services stack. This stack would handle all SOAP message parsing and legacy system invocations - quite a big task. Most enterprises would balk at the idea of modifying an existing COBOL or older mainframe application to introduce an unproven Web services stack.

As a result of this scenario, if a company wants to use Web services for EAI, the most likely strategy would be to purchase a Web services gateway that would intercept Web services messages, do a native invocation of a legacy system, get the response, and convert it to a response Web service message. JCA could easily be used as the technology that does the native legacy system invocation, but many companies might object to having so many layers of message handling to enable EAI.

As a side note, I believe that this EAI problem will ultimately be a thorn in Microsoft's .NET strategy. Since most legacy systems don't run on Microsoft Windows, it's impossible for .NET to do direct legacy system integration. Microsoft has openly declared that Web services are their strategy for EAI in the future. This means that Microsoft is depending upon companies to embed Web services stacks into legacy systems to gain access to them. I would argue that enterprises will balk at the idea and not make this investment. Those enterprises that commit to .NET will be forced to purchase an EAI server (WebLogic Server is a good choice) to intercept Web services requests and then use an EAI-specific technology such as JCA to do the integration.

Context Propagation
A serious infrastructure issue to consider is transaction and security context propagation from clients consuming a legacy system to the legacy system itself.

  • JCA: JCA supports security and transaction context propagation from the JCA server to the legacy system. This is one of the fundamental infrastructure concepts incorporated into the design of JCA and a strong reason it's used in many integration scenarios. This means that if an XA transaction context or a security context identifying a user is created in the application server, when the application server invokes a service on the legacy system, the system is infected with the transaction and security context. In terms of XA transactions, this means that if the legacy system is an XA-compliant resource manager, it can participate in an XA transaction. This allows architects to design EAI systems that can ensure ACID transactions with the legacy system and the client consuming it. In terms of security, this means that a single sign-on identifier for an individual can be used between the client and the legacy system. These are very critical characteristics to consider when designing interaction with a legacy system.
  • Web services: It's conceivable that a Web service message could propagate a transaction or security context as part of a SOAP header extension, but there is no standard for doing that today. Additionally, even if such a standard existed, it would not be used to propagate XA transactions and internal security user identifiers. Web services decouple clients from servers much the same way MOM systems decouple producers and consumers. Yes, it is possible to propagate an XA transaction context across a Web service message, but the very nature of Web services means that there is no predictability as to when response messages will be sent or arrive at their destination. XA transactions are very short-lived transactions, usually with an automatic transaction rollback occurring after 30 seconds. The unpredictable and disconnected nature of Web services makes them an unrealistic transport for XA transaction contexts. A similar argument can be made for system-level security identifiers. This means that a client using Web services to access a remote system cannot do so using XA transaction semantics.
Don't be discouraged by Web services, however. Web services are intended to expose a business-level interface to a piece of business logic, not a fine-grained method. As such, they do require transactions, security, and a whole slew of other infrastructure technologies, but they must be applied in the context of the environment Web services will operate in (unpredictable response, unpredictable load, applications come and go, XML formats change, etc.). For example, OASIS is working on the Business Transaction Protocol (BTP) to model long-lived transactions for systems in this environment. Additionally, there are a whole slew of security Web services specifications being created. Web services will ultimately change the way developers look at business services and the architecture they use to build them, but you will have to decide if those new approaches are appropriate for EAI.

Code Binding
Code binding is a simple, yet important, aspect of EAI. A comparison of JCA and Web services can be made here, too.

  • JCA: JCA is a Java-based technology. This does not mean that the legacy system has to be authored in Java, but it does mean that the consumer of a legacy application using JCA must be written in Java. This has certain implications for architects. First, any application that uses JCA is bound to its use of JCA. This means that if the application's use of the legacy system needs to change, a developer must physically rewrite the JCA access code to make this change. This may or may not be problematic for some companies. Second, applications can take advantage of strong typing and compile-time checking for clients that are accessing a legacy system. Since the data types and methods used to access the legacy system are exposed as Java types, a Java application that uses JCA can have compile-time checking against semantic usage of the legacy system (it doesn't validate appropriate usage of the system, just that the syntax of using the system is accurate).
  • Web services: A Web service EAI approach would allow a client authored in any language to access the legacy system, whether a Java, C#, Fortran, or COBOL client. Also, since Web services use XML messages to communicate with other systems, a client that dynamically connects to a Web service and obtains a WSDL document to understand message structure doesn't get any benefits associated with compile-time checking. However, the Web service's format can evolve in real time and the application can dynamically evolve with it - i.e., changing the Web service doesn't necessarily mean rewriting the client using the Web service.

Data Binding
Data binding deals with changing the format of the legacy system data to a format that is consumable by the client application.

  • JCA: JCA requires a binding of the data into a Java data type (Java class). This is simple and straightforward for simple data types where there is a clear mapping between the native type and the Java type, but quickly gets sticky when the legacy system has complex relational or binary data that has to be manipulated. JCA 1.0 didn't have a standard way to represent the format of data in the legacy system; dynamic clients couldn't connect to a JCA adapter and "discover" what the data format of the legacy system was. JCA 2.0 will contain a metadata facility that will allow JCA clients to query a JCA adapter to dynamically discover the shape and format of data in the legacy system.
  • Web services: Web services poses an interesting problem for legacy systems, as the data in the legacy system must first be translated into XML for transport and then into the programming language format of the client consuming the Web service. If the EAI system uses the client -> Web service -> server middleware -> native integration adapter -> legacy system approach, the data in the legacy system goes through an ugly native format -> language format of native integration adapter translation -> XML translation -> language format of the client consuming the Web service. Unless data-binding techniques drastically improve, it's possible that a lot of the meaning of the original data could be lost in these translations (not to mention the blatant performance slowdown).

On the other hand, there could be value for many companies to represent legacy data in XML format. XML has semantic business meaning that can be leveraged across applications and systems in a number of different ways. Also, if the legacy system natively exposes its data in XML, Web services might provide a seamless approach that may make using JCA too cumbersome. There are many systems built in the past three years that are now legacy systems using XML as the data format, so this approach may be feasible. Conclusion
I hope this article has provided a rapid introduction to the major architectural differences between JCA and Web services for the purpose of doing EAI. We're always interested in hearing about additional positioning arguments and further comparisons. If you have something to add, let us hear about it! Send me an e-mail and let's discuss it.

More Stories By Tyler Jewell

Tyler Jewell is VP, Product Management & Strategy, Oracle Public Cloud.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
Disruption, Innovation, Artificial Intelligence and Machine Learning, Leadership and Management hear these words all day every day... lofty goals but how do we make it real? Add to that, that simply put, people don't like change. But what if we could implement and utilize these enterprise tools in a fast and "Non-Disruptive" way, enabling us to glean insights about our business, identify and reduce exposure, risk and liability, and secure business continuity?
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
DXWorldEXPO LLC announced today that Telecom Reseller has been named "Media Sponsor" of CloudEXPO | DXWorldEXPO 2018 New York, which will take place on November 11-13, 2018 in New York City, NY. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
@DevOpsSummit at Cloud Expo, taking place November 12-13 in New York City, NY, is co-located with 22nd international CloudEXPO | first international DXWorldEXPO and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time t...
DXWorldEXPO LLC announced today that "IoT Now" was named media sponsor of CloudEXPO | DXWorldEXPO 2018 New York, which will take place on November 11-13, 2018 in New York City, NY. IoT Now explores the evolving opportunities and challenges facing CSPs, and it passes on some lessons learned from those who have taken the first steps in next-gen IoT services.
SYS-CON Events announced today that Silicon India has been named “Media Sponsor” of SYS-CON's 21st International Cloud Expo, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Published in Silicon Valley, Silicon India magazine is the premiere platform for CIOs to discuss their innovative enterprise solutions and allows IT vendors to learn about new solutions that can help grow their business.
SYS-CON Events announced today that CrowdReviews.com has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5–7, 2018, at the Javits Center in New York City, NY. CrowdReviews.com is a transparent online platform for determining which products and services are the best based on the opinion of the crowd. The crowd consists of Internet users that have experienced products and services first-hand and have an interest in letting other potential buye...
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
"IBM is really all in on blockchain. We take a look at sort of the history of blockchain ledger technologies. It started out with bitcoin, Ethereum, and IBM evaluated these particular blockchain technologies and found they were anonymous and permissionless and that many companies were looking for permissioned blockchain," stated René Bostic, Technical VP of the IBM Cloud Unit in North America, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Conventi...