Welcome!

Weblogic Authors: Yeshim Deniz, Elizabeth White, Michael Meiner, Michael Bushong, Avi Rosenthal

Related Topics: Weblogic

Weblogic: Article

Canonical Message Formats

Avoiding the Pitfalls

As the scope of enterprise integration grows, IT organizations are demanding greater efficiency and agility from their architectures and are moving away from point-to-point integration,which is proving to be increasingly cumbersome to build and maintain.

They are migrating towards adaptive platforms such as BEA's 8.1 Platform and many-to-many architectures that supports linear growth costs as well as simplified maintenance. To connect systems, WebLogic developers are shifting away from creating individual adapters between every pair of systems in favor of Web services. For data, they are shifting away from individual mappings between data sources and targets in favor of Liquid Data enhanced with canonical messages. But trying to implement canonical messages can be rife with problems such as causing infighting between departments and creating rigid models that become obstacles to progress. Fortunately, these pitfalls can be avoided through proper architecture.

Introduction
As IT professionals engineer greater efficiency and flexibility into their enterprises, the industry has seen a shift from point-to-point connectivity to platforms such as BEA WebLogic that support many-to-many integration. The heart of this shift is a service-oriented architecture (SOA) that brings the world one step closer to IT infrastructure that is truly plug-and-play. However, any CIO who has implemented an SOA can tell you that the SOA still has some hurdles to clear before it delivers on the plug-and-play future. The most imposing of these hurdles is data interoperability.

Data documents are exchanged blindly because the SOA offers no systematic way to standardize the way in which inbound data is interpreted and validated. Even if an organization standardizes on an industry document-type definition (DTD), there is still no automated way to reconcile, for example, the strings "two," "2," and "2.00."

XML Schema Definitions (XSDs) and LiquidData help with data typing and transformations, but that's just the tip of the iceberg. What if the value of 2 is derived from multiple data sources? What if another system reports that the value is really 3? When should the value 2.001 be rounded to 2? What if there is a business requirement that dictates that the value must be greater than 5?

By most accounts, over 40% of the cost of integrating applications is spent writing point-to-point adapters for the transformation, aggregation, validation, and business operations needed to effectively exchange data - and much of the code is redundant, inconsistent, and costly to maintain. This approach results in a brittle architecture that is expensive to build and difficult to change without more hand coding.

Canonical models are a step in the right direction, but exchange models - a superset of canonical models - solve the entire problem with a many-to-many solution for modeling, automating, and managing the exchange of data across an SOA. They capture data reconciliation and validation operations as metadata, defined once but used consistently across the SOA. In this way, rules are enforced, reused, and easily modified without incurring additional integration costs. Without such infrastructure in place, data interoperability problems threaten the flexibility and reuse of the SOA - the incentives for implementing an SOA in the first place.

SOAs Ignore Underlying Data
Web services provide a standards-based mechanism for applications to access business logic and ship documents, but they provide nothing for interpreting and validating the data in the documents within the context of the local application. In addition to the simple examples given above, IT departments are left with the sizable task of writing code to interpret and convert data every time a document is received. The operations performed include:

  • Transformations: Schema mapping, such as converting credit bureau data into an application document
  • Aggregation: Merging and reconciling disparate data, such as merging two customer profiles with different and potentially conflicting fields
  • Validation: Ensuring data consistency, such as checking that the birth date precedes the death date
  • Business operations: Enforcing business processes, such as ensuring that all orders fall within corporate risk guidelines

Consider the apparently simple concept of "order status". This concept is vital to the correct execution of many business processes within the enterprise, but what are the true semantics of the term? Is it the status of issuing a request for a quote, or placing a purchase order? What about the status of a credit check on the customer? What limitations should be placed on the order if the credit check reveals anomalies? Perhaps "order status" refers to the manufacturing, shipping, delivery, return, or payment status. In practice, it's probably all of these things as well as the interdependencies between them. Applications in different functional areas of the corporation will have different interpretations of "order status" and different databases will store different "order status" values.

Each database and application within a given area of responsibility, such as order management, manufacturing, or shipping, uses a locally consistent definition of "order status". However, when an enterprise begins using an SOA to integrate processes across these areas, source systems must convert their local definitions to the local definitions of the target systems. And the problem increases with the number of target systems - order management systems may send messages to both manufacturing systems and shipping systems, each with a different definition of "order status".

Converting these messages for the target system is labor intensive and goes beyond simple mappings. IT departments are often forced to alter applications and embed additional code in order to execute any of the operations shown in Table 1.

These tasks are often implemented with hand coding, which creates a brittle set of data services that scatter the semantics of reconciling data across the entire service network. What is worse is that this code is not reusable from project to project. Worse still is that it creates an architecture that is not amenable to change. Changing a single logical rule anywhere on the network sets off a domino effect in which every application service that has implemented this rule must also be changed - assuming each instance of the rule can even be found.

The interoperability of data plays just as big a role in SOAs as the interoperability of services, and if the mechanics of exchanging data are not addressed the two key reasons for implementing an SOA in the first place - flexibility and reuse - are severely threatened. Fortunately, this problem can be addressed through good architecture that includes an exchange model for data in the middle tier.

Migrating Point-to-Point to Many-to-Many
As the scope of enterprise integration grows, IT organizations are demanding greater efficiency and agility from their architectures. This demand has fueled a shift in the industry from point-to-point integration - which has proven to be increasingly cumbersome to build and maintain - to a many-to-many architecture that supports linear growth costs and simplified maintenance.

From EAI to SOA
Enterprise application integration (EAI) packages rely on point-to-point integration for interoperability between services and incorporate individual adapters between every pair of systems. The costs to build and maintain such an architecture become unmanageable as the number of systems grows; integrating n systems requires n(n-1) adapters.

To overcome this problem, IT departments are turning towards a many-to-many approach for integrating diverse systems. This shift involves moving away from individual adaptors between every pair of services towards publishing Web services with standard interfaces on the enterprise message bus. This is the heart of an SOA.

From Mapping Tools to Canonical Messages
IT organizations are making the same transition in how they architect for data interoperability. Rather than creating individual mappings between every pair of systems, organizations are implementing canonical messages for a move towards a many-to-many data architecture. Canonical messages involve publishing standard schemas and vocabularies, and all systems map to the agreed-upon models to communicate. This takes a major step in controlling the initial costs of data interoperability.

Canonical message formats represent a many-to-many approach to sharing data across integrated systems, but they fall short of a complete data exchange solution. Early attempts to implement canonical messages formats have often been rife with problems, such as forcing agreement between departments and creating rigid models that become obstacles to future growth. While they do simplify some aspects of integration, shortcomings prevent them from being the silver bullet:

  • Requires agreement between departments: One of the biggest drawbacks of imposing common message formats or vocabularies across the enterprise is accounting for diversity between users. Forcing groups with different needs and goals to agree on messages can cause infighting and usually results in the lowest common denominator that inevitably hinders progress.
  • Inflexible: Because all systems write to the canonical model, it is impossible to change the model without considerable cost and disruption of service. A single change in a message schema requires code in all systems to be rewritten, tested, and redeployed.
  • Ignores interpretation and validation operations: Canonical messages do nothing to address the semantic differences discussed in the previous section or to make sure that the information is valid within the context of the application. Even with all systems using the identical schema for "order status," the rules that govern the usage of each field are not defined - and this ambiguity has to be handled with custom code.

From Canonical Messages to Data Exchange Models
Making a many-to-many SOA architecture deliver on the promise of simplified integration requires more than Web services and canonical message formats. The missing link is infrastructure that standardizes and administers all exchange operations including data transformation, aggregation, validation, and business operations. Canonical messages that are enriched with metadata defining such operations and constituting an exchange model complete the architecture and bridge the gap between diverse services that share information.

The Real Solution: Exchange Modeling
For an SOA to deliver on its promise, the architecture needs a mechanism for reconciling the inherent inconsistencies between data sources and data targets. Exchange modeling technology goes one step further than canonical message formats by creating an exchange model that captures the semantics and usage of data as metadata, in addition to the syntax and structure. Exchange modeling fits into the BEA 8.1 Platform (see Figure 1).

The metadata in an exchange model is used to deploy rich sets of shared data services that formalize and automate the interaction with data across an SOA - ensuring data interoperability and data quality.

Exchange models consist of three tiers (see Figure 2): one tier each for data sources, data targets, and the intervening canonical message formats. By having multiple tiers, each department can program to its own models, and changes can be made locally without disrupting other systems. Schemas are derived directly from existing systems, enterprise data formats (EDFs), existing canonical messages, or industry standard schemas such as RosettaNet, ACORD, or MISMO. They are then enriched with metadata that define the transformation, aggregation, validation, and business operations associated with each field, class, or the entire schema.

At runtime, services within the SOA need only to access the shared data services for data, rather than having to go to multiple individual data sources via tightly coupled integration channels. During each exchange, data is fully converted and documents are guaranteed to be valid before they are submitted to back end systems.

In the "order status" example discussed earlier, a developer would create a set of rules on canonical messages sent from order management, manufacturing, and shipping systems. These conditions would fire based on the source system, the target system, and the canonical message type. In cases where a transformation is necessary, the fired condition would apply a mapping that described how to translate among the various definitions of status. Exchange models are a practical part of any SOA because:

  • Conditions and mapping are described as metadata: Unlike hand code in applications, the developer can change them dynamically at runtime.
  • Exchange operations are defined centrally: The transformation, aggregation, validation, and business operations are defined once and take effect throughout the enterprise without exception to eliminate redundant coding and ensure consistency.
  • Data services are deployed locally: There is no single point of failure or performance bottleneck.

Conclusion
IT organizations are architecting their infrastructures to support a greater level of integration and for a greater level of flexibility to respond to an ever-changing business climate. For many, this includes a shift to an SOA for service interoperability, but an SOA does nothing to address data interoperability. Many organizations resort to adding data interoperability code to every application, and others are implementing canonical data models but run into shortcomings.

Data exchange models go further by allowing data structures and message formats to be enriched with semantic information to fully describe how to interpret and convert the information that is shared between systems. Exchange models are also tiered so that organizations have control over their own data structures and are not forced to program to common models. As an added bonus, the change management facilities inherent in exchange models create infrastructure that can be responsive to the changing needs of business.

More Stories By Coco Jaenicke

Coco Jaenicke was, until recently, the XML evangelist and director of product marketing for eXcelon, the industry's first application development environment for building and deploying e-business applications. She is a member of XML-J's Editorial Advisory Board.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
"Cloud Academy is an enterprise training platform for the cloud, specifically public clouds. We offer guided learning experiences on AWS, Azure, Google Cloud and all the surrounding methodologies and technologies that you need to know and your teams need to know in order to leverage the full benefits of the cloud," explained Alex Brower, VP of Marketing at Cloud Academy, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clar...
Large industrial manufacturing organizations are adopting the agile principles of cloud software companies. The industrial manufacturing development process has not scaled over time. Now that design CAD teams are geographically distributed, centralizing their work is key. With large multi-gigabyte projects, outdated tools have stifled industrial team agility, time-to-market milestones, and impacted P&L stakeholders.
Gemini is Yahoo’s native and search advertising platform. To ensure the quality of a complex distributed system that spans multiple products and components and across various desktop websites and mobile app and web experiences – both Yahoo owned and operated and third-party syndication (supply), with complex interaction with more than a billion users and numerous advertisers globally (demand) – it becomes imperative to automate a set of end-to-end tests 24x7 to detect bugs and regression. In th...
"Akvelon is a software development company and we also provide consultancy services to folks who are looking to scale or accelerate their engineering roadmaps," explained Jeremiah Mothersell, Marketing Manager at Akvelon, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"MobiDev is a software development company and we do complex, custom software development for everybody from entrepreneurs to large enterprises," explained Alan Winters, U.S. Head of Business Development at MobiDev, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events announced today that CrowdReviews.com has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5–7, 2018, at the Javits Center in New York City, NY. CrowdReviews.com is a transparent online platform for determining which products and services are the best based on the opinion of the crowd. The crowd consists of Internet users that have experienced products and services first-hand and have an interest in letting other potential buye...
"IBM is really all in on blockchain. We take a look at sort of the history of blockchain ledger technologies. It started out with bitcoin, Ethereum, and IBM evaluated these particular blockchain technologies and found they were anonymous and permissionless and that many companies were looking for permissioned blockchain," stated René Bostic, Technical VP of the IBM Cloud Unit in North America, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Conventi...
SYS-CON Events announced today that Telecom Reseller has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
"Space Monkey by Vivent Smart Home is a product that is a distributed cloud-based edge storage network. Vivent Smart Home, our parent company, is a smart home provider that places a lot of hard drives across homes in North America," explained JT Olds, Director of Engineering, and Brandon Crowfeather, Product Manager, at Vivint Smart Home, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Coca-Cola’s Google powered digital signage system lays the groundwork for a more valuable connection between Coke and its customers. Digital signs pair software with high-resolution displays so that a message can be changed instantly based on what the operator wants to communicate or sell. In their Day 3 Keynote at 21st Cloud Expo, Greg Chambers, Global Group Director, Digital Innovation, Coca-Cola, and Vidya Nagarajan, a Senior Product Manager at Google, discussed how from store operations and ...
In his session at 21st Cloud Expo, Carl J. Levine, Senior Technical Evangelist for NS1, will objectively discuss how DNS is used to solve Digital Transformation challenges in large SaaS applications, CDNs, AdTech platforms, and other demanding use cases. Carl J. Levine is the Senior Technical Evangelist for NS1. A veteran of the Internet Infrastructure space, he has over a decade of experience with startups, networking protocols and Internet infrastructure, combined with the unique ability to it...
It is of utmost importance for the future success of WebRTC to ensure that interoperability is operational between web browsers and any WebRTC-compliant client. To be guaranteed as operational and effective, interoperability must be tested extensively by establishing WebRTC data and media connections between different web browsers running on different devices and operating systems. In his session at WebRTC Summit at @ThingsExpo, Dr. Alex Gouaillard, CEO and Founder of CoSMo Software, presented ...
"There's plenty of bandwidth out there but it's never in the right place. So what Cedexis does is uses data to work out the best pathways to get data from the origin to the person who wants to get it," explained Simon Jones, Evangelist and Head of Marketing at Cedexis, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
WebRTC is great technology to build your own communication tools. It will be even more exciting experience it with advanced devices, such as a 360 Camera, 360 microphone, and a depth sensor camera. In his session at @ThingsExpo, Masashi Ganeko, a manager at INFOCOM Corporation, introduced two experimental projects from his team and what they learned from them. "Shotoku Tamago" uses the robot audition software HARK to track speakers in 360 video of a remote party. "Virtual Teleport" uses a multip...
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, whic...
SYS-CON Events announced today that Evatronix will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Evatronix SA offers comprehensive solutions in the design and implementation of electronic systems, in CAD / CAM deployment, and also is a designer and manufacturer of advanced 3D scanners for professional applications.
Leading companies, from the Global Fortune 500 to the smallest companies, are adopting hybrid cloud as the path to business advantage. Hybrid cloud depends on cloud services and on-premises infrastructure working in unison. Successful implementations require new levels of data mobility, enabled by an automated and seamless flow across on-premises and cloud resources. In his general session at 21st Cloud Expo, Greg Tevis, an IBM Storage Software Technical Strategist and Customer Solution Architec...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. In his session at @BigDataExpo, Jack Norris, Senior Vice President, Data and Applications at MapR Technologies, reviewed best practices to ...
An increasing number of companies are creating products that combine data with analytical capabilities. Running interactive queries on Big Data requires complex architectures to store and query data effectively, typically involving data streams, an choosing efficient file format/database and multiple independent systems that are tied together through custom-engineered pipelines. In his session at @BigDataExpo at @ThingsExpo, Tomer Levi, a senior software engineer at Intel’s Advanced Analytics gr...
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things’). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing? IoT is not about the devices, it’s about the data consumed and generated. The devices are tools, mechanisms, conduits. In his session at Internet of Things at Cloud Expo | DXWor...