Weblogic Authors: Yeshim Deniz, Elizabeth White, Michael Meiner, Michael Bushong, Avi Rosenthal

Related Topics: Weblogic

Weblogic: Article

Defragment Your View of the World for a Quiet Life

An innovative approach to the distributed data problem

The value of two phase commit transactions has always been that programmers can write applications that access data spread across multiple databases and be confident that any updates that are made will be consistently reflected in all of the databases, or none, at the end of the transaction.

Clearly, the reality underlying this observation is that data is split across multiple stores. And how! As project-based applications are built, or bought, data about a single business entity (let's take "customer" as an example) become diffused over many back-end systems, and before long the poor guy is on the phone, complaining that you got his personal details wrong, despite his having informed you of his change of address five times in the last six weeks. But that's just the customer perspective. All sorts of other people feel this pain; managers can't get consolidated views of the things they care about: customers, products, sales, you name it they are doubtless populating their own personal spreadsheets with data gleaned from five or six different back ends in order to inform their usual run of day-to-day decisions.

So, the IT department set off in response on a crusade to build a portal to centralize access to the data, which is good as far as it goes - at least now the code that is the equivalent of that spreadsheet is managed somewhere on a server (although the manager may debate this as a benefit, since now he can't change anything for three months, while he waits for a developer to become free). All this turns out to be a mirage in the longer term as well - as the data landscape changes at the back end, unpredictable quantities of this front-end data integration logic gets broken, and needs rework; not to mention the fact that almost none of it is reusable - a million different little queries for a million different specific purposes, and growing fast.

This landscape is about as far from the "service-oriented architecture" vision as it is possible to be - all of the pain of a stovepiped approach to functionality (and therefore data) in one painful place. It is because of the importance of logically organized, coherent data, and the grim reality of its fragmented storage that a data services layer is one of the first architectural layers that gets looked into as IT departments start to review their desired reference architecture for a move to SOA.

In the past, this problem has been addressed by building data-mart type intermediate databases, drip-fed via ETL of data from the fragmented data sources of record. This does provide for a consistent data model to query, but at the cost of building and maintaining the ETL and with the requirement that the data cannot HAVE to be absolutely up-to-date. After all, you are querying a consistent view of the data, but it's a snapshot - not the real thing. Just imagine if there were an infrastructure that could do for the logical presentation of data what the transaction manager does for its transactional consistency...

Well, there is! And as with all of the best and most creative artists, it has recently changed its name. The product formerly known as Liquid Data - now called AquaLogic Data Services - is an infrastructure that provides for a systematic solution to these issues.

Enter ALDS
At the root, you model your data as you'd like it to appear (the logical data model, canonical data representation, call it what you will) your data as it physically exists (be that data held in relational schemas, applications, Web services, or whatever) and then you declaratively define how the physical data elements relate to the logical data model using the W3C standard XQuery language. In this way, you have solved the fundamental problem of understanding where the data you want to look at comes from, and you have done it in a systematic, managed way. So, looks as good as the portal vs. spreadsheet phase of evolution then?

Actually, it looks a lot better. Recall that all we did so far is declare how the data is built up via relationships. The actual execution of the queries is managed at runtime by the ALDS query engine, which is guided by your declared relationships. What you run queries against is the logical data model, and what that physically causes to happen is driven by what data you are looking for - if you look for a single element of data that comes ultimately from a single back-end database or application, the query that actually runs will just retrieve the necessary information, not all that it would take to populate the entire logical data model. We have achieved, therefore, not only centralized management of data relationships, but reusable relationships - instead of the million aforementioned combinations of data retrieval for a million different specific purposes we have one declared relationship with as many different filters applied to it at runtime as are needed to address all of the different use cases. Wow, reuse! Sounds like a bit of a SOA to me! Maybe it's not an accident that the name changed and the product became part of BEA's new AquaLogic Service Infrastructure product family!

And because the queries are being pushed down to the data sources of record at runtime, the data we see is absolutely up-to-date, removing another of the problems with the "just add another database" technology elastoplast traditional solution to this problem. Finally in addition to all this, ALDS allows you to model the relationships between your physical data sources, so as the back ends change, you can do quick impact analyses of what will be impacted (and, of course, the impact will be restricted to the ALDS tier - the front ends will continue to work against the same logical data model).

But that's not all: the product allows updates as well as queries - updates are decomposed and pushed to the back-end data sources by the query engine, and there is an update framework where you can manage the consistency of updates across multiple stores as appropriate

And what's the role of the transaction manager in all of this? Well, if all your data sources are xa compliant, ALDS can use it to give you update capability with no need to write any code. Magic, I'll have one!!

In conclusion, I would encourage anybody who is grappling with these types of issues to give ALDS a test run. Because it's an innovative approach to the distributed data problem, a lot of people's initial reaction to it is "oh, it won't go fast enough" or some such. In practice, I have never failed to see these same individuals hearts melted as they see the power (and efficiency) of the product and its approach to data integration. Go on... Give it a spin! You know it makes sense! Read more about the ALDS product at http://dev2dev.bea.com/dataservices/.

More Stories By Peter Holditch

Peter Holditch is a senior presales engineer in the UK for Azul Systems. Prior to joining Azul he spent nine years at BEA systems, going from being one of their first Professional Services consultants in Europe and finishing up as a principal presales engineer. He has an R&D background (originally having worked on BEA's Tuxedo product) and his technical interests are in high-throughput transaction systems. "Of the pitch" Peter likes to brew beer, build furniture, and undertake other ludicrously ambitious projects - but (generally) not all at the same time!

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

IoT & Smart Cities Stories
Headquartered in Plainsboro, NJ, Synametrics Technologies has provided IT professionals and computer systems developers since 1997. Based on the success of their initial product offerings (WinSQL and DeltaCopy), the company continues to create and hone innovative products that help its customers get more from their computer applications, databases and infrastructure. To date, over one million users around the world have chosen Synametrics solutions to help power their accelerated business or per...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
@DevOpsSummit at Cloud Expo, taking place November 12-13 in New York City, NY, is co-located with 22nd international CloudEXPO | first international DXWorldEXPO and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time t...
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the...
SYS-CON Events announced today that IoT Global Network has been named “Media Sponsor” of SYS-CON's @ThingsExpo, which will take place on June 6–8, 2017, at the Javits Center in New York City, NY. The IoT Global Network is a platform where you can connect with industry experts and network across the IoT community to build the successful IoT business of the future.
IoT is rapidly becoming mainstream as more and more investments are made into the platforms and technology. As this movement continues to expand and gain momentum it creates a massive wall of noise that can be difficult to sift through. Unfortunately, this inevitably makes IoT less approachable for people to get started with and can hamper efforts to integrate this key technology into your own portfolio. There are so many connected products already in place today with many hundreds more on the h...
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
CloudEXPO New York 2018, colocated with DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
Disruption, Innovation, Artificial Intelligence and Machine Learning, Leadership and Management hear these words all day every day... lofty goals but how do we make it real? Add to that, that simply put, people don't like change. But what if we could implement and utilize these enterprise tools in a fast and "Non-Disruptive" way, enabling us to glean insights about our business, identify and reduce exposure, risk and liability, and secure business continuity?