Welcome!

Weblogic Authors: Yeshim Deniz, Elizabeth White, Michael Meiner, Michael Bushong, Avi Rosenthal

Related Topics: Weblogic

Weblogic: Article

Transactions: That's enough of your source!

Transactions: That's enough of your source!

A common complaint in the transaction newsgroup is, "I've done my database updates in a JTA transaction, but they didn't complete as a unit!"

In many cases, the explanation for this unfortunate loss of ACID is that the database connections that were used in the logic weren't obtained from a transactional data source, or Tx Data Source as it's abbreviated in the console. The shorthand explanation for this is that any database connection that is to participate in a distributed transaction needs to expose an xa interface, so you need a data source that can pass back an xa connection and that references a pool of xa database connections. The data source that is capable of passing through an xa connection is, you guessed it, the Tx Data Source. The bottom line: when you set up a database connection, use a Tx Data Source and an xa-compliant jdbc driver if there's any chance that you want to access it in a distributed transaction. In fact, I would boldly go so far as to say that you should always use xa drivers (where available) and Tx Data Source data sources to avoid future problems as code and infrastructure reuse change the interactions between the elements of your system.

By way of illustration, the example server shipped with WebLogic Server 7.0 provides a demo connection pool of connections to the PointBase sample database called demoXApool. The Driver class name used by this pool is com.pointbase.xa.xaDataSource, which is the name of the PointBase driver class that exposes the xa interface. This pool is in turn accessed through the Tx Data Source called examples-dataSource-demoXAPool, which exposes the xa capabilities of the PointBase driver up to the calling layers, allowing database updates to participate in JTA distributed transaction processing.

Get Something for Nothing?
If you've been following this example in your console, you may have noticed a checkbox in the configuration of the Tx Data Source entitled "Emulate Two-Phase Commit for non-XA Driver". If you were feeling optimistic the moment you saw that, your mind may have started racing and you may have reached the conclusion that if you have a non xa-capable driver, you need to simply check this box and all will be well, by virtue of WebLogic Server's ability to somehow emulate xa at the click of a checkbox. If so, my friend, I advise you to wake up and smell the heuristics, preferably quickly, before you lose the integrity of your valuable business data. You may have been led even more quickly to this conclusion in the 6.x versions of the server where, if memory serves, this option was simply entitled "enable two-phase commit". It would be surprising if all the effort that database vendors put into supporting the xa standard was able to be emulated simply by checking a box, and of course the world being generally free from such surprises, this is not the case.

Recall from past articles that the xa interface allows a transaction manager to commit a database transaction in two phases. Instead of just saying "commit" - the standard transaction demarcation verb available in all database drivers - the TM can, through the offices of xa, say "prepare" and then when it has finally made up its mind say "commit". Without the xa interface, there is no way to achieve this, and in fact the "emulation" of two-phase commit provided by WebLogic Server is simply to say "Yes" to an instruction to prepare and then pass an instruction to commit through to the (non-xa) database connection as an ordinary commit instruction. If all goes well, this will yield something that looks like a distributed transaction.

If all goes badly, this will be the beginning of your (or your WebLogic administrator's) nightmare. By way of illustration of what could go wrong, imagine a failure mode whereby the non-xa database gets corrupted, or fails after the prepare has happened. When the time comes for the commit call to be issued, there will be no way back. The commit will fail due to the corruption or disconnection, but by this point all the other databases will be gaily committing their own portions of the transaction, despite the loss of the update. Real ACID would demand that all the other databases roll back to preserve the transactional integrity. As you can see, we have lost our ACID, and with it our consistent data. In fact, this "simulation" will only be guaranteed to work if nothing in the processing environment goes wrong during certain critical windows around the commit processing. Since transactions are supposed to be all about recovering from failures, especially tricky failures during critical windows, the simulation is of no real value. The advice must be, use it as a convenience during development if you don't have an xa database driver at hand, but never check that box in a production system.

Last One to the Moon by Magic
So the xa transaction simulation certainly isn't anything close to magic moon dust. But there is something that isŠ Last Resource Optimization or LRO.

Imagine a transaction spanning several resources (generally, a resource is a fancy word for a database connection) where all but one of the resources are xa compliant. Now imagine that at prepare time the transaction manager (TM) was to successfully prepare all the xa resources. Without the non-xa resource, with all the prepares in OK, the decision to commit would be logged by the TM, and the commits would be issued. With the non-xa resource, we could limit the window for failure down to a very small duration if we were to log the decision and then immediately commit the single non-xa resource with a single phase commit instruction. If this single phase commit were to succeed, then we could go ahead with the normal second phase commitment of the rest of the resources. With this scheme, we would only be vulnerable to failure in the small window between the time the commit decision is logged and the time the non-xa commit completes. After that point, if anything failed, the non-xa resource would have been committed and eventually the xa resources would be rolled forward by the standard recovery processing. In fact we could go a step further and record the transaction id in the non-xa resource. Now we could recover the transaction with certainty, because during recovery processing the TM could check in the non-xa datastore to see if the non-xa commit succeeded or not - if the in-doubt transaction id was stored there, the commit happened, otherwise it didn't. On that basis, the rest of the resources could be recovered to their pre or post transaction states as appropriate.

In a world filled with nothing but xa resources, this LRO technique would be nothing more than an interesting curiosity, and most databases are xa compliant, so why am I boring you with this arcana? Well, many applications touch more than just databases - in these days of systems composed from a combination of bought and built functionality, most bespoke code accesses both databases (which are indeed generally xa compliant) and packaged applications (which generally are not). In fact, the whole point of the BEA WebLogic Platform is that most applications will involve elements of integration. In this scenario, to lever the maximum value out of the application server's transaction management capabilities, LRO would be a good feature.

A Little Bird Speaks of Clear Blue Water
As it turns out, a little bird told me that the next release of WebLogic Server is scheduled to support the small-window variant of LRO. When this is released, you will be able to build applications that use the transaction manager to protect data integrity across databases and an application view connected to an application through a J2EE CA adapter, even if the underlying application doesn't support xa. Or distribute a transaction across databases and a nontransactional message bus, or across multiple databases where one doesn't support xa, via a new supercharged version of the WebLogic jts driver.

Its features such as LRO, fitting as they do neatly beneath the J2EE API definitions without being specified by any of them, that sort the men from the boys and keep WebLogic Server ahead of the app server pack - a pack composed not only of competitive vendor's products but of open-source implementations and the like. J2EE support may be a commodity, in the same way as support for ANSI SQL in an RDBMS is, but it is in the application server beneath - the engine room of the modern enterprise - that commodity stops and clear blue water divides the best from the rest.

More Stories By Peter Holditch

Peter Holditch is a senior presales engineer in the UK for Azul Systems. Prior to joining Azul he spent nine years at BEA systems, going from being one of their first Professional Services consultants in Europe and finishing up as a principal presales engineer. He has an R&D background (originally having worked on BEA's Tuxedo product) and his technical interests are in high-throughput transaction systems. "Of the pitch" Peter likes to brew beer, build furniture, and undertake other ludicrously ambitious projects - but (generally) not all at the same time!

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
In his session at 21st Cloud Expo, Carl J. Levine, Senior Technical Evangelist for NS1, will objectively discuss how DNS is used to solve Digital Transformation challenges in large SaaS applications, CDNs, AdTech platforms, and other demanding use cases. Carl J. Levine is the Senior Technical Evangelist for NS1. A veteran of the Internet Infrastructure space, he has over a decade of experience with startups, networking protocols and Internet infrastructure, combined with the unique ability to it...
"There's plenty of bandwidth out there but it's never in the right place. So what Cedexis does is uses data to work out the best pathways to get data from the origin to the person who wants to get it," explained Simon Jones, Evangelist and Head of Marketing at Cedexis, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"Cloud Academy is an enterprise training platform for the cloud, specifically public clouds. We offer guided learning experiences on AWS, Azure, Google Cloud and all the surrounding methodologies and technologies that you need to know and your teams need to know in order to leverage the full benefits of the cloud," explained Alex Brower, VP of Marketing at Cloud Academy, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clar...
Large industrial manufacturing organizations are adopting the agile principles of cloud software companies. The industrial manufacturing development process has not scaled over time. Now that design CAD teams are geographically distributed, centralizing their work is key. With large multi-gigabyte projects, outdated tools have stifled industrial team agility, time-to-market milestones, and impacted P&L stakeholders.
Gemini is Yahoo’s native and search advertising platform. To ensure the quality of a complex distributed system that spans multiple products and components and across various desktop websites and mobile app and web experiences – both Yahoo owned and operated and third-party syndication (supply), with complex interaction with more than a billion users and numerous advertisers globally (demand) – it becomes imperative to automate a set of end-to-end tests 24x7 to detect bugs and regression. In th...
"Akvelon is a software development company and we also provide consultancy services to folks who are looking to scale or accelerate their engineering roadmaps," explained Jeremiah Mothersell, Marketing Manager at Akvelon, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"MobiDev is a software development company and we do complex, custom software development for everybody from entrepreneurs to large enterprises," explained Alan Winters, U.S. Head of Business Development at MobiDev, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events announced today that CrowdReviews.com has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5–7, 2018, at the Javits Center in New York City, NY. CrowdReviews.com is a transparent online platform for determining which products and services are the best based on the opinion of the crowd. The crowd consists of Internet users that have experienced products and services first-hand and have an interest in letting other potential buye...
"IBM is really all in on blockchain. We take a look at sort of the history of blockchain ledger technologies. It started out with bitcoin, Ethereum, and IBM evaluated these particular blockchain technologies and found they were anonymous and permissionless and that many companies were looking for permissioned blockchain," stated René Bostic, Technical VP of the IBM Cloud Unit in North America, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Conventi...
SYS-CON Events announced today that Telecom Reseller has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
"Space Monkey by Vivent Smart Home is a product that is a distributed cloud-based edge storage network. Vivent Smart Home, our parent company, is a smart home provider that places a lot of hard drives across homes in North America," explained JT Olds, Director of Engineering, and Brandon Crowfeather, Product Manager, at Vivint Smart Home, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Coca-Cola’s Google powered digital signage system lays the groundwork for a more valuable connection between Coke and its customers. Digital signs pair software with high-resolution displays so that a message can be changed instantly based on what the operator wants to communicate or sell. In their Day 3 Keynote at 21st Cloud Expo, Greg Chambers, Global Group Director, Digital Innovation, Coca-Cola, and Vidya Nagarajan, a Senior Product Manager at Google, discussed how from store operations and ...
It is of utmost importance for the future success of WebRTC to ensure that interoperability is operational between web browsers and any WebRTC-compliant client. To be guaranteed as operational and effective, interoperability must be tested extensively by establishing WebRTC data and media connections between different web browsers running on different devices and operating systems. In his session at WebRTC Summit at @ThingsExpo, Dr. Alex Gouaillard, CEO and Founder of CoSMo Software, presented ...
WebRTC is great technology to build your own communication tools. It will be even more exciting experience it with advanced devices, such as a 360 Camera, 360 microphone, and a depth sensor camera. In his session at @ThingsExpo, Masashi Ganeko, a manager at INFOCOM Corporation, introduced two experimental projects from his team and what they learned from them. "Shotoku Tamago" uses the robot audition software HARK to track speakers in 360 video of a remote party. "Virtual Teleport" uses a multip...
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, whic...
SYS-CON Events announced today that Evatronix will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Evatronix SA offers comprehensive solutions in the design and implementation of electronic systems, in CAD / CAM deployment, and also is a designer and manufacturer of advanced 3D scanners for professional applications.
Leading companies, from the Global Fortune 500 to the smallest companies, are adopting hybrid cloud as the path to business advantage. Hybrid cloud depends on cloud services and on-premises infrastructure working in unison. Successful implementations require new levels of data mobility, enabled by an automated and seamless flow across on-premises and cloud resources. In his general session at 21st Cloud Expo, Greg Tevis, an IBM Storage Software Technical Strategist and Customer Solution Architec...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. In his session at @BigDataExpo, Jack Norris, Senior Vice President, Data and Applications at MapR Technologies, reviewed best practices to ...
An increasing number of companies are creating products that combine data with analytical capabilities. Running interactive queries on Big Data requires complex architectures to store and query data effectively, typically involving data streams, an choosing efficient file format/database and multiple independent systems that are tied together through custom-engineered pipelines. In his session at @BigDataExpo at @ThingsExpo, Tomer Levi, a senior software engineer at Intel’s Advanced Analytics gr...
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things’). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing? IoT is not about the devices, it’s about the data consumed and generated. The devices are tools, mechanisms, conduits. In his session at Internet of Things at Cloud Expo | DXWor...