Welcome!

Weblogic Authors: Yeshim Deniz, Elizabeth White, Michael Meiner, Michael Bushong, Avi Rosenthal

Related Topics: Weblogic

Weblogic: Article

Building Adaptive Applications With Reusable Components

Building Adaptive Applications With Reusable Components

Building Adaptive Applications With Reusable Components

Two daunting tasks face application architects and project managers alike.The first is to architect a solution that will reduce the risks involved with implementing new and changing business requirements during the application development and post-application deployment stages of a project. The second is to architect a solution that will reduce the development time and increase the corporation's Return on Investment (ROI) in past projects by reusing prebuilt visual and business components

BEA WebLogic Portal 4.0 contains frameworks designed to allow the creation of adaptive applications using visual, data-cleansing, and business components reaped from previous application development projects. The Webflow and Pipeline frameworks enable developers to dynamically link together reusable components that follow the Model-View-Controller (MVC) design pattern.

Webflow Framework
The Webflow framework is designed to build Web applications that can adapt to changing Web-business requirements. The main controller behind the Webflow framework is the WebflowExecutor class (see Figure 1). The WebflowExecutor intercepts events emanating from a visual, input processor or business component, consults with a centralized XML-based application scenario file; and invokes the next predefined visual, input processor or business component.

The XML-based application scenario file enables the decoupling of application components, which is key to developing adaptive applications with reusable components. During runtime, each component sends an event type and location identifier to the WebflowExecutor via the WebflowServlet. The WebflowExecutor then determines the next component in the sequence to execute. The runtime determination of sequence flows by the WebflowExecutor allows the developer to simply modify the scenario file in order to alter the business flow of the online application. The alteration of the sequence file may include rearranging existing links between components, removing existing components, or adding new components. The ability to add and remove components in an application that uses the Webflow and Pipeline frameworks without any code modifications is made possible by using the provided JSP Webflow tag libraries (in the case of visual components) and by implementing the appropriate interfaces (in the case of input processors and pipeline components).

Visual Components
The visual components that can be used in the Webflow framework are JSP and HTML pages. The only supported events that visual components can generate are hyperlinks and forms. In the case of JSP pages, the developer need only inform the Webflow servlet of the event type and origination ID, which would then be forwarded to the WebflowExecutor. JSP helper tag libraries are provided to help the developer in developing JSP pages, as shown below.

<a href="<webflow:createWebflowURL origin="checkout.jsp" event="link.continue" httpsInd="http"/>"> Check Out </a>

The example above is invoked when the user selects that corresponding hyperlink, in which case the origin ID ("checkout.jsp"), the event type ("link.continue"), and the protocol to be used ("http") are sent to the Webflow servlet. In the case of HTML visual components, the visual component developer would pass the Webflow servlet the same information as a JSP visual component. However, since JSP tag libraries are not usable in HTML pages, the information has to be explicitly placed in the appropriate format, as shown below:

<a href=http://localhost:7501/example/
application?namespace
=order&origin=checkout.jsp&event=link.continue> Check Out</a>
The fact that destinations aren't hard-coded in each visual component allows the developer to rearrange the site map without having to modify JSP or HTML code.

Input Processor Components
In order to ensure that the information being sent to the business components is valid and well-formatted, a data-cleansing step should be added to the execution flow. InputProcessors are specialized Java classes that can be used to perform the data-cleansing task. To add an InputProcessor to the Web application scenario, a developer would simply implement the InputProcessor interface or extend the InputProcessorSupport class (see Figure 2), and link the InputProcessor to the appropriate components in the XML-based scenario file.

The InputProcessor developer may also use an implementation of the ValidatedValues interface to report back to the visual component the results of the completed data-cleansing task. The visual component developer can then access the status of each form field via the JSP tag. In addition to cleansing data, InputProcessors can also be used to perform conditional branching within a Web application where, depending upon the branching logic contained in the InputProcessor, the return value will tell the WebflowExecutor which visual or business component to execute next.

After the InputProcessor completes its processing, a return object is sent back to the WebflowExecutor. The return object, coupled with the knowledge of the InputProcessor type, allows the WebflowExecutor to determine the next component to execute by referring to the XML-based scenario file. Similar to the visual components, hard-coded links to other components are not implemented in the InputProcessors, which allow the developer to reuse data-cleansing and conditional-branching components, and provides the ability to change the application's behavior without recoding.

Pipeline Framework
The Pipeline framework is designed to build business applications that can adapt to changing business requirements. The framework follows the Pipe-and-Filter architectural pattern and works in conjunction with the Webflow framework. After the WebflowExecutor has consulted the XML-based application scenario file and determined that a pipeline needs to be executed, control is handed over to the PipelineExecutor to execute the pipeline. After completion of the pipeline invocation, control is handed back to the WebflowExecutor to determine which components in the application sequence to invoke next. It's important to understand that all the components in the Webflow and Pipeline frameworks can be linked together in any sequence and that information may be shared between these components through the PipelineSession object.

Pipeline
The PipelineExecutor serves as the controlling class in the Pipeline framework, handling the linkage and exceptions between pipeline components. The pipeline components in the Pipeline framework are essentially business components that perform the required business functionality. Each pipeline can be composed of a sequence of one or more pipeline components and exceptions can be generated by any of the pipeline components. These exceptions can either be fatal, which stops the processing of the pipeline, or nonfatal, which allows the next component in the sequence to be invoked. A pipeline can also be wrapped in a single transactional context, allowing the components in the pipeline to act as one atomic unit of work. As with the WebflowExecutor in the Webflow framework, the PipelineExecutor references a pipeline scenario script to determine which pipeline component to invoke next, given a specific event and origination.

Business Component
The business components that can be linked together in a single pipeline may consist of Java and Stateless Session EJB classes. The business component (i.e, pipeline component) is generally used to execute coarse-grain business logic or integrate to existing back-end systems.

To add a business component to the pipeline sequence, a developer would simply implement the PipelineComponent interface or extend the PipelineComponentSupport class (see Figure 3), and link the business component implementation to the other business components in the XML-based pipeline scenario file.

In order to allow the sharing of information in a pipeline, the PipelineSession object is passed between business components via the required Process method implementation. The PipelineSession object can also be contained within the same transactional context as the components in the pipeline itself.

GUI Editor Tool
Since the XML-based scenario files are integral to the Webflow and Pipeline frameworks, the ability to create and edit these files is incorporated into the WebLogic Portal GUI tool, the E-Business Control Center (EBCC). Each framework has its own specialized drawing area (see Figures 4 and 5) and multiple scenario files are differentiated via a namespacing scheme. The visual editing tool is very easy to use with drag-and-drop, zoom, scenario-linking, and attribute-editing features. The ability to graphically create an application by linking together visual and business components from a corporation's certified IT application component library is a powerful step in building adaptable applications in a substantially reduced time frame.

Conclusion
The Webflow and Pipeline frameworks, which are part of the BEA WebLogic Portal product, address the need for an architecture that is able to adapt to changing business requirements and improve ROI by enabling the reuse of application components.

By simply implementing the required framework interfaces, visual, data-cleansing, and business components can easily be linked together to form a fully functional application. This application can then be adapted to address new business functionality by visually relinking new or existing components from previous projects. Additionally, the implementation of the MVC design pattern provides the separation between the presentation logic and the underlying business processes, thus allowing the ability to modify any one of the tiers without impacting the others.

More Stories By Dwight Mamanteo

Dwight Mamanteo is a technical manager with the Global Alliances Technical Services
organization at BEA Systems. He has been with BEA since 1999 and his current
job responsibilities include providing technical enablement support to BEA's
strategic SI and ISV partners. He has
been involved with object-oriented programming, design and architecture since 1993.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
DXWorldEXPO LLC, the producer of the world's most influential technology conferences and trade shows has announced the 22nd International CloudEXPO | DXWorldEXPO "Early Bird Registration" is now open. Register for Full Conference "Gold Pass" ▸ Here (Expo Hall ▸ Here)
Join IBM November 1 at 21st Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA, and learn how IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Cognitive analysis impacts today’s systems with unparalleled ability that were previously available only to manned, back-end operations. Thanks to cloud processing, IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Imagine a robot vacuum that becomes your personal assistant tha...
"MobiDev is a software development company and we do complex, custom software development for everybody from entrepreneurs to large enterprises," explained Alan Winters, U.S. Head of Business Development at MobiDev, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
I think DevOps is now a rambunctious teenager - it's starting to get a mind of its own, wanting to get its own things but it still needs some adult supervision," explained Thomas Hooker, VP of marketing at CollabNet, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Recently, WebRTC has a lot of eyes from market. The use cases of WebRTC are expanding - video chat, online education, online health care etc. Not only for human-to-human communication, but also IoT use cases such as machine to human use cases can be seen recently. One of the typical use-case is remote camera monitoring. With WebRTC, people can have interoperability and flexibility for deploying monitoring service. However, the benefit of WebRTC for IoT is not only its convenience and interopera...
Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy -- one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Industry-acclaimed CTO and cloud expert, Shankar Kalyana presents. Only the most exceptional IBMers are appointed with the rare distinction of IBM Fellow, the highest technical honor in the company. Shankar has also receive...
It is of utmost importance for the future success of WebRTC to ensure that interoperability is operational between web browsers and any WebRTC-compliant client. To be guaranteed as operational and effective, interoperability must be tested extensively by establishing WebRTC data and media connections between different web browsers running on different devices and operating systems. In his session at WebRTC Summit at @ThingsExpo, Dr. Alex Gouaillard, CEO and Founder of CoSMo Software, presented ...
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
WebRTC is great technology to build your own communication tools. It will be even more exciting experience it with advanced devices, such as a 360 Camera, 360 microphone, and a depth sensor camera. In his session at @ThingsExpo, Masashi Ganeko, a manager at INFOCOM Corporation, introduced two experimental projects from his team and what they learned from them. "Shotoku Tamago" uses the robot audition software HARK to track speakers in 360 video of a remote party. "Virtual Teleport" uses a multip...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
IoT is rapidly becoming mainstream as more and more investments are made into the platforms and technology. As this movement continues to expand and gain momentum it creates a massive wall of noise that can be difficult to sift through. Unfortunately, this inevitably makes IoT less approachable for people to get started with and can hamper efforts to integrate this key technology into your own portfolio. There are so many connected products already in place today with many hundreds more on the h...
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settl...
In his session at @ThingsExpo, Dr. Robert Cohen, an economist and senior fellow at the Economic Strategy Institute, presented the findings of a series of six detailed case studies of how large corporations are implementing IoT. The session explored how IoT has improved their economic performance, had major impacts on business models and resulted in impressive ROIs. The companies covered span manufacturing and services firms. He also explored servicification, how manufacturing firms shift from se...
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of bus...
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
IoT solutions exploit operational data generated by Internet-connected smart “things” for the purpose of gaining operational insight and producing “better outcomes” (for example, create new business models, eliminate unscheduled maintenance, etc.). The explosive proliferation of IoT solutions will result in an exponential growth in the volume of IoT data, precipitating significant Information Governance issues: who owns the IoT data, what are the rights/duties of IoT solutions adopters towards t...
Amazon started as an online bookseller 20 years ago. Since then, it has evolved into a technology juggernaut that has disrupted multiple markets and industries and touches many aspects of our lives. It is a relentless technology and business model innovator driving disruption throughout numerous ecosystems. Amazon’s AWS revenues alone are approaching $16B a year making it one of the largest IT companies in the world. With dominant offerings in Cloud, IoT, eCommerce, Big Data, AI, Digital Assista...
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.