Welcome!

Weblogic Authors: Yeshim Deniz, Elizabeth White, Michael Meiner, Michael Bushong, Avi Rosenthal

Related Topics: Weblogic

Weblogic: Article

Building Adaptive Applications With Reusable Components

Building Adaptive Applications With Reusable Components

Building Adaptive Applications With Reusable Components

Two daunting tasks face application architects and project managers alike.The first is to architect a solution that will reduce the risks involved with implementing new and changing business requirements during the application development and post-application deployment stages of a project. The second is to architect a solution that will reduce the development time and increase the corporation's Return on Investment (ROI) in past projects by reusing prebuilt visual and business components

BEA WebLogic Portal 4.0 contains frameworks designed to allow the creation of adaptive applications using visual, data-cleansing, and business components reaped from previous application development projects. The Webflow and Pipeline frameworks enable developers to dynamically link together reusable components that follow the Model-View-Controller (MVC) design pattern.

Webflow Framework
The Webflow framework is designed to build Web applications that can adapt to changing Web-business requirements. The main controller behind the Webflow framework is the WebflowExecutor class (see Figure 1). The WebflowExecutor intercepts events emanating from a visual, input processor or business component, consults with a centralized XML-based application scenario file; and invokes the next predefined visual, input processor or business component.

The XML-based application scenario file enables the decoupling of application components, which is key to developing adaptive applications with reusable components. During runtime, each component sends an event type and location identifier to the WebflowExecutor via the WebflowServlet. The WebflowExecutor then determines the next component in the sequence to execute. The runtime determination of sequence flows by the WebflowExecutor allows the developer to simply modify the scenario file in order to alter the business flow of the online application. The alteration of the sequence file may include rearranging existing links between components, removing existing components, or adding new components. The ability to add and remove components in an application that uses the Webflow and Pipeline frameworks without any code modifications is made possible by using the provided JSP Webflow tag libraries (in the case of visual components) and by implementing the appropriate interfaces (in the case of input processors and pipeline components).

Visual Components
The visual components that can be used in the Webflow framework are JSP and HTML pages. The only supported events that visual components can generate are hyperlinks and forms. In the case of JSP pages, the developer need only inform the Webflow servlet of the event type and origination ID, which would then be forwarded to the WebflowExecutor. JSP helper tag libraries are provided to help the developer in developing JSP pages, as shown below.

<a href="<webflow:createWebflowURL origin="checkout.jsp" event="link.continue" httpsInd="http"/>"> Check Out </a>

The example above is invoked when the user selects that corresponding hyperlink, in which case the origin ID ("checkout.jsp"), the event type ("link.continue"), and the protocol to be used ("http") are sent to the Webflow servlet. In the case of HTML visual components, the visual component developer would pass the Webflow servlet the same information as a JSP visual component. However, since JSP tag libraries are not usable in HTML pages, the information has to be explicitly placed in the appropriate format, as shown below:

<a href=http://localhost:7501/example/
application?namespace
=order&origin=checkout.jsp&event=link.continue> Check Out</a>
The fact that destinations aren't hard-coded in each visual component allows the developer to rearrange the site map without having to modify JSP or HTML code.

Input Processor Components
In order to ensure that the information being sent to the business components is valid and well-formatted, a data-cleansing step should be added to the execution flow. InputProcessors are specialized Java classes that can be used to perform the data-cleansing task. To add an InputProcessor to the Web application scenario, a developer would simply implement the InputProcessor interface or extend the InputProcessorSupport class (see Figure 2), and link the InputProcessor to the appropriate components in the XML-based scenario file.

The InputProcessor developer may also use an implementation of the ValidatedValues interface to report back to the visual component the results of the completed data-cleansing task. The visual component developer can then access the status of each form field via the JSP tag. In addition to cleansing data, InputProcessors can also be used to perform conditional branching within a Web application where, depending upon the branching logic contained in the InputProcessor, the return value will tell the WebflowExecutor which visual or business component to execute next.

After the InputProcessor completes its processing, a return object is sent back to the WebflowExecutor. The return object, coupled with the knowledge of the InputProcessor type, allows the WebflowExecutor to determine the next component to execute by referring to the XML-based scenario file. Similar to the visual components, hard-coded links to other components are not implemented in the InputProcessors, which allow the developer to reuse data-cleansing and conditional-branching components, and provides the ability to change the application's behavior without recoding.

Pipeline Framework
The Pipeline framework is designed to build business applications that can adapt to changing business requirements. The framework follows the Pipe-and-Filter architectural pattern and works in conjunction with the Webflow framework. After the WebflowExecutor has consulted the XML-based application scenario file and determined that a pipeline needs to be executed, control is handed over to the PipelineExecutor to execute the pipeline. After completion of the pipeline invocation, control is handed back to the WebflowExecutor to determine which components in the application sequence to invoke next. It's important to understand that all the components in the Webflow and Pipeline frameworks can be linked together in any sequence and that information may be shared between these components through the PipelineSession object.

Pipeline
The PipelineExecutor serves as the controlling class in the Pipeline framework, handling the linkage and exceptions between pipeline components. The pipeline components in the Pipeline framework are essentially business components that perform the required business functionality. Each pipeline can be composed of a sequence of one or more pipeline components and exceptions can be generated by any of the pipeline components. These exceptions can either be fatal, which stops the processing of the pipeline, or nonfatal, which allows the next component in the sequence to be invoked. A pipeline can also be wrapped in a single transactional context, allowing the components in the pipeline to act as one atomic unit of work. As with the WebflowExecutor in the Webflow framework, the PipelineExecutor references a pipeline scenario script to determine which pipeline component to invoke next, given a specific event and origination.

Business Component
The business components that can be linked together in a single pipeline may consist of Java and Stateless Session EJB classes. The business component (i.e, pipeline component) is generally used to execute coarse-grain business logic or integrate to existing back-end systems.

To add a business component to the pipeline sequence, a developer would simply implement the PipelineComponent interface or extend the PipelineComponentSupport class (see Figure 3), and link the business component implementation to the other business components in the XML-based pipeline scenario file.

In order to allow the sharing of information in a pipeline, the PipelineSession object is passed between business components via the required Process method implementation. The PipelineSession object can also be contained within the same transactional context as the components in the pipeline itself.

GUI Editor Tool
Since the XML-based scenario files are integral to the Webflow and Pipeline frameworks, the ability to create and edit these files is incorporated into the WebLogic Portal GUI tool, the E-Business Control Center (EBCC). Each framework has its own specialized drawing area (see Figures 4 and 5) and multiple scenario files are differentiated via a namespacing scheme. The visual editing tool is very easy to use with drag-and-drop, zoom, scenario-linking, and attribute-editing features. The ability to graphically create an application by linking together visual and business components from a corporation's certified IT application component library is a powerful step in building adaptable applications in a substantially reduced time frame.

Conclusion
The Webflow and Pipeline frameworks, which are part of the BEA WebLogic Portal product, address the need for an architecture that is able to adapt to changing business requirements and improve ROI by enabling the reuse of application components.

By simply implementing the required framework interfaces, visual, data-cleansing, and business components can easily be linked together to form a fully functional application. This application can then be adapted to address new business functionality by visually relinking new or existing components from previous projects. Additionally, the implementation of the MVC design pattern provides the separation between the presentation logic and the underlying business processes, thus allowing the ability to modify any one of the tiers without impacting the others.

More Stories By Dwight Mamanteo

Dwight Mamanteo is a technical manager with the Global Alliances Technical Services
organization at BEA Systems. He has been with BEA since 1999 and his current
job responsibilities include providing technical enablement support to BEA's
strategic SI and ISV partners. He has
been involved with object-oriented programming, design and architecture since 1993.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science," is responsible for setting the strategy and defining the Big Data service offerings and capabilities for EMC Global Services Big Data Practice. As the CTO for the Big Data Practice, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He's written several white papers, is an avid blogge...
Nicolas Fierro is CEO of MIMIR Blockchain Solutions. He is a programmer, technologist, and operations dev who has worked with Ethereum and blockchain since 2014. His knowledge in blockchain dates to when he performed dev ops services to the Ethereum Foundation as one the privileged few developers to work with the original core team in Switzerland.
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more busine...
Whenever a new technology hits the high points of hype, everyone starts talking about it like it will solve all their business problems. Blockchain is one of those technologies. According to Gartner's latest report on the hype cycle of emerging technologies, blockchain has just passed the peak of their hype cycle curve. If you read the news articles about it, one would think it has taken over the technology world. No disruptive technology is without its challenges and potential impediments t...
If a machine can invent, does this mean the end of the patent system as we know it? The patent system, both in the US and Europe, allows companies to protect their inventions and helps foster innovation. However, Artificial Intelligence (AI) could be set to disrupt the patent system as we know it. This talk will examine how AI may change the patent landscape in the years to come. Furthermore, ways in which companies can best protect their AI related inventions will be examined from both a US and...
Bill Schmarzo, Tech Chair of "Big Data | Analytics" of upcoming CloudEXPO | DXWorldEXPO New York (November 12-13, 2018, New York City) today announced the outline and schedule of the track. "The track has been designed in experience/degree order," said Schmarzo. "So, that folks who attend the entire track can leave the conference with some of the skills necessary to get their work done when they get back to their offices. It actually ties back to some work that I'm doing at the University of San...
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the...