Weblogic Authors: Yeshim Deniz, Elizabeth White, Michael Meiner, Michael Bushong, Avi Rosenthal

Related Topics: Weblogic

Weblogic: Article

Consistent Content Management and Delivery

Consistent Content Management and Delivery

We know the mantra "Content is King" with the Internet is legitimate. Many now ask, "How do we make content a focus?" Whether you have a content-heavy Internet Web site or a vast base of enterprise content inside your corporate intranet, you want to provide your users with a quick and efficient way to access and produce relevant content.

Recently an architecture that assisted in easy content creation and consumption included a portal server and a content management system (CMS). These two products have many services that apply business rules to the creation and consumption process of enterprise content - hence, WebLogic Portal and Documentum 4i.

BEA WebLogic Portal 4.0 and Documentum 4i are two application platforms with many architecture integration issues and concerns. In this article, I'll discuss two architectures that describe how to secure, organize, manage, and deliver Web content. The difference is one installation is focused on intranet consumption, the other on Internet consumption.

Documentum 4i
Documentum 4i is an Enterprise Content Management Solution. We are primarily concerned with how it handles Web content (XML and HTML). Documentum 4i's content repository is called the DocBase. Documentum has many add-on products that interact with the DocBase, including WebPublisher, WebCache, and eConnector,

WebPublisher is the Web front end and allows administrators, publishers, editors, and authors to create, edit, and manage the enterprise content. One of WebPublisher's best features is its Web-based XML authoring tool. This allows the content publishing team to fully manage XML content without knowing anything about XML and its specification. To create different outputs from the XML, WebPublisher allows "renditions" to be created from the XML. These rendition templates are standard XSLs.

WebCache is basically the deployment engine. It has an internal scheduler that runs deployment jobs that are configured to implement who, what, when, where, and why to deploy content in the Documentum DocBase. WebCache has a sender "WebCache Source" and a receiver "WebCache Target." The Source needs to interact with the DocBase to retrieve the content. The Target needs to be wherever your content delivery solution is.

Finally, we have the eConnector for BEA. The eConnector is Documentum's implementation of the WebLogic Content Manager. What is the Content Manager? This is the edocs.bea.com description:
The Content Manager runtime subsystem provides access to content through tags and EJBs. The Content Management tags allow a JSP developer to receive an enumeration of Content objects by querying the content database directly using a search expression syntax. The Content Manager component works alongside the other components to deliver personalized content, but does not have a GUI-based tool for edit-time customization.

WebLogic Portal
I won't go into all aspects of BEA WebLogic Portal, but will discuss some of what it can do for our Web content. I'll look at pieces of WebLogic Portal like the content manager, EJBs, content selectors, and the PZ tag library. At a high level, Portal's personalization server provides a framework so that developers can deliver content in a secure, organized, and dynamic way.

Similar Architectures; Different Requirements
The Internet Architecture

The Documentum architecture is typical for an Internet Web site installation (see Figure 1). I call it typical because Documentum, and the content, rests securely inside the corporate intranet and your content is behind many stringent firewall rules. In the corporate DMZ, WebLogic Portal waits to deliver the content to our Internet customers. WebCache Source is installed on the same server as the main Documentum product, inside the intranet. WebCache Target is installed on the same server as WebLogic Portal, in the DMZ. These products are configured to securely deliver "active" content from the Source through the firewalls to the Target. The eConnector product is installed using the WebCache implementation method.

The Intranet Architecture
In a corporate intranet portal (see Figure 2) we're not dreadfully concerned with the firewall and the security from the Internet to our CMS and our delivery solutions, as compared to our Internet architecture. We aren't concerned with it because all the installations are safely inside the corporate firewall. Documentum and WebLogic Portal will be installed so that they don't have any firewalls between them. Therefore, there's no need to have WebCache involved with this installation. You may ask, how will WebLogic Portal deliver the content from Documentum? The eConnector can be configured to interact directly with the DocBase. This is a lot easier because you don't have firewall or port restrictions, along with not having the WebCache set of products to install and configure.

Setting Up Documentum to Work with eConnector
Documentum is a large application with many different ways to manage structured and unstructured enterprise content. As a side note, I strongly suggest finding a senior-level Documentum consultant when you initially implement this product. As you dig into the guts of the product, and every installation will, there is a large learning curve. I'll point out several pieces of the product that directly impact the integration of these robust applications.

Creating the Content
Every piece of content that goes into the CMS needs to have an associated document type. A document type is a DocBase object that defines what a piece of content is to the CMS. We are mainly concerned with the document-type properties, that hold the content classification data. For us, the classification helps us cooperate with our Web site's taxonomy. The properties also assist with a more precise search for content inside the CMS and in our Web site.

After you set up the document type, you'll need to set up a content template in WebPublisher. You have to establish three different objects and set some configuration information. First, you need to assign a document type to this content type. The next property is a default workflow. Next, you'll need a content template - a blank XML document that contains all available XML tags. We all know that this should be a DTD but that's not what Documentum implemented. To complement the content template we need to define a rules file. The rules file is an XML document that maps content template tags to Web form controls. Finally, we have the renditions, also called output presentations. Presentations are XSLs that format the XML content to the various outputs you may need. For our purposes, we'll need an HTML presentation.

Now Documentum is ready to have an author create some content. I must point out that the XML authoring tool does have an embedded HTML authoring tool for the text body form format, but it doesn't perform much more than RTF capabilities (see Listing 1; the code for this article can be found online at www.sys-con.com/weblogic/sourcec.cfm).

Managing the Content
After the author creates and saves the content, the default workflow is initiated (see Figure 3). We will use a workflow that automatically promotes the content to an "active" status. This type of workflow is strongly discouraged. Usually an editor will need to view/edit and approve the content before it moves to active status. Active content is a Documentum term to describe a piece of content that has been fully approved through a workflow approval cycle. At this point our piece of content is ready for deployment to the Web site. Although I may not explain it in depth, there are many other benefits to using a CMS like Documentum. Features of CMS include, but are not limited to, user/content security, aggregation services, workflow, versioning, outputting/exporting services, and workgroup collaboration services.

Deploying the Content
In comes WebCache source. We need to set up a Web Publishing Configuration to deploy our active content. We can set this up in the Documentum DocBase Administrator. Some of what we need to know is the IP address, database table, and target directory of the WebCache Target, so the Source can communicate with it. The most important part of setting up a Web Publishing Configuration is the additional attributes section in the setup. These additional attributes directly relate to the document-type properties. The Web Publisher sends the additional attributes to the database table, which will go into the eConnector setup. Finally, we'll need to set up a Documentum Job. The Documentum Job is the internal scheduler, like a cron tab. We want a Job to run the Web Publishing Configuration. It will need to run every x hours or x minutes. These WebCache Jobs run transactionally, meaning they only send content that has changed since the last time it ran, so it conserves network and processing resources.

Initial BEA WebLogic Setup Including the Documentum Client
One configuration in common with both architectures is that BEA WebLogic needs to be configured to communicate with the Documentum client. The Documentum client is a set of Java libraries, OS libraries, and some property files. The important Java API between WebLogic and the Documentum client is the dfc.jar file, the Documentum Foundation Client. This set of Java classes communicates to the Documentum client, which is only available on a few OSs. (Consult Documentum for their support requirements; I have used it on Solaris.)

First, install eConnector. Next, add all necessary Documentum client files to the WebLogic paths. Listing 2 establishes the Documentum eConnector client home. Then put two JAR files, the dmjdbc.jar and the all-important dfc.jar, into the db_classpath along with the oracle jdbc drivers. The db_classpath eventually goes into the classpath. Next we need a pointer to the Documentum client property file, dmcl.ini, in our shell environment. This file contains various properties to connect the Documentum client to the Documentum DocBase. Finally, we need the actual Documentum client in our LD_LIBRARY_PATH. The needed library file is libdmcl40.so, the OS-dependent file.

Configure Portal Application to Use eConnector
Now that we have our WebLogic environment set up with the Documentum client, we need to set up our portal application in order to use the eConnector. First, we install and deploy the appropriate eConnector EJB based on the chosen architecture; we're using the Webcache configuration and the Webcache-ejb.jar. This is the WebCache implementation of the WebLogic Portal Content Manager. Next we need to configure a JDBC connection pool (see Listing 3). Along with the connection pool in the WebLogic application the eConnector has a property file to contain more specific database information. Now, edit the dmjdbc.property file. You'll need to put the actual database driver and connection information into the WebCache property file. We'll also need to edit the WebLogic Portal property EJB to look to our content manager EJB. Now, we need to change the following EJB descriptors: EJB Ref Name field to ejb/ContentManagers/WebCacheDocument; EJB Reference Description to ejb/ContentManagers/WebCacheDocument; and the JNDI name field to ${APPNAME}.BEA_personalization.WebCacheDocumentManager. Make sure that the new EJB and new connection pool is targeted and deployed to your server.

Finally, and most important, we need to map WebCache Source attributes to something that our JSP developers can write queries against. eConnector has a property file named {$DCTM_HOME}Webcache map.property. This file lets you map a comprehensible name to a WebCache attribute and is important because the WebCache attribute is our content metadata. Note that if there isn't a mapping setup, then the JSP developers will not be able to query against that attribute.

What Do We Do with the eConnector?
Now that our environment and portal application are set up to use the Documentum eConnector for BEA, we need to display the content. To pull the content from our new Content Manager, we use a function from the Personalization tag library. The pz:contentQuery function allows us to query the WebCache Content Manager for our specific content (see Listing 4).

The metadata is the "glue" that binds the content between these two products and our Web users. The content's metadata is information (data) that classifies the content so that in a dynamic Web site the content can integrate with the Web site's taxonomy. A lot of requirements gathering and analysis is needed to carefully negotiate the metatagging scheme. Metadata could make or break your long-term content goals. You will need to carefully balance author dissatisfaction with lengthy entry forms with content rich classifications. Angry content authors could prove a good system stale. Alternatively, if you don't enrich your content with metadata, then your content may not be agile enough to repurpose to other Web services.

Documentum has a form of clustering called federating - a way to allow multiple Documentum installations to interact with each other's content. WebLogic has many clustering features that other WLDJ articles have discussed. What I want to mention is different options in the Internet installation. The WebCache Target needs to be installed on each server that has WebLogic Portal running on it. This is needed because of the eConnector. You really can't use a shared drive to host the content between the different servers. In addition, you need the WebCache Source job to deploy content to all of the installed WebCache Targets.

Once again, the Documentum suite of products is very large. Bringing in one individual who has a wealth of Documentum experience will help to solidify a successful Documentum implementation.

Content Aggregation vs Content Presentation
Keep content templates separate from content presentation, that is, keep content style away from the authoring process. This is an essential but sometimes difficult piece of advice for any CMS project at a time when editors tend to want very specific WYSIWYG HTML authoring tools. If you allow the editors to mess around with the content entry system with presentation logic via HTML tags, then when you want to change the presentation, your content will also have to change and that will take many hours of content editing. Put time and effort into developing granular data capture templates instead, and take care of end-user presentation within your JSP pages on the delivery side.

The benefits of matching a CMS with a personalization server are endless. A CMS like Documentum can help manage, centralize, and organize your corporate content assets. On the delivery side, WebLogic Portal offers a strong platform that provides a reliable, scalable, and secure way of presenting your enterprise content assets to your Web-based audience.

More Stories By Travis Wissink

Travis Wissink, an independent consultant, calls the Washington, DC Metro area his home. He specializes in WebLogic (J2EE) development and Content Management needs, as well as Interwoven implementations. Most recently, Travis is a Lead WebLogic Portal consultant and is integrating BEA Portal with Documentum, which resembles this articles content.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

IoT & Smart Cities Stories
Early Bird Registration Discount Expires on August 31, 2018 Conference Registration Link ▸ HERE. Pick from all 200 sessions in all 10 tracks, plus 22 Keynotes & General Sessions! Lunch is served two days. EXPIRES AUGUST 31, 2018. Ticket prices: ($1,295-Aug 31) ($1,495-Oct 31) ($1,995-Nov 12) ($2,500-Walk-in)
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
Nicolas Fierro is CEO of MIMIR Blockchain Solutions. He is a programmer, technologist, and operations dev who has worked with Ethereum and blockchain since 2014. His knowledge in blockchain dates to when he performed dev ops services to the Ethereum Foundation as one the privileged few developers to work with the original core team in Switzerland.
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
The challenges of aggregating data from consumer-oriented devices, such as wearable technologies and smart thermostats, are fairly well-understood. However, there are a new set of challenges for IoT devices that generate megabytes or gigabytes of data per second. Certainly, the infrastructure will have to change, as those volumes of data will likely overwhelm the available bandwidth for aggregating the data into a central repository. Ochandarena discusses a whole new way to think about your next...
CloudEXPO | DevOpsSUMMIT | DXWorldEXPO are the world's most influential, independent events where Cloud Computing was coined and where technology buyers and vendors meet to experience and discuss the big picture of Digital Transformation and all of the strategies, tactics, and tools they need to realize their goals. Sponsors of DXWorldEXPO | CloudEXPO benefit from unmatched branding, profile building and lead generation opportunities.
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more busine...
All in Mobile is a place where we continually maximize their impact by fostering understanding, empathy, insights, creativity and joy. They believe that a truly useful and desirable mobile app doesn't need the brightest idea or the most advanced technology. A great product begins with understanding people. It's easy to think that customers will love your app, but can you justify it? They make sure your final app is something that users truly want and need. The only way to do this is by ...
DXWorldEXPO LLC announced today that Big Data Federation to Exhibit at the 22nd International CloudEXPO, colocated with DevOpsSUMMIT and DXWorldEXPO, November 12-13, 2018 in New York City. Big Data Federation, Inc. develops and applies artificial intelligence to predict financial and economic events that matter. The company uncovers patterns and precise drivers of performance and outcomes with the aid of machine-learning algorithms, big data, and fundamental analysis. Their products are deployed...