Weblogic Authors: Yeshim Deniz, Elizabeth White, Michael Meiner, Michael Bushong, Avi Rosenthal

Related Topics: Weblogic

Weblogic: Article

Consistent Content Management and Delivery

Consistent Content Management and Delivery

We know the mantra "Content is King" with the Internet is legitimate. Many now ask, "How do we make content a focus?" Whether you have a content-heavy Internet Web site or a vast base of enterprise content inside your corporate intranet, you want to provide your users with a quick and efficient way to access and produce relevant content.

Recently an architecture that assisted in easy content creation and consumption included a portal server and a content management system (CMS). These two products have many services that apply business rules to the creation and consumption process of enterprise content - hence, WebLogic Portal and Documentum 4i.

BEA WebLogic Portal 4.0 and Documentum 4i are two application platforms with many architecture integration issues and concerns. In this article, I'll discuss two architectures that describe how to secure, organize, manage, and deliver Web content. The difference is one installation is focused on intranet consumption, the other on Internet consumption.

Documentum 4i
Documentum 4i is an Enterprise Content Management Solution. We are primarily concerned with how it handles Web content (XML and HTML). Documentum 4i's content repository is called the DocBase. Documentum has many add-on products that interact with the DocBase, including WebPublisher, WebCache, and eConnector,

WebPublisher is the Web front end and allows administrators, publishers, editors, and authors to create, edit, and manage the enterprise content. One of WebPublisher's best features is its Web-based XML authoring tool. This allows the content publishing team to fully manage XML content without knowing anything about XML and its specification. To create different outputs from the XML, WebPublisher allows "renditions" to be created from the XML. These rendition templates are standard XSLs.

WebCache is basically the deployment engine. It has an internal scheduler that runs deployment jobs that are configured to implement who, what, when, where, and why to deploy content in the Documentum DocBase. WebCache has a sender "WebCache Source" and a receiver "WebCache Target." The Source needs to interact with the DocBase to retrieve the content. The Target needs to be wherever your content delivery solution is.

Finally, we have the eConnector for BEA. The eConnector is Documentum's implementation of the WebLogic Content Manager. What is the Content Manager? This is the edocs.bea.com description:
The Content Manager runtime subsystem provides access to content through tags and EJBs. The Content Management tags allow a JSP developer to receive an enumeration of Content objects by querying the content database directly using a search expression syntax. The Content Manager component works alongside the other components to deliver personalized content, but does not have a GUI-based tool for edit-time customization.

WebLogic Portal
I won't go into all aspects of BEA WebLogic Portal, but will discuss some of what it can do for our Web content. I'll look at pieces of WebLogic Portal like the content manager, EJBs, content selectors, and the PZ tag library. At a high level, Portal's personalization server provides a framework so that developers can deliver content in a secure, organized, and dynamic way.

Similar Architectures; Different Requirements
The Internet Architecture

The Documentum architecture is typical for an Internet Web site installation (see Figure 1). I call it typical because Documentum, and the content, rests securely inside the corporate intranet and your content is behind many stringent firewall rules. In the corporate DMZ, WebLogic Portal waits to deliver the content to our Internet customers. WebCache Source is installed on the same server as the main Documentum product, inside the intranet. WebCache Target is installed on the same server as WebLogic Portal, in the DMZ. These products are configured to securely deliver "active" content from the Source through the firewalls to the Target. The eConnector product is installed using the WebCache implementation method.

The Intranet Architecture
In a corporate intranet portal (see Figure 2) we're not dreadfully concerned with the firewall and the security from the Internet to our CMS and our delivery solutions, as compared to our Internet architecture. We aren't concerned with it because all the installations are safely inside the corporate firewall. Documentum and WebLogic Portal will be installed so that they don't have any firewalls between them. Therefore, there's no need to have WebCache involved with this installation. You may ask, how will WebLogic Portal deliver the content from Documentum? The eConnector can be configured to interact directly with the DocBase. This is a lot easier because you don't have firewall or port restrictions, along with not having the WebCache set of products to install and configure.

Setting Up Documentum to Work with eConnector
Documentum is a large application with many different ways to manage structured and unstructured enterprise content. As a side note, I strongly suggest finding a senior-level Documentum consultant when you initially implement this product. As you dig into the guts of the product, and every installation will, there is a large learning curve. I'll point out several pieces of the product that directly impact the integration of these robust applications.

Creating the Content
Every piece of content that goes into the CMS needs to have an associated document type. A document type is a DocBase object that defines what a piece of content is to the CMS. We are mainly concerned with the document-type properties, that hold the content classification data. For us, the classification helps us cooperate with our Web site's taxonomy. The properties also assist with a more precise search for content inside the CMS and in our Web site.

After you set up the document type, you'll need to set up a content template in WebPublisher. You have to establish three different objects and set some configuration information. First, you need to assign a document type to this content type. The next property is a default workflow. Next, you'll need a content template - a blank XML document that contains all available XML tags. We all know that this should be a DTD but that's not what Documentum implemented. To complement the content template we need to define a rules file. The rules file is an XML document that maps content template tags to Web form controls. Finally, we have the renditions, also called output presentations. Presentations are XSLs that format the XML content to the various outputs you may need. For our purposes, we'll need an HTML presentation.

Now Documentum is ready to have an author create some content. I must point out that the XML authoring tool does have an embedded HTML authoring tool for the text body form format, but it doesn't perform much more than RTF capabilities (see Listing 1; the code for this article can be found online at www.sys-con.com/weblogic/sourcec.cfm).

Managing the Content
After the author creates and saves the content, the default workflow is initiated (see Figure 3). We will use a workflow that automatically promotes the content to an "active" status. This type of workflow is strongly discouraged. Usually an editor will need to view/edit and approve the content before it moves to active status. Active content is a Documentum term to describe a piece of content that has been fully approved through a workflow approval cycle. At this point our piece of content is ready for deployment to the Web site. Although I may not explain it in depth, there are many other benefits to using a CMS like Documentum. Features of CMS include, but are not limited to, user/content security, aggregation services, workflow, versioning, outputting/exporting services, and workgroup collaboration services.

Deploying the Content
In comes WebCache source. We need to set up a Web Publishing Configuration to deploy our active content. We can set this up in the Documentum DocBase Administrator. Some of what we need to know is the IP address, database table, and target directory of the WebCache Target, so the Source can communicate with it. The most important part of setting up a Web Publishing Configuration is the additional attributes section in the setup. These additional attributes directly relate to the document-type properties. The Web Publisher sends the additional attributes to the database table, which will go into the eConnector setup. Finally, we'll need to set up a Documentum Job. The Documentum Job is the internal scheduler, like a cron tab. We want a Job to run the Web Publishing Configuration. It will need to run every x hours or x minutes. These WebCache Jobs run transactionally, meaning they only send content that has changed since the last time it ran, so it conserves network and processing resources.

Initial BEA WebLogic Setup Including the Documentum Client
One configuration in common with both architectures is that BEA WebLogic needs to be configured to communicate with the Documentum client. The Documentum client is a set of Java libraries, OS libraries, and some property files. The important Java API between WebLogic and the Documentum client is the dfc.jar file, the Documentum Foundation Client. This set of Java classes communicates to the Documentum client, which is only available on a few OSs. (Consult Documentum for their support requirements; I have used it on Solaris.)

First, install eConnector. Next, add all necessary Documentum client files to the WebLogic paths. Listing 2 establishes the Documentum eConnector client home. Then put two JAR files, the dmjdbc.jar and the all-important dfc.jar, into the db_classpath along with the oracle jdbc drivers. The db_classpath eventually goes into the classpath. Next we need a pointer to the Documentum client property file, dmcl.ini, in our shell environment. This file contains various properties to connect the Documentum client to the Documentum DocBase. Finally, we need the actual Documentum client in our LD_LIBRARY_PATH. The needed library file is libdmcl40.so, the OS-dependent file.

Configure Portal Application to Use eConnector
Now that we have our WebLogic environment set up with the Documentum client, we need to set up our portal application in order to use the eConnector. First, we install and deploy the appropriate eConnector EJB based on the chosen architecture; we're using the Webcache configuration and the Webcache-ejb.jar. This is the WebCache implementation of the WebLogic Portal Content Manager. Next we need to configure a JDBC connection pool (see Listing 3). Along with the connection pool in the WebLogic application the eConnector has a property file to contain more specific database information. Now, edit the dmjdbc.property file. You'll need to put the actual database driver and connection information into the WebCache property file. We'll also need to edit the WebLogic Portal property EJB to look to our content manager EJB. Now, we need to change the following EJB descriptors: EJB Ref Name field to ejb/ContentManagers/WebCacheDocument; EJB Reference Description to ejb/ContentManagers/WebCacheDocument; and the JNDI name field to ${APPNAME}.BEA_personalization.WebCacheDocumentManager. Make sure that the new EJB and new connection pool is targeted and deployed to your server.

Finally, and most important, we need to map WebCache Source attributes to something that our JSP developers can write queries against. eConnector has a property file named {$DCTM_HOME}Webcache map.property. This file lets you map a comprehensible name to a WebCache attribute and is important because the WebCache attribute is our content metadata. Note that if there isn't a mapping setup, then the JSP developers will not be able to query against that attribute.

What Do We Do with the eConnector?
Now that our environment and portal application are set up to use the Documentum eConnector for BEA, we need to display the content. To pull the content from our new Content Manager, we use a function from the Personalization tag library. The pz:contentQuery function allows us to query the WebCache Content Manager for our specific content (see Listing 4).

The metadata is the "glue" that binds the content between these two products and our Web users. The content's metadata is information (data) that classifies the content so that in a dynamic Web site the content can integrate with the Web site's taxonomy. A lot of requirements gathering and analysis is needed to carefully negotiate the metatagging scheme. Metadata could make or break your long-term content goals. You will need to carefully balance author dissatisfaction with lengthy entry forms with content rich classifications. Angry content authors could prove a good system stale. Alternatively, if you don't enrich your content with metadata, then your content may not be agile enough to repurpose to other Web services.

Documentum has a form of clustering called federating - a way to allow multiple Documentum installations to interact with each other's content. WebLogic has many clustering features that other WLDJ articles have discussed. What I want to mention is different options in the Internet installation. The WebCache Target needs to be installed on each server that has WebLogic Portal running on it. This is needed because of the eConnector. You really can't use a shared drive to host the content between the different servers. In addition, you need the WebCache Source job to deploy content to all of the installed WebCache Targets.

Once again, the Documentum suite of products is very large. Bringing in one individual who has a wealth of Documentum experience will help to solidify a successful Documentum implementation.

Content Aggregation vs Content Presentation
Keep content templates separate from content presentation, that is, keep content style away from the authoring process. This is an essential but sometimes difficult piece of advice for any CMS project at a time when editors tend to want very specific WYSIWYG HTML authoring tools. If you allow the editors to mess around with the content entry system with presentation logic via HTML tags, then when you want to change the presentation, your content will also have to change and that will take many hours of content editing. Put time and effort into developing granular data capture templates instead, and take care of end-user presentation within your JSP pages on the delivery side.

The benefits of matching a CMS with a personalization server are endless. A CMS like Documentum can help manage, centralize, and organize your corporate content assets. On the delivery side, WebLogic Portal offers a strong platform that provides a reliable, scalable, and secure way of presenting your enterprise content assets to your Web-based audience.

More Stories By Travis Wissink

Travis Wissink, an independent consultant, calls the Washington, DC Metro area his home. He specializes in WebLogic (J2EE) development and Content Management needs, as well as Interwoven implementations. Most recently, Travis is a Lead WebLogic Portal consultant and is integrating BEA Portal with Documentum, which resembles this articles content.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

IoT & Smart Cities Stories
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
Digital Transformation is much more than a buzzword. The radical shift to digital mechanisms for almost every process is evident across all industries and verticals. This is often especially true in financial services, where the legacy environment is many times unable to keep up with the rapidly shifting demands of the consumer. The constant pressure to provide complete, omnichannel delivery of customer-facing solutions to meet both regulatory and customer demands is putting enormous pressure on...
IoT is rapidly becoming mainstream as more and more investments are made into the platforms and technology. As this movement continues to expand and gain momentum it creates a massive wall of noise that can be difficult to sift through. Unfortunately, this inevitably makes IoT less approachable for people to get started with and can hamper efforts to integrate this key technology into your own portfolio. There are so many connected products already in place today with many hundreds more on the h...
The standardization of container runtimes and images has sparked the creation of an almost overwhelming number of new open source projects that build on and otherwise work with these specifications. Of course, there's Kubernetes, which orchestrates and manages collections of containers. It was one of the first and best-known examples of projects that make containers truly useful for production use. However, more recently, the container ecosystem has truly exploded. A service mesh like Istio addr...
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight...
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
To Really Work for Enterprises, MultiCloud Adoption Requires Far Better and Inclusive Cloud Monitoring and Cost Management … But How? Overwhelmingly, even as enterprises have adopted cloud computing and are expanding to multi-cloud computing, IT leaders remain concerned about how to monitor, manage and control costs across hybrid and multi-cloud deployments. It’s clear that traditional IT monitoring and management approaches, designed after all for on-premises data centers, are falling short in ...
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more busine...