Welcome!

Weblogic Authors: Yeshim Deniz, Elizabeth White, Michael Meiner, Michael Bushong, Avi Rosenthal

Related Topics: Weblogic

Weblogic: Article

Consistent Content Management and Delivery

Consistent Content Management and Delivery

We know the mantra "Content is King" with the Internet is legitimate. Many now ask, "How do we make content a focus?" Whether you have a content-heavy Internet Web site or a vast base of enterprise content inside your corporate intranet, you want to provide your users with a quick and efficient way to access and produce relevant content.

Recently an architecture that assisted in easy content creation and consumption included a portal server and a content management system (CMS). These two products have many services that apply business rules to the creation and consumption process of enterprise content - hence, WebLogic Portal and Documentum 4i.

BEA WebLogic Portal 4.0 and Documentum 4i are two application platforms with many architecture integration issues and concerns. In this article, I'll discuss two architectures that describe how to secure, organize, manage, and deliver Web content. The difference is one installation is focused on intranet consumption, the other on Internet consumption.

Documentum 4i
Documentum 4i is an Enterprise Content Management Solution. We are primarily concerned with how it handles Web content (XML and HTML). Documentum 4i's content repository is called the DocBase. Documentum has many add-on products that interact with the DocBase, including WebPublisher, WebCache, and eConnector,

WebPublisher is the Web front end and allows administrators, publishers, editors, and authors to create, edit, and manage the enterprise content. One of WebPublisher's best features is its Web-based XML authoring tool. This allows the content publishing team to fully manage XML content without knowing anything about XML and its specification. To create different outputs from the XML, WebPublisher allows "renditions" to be created from the XML. These rendition templates are standard XSLs.

WebCache is basically the deployment engine. It has an internal scheduler that runs deployment jobs that are configured to implement who, what, when, where, and why to deploy content in the Documentum DocBase. WebCache has a sender "WebCache Source" and a receiver "WebCache Target." The Source needs to interact with the DocBase to retrieve the content. The Target needs to be wherever your content delivery solution is.

Finally, we have the eConnector for BEA. The eConnector is Documentum's implementation of the WebLogic Content Manager. What is the Content Manager? This is the edocs.bea.com description:
The Content Manager runtime subsystem provides access to content through tags and EJBs. The Content Management tags allow a JSP developer to receive an enumeration of Content objects by querying the content database directly using a search expression syntax. The Content Manager component works alongside the other components to deliver personalized content, but does not have a GUI-based tool for edit-time customization.

WebLogic Portal
I won't go into all aspects of BEA WebLogic Portal, but will discuss some of what it can do for our Web content. I'll look at pieces of WebLogic Portal like the content manager, EJBs, content selectors, and the PZ tag library. At a high level, Portal's personalization server provides a framework so that developers can deliver content in a secure, organized, and dynamic way.

Similar Architectures; Different Requirements
The Internet Architecture

The Documentum architecture is typical for an Internet Web site installation (see Figure 1). I call it typical because Documentum, and the content, rests securely inside the corporate intranet and your content is behind many stringent firewall rules. In the corporate DMZ, WebLogic Portal waits to deliver the content to our Internet customers. WebCache Source is installed on the same server as the main Documentum product, inside the intranet. WebCache Target is installed on the same server as WebLogic Portal, in the DMZ. These products are configured to securely deliver "active" content from the Source through the firewalls to the Target. The eConnector product is installed using the WebCache implementation method.

The Intranet Architecture
In a corporate intranet portal (see Figure 2) we're not dreadfully concerned with the firewall and the security from the Internet to our CMS and our delivery solutions, as compared to our Internet architecture. We aren't concerned with it because all the installations are safely inside the corporate firewall. Documentum and WebLogic Portal will be installed so that they don't have any firewalls between them. Therefore, there's no need to have WebCache involved with this installation. You may ask, how will WebLogic Portal deliver the content from Documentum? The eConnector can be configured to interact directly with the DocBase. This is a lot easier because you don't have firewall or port restrictions, along with not having the WebCache set of products to install and configure.

Setting Up Documentum to Work with eConnector
Documentum is a large application with many different ways to manage structured and unstructured enterprise content. As a side note, I strongly suggest finding a senior-level Documentum consultant when you initially implement this product. As you dig into the guts of the product, and every installation will, there is a large learning curve. I'll point out several pieces of the product that directly impact the integration of these robust applications.

Creating the Content
Every piece of content that goes into the CMS needs to have an associated document type. A document type is a DocBase object that defines what a piece of content is to the CMS. We are mainly concerned with the document-type properties, that hold the content classification data. For us, the classification helps us cooperate with our Web site's taxonomy. The properties also assist with a more precise search for content inside the CMS and in our Web site.

After you set up the document type, you'll need to set up a content template in WebPublisher. You have to establish three different objects and set some configuration information. First, you need to assign a document type to this content type. The next property is a default workflow. Next, you'll need a content template - a blank XML document that contains all available XML tags. We all know that this should be a DTD but that's not what Documentum implemented. To complement the content template we need to define a rules file. The rules file is an XML document that maps content template tags to Web form controls. Finally, we have the renditions, also called output presentations. Presentations are XSLs that format the XML content to the various outputs you may need. For our purposes, we'll need an HTML presentation.

Now Documentum is ready to have an author create some content. I must point out that the XML authoring tool does have an embedded HTML authoring tool for the text body form format, but it doesn't perform much more than RTF capabilities (see Listing 1; the code for this article can be found online at www.sys-con.com/weblogic/sourcec.cfm).

Managing the Content
After the author creates and saves the content, the default workflow is initiated (see Figure 3). We will use a workflow that automatically promotes the content to an "active" status. This type of workflow is strongly discouraged. Usually an editor will need to view/edit and approve the content before it moves to active status. Active content is a Documentum term to describe a piece of content that has been fully approved through a workflow approval cycle. At this point our piece of content is ready for deployment to the Web site. Although I may not explain it in depth, there are many other benefits to using a CMS like Documentum. Features of CMS include, but are not limited to, user/content security, aggregation services, workflow, versioning, outputting/exporting services, and workgroup collaboration services.

Deploying the Content
In comes WebCache source. We need to set up a Web Publishing Configuration to deploy our active content. We can set this up in the Documentum DocBase Administrator. Some of what we need to know is the IP address, database table, and target directory of the WebCache Target, so the Source can communicate with it. The most important part of setting up a Web Publishing Configuration is the additional attributes section in the setup. These additional attributes directly relate to the document-type properties. The Web Publisher sends the additional attributes to the database table, which will go into the eConnector setup. Finally, we'll need to set up a Documentum Job. The Documentum Job is the internal scheduler, like a cron tab. We want a Job to run the Web Publishing Configuration. It will need to run every x hours or x minutes. These WebCache Jobs run transactionally, meaning they only send content that has changed since the last time it ran, so it conserves network and processing resources.

Initial BEA WebLogic Setup Including the Documentum Client
One configuration in common with both architectures is that BEA WebLogic needs to be configured to communicate with the Documentum client. The Documentum client is a set of Java libraries, OS libraries, and some property files. The important Java API between WebLogic and the Documentum client is the dfc.jar file, the Documentum Foundation Client. This set of Java classes communicates to the Documentum client, which is only available on a few OSs. (Consult Documentum for their support requirements; I have used it on Solaris.)

First, install eConnector. Next, add all necessary Documentum client files to the WebLogic paths. Listing 2 establishes the Documentum eConnector client home. Then put two JAR files, the dmjdbc.jar and the all-important dfc.jar, into the db_classpath along with the oracle jdbc drivers. The db_classpath eventually goes into the classpath. Next we need a pointer to the Documentum client property file, dmcl.ini, in our shell environment. This file contains various properties to connect the Documentum client to the Documentum DocBase. Finally, we need the actual Documentum client in our LD_LIBRARY_PATH. The needed library file is libdmcl40.so, the OS-dependent file.

Configure Portal Application to Use eConnector
Now that we have our WebLogic environment set up with the Documentum client, we need to set up our portal application in order to use the eConnector. First, we install and deploy the appropriate eConnector EJB based on the chosen architecture; we're using the Webcache configuration and the Webcache-ejb.jar. This is the WebCache implementation of the WebLogic Portal Content Manager. Next we need to configure a JDBC connection pool (see Listing 3). Along with the connection pool in the WebLogic application the eConnector has a property file to contain more specific database information. Now, edit the dmjdbc.property file. You'll need to put the actual database driver and connection information into the WebCache property file. We'll also need to edit the WebLogic Portal property EJB to look to our content manager EJB. Now, we need to change the following EJB descriptors: EJB Ref Name field to ejb/ContentManagers/WebCacheDocument; EJB Reference Description to ejb/ContentManagers/WebCacheDocument; and the JNDI name field to ${APPNAME}.BEA_personalization.WebCacheDocumentManager. Make sure that the new EJB and new connection pool is targeted and deployed to your server.

Finally, and most important, we need to map WebCache Source attributes to something that our JSP developers can write queries against. eConnector has a property file named {$DCTM_HOME}Webcache map.property. This file lets you map a comprehensible name to a WebCache attribute and is important because the WebCache attribute is our content metadata. Note that if there isn't a mapping setup, then the JSP developers will not be able to query against that attribute.

What Do We Do with the eConnector?
Now that our environment and portal application are set up to use the Documentum eConnector for BEA, we need to display the content. To pull the content from our new Content Manager, we use a function from the Personalization tag library. The pz:contentQuery function allows us to query the WebCache Content Manager for our specific content (see Listing 4).

Metadata
The metadata is the "glue" that binds the content between these two products and our Web users. The content's metadata is information (data) that classifies the content so that in a dynamic Web site the content can integrate with the Web site's taxonomy. A lot of requirements gathering and analysis is needed to carefully negotiate the metatagging scheme. Metadata could make or break your long-term content goals. You will need to carefully balance author dissatisfaction with lengthy entry forms with content rich classifications. Angry content authors could prove a good system stale. Alternatively, if you don't enrich your content with metadata, then your content may not be agile enough to repurpose to other Web services.

Clustering
Documentum has a form of clustering called federating - a way to allow multiple Documentum installations to interact with each other's content. WebLogic has many clustering features that other WLDJ articles have discussed. What I want to mention is different options in the Internet installation. The WebCache Target needs to be installed on each server that has WebLogic Portal running on it. This is needed because of the eConnector. You really can't use a shared drive to host the content between the different servers. In addition, you need the WebCache Source job to deploy content to all of the installed WebCache Targets.

Once again, the Documentum suite of products is very large. Bringing in one individual who has a wealth of Documentum experience will help to solidify a successful Documentum implementation.

Content Aggregation vs Content Presentation
Keep content templates separate from content presentation, that is, keep content style away from the authoring process. This is an essential but sometimes difficult piece of advice for any CMS project at a time when editors tend to want very specific WYSIWYG HTML authoring tools. If you allow the editors to mess around with the content entry system with presentation logic via HTML tags, then when you want to change the presentation, your content will also have to change and that will take many hours of content editing. Put time and effort into developing granular data capture templates instead, and take care of end-user presentation within your JSP pages on the delivery side.

Summary
The benefits of matching a CMS with a personalization server are endless. A CMS like Documentum can help manage, centralize, and organize your corporate content assets. On the delivery side, WebLogic Portal offers a strong platform that provides a reliable, scalable, and secure way of presenting your enterprise content assets to your Web-based audience.

More Stories By Travis Wissink

Travis Wissink, an independent consultant, calls the Washington, DC Metro area his home. He specializes in WebLogic (J2EE) development and Content Management needs, as well as Interwoven implementations. Most recently, Travis is a Lead WebLogic Portal consultant and is integrating BEA Portal with Documentum, which resembles this articles content.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
Nordstrom is transforming the way that they do business and the cloud is the key to enabling speed and hyper personalized customer experiences. In his session at 21st Cloud Expo, Ken Schow, VP of Engineering at Nordstrom, discussed some of the key learnings and common pitfalls of large enterprises moving to the cloud. This includes strategies around choosing a cloud provider(s), architecture, and lessons learned. In addition, he covered some of the best practices for structured team migration an...
Recently, REAN Cloud built a digital concierge for a North Carolina hospital that had observed that most patient call button questions were repetitive. In addition, the paper-based process used to measure patient health metrics was laborious, not in real-time and sometimes error-prone. In their session at 21st Cloud Expo, Sean Finnerty, Executive Director, Practice Lead, Health Care & Life Science at REAN Cloud, and Dr. S.P.T. Krishnan, Principal Architect at REAN Cloud, discussed how they built...
In his session at 21st Cloud Expo, Raju Shreewastava, founder of Big Data Trunk, provided a fun and simple way to introduce Machine Leaning to anyone and everyone. He solved a machine learning problem and demonstrated an easy way to be able to do machine learning without even coding. Raju Shreewastava is the founder of Big Data Trunk (www.BigDataTrunk.com), a Big Data Training and consulting firm with offices in the United States. He previously led the data warehouse/business intelligence and B...
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, led attendees through the exciting evolution of the cloud. He looked at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering m...
With tough new regulations coming to Europe on data privacy in May 2018, Calligo will explain why in reality the effect is global and transforms how you consider critical data. EU GDPR fundamentally rewrites the rules for cloud, Big Data and IoT. In his session at 21st Cloud Expo, Adam Ryan, Vice President and General Manager EMEA at Calligo, examined the regulations and provided insight on how it affects technology, challenges the established rules and will usher in new levels of diligence arou...
The 22nd International Cloud Expo | 1st DXWorld Expo has announced that its Call for Papers is open. Cloud Expo | DXWorld Expo, to be held June 5-7, 2018, at the Javits Center in New York, NY, brings together Cloud Computing, Digital Transformation, Big Data, Internet of Things, DevOps, Machine Learning and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding busin...
Smart cities have the potential to change our lives at so many levels for citizens: less pollution, reduced parking obstacles, better health, education and more energy savings. Real-time data streaming and the Internet of Things (IoT) possess the power to turn this vision into a reality. However, most organizations today are building their data infrastructure to focus solely on addressing immediate business needs vs. a platform capable of quickly adapting emerging technologies to address future ...
No hype cycles or predictions of a gazillion things here. IoT is here. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, an Associate Partner of Analytics, IoT & Cybersecurity at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He also discussed the evaluation of communication standards and IoT messaging protocols, data...
22nd International Cloud Expo, taking place June 5-7, 2018, at the Javits Center in New York City, NY, and co-located with the 1st DXWorld Expo will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud ...
22nd International Cloud Expo, taking place June 5-7, 2018, at the Javits Center in New York City, NY, and co-located with the 1st DXWorld Expo will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud ...
DevOps at Cloud Expo – being held June 5-7, 2018, at the Javits Center in New York, NY – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's largest enterprises – and delivering real results. Among the proven benefits,...
@DevOpsSummit at Cloud Expo, taking place June 5-7, 2018, at the Javits Center in New York City, NY, is co-located with 22nd Cloud Expo | 1st DXWorld Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait...
Cloud Expo | DXWorld Expo have announced the conference tracks for Cloud Expo 2018. Cloud Expo will be held June 5-7, 2018, at the Javits Center in New York City, and November 6-8, 2018, at the Santa Clara Convention Center, Santa Clara, CA. Digital Transformation (DX) is a major focus with the introduction of DX Expo within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive ov...
SYS-CON Events announced today that T-Mobile exhibited at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. As America's Un-carrier, T-Mobile US, Inc., is redefining the way consumers and businesses buy wireless services through leading product and service innovation. The Company's advanced nationwide 4G LTE network delivers outstanding wireless experiences to 67.4 million customers who are unwilling to compromise on qua...
SYS-CON Events announced today that Cedexis will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Cedexis is the leader in data-driven enterprise global traffic management. Whether optimizing traffic through datacenters, clouds, CDNs, or any combination, Cedexis solutions drive quality and cost-effectiveness. For more information, please visit https://www.cedexis.com.
SYS-CON Events announced today that Google Cloud has been named “Keynote Sponsor” of SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Companies come to Google Cloud to transform their businesses. Google Cloud’s comprehensive portfolio – from infrastructure to apps to devices – helps enterprises innovate faster, scale smarter, stay secure, and do more with data than ever before.
SYS-CON Events announced today that Vivint to exhibit at SYS-CON's 21st Cloud Expo, which will take place on October 31 through November 2nd 2017 at the Santa Clara Convention Center in Santa Clara, California. As a leading smart home technology provider, Vivint offers home security, energy management, home automation, local cloud storage, and high-speed Internet solutions to more than one million customers throughout the United States and Canada. The end result is a smart home solution that sav...
SYS-CON Events announced today that Opsani will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Opsani is the leading provider of deployment automation systems for running and scaling traditional enterprise applications on container infrastructure.
SYS-CON Events announced today that Nirmata will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Nirmata provides a comprehensive platform, for deploying, operating, and optimizing containerized applications across clouds, powered by Kubernetes. Nirmata empowers enterprise DevOps teams by fully automating the complex operations and management of application containers and its underlying ...
SYS-CON Events announced today that Opsani to exhibit at SYS-CON's 21st Cloud Expo, which will take place on October 31 through November 2nd 2017 at the Santa Clara Convention Center in Santa Clara, California. Opsani is creating the next generation of automated continuous deployment tools designed specifically for containers. How is continuous deployment different from continuous integration and continuous delivery? CI/CD tools provide build and test. Continuous Deployment is the means by which...