Click here to close now.


Weblogic Authors: Elizabeth White, Michael Meiner, Michael Bushong, Avi Rosenthal

Related Topics: Weblogic

Weblogic: Article

Removing Performance Bottlenecks Through JSP Precompilation

Removing Performance Bottlenecks Through JSP Precompilation

Welcome to this edition of "In the Admin Corner," a new monthly column devoted to the administration, configuration, management, and deployment aspects of WebLogic Server.

The goal of this column is to provide you with a closer look at the nondevelopment issues of J2EE that are commonly encountered when working with WLS. Developers and administrators alike will find this column to be of value, as the material applies to both the development and production ends of the application spectrum. Furthermore, this feature draws heavily upon experiences from both the field and the engineering lab, providing detailed answers to real-world problems that go beyond the hand-waving solutions of corridor conversations and whiteboard sessions.

The Need for JSP Precompilation
This month's article looks at removing a potential system performance bottleneck by addressing one of the most common problems that plague nearly all J2EE development projects - the overhead of JavaServer Page (JSP) compilation at server runtime. Although JSPs are the ideal choice for presenting dynamic HTML views within J2EE applications, they do impact performance in a way that, while more annoying than detrimental, initially creates the perception that the application is slow.

According to the J2EE specification, JSPs are primarily HTML files in which embedded Java code is utilized to interact with other system components and present information dynamically. The specification states that all J2EE-compliant application servers supporting JSP, upon a client request for a given JSP, will

  • Convert the JSP from its HTML format into a Java class (in Java source format) of servlet type, substituting out any shorthand JSP notation for fully qualified Java syntax

  • Compile the newly generated Java source into its .class bytecode format

  • Execute the appropriate interface method on the newly compiled class and return the response to the requesting client

    While this is an excellent approach - from a development perspective - for managing dynamic HTML generation within the presentation layer, its impact on the server runtime environment requires a JSP to be parsed, converted to Java code, and compiled before it is executed to handle a given client request. The obvious impact on the end user is that a response will, at minimum, be delayed by the time it takes for the JSP compilation phase to occur for one given JSP file. Take into account the possibility that a given user request may hit two or more JSP files within the same request, and suddenly the time required for the compilation phase is multiplied that many more times.

    For the end user who first hits a given JSP and thus forces the initial compilation of the requested file, the perception is that the application is slow and unresponsive. Although such a perception may exist, the compilation process for a given JSP file is generally done once in the lifespan of a given app server VM instance. Therefore, its overall impact on performance is considered to be a nuisance, rather than a critical roadblock to the overall response time of the application. Nonetheless, production systems that aim to deliver a JSP-based J2EE application in a production environment must overcome the pitfalls of JSP and make compilation transparent to the end user.

    So how can a production environment benefit from the use of JSP files, yet escape the performance hit of runtime compilation? The answer is simple - implement a process commonly referred to as JSP precompilation. With JSP precompilation, JSP files and their compiled equivalents are deployed into a production environment, having already been precompiled in an offline environment. If precompilation and deployment of the resultant class files are done correctly, the application server will execute the previously compiled classes for the JSP files, and will not force a given request to recompile the JSP at runtime. This creates a situation in which the application operates without the unnecessary overhead of compilation, allowing the administrator to remove a perceived bottleneck to the overall performance of the system.

    Different Methodologies and Approaches
    There's no doubt that the promises of JSP precompilation sound exciting. However, in order to fulfill such promises, you must first understand the different approaches that can be taken to implement the technique, and the advantages and disadvantages of each.

    Exercising the Application to Force Precompilation
    The most obvious approach used to implement JSP precompilation is to exercise a given site application by requesting all possible JSP pages in the application before the production release, so compilation is done before end users access the site. This can be done either by manually browsing the entire site for the first time or by launching automated requests from test suite applications or other scripted clients (such as LoadRunner or SilkPerformer). While this approach appears to have the downside of being the simplest of all possible JSP precompilation methods, its disadvantages quickly become apparent. Perhaps the biggest disadvantage here is that it's difficult to implement across clustered environments, where the number of requests once required by this approach for a single node instance are now multiplied by the number of nodes within the cluster. Furthermore, it's even harder to ensure that each server instance within a clustered environment undergoes JSP precompilation when the cluster is proxied by one or more Web servers or hardware load balancers, since there is generally no way of knowing to which app server the proxy will initially forward the request. Additionally, this method must be implemented every time the application server is recycled, making it a painful task that cannot be accomplished for all but the smallest of sites. Therefore, we don't recommend this approach to JSP precompilation.

    Using Compilation Tools to Implement Precompilation
    Since manually exercising a site application to force JSP precompilation has significant disadvantages in real-world production environments, the alternative of compiling the JSPs into servlets during preproduction runtime becomes much more enticing. Fortunately, WLS offers two methods to do this. The first way performs precompilation at server startup during deployment of a given Web application (declarative precompilation), while the second offers a command-line Java tool (weblogic.jspc) to allow the process to be handled completely offline (programmatic precompilation). While both have their advantages, programmatic precompilation is the more flexible option of the two, and offers more compelling reasons to be used.

    Declarative Precompilation
    For declarative precompilation under WLS, a given Web application (standalone or as part of an EAR) can be configured so all of its JSPs are precompiled during application deployment (at server startup) and redeployment (at runtime). The necessary configuration changes are made to the WEB-INF/weblogic.xml deployment descriptor, which uses the precompile <jsp-param/> directive as follows:


    Upon deployment (or redeployment) of a given Web application, WLS will attempt to precompile all JSP files within the WAR if the above parameter is set to true, recursively working its way down from the root directory of the Web application in the process (and skipping over WEB-INF). Files with either a .jsp or .JSP extension become targets for compilation. Compiled class files are then placed underneath the temporary working directory of the Web application (by default a subdirectory of WEB-INF, unless explicitly specified within weblogic.xml) in the appropriate package directory structure.

    While this method is by far the most convenient approach to JSP precompilation (the "flick-a-switch" approach), it has a number of disadvantages that render it almost useless. If an error occurs during compilation of a JSP at the time of deployment (or redeployment), precompilation of the Web application will halt at the point of the exception. Additionally, in situations where there are a large number of JSP files within a given Web application, declarative precompilation significantly impacts deployment time, blocking the deployment until all of the files have been compiled. For large applications, such deployment times tend to be on the order of minutes (10 to 15 minutes in some cases, potentially even longer in others) when hundreds of JSP files are present and declarative precompilation is implemented. Imagine starting a server instance in which a given Web application cycles into the deployment phase with declarative precompilation enabled. If there are a large number of JSP files within the app, and the deployment, nearing completion and having already taken a significant amount of time, suddenly fails because an exception is thrown during compilation, frustration will surely ensue. While convenient at first look, declarative compilation poses a significant risk to production system management and should be utilized only with great consideration.

    Programmatic Precompilation
    The most reliable way to precompile JSP files under WLS is to use the Java command-line utility, weblogic.jspc, located in the weblogic.jar file under the lib directory of the WLS installation. This tool allows a developer to compile the desired JSP files during the development phase and iron out compile-time issues before deployment. It also provides an administrator with the ability to implement JSP precompilation for production systems. The major benefits of this utility are:

  • Files can be precompiled once and then deployed multiple times. (This isn't affected by the recycling of a server instance.)

  • Compile-time exceptions can be worked out in advance without affecting deployment.

  • Classes can be deployed across a cluster.

    The drawbacks are that using weblogic.jspc requires manual intervention and that it must be rerun when JSP files become out-of-date in development. However, given the issues with the other two methods discussed above, we hardly find this inconvenience to be a disadvantage, and recommend it as the most reliable and flexible mechanism to implement JSP precompilation.

    Executing weblogic.jspc
    In order to use weblogic.jspc effectively, you must first understand its usage and syntax. For this article, we will utilize the features of the tool from WLS 6.1 SP2. Note: The syntax and best practices given below should apply to all versions of WLS 6.1 and in part to the new WLS 7.0.

    To invoke the command-line JSP compiler (weblogic.jspc), you must ensure the following:

  • The PATH environment variable must include the binary directory of the J2SE 1.3. package installed on your machine for JVM runtime support (e.g., /opt/j2se/1.3.1/sdk/bin or c:\sunsoft\j2se\1.3.1\sdk\bin). If you plan on using javac as your Java compiler for JSP compilation, be sure your PATH includes the binary directory of the full Java 1.3 software development kit (SDK), and not simply the Java Runtime Engine (JRE), as no compiler is shipped with the JRE. If you plan on using a compiler other than javac (such as Jikes), be sure to include the appropriate directories within the PATH for that compiler as well.

  • Set the Java system classpath to include the weblogic.jar file from the WLS 6.1 SP2 installation directory, by default found under the product library directory (e.g., /opt/bea/wlserver6.1/lib/weblogic.jar, or c:\bea\wlserver6.1\lib\weblogic.jar). Additionally, be sure to reference any libraries (JAR or class files) that you might need from the classpath during the JSP compilation stage.

    Before executing weblogic.jspc for the first time, it's a good idea to quickly test your command-line configuration as set above. This can be achieved by simply running a version check of WLS with the command, "java weblogic.version", which should return the following:

    WebLogic Server 6.1 SP2 12/18/2001 11:13:46 #154529
    WebLogic XML Module 6.1 SP2 12/18/2001 11:28:02 #154529

    If your output isn't similar to the above (appropriate to the version you are running), be sure to revisit the PATH and classpath variables set within your current command-line environment before proceeding with JSP precompilation.

    The general syntax of weblogic.jspc is given below:

    java weblogic.jspc [options] <jsp files>...

    The JSP compiler can by default compile a single JSP file or a set of JSPs in a single invocation of the compiler, and can be configured in a number of different ways via command-line options. A working example is provided:

    -webapp mywebapp
    -compiler javac
    -compileFlags "-g"
    -classpath /u/apps/dist/src/lib.jar
    -d .
    -package com.slackwerks.mywebapp.jsp

    This article shows a single example, but so you may better understand how weblogic.jspc can be used and managed in your environment, we provide the complete set of working options, implications of use, and associated issues at

    While the argument for JSP precompilation can easily be made, there are a number of approaches that can be taken to implement it. However, given the advantages and disadvantages of those shown above, it should be readily apparent that programmatic precompilation via weblogic.jspc provides the most flexible option for overcoming the pitfalls inherent to JSP. Becoming familiar with the tool early in the development cycle will improve the administration and performance aspects of the application during production.

  • More Stories By Steve Mueller

    Steve Mueller is a principal
    consultant for BEA Systems, where he specializes in the design,
    development, and administration of enterprise systems running on WebLogic Server.

    More Stories By Scot Weber

    Scot Weber consults on J2EE design, development, and administration, specifically focusing on WebLogic Server. He has been involved as a lead architect and systems engineer in the field of distributed applications, and has participated in and witnessed the maturation of traditional OLTP, RPC, and messaging methodologies into their current J2EE manifestations

    Comments (0)

    Share your thoughts on this story.

    Add your comment
    You must be signed in to add a comment. Sign-in | Register

    In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

    @ThingsExpo Stories
    The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data shows "less than 10 percent of IoT developers are making enough to support a reasonably sized team....
    We all know that data growth is exploding and storage budgets are shrinking. Instead of showing you charts on about how much data there is, in his General Session at 17th Cloud Expo, Scott Cleland, Senior Director of Product Marketing at HGST, showed how to capture all of your data in one place. After you have your data under control, you can then analyze it in one place, saving time and resources.
    Just over a week ago I received a long and loud sustained applause for a presentation I delivered at this year’s Cloud Expo in Santa Clara. I was extremely pleased with the turnout and had some very good conversations with many of the attendees. Over the next few days I had many more meaningful conversations and was not only happy with the results but also learned a few new things. Here is everything I learned in those three days distilled into three short points.
    As organizations realize the scope of the Internet of Things, gaining key insights from Big Data, through the use of advanced analytics, becomes crucial. However, IoT also creates the need for petabyte scale storage of data from millions of devices. A new type of Storage is required which seamlessly integrates robust data analytics with massive scale. These storage systems will act as “smart systems” provide in-place analytics that speed discovery and enable businesses to quickly derive meaningful and actionable insights. In his session at @ThingsExpo, Paul Turner, Chief Marketing Officer at...
    DevOps is about increasing efficiency, but nothing is more inefficient than building the same application twice. However, this is a routine occurrence with enterprise applications that need both a rich desktop web interface and strong mobile support. With recent technological advances from Isomorphic Software and others, rich desktop and tuned mobile experiences can now be created with a single codebase – without compromising functionality, performance or usability. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, demonstrated examples of com...
    In his General Session at 17th Cloud Expo, Bruce Swann, Senior Product Marketing Manager for Adobe Campaign, explored the key ingredients of cross-channel marketing in a digital world. Learn how the Adobe Marketing Cloud can help marketers embrace opportunities for personalized, relevant and real-time customer engagement across offline (direct mail, point of sale, call center) and digital (email, website, SMS, mobile apps, social networks, connected objects).
    Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessions, I wanted to share some of my observations on emerging trends. As cyber security serves as a fou...
    The Internet of Everything is re-shaping technology trends–moving away from “request/response” architecture to an “always-on” Streaming Web where data is in constant motion and secure, reliable communication is an absolute necessity. As more and more THINGS go online, the challenges that developers will need to address will only increase exponentially. In his session at @ThingsExpo, Todd Greene, Founder & CEO of PubNub, exploreed the current state of IoT connectivity and review key trends and technology requirements that will drive the Internet of Things from hype to reality.
    Continuous processes around the development and deployment of applications are both impacted by -- and a benefit to -- the Internet of Things trend. To help better understand the relationship between DevOps and a plethora of new end-devices and data please welcome Gary Gruver, consultant, author and a former IT executive who has led many large-scale IT transformation projects, and John Jeremiah, Technology Evangelist at Hewlett Packard Enterprise (HPE), on Twitter at @j_jeremiah. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.
    Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
    The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound effect on the world, and what should we expect to see over the next couple of years.
    With all the incredible momentum behind the Internet of Things (IoT) industry, it is easy to forget that not a single CEO wakes up and wonders if “my IoT is broken.” What they wonder is if they are making the right decisions to do all they can to increase revenue, decrease costs, and improve customer experience – effectively the same challenges they have always had in growing their business. The exciting thing about the IoT industry is now these decisions can be better, faster, and smarter. Now all corporate assets – people, objects, and spaces – can share information about themselves and thei...
    PubNub has announced the release of BLOCKS, a set of customizable microservices that give developers a simple way to add code and deploy features for realtime apps.PubNub BLOCKS executes business logic directly on the data streaming through PubNub’s network without splitting it off to an intermediary server controlled by the customer. This revolutionary approach streamlines app development, reduces endpoint-to-endpoint latency, and allows apps to better leverage the enormous scalability of PubNub’s Data Stream Network.
    I recently attended and was a speaker at the 4th International Internet of @ThingsExpo at the Santa Clara Convention Center. I also had the opportunity to attend this event last year and I wrote a blog from that show talking about how the “Enterprise Impact of IoT” was a key theme of last year’s show. I was curious to see if the same theme would still resonate 365 days later and what, if any, changes I would see in the content presented.
    Apps and devices shouldn't stop working when there's limited or no network connectivity. Learn how to bring data stored in a cloud database to the edge of the network (and back again) whenever an Internet connection is available. In his session at 17th Cloud Expo, Ben Perlmutter, a Sales Engineer with IBM Cloudant, demonstrated techniques for replicating cloud databases with devices in order to build offline-first mobile or Internet of Things (IoT) apps that can provide a better, faster user experience, both offline and online. The focus of this talk was on IBM Cloudant, Apache CouchDB, and ...
    Microservices are a very exciting architectural approach that many organizations are looking to as a way to accelerate innovation. Microservices promise to allow teams to move away from monolithic "ball of mud" systems, but the reality is that, in the vast majority of organizations, different projects and technologies will continue to be developed at different speeds. How to handle the dependencies between these disparate systems with different iteration cycles? Consider the "canoncial problem" in this scenario: microservice A (releases daily) depends on a couple of additions to backend B (re...
    Discussions of cloud computing have evolved in recent years from a focus on specific types of cloud, to a world of hybrid cloud, and to a world dominated by the APIs that make today's multi-cloud environments and hybrid clouds possible. In this Power Panel at 17th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the importance of customers being able to use the specific technologies they need, through environments and ecosystems that expose their APIs to make true change and transformation possible.
    There are over 120 breakout sessions in all, with Keynotes, General Sessions, and Power Panels adding to three days of incredibly rich presentations and content. Join @ThingsExpo conference chair Roger Strukhoff (@IoT2040), June 7-9, 2016 in New York City, for three days of intense 'Internet of Things' discussion and focus, including Big Data's indespensable role in IoT, Smart Grids and Industrial Internet of Things, Wearables and Consumer IoT, as well as (new) IoT's use in Vertical Markets.
    Container technology is shaping the future of DevOps and it’s also changing the way organizations think about application development. With the rise of mobile applications in the enterprise, businesses are abandoning year-long development cycles and embracing technologies that enable rapid development and continuous deployment of apps. In his session at DevOps Summit, Kurt Collins, Developer Evangelist at, examined how Docker has evolved into a highly effective tool for application delivery by allowing increasingly popular Mobile Backend-as-a-Service (mBaaS) platforms to quickly crea...
    The cloud. Like a comic book superhero, there seems to be no problem it can’t fix or cost it can’t slash. Yet making the transition is not always easy and production environments are still largely on premise. Taking some practical and sensible steps to reduce risk can also help provide a basis for a successful cloud transition. A plethora of surveys from the likes of IDG and Gartner show that more than 70 percent of enterprises have deployed at least one or more cloud application or workload. Yet a closer inspection at the data reveals less than half of these cloud projects involve production...