Weblogic Authors: Yeshim Deniz, Elizabeth White, Michael Meiner, Michael Bushong, Avi Rosenthal


How to Speed Up sites like vancouver2010.com by more than 50% in 5 minutes

a standard approach in order to get to a high-level analysis result in 5 minutes

Many Web Sites that use JavaScript frameworks to make the site more interactive and more appealing to the end user suffer from poor performance. Over the past couple of months I’ve been contacted by users of our free dynaTrace AJAX Edition asking me to help them analyze their problems. In doing so, I’ve developed a standard approach in order to get to a high-level analysis result in 5 minutes.

As the Winter Olympics are a hot topic right now I checked out vancouver2010.com to see if they have any potential to improve their web site performance. It seems I found a perfect candidate for this 5 minutes guide :-)

Minute 1: Record your dynaTrace AJAX Session

Before I start recording a session I always turn on argument capturing via the preferences dialog:

Turn on Argument Capuring in the Preferences Dialog

Turn on Argument Capuring in the Preferences Dialog

The reason I do that is because I want to see the CSS Selectors passed to the $ or $$ lookup functions from various JavaScript frameworks like jQuery or Prototype. The main problem I’ve identified in my work are CSS Selectors per className that cause huge overhead on pages with many DOM elements. I wrote two blogs about the performance impact of CSS Selectors in jQuery and Prototype.

Now its time to start tracing. I executed the following scenario:
1. went to http://vancouver2010.com
2. click on Alpine skiing
3. click on Schedules & Results
4. click on the results of the February 17th race (that’s where we Austrians actually made it on the podium)

Minute 2: Identify poorly performing pages

After closing the browser, I return to dynaTrace AJAX Edition and look at the Summary View to analyze the individual page load times and to identify whether there is a lot of JavaScript, Rendering or Network time involved. Let’s see what we got here:

Identifying HotSpots on every page

Identifying HotSpots on every page

Here is what we can see
1. Across the board we have high JavaScript execution. The last page (schedule and results) tops it with almost 7 seconds in pure JavaScript
2. The first page has a big amount of Rendering Time – that is time spent in the browsers rendering engine
3. Page 2 and 4 have page load times (time till the onLoad event was triggered) of more than 5 seconds!!
4. Page 3 has a very high Network Time although it doesn’t have a very bad page load time. This means that we have content that was loaded after the onLoad

Minute 3: Analyze Timeline of slowest Page

I pick page 4 as we see a very high Page Load and very high JavaScript time. I drill down to the timeline view and analyze the page characteristic:

Where is the time spent on this page?

Where is the time spent on this page?

Here is what I can read from this timeline graph (moving the mouse over these blocks gives me a tooltip with timing and context information):
1. the readystatechangehandler takes 5.6 seconds in JavaScript. This handler is used by jQuery and calls all registered load handlers
2. the script FB.share takes 792 ms when it gets loaded
3. an XHR Request at the very beginning takes 820ms
4. we have about 80 images all coming from the same domain – this could be improved by using multiple domains
5. we have calls to external apps like facebook, google ads or google analytics

Minute 4: Identify poorly performing CSS Selectors

The biggest block is the JavaScript executed in the readystatechangehandler. I double click on it and end up in the PurePath view showing me the JavaScript trace of this event handler. I navigate to the actual handler implementation which gets called by jQuery. I expand the handler to see the methods it calls and which one consume the most time. It is not surprising to see a lot of jQuery Selector methods in there using a CSS className to identify the element:

PurePath View showing HotSpots in the onLoad event handlers

PurePath View showing HotSpots in the onLoad event handlers

I highlighted those calls that have a major impact on the performance of this event handler. You can see that most of the time is actually spent in the $ methods that is used to look up elements. Another thing that I can see is that they change the class name of the body to “en” which takes 550ms to execute.

As I am sure there are tons of calls to jQuery Selector Lookups in that JavaScript handler as well as in all other JavaScript handlers on the vancouver2010.com website I open up the HotSpot view. The HotSpot view shows me the JavaScript, DOM Access and Rendering Hotspots across all pages. I am interested in the $ methods only. In the HotSpot view I therefore filter for “$(” and also filter to only show the DOM API (we account the $ method to the DOM API and not to jQuery). Here is what I get after sorting the table by the Total Sum column:

HotSpot View showing all jQuery CSS Selectors and their performance overhead

HotSpot View showing all jQuery CSS Selectors and their performance overhead

The problem here is easy to explain. The site makes heavy use of the CSS Selectors to look up elements by class name. This type of lookup is not natively supported by Internet Explorer and therefore jQuery has to iterate through the whole DOM to find those elements. A better solution would be to use unique IDs - or at least add the tag name to the selector string – this also helps jQuery as it first finds all elements by tag name (which is natively implemented and therefore rather fast) and then only has to iterate through these elements. So instead of an average lookup time of between 50ms and 368ms this can be brought down to 5-10ms -> nice performance boost - eh? :-)

Minute 5: Identify network bottlenecks

In the timeline I saw many image requests coming from the same domain. As most browsers have a physical network connection limitation per domain (e.g.: IE7 uses 2) the browsers can only download so many images in parallel. All other images have to wait for a physical connection to become available. Drilling into the Network View for page 4 I can see all these 70+ images and how they “have to wait” to become downloaded. Once these images are cached this problem is no longer such a big deal – but for first time visitors it is definitely slowing down the page:

Network View showing wating times for Images

Network View showing waiting times for Images

The solution for this problem is using the concept of domain sharding. Using 2 domains to host the images allows the browser to use twice as many physical connections to download more images in parallel. This will speed up page the download of those images by 50%.


It is easy to analyze the performance hotspots of any web site out there. This is my approach to identify the most common problems that I’ve seen in my work. Besides the problems with CSS Selectors and Network Requests we see problems with poorly performing JavaScript routines (very often from 3rd party libraries), too many JavaScript files on the page, too many XHR (XmlHttpRequests) to the server and slow responses from the server of those XHR Requests. Especially for that last piece we then use our End-To-End Monitoring Solution by integrating the data captured with dynaTrace AJAX Edition with the server-side PurePath data captured with dynaTrace CAPM. Also – check out my blog about why end-to-end performance analysis is important and how to do it.

Feedback on this is always welcome. I am sure you have your own little tricks and processes to identify performance problems of your web sites. Feel free to share it with us at blog.dynatrace.com.

Related reading:

  1. Performance Analysis of dynamic JavaScript Menus In my previous post I talked about the impact of...
  2. 101 on jQuery Selector Performance Last week I got to analyze a web page that...
  3. 101 on Prototype CSS Selectors Performance implications of certain CSS Selectors are not specific to...
  4. Ensuring Web Site Performance – Why, What and How to Measure Automated and Accurately It is a fact that end user response time is...
  5. 101 on HTTPS Web Site Performance Impact I recently analyzed a secure web page that took 20...

More Stories By Andreas Grabner

Andreas Grabner has been helping companies improve their application performance for 15+ years. He is a regular contributor within Web Performance and DevOps communities and a prolific speaker at user groups and conferences around the world. Reach him at @grabnerandi

IoT & Smart Cities Stories
Nicolas Fierro is CEO of MIMIR Blockchain Solutions. He is a programmer, technologist, and operations dev who has worked with Ethereum and blockchain since 2014. His knowledge in blockchain dates to when he performed dev ops services to the Ethereum Foundation as one the privileged few developers to work with the original core team in Switzerland.
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more busine...
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science," is responsible for setting the strategy and defining the Big Data service offerings and capabilities for EMC Global Services Big Data Practice. As the CTO for the Big Data Practice, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He's written several white papers, is an avid blogge...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
Whenever a new technology hits the high points of hype, everyone starts talking about it like it will solve all their business problems. Blockchain is one of those technologies. According to Gartner's latest report on the hype cycle of emerging technologies, blockchain has just passed the peak of their hype cycle curve. If you read the news articles about it, one would think it has taken over the technology world. No disruptive technology is without its challenges and potential impediments t...
If a machine can invent, does this mean the end of the patent system as we know it? The patent system, both in the US and Europe, allows companies to protect their inventions and helps foster innovation. However, Artificial Intelligence (AI) could be set to disrupt the patent system as we know it. This talk will examine how AI may change the patent landscape in the years to come. Furthermore, ways in which companies can best protect their AI related inventions will be examined from both a US and...
Bill Schmarzo, Tech Chair of "Big Data | Analytics" of upcoming CloudEXPO | DXWorldEXPO New York (November 12-13, 2018, New York City) today announced the outline and schedule of the track. "The track has been designed in experience/degree order," said Schmarzo. "So, that folks who attend the entire track can leave the conference with some of the skills necessary to get their work done when they get back to their offices. It actually ties back to some work that I'm doing at the University of San...
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the...