Welcome!

Weblogic Authors: Yeshim Deniz, Elizabeth White, Michael Meiner, Michael Bushong, Avi Rosenthal

Related Topics: Microservices Expo, Java IoT, Microsoft Cloud, Machine Learning , Agile Computing, @CloudExpo

Microservices Expo: Article

Load Testing: Same Old, Same Old or a Whole New Ball Game?

How can we be prepared for the future of load and performance testing?

Bonjour Foxes!

I started my career as a Telecom Engineer for Rational Software in the load testing space back in the late '90s, and when I look back on the last decade, there were enormous advances in the broader IT world including development methodologies, processing speeds, network speeds, mobile devices (it's hard to believe the first iPhone was only released 6 years ago). But the question is: Has the world of load and performance testing really changed all that much? Are the missions the same? Are the challenges different? And how can we be prepared for the future of load and performance testing?

Let's start with what hasn't changed:

Websites crash.
Crashes are a reality today just as much as they were 10 years ago. While many companies have become aware of big events (e.g., Cyber Monday, big ad campaigns) that cause traffic spikes, exact traffic levels are still difficult to predict. One big difference between now and then is the sheer number of websites on the Internet. According to Netcraft, the number of active sites on the Internet has increased by almost 1000% over the past ten years. More websites = more crashes.

Developers still think their code is bug-free.
Being a load tester can sometimes come with the occasional power trip that results from crashing an application written by developers who think their code is perfect. While I'm sure their code is very good, load testing can still quickly uncover any flaws in the code or architecture that can cause performance issues.

Load testing is still pushed off until late in the development cycle.
This is a reality most testers just have to deal with. Testing in general, but particularly load and performance testing, are held off until the end of the development cycle, which can be particularly frustrating for testers who can be blamed for holding up a release.

HTTP is still a connectionless protocol.
Furthermore, it's being used to drive a world that is becoming increasingly connected. In fact, the vast majority of advances in web technologies (cookies, sessions, AJAX, WebSockets, SPDY, etc.) have been created in order to overcome the HTTP connectionless limitations. For load testers, this means it still causes complications.

What Has Changed?
Load testing is becoming a mandatory step in the development process.

I've been talking to more and more companies these days that are requiring that all applications go through load and performance testing before they're deployed to production. This is especially true for new online services that know that they only have one chance to show their best or likely lose that customer forever.

The performance of applications is becoming more important than the breadth of functionality.
For some companies like insurance and banks, the importance of application performance is much higher than the pure number of functions their apps can perform. This means that load and performance testers are playing more prominent roles within these development organizations.

The number of technologies built over HTTP is growing each month.
More and more technologies are being developed to make the web faster, more secure and more reactive, and more and more development organizations are adopting them at faster rates. This means load testers are required to test apps containing complex technologies they've never seen before, and it's no easy task to test apps utilizing AJAX, SPDY, WebSockets, video streaming, etc.

App developers have to consider the performance of their apps on mobile devices and networks.
Morgan Stanley
has predicted that mobile Internet users will surpass desktop Internet users by the end of the year. With this in mind, performance testers need to be able to re-create the use cases and network conditions actual users will experience with several different network types and several different devices.

What can you do to handle the realities of today and be prepared for load testing world of tomorrow?

1. Don't avoid load and performance testing.
If you're one of those people who think your apps will be fine and users will do the testing in production, I hope for your sake that your developers actually do write bug-free code. Remember, your end-users will not be as forgiving or as patient as virtual users are when your application is under high load if you recall the old Amazon.com statistic about losing 1% of sales for every 100 ms delay in response times.

2. Make your tests as realistic as possible.
Load testing is much easier these days with the tools available, but don't think it's a "point & click" operation. I see too many companies running "load tests" that do not simulate the number of users observed in production nor the conditions under which the apps will be used.

3. Make sure your load testing tool can match the rhythm of the technical "dance".
Developers and architects are going to want to take advantage of the latest technologies, even some that are still in beta. As a tester, you and your tool shouldn't be the bottleneck for the product launch. Make sure your tool supports your needs as well as the needs of your development organization.

With all of the advances in web and mobile application technologies and the instant response times end-users expect these days, the performance of applications is only going to grow in importance. My advice to load testers is to stay on top of these trends because they move quickly. It's certainly an exciting time to be a load tester.

More Stories By Hervé Servy

Hervé Servy is a Senior Performance Engineer at Neotys. He has spent 10 years working for IBM-Rational and Microsoft pre-sales and marketing in France and the Middle East. During the past 3 years, as a personal project, Hervé founded a nonprofit organization in the health 2.0 area. If that isn’t techie enough, Hervé was also born on the very same day Apple Computer was founded.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...