What are the Goals of Performance Testing?

So what is the point of performance testing?  I get this question often.  And depending on who you talk to, you get different answers.

First let me begin by telling you what are NOT the goals of performance testing / validation.



  • Writing a great script

  • Creating a fantastic scenario

  • Knowing which protocols to use

  • Correlating script data

  • Data Management

  • Running a load test


This is not to say that all of these are not important. They are very important, but they are not the goals. They are the means to the end.


So why DO people performance test? What are the goals?



  • Validating that the application performs properly

  • Validating that the application conforms to the performance needs of the business

  • Finding, Analysing, and helping fix performance problems

  • Validating the hardware for the application is adequate

  • Doing capacity planning for future demand of the application


The outcomes of the performance test are the goals of testing. It seems basic. Of course these are the goals. But...



  • How many people really analyse the data from a performance test?

  • How many people use diagnostic tools to help pinpoint the problems?

  • How many people really know that the application performs to the business requirements?

  • How many people just test to make sure that the application doesn't crash under load?


Even though they seem obvious, many testers/engineers are not focusing on them correctly, or are not focused on them at all.



  • Analysing the data is too hard.

  • If the application stays up, isn't that good enough?

  • So what if it's a little slow?


These are the reasons that I hear. Yes, you want to make sure that the application doesn't crash and burn. But who wants to go to slow website. Time is money. That is not just a cliche, it's the truth. Customers will not put up with a slow app/website. They will go elsewhere and they do go elsewhere. Even if it is an internal application, if it is slow performing a task, then it takes longer to get the job done, and that means it costs more to get that job done.


Performance engineering is needed to ensure that applications perform properly and perform to the needs of the business. These engineers do not just write performance scripts. Just because someone knows Java does not mean that they are a developer. And just because a person knows how to write a performance script does not mean they they are a performance engineer.


Performance engineering requires skills that not all testers have. They need to understand the application under test (AUT), databases, web servers, load balancers, SSO, etc.... They also have to understand the impact of cpu, memory, caching, i/o, bandwidth, etc.... These are not skills are learned overnight, but skills that are acquired overtime.


I wrote a previous blog entry on "you get what you pay for". If you pay for a scripter, you get a scripter. If you pay for a performance engineer, you get a performance engineer (well not always :smileyhappy:. Sometimes people exaggerate their skills :smileyhappy: ).


Companies can always divide and conquer. They can have automaters/ scripters create the scripts and the tests, then have performance engineers look at the test and analysis the results. In any case the performance engineer is a needed position if you want to properly performance test/validate.


It needs to be mandatory to know what metrics to monitor and what those metrics mean. Also knowing how to use diagnostic tools needs to be mandatory. Again in a previous blog I mentioned that if you are not using diagnostics you are doing an injustice to your performance testing. Without this analysis knowledge you are not truly performance testing, you are just running a script with load. Performance testing is both running scripts and analysing the runs.


By looking at the monitoring metrics and diagnostic data, one can begin to correlate data and help pinpoint problems. They can also notice trends that may become problems overtime. Just running a loadtest without analysis will not give you that insight. It will just let you know that the test appeared to run ok for that test run. Many times just running the test will give you a false positive. People wonder why an application in production is running slow if it already passed performance validation. Sometimes this is the reason (You never want this to be the reason). Proper analysis will ensure a higher quality application.


As I said, these are not skills that are created overnight. Performance engineers learn on the job. How do you make sure that this knowledge stays with a company as employees come and go? That is where a Center of Excellence (CoE) comes into play (You knew I was going to have to pitch this :smileyhappy: ). If you centralize your testing efforts, then the knowledge becomes centralized as opposed to dispersed through a company only to get lost if those employees with the knowledge leave. You can read yet another one of my blogs for more information on the CoE. Wow! I've just been pitching my blogs entries today :smileyhappy:. But I digress.


Let's stop thinking that proper performance testing is writing a good script and agree that performance engineering is not an option but a must. Let's start to focus on the real goals of performance testing and then all the of the important "means to the end" will just fall into place.

Comments
| ‎10-09-2009 06:34 PM

I just want to print this Blog entry out on parchment paper and frame it on the wall of my cubicle!

Here at Deere & Company I feel like the lone evangelist for the Performance Testing philosophy you are describing. I am constantly trying to convince Management and others that our job as Application Performance Testers is not just conducting some "load tests" on an application 2 days before it gets deployed into Production. When I try to incorporate the use of monitoring during testing I have to fight for the necessary "rights" to even access the Servers on which the application runs. And I am pretty much not allowed to even think about monitoring the Databases (DB2, SQL Server, and Oracle), the Middleware (MQ), or the Network. Worse yet, trying to get the various experts in the aforementioned areas to work cooperatively with our team during testing is an ongoing struggle.

As a result of the above "fights", I am struggling to demonstrate value to the testing that we do conduct. In the past week alone, here are excerpts from two "analysis" reports from my colleagues: "We tested the application with 100 VUsers, 150 VUsers and 200 VUsers. It ran OK with 100, but crashed with the other loads." and "The application ran well for 45 minutes, but then it experienced weirdness". I'm serious! On top of this, I see constant reports that are nothing more than the LoadRunner Analysis Report in Word Doc form --- without any real explanation for what a "VUser" really is, what Hits per Second or Throughput even mean (and should we care), and whether or not the application will run at the projected loads.

The reason I am even on the HP site right now is I was looking to see if I could find more current information on several things I personally need for a project on which I am working: 1) A better understanding of how to monitor WebSphere Applications running on a virtualized Apache/Linux environment. 2) A way to monitor DB2 running on AIX. 3) A way to monitor AIX in general, with consideration for the concept of entitilement. 4) A way to diagnose Application "crashes" in WebSphere (including how to read Heapdumps and Javacores). Unfortunately or fortunately, I stumbled on this Blog, and have to give up my search again for today. Oh well, I have been trying to get the HP Diagnostics software to run in our environment since at least February, so what's a few more days? I can always open another HP Service Request, learn Spanish so I can decipher the recommendations I am offered, and give up again a month from now.

I would like to ask if you can find someone who can draft a "Performance Testing Specialist's Oath" that addresses all of the points you have made. That way I can at least give the new testers on our team something to shoot for.

(anon) | ‎09-29-2010 11:05 AM

I agree completely with Runner53.  I experience the same hurdles and I am at a loss to articulate the value of utilizing the LR product to the full extent.  Not allowing LR access to a server via a service account to gather perfmon stats is just one of the hurdles.   A basic understanding of the complexity of testing is another.  Scripting is a means to an end, not the end goal.

 

I am on the HP site looking for standard best practices to present to the PMs, to give them an understanding that performance testing is not just click and script and run.  It takes preparation, LOB input and interpretation of results.   

 

Having a "LR Best Practices" site would be helpful.  I would like to see HP take an active roll in assisting the performance engineer in explaining Performance testing and the value-add to the company.  The company has purchased a very powerful and useful product and under the guidance of an experienced engineer, production issues could be caught in QA.  Our goal as performance engineers is to take our years of experience in engineering and architecting combined with a knowledge of the LR product to deliver a quality application and positive end user experience.  After all, we want the customer to return to our site or application and use it, that is why we developed it.  It keeps us in business.

Leave a Comment

We encourage you to share your comments on this post. Comments are moderated and will be reviewed
and posted as promptly as possible during regular business hours

To ensure your comment is published, be sure to follow the Community Guidelines.

Be sure to enter a unique name. You can't reuse a name that's already in use.
Be sure to enter a unique email address. You can't reuse an email address that's already in use.
Type the characters you see in the picture above.Type the words you hear.
Search
About the Author


Follow Us
The opinions expressed above are the personal opinions of the authors, not of HP. By using this site, you accept the Terms of Use and Rules of Participation