HP LoadRunner and Performance Center Blog

Displaying articles for: August 2008

Performance Testing Guidance

One of the last things I worked on before leaving Microsoft was a book on Performance Testing Guidance, specifically with some of the best performance engineers that I'd known for years at Microsoft. These guys were intense about the subject and really worked the authors and reviewers like crazy to get it done.

 

This guide shows you an end-to-end approach for implementing performance testing. Whether you are new to performance testing, or looking for ways to improve your current performance testing approach, you will find insights that you can tailor for your specific scenarios.This guide covers Microsoft's recommended approach for implementing performance testing for Web applications. These provide steps for managing and conducting performance testing. For simplification and tangible results, they are broken down into activities with inputs, outputs, and steps. You can use the steps as a baseline or to help you evolve your own process.

    

Written by:  J.D. Meier, Scott Barber, Carlos Farre, Prashant Bansode, and Dennis Rea

     

Reviewed by: Alberto Savoia, Ben Simo, Cem Kaner, Chris Loosley, Corey Goldberg, Dawn Haynes, Derek Mead, Karen N. Johnson, Mike Bonar, Pradeep Soundararajan, Richard Leeke, Roland Stens, Ross Collard, Steven Woody, Alan Ridlehoover, Clint Huffman, Edmund Wong, Ken Perilman, Larry Brader, Mark Tomlinson, Paul Williams, Pete Coupland, and Rico Mariani.

 

 

 

 

The book is an excellent starting point for learning about performance testing and performance engineering practices and I can tell you it is loaded with good advice about how to "think" about load testing. It covers the concepts and perspectives that some of the best engineers in our industry apply every day, to some of the toughest performance problems.


One of those guys is Scott Barber (note: an unabashed plug for Scott's work) who was a major contributor to this book. According to Scott's own blog entry about the book: "Even though this as a Microsoft patterns&practices book, it is a tool, technology, & process agnostic book...the book should apply equally well to a LoadRunner/Eclipse/Agile project it applies to a VSTS/.NET/CMMI project."  You can read more of Scott's blog

 

Check out the book (buy it, get the PDF, or view online.)...it's a great way to get started with the performance testing discipline.

here. In retrospect, I think I failed to truly appreciate Scott's experience and contributions to the writing - in fact, I know I did. 

 

Labels: LoadRunner

Email Questions about Think Times

**********************************************

From: Prasant 
Sent: Monday, August 04, 2008 7:55 AM
To: Tomlinson, Mark
Subject: Some questions about LR 

Hi Mark,

I am Prasant . I got your mail id from yahoo LR group. I have just started my career in Performance testing. I got a chance to work on LR . Currently I am working with LR 8.1. I have one doubt regarding think time. While recoding one script automatically think time got recorded in the script. While executing the script I am ignoring the think time. Is it required to ignore the think time or we have to consider the think time while executing the script.

I have questions in mind like, when think time is considerd as the user is taking before giving input to the server . In that case while recording any script for a particular transaction I may take 50 seconds as think time and my friend who is recording the same script will take less than 50 seconds (let's say 20 seconds). So, in his script and in my script the think time will vary for same transaction. If I will execute both the scripts considering the think time the transaction response time will vary . It may create confusion for the result analysis. Can you please give some of your view points about this.

Thanks,
Prasant

**********************************************
From: Tomlinson, Mark 
Sent: Thursday, August 07, 2008 2:59 AM
To: Prasant
Subject: RE: Some questions about LR 

Hi Prasant,

Yes – it is good that think time gets recorded, so the script will be able to replay exactly like the real application – with delays for when you are messing around in the UI. But you must be careful, if you are recording your script and you get interrupted…or perhaps you go to the bathroom, or take a phone call…you will see VERY LONG THINK TIMES getting recorded. You should keep track of this, and then manually go edit those long delays – make them shorter in the script. Make them more realistic, like a real end user.

Also, as a general rule of thumb you should try *not* to include think time statements in between your start and end transactions. You are right that it will skew the response time measurements. But for longer business processes where you have a wrapper transaction around many statements…it might be impossible to clean every transaction.

Here are 3 other tips for you:

First – in the run time settings, you have options to limit or adjust the think time settings for replay…you can set a maximum limit, or multiply the amount. The combinations are very flexible. You can also choose to ignore think times and run a stress test, although I typically will include even 1 second iteration pacing for most stress tests I run.

Second – you can write some advanced functions in the script to randomize the think times programmatically. This could be used to dynamically adjust the think time from a parameter value, in the middle of the test.

Third – even if you do have think times inside your start and end transactions, there is an option in the Analysis tool to include or exclude the think time overhead in the measurements displayed in the Analysis graphs and summary.

I hope you’ll find that with those 3 tips, you can get all the flexibility you need to adjust think times in your scripts – try to make the most realistic load scenario you can.

Best wishes,
Mark

Flash compared to Silverlight

Today I read an article from my colleague Brandon which compares Adobe Flash and Microsoft Silverlight. He makes some excellent points about the strengths of Flash's market penetration compared to Silverlight's latest enhancements. For rich internet application, I think we still see Flash as the primary Ux platform out there…and it is a challenge for any testing tools to keep up with the fast pace of Adobe's innovations.

Brandon points out one of the main advantages that Silverlight has is the "Speed to Production" - getting the app delivered, out to production quickly. The advantage is better responsiveness and agility for the business. Unfortunately this usually equates to less time for proper testing, and especially performance testing.

 

It's also interesting how he points out performance comparison at the presentation layer - which I think could be described as the "perceived performance" for the entire application system. In Enterprise RIA or mashed-up application users might not perceive on-par performance from either Flash or Silverlight, depending on the backend systems.

In a big RIA, you have multiple points of exposure to latency risk introduced in the data services calls behind the application - so even if the UI is responsive to the user, the data retrieval might be slow. Check out James Ward's blog on "Flex data loading benchmarks" - showing the combination of AMF and BlazeDS, which is proving to be a very scalable and responsive combination.

Tags: Silverlight

Welcome to Loadrunner at hp.com

Greetings fellow LoadRunner Gurus! This is the introductory entry to the new LoadRunner @ hp.com blog, written by yours truly - Mark Tomlinson, the Product Manager for LoadRunner here at HP Software.

As you might be expecting a lengthy personal introduction for the first blog entry, I've decided to not deliver on that foolishly stereotypical initiation. Instead I'd like to start off here with a few opportunities to engage with you directly for the betterment of LoadRunner's future.

First, we are looking for a few good and challenging applications for LoadRunner to go after - as a our New Horizon research team are developing some very new exciting solutions for advanced client record and replay. If you got an application with any extreme architecture including:

  1. Web v2.0 or Rich Internet Applications on Java, .NET or Adobe
  2. Multi-multi-multi-protocol…like a mashed-up app with several backend systems
  3. Encoded/Serialized or Secured communication protocols
  4. Asynchronous, multi-threaded client(s) or data-push technologies
  5. Any combination or all of the above.

If you have a challenge for LoadRunner, we'd love to hear from you.

Second, we have a new release of LoadRunner coming soon and we are just getting our plans for the early-access beta program put together. If you're an existing customer and you're interested in getting into our formal beta program for LoadRunner drop us an email. We have an early-access program that does require your feedback, production usage and reference for the new release. We'd love to have your support for all that - but I certainly understand some folks just want to share some feedback on the new stuff. We need that also, if that's all you can do.

Lastly, I'd love to hear from you - so drop me an email (loadrunner at hp . com). What do you love about the product, what do you not like so much? What kinds of testing are you doing? What new applications are you being asked to test? How do you get your testing done? How do you generate meaning from the load test results? What is your favorite BBQ restaurant? Let me know your thoughts and feedback - the good, the bad, the ugly. I have been using LoadRunner for nearly 15 years - so I plan include your input into our strategy for moving forward with innovating our solutions. I will post back weekly with some Q&A, if you'd like to share the conversation with our community.

Again - all of these initiatives are really important to the future of LoadRunner. Your participation is encouraged and greatly appreciated!

Thanks - and again, welcome to the blog!

Search
Showing results for 
Search instead for 
Do you mean 
About the Author(s)
  • I have been working in the computer software industry since 1989. I started out in customer support then software testing where I was a very early adopter of automation, first functional test automation and them performance test automation. I worked in professional services for 8 years before returning to my roots in customer support where I have been a Technical Account Manger for HP's Premier Support department for the past 4 years. I have been using HP LoadRunner since 1998 and HP Performance Center since 2004. I also have strong technical understanding of HP Application Lifecycle Management (Quality Center) and HP SiteScope.
  • Malcolm is a functional architect, focusing on best practices and methodologies across the software development lifecycle.
  • Michael Deady is a Pr. Consultant & Solution Architect for HP Professional Service and HP's ALM Evangelist for IT Experts Community. He specializes in software development, testing, and security. He also loves science fiction movies and anything to do with Texas.
  • Mukulika is Product Manager for HP Performance Center, a core part of the HP Software Performance Validation Suite, addressing the Enterprise performance testing COE market. She has 14 years experience in IT Consulting, Software development, Architecture definition and SaaS. She is responsible for driving future strategy, roadmap, optimal solution usage and best practices and serves as primary liaison for customers and the worldwide field community.
  • HP IT Distinguished Technologist. Tooling HP's R&D and IT for product development processes and tools.
  • WW Sr Product Marketing Manager for HP ITPS VP of Apps & HP Load Runner
Follow Us


HP Blog

HP Software Solutions Blog

Labels
The opinions expressed above are the personal opinions of the authors, not of HP. By using this site, you accept the Terms of Use and Rules of Participation