HP LoadRunner and Performance Center Blog

Displaying articles for: May 2009

Learn: what can you really get out of a Performance CoE

I've been working in centralized performance testing organizations for more than 6 years, giving presentations and consulting on Perf CoE - to the point where I didn't think there was much more to learn.  Then I read the preliminary research from Theresa Lanowitz who digs deeper into the true value of doing centralized performance testing.  There is compelling new evidence to show the real value that our customers are finding, the clear benefits and some of the hidden value.   

Please consider attending this session from Theresa Lanowitz, founder of voke Inc., as she presents the findings of a recent Market Snapshot on Performance Centers of Excellence. In this presentation, she will discuss:

  • Building a performance CoE

  • Achieving performance CoE ROI – qualitative and quantitative

  • Gaining organizational maturity through a performance CoE

  • Realizing the benefits, results, and strategic value of a performance CoE

Attendees will receive a copy of the voke "Market SnapshotTM Report: Performance Center of Excellence."

Click here to Register

Labels: LoadRunner

ROI: You Get What You Pay For

We've all heard that saying. But how many times do we really follow it? We have bought, ok I have bought, cheap drills, exercise machines, furniture, only to be sorry about when they break prematurely. Or you find a great deal on shoes only to have them fall apart on you while you are in a meeting with a customer. I'm not saying that happened to me, but I know how that feels.

Cheaper always seems like it's a better deal. Of course it's not always true. I can tell you that now I pay more for my shoes and I'm much happier for it :smileyhappy:. No more embarrassing shoe problems in front of customers (not saying that it happened to me). In fact when my latest pair of shoes had an issue, I sent them back to the dealer and they mailed me a new pair in less than a week! That's service. You get what you pay for.

The same holds true for cars, clothes, hardware, repairs, and of course software testing tools. You knew I was going to have to go there.

I hear from some people that Performance Center is too expensive. I'm always amazed when I hear that. I'm not saying Performance Center is for everyone. If you don't need PC's features and functionality, it may appear pricey. If you are only looking for a simple cell phone, then a phone that connects to the internet and to your email and also has a touch screen may seem a little pricey. But if you need those features then you are able to see the value in those features.

I can sit here, go through each unique feature in Performance Center and explain to you the value (Not saying that it will not come in a future blog :smileyhappy: ). But why would you listen to me, I'm the product manager. Of course I'm going to tell you that there is a lot of value to PC. Well IDC,a premier global provider of market intelligence and advisory services, has just released an ROI case study around HP Performance Center.

A global finance company specializing in real estate finance, automotive finance,
commercial finance, insurance, and online banking was able to achieve total ROI in 5.6 months. Yes! Only 5.6 months, not 1 or 2 years. But a total return on investment in 5.6 months. If anything I think we should be charging more for Performance Center :smileyhappy:. This company did not just begin using PC, they have been using PC for the last 4 years. And during that time they have found a cumulative benefit of $24M. I'd say that got a lot more than what they were paying for. Not only did they see a 5.6 month return on the investment but they are seeing a 44% reduction in errors and a 33% reduction in downtime in production.

What gave them these fantastic numbers?

Increased Flexibility
By moving to Performance Center they were able to schedule their tests. Before PC they had controllers sitting around being idle while other controllers where in high demand based on the scenarios that were on them. But once they were able to start to schedule their tests, they began performing impromptu load tests and concurrent load tests. They started to see that they were able to run more tests with fewer controllers.

Increased Efficiency
While they are able to increase their testing output, they didn't increase their testing resources.
Their testers/engineers were able to more through PC than what they could do with any other tool.

Central Team
With the central team they were able to increase their knowledge and best practices around performance testing. By doing this along with performing more test, they were able to reduce their error rate by 44% and their production downtime by 33%.

So you get what you pay for. Put in the time and money. Get a good enterprise performance testing product. Invest in a strong central testing team. You will get more out, than what you put in. In the case of this customer they got $21M out over 4 years.

Also invest in good shoes. Cheap shoes will only cause you headaches and embarrassment (Not saying that it happened to me).

Managing Changes – Web 2.0, where's your shame

Oh, this strange fascination started in 2004 when they coined this new generation of ‘web development’ called Web 2.0.   I witnessed this evolution of technology from my seat in steerage at Microsoft as customers switched from the old Active Architecture (remember Windows DNA?) to the warm impermanence .NET and J2EE architectures for web applications.  Out with the old and in with the new, but the performance problems were generally the same – memory management, caching, compression, heap fragmentation, connection pooling, and so on.  It might have had a new name, but it was the same people making the same mistakes.  Back then we dismissed some of these new architecture as unproven or non-standard.  But that didn’t last long.  Now almost 5 years later with Web 2.0, any major player in the software industry that hasn’t adopted the latest web architectures is being spit on as being plainly outdated or stuck with the label of being traditional.

When it comes to testing tools and Web 2.0, I think that “traditional” does not equate to obsolete – no matter how some of the “youngsters” in the testing tool market may like to imply.  The software industry is competitive, certainly and I think new companies and software should just evangelize the positive innovations they have and then the facts can speak for themselves.  If the ‘old guys’ can’t support new Web 2.0 stuff…then it will be obvious soon enough.  For instance, if a new testing tool company doesn’t fully support AJAX clients it’s just unacceptable at this point.

However, I do believe it is fair game to evaluate existing software solutions (pre-Web 2.0) on how well they can be adapted to support newer innovations in technology.  As for LoadRunner, I think we have a long history of adapting and embracing every new technology that has come along.  I started using LoadRunner with X-Motif systems running on Solaris.  That era and generation of technology is long since died (no offense intended to Motif or Sun).  Today, the same concepts for record, replay , execution, scripting, and analysis are still innovative and very relevant.  As long as the idea for the product is still valid, you can still deliver a valid product.

Adapting to changes here in LoadRunner we usually start with overcoming the technical hurdles for creating a new virtual user, or updating an existing one.  And as I stated above, we have a long and rich history of doing this – probably more than any other testing tool.  As an example, in versions 9.0, 9.1 and 9.5 we have continued to improve our support for AJAX, Flex and Asynchronous applications.  We respond to change quite well and even if we take some extra time to evaluate every aspect of what this ‘new web’ change means to our customers.  It’s worth getting right and not being swayed by the hype of the ‘Web 2.0’ label.

Let me finish by stating that these new web technologies as challenge to testing tools, but it’s even more of a change to testers.  I’ve heard that many-a-tester gets a surprise by the next version of the AUT which secretively has implemented new Web 2.0 architecture or even started using web services calls to a SOA.  Change is a surprise only if you’re unaware or unconscious.  Sure, it would be a failure to not communicate to QA that there were some significant technology changes coming, right?  To some, this would sound too familiar.  Like an institutionalized version of “throw it over the wall” behavior, but honestly these new technologies (like AJAX) have been around for nearly 5 years now.

As for most testers, here’s a thanks to Web 2.0 – “You've left us up to our necks in it!”

Showing results for 
Search instead for 
Do you mean 
About the Author(s)
  • Lending 20 years of IT market expertise across 5 continents, for defining moments as an innovation adoption change agent.
  • I have been working in the computer software industry since 1989. I started out in customer support then software testing where I was a very early adopter of automation, first functional test automation and them performance test automation. I worked in professional services for 8 years before returning to my roots in customer support where I have been a Technical Account Manger for HP's Premier Support department for the past 4 years. I have been using HP LoadRunner since 1998 and HP Performance Center since 2004. I also have strong technical understanding of HP Application Lifecycle Management (Quality Center) and HP SiteScope.
  • GTM Marketing for HP Software's ADM team. I am passionate about design, digital marketing, and emerging tech.
  • Malcolm is a functional architect, focusing on best practices and methodologies across the software development lifecycle.
  • Michael Deady is a Pr. Consultant & Solution Architect for HP Professional Service and HP's ALM Evangelist for IT Experts Community. He specializes in software development, testing, and security. He also loves science fiction movies and anything to do with Texas.
  • Mukulika is Product Manager for HP Performance Center, a core part of the HP Software Performance Validation Suite, addressing the Enterprise performance testing COE market. She has 14 years experience in IT Consulting, Software development, Architecture definition and SaaS. She is responsible for driving future strategy, roadmap, optimal solution usage and best practices and serves as primary liaison for customers and the worldwide field community.
  • HP IT Distinguished Technologist. Tooling HP's R&D and IT for product development processes and tools.
  • Rick Barron is a Program Manager for various aspects of the PM team and HPSW UX/UI team; and working on UX projects associated with HP.com. He has worked in high tech for 20+ years working in roles involving web design, usability studies, and mobile marketing. Rick has held roles at Motorola Mobility, Symantec and Sun Microsystems.
  • WW Sr Product Marketing Manager for HP ITPS VP of Apps & HP Load Runner
HP Blog

HP Software Solutions Blog


Follow Us
The opinions expressed above are the personal opinions of the authors, not of HP. By using this site, you accept the Terms of Use and Rules of Participation.