HP LoadRunner and Performance Center Blog

HP Discover tips and tricks: the performance testing edition with HP and Major Healthcare MCO

tips and tricks dice.pngThe International Data Corporation (IDC) forecasts that by late 2013, mobile phones will surpass PCs as the most common web access device globally. Are you prepared?

 

As a result of the global shift in mobile, it is vital that you stay one step ahead of the game and make sure your mobile applications are up to par. Keep reading to find out how you can learn more about mobile performance testing at HP Discover.

Your guide to performance testing at HP Discover 2014!

performance testing.pngHP Discover 2014 is just around the corner! There are so many activities and meetings that it is hard to keep up with everything! In order to help you find your way around and make sure you don’t miss anything, I created this go-to-guide for HP Discover!

 

 

We have tons of great sessions from customers, new products, new releases and tons of networking! You won’t want to miss all the fun!

Performance testing—now powered by SaaS

keyboard cloud.jpgHP Performance Center on SaaS takes the cloud theme of the Apps 12 launch one step further. HP PC on SaaS eliminates the need to have hardware on premise and allows you the flexibility to burst into the HP SaaS cloud,  clouds purchased from elsewhere or a combination as your testing needs demand. 

 

Keep reading to find out how HP SaaS can help you quickly unlock the value of Performance Center to get through your first round of testing within days of signing up.

 

Guest post by Mustali Barma, HP PC on SaaS Solution Manager

Best practices and proven methodologies for Mobile Performance Testing

Vivit logo.jpgMobile performance testing is playing an increasingly important role in the automated tech landscape. This is why we are addressing this issue head- on. You can join us on April 9 at our webinar with Vivit where we will discuss the latest best practices and methodologies for creating an exceptional Mobile Performance Testing strategy.

 

Keep reading to find out how you can join us.

Cloud testing with HP Performance Center 12

provision multiple load generators.jpgIs your department being asked to perform more cloud testing as a result of increasing demand for mobile and analytics?  Cloud testing allows you to quickly and elastically scale up performance tests to meet the demands of your customer-facing business applications.  As a result, you can reduce expenditures and overhead of managing dedicated machines.

 

Keep reading to find out how you can easily perform cloud testing using HP Performance Testing Solutions.

HP Cloud Testing Webinar: 3 Essential Facts to Simplify Performance Testing

To meet business and technology shifts, agile IT enterprises are adopting technologies and
architectures such as cloud testing and computing which have increased the
speed of application development. 

 

Keep reading to learn how you can imporve your continuous performance testing.

Webinar: Move Performance Testing to the Next Level with HP Performance Center

Are you satisfied with your performance testing processes and standards? Do disparate teams in IT and in your LOB organizations share performance testing resources and results? Can you trace all defects and how overall quality is being applied across your projects and releases? If not, it’s time to think standardize, share, and even Center of Excellence.

This webinar will focus on these best practices and highlight them by demonstrating the capabilities of HP Performance Center – solution expressly designed for end-to-end application performance.

 

Register here

Virtual Event: The power of change with United Airlines, T-Mobile and Nationwide

Industry experts from T-Mobile, United Airlines, Forrester, SunTrust, Nationwide, and Iron Mountain gather to share how you can stay ahead and technology change and make it your competitive advantage. Join the HP Power to Change virtual event on September 10!

 

Register here

Virtual services in load testing with the HP LoadRunner and HP Service Virtualization integration

Imagine you’re a performance engineer who has to test a composite application which relies on a number of business services, some of which are developed in-house, and some of which are third-party services. You find out that one of the in-house services doesn’t work, and you also discover that you’re not allowed to perform load testing on one of the third-party services because it’s a production service. How can you go ahead with your performance testing, when there are key services that you can’t access?

 

Continue reading to find out how HP Service Virtualization is integrated with HP LoadRunner to create virtual services that let you carry on working even when services aren’t available.

 

This post was written by Sergey Kutsos and Andrey Gaevskiy from the LoadRunner R&D team

Your guide to Performance testing at HP Discover!

Is your head swirling with HP Discover information and sessions? I am here to help you uncover the sessions, booths and events that should be at the top of your list.

 

Keep reading to follow me as I act as your tour guide through the fun and educational experience of HP Discover.

4 Tips for Replaying Remote Desktop Protocol (RDP) scripts in LoadRunner

LoadRunner scripts that use the Remote Desktop Protocol (RDP) don’t always replay smoothly.  My colleague Kert (Yang Luo) from the LoadRunner R&D team has collected some tips that will save you time and effort, and help you ensure that your RDP scripts run correctly.   Continue reading to learn more.

Performance testing solutions for the New Style of IT: HP LoadRunner 11.52 is now available

Last week at the StarEast Testing Conference in Orlando, FL we made some pretty big announcements.  We released new versions of our flagship quality and testing software products. We are calling the whole initiative “The new style of IT”.

 

This new way of thinking about IT is impacting LoadRunner as well. Keep reading to find out the highlights of LoadRunner 11.52 and why I am as excited about the launch as a kid getting a new toy.

 

Start your mobile application testing with the discounted HP Performance Validation Suite

Last week I attended a class about Mobile application testing and learned a great deal about what needs to be tested and how, specifically from a mobile application point of view. In a matter of minutes, we were able to find all sorts of bugs in the applications. Overall, it was a real fun and empowering experience.  

 

Is testing mobile applications the same as testing web applications? It seems like it is, but it is so NOT!

 

Today HP is launching a time limited, bundle incentivized, mobile performance testing promotion to help you get started with this most important initiative.

 

Read the blog to learn more about mobile application testing and the discounted HP Performance Validation Suite.

 

 

Testing asynchronous business processes with LoadRunner

In the early days of the World Wide Web, web applications communicated using conversations that had a synchronous nature: the user interacts with an application that is presented in a web browser, the browser submits a request to the web server, the server sends a response, and the application in the browser is updated.

 

However, synchronous applications have a number of limitations. One limitation involves the inability of the client to reflect events occurring at the server. This could be critical if, for example, the application in question displays stock prices or sport results and must present the updated info at all times. The answer to these limitations is asynchronous communication, whereby responses from the server arrive periodically or as a reaction to server-side events.

 

Many of today’s applications combine synchronous (user-initiated) and asynchronous (application-initiated) modes of communication between the client and the server. Usually the two occur in parallel (e.g. a user is navigating within the bank’s site, while at the same time stock updates keep pouring in), which makes it hard to accurately emulate such applications using traditional Vuser scripts. 

 

Continue reading to learn how to use VuGen to create a script for asynchronous situations.

Performance testing from a woman’s perspective

For all dear women: Happy International Women’s day!

 

Today is a special day for all women! It is the Women’s international day celebrated across the globe! It is YOUR day!

 

MeganShelton.jpg

Guys, please take the time to acknowledge all women that are part of your life today!  This includes your wife, girlfriend, your co-workers, sisters and friends. I am sure they will appreciate the acknowledgement.

 

To celebrate the day I want you to “meet” a great performance engineer; who inspires other women that are currently working in this area or are planning to become a performance engineer.

 

 

I would like to introduce Megan Shelton. She has a deep passion for performance testing and performance management, and has been working as a Performance Engineer for 10 years.  

Tips for browser emulation in network-level scripts

Different browsers can have a dramatic effect on a web application’s performance. In order to accurately test an application’s performance, it is important to emulate different browsers to understand how the application’s behavior and network utilization are affected. Continue reading to find out how different browsers can be emulated in network-level scripts, such as LoadRunner Web (HTTP/HTML).

Download HP LoadRunner and enhance your experience with the tips below

I am sure we have some similarities as “techies”! And one of the similarities may be learning about new software. I love trying new solutions and the challenges of discovering how it works and learning how the solution can help.

 

I understand that trying a new solution can be tricky; I decided to write a blog to enhance your experience when trying HP LoadRunner. Keep reading to find out how to have the best load testing experience ever.

Questions and answers from our Introduction to Performance Testing webinar

The “introduction to performance testing” webinar was a big hit last week.  An attendee told us:

 

“This session is very useful for me to start my first new project”

This demonstrates how there are still many question in the market about the process around performance testing domain. 

 

We received excellent questions and want to share them with the community. We want not only share but get your input on them and continue the conversation.  If you have experience and want to add to this blog, or have additional questions, please feel free to join us in the questions!

HuddleUp Performance Testing Tweetchat - Join a live community chat

Looking to talk about Performance Testing, performance engineering or performance application lifecycle? Well, you are in luck. We are beginning a series of Twitterchats, and we want you to join us.

 

Keep reading to find out how you can join the conversation.

  • No registration is required to attend the Tweetchat sessions.
  • To participate, follow the hashtag  #loadrunner
  • Just mark your calendars for the following days:

     Wednesday of the 2nd and 4th week of the month at 12 pm CST.

 

  • Log at TweetChat and join the conversation! Bring your challenges, your tips and your ideas!

 

Performance Testing 101 – A webinar series focused on your needs!

 

I am sure you receive tons of emails everyday giving you webinars to attend. Working here at HP is no different and sometimes it is hard to keep up with the “good ones”.

 

I want to make your life easier and give you the chance to collaborate with me!  New Year resolution is to bring you the best webinar information around Performance. Cool, huh?

 

We are now providing a continuous series of Webinars focused on Performance testing, Performance Engineering and Performance lifecycle - they all add value to your job.

 

My plan is to bring you the “HOT” topics, as well as great speakers and unique ideas from our community. And BTW, the HP Performance Testing is the largest community in the industry! I am proud to be part of this community.

 

Learn more!

Results Positive, HP partner, now offering HP LoadRunner on Demand in the Cloud!

I am pleased to announce that Results Positive, is now offering performance testing solution on demand in the cloud, powered by HP LoadRunner. The new solution, LoadRunner on Demand, provides options to test from a hour to a day, as well as managed services, scriting and consulting services.

 

Read the blog to learn more about the LoadRunner on Demand.

 

 

Can a plumber or a chef make a good performance engineer?

Can a plumber or a chef make a good performance engineer? Yes, testifies one of our most advanced performance engineering customers.

Autobiography of a Performance User Story

I am a performance requirement and this is my story. I just got built and accepted in the latest version of a Web-based SaaS (software as a service) application (my home!) that allows salespersons to search for businesses (about 14 million in number) and individuals (about 200 million in number) based on user-defined criteria, and then view the details of contacts from the search results. The application also allows subscribers to download the contact details for further follow-up.


I’m going to walk through my life in an agile environment—how I was conceived as an idea, grew up to become an acknowledged entity, and achieved my life’s purpose (getting nurtured in an application ever after). First a disclaimer – the steps described below do not exhaustively describe all the decisions taken around my life.



It all started about three months back. The previous version of the application was in production with about 30,000 North American subscribers. The agile team was looking to develop its newer version.


One of the strategic ideas that had been discussed quite in detail was to upgrade application’s user interface to a modern Web 2.0 implementation, using more interactive and engaging on-screen process flows and notifications. The proposed changes were primarily driven by business conditions, market trends and customer feedback. The management had a vision to capture a bigger pie of the market. The expectation was of adding 100,000 new subscribers in twelve months of release, all from North America. A big revenue opportunity! Because the changes were confined to its user interface, no one thought of potential impact on application performance. I was nowhere in the picture yet!


Due to the potential revenue impact of the user interface upgrade, the idea got moved high up in the application roadmap for immediate consideration. The idea became a user story that got moved from the application roadmap to the release backlog. Application owners, architects and other stakeholders started discussing the upgrade in more details. During one such meeting, someone asked the P-question—what about performance? How will this change impact the performance of the application? It was agreed that performance expectations of the user-interface changes should clearly be captured in the release backlog. That’s when I was conceived. I was vaguely defined as – “As an application owner of the sales leads application, I want the application to scale and perform well to as many as 150,000 users so that new and existing subscribers are able to interact with the application with no perceived delays.”


During sprint -1 (discovery phase of the release planning sprint), I was picked up for further investigation and clearer definition. Different team members investigated the implications of me as an outcome. The application owner considered application usage growth for the next 3 years and came back with a revised peak number of users (300,000). The user interface designer built a prototype of the recommended user-interface changes; focusing on the most intensive transaction of the application – when a search criterion is changed the number-of-contacts-available counter on the screen need to get updated immediately. The architect tried to isolate possible bottlenecks in the network, database server, application server and Web server, due to the addition of more chatty Web components such as AJAX, JavaScript, etc. The IT person looked at the current utilization of the hardware in the data center to identify any possible bottlenecks and came back with a recommendation to cater to the expected increased usage. The lead performance tester identified the possible scenarios for performance testing the application. At the end of sprint -1, I was re-defined as – “As an application owner of the sales lead application, I want the application to scale and perform well to as many as 300,000 simultaneous users so that when subscribers change their search criteria, an updated count of leads available is refreshed within 2 seconds on the screen.” I was defined with more specificity now. But was I realistic and achievable?


During sprint 0 (design phase of the release planning sprint), I was picked up again to see the impact I would have on the application design. IT person realized that to support revised number of simultaneous users, additional servers will need to be purchased. Since that process is going to take a longer time, his recommendation was to scale the number of expected users back to 150,000. To the short time, user interface designer decided to limit the Web 2.0 translation to the search area of the application and puts the remaining functional stories in the product backlog. The architect made recommendations to modify the way some of the Web services were being invoked and on fine tuning some of the database queries. A detailed design diagram was presented to the team leads along with compliance guidelines. The lead performance tester focused on getting the staging area ready for me. I was re-shaped to – “As an application owner of the sales lead application, I want the application and perform well to as many as 150,000 simultaneous users so that when subscribers change their search criteria, an updated count of leads available is refreshed with 2 seconds on the screen.” I was now an INVESTed agile story, where INVEST stands for independent, negotiable, valuable, estimable, right-sized and testable.


During the agile sprint planning and execution phase; developers, QA testers and performance testers were all handed over all the requirements (including mine) for the sprint. While developers started making changes to the code for the search screen, QA testers got busy with writing test cases and performance testers finalized their testing scripts and scenarios. Builds were prepared every night and incremental changes were tested as soon as new code was available for testing. Both QA testers and performance testers worked closely with the developers to ensure functionality and performance were not compromised during the sprint. Daily scrums provided the much-needed feedback to the team so that everyone knew what was working and what was not. Lot of time was spent on me to ensure my 2-second requirement does not slip to 3-seconds, as it will have a direct impact on customer satisfaction. I felt quite important, sometimes even more than my cousin story of search screen user interface upgrade! At the end of a couple of 4-week sprints, the application was completely revamped with Web 2.0 enhancements with functionally and performance fully tested – ready to be released. I was ready!


Today, I will be deployed to the production environment. No major hiccups are expected, as during the last two weeks I was beta tested by some of our chosen customers on the staging environment. The customers were happy and so were internal stakeholders with the outcome. During these two weeks, I hardened myself and got ready to perform continuously and consistently. Even though my story is ending today, my elders have told me that I will always be a role model (baseline) for future performance stories to come. I will live forever in some shape or form!


Are We Done Yet?

When is a user story considered done in agile projects? Depending on whom in the project I ask this question, the response to this question will be different. A developer might consider a story done when it has been unit tested and its defects have been addressed. A QA person might consider a story done when its functionality has been successfully tested as per its acceptance criteria. An application owner or a stakeholder might consider a story done when the story has been architected, designed, coded, functionally tested, performance tested, integration tested, accepted by the end-user, beta tested, and successfully deployed.


Clearly, a standard is needed to properly define the term “done” in agile projects. Good news is that you can have your own definition for “done” for your agile projects. However, it is important that everyone in the team collaboratively agrees to this definition of done. The definition of done might vary by the adoption stage of agile methodologies in an organization (see figure below). During the early adoption days of agile methodologies, a team might agree that the definition of done is limited to Analysis, Design, Coding, and Functional and Regression Testing (the innermost circle). This means that the team is taking on a performance testing debt from each sprint and moving it to the hardening sprint. This is a common mistake, as most performance issues are design issues and are hard to fix at a later stage.




As the team becomes more comfortable and mature with agile methodologies, they expand the definition of done circle to first include Performance Testing and then include User Acceptance Testing – all within a sprint.


I have some tips for you to include performance testing in the definition of done,


·         Gather all performance related requirements and address those during system architecture discussions and planning


·         Ensure that team is working closely with the end-users/stakeholders to define acceptance criteria for each performance story


·         Involve performance testers early in the project, even in the Planning and Infrastructure stages


·         Make performance testers part of the development (sprint) team


·         Ensure that the performance testers work on test cases and test data preparation while developers are coding for those user stories


·         Get performance testers to create stubs for external Web services that are being utilized


·         Deliver each relevant user story to performance testers as soon as it is signed off by the functional testers


·         Ensure that performance testers are providing continuous feedback to developers, architects and system analysts


·         Share performance test assets across projects and versions


·         Schedule performance tests for off-hours to maximize the utilization of time within the sprint


It is important to remember that even performance tests are code, and should be planned just like coding the application, so it becomes part of the sprint planning and execution.


To me, including performance testing in the definition of done is a very important step in confidently delivering a successful application to its end-users. Only the paranoid survive – don’t carry a performance debt for your application!

Performance Management for Agile Projects

Performance management is an integral part of every software development project. When I think of agile projects, I think about collaboration, time to market, flexibility, etc. But to me the most important aspect of agile processes is its promise of delivering a “potentially shippable product/application increment”. What this promise means for application owners and stakeholders is that, if desired, the work done in iteration (sprint) has gone through enough checks and balances (including meeting performance objectives) that the application can be deployed or shipped. Of course, the decision of deploying or shipping the application is also driven by many other factors such as the incremental value added to the application in one sprint, the effect of an update on company’s operations, and the effect of frequent updates on customers or end-users of the application.


Often application owners fail to provide an objective assessment of application performance in the first few sprints or until the hardening sprint—just before the application is ready to be deployed or shipped. That is an “Agile Waterfall” approach, where performance and load testing is kept aside until the end. What if the architecture or design of the application needs change to meet the performance guidelines? There is also a notion that performance instrumentation, analysis and improvements are highly specialized tasks which result in resources not being available at the start of a project. This happens when the business and stakeholders are not driving the service level measurements (SLMs) for the application.


Application owners and stakeholders should be interested in the performance aspects of the application right from the start. Performance should not be an afterthought. The application’s backlog in agile contains not only the functional requirements of the application but also the performance expectations from the application. For example, “As a user, I want the application site to be available 99.999% of the time I try to access it so that I don’t get frustrated and find another application site to use”.  Performance is an inherent expectation behind every user story. Another example may be, “As an application owner, I want the application to support as many as 100,000 users at a time without degrading the performance of the application so that I can make the application available globally to all employees of my company”. These stories are setting the SLMs or business-driven requirements for the application, which in turn will define the acceptance criteria and drive the test scripts.


It is important that, if a sprint backlog has performance related user stories (and I’ll bet you nearly all of them do) its team has IT infrastructure and performance testers as contributors (“pigs” in Scrum terminology). During release planning (preferably) or sprint planning sessions these contributors must spend time in analyzing what testing must be performed to ensure that these user stories are considered “done” by the end of the sprint. Whether they need to procure additional hardware, modify the IT infrastructure for load testing, or work on the automation of performance testing; these contributors are an active member of the sprint team participating in daily scrums.  They must keep a constant pressure on developers and functional testers to deliver the functionality for performance testing. After all, the success of the sprint is measured as whether or not every member delivered the final product that fully met the acceptance criteria and on time.




To me, performance testing is an integral part of the agile process and it can save cost to an organization. The more you wait to conduct performance tests, the more expensive it will become for you to incorporate changes. So don’t just test early and often – test functionality and performance in the same sprint!


Video: Understanding Concurrency and Concurrent Users

So, just what the is the proper definition for concurrency?  This is a hot topic sometimes when it comes to arguing about the validity and accuracy of the stress testing analysis.  Of course there is no ONE simple answer here, so it's up to you to establish a common definition on your teams or for the Performance CoE.  These videos will give you some tips on what concurrency is for performance testing.  You will also learn about a common set of definitions for concurrent users....concurrent, active and named users.


LoadRunner Concurrency Part 1:



(if your browser doesn't show the video in-line, just click here)


 


 


Of course, I thought of several other items for this...and so there is also LoadRunner Concurrency Part 2:



(if your browser doesn't show the video in-line, just click here)

Video: Running LoadRunner Virtualized

If you've ever needed to understand how LoadRunner should be implemented in a virtual environment, you should enjoy this video walkthrough explaining the best practices to do just that.  Make a specific note about how your Iteration Pacing and Think Time settings really effect the health, scalability and accuracy of your load test.



(if your browser doesn't show the video in-line, just click here)

Video: Ten Virtual Users Just Aren't Enough

And the Rolling Stones once sang a song, “You can’t always get what you want” – but that should all make sense when you watch this video about stress testing.




(if your browser doesn't show the video in-line, just click here)

Search
Showing results for 
Search instead for 
Do you mean 
About the Author(s)
  • I have been working in the computer software industry since 1989. I started out in customer support then software testing where I was a very early adopter of automation, first functional test automation and them performance test automation. I worked in professional services for 8 years before returning to my roots in customer support where I have been a Technical Account Manger for HP's Premier Support department for the past 4 years. I have been using HP LoadRunner since 1998 and HP Performance Center since 2004. I also have strong technical understanding of HP Application Lifecycle Management (Quality Center) and HP SiteScope.
  • Malcolm is a functional architect, focusing on best practices and methodologies across the software development lifecycle.
  • Michael Deady is a Pr. Consultant & Solution Architect for HP Professional Service and HP's ALM Evangelist for IT Experts Community. He specializes in software development, testing, and security. He also loves science fiction movies and anything to do with Texas.
  • Mukulika is Product Manager for HP Performance Center, a core part of the HP Software Performance Validation Suite, addressing the Enterprise performance testing COE market. She has 14 years experience in IT Consulting, Software development, Architecture definition and SaaS. She is responsible for driving future strategy, roadmap, optimal solution usage and best practices and serves as primary liaison for customers and the worldwide field community.
  • HP IT Distinguished Technologist. Tooling HP's R&D and IT for product development processes and tools.
  • WW Sr Product Marketing Manager for HP ITPS VP of Apps & HP Load Runner
Follow Us


HP Blog

HP Software Solutions Blog

Labels
The opinions expressed above are the personal opinions of the authors, not of HP. By using this site, you accept the Terms of Use and Rules of Participation