Are We Done Yet?

When is a user story considered done in agile projects? Depending on whom in the project I ask this question, the response to this question will be different. A developer might consider a story done when it has been unit tested and its defects have been addressed. A QA person might consider a story done when its functionality has been successfully tested as per its acceptance criteria. An application owner or a stakeholder might consider a story done when the story has been architected, designed, coded, functionally tested, performance tested, integration tested, accepted by the end-user, beta tested, and successfully deployed.

Clearly, a standard is needed to properly define the term “done” in agile projects. Good news is that you can have your own definition for “done” for your agile projects. However, it is important that everyone in the team collaboratively agrees to this definition of done. The definition of done might vary by the adoption stage of agile methodologies in an organization (see figure below). During the early adoption days of agile methodologies, a team might agree that the definition of done is limited to Analysis, Design, Coding, and Functional and Regression Testing (the innermost circle). This means that the team is taking on a performance testing debt from each sprint and moving it to the hardening sprint. This is a common mistake, as most performance issues are design issues and are hard to fix at a later stage.

As the team becomes more comfortable and mature with agile methodologies, they expand the definition of done circle to first include Performance Testing and then include User Acceptance Testing – all within a sprint.

I have some tips for you to include performance testing in the definition of done,

·         Gather all performance related requirements and address those during system architecture discussions and planning

·         Ensure that team is working closely with the end-users/stakeholders to define acceptance criteria for each performance story

·         Involve performance testers early in the project, even in the Planning and Infrastructure stages

·         Make performance testers part of the development (sprint) team

·         Ensure that the performance testers work on test cases and test data preparation while developers are coding for those user stories

·         Get performance testers to create stubs for external Web services that are being utilized

·         Deliver each relevant user story to performance testers as soon as it is signed off by the functional testers

·         Ensure that performance testers are providing continuous feedback to developers, architects and system analysts

·         Share performance test assets across projects and versions

·         Schedule performance tests for off-hours to maximize the utilization of time within the sprint

It is important to remember that even performance tests are code, and should be planned just like coding the application, so it becomes part of the sprint planning and execution.

To me, including performance testing in the definition of done is a very important step in confidently delivering a successful application to its end-users. Only the paranoid survive – don’t carry a performance debt for your application!

Leave a Comment

We encourage you to share your comments on this post. Comments are moderated and will be reviewed
and posted as promptly as possible during regular business hours

To ensure your comment is published, be sure to follow the Community Guidelines.

Be sure to enter a unique name. You can't reuse a name that's already in use.
Be sure to enter a unique email address. You can't reuse an email address that's already in use.
Type the characters you see in the picture above.Type the words you hear.
Showing results for 
Search instead for 
Do you mean 
About the Author

Follow Us
The opinions expressed above are the personal opinions of the authors, not of HP. By using this site, you accept the Terms of Use and Rules of Participation.