Defining, Measuring, and Managing Success in App Modernization Projects

measure of success.jpgThis week, I am enjoying another set of oversea customer discussions in the UK for the HP ALM SIG (More Info).  These presentations and discussion groups are being held in London, Manchester, and in Edinburgh. As I prepared my talking points for these sessions, I recalled the customer feedback from last week’s blog posts on business justification for application modernization IT efforts. 

 

In my second post of last week, we started to introduce the topic of success criteria for application modernization projects.  To quote a section from last weeks post, “Quality no-longer equates to back-end testing, instead includes managing the complete application lifecycle through a quality lense.

 

I wanted to double-click on this “quality lense” and see if we could start a discussion on the top success criteria we see our customers focusing on for application modernization projects.  For the purposes of this post, let’s use the popular phrase  “Key Performance Indicators” instead of vague “success criteria” to help frame our conversation.

 

As any software project team knows, Key Performance Indicators (or KPIs) are a critical elements to define, measure, and manage to ensure accurate (and cost effective) go-live decisions.  In my experience, KPIs are usually defined in the software project plan, and then tracked throughout the lifecycle.   

 

A quick sample of types of KPIs I commonly see:

 

Progress Metrics for Requirements Definition and Management

-       % Requirements Documentation Completed

-       % Requirements Validated

-       % Requirements Accepted for Development

Progress Metrics for Agile Development Sprints (Multiple)

-       % Code Complete

-       % Code Unit Tested

-       % Code Deployed

Progress Metrics for Quality Assurance Testing

-       % of Tests Executed

-       Number of Tests Passed, Number of Tests Failed

-       % of Test Failed vs.. Total

-       Number of Defects

-       Number of High Priority Defects

Performance Metrics

-       Current Transaction under load vs. goal

-       Response rates for target load

-       Resource utilization under load vs.. expected

-       Root cause analysis metrics

Security Metrics

-       Number of Security Defects

-       Number of High Priority Security Defects

 

…and the list goes on. 

 

In my opinion, KPIs (such as the above short-sample list) represent the “quality lense” to manage the lifecycle for complex application modernization projects.      Since application modernization projects most always touch multiple systems, multiple development efforts, and almost universally impact mission critical business processes, these KPIs need not only be defined, but also measured accurately (and continually.)

 

In ALM 11’s Project Planning and Tracking module, HP takes a unique approach to providing technology to define, measure, and manage KPIs associated with a lifecycle approach to quality.    Like any project tracking software tool, HP ALM 11 gives users the tools to define tasks, cycles, and milestones through an easy to use, drag and drop interface.    From a tool perspective - this is the easy part.

 

What makes ALM 11 different is that the project planning and tracking system is fully integrated with all of the HP ALM modules, including requirements management, development management, quality management, performance management, and security management.        Actually, the word “integrated” is not strong enough, instead I should have rephrased this as “ALM 11 has built Project Planning and Tracking into the same system (with the same repository) as all of the HP ALM modules, including requirements management, development management, quality management, performance management, and security management. 

 

This very important distinction empowers HP ALM Project Planning and Tracking to enable decision makers to digitize KPIs into the HP ALM system at the start, and the HP ALM system will do all the work to communicate how well the project is actually doing.   HP ALM tracks the activities, progress, and key application metrics associated with tasks performed against the application, and as such, related KPIs are kept up to date in real time.    In other words, when a developer, tester, or business analyst performs their work in the HP ALM system, KPIs are kept up to date accordingly. 

 

A close friend and fellow IT professional colleague best illustrates why manual KPIs represent a serious quality issue:

 

To roughly quote him:

 

Project status reporting at my company is - at best- a finger in the air exercise and the main reason our deadlines are missed.  Visibility to project progress is gained through a 60 minute conference call held every Monday.  The call includes 30 people talking over each other, much context is lost in translation, and our project manager often fills in gaps with wild guesses.  These calls result in KPI updates to a manual dashboard that no-one can possibly trust.”

 

Sound familiar?

 

Let’s take this to yet-another-level.  If you are using an (inaccurate?) manual dashboard (from my friend’s above rough quote), what happens if you see that too many defects are currently open, or that the performance metrics are not hitting the mark, or the requirements validation effort is not complete?   How do you drill down from there?  

 

Well, in most cases, the next steps involve email messages, voice mails, instant messages, and yet-more-out-of-context discussions.    The cycle repeats...

 

I want to close this blog on four questions for our readers:

 

Do you see KPIs definition, measurement, and monitoring as being key to quality in application modernization projects? 

 

Can you tell us how your group defines and measures KPIs? 

 

If you are using HP ALM 11, can you tell us what KPIs you use and how it is working for you?

 

If you use manual KPI measurement, do you find it is an accurate representation of project health?

 

Please send me your feedback directly at Matt.Morgan@hp.com.

 

 

Leave a Comment

We encourage you to share your comments on this post. Comments are moderated and will be reviewed
and posted as promptly as possible during regular business hours

To ensure your comment is published, be sure to follow the Community Guidelines.

Be sure to enter a unique name. You can't reuse a name that's already in use.
Be sure to enter a unique email address. You can't reuse an email address that's already in use.
Type the characters you see in the picture above.Type the words you hear.
Search
Showing results for 
Search instead for 
Do you mean 
About the Author
Matthew Morgan is Vice President of Product Marketing for HP Software and serves as the marketing business owner for the Hybrid IT and Cloud...
Featured


Follow Us
The opinions expressed above are the personal opinions of the authors, not of HP. By using this site, you accept the Terms of Use and Rules of Participation.