Tearing down the wall between Dev and Ops to create quality DevOps

Today’s enterprises need to become more agile in order to effectively compete.  To keep up, IT organizations not only need to deliver services faster, but at the same time, they need to make end-to-end quality their focus. For IT organizations that are ready to take the journey to what we call DevOps, COBIT 5 can serve as an instructional guide. COBIT 5 defines an expansive role for quality by saying that quality is effectively the outcomes generator for the enterprise. This makes the wall between development and ops an artificial one.

 

Quality clearly needs to be designed into the application, but it also needs to be measured throughout the application lifecycle. For this reason, COBIT 5 defines this process as ensuring the consistent delivery of solutions and services that meet enterprise and satisfy stakeholder needs. A key idea here is that projects and programs deliver services (enterprise capabilities) and these capabilities need to be continuously improved and made more efficient during their lifetime. How many of you in quality think this way?

 

Quality needs to be measured

It was W. Edwards Deming that defined the plan-do-check-run. But how many of you see quality as having an end-to-end role in this process? Deming did! COBIT 5 suggests changing our thinking on IT quality by having IT organizations measure themselves against three process improvement goals. Let’s explore each process along with their recommended metrics in order to get a better idea of how we should be running quality in a dev/ops era. 

 

1.                   Stakeholders are satisfied with the quality of solutions and services. Notice the first quality improvement goal is not about testing. It is about the quality of solutions and services—the outputs of tested programs and projects. Three metrics are used to measure success of this goal:

 

  • The average stakeholder satisfaction with solutions and services
  • The percent of stakeholders satisfied with IT quality
  • The number of services with a formal quality management plan.

Do your solutions and services have a quality management plan? The metrics here start with a survey of customer satisfaction, but end asking whether there is a quality management plan. This means that we should treat services like a manufacturing line and ask over time, how well are we improving them and what is our plan for doing so? How many IT organizations really think this way? My guess is that this is a small number.  

 

2.                   Project and service delivery results are predictable. Predictable means the end-to-end quality processes occur as you expect them. Three metrics are recommended to measure this area:

 

  • The percent of projects reviewed that meet target quality goals and objectives
  • The percent of solutions and services delivered with formal certification
  • The number of defects uncovered prior to production

How often does this happen at your organization? Clearly, projects and programs need to have target quality goals as requirements. Meanwhile, quality certification is a formal acknowledgment that we comply against standards. And lastly, the quality of process is about having defects discovered prior to production. This is essential for dev/ops to work.

 

3.                   Quality requirements are implemented in all processes. This says that quality is not just designed in but managed over the service or process lifetime. Three metrics are recommended to manage this process goal:

 

  • The number of processes with a defined quality requirements
  • The number of processes with a formal quality assessment report
  • The number of SLAS that include quality acceptance criteria

I think a quality requirement says that in service design, a nonfunctional requirement needs to be specified for a quality of service level for service supported business processes. This is clever because most organizations define service levels post production. This concept is amplified in the notion of quality acceptance criteria in SLAs. Lastly, there is the notion here that there should be a formal quality assessment report on an ongoing basis. This means that we need to look to the past and look forward in the delivery of business processes and services.

 

So where should you start?

As always, my suggestion is that you start where the most immediate value can be driven. I would start by showing acceptable quality and successfully testing. What do you think? I would love to hear back from you feel free to reach out to me in the comments section below.

 

I also encourage you to learn more about HP Application Lifecycle Management here.

 

 

Related links:

Solution page:  HP Application Lifecycle Management

Twitter: @MylesSuer

Leave a Comment

We encourage you to share your comments on this post. Comments are moderated and will be reviewed
and posted as promptly as possible during regular business hours

To ensure your comment is published, be sure to follow the Community Guidelines.

Be sure to enter a unique name. You can't reuse a name that's already in use.
Be sure to enter a unique email address. You can't reuse an email address that's already in use.
Type the characters you see in the picture above.Type the words you hear.
Search
Showing results for 
Search instead for 
Do you mean 
About the Author
Mr. Suer is a senior manager for IT Performance Management. Prior to this role, Mr. Suer headed IT Performance Management Analytics Product ...
Featured


Follow Us
The opinions expressed above are the personal opinions of the authors, not of HP. By using this site, you accept the Terms of Use and Rules of Participation.