IT Service Management Blog
Follow information regarding IT Service Management via this blog.

Is Your CMS "On Fire"?

How does one measure the "Quality" of something?  What does CMS "Quality" mean?   "High-Quality" is thrown around without much substantiation, especially in the world of software.


My friend Dennis says  that for a CMS to work it must be actionable.  Of course we all agree.  But how many of us are measuring (or trying) the actionability of the CMS?  What are the metrics for measuring "actionability" and any other metrics which are important to its other and lower functions?  How is it possible to measure fuzzy, subjective, inexact things like  "data quality"?  At what point does measuring the CMS become ROI analysis?  I'm full of questions today.


Let's start pedantically for fun:


On Fire: adj. 1. Positive connotation: A continual period of producing exceptional work.  On a winning or lucky streak.  "Three goals in one game, he's on fire!".  The good kind of on fire.


2. Negative connotation: Exceptionally behind schedule or fraught with so many problems as to seriously hinder, halt, or even reverse forward progress.  "Our waiter is so in the weeds he's on fire."  Aka "mega-backlogged" or "dead in the water".      The bad kind of on fire.


3. Aflame, as in, seriously hot, or producing a glow or light.   Can apply to either prior definitions.


Fighting Fires:  Helping someone who is on fire in the bad way.  Commonly for someone important.    It is possible to catch fire  from fighting too many fires at once.  So much for my dictionary-writing skills.


For whatever acronym that is commercially and culturally significant to you, there's a way to say you're On Fire -  in both the good and bad ways.


Is your CMS on fire?  How would you know?  What metrics would one look at?  Is there such a thing as a CMS "thermometer"?    Let's call it a CMS-o-Meter:





During implementation, it's easier to tell if your CMS project is on fire.  Assuming we defined clear goals and have reasonable success criteria, we can look to the early deliverables and status reports like any other project to determine how on fire we are one way or another.


But operationally, once you get the CMS or part of it built, how does one measure it's temperature?


What if your pile of CIs were as important as, say, a nuclear pile - you pretty much couldn't go without a thermometer.  Kind of important to avoid catching fire.  Big, hot fire.  The kind that burns you for a long time.  How important is the CMS to your IT?  Got anything valuable in there?  Research like this and personal experience show that it is as easy to blow up your CMDB project.


Two tried and true methods are  to find  and measure quality for almost anything is by 1) what's important to the consumer and  2) what's important to whoever is responsible for the maintenance.  This is true for a car or a Service Desk or a CMDB or a CMS.


Research from Gartner suggests monitoring data quality is not widespread, and the decision to monitor data quality  falls as either an afterthought or chronically shorted on resources due to the cost of not only doing the monitoring but learning how.


Use case and Consumer-based metrics could include  qualitative  vectors like timeliness and accuracy of the consumed information.  These are often difficult and costly to measure, but they're the best indicators.  I believe you should invest here if you can.  Talk to the users.  Measure the value chain as far up and including the business as you can.  Are your change control and closed-loop incident/problem management processes working better?  Is your MTBF and MTTR improving?  Did the business lose less money due to critical availability downtime?


 Process-based measurements could be important too, such as, the performance of the CMS itself - what is the latency when you open an RFC or when the Service Desk creates an incident (both processes which can consume or provide data to/from the CMS)?


Administration-based metrics are usually more foundational and architectural:  Does the system work?  Is it secure?    Is working  with the software more like dancing or more like wrestling?  Do you get good support from the vendor, and as importantly, is it easy to work with support?  Is R&D responsive on patching major problems?  Is the vendor forthcoming with their road map?


Stratifying the measurements this way will help. 


The takeaway here is not to give you a comprehensive list, it's that you should be concerned about and invest in quality measurement of your CMS and it's data as much as the data itself.  


Think about making the CMS actionable.


Build yourself the right thermometer for your CMS.


Calibrate your CMS-o-Meter to make sure it's reporting accurately.


Then monitor these metrics.  Operationally, in Production, like you mean it.  Treat it as you would any other production applications according to it's priority in your organization.


When you're on fire, what do you do?  Let us know with a reply.  We'd always like to hear if you found this post useful, offensive, or just amusing for a few minutes.  Thanks.









| ‎03-26-2010 10:20 PM


I love your titiles to your articles, but the few I looked at quickly went from a title on  topic I wanted to read to an article that frankly, I don't know what you are trying to tell me.

You some good points here and there, but I really would love to hear what you have to say more on topic to your Titiles.  



Jody Roberts | ‎04-15-2010 08:15 PM

First Matt, THANKS for the feedback!  We're in need of that.

Second, what am I trying to say?  Well, can you answer the title question?  Do you know the quality "level" of your CMS and it's data?  If not, are you really ok with that?  Can you guarantee the quality of the data delivered through the CMS to the consumer?  I'm trying to make people think here, not hand out our IP.

The CMS-O-Meter is funny, but the plumbing behind it is serious business.  If you can measure the quality of what your CMS is delivering, you're in trouble.  

And we can help you avoid or get out of that trouble.

That's what I'm trying to say.

Leave a Comment

We encourage you to share your comments on this post. Comments are moderated and will be reviewed
and posted as promptly as possible during regular business hours

To ensure your comment is published, be sure to follow the Community Guidelines.

Be sure to enter a unique name. You can't reuse a name that's already in use.
Be sure to enter a unique email address. You can't reuse an email address that's already in use.
Type the characters you see in the picture above.Type the words you hear.
Showing results for 
Search instead for 
Do you mean 
About the Author
Jody Roberts is a researcher, author, and customer advocate in the Product Foundation Services (PFS) group in HP Software. Jody has worked ...

Follow Us
The opinions expressed above are the personal opinions of the authors, not of HP. By using this site, you accept the Terms of Use and Rules of Participation.