Does IT matter? Alan Turing vs. Nicholas Carr

charlesbetz.jpgBy Charlie Betz
 
Charlie Betz is research director for IT portfolio management at Enterprise Management Associates (EMA) and author of the white paper, “Business Intelligence for the Business of IT.”

 

We’re coming up on some significant anniversaries in the history of IT. This year is the 100th anniversary of Alan Turing’s birth. Turing, you may remember, is often considered the father of modern computer science and his Turing machine one of the first conceptual models of a general purpose computer. He worked as a codebreaker in British intelligence during World War II (among other things, breaking the infamous Enigma codes), and his life has been the subject of books, films and plays.

 

And this coming May it will be the 10th anniversary of Nicholas Carr’s provocative Harvard Business Review article, “IT Doesn’t Matter.” (Later, when Carr turned his article into a book, he softened the title to “Does IT Matter?” but you get the idea.)

 

Aside from giving us some fun excuses to visit Wikipedia, what does any of this have to do with the challenges CIOs are facing today with cloud computing and reports (in some quarters) of their inevitable demise? A lot, as it turns out.

 

IT is a commodity
I recently wrote a blog post about whether it’s wise to view IT simply as a sourcing issue. That supply-centric approach to IT is the logical end of seeing IT as a commodity. I think there are considerable dangers in doing this. For one thing, you can end up outsourcing everything – and there go your crown jewels, which are basically your innovation, because so much innovation revolves around IT.

 

But we must face the fact that parts of IT are commodities. And Turing recognized this. Sixty years ago he basically said that IT continually commoditizes itself: “As soon as any technique becomes at all stereotyped it is possible to devise a set of electronic instruction tables that will enable the computer to do it for itself.” You could say – perhaps with just a bit of overstatement – that Turing was the first to say IT doesn’t matter, beating Nicholas Carr to the punch by 50 years.

 

I’ve always thought this statement by Turing was a great quote. IT has been in this perpetual cycle of self-commoditization ever since it started. And of course the definition of when something is commoditized is that that’s when it becomes a sourcing problem.

 

But IT doesn’t stay a commodity
With IT, however, there’s always the next thing. So if you were to say that IT is only a sourcing problem, that means there is no more innovation. There are no more new methods being tried out that can then be in turn commoditized. And we know this is false. So it’s in the continual definition of new problems that we have that necessary input into IT’s evolutionary processes.

 

This is where I think subscribers to the Nicholas Carr school of thought miss the point, because they think that at some point the process stops. But the process never stops. People are creative and competitive and entrepreneurial, and they’re going to come up with new ways to compete using computers. So people will continue to demand that the computer do new things and that the computer be stretched in new ways, so that leads to the innovation and that leads (again) to eventual commoditization.

 

What should YOU do about this, and how can YOU derive advantage from these admittedly philosophical musings? I think the important thing is this: Recognize that IT is in a constant cycle of innovation and commoditization, and think where the cycle is in relation to your business and to the critical initiatives you have underway. Just as you don’t want to outsource your engine for innovation, you also don’t want to lose sight of the opportunity to be ruthlessly efficient. You don’t want to be the CIO who realizes you’ve just spent a million dollars having developers develop something you could have bought on the market for $50,000, all because you didn’t realize that what you had was a commodity problem.

 

Related links:
• Blog: Is today’s CIO really just another supply chain manager?
• Blog: Does your application portfolio belong on an episode of “Hoarders”?
• Blog: 2 lessons from business school that can transform your organization
• White paper: Business Intelligence for the Business of IT
• Charlie Betz on Twitter: @CharlesTBetz

 

For more insights on the future of IT and how you can optimize IT performance to drive business results, subscribe to the Discover Performance ezine.

Labels: Future of IT
Comments
cebess | ‎09-14-2012 08:24 AM

Although the components of IT may be viewed as a commodity, what information gathering, analysis and automation can perform definitely is not a commodity. If it is a commodity, the organization is only scratching the surface of the possibilities. It is time to be bold about the use of IT.

 

 

Local IT Support | ‎09-19-2012 09:23 AM

This posting is an nice Article.Is that IT is the comodity? I dont think so.If it was then it cann't get this much popularity......

Leave a Comment

We encourage you to share your comments on this post. Comments are moderated and will be reviewed
and posted as promptly as possible during regular business hours

To ensure your comment is published, be sure to follow the Community Guidelines.

Be sure to enter a unique name. You can't reuse a name that's already in use.
Be sure to enter a unique email address. You can't reuse an email address that's already in use.
Type the characters you see in the picture above.Type the words you hear.
Search
Showing results for 
Search instead for 
Do you mean 
About the Author
This account is for guest bloggers. The blog post will identify the blogger.
Featured


Follow Us
The opinions expressed above are the personal opinions of the authors, not of HP. By using this site, you accept the Terms of Use and Rules of Participation.