Nicholas Carr loves to write about "utility computing". I am not sure he has seen too many outsourcing MSAs or SLAs - or lived through multi-year cycles of managing an outsource relationship (though he could argue he has a hosted blog and has experience managing that vendor) - but he continues to put a lot of faith in utility computing. It is an underpinning of his premise that outsourcing has become so easy that we can start thinking about the end of corporate IT. In a recent post he projects that an increase in fuel prices may create the shock to make utility computing more attractive.
I am myself a "buy" as against a "build" fan - and in the business of helping CIOs outsource. But to me, the most successful outsourcing is selective outsourcing. Most of today's "IT utilities" - the large vendors - just have not made economics or performance compelling for wholesale outsourcing. And frankly, many buyers even after 2 decades do not know how to measure and manage outsourced IT performance, even as they run complex industrial supply chains. So you have to negotiate the deals hard. Then monitor performance like a hawk. So by their nature, they take way more time and attention than "utilities" deserve. If each of us had to negotiate that hard and spend that much time monitoring our electric, water, gas, lawn service - our utilities - we would have little budget or time left for anything else.
Why selective outsourcing?
1) Today's "IT Utilities" already have scale, but have not done much with it
EDS has over 100,000 employees. The average Fortune 500
CIO has 500 IT employees. Infosys has delivered over 18,000 projects
using its GDM.
The average CIO has done fewer than 10. Microsoft spent $ 6 billion in
R&D last year. The average Fortune 500 CIO's total IT budget (not
just on software) is less than $ 50 m. Yet vendors cannot price their
products or deliver performance on a utility scale model? How much more scale do they need?
2) Today's "IT Utilities" have not passed along falling prices in a number of areas
Accenture has over 10,000 resources in India and Philippines. IBM has even more in low cost markets. Yet when it comes to pricing they reluctantly showcase these resources. Most software vendors are starting to leverage open source components. Most have also moved at least some of their R&D to low cost markets - Oracle has over 6,000 resourceas in India. Why is this not being passed along? And how long before they would pass along any fuel related increases Carr talks about?
3) Today's "IT Utilities" have been inconsistent in delivery - and buyers still do not know how to buy such services
The media has written about several high-profile outsourcing failures - JP MorganChase, Sears among others - but even in deals that go the entire term there is more churn and re scoping than gets reported. Gartner says 80% of outsourcing deals are re-negotiated over their typical 3 year term. Also see the Deloitte analysis here. This is not always the vendor's fault, but the fact is buying and selling of these services is still an evolving art.
OK, so how about more "bite-sized" utility offerings? Like salesforce.com? While the initial experiences are promising and AppExchange will bring more to the fray, today there just are not as many viable SaaS offerings available. Their predecessors, the ASPs, scared off too much investment in the model till recently.
Business 2.0 has a story on Joe Kraus this month. The co-founder of Excite says he needed $ 3 m to go from idea to launch. At hia new venture, Jotspot the comparable cost is $ 100,000. Cheaper hardware, open source software, global labor, newer search driven marketing costs have made building new technology that much cheaper in the last few years. CIOs read such stories. Till utility computing delivers those kind of efficiencies, it is a lot more efficient for CIOs to continue to build their own "generators".
The "fuel" that keeps today's utility computing unattractive is not the kind we can blame Arab sheiks for.