New business models within the IT sector have been relatively hard to find since the end of the dotcom boom. However, in the past 12 months or so, a new buzzword has emerged that is gaining a huge amount of interest – ‘utility computing’. If you have not heard of it yet, it is probably only a matter of time as the major IT vendors position themselves to launch their own version of what may be the next big thing in IT.
So what is it? Fundamentally, it is a very straightforward concept: make the provision of IT services as easy as obtaining electricity or water, where the flick of a switch or the turn of a tap provides as much of the resource as is required. With utility computing, the customer simply pays by the hour, day or month depending on the computing resources used. After all, why should companies run a large number of systems to handle peaks in demand, but then leave those systems idle for the rest of the time? Alternatively, who wants to run a small number of systems that may be unable to provide the service needed during peak periods? The solution: allow customers to share computing resources, either internally between departments or externally between customers, so that any level of computing service is available as and when it is needed.
Although the movement to utility computing is still in its early stages, it is clearly far more than smoke and mirrors. The concept was described by independent technology research company Forrester Research (Business Week, 25 August) as the “third major computing revolution after mainframes and the internet”, and a growing list of companies have signed up to large outsourcing deals involving the provision of utility computing services. American Express, Deutsche Bank and JP Morgan Chase are just three high-profile announcements. And of the vendors, IBM with its version named ‘On demand’, will be investing $10bn (£5.8bn) in it.
The jury is still out as to whether the promises being made by many of the vendors claiming to be able to provide services in such a flexible, straightforward way can be realised. There are also specific legal issues that any customer will need to address before embarking on such a deal. Here are just some of the major issues that need to be considered.
What is being provided?
‘Grid computing’, ‘Organic IT’, and ‘On demand computing’ are just a few of the terms that are used in describing the concept. Major vendors such as Sun Microsystems, IBM and Hewlett-Packard are all giving a slightly different slant on how they will provide utility computing-type services. While this lack of clarity is a sign of the immaturity of the utility computing market, it also suggests the need for customers to: be specific in contractual terms about what it is they are procuring; have sufficient detail around the service levels set out in the contract; and undertake appropriate due diligence to ensure the vendor is capable of delivering on their marketing promises.
Utility computing models suggest a long-term commitment to a vendor. The customer may be restricted from selecting another vendor with superior products or services; from making changes without incurring liability for any sunk or capital costs; or from exiting the relationship. If customers lose control by renting or borrowing resources instead of owning them, there is the potential that either the utility computing vendor or the relationship itself will end up influencing the development of a customer’s network, with procurement decisions potentially biased towards the supplying vendor’s products.
Any customer of a utility computing solution will need to manage its internal workforce to be able to take full advantage of the proposed solution. Like any outsourcing deal, if the customer fails to manage the strategic planning process, the implementation of the solution or the ongoing operations, the customer might fail to achieve the stated business goals. This could lead to legal disputes about compliance with the contract. Utility computing models might, therefore, only be suitable for customers with a strong internal management of the outsourcing model.
One of the key promises of utility computing is the fast, automated re-use of computing resources. However, will vendors be able to guarantee that data is ‘cleaned up’ once the disk space it used has been allocated to someone else? While the vendor will need to be under an obligation to ensure that each customer’s data and applications are at least segregated from those of other customers, there may well be significant additional risks to those faced by a business running data and applications on its own servers. Accordingly, customers will need to be quite clear and specific about the security levels required, and sufficient security warranties and indemnities will need to be provided for in the agreement. Customers should consider the use of third parties, such as specialist security organisations, to provide an independent assessment of remotely accessed systems.
Given that the central marketing idea behind IT as a utility is that the data, applications and technology are available whenever needed, it will be essential that vendors can ensure that costs are more or less predictable. It should be analogous to using a mobile phone – a user is not restricted in what it can do, but the cost is known up-front and so an appropriate decision can be made.
Yet is this a concept that is only going to work for those businesses with high-volume data transactions flowing through their systems, such as American Express? For companies that do not experience seasonal variations in how they use IT infrastructure, the need for a utility computing-type model is arguably quite low. And what about when users test the much-vaunted flexibility of the system and perhaps stop needing certain services? Will vendors be able to turn off the system as quickly as it can be turned on?
Payment for software licences is another interesting area. Currently most of the software vendors operate on models whereby they ‘sell’ licences for each product. The licences concerned may function in a variety of fashions, but most often operate on a per server, per central processing unit, or per user basis. In a utility computing model, flexibility is going to be vital. It is apparent that in order to function cost effectively, utility computing requires that software licences be made available quickly, and that consumers incur the licence fee only while the software is actually in use – a major difference from the standard approach.
Viable solution or market hype?
So, is utility computing the panacea for every IT director’s woes? The major IT vendors would certainly like to think so, but the reality is that it is at an early market development stage. Customers need to separate the hype from the reality. While there seems little doubt that IT services, particularly those provided through the larger outsourcing deals, are likely to include significant components of the utility-computing model, at this stage of the game good business sense and good lawyers will be needed to avoid trouble.
Peter Brudenall is a senior assistant at Simmons & Simmons