Industry Girl has an interesting observation of what it is that is driving utility computing. She believes that it is the drive towards "Web 2.0 sites and applications, like the video on YouTube or the social networking pages on MySpace" is creating huge demand on backend server infrastructure--unpredictable demand, I may add--which, in turn, is creating the need for truly dynamic capacity allocation. Add to that the trend of Web 2.0 technologies being used by more and more commercial and public organizations, and you begin to see why it's time to turn your IT into a utility.
I have to say I agree with her, but I would like to observe that this is only a (large) piece of the overall picture. In reality, most of the sites she specifically mentions are actually "Software as a Service" sites targeted at consumers rather than businesses. Its the trend towards getting your computing technology over the Internet in general that is the real driving need.
For "utility computing" plays such as SaaS companies, managed hosting vendors, booksellers :), and others, the need for utility computing isn't just the need to find capacity, it is also the need to control capacity. This, in turn, means intelligent, policy-based systems, that can deliver capacity to where it is needed, share capacity among all compatible software systems and meter capacity usage in enough detail to allow the utility to constantly optimize "profit" (which may or may not be financial gain for the capacity provider itself).
Service Level Automation, anyone?
Web 2.0 drives utility computing, which in turn drives service level automation. So, Industry Girl, I welcome your interest in utility computing, and offer that the extent to which utility computing is successful is the extent by which such infrastructure delivers the functionality required at the service levels demanded. Welcome to our world...
2 comments:
While it's true that many Web 2.0 companies find utility computing models like EC2 and The GridLayer attractive for being able to deal with unpredictable load, that's only the tip of the proverbial iceberg.
Many enterprise IT shops are straining to deal with growth in an era when good sys admins are scarce and datacenter upgrades are measured in megawatts. Utility computing gives them a new option.
Hence, while the Web 2.0 companies get a lot of attention, especially in the blogs, enterprises account for a large percentage of the resources being used.
barjimo, what you say is absolutely true. The vast majority of computing power out there is in corporate or government data centers--and mostly unautomated.
However, I think there is some truth to the idea that utility computing is driven by software solutions where the system load is unpredictable, both in the short-term (e.g. peak load) and long-term (e.g. growth of user community and aquisitions, etc.).
The definition of Web 2.0 technologies that Industry Girl identified in her post is interesting in that they are highly dynamic users of infrastructure. I think enterprise computing is increasingly showing many of the same properties (lord knows that's true for most web 1.0 applications).
Post a Comment