Industry Girl has an interesting observation of what it is that is driving utility computing. She believes that it is the drive towards "Web 2.0 sites and applications, like the video on YouTube or the social networking pages on MySpace" is creating huge demand on backend server infrastructure--unpredictable demand, I may add--which, in turn, is creating the need for truly dynamic capacity allocation. Add to that the trend of Web 2.0 technologies being used by more and more commercial and public organizations, and you begin to see why it's time to turn your IT into a utility.
I have to say I agree with her, but I would like to observe that this is only a (large) piece of the overall picture. In reality, most of the sites she specifically mentions are actually "Software as a Service" sites targeted at consumers rather than businesses. Its the trend towards getting your computing technology over the Internet in general that is the real driving need.
For "utility computing" plays such as SaaS companies, managed hosting vendors, booksellers :), and others, the need for utility computing isn't just the need to find capacity, it is also the need to control capacity. This, in turn, means intelligent, policy-based systems, that can deliver capacity to where it is needed, share capacity among all compatible software systems and meter capacity usage in enough detail to allow the utility to constantly optimize "profit" (which may or may not be financial gain for the capacity provider itself).
Service Level Automation, anyone?
Web 2.0 drives utility computing, which in turn drives service level automation. So, Industry Girl, I welcome your interest in utility computing, and offer that the extent to which utility computing is successful is the extent by which such infrastructure delivers the functionality required at the service levels demanded. Welcome to our world...