In a continuation of
the discussion regarding
Nick Carr's "
The Big Switch: Rewiring the World from Edison to Google" and
Yochai Benkler's "
The Wealth of Networks: How Social Production Transforms Markets and Freedom", I want to focus today on the shortcomings of the electric utility analogy--or any other analogy I have heard of for that matter--in describing the compute capacity utility story. It is important to note that, while the electricity-as-utility story has dominated the utility computing discussion to date, other interesting analogies have been put forth lately that enlighten some aspects of the compute story while clouding (no pun intended) others.
Let's start with the electric utility analogy that Carr focuses on in his work. Nick does an excellent job of laying out both the history of electric production and distribution in the United States, as well as mapping those to similar aspects of compute utilities. As Nick puts it:
"The commercial and social ramifications of the democratization of electricity would be hard to overstate...Cheap and plentiful electricity shaped the world we live in today. Its a world that didn't exist a mere hundred years ago, and yet the transformation that has played out over just a few generations has been so great, so complete, that it has become almost impossible for us to imagine what life was before electricity began to flow through the sockets in our walls.
Today we're in the midst of another epochal transformation, and its following a similar course. What happened to the generation of power a century ago is now happening to the processing of information. Private computer systems, built and operated by individual companies are being supplanted by services provided over a common grid--the Internet--by centralized data-processing plants. Computing is turning into a utility, and once again the economic equations that determine the way we work and live are being rewritten."
OK, so its hard to argue with the basic premise that we are undergoing a change that is similar to the introduction of cheap, readily available electricity in the early twentieth century. Nick is a master for pointing how the evolution of electric technology fed changes in societal norms, and vice versa. "It's a messy process--when you combine technology, economics and human nature, you get a lot of variables", he writes, "but it has an inexorable logic, even if we can trace it only in retrospect."
Unfortunately, the same can be said about a variety of other technical advances that didn't end up looking like the electric marketplace; take manufacturing, food production, and music and film production, for example. All of these have elements that can be seen as paralleling utility computing, social production or both. Yet none of them really map completely, and the flaws in the analogy have a "chaos"-like ability to magnify as history bears out.
Now, to Nick's credit, he does start Part 2 of the book--his in depth comparison of the social implications of utility computing--with the following comments:
"Before we can understand the implications for users...we first need to understand how computing is not like electricity, for the differences between the two technologies are as revealing as their similarities."
He goes on to highlight the following differences, using them to make key points about how the effects of compute utilities on society may not be nearly as beneficial as the effect of electric utilities:
- With electricity, the applications of the commodity lie outside of the utility--i.e. the appliances, electronics, lighting, etc. that consume the power. With computing, the applications themselves are deliverable over the network, and can be shared by anyone that wants to (and is allowed to) use them.
- Computing is much more modular than the electric grid, meaning that the components that make up the commodity service (storage, processing, networking) can be split up and offered by a variety of different parties.
- The compute utility is programmable; it can be made to perform a variety of custom tasks are required by its customers. Electricity from your basic power outlet is a fixed state commodity--there are exacting standards to what it is and how it is delivered, as well as laws of physics that limit how it can be used.
- Choosing an electric utility was generally an all-or-nothing choice; you either got power from the grid, or you had your own power generation. The modularity of computing, however allows for a slow transitional change from private to public consumption. (I think there is a serious flaw in this analogy, for what its worth. Look at the increasing installation of solar power systems in residential applications--all while remaining a part of the grid. This seems to indicate a gradual transition to a hybrid public/private power grid in the electricity space.)
- The compute utility allows others to participate directly in creating value for the utility, and do so cheaply and simply. Providing power to the electric grid has always been expensive and very technical (as, I have to admit, is true in my objection in point 4).
These are excellent examples, and are all important to note (even point 4). However, I think Nick fails to note the most important difference between electricity and data processing; namely
data != electricity
There are huge implications to what is being moved over the network versus what is being moved over the power grid, beyond just the programmable elements. These differences are
critical when analyzing the compute as utility story, and its a shame he doesn't address them.
For example, checking his index for the terms "security", "data security" or "software security" shows exactly zero entries. When talking about the transition of data vs electricity, it seems critical that one consider the sensitivities that people and organizations have about
how it is transmitted. "Privacy" is the subject of a 7 page essay highlighting what we have been willing to give up so easily, but he basically uses the subject to highlight a specific trait of the network without investigating how related issues will cause compute capacity to differ from electricity. My own opinion is that these two subjects--security and privacy--are exactly what will slow down the "total conversion" to centralized computing utilities for customers like banks, classified federal bureaucracies and health care. I
spoke of this in detail before.
As noted earlier, others have commented on some of these issues, and have used other analogies like manufacturing to counter the electricity analogy. One excellent example of this is an an article by Michael Feldman of
HPCwire in which he argues that a better analogy is food production. As he puts it:
"When food became a commodity, agribusiness conglomerates took over and replaced lots of family farms with much larger, more efficient "factory farms." Today, crops like wheat and soybeans are typically grown on multi-hundred acre land parcels. But not all food products are easily commoditized. Specialty fruits, vegetables, and organic products don't usually lend themselves very well to large-scale production. According to the U.S. Department of Agriculture about a quarter of farm revenue is still generated on family farms. Many of these farms are focusing on these specialty items and have formed cooperative arrangements in order to remain economically viable."
This analogy works from the standpoint that it describes a system in which people care about the varying qualities of the service output by the "utility". For example, we all know the amount of effort spent by the FDA and others to make sure our meats aren't tainted with deadly bacteria. In fact, some specialty food producers have built their marketing message around food safety and health, and many of those are small, boutique producers. Other small players have provided specialty food items to very specific markets with great success. I have believed all along that the compute market will evolve into a few major players and hundreds (thousands?) of small boutique specialty players, especially in the SaaS space. ("Special SaaS with that?" Please forgive me...)
Unfortunately, the food analogy also breaks down in one critical way:
data != food
In this case, its the real, physical nature of food, and the accompanying issues with logistics, cost of production (including fixed real estate costs), and brick-and-mortar sales that don't compare well to the zero marginal production cost nature of data. Replicating food and shipping it to a new customer destination are expensive acts; doing the same with data costs nearly zero. Furthermore, geographic location means nearly nothing for computing. Food, on the other hand is subject to cultural, climatological and logistical limitations to where it can be produced and sold.
For this reason, computing will tend to a much higher level of centralization than food production has seen. Intuitively, one must believe that this will lead to larger displacement of private data centers than would have happened if it was more expensive to share infrastructure.
I'm still trying to digest all of this, but I have a growing feeling that Carr's dependency on the "Edison analogy" (to coin a phrase for no good reason) actually limits the likelihood of some of his arguments. He also seems to assume that the economics of the web won't evolve much from where it is today--largely advertising based, with millions of people willing to do stuff for free and few existing cultural industries willing to produce for online audiences. I want to bring Benkler back into the conversation when I cover this in a later post.
(One side bar on the commercial production of online content: did anyone see
the news from NBC today?)