Tuesday, January 29, 2008

One Step To Prepare For Cloud Computing

Some of you may be wondering why I am making such a big stink about software architecture on a blog about service level automation (SLAuto). Well, as Todd Biske points out, "the relationships (and potentially collisions) between the worlds of enterprise system management, business process management, web service management, business activity monitoring, and business intelligence" are easier to resolve if the appropriate access to metrics is provided for a software service. For SLAuto, this means the more feedback you can provide from the service, process, data and infrastructure levels of your software architecture, the easier it is to automate service level compliance.

Let's look at a few examples for each level:
  • Service/Application: From the end user's perspective, this is what service levels are all about. Key metrics such as transaction rates (how many orders/hour, etc.), response times, error rates, and availability are what the end users of a service (e.g. consumers, business stakeholders, etc.) really care about.
  • Business Process: Business process metrics can warn the SLAuto environment about cross-service issues, business rule violations or other extraordinary conditions in the process cycle that would warrant capacity changes at the BPM or service levels.
  • Data Storage/Management: Primarily, this layer can inform the SLAuto system about storage needs and storage provisioning, which in turn is critical to automated deployment of applications into a dynamic environment.
  • Infrastructure: This is the most common form of metric used to make SLAuto decisions today. Such metrics as CPU utilization, memory utilization and I/O rates are commonly used in both virtualized and non-virtualized automated environments.

As noted, digital measurement of these data points can feed an SLAuto policy engine to trigger capacity adjustment, failure recover or other applicable actions as necessary to remain within defined service thresholds. While most of the technology required to support SLAuto is available, the truth is that the monitoring/metrics side of things is the most uncharted territory. As an action item, I ask all of you to take Todd's words of wisdom into account, and design not only for functionality, but also manageability. This will aid you greatly in the quest to build fluid systems that can best take advantage of utility infrastructure today.

Monday, January 28, 2008

It's the labor, baby...

I'm getting ready to go back to work on Wednesday, so I decided today (while Owen is at school and Mia has Emery) to get caught up on some of the blog chatter out there. First, read Nick Carr's interview with GRIDToday. Damn it, I wish this was the sentiment he communicated in "The Big Switch", not the "its all going to hell" tone the book actually conveyed.

Second, Google Alerts, as always, is an excellent source, and I found an interesting contrarian viewpoint about cloud computing from Robin Harris (apparently a storage marketing consultant). Robin argues there are two myths that are propelling "cloud computing" as a buzz phrase, but that private data centers will never go away in any real quantity.

Daniel Lemire responds with a short-but-sweet post that points out the main problem with Robin's thinking: he assumes that hardware is the issue, and ignores the cost of labor required to support that hardware. (Daniel also makes a point about latency being the real issue in making cloud computing work, not bandwidth, but I won't address that argument here, especially with Cisco's announcement today.)

The cost of labor, combined with real economies of scale is the real core of the economics of cloud computing. Take this quote from Nick Carr's GRIDToday interview:
If you look at the big trends in big-company IT right now, you see this move toward a much more consolidated, networked, virtualized infrastructure; a fairly rapid shift of compressing the number of datacenters you run, the number of computers you run. Ultimately … if you can virtualize your own IT infrastructure and make it much more efficient by consolidating it, at some point it becomes natural to start to think about how you can gain even more advantages and more cost savings by beginning to consolidate across companies rather than just within companies.
Where does labor come into play in that quote? Well, consider "compessing of the number of datacenters you run", and add to that to the announcement that the Google datacenter in Lenoir, North Carolina will hire a mere 200 workers (up to 4 times as many as announced Microsoft and Yahoo data centers). This is a datacenter that will handle traffic for millions of people and organizations worldwide. If, as Robin implies, corporations will take advantage of the same clustering, storage and network technologies that the Googles and Microsofts of the world leverage, then certainly the labor required to support those data centers will go down.

The rub here is that, once corporations experience these new economies of scale, they will begin to look for ways to push the savings as far as possible. Now the "consolidat[ion] across companies rather than just within companies" takes hold, and companies begin to shut down their own datacenters and rely on the compute utility grid. Its already happening with small business, as Nick, I and many others have pointed out. Check out Don McAskill's SmugMug blog if you don't believe me. Or GigaOM's coverage of Standout Jobs. It may take decades, as Nick notes, but big business will eventually catch on. (Certainly those startups that turn into big businesses using the cloud will drive some of these economics.)

One more objection to Robin's post. To argue that "Networks are cheap" is a falicy, he notes that networks still lag is speed behind processors, memory, bus speeds, etc. Unfortunately, that misses the point entirely. All that is needed are network speeds that get to the point where functions complete in a time that is acceptible for human users and economically viable for system communications. That function is independent of the network's speed relative to other components. For example, my choice of Google Analytics to monitor blog traffic is solely dependent on my satisfaction with the speed of the conversation. I don't care how fast Google's hardware is, and all evidence seems to point to the fact that their individual systems and storage aren't exceptionally fast at all.

Thursday, January 24, 2008

Data propagation and software fluidity

Jon Udell has an interesting post commenting on Jeff Jonas' explaination of Out-bound Record-level Accountability in Information Sharing Systems. The central thesis of Jeff's post is that tracking who specifically received a given datum is very expensive yet highly necessary in many applications. The example given is that of a user who wishes to no longer receive email from a site they have an account with, or any of the other sites that the original site shared that preference with. How does the original site know who to contact? The high cost is a result of the difficulty in tracking who data has been forwarded to.

John replies very simply that a "publish" model, much like blogging, might be the answer. "Data blogging", coined by fellow blogger Galvin Carr, refers essentially to the problem of syndication, but Udell projects that to a much wider arena of data types. As he notes, there is much evidence out there that "push" models are generally only applicable to edge systems calling "inward". "Publish and Subscribe"-type pull models are far easier to implement when running "outward" from the cloud to edge systems (as well as, generally, within the cloud--aka event-driven architectures).

There are two valuable results of this approach:
  1. The originating system can require users of data to subscribe with a unique identity, and each "pull" of published data could be tracked (if necessary) to identify who is up to date and who isn't.
  2. For software fluidity purposes, it further decouples the originating system from its subscribers, meaning both the subscribers and the originating system can be "moved" from physical environment to physical environment with no loss of communication. The most negative action that could take place here is if the originating publisher's DNS name changed in the course of the move, but redirects and other techniques could even mitigate that issue.

I am commenting on this, of course, largely for the second item. Access to data, services and even edge devices must be very loosely coupled to work in a cloud computing world. This is one great example of how you could architect for that eventuality, in my opinion.

Wednesday, January 23, 2008

Children of the Net: Why Our Decendents Will Love The Cloud

Our children--or perhaps our grandchildren--won't remember a time when there was a PC on every desk, or when you had to go to Fry's Electronics to buy a shrink-wrapped copy of your favorite game. This, as Nick notes frequently in The Big Switch, is one of the real parallels between what our ancestors went through with electrification and what we have yet to go through with compute utilities. Heck, I already find it hard to remember when I didn't have access to the World Wide Web, and in what year all of that changed. Also, I'm frankly already taking the availability of services from the cloud for granted.

My Dad used to tell me stories of when he lived in a house in Scotland with only a few lights and no other electrical appliances, no indoor plumbing and no telephone. I can't imagine living like that, but it was just about 50-60 years ago. Those born in the latter half of the twentieth century (in an industrialized country) are perhaps the first to live a lifetime without seeing or experiencing life without multiple sockets in every room. It is unimaginable what life was like for our ancestors pre-electrification.

There will likely be both positive and negative consequences that come from any innovation, but to the innovator's descendants, they won't remember things any other way. In the end, once basic needs are taken care of, all human kind cares about is lifestyle anyway, so the view of how "good" an "era" is, is largely driven by how well those needs are taken care of. One of those basic needs is the need to create/learn/adapt, but another one is the need for predictability of outcome. This constant battle between the yearn for freedom and the yearn for control is what makes human culture evolve in brilliantly intricate ways.

I for one hold out hope that our descendants will be increasingly satisfied with their lifestyles, which--in the end--is probably what we all want to see happen. Will those lifestyles be better or worse from our perspective as ancestors? Who knows...but it won't really matter, now, will it?

Of course, one of the biggest challenges to humanity is meeting even the basic needs of its entire population. To date, the species has failed to achieve this--the study of economics is largely targeted at understanding why this is. Cloud computing could, as Nick suggests, actually make it more difficult for some groups of people to meet their basic needs, but I would argue that this would be counter productive to the rest of society.

At the core of my argument is the fact that so much of online business is predicated on massive numbers of people being able to afford a given product. Nick argues that life in the newspaper world shows us the future of most creative enterprises; the ease of the masses to create and find content makes it difficult to sell advertising to support newspapers, thus the papers struggle. But if huge numbers of people are out of work, with no one valuing their talents and experience, that will lead to less consumer spending. Less consumer spending will lead to less advertising, which will in turn lead to less income for "the cloud" (i.e. those companies making money from advertising in the cloud). Its a horribly negative feedback cycle for online properties/services, and one I think will fail to come to pass.

The alternative is that the best of the talent out there continue to find ways to get paid, while the masses are still encouraged to participate. Newspaper journalists are already finding opportunities online, though perhaps at a slower pace then some would like. I believe that ventures such as funnyordie.com and even YouTube will create economic opportunities for videographers and film makers to rise above the noise. Musicians are already experimenting with alternative online promotion and sales tools that will change the way we find, buy and consume music. Yes, the long tail will flourish, but the head of the tail will continue to make bank.

The result of this is simply a shifting of the economic landscape, not a wholesale collapse into a black hole. Yeah, the wealth gap thing is a big deal (see Nick's book), but I believe that the rich are going to start investing some of that money back into the system when the new distribution mechanisms of the online world mature--and that should create jobs, fund creative talent and create a new world in which those that adapt thrive, and those that don't struggle.

Did I mention I think the utility computing market is a complex adaptive system?

Sunday, January 20, 2008

Evidence of pending doom and imminent salvation...

Two news articles that occurred as soon as I went offline for the birth of my daughter provide increasing evidence of the importance of service level automation and image portability between vendors:

  • Joyent, one of the most ambitious new "capacity on demand" managed hosting services, has experienced a multi-day outage that has affected two of their prime storage services. No failover path was available to users of the services, and there is no mention of functionality or services to assist customers with moving--temporarily or permanently--to another vendor's service. Odds are high that some of these customers have lost access to key data, or are flying without substantial backups to key systems. Any decision to move to a different servers (like Twitter will according to the post) is on the customer's own dime.

    A prime example of the dangers of vendor lock-in that Simon and I have been warning you about...


  • Oracle has announced its intention to build and sell "Grid 2.0" technology that will target--yes, you heard right--service level automation. Welcome to the SLAuto game boys. I hope you're ready to talk standards for image and policy portability; as well as policy platform interoperability. Otherwise, you're just creating a new DB grid "silo", and not helping anyone in the long run. Please, feel free to educate me if you think otherwise...

These events show the caution that users of cloud services must employ. Be ready to take on increased integration responsibilities as you deploy more and more elements of your datacenter to the cloud, automate more of the management of those elements, and find the product landscape one in which there (still) is no silver bullet. You may not be writing apps, but you sure as heck will be writing the orchestration that will tie the apps you employ into a cohesive business process ecosystem. You may also find yourself writing backup integration again, just in case you experience "Joyent 2.0"...

Thursday, January 17, 2008

Off Topic: Introducing Emery Anne Urquhart


Emery Anne Urquhart
Born 1/16/2008
7lb, 12 oz
19 3/4"
Mom and baby are fine. Dad is scared silly, however...

Tuesday, January 15, 2008

Off Topic:The next two weeks...

Just a quick note about what I will be doing for the next two weeks starting tomorrow morning. At 5:30AM sharp tomorrow, my wife and I will arrive at the hospital for the birth of my daughter, my second child. I will post pictures and/or video when it is available. (Also off topic, I know, but I'm just too proud...)

Once "Baby Girl" has arrived, I will be splitting my time between caring for my son, caring for my wife and baby girl, and caring for all three. In other words, the blogging will suffer a bit. Once we get settled in a bit, I'll start posting again. That may be a few days or a few weeks. Please be patient.

On a vaguely related note, I finally got Feedburner set up for my site, and I was happy to find so many of you were regular subscribers. I hope many of these are mutual subscriptions where I also follow your work, but if you'd like to let me know where and what you post, please post a comment here and I'll check it out.

In the meantime, for utility computing related topics, stay in touch with Nicholas Carr, Simon Wardley and Anne Zelenka. (Anne's post on GigaOM is especially good, and one that I seriously wish I wrote myself. She captured much of what I would want to say about the effect of utility computing on the middle class, and placed Nick's book in exactly the right context.)

Friday, January 11, 2008

The Compute Grid is Like Nothing Before It

In a continuation of the discussion regarding Nick Carr's "The Big Switch: Rewiring the World from Edison to Google" and Yochai Benkler's "The Wealth of Networks: How Social Production Transforms Markets and Freedom", I want to focus today on the shortcomings of the electric utility analogy--or any other analogy I have heard of for that matter--in describing the compute capacity utility story. It is important to note that, while the electricity-as-utility story has dominated the utility computing discussion to date, other interesting analogies have been put forth lately that enlighten some aspects of the compute story while clouding (no pun intended) others.

Let's start with the electric utility analogy that Carr focuses on in his work. Nick does an excellent job of laying out both the history of electric production and distribution in the United States, as well as mapping those to similar aspects of compute utilities. As Nick puts it:
"The commercial and social ramifications of the democratization of electricity would be hard to overstate...Cheap and plentiful electricity shaped the world we live in today. Its a world that didn't exist a mere hundred years ago, and yet the transformation that has played out over just a few generations has been so great, so complete, that it has become almost impossible for us to imagine what life was before electricity began to flow through the sockets in our walls.

Today we're in the midst of another epochal transformation, and its following a similar course. What happened to the generation of power a century ago is now happening to the processing of information. Private computer systems, built and operated by individual companies are being supplanted by services provided over a common grid--the Internet--by centralized data-processing plants. Computing is turning into a utility, and once again the economic equations that determine the way we work and live are being rewritten."
OK, so its hard to argue with the basic premise that we are undergoing a change that is similar to the introduction of cheap, readily available electricity in the early twentieth century. Nick is a master for pointing how the evolution of electric technology fed changes in societal norms, and vice versa. "It's a messy process--when you combine technology, economics and human nature, you get a lot of variables", he writes, "but it has an inexorable logic, even if we can trace it only in retrospect."

Unfortunately, the same can be said about a variety of other technical advances that didn't end up looking like the electric marketplace; take manufacturing, food production, and music and film production, for example. All of these have elements that can be seen as paralleling utility computing, social production or both. Yet none of them really map completely, and the flaws in the analogy have a "chaos"-like ability to magnify as history bears out.

Now, to Nick's credit, he does start Part 2 of the book--his in depth comparison of the social implications of utility computing--with the following comments:
"Before we can understand the implications for users...we first need to understand how computing is not like electricity, for the differences between the two technologies are as revealing as their similarities."
He goes on to highlight the following differences, using them to make key points about how the effects of compute utilities on society may not be nearly as beneficial as the effect of electric utilities:
  1. With electricity, the applications of the commodity lie outside of the utility--i.e. the appliances, electronics, lighting, etc. that consume the power. With computing, the applications themselves are deliverable over the network, and can be shared by anyone that wants to (and is allowed to) use them.


  2. Computing is much more modular than the electric grid, meaning that the components that make up the commodity service (storage, processing, networking) can be split up and offered by a variety of different parties.


  3. The compute utility is programmable; it can be made to perform a variety of custom tasks are required by its customers. Electricity from your basic power outlet is a fixed state commodity--there are exacting standards to what it is and how it is delivered, as well as laws of physics that limit how it can be used.


  4. Choosing an electric utility was generally an all-or-nothing choice; you either got power from the grid, or you had your own power generation. The modularity of computing, however allows for a slow transitional change from private to public consumption. (I think there is a serious flaw in this analogy, for what its worth. Look at the increasing installation of solar power systems in residential applications--all while remaining a part of the grid. This seems to indicate a gradual transition to a hybrid public/private power grid in the electricity space.)


  5. The compute utility allows others to participate directly in creating value for the utility, and do so cheaply and simply. Providing power to the electric grid has always been expensive and very technical (as, I have to admit, is true in my objection in point 4).
These are excellent examples, and are all important to note (even point 4). However, I think Nick fails to note the most important difference between electricity and data processing; namely

data != electricity

There are huge implications to what is being moved over the network versus what is being moved over the power grid, beyond just the programmable elements. These differences are critical when analyzing the compute as utility story, and its a shame he doesn't address them.

For example, checking his index for the terms "security", "data security" or "software security" shows exactly zero entries. When talking about the transition of data vs electricity, it seems critical that one consider the sensitivities that people and organizations have about how it is transmitted. "Privacy" is the subject of a 7 page essay highlighting what we have been willing to give up so easily, but he basically uses the subject to highlight a specific trait of the network without investigating how related issues will cause compute capacity to differ from electricity. My own opinion is that these two subjects--security and privacy--are exactly what will slow down the "total conversion" to centralized computing utilities for customers like banks, classified federal bureaucracies and health care. I spoke of this in detail before.

As noted earlier, others have commented on some of these issues, and have used other analogies like manufacturing to counter the electricity analogy. One excellent example of this is an an article by Michael Feldman of HPCwire in which he argues that a better analogy is food production. As he puts it:

"When food became a commodity, agribusiness conglomerates took over and replaced lots of family farms with much larger, more efficient "factory farms." Today, crops like wheat and soybeans are typically grown on multi-hundred acre land parcels. But not all food products are easily commoditized. Specialty fruits, vegetables, and organic products don't usually lend themselves very well to large-scale production. According to the U.S. Department of Agriculture about a quarter of farm revenue is still generated on family farms. Many of these farms are focusing on these specialty items and have formed cooperative arrangements in order to remain economically viable."

This analogy works from the standpoint that it describes a system in which people care about the varying qualities of the service output by the "utility". For example, we all know the amount of effort spent by the FDA and others to make sure our meats aren't tainted with deadly bacteria. In fact, some specialty food producers have built their marketing message around food safety and health, and many of those are small, boutique producers. Other small players have provided specialty food items to very specific markets with great success. I have believed all along that the compute market will evolve into a few major players and hundreds (thousands?) of small boutique specialty players, especially in the SaaS space. ("Special SaaS with that?" Please forgive me...)

Unfortunately, the food analogy also breaks down in one critical way:

data != food

In this case, its the real, physical nature of food, and the accompanying issues with logistics, cost of production (including fixed real estate costs), and brick-and-mortar sales that don't compare well to the zero marginal production cost nature of data. Replicating food and shipping it to a new customer destination are expensive acts; doing the same with data costs nearly zero. Furthermore, geographic location means nearly nothing for computing. Food, on the other hand is subject to cultural, climatological and logistical limitations to where it can be produced and sold.

For this reason, computing will tend to a much higher level of centralization than food production has seen. Intuitively, one must believe that this will lead to larger displacement of private data centers than would have happened if it was more expensive to share infrastructure.

I'm still trying to digest all of this, but I have a growing feeling that Carr's dependency on the "Edison analogy" (to coin a phrase for no good reason) actually limits the likelihood of some of his arguments. He also seems to assume that the economics of the web won't evolve much from where it is today--largely advertising based, with millions of people willing to do stuff for free and few existing cultural industries willing to produce for online audiences. I want to bring Benkler back into the conversation when I cover this in a later post.

(One side bar on the commercial production of online content: did anyone see the news from NBC today?)

Monday, January 07, 2008

7 Businesses to Start in 2008

Rather than offer a list of predictions for 2008, I thought I'd have some fun suggesting some businesses that could make you money in 2008 or the few years following.

  1. SaaS<-->Enterprise data conversion practice: All those existing enterprise apps will need to have their data migrated to that trendy new SaaS tool; and should anyone actually decide they hate their first vendor, they'll be spending that money again to convert to the next choice. Perhaps they'll even get fed up and return to traditional enterprise software. Easy money.
  2. Enterprise Integration as a Service: No matter how much functionality one SaaS vendor will provide, it will never be enough. Integration will always be necessary, but where/how will it be delivered? Go for the gold with a browser based integration option. Just figure out how to do it better/cheaper/faster than force.com, Microsoft, Google, Amazon, etc...
  3. SaaS meter consolidation service: Given the problem stated in 2 above, who wants 5 or 6 bills where its impossible to trace the cost of a transaction across vendors? Provide a single billing service that consolidates the charges of the vendor stable and provides additional analytic capabilities to break down where costs and revenues come from. Then get ready to defend yourself against the data ownership walls put up by those same vendors (see 4 below).
  4. SaaS/HaaS Customer litigation practice: Given the example of Scoble's experience with Facebook, there are clearly a lot of sticky legal issues to be worked out about "who owns what". Ride that gravy train with litigation expertise in data ownership, vendor contractual obligations and the role of code as law.
  5. SaaS industry (or SaaS customer) data ownership rights lobbyist: Given 4 above, each industry player is going to want their voice in congress to protect/promote their interest. Drive the next set of legislation that screws up online equality and individual rights.
  6. Sys Admin retraining specialist: All those sys admins who will be out of work thanks to cloud computing are going to need to be retrained to monitor SLAs across external vendor properties, and to get good at waiting on hold for customer service representatives.
  7. Handset recycling services: The rate at which "specialized" hardware will evolve will raise the rate of obsolescence to a new high. Somebody is going to make a killing from all those barely used precious metals, silicon and LCD screens going to waste. Why not you?

Friday, January 04, 2008

"Social Production" vs. "Greed" Online

I want to start my comparison of Yochai Benkler's tome, "The Wealth of Networks: How Social Production Transforms Markets and Freedom", and Nick Carr's "The Big Switch: Rewiring the World from Edison to Google" with coverage of the direct critique of the former in the latter.

Benkler proposes that we are entering a new phase of economic history, which he calls the "networked information economy". Counter to the prior industrial economy, this phase is highlighted by the rising effect of "non-market" production on the creation of intellectual capital, made possible by the near zero cost of creating and sharing content on the Internet.

According to Benkler, in a network based economy:

  1. "Individuals can do more for themselves independently of the permission or cooperation of others."


  2. "Individuals can do more in loose affiliation with others, rather than requiring stable, long-term relations, like coworker relations or participation in formal organizations, to underwrite effective cooperation."
As a result of this, says Benkler, "we can make the twenty-first century one that offers individuals greater autonomy, political communities greater democracy, and societies greater opportunities for cultural self-reflection and human connection."

In chapter 7 of Carr's book, titled "From the Many to the Few", Carr makes an argument for the inequitable effects of social networking and unpaid content creation. With specific reference to Benkler and others writing about the rising importance of the so-called "gift economy", he notes that

"[t]here's truth in such claims, as anyone looking at the Web today can see...[b]ut there is a naivete, or at least a short-sightedness, to these arguments as well. The Utopian rhetoric ignores the fact that the market economy is rapidly subsuming the gift economy."
As evidence, Carr notes that two of the most important Web 2.0 acquisitions of the last couple of years--that of Flickr by Yahoo, and YouTube by Google--were driven in large part by the incredible economics of these companies. When Flickr was acquired for $35 million, there were less than 10 people on staff. YouTube had less than 70 employees when they were bought for 1.65 billion.

However, perhaps the most astounding comparison between the two is that both had millions of people producing, organizing and promoting content, but effectively none of them got a single dime of equity. When YouTube was sold, each of the 3 founders got about a third of a billion dollars for 10 months of work. Its hard to argue that Google bought the web site software for that price. Google bought content and traffic, both of which were largely attributable to those unpaid millions.

I think Carr is right, unfortunately, that we overestimate the influence that "open" technologies will have on the incumbent industrial system. Carr notes important evidence like the growing income gap between the richest Americans and the rest of us, as well as the struggle that newspapers and other media companies are having to generate sufficient income to sustain their businesses--and, in turn, their employee's standard of living. I will add that even the distinct line between "open source" and "proprietary" projects is blurring, as Anne Zelenka notes on GigaOM today. The result of this trend will, of course, be mixed. At times the content created out of love, frustration or even narcissism will loosen the grip of corporate systems on our society, but these may always be offset by new controls and entrepreneurial successes by these same systems.

On the other hand, I think Nick is too skeptical about the amount of change that will beset business in the coming decades. It is easy to think of ways to provide equity to those that produce content, and I believe someone will come up with a business that does so in the next year or two. Furthermore, the process of democracy itself may be changed significantly in the next two decades, as both the government and entities seeking influence over the government (or seeking to loosen the control of government) find new ways to tweak the system. John Udell at Microsoft has covered an interesting corollary, public access to government data, and noted some of the progress made in that space.

Those of you that have read me for a while know that I am extremely interested in complexity theory and its applications to technological development. In the end, I believe what we are going to see in the next data is an "edge of chaos" process, where the forces of liberalization continually struggle against the forces of social and economic inertia. In the long term, however, I believe that this process will continually better the lives of those swept up in it; with (significant) luck, the lives of everyone on Earth. What is left to chance, however, is the amount of pain and suffering that may be felt as change takes place.

Wednesday, January 02, 2008

First look at Nick Carr's "The Big Switch" and Yochai Benkler's "The Wealth of Networks"

Welcome back one and all. I hope everyone enjoyed the holidays as much as I did this time. While I enjoyed several visits with family and friends throughout the week, most of my time was spent either playing with my son, or preparing the house for the arrival of my daughter in two weeks. As you might imagine, the latter is taking up most of my mental cycles these days.

I did, however, spend some time reading two books over the break, both covering the broad topic of the effects of Web 2.0 and the compute cloud system on society and culture. One is a very positive economic analysis of what the possibilities may be, while the other is a skeptical comparison of the history of the electric grid with the evolving history of the compute grid.

"The Wealth of Networks: How Social Production Transforms Markets and Freedom", by Yochai Benkler--which can be read for free online--surprised me as being a much more fascinating read than I expected it to be. I knew that Benkler was going for a more formal economic analysis of the effects of "non-market" production online (e.g. videos submitted to YouTube, photos on Flickr, etc.), but his analysis of both the trends and possibilities was actually very easy for anyone in technology to understand, and didn't require a lot of economics knowledge. I'm still working on this one, but I will provide some more in depth discussion in later posts.

Nicholas Carr's latest, "The Big Switch: Rewiring the World from Edison to Google", is everything you would hope from Nick, though perhaps with a little bit darker outlook than expected. However, I believe this is a must read for anyone contemplating the utility computing revolution, as it lays out an honest assessment of the evils that utility computing will bring along with the good. Using the history of the electric utility grid as a model, Nick points out that particular technical revolution brought with it promises of the democratization of humankind, but actually unfolded with much more mixed results. Utility computing will be no exception, Carr argues, and I heartily agree with him--though I am not sure I agree with all of his detailed examples and predictions.

I actually recommend reading these two books in parallel, as I have been. Here's what I did, and I think it allowed me to read both texts with a more critical eye:
  • Read Chapters 1-3 of Benkler to get a sense of the economic arguments about how social production will change the way we interact, generate information and entertainment, and possibly change our political and cultural landscape to create a more egalitarian society.
  • Now read Carr's work in its entirety, mostly to get sucked back to earth about how Benkler's grandiose vision is just that, a vision; much of the positive developments Benkler looks for can easily be countered by opposing forces looking to maintain or enhance the status quo.
  • Now finish Benkler's work to gain a detailed perspective of the economics at work in the online world, but with a more critical eye towards his desired social and political outcomes.

I am still working on Benkler's book, but I can say now that my eyes have been opened to how much change is before us, and how the great value we get from social production is tempered by the effects on certain careers, economic segments and perhaps even the quality of work we produce itself.

I will dig into a few specific subjects soon, comparing Benkler's vision with Carr's, and adding my own "special sauce". I would really welcome comments from my contemporaries who have read one or both of these works, including critiques of my critiques. My intuition tells me that those that understand what is at stake, and what could happen--both good and bad--will have a distinct advantage as the next two decades play themselves out.

Update: Below are links to the follow on posts for this joint review of the two works: