Thursday, October 12, 2006

The Datacenter is Dead! (Or Just Mutating Badly!)

I used to be a member of an object oriented programming user group in St. Paul, MN run by a professor at St. Thomas University (if I recall correctly--I can't even remember his name). This man was a tireless organizer of what was then a critical forum for fostering MN software development expertise. He was also a frequent speaker to the group, and one speech always comes immediately to mind when I remember the "good old days".

This computer science professor stood in front of a highly attentive audience one evening and declared "data is dead!"

His point was that if we modified our models of how computers stored data persistently to use a "always executing" approach, the need for databases to manage storage and retrieval of data from block-based storage would be made obsolete. (I view "always executing" systems much like your cell phone or Palm device today; when you turn on your device, applications remain in the state they were in when you last shut it off.)

Its funny to remember how much we thought objects were going to replace everything, given the intense dependency we have on relational databases today. But his arguments forced us to really think about the relation between the RDBMS and object oriented applications. One result of years of this thinking, for example, is Hibernate.

Jonathan Schwartz, my beloved leader in a former life, recently blogged about the future of the datacenter, contending that the need for large, centralized computing facilities are numbered. In other words, "the datacenter is dead".

His contention is that the push towards edge and peer computing with "fail in place" architectures would make central facilities tended by technology priests obsolete. Ultimately, his point is that we should reexamine current enterprise architectures given the growing ubiquity of these new technologies.

I have to say, I think he makes a good argument...up to a point. My problem is that he seems to ignore two things:
  • Data has to live somewhere (i.e. data is certainly not dead)
  • People expect predictable service levels from shared services--the more critical those service levels, the more critical that those service levels can be guaranteed.
Rather, I think that the days of the company owned datacenter are beginning to wane, and that the future is in a combination of edge computing and commercial computing utilities which will offer service delivery at guaranteed service levels.

I think its good news that, in order to achieve such a vision, we must take baby steps from the static, siloed, humans-as-service-level-managers approach of today's IT shops.

As you may have guessed from my previous blogs:
  • I believe the first of these steps is to shed dependencies between software services and infrastructure components.
  • Following that we need to begin to turn monitors into meters, capturing usage data for both real time correction of service level violations, as well as analysis of usage and incident trends.
  • Finally, we need the automation tools that guarantee these service levels to operate across organization boundaries, allowing businesses to drive the behavior (and associated cost) of their services wherever they may run in an open computing capacity marketplace.
The cool thing to think about is how SLA applies to the edge devices, though. Can we guarantee that necessary processing will occur both in backend data and services utilities as well as our edge and interface devices? How about in a peer network environment, especially one where one organization does not own or manage all of the computing capacity running the service?

No, neither data nor the datacenter are dead, they are just evolving quickly enough that they may soon be unrecognizable...

No comments: