When I was a student some <mumble, mumble> years ago I decided I wanted to try ‘proper’ photography. I bought a second-hand SLR and got some books out of the library. I quickly grasped the basics of shutter speed, aperture, light metering and so on and went and did what all the books told me: I took lots of photographs, carefully logging what settings I’d used as I went. Once a film was full I’d send it off to the processing lab, wait a week or so, and then check the results against my notes in an attempt to learn more about how different settings affect the quality of the resulting photograph.
When the first digital cameras came out, my initial reaction was kind of “so what”; it’s a simple evolution to replace film with a CCD and a memory card. The first digital SLRs were superficially identical to the film SLRs they replaced.
Nikon D70 Digital SLR
Nikon F70 SLR
But I was wrong. The first thing I found with a Digital SLR was that the tedious and lengthy feedback loop between taking a photograph and working out why it looked better or worse than other shots had disappeared. Now I could take a shot, look at the results, make adjustments and re-shoot on the spot … I probably learned more about taking photographs in three months with my D-SLR than I did in 5 years of owning an SLR.
This was no doubt helped by a complete inversion of the economic model. My original SLR was fairly cheap to buy but the recurring costs (for film and film processing mainly) were high and directly proportional to the activity of taking photographs. The D-SLR was more expensive to buy but I could take as many photographs as I liked without worrying about how much it was going to cost me to see the results.
And the differences in process and economics were really just the start. Showing photographs to people is no longer something that has to be done one-to-one, handing over physical prints. The transition from camera to online is smooth, simple and fast. I could put my old film-photographs online but that required a long and hard slog involving a flat-bed scanner, some very ropey editing software and a lot of patience. Flickr, twitpic and the like have moved the game on from people simply showing their photographs to sharing them with whoever might be interested.
And so on … in short, ‘going digital’ was a revolution for photography, not an evolution.
I feel the same about cloud computing. My initial reaction was “that’s kind of nice” but I was immersed in the world of building large-scale on-premise enterprise software and failed to see beyond the superficial similarities: the systems I was building lived in data centres and were accessed through a browser so what really was new?
On-premise data centre ... Or is it a cloud computing data centre?
However, after spending the past six months building systems for the cloud in the cloud I see the same kind of fundamental shifts in enterprise software that ‘going digital’ brought to photography.
The process of delivering systems has changed. As someone who worries a lot about the non-functional characteristics of a system (availability, scalability, performance, security, manageability, maintainability and so on) I now worry a lot less about how to build these into the system and a lot more about understanding what the platform I’m building on gives me. With a ‘raw’ cloud environment like Amazon EC2 some of these still need quite a bit of work, with a Platform-as-a-Service environment like Force.com, its all there for me.
As someone who likes to build systems using weekly iterations, the flexibility of cloud environments is a godsend. Before we used to set up a number of different environments for production, staging, QA, development and so on and have strict processes for governing what was released into each environment (and making all this work on a weekly cycle was hard). In cloud environments we still need to look after production but any other environment needed for whatever purpose is extremely simple to create, populate and then decommission when we’re done.
The economics of delivering systems has changed. On-premise solutions need to have sufficient capacity to deal with the sharpest peak in demand. Adding further capacity can be a lengthy and very expensive processes regardless of whether you get the benefit of that capacity or not and reducing capacity can often be so expensive that its cheaper not to bother. There are all sorts of regulations that (quite rightly) protect the integrity of the enterprise IT infrastructure against the vagaries of the system you’re introducing to it and these can be extremely expensive to adhere to or get an exception from.
Cloud environments turn most of this on its head. Now you pay for what you need and, if you need to increase or reduce capacity that’s a very simple and quick operation with a directly proportional increase or reduction in what you pay. And many of the regulations needed to protect the corporate IT infrastructure simply don’t apply because the system isn’t in your corporate IT infrastructure. Of course that doesn’t mean that governance isn’t required, especially around the data you move into the cloud, but many of the regulations that are aimed at limiting access to shared resource, for example, become irrelevant.
Last week I integrated Amazon S3 storage with the Force.com platform to give me a way of providing secure, well-managed access to an almost unlimited amount of document storage through a rich, browser-based interface. Suppose I wanted to use this to allow a team of, say, 30 users to manage a content repository of 1TB of information. At a rough estimate this would cost ~$2000 per year. How much would it cost to build and deploy an on-premise solution to do the same kind of thing?
As with digital photography, once you’ve got past the differences in process and economics, a whole set of new possibilities emerge that simply wouldn’t work in the ‘old’ way of doing things. It’s still relatively early days and I think the really exciting new opportunities that cloud computing brings have yet to be seen, but here’s one big new possibility to start with. Suppose I wanted to sell my ‘unlimited capacity managed document repository’ solution (and other companies have already produced identical products so I probably wont). In the ‘old’ way of doing things I’d need to buy a huge amount of storage capacity, set up a number of servers to serve the management interface, write a load of software to do user management and security, and so on. The cost of just getting started would be massive and, even then, who would trust a small company with the secure storage of their important documents? This is a business that could only work with a serious amount of VC money behind it. In the world of cloud-computing however my cost of start-up is simply the cost of writing the integration between S3 and Force.com and that is all I’m asking customers to trust. The secure storage is provided by S3 and the management interface by Salesforce.com: both billion-dollar companies who invest extremely heavily in security, availability and so on.
Evolution is a process of gradual progressive change, revolution is a radical and pervasive change. To me, cloud computing feels like a revolution, not evolution.