At breakfast with Jonathan Schwarz yesterday he was talking about Sun’s recent launch of a true utility – a grid computer available to all. His pitch is essentially that computing should become a service like water, electricity or gas: one that you can turn on or turn off at will. The missing piece is that for something to become a true utility there needs to be transparent pricing, i.e. a clear and indisputable metric in units that are standard to everyone. With electricity, this was KW/h – and the move to utility led to ubiquity.
The comparison between KW/h and CPU/h isn’t seem straightforward – “CPU” after all is a moving target. Sun have helpfully defined what they mean though – a 2.4Ghz opteron (and there are accompanying stats on disk storage etc) – along with issuing a none-too-subtle challenge to IBM. Unlike electricity, you should get progressively more for less as time goes on – the folks who manage the grid perform upgrades and, hopefully, the price falls as the cost of computing is driven down by ubiqity and accompanying widespread use. I haven’t notice my electricity bill go down recently but, as Jonathan says, the first person to have his house fully wired was JP Morgan and he needed full time staff to manage the generator; since then, bills have certainly come down.
Whenever I looked at the task manager application on my PC I was always amused to see it registering mostly 95% idle (when I worked on VAX systems, I saw that rather than say “idle” they used “System tasks” in case any senior management happened to look at it and wonder why they were paying such huge bills I guess). So in terms of CPU/h, I am paying through the nose for ‘CPU’ and getting very little ‘hour’. And, if I’m paying through the nose, then any corporate or public sector entity is getting nosebleeds unless they’re running intensive activities all the time (oil exploration surveys come to mind) or they’re running on out of date hardware and really sweating their assets.
We’re all used to paying for computing as “capital cost” though – we buy a laptop or a desktop for £1000, £2000 or £3000 – and then we manage it operationally (for perhaps £1,000-£3,000 a head from what I hear). Over 3-5 years of depreciation that’s a lot of money for probably relatively few truly productive CPU hours. We probably don’t even know how many or how much they cost.
What Sun have offered here is a benchmark – a pure number that allows direct comparison with our own costs. I don’t think anyone has tried that before. We don’t know, of course, whether that’s Sun’s true cost of service provision (with appropriate margins built in etc) or whether it’s a loss leader (their accounts a few quarters down will perhaps tell that story). But we do have a number that anyone can compare to their own data.
The problem, I think, is that few will have the data to really determine the cost in CPU/h terms. To start, maybe it will be enough to sum the cost of the data centre and divide by the number of hours in a year. That, for most people, will be more than $1. And, if like every system I’ve ever seen, you’re mostly idle, then all of a sudden it starts to look like $10, $100 or $1000 an hour. Or maybe more?
Then come all the objections, all of the comparisons, the dependencies, the issues and restrictions. It will be something like “we couldn’t move our data to Sun’s place because of confidentiality” or because of “security” or “data protection” or “we don’t run their software stack” or whatever. I think that’s the beauty of Sun’s move though – they’re provoking a debate and some folks will take on that challenge and run the numbers and see if there’s maybe a way that they can make use of Sun. Others will run the numbers and look for ways to cut their own costs and get more efficient. And others still will ignore it because they don’t really want to know what their own costs are and how far away they are from true utility computing.
If I could hook a truly dumb terminal up to their grid and run my own basic computing needs against it (I really don’t need a grid but I’ll take it if it’s there) with ubiquitous wireless connectivity, I think my computing bill would be $10 a year or less. I wonder if there’s a model there for the folks who don’t have PCs yet, who don’t quite know why they need one – a fully subscription based system with minimal hardware, practically zero management and a simple monthly fee (after all, it works for satellite TV). You turn your terminal on and you start paying for it – just like water, electricity and gas. I’m sure that Jonathan is not yet ready to get into the consumer market with this – he wants weather forecasting or seismic surveys or something – but maybe once it’s proven there is a model with the right partner to do that.
I don’t think Sun will succeed at making this a profitable business, but I do give them credit for trying. And I’d love to see a debate provoked around what the true costs are for others.