The combined dramatic price drops from major public Cloud providers a few weeks ago has led to a pandemonium of punditry. One deep thinker after another has plumbed the pedestrian proportions of this phenomenon, looking for meaning. Now it’s time for me to jump into this pool of profundity.
Our pivot point: Bezos’s Law, first pontificated by Greg O’Connor of AppZero. In a paean to Moore’s Law, O’Connor defined Bezos’s Law as “a unit of computing power price is reduced by 50 percent approximately every three years.” Hallelujah! No matter how large your data center or how complex your IT environment, moving it to the Cloud will eventually result in paying pennies for infrastructure where you’re now paying dollars. Furthermore, not moving to the Cloud is a fool’s choice, in spite of the fact that hardware prices are also dropping like a stone. The conclusion: everybody will eventually move to the Cloud, simply by virtue of Bezos’s Law and the dramatic cost savings over time it promises.
Sorry to pierce this presumptuous perspective, but not so fast. Bezos’s Law – just like its patriarchal predecessor, Moore’s Law – doesn’t operate in a vacuum. In fact, there’s an opposing principle (or more like a rule of thumb) that continually acts to counteract these laws: Parkinson’s Law – or more precisely, various corollaries to Parkinson’s Law.
Parkinson’s Law states that “work expands so as to fill the time available for its completion.” We can extend this law, and hence derive a plethora of corollaries from it, to something like “whenever people have an excess amount of a resource, they’d rather use it up than bank such excess and consider it savings.”
We’ve long experienced the corresponding corollary to Parkinson’s Law that counteracts Moore’s Law: “people will create software that maximizes its use of available processing power.” Here’s an example. My first computer, a Macintosh SE back in the 1980s, had an 8 MHz, 8 bit processor. It took, oh, about 45 seconds to boot. Today I’m writing this post on a Dell Latitude 3540 running Windows 7.1. This behemoth features a 1.7 GHz, 64 bit processor – 213 times faster than my venerable old Mac. How long does my laptop take to boot? Hmm, about 45 seconds. Maybe a bit more –after all, it is running Windows.
What happened here? Moore’s Law more or less predicted the 213 times performance increase, and yet the same task (more or less) takes the same amount of time as it did back in the days of big hair and legwarmers (more or less). The problem of course: Parkinson’s Law, the Moore’s Law corollary. Microsoft knows full well that people will put up with a 45-second boot time (after all, we’re used to it, which is part of the point), so they cram as much as they can into those 45 seconds.
Back to Bezos’s Law. True, the cost of IaaS is dropping like a stone, and will continue to drop. But any enterprise expecting to actually save money over time by moving to the Cloud is in for a rude awakening. Yes, the ugly head of Parkinson’s Law, Bezos’s Corollary rears once again: cheaper processing infrastructure means we’ll get to do more stuff. It doesn’t mean we’ll get to save money.
Keep in mind the Parkinson’s corollaries are just rules of thumb. Sure, you could forego increasing your capabilities in favor of cost savings, just as some systems boot much faster than my laptop because they don’t load nearly as much stuff during their boot, or they don’t boot from little spinning disks. But as a rule of thumb, Parkinson puts said digit on a core principle of human nature. Give us more resources, and we’d much rather find uses for them than save them.
It’s no wonder, then, that there’s also a Big Data corollary to Parkinson’s Law: the size of big data sets you’ll be dealing with will always use up your capacity for collecting, storing, and analyzing such data. The more you can crunch, the more you’ll have to crunch. But faced with the choice of crunching more Big Data or saving money, it is human nature – and probably good business – to go for the crunching.
IaaS, big data, Cloud Infrastructure