The Rise of Utility Computing
October 5, 2011Oracle (Nasdaq: ORCL) is a big company, and that point gets driven home when you start to go in-depth on their products. At a show like OpenWorld, which is dedicated more or less to touching on every aspect of the business, you can quickly get out of your depth…
Since the opening keynote on Sunday, the talk has been mainly around things that I know about but don’t cover. So I’ve learned about what’s new in the company’s computing hardware, operating systems, the subject of Big Data and the new Sparc T4 chip.
You have to expect that. As a database company first, something like Exadata — a storage machine for very large databases — is very important, especially for large Oracle customers. Exadata is supposed to speed up data processing by orders of magnitude and enable users to compress it, significantly reducing the number of disks needed and the electricity to drive and cool them. As I say, it’s important if you are a big company with significant IT issues, not the least of which is power consumption.
I can say the same about Exalogic, a compute server that offers massive parallelism, meaning multiple parts doing the same job within the box. Parallelism is important if you need to support hundreds of thousands or even millions of users on a website and need to ensure up time all the time. And Oracle just introduced Exalytic, a machine that does for business intelligence what the others do for their respective IT fiefdoms.
Massive Parallelism
So, it’s all definitely important, but not exactly CRM — except for one small idea. All of the gear mentioned is necessary for taking the next step from cloud computing to a genuine form of utility computing. Think about your phone service or electric service or older services like water or natural gas. Those things just about never go out. True, they can falter in a natural disaster, but other than that, outages are rather rare. The providers generally achieve seven to nine "9s" of reliability (99.9999999 percent up time).
Do the math — it’s like a hiccup once a year. Now compare that with the three or four 9s a cloud offering provides these days. Though still rare, outages are something that cloud computing still grapples with in part because many cloud vendors do not offer the kind of redundancy and massive parallelism that other utilities do. It’s not that the electric grid is perfect; what keeps it up all the time is massive parallelism. The same is true in almost any system upon which we’ve come to depend. In its Exa- hardware, Oracle is trying to provide that parallelism.
My point is that cloud computing is growing up, and not a moment too soon. The amount of data they’re talking about at this show will soon be measured in zettabytes, if my friends at IDC are right. It is a number that makes even the national debt look insignificant, and it’s just on the horizon.
What’s also on the horizon is whopping energy costs for data processing unless we find better ways to power and cool our machines, and the Exa- family is certainly a way toward that.
A ‘Private’ Cloud?
Implicit in all this is the inexorable move toward cloud computing. This week I am seeing and hearing a lot about private, public and hybrid clouds, and much of this will lead to more efficient processing that will make the zettabyte world realistic.
Nonetheless, I must say that some of what I hear — not just from Oracle but from partners like EMC2 too — seems a bit self-serving. Specifically, I must respectfully disagree with anyone who uses the term "private cloud" because it simply moves a data center from within a company’s walls to a vendor’s data center.
While the IT processing might be delivered through the Internet and the solution might shave off some computing costs, cloud computing is much more than moving the data center and continuing to run the same applications in the same old way. Hint: if your data center is moving to the cloud, shouldn’t some of your business processes too? Shouldn’t you be looking at reinventing your business processes as well as making them cheaper to run? Sometimes I worry that extracting cost out of IT this way will simply kick the business process question down the road, and the processes will just ossify.
We’re embarking on a new computing era that will be powered by massively parallel machines, yet we seem to behave as if we can simply take our heretofore terrestrial applications to the cloud — while it will work, it won’t take advantage of all the opportunities the cloud offers.
Now’s the time to rethink the ways we do business and the applications we do business with. The result of that thinking should be a set of programs in every company designed to move into the new era of business — one that gets closer to the customer and, by the way, the remote employee or contractor. Granted, that’s a big job, but so far it’s the 800-pound gorilla in the room. No one is talking about it. New languages, new tools and approaches are being rolled out as if they’re simply nice to have. Not so — they’re necessary.