Building a Cloud Is Easy, Once You Have the Right Infrastructure
October 5, 2011Cloud computing has come on so fast and so strong over the past year that it’s understandable that much of what we "know" about its architecture and capabilities is more myth than fact.
As some proponents would have you believe, CIOs need only sign on the dotted line to gain instant access to unlimited, low-cost resources…
The reality is that the cloud is highly complex, comes in a variety of flavors and requires a fairly sophisticated in-house data architecture, all of which can have significant sway over the capabilities it brings to the table and the operational efficiencies it produces.
"The progression from an initial cloud deployment to a mature cloud strategy is not necessarily linear, and one type of cloud is not necessarily better than another," said Scott Powers, director of technical and product marketing at cloud platform developer Gale Technologies. "It all depends on the needs and situation of each organization."
For example, Powers says many small companies use public cloud services for the bulk of their server and storage infrastructure, avoiding hefty capital investments. Larger organizations that specialize in batch processing also use the public cloud, but their focus is more on bursting data during high peak loads. Meanwhile, mid-sized firms are trending toward private clouds to take advantage of dynamic provisioning and self-service as a means to cut costs, accelerate the delivery of business services or both.
With that in mind, firms like Gale Technologies are hoping to provide the next best thing to an instant cloud: a series of templates that can be used to configure cloud resources for a wide range of legacy environments. The idea is to utilize a set of pre-built automation and workflow configurations containing adapters for many of the leading server, storage, networking and virtualization platforms as a means to specify and control relationships between internal and external resources. The company says it can produce a fully functional cloud environment in less than two weeks.
"With template-based automated provisioning, the templates and workflows are pre-designed, tested and approved – with appropriate domain expertise and ownership as required," Powers says. "Once published, these templates can be offered through a Web-based service portal for end-user self-service automated fulfillment."
The concept of template-based cloud deployment has been taken to extremes by a startup called Piston Cloud Computing. The company has developed a streamlined workflow system called CloudKey that can be contained within a single USB stick, which the company says allows anyone with a laptop to build a private cloud infrastructure in a few hours. The system uses the OpenStack platform and the company’s pentOS operating system to allow firms to easily convert to a hybrid format when it becomes necessary to tap outside resources.
"We’ve basically taken the time-intensive process of building a private cloud and automated it to the point that a competent admin can install a private cloud with a laptop and a coffee break," says Christopher MacGown, CTO and co-founder of Piston. "A lot of small businesses are being forced to choose between the expense and convenience of public cloud providers, or the inconvenience but tangible cost and performance benefit of hosting services on their own hardware. Private cloud makes it possible for a small business to keep the ease of use and convenience of cloud provisioning while retaining the cost and performance benefits of owning your own hardware.
Of course, the cloud can only be built on an internal infrastructure that is ripe for resource pooling, dynamic load balancing and broad automation. Virtualization is a major enabler for all of these things, although it is not an absolute requirement.
More than anything, though, the key factor in a successful cloud deployment is a strong knowledge of existing infrastructure. As Peggy Canale, government segment manager for the Avocent division of Emerson Network Power, puts it: "If you don’t know what you have, you are starting in a cloud-risky posture."
"There are four areas we examine to ensure that the physical infrastructure is ready for the cloud," she says. "First, you need tight control and management over your current assets. It sounds simple, but the federal government just went through a prolonged review in this area and our private sector customers often use outdated diagrams as well. Second, performance data along with the ability to access all rack-level devices, no matter how remote, is necessary for a cloud-ready environment.
"Third, because a cloud’s virtualization technology means that a dynamic compute environment is built on a static foundation, you must be able to maintain headroom to handle normal peaks in demand, but also the odd spikes from sudden and intense cloud utilization. Last, the cloud is all about flexibility and scalability. More than half of the organizations considering the cloud are worried about how it will affect availability and performance. You need to be able to fully visualize the effects of change on load, power and devices to plan confidently."
Still, the cloud represents more than just a streamlined architecture. In fact, it alters some of the fundamental concepts that have guided IT development for years.
A key example is the division between the enterprise LAN and the WAN. Traditionally, local networks have been built for high-traffic, short-haul hops between multiple end points, while the wide area catered to long-haul point-to-point communications between remote offices. With the cloud, that notion is turned on its head as multiple users will now require long-haul access to both server and storage resources. In essence, the WAN is the new LAN and will need the chops to handle LAN-style traffic.
"As companies rely on the cloud for applications, services and infrastructure, they find that resources are externally located ," says Steve Schick, marketing director for WAN optimization developer Blue Coat Systems. "The WAN was created based on resources that are housed inside the company. Ultimately, this begs an architectural question — does the inside/outside paradigm still make sense? Is the common practice of backhauling all branch office Internet traffic to and from the data center still sensible?"
Blue Coat’s response to this change is to augment its traditional symmetrical approach to WAN optimization, in which acceleration appliances of equal capability are located at each end of the WAN, with the new CloudCaching system that places all of the acceleration on the end-user side.
"A SaaS application vendor will not allow or be able to accommodate a WAN optimization server, physical or virtual, for each customer," Schick says. "To effectively accelerate SaaS applications, one has to move to an asymmetric model. This ‘one-sided’ acceleration takes place only at the branch office or headquarters location and not in the cloud."
Conventional wisdom holds that before launching any new technology endeavor, enterprise managers need to devise a well-thought-out plan for exactly how they want to implement it and, more importantly, what they want to do with it. That’s a tall order for cloud computing, considering that many of its capabilities are speculative at this point.