Demystifying the myths of cloud computing

August 3, 2011 Off By David
Object Storage
Grazed from DatacenterDynamics.  Author:  Penny Jones.

Myth #1 – Cloud isn’t secure 
For any cloud provider, security is always the top priority. Most companies don’t have the luxury of dedicating resources on security. Cloud uses all the same security tactics and strategies, including physical data center security, separation of the network, isolation of the server hardware, and isolation of storage that enterprise data centers have used for the last 30 years, and good providers will in fact invest a lot more…

 
AWS maintains security at the physical datacenter by providing badge-controlled access, guard stations, monitored security cameras, alarms, separate cages, and strictly audited procedures and processes. It has the added physical security advantage that customers have no need to access to the servers and networking gear inside.  Because of this, access to the datacenter is even more strictly controlled than traditional rented facilities. 
 
In regards to network security in the AWS cloud, AWS maintains packet-level isolation of network traffic and support industry-standard encryption. Because Amazon Web Services’ Virtual Private Cloud allows a customer to establish its own IP address space, customers can use the same tools and software infrastructure they’re already familiar with to monitor and control their cloud networks. 
 
Finally, the scale of a cloud computing provider can also allow significantly more investment in security controls and countermeasures than almost any large company could afford.  
 
Myth #2 – Cost is the only cloud advantage 
The reality is cost is just one of them, the more important advantage is the ability to move more quickly and accelerate time-to-market. If you ask software development engineers at enterprises how long it takes to get a server if they want to do an experiment or just expand a project, the answers range from four weeks to three months just for a server. That is frustrating for engineers and it stifles innovation.  
 
With a cloud you are able to spin up large amounts of server capacity in minutes to expedite development work. As an example, before transitioning to AWS, it took the Guardian News and Media (GNM) three weeks for hardware to be delivered and installed plus additional time for budget approval. Now, the entire process is completed within 30 minutes.
 
Myth #3 – You should move all infrastructure to the cloud in one fell swoop   
If you are a start-up, that is what you should do. It makes no sense to build on top of the old world model of buying infrastructure that you may or may not need.  
 
For enterprises that have new development it is easy to build it on top of the cloud and quickly take advantage of those benefits. For enterprises with many legacy applications and systems, it is not advisable to move everything at once.  
 
Most enterprises move more methodically by picking a diverse set of initial applications to try as proofs of concepts in the cloud. They run them from a few weeks to a few months to see how the cloud is different and understand how to operate in the cloud before moving more of their applications. This will be followed by a 12-to-24-month migration plan. 
 
For example, GNM first started working with AWS when it encountered the need to scale quickly in response to unpredictable demand changes for its new online services. GNM worked to automate the launching of its servers in the cloud using shell scripts and Puppet, a tool for configuring new Amazon EC2 instances. The group has two Amazon Machine Images (AMIs)—32 and 64 bit—and provides user data when creating each instance to determine which Puppet manifests to download and apply to create the right type of server. Now having successfully launched multiple applications on AWS, GNM is developing plans to incorporate increasingly sophisticated cloud deployments in the future, including Auto Scaling.  
 
Myth #4 – I can get all the benefits of the cloud with my own private cloud  
The reality is when you really dig into the details of these private or internal clouds they are usually a very expensive fixed cost, private installation of infrastructure which lack all the key benefits of the cloud. Companies that build these types of internal clouds still own all the capital expense at the data centers and incur ongoing high maintenance costs. 
 
For example, beyond the initial investment in purchasing hardware, businesses need to account for performance checks and maintenance fees. Upgrading to the newest routers or load balancers is expensive. For customers requiring a multi-million dollar investment in infrastructure, thinking about how the value of that asset will degrade over time is a critical analytical component. The average data center utilization is around 10%, and 20-25% is the highest numbers we hear. An experienced cloud provider will run at a much higher utilization, as the demand curve is smoothed across many thousands of customers. 
 
Running a highly reliable and available IT infrastructure of course requires far more than deploying servers in a datacenter. Organizations need to have reliable storage, including backup. 
 
To achieve realistic disaster recovery, all of the data centers and servers involved have to be constantly utilized; if they sit idle, it’s almost certain they won’t function as desired when activated from a cold start. So an organization needs to account for both the cost and the complexity of this redundancy when evaluating deployment. 
 
Many people also ignore the true costs of the sizable IT infrastructure teams that are needed to handle the “heavy lifting” of managing heterogeneous hardware and the related supply chain, staying up to date on data center design, negotiating contracts, dealing with legacy software, operating data centers, moving facilities, scaling and managing physical growth. Companies should consider the notion of what really is private cloud because it is a term with ‘cloud’ in it but lacks all the key benefits of the cloud.
 
Cloud computing enables IT to be truly the business enabler. It allows companies to focus their capital and resources on innovations to accelerate their time to market, rather than running and maintaining the undifferentiating heavy lifting of infrastructure. 
 
Think about what had happened over 100 years ago when most companies generated their own electricity. They had to install and operate generator in their premise to generate electricity to run the factory. Over time the electricity grid evolved, the economies of the grid were just too good for companies to continue running their own electricity generator and it eventually became obsolete. 
 
This same analogy could happen to computing. In fullness of time, very few companies will own their data centers and those that do own it will have tiny footprints. That is because the economies of the cloud are too great for any forward-looking companies to ignore.