Is Cloud Computing The Biggest Green Technology?
November 14, 2012Grazed from CloudTweaks. Author: Abdul Salam.
Global warming and climate change are on top of world’s list of concerns and one of the reasons is our dependence on dirty energy from fossil fuels. We all consider the transportation sector as the main pollution source of our atmosphere, but is the IT industry really exempted from the blame?
Governments around the globe usually have stringent standards on factory or industrial facility energy consumption and emission while the energy consumption in IT laboratories and data centers is overlooked, with the exception of some universities and research organizations. So there are no standards or laws that are meant to be followed when putting up such facility, which makes this a big problem. Research suggests that a great deal of energy is being wasted during energy conversion from AC to DC and it would cost twice as much, in terms of energy consumption, to cool a server than to run it…
Let’s say a server is rated at 500W and it runs 24/7; that server would consume 4380KWh per year. Now let’s assume that ten of those servers are running at the same time for every IT company with more than 1,000 employees. The estimated number of those companies given by the census bureau was 2916 in 2007; it could have grown exponentially within the last five years. This gives a rough value of 12,772MWh of energy consumption in 2007 alone, and twice of that value is used for cooling those servers. That is assuming all companies used the same 500W servers in the same way, but the real-world value could be even greater because there are still many old servers running. And not all of that hardware is being utilized; underutilization is the biggest waste of resource. This is a very big concern, especially since most of that energy is not renewable…
Read more from the source @ http://www.cloudtweaks.com/2012/11/is-cloud-computing-the-biggest-green-technology/


