Grazed from The Verge. Author: Tim Carmody.
A new report by
The New York Times on the internet’s energy consumption estimates that data centers worldwide use 30 billion watts of electricity, including as much as 10 billion watts in the United States alone. According to McKinsey & Company, in a report commissioned by the
Times, between 6 and 12 percent of that energy powers actual computations; the rest keeps servers running in case of a surge or crash. "This is an industry dirty secret," an anonymous executive told the
Times. Other sources quoted in the story call the growing energy needs of servers unsustainable and a "waste," the result of building up servers beyond current demands by companies that want "too many insurance policies" to avoid even a millisecond of downtime.
"Power, Pollution and the Internet" is the first entry in a multi-part investigative series on the environmental impact of cloud computing. Judging by this opening salvo, the series promises to be sharply critical of the technology industry. It calls the industry secretive, slow to change its practices, and unrealistic in its presentation of the internet as an environmental boon — taking particular care to single out Google, Amazon, and Facebook for specific consideration…
The report also presents a distorted and outdated view of the internet and cloud computing. It focuses on frivolous media and entertainment, or "fantasy points and league rankings, snapshots from nearly forgotten vacations kept forever in storage devices." It doesn’t really grapple with the cloud as an increasingly-essential element of infrastructure, powering industry, government, finance, and commerce, as well as personal communication and data storage…
Read more from the source @ http://www.theverge.com/2012/9/23/3377868/cloud-internet-infrastructure-waste-energy-new-york-times