10 Ways Data Centers Are Becoming Greener

April 13, 2011 Off By David
Grazed from GigaOM.  Author: Katie Fehrenbacher.

Energy-efficient data centers have solidly moved into the (low-power) spotlight in recent weeks thanks to the Open Compute Project from Facebook. Last week, the social network giant shared an unprecedented amount of data about its low-power servers and data center designs that have enabled its new data center in Oregon to be remarkably energy-efficient. To me, the move shows just how important these energy-efficient characteristics have become for the leading Internet companies as a way to stay competitive and keep their energy costs as low as possible.

At our Green:Net 2011 event (register here) coming up on April 21 (just next week!), we’ll be featuring conversations about energy-efficient and clean-powered data center innovation from Internet companies like Google, Yahoo and Microsoft; data center energy gear vendors like GE; and startups like Calxeda and SeaMicro (two of our Big Ideas companies). These entrepreneurs’ and engineers’ ideas for reducing energy usage, and adding in clean power, are diverse, inspiring, cost-effective and could one day be the norm in computing.

Here are 10 innovative ways that data centers are becoming greener:

1. ARM Wrestling. Calxeda builds servers out of clusters of cell phone chips (ARM chips) in order to optimize power efficiency. The startup’s tech has received ringing endorsements from analysts, and large companies like graphics chip maker Nvidia have followed suit with their own plans to use the ARM architecture for servers.

2. Wireless Networks and Sensors. Good ol’ wireless networks and sensors can determine which areas of a data center are running too hot and too cold. It’s basically using wireless tech to fill an energy blind spot. SynapSense is a startup launched by Intel’s Peter Van Deventer and U.C. Davis computer science professor Raju Pandey that makes wireless sensor technology that feeds power and cooling data to real-time visualization software. Startup Sentilla takes a slightly different approach, and the company’s “Energy Manager for Data Centers” allows IT shops to wirelessly monitor their energy consumption in multi-vendor environments.

3. Computing Power On Demand. Why not build a system that uses just as much computing power as website operators need at all times? That’s what Power Assure, a startup founded three years ago, is doing. The company makes a Software-as-a-Service product that can ramp up and down the power consumption of data centers to coincide with the demand of its web company users. So, say, in the middle of the night, when few people are pinging its customers’ websites, Power Assure’s service can reduce energy consumption appropriately, and when there’s a sudden spike of traffic in the morning, the service can quickly ramp capacity back up.

4. Clean Power Computing Experiment. There are a couple of companies trying to aggressively combine clean power and data centers, including Greenqloud (geothermal), Baryonyx (wind) and even this trash-powered data center idea. But one of the few companies that seems to be realistically trying to figure out an economic approach to weaving in clean power with data centers is Google. The search engine giant has been investing (significantly, recently) in clean power projects and has told us about ways it could see how to integrate those resources when the time — and economics — allow.

5. Servers That Act Like People. Humans have evolved to be pretty good at using energy only when we need it. It’s a biological conservation method, and we conserve calories to burn them when we need to fight or flee, but not when we want to sleep and chillax. But servers, on the other hand, run pretty inefficiently, and often times use the same amount of energy when used lightly (say, when web services are pinging them less frequently) as they are at maximum use (during the live streaming of Obama’s inauguration, e.g.). It’s similar to the issue Power Assure is trying to tackle, but Google thinks servers should be designed and engineered from the ground up to work this way, vs relying on a third party’s software to manage this process.

6. Get Some Fresh Air. Turns out tapping into fresh air can do wonders for data center energy efficiency. The massive cooling systems that cool down data center servers can account for up to half of the power consumption of the data center. Some operators are eliminating those chillers by building data centers in open-air-friendly  climates and by developing open air designs that can block out dirt and humidity, but can use the natural environment to keep the data center at an optimal temperature. Yahoo has developed a sort of chicken-coop inspired data center design that lets in outside air.

7. Water Sprayed Cooling. Take a page from outside-air cooling, and add in a spritz of water. Facebook keeps its data center in Oregon cool by spraying water into the incoming air. Then Facebook designed its servers to be able to work in that hotter and more humid environment.

8. Liquid-Cooled Servers. Water isn’t just helpful for cooling in little sprays in the data center. What about dunking entire servers in it, or better yet, a weird fluid mixture? While it’s still early days for liquid-cooled servers, a few startups are developing technology in this area. Iceotope has built a liquid-cooled server system that it says can cut data center cooling costs by a whopping 93 percent. Green Revolution Cooling also submerges servers in an inert liquid (mineral oil, yuck).

9. Data Center in a Box. It’s all about modular computing. Microsoft has developed a green pre-fab “data center in a box” technology called ITPAC (IT pre-assembled components). ITPAC is manufactured by air cooling company Saiver, uses outside air for cooling, is basically plug-n-play with the servers and everything, and the data center operator doesn’t need to actually construct the building, so this also reduces the need for concrete, steel, piping, copper etc. The U.N. recently turned to the ITPAC system for its office in Nairobi.

10. Cloud Computing? Is cloud computing more energy-efficient, or less? Who knows, but a lot of the vendors like Microsoft and Akamai have arguments ready for why cloud computing is more energy-efficient and a better use of computing resources. Other studies show it’s a lot more complicated and depends on the amount of web traffic, web usage and infrastructure.