Intel Experiment Could Save Millions in Data Center Power Costs
An Intel experiment in cooling data centers has discovered that cooling servers with plain outside air is almost as effective as air conditioners. There was only a tiny increase in server failure. [Intel is a sponsor of SVW]
This could save data centers millions of dollars in power costs and allow expansion. Currently, many data centers have limited expansion options because they are limited by how much electric power they have available.
Here are the details from ZDNet:
. . . The experiment was run for 10 months, between October 2007 and August 2008. Server units with over 900 blades, used for production design, were split into two compartments. One of the compartments was air cooled, with temperatures ranging from 18 to 32°C. The other compartment was cooled using air conditioning, and used as a control.
. . . Intel used a normal air filter that took larger particles out of the air but not fine dust. While the 32 servers and racks became coated in dust, and humidity was monitored but not controlled, the failure rate was 4.46 percent, compared with a 3.83 percent failure rate in Intel's main datacenter over the same period.
. . . Intel estimated an annual cost reduction of approximately $143,000 (£79,000) for a small, 500kW datacenter, based on electricity costs of eight cents per kWh. In a larger 10MW datacenter, the estimated annual cost reduction was $2.87 million.
Foremski's Take: This is a big power saving. This will enable expansion in some data centers but it could also put off the installation of more power efficient hardware based on Intel's latest power-saving chip sets, and also power saving hard drive systems from 3PAR and others. Power savings have been a key incentive for many data centers to rip out and replace their older power hungry gear.
Since environments such as high humidity are less of a problem maybe we'll see server farms combined with greenhouses :-) Cloud computing plus vine-ripened tomatoes.