Data centers are notorious energy hogs, caught in a cyclical process of keeping servers powered up and cooled down at once. In September, Google disclosed that its data centers continuously draw nearly 260 million watts, roughly equal to 25 percent of the output of a nuclear power plant. The New York Times noted that Google defends its power use partly by pointing out the energy saved when people search for something online instead of, for example, taking a car trip to the library.
But new conclusions from a Microsoft Research study may offer a less abstract way of arguing the sustainability of data centers. Their concept? A home heating “data furnace.” According to the study, the heat created by servers could be used to heat homes, as long as each home has a broadband connection that could link their “micro data center” to the larger companies requiring computing power. Instead of a furnace, homes might have a cabinet filled with computer servers connected to a circulation fan and ductwork.
“It’s a classic example of being able to transform waste into energy,” says Dominic Basulto of Big Think. “In some parts of America, vast data centers holding hundreds of thousands of servers are generating heat right now as they crunch through all the information being generated in the cloud. So much heat, in fact, that they have been specifically relocated to cooler climates where it is easier (and cheaper) to dissipate all the energy they are creating,” he says. In New York City alone, he notes, there are 50 different municipal data centers for storing government data, “not to mention all the data centers in hollowed-out buildings used by corporate clients.”
Randall Stross offers a few more positives of the data furnaces in the Times: “A conventional data center must invest about $400 a year to run each server, or about $16,000 for a cabinet filled with 40 of them. (This includes the costs of building a bricks-and-mortar center and cooling the machines.)” A company’s cost to operate the same cabinet in a home would be less than $3,600 a year, and would cover the homeowner’s electricity costs for the servers.
Unfortunately, there are also potential glitches to the proposed system. For example, if temperatures were to top 95 degrees, the servers would have to be shut down. On the up side, the researchers say the data furnaces wouldn’t make homes hotter in the summer, as homeowners could simply empty the hot air to the outside via an additional duct, “as harmless as a clothes dryer’s.”
There is also the obvious security question, discussed by Rebecca Boyle in Popular Science. “How could IT companies ensure that a client’s confidential data is safe in some random family’s basement?” she posits. “What about floods, power outages, or server snafus?" Researchers respond that servers would remain under the control of a company's central data center and all data would be encrypted. If a micro center were to fail, its work would be automatically designated to another server.
With chilly months ahead, does capturing heat from data centers sound like a positive move toward sustainability or a risky inconvenience to you?