How Google Cools Its 1 Million Servers 87
1sockchuck writes "As Google showed the world its data centers this week, it disclosed one of its best-kept secrets: how it cools its custom servers in high-density racks. All the magic happens in enclosed hot aisles, including supercomputer-style steel tubing that transports water — sometimes within inches of the servers. How many of those servers are there? Google has deployed at least 1 million servers, according to Wired, which got a look inside the company's North Carolina data center. The disclosures accompany a gallery of striking photos by architecture photographer Connie Zhou, who discusses the experience and her approach to the unique assignment."
STEEL TUBES, full of WATER?! (Score:0, Insightful)
Oh my god, what an innovation! Google has invented PLUMBING, and run tubes FULL OF WATER mere INCHES AWAY from its servers!
This is unheard of! How do they avoid getting everything wet, and having a couple feet of standing water on the floor?! I must know more about this magical technology Google has invented!
These are strange days we're livin', bros.
Re:The same way as everybody else. (Score:3, Insightful)
Take the heat you produce, and dump it somewhere else.
Sure but there are different ways of doing it.
Google says they have the cold air come up from their raised floor.
Facebook does it differently- the cold air drops down:
http://opencompute.org/2012/08/09/water-efficiency-at-facebooks-prineville-data-center/ [opencompute.org]
I'm no data center engineer but the Facebook way makes more sense to me.
Re:The same way as everybody else. (Score:2, Insightful)
Not particularly, either to the different ways, or the Facebook way.
Heat naturally flows up, cool air dropping down would fight that ventilation effect.