Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Google Cloud Data Storage Hardware Technology

How Google Cools Its 1 Million Servers 87

1sockchuck writes "As Google showed the world its data centers this week, it disclosed one of its best-kept secrets: how it cools its custom servers in high-density racks. All the magic happens in enclosed hot aisles, including supercomputer-style steel tubing that transports water — sometimes within inches of the servers. How many of those servers are there? Google has deployed at least 1 million servers, according to Wired, which got a look inside the company's North Carolina data center. The disclosures accompany a gallery of striking photos by architecture photographer Connie Zhou, who discusses the experience and her approach to the unique assignment."
This discussion has been archived. No new comments can be posted.

How Google Cools Its 1 Million Servers

Comments Filter:
  • by Anonymous Coward

    Take the heat you produce, and dump it somewhere else.

    Be nice if it could be my house! I want to avoid turning the heat on till Thanksgiving if possible, but it's getting a bit tough.

    • Re: (Score:3, Insightful)

      by TheLink ( 130905 )

      Take the heat you produce, and dump it somewhere else.

      Sure but there are different ways of doing it.

      Google says they have the cold air come up from their raised floor.

      Facebook does it differently- the cold air drops down:
      http://opencompute.org/2012/08/09/water-efficiency-at-facebooks-prineville-data-center/ [opencompute.org]

      I'm no data center engineer but the Facebook way makes more sense to me.

      • by sjwt ( 161428 )

        I'm no thermodynamic engineer, but it seems to me facebook's way would lose out on the fact that the hot air would mingle with the cold air on the way down and warm up a little

        • Re: (Score:2, Funny)

          by TheLink ( 130905 )
          If you have hot air coming out your servers into the cold aisle you're doing things wrong.
        • by Cramer ( 69040 )

          Hot Aisle Containment. Without it, yes, the cold air would be mixing with rising hot air.

        • by tlhIngan ( 30335 )

          I'm no thermodynamic engineer, but it seems to me facebook's way would lose out on the fact that the hot air would mingle with the cold air on the way down and warm up a little

          For office or building heating, Facebook's method is completely conventional - A/C comes from the roof tiles and dribbles down to the floor which mixes with the existing warmer office air (which is why A/C is on - in a lot of places, you use A/C year round), creating a somewhat even temperature vertically (the ceiling is not appreciab

      • Re: (Score:2, Insightful)

        by Anonymous Coward

        Not particularly, either to the different ways, or the Facebook way.

        Heat naturally flows up, cool air dropping down would fight that ventilation effect.

        • indeed and the heated air raising up and being ventilated out will cause lower pressure -> ie. draw the cold in as well :)

      • by rriven ( 737681 )

        Google says they have the cold air come up from their raised floor.

        Google uses a different approach.

        Google realized that the so-called cold aisle in front of the machines could be kept at a relatively balmy 80 degrees or so—workers could wear shorts and T-shirts instead of the standard sweaters. And the “hot aisle,” a tightly enclosed space where the heat pours from the rear of the servers, could be allowed to hit around 120 degrees. That heat could be absorbed by coils filled with water, which would then be pumped out of the building and cooled before being circulated back inside. Add that to the long list of Google’s accomplishments: The company broke its CRAC habit.

        They also might not have a million servers,

        a tiny embossed plaque that reads july 9, 2008. google’s millionth server. But executives explain that this is a cumulative number, not necessarily an indication that Google has a million servers in operation at once

        • Google realized that the so-called cold aisle in front of the machines could be kept at a relatively balmy 80 degrees or so—workers could wear shorts and T-shirts instead of the standard sweaters. And the “hot aisle,” a tightly enclosed space where the heat pours from the rear of the servers, could be allowed to hit around 120 degrees.

          Our Los Angeles DataCenter does similarly; I haven't had to wear a sweater in a datacenter in at least twelve years. We use air, though, rather than water since we do colocation and a water-cooled system isn't as friendly to standard racks and servers which clients often want to install.

          We benefit from the Los Angeles outside air temperature; we don't need to run chillers unless the outside temperature is over 72 degrees, and while it's often over that during summer days it's almost always below that at nig

      • I thought the Facebook way was to blame it on the users misconfiguring things.
    • Be nice if it could be my house! I want to avoid turning the heat on till Thanksgiving if possible, but it's getting a bit tough.

      Figure out a way to efficiently transport low-grade heat long distances (or even extremely efficiently transport it short distances), and you might not get a Nobel prize, but you could get quite wealthy, and even feel good about all the energy the world saves.

      Until it's revealed that the materials used cause cancer in cute puppies, but, eh, maybe they won't find out until you're d

  • by Anonymous Coward on Saturday October 20, 2012 @12:49PM (#41715571)

    they use the chilling effect from all those DMCA notices they receive.

  • by Anonymous Coward on Saturday October 20, 2012 @12:51PM (#41715585)

    Google faked at least one picture. Take a look at this picture. [google.com]

    The left-hand side is exact copy of the right-hand side. Take a look at the details: The halos from the lights and the texts in the white labels.

    • The grill near the first triangular shadow at the bottom, is far more lit on the right. The triangular shadow is also different. However, those labels do seem far too similar...

      • by Anonymous Coward

        Not only are the similar, they're mirror images.

        Why put [label] [barcode] (left) on half your servers, then [barcode] [label] (right) on the other half?

        Also, the UPSs (or whatver they are, grey box on top) are mirrors (do they make left-handed and right-handed ones?)
        Perhaps more damning, the bus-bar power take-offs, along the top (middle). Again, left handed and right-handed ones (power cutoff switch).

        • furthermore if you look at the shadow reflected by the cables on the floor you will notice they are the exact same shadows on both left and right.
        • I agree it's a been flipped (the labels make no sense otherwise), but that grill is different (or my eyes are going in my old age of course).

    • by jbwolfe ( 241413 )
      Not by chance related to Steven Wright [youtube.com], are you?
    • by hresult ( 902522 )
      It's not Google who took and processed pictures (because there is no any ambient light in data centers as you can imagine), it's photographers who were invited to the facility and who ultimately decided to make racks look prettier by being strictly symmetrical.
    • by swillden ( 191260 ) <shawn-ds@willden.org> on Saturday October 20, 2012 @09:10PM (#41718711) Journal

      Google faked at least one picture. Take a look at this picture. [google.com]

      The left-hand side is exact copy of the right-hand side. Take a look at the details: The halos from the lights and the texts in the white labels.

      If you read the link with the interview with the photographer you'll find that she's into heavy post-production editing. Arguably, *all* of the images are "faked" to some extent. She takes many shots of each scene and layers them together selectively to get the effect she wants. She clones out stuff she doesn't want (e.g. she mentions removing an exit sign) and clones in stuff she feels is needed to make the image symmetric, and therefore more beautiful. She doesn't worry about barrel, pincushion and perspective distortion in the original shots and does heavy correction of the final images to straighten the lines and make the angles pleasing to the eye. She shot almost all of the images with long exposures in a darkened room, which makes the relatively small LEDs appear to glow intensely and makes their cast light powerful enough to be very visible when in reality it's not very visible at all.

      In short, she's interested in beauty more than in fidelity, and does whatever it takes to achieve it. Personally, I think her results are fantastic.

  • by stevegee58 ( 1179505 ) on Saturday October 20, 2012 @12:55PM (#41715607) Journal
    So Ted Stevens (may he rest in peace) was right! It really *is* a series of tubes!
  • by Anonymous Coward

    With 1 million interns from India that wave their arms really fast!

  • by theshowmecanuck ( 703852 ) on Saturday October 20, 2012 @01:04PM (#41715661) Journal
    I'm really starting to like the idea that they should submerge all their servers in circulating liquid and cool them that way. It could absorb a lot more heat and it would probably make it a lot easier to reclaim the energy from the waste heat. Air has a really low specific heat. It's why they make fluffy down filled jackets; to trap the air which acts as a good thermal insulator (and getting it wet in the winter will kill you if you don't get somewhere warm because liquids conduct/transfer heat better due to greater specific heat).
    • Comment removed (Score:5, Informative)

      by account_deleted ( 4530225 ) on Saturday October 20, 2012 @01:11PM (#41715695)
      Comment removed based on user account deletion
      • There has to be something better than air to cool things. People figured that out with respect to engines a long time ago.
      • by anerki ( 169995 )

        Why not build specific cases that fit perfectly, in a single mold? Then you could make a group of 4, and submerse that (you can even leave the top open and connect a tube in the bottom to create a current inside and turn it into a huge heatsink too). It won't be perfect obviously, but would work better than tubes I reckon, and since you could have the submerse them compeletely, but put them in containers that are open at the top serviceability would be easier, at least.

    • by joib ( 70841 )
      Current generation google datacenters already have a PUE around 1.1, so whatever they do by tweaking the cooling they cannot reduce the total energy consumption by more than 10 %. Of course, at their scale 10% is still a lot of energy, but the question is how much they could actually reduce that by going to immersion cooling. So far the anecdotal answer seems to be "not enough", since otherwise they would surely already have done it.
      • I'm thinking not only of the efficiency but of something similar to cogeneration. Maybe it would be better to locate centres furthur north so they could help heat a small town in winter with the exccess heat. IIRC in Winnipeg for a long time, "waste steam" from a smaller electrical generation station near the core was used to heat a whole section of office buildings in the downtown (this was from the 1920s to maybe the 80s... but the idea is still valid I think). Or would it be possible to use the excess he
        • by joib ( 70841 )
          The problem is that the waste heat from server is pretty low grade; google runs their data centers hotter than most, and they report a waste heat temp of about 50 C. I would guess that the water they use to cool the air thus gets heated to at most 45 C or so. So it's difficult to use efficiently or economically. At least over here, district heating systems have an input temperature around 100 C (in some cases slightly more, the pressure in the system prevents boiling).

          I don't see how this would be any dif

      • by epine ( 68316 )

        So far the anecdotal answer seems to be "not enough", since otherwise they would surely already have done it.

        If Cinderella's coach was going to turn into a pumpkin, it would surely have already happened before the clock struck twelve.

        You're suggesting that at the exact moment when it becomes effective to do so, all the Google data centers turn into immersive pumpkins, so the non-existence of such is a perfect correlate with its viability, since any friction around the decision is inconceivable.

        What did you

        • by joib ( 70841 )
          That might be a relevant argument if immersion cooling (or generally, liquid cooling of the servers themselves) would somehow be new, innovative, or non-obvious. It's none of those. Secondly, I didn't mean to imply that google would turn around on a dime, but rather that at least some of the newer data centers would use something better if available. The Hamina data center seen in those picture, for instance, was opened in 2012 and seems to use the same air-cooled hot-aisle containment design. I haven't se
  • ... and risk damaging my belief they use ice-pops.

    -Oliver

  • Can someone give me a short and elegant argument why one would use hot aisle containment? Cold air is precious, so intuitively I would think you'd want to contain that.
    • by joib ( 70841 )
      Google runs their datacenters at quite high temperatures, the cold side is around 25 C, hot side 50 C. I suppose it would be a pretty unpleasant working environment if the main space of the server rooms would be at 50 C rather than 25 C.
    • by Jstlook ( 1193309 ) on Saturday October 20, 2012 @01:55PM (#41715953)
      Their cold air is essentially room temperature, as they're using 80 degrees (presumably F) for that side. So really they've just contained the servers, sucked all the heat out, cooled it down to room temperature, and dumped it back into the room. It's far more efficient because they're not using the servers to heat a whole room / building, then air condition each room for human usage.
      • by godrik ( 1287354 )

        Their cold air is essentially room temperature, as they're using 80 degrees (presumably F) for that side.

        They are Google. I think it is 80K!

    • by CBravo ( 35450 )
      It is more efficient to cool down 1/2 the air twice as much. Less overhead with pumps, plumbing, cooling surface, etc. The only problem, mentioned by others, you better not let the hot air escape. IANACE.
  • I wish there was a good way to hear the tremendous noise all those fans make in the hot aisle.

  • by Anonymous Coward

    The was a placard for the millionith server board. Google stated clearly that it was # 1,000,000 and that it did not mean there were 1M in service. That was really clear in the article so the summary messed it up or just missed it.

  • by vikingpower ( 768921 ) on Sunday October 21, 2012 @05:00AM (#41720535) Homepage Journal

    A Dutch newspaper, which this week published several of Zhous photos, found out - after a thread on Reddit began mentioning possible photoshopping - that, indeed, Zhou manipulated her pictures: http://www.nrc.nl/nieuws/2012/10/20/google-publiceert-prachtige-fotos-van-datacentra-maar-zijn-ze-echt-nee/ [www.nrc.nl] is the link ( story, of course, in Dutch )

    When you look at the pictures with a critical eye, you see it quite quickly: on half of the servers, the LEDs are on the wrong side, they are simply mirrored. Zhou declared she is "crazy about symmetry". As one commentor on Reddit put it: "I knew it! For a long time, Google has been trying to make us believe that they have a lot of servers. Well, this proves that they only have very many servers" Google quite quickly admitted to the news, but did not see a reason to take the Zhou series of pictures offline.

  • i smell an engineering contest. I'm way far from an engineer ... couple of lightyears probably but if that air moves can't it be used for energy reclamation driving some kind of micro-turbines (does that exist?) getting back a little of the energy spent that could be redirected into the system again. Or is this one of my 'you've read too much science-fiction' moments ?

One person's error is another person's data.

Working...