Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Intel Data Storage Silicon Graphics Hardware IT

Intel and SGI Test Full-Immersion Cooling For Servers 102

itwbennett (1594911) writes "Intel and SGI have built a proof-of-concept supercomputer that's kept cool using a fluid developed by 3M called Novec that is already used in fire suppression systems. The technology, which could replace fans and eliminate the need to use tons of municipal water to cool data centers, has the potential to slash data-center energy bills by more than 90 percent, said Michael Patterson, senior power and thermal architect at Intel. But there are several challenges, including the need to design new motherboards and servers."
This discussion has been archived. No new comments can be posted.

Intel and SGI Test Full-Immersion Cooling For Servers

Comments Filter:
  • Cray-2 (Score:5, Informative)

    by Anonymous Coward on Thursday April 10, 2014 @02:06AM (#46711555)

    The Cray-2 did this in 1985 using a liquid called Fluorinert also invented by 3M:

    http://en.wikipedia.org/wiki/Cray-2
    http://en.wikipedia.org/wiki/Fluorinert

    • 3M Novec 649 Engineered Fluid

      Novec 649 fluid is an advanced heat transfer fluid, balancing customer needs for physical, thermal and electrical properties, with favorable environmental properties. Novec 649 fluid is an effective heat transfer fluid with a boiling point of 49C. Novec 649 fluid is useful in heat transfer particularly where non-flammability or environmental factors are a consideration.

      • by Anonymous Coward

        49C is rather low for a boiling point for this application. Not that I'm terribly familiar with their product lines, but there are several in the 7xxx series that are more likely candidates - some with boiling points better than 100C. Running things hot (within tolerances) would help with the efficiency of heat transfer to the environment as well.

        • Re:I wonder... (Score:5, Interesting)

          by gl4ss ( 559668 ) on Thursday April 10, 2014 @03:47AM (#46711919) Homepage Journal

          it can be an advantage, as long as it doesn't break down on boiling.

          that way the cpu can stay at 49c and the system can be built to not require pumps, just by piping the steam to a cooling tower and from tower back to servers. however of course this needs redesign of the server and components, like said.

        • 49C is rather low for a boiling point for this application.

          Is it? Phase transitions generally require quite a lot of energy. It is my understanding that if you allow the vapors to condense externally and return the liquid back, you'll get a significantly improved heat transfer. In fact, this is why heat pipes work so well.

          • by mpe ( 36238 )
            Phase transitions generally require quite a lot of energy.

            Phase translations involve energy refered to as "latent heat". However the latent heat of boilng and that of freezing (along with the specific heat capacity in any phase) depend very much on the substance involved,
            • by Polo ( 30659 ) *

              It seems to be 99kJ/kg at it's boiling point of 45C/120F

              For all practical purposes, I just thought coolants increase in temperature to their boiling point and just stay there (or a little higher if under pressure like a car radiator or pressure cooker)

              That would mean systems with this fluid would reach 120F and basically go no further (unless ALL the coolant boiled off, which I doubt would happen)

        • The refrigerant in most air conditioner systems boils at about 45 F. The ability to transfer heat to the outside world depends on the compressor power - not on what the boiling temperature is. Because of the temperature and chemical compatibility with systems that run R-134a, there are going to be a lot of hardware cost reductions, multi-source suppliers, and existing infrastructure to support that technology. That boiling point yields a better technology ecosystem.

        • Comment removed based on user account deletion
        • Yes boiling at this temperature is useful. Makes it easier to separate the hot from the cold, the equipment can be immersed in the liquid form, heat it up and it automatically separates the part that needs cooling and re-condensing.

          Transporting the hot part becomes easy, the system has a natural pump cycling the atoms around driven off the heat. So all that heat energy is more usefully absorbed by the system (into kinetic energy), you are not putting additional energy in (such as a liquid pump) which also

      • Where did you see that it was Novec 649? There are a whole bunch of different 'Novec' engineered fluids... They could be referring to Novec 1230, which is a fire suppression fluid [mentioned in TFS]...that one doesn't seem very healthy to be around.

        • Where did you see that it was Novec 649? There are a whole bunch of different 'Novec' engineered fluids... They could be referring to Novec 1230, which is a fire suppression fluid [mentioned in TFS]...that one doesn't seem very healthy to be around.

          SWAG... scientific wild ass guess.
          I looked at the Novec product line and picked on that I would try first.
          Note I changed the comment subject to "I wonder" not "I know".

        • by Polo ( 30659 ) *

          looks like the specs for both of them are about the same . 49C/120F boiling point, 88kJ/kg specific heat.

      • Re:I wonder... (Score:5, Informative)

        by grouchomarxist ( 127479 ) on Thursday April 10, 2014 @02:46AM (#46711699)

        None of the articles I've seen mentioned which version of Novec is being used. They have a great variety: http://solutions.3m.com/wps/po... [3m.com]

      • by Anonymous Coward

        Silicone oil is not flammable, can withstand a lot of heat, excellent heat transfer characteristic, doesn't conduct electricity ... and furthermore, Silicone oil is CHEAP !!

        Can Silicone oil be used in similar operation ?

        • by SuricouRaven ( 1897204 ) on Thursday April 10, 2014 @03:15AM (#46711803)

          Yes, it can. I've got a little bitcoin miner chip running right now as a proof of this. I'm not a bitcoin enthusiast, just wanted something hot and expendable to test immersion cooling on.

          There is one downside: Viscosity. It's thick stuff, so it takes a powerful pump to keep it actively circulating. It also tends to pool in spaces underneath components and anywhere not exposed to easy circulation, impeding cooling.

          • My experience with silicone oil was that it was very _thin_, and tended to try to creep out of its containers (lubricant for the heads on a drum recorder, back in the 70's and 80's). We weren't using it for heat transfer, just lubrication, and used cotton wicks to pull the oil out of the tray and apply it to the drum (no pump required).

            • Your oil must have shorter chains than mine. Much like hydrocarbon oil, it comes in a variety of thicknesses and other properties depending on chain length. I thought I had one of the shorter mixes, but not the shortest.

        • by advid.net ( 595837 ) <slashdot@nOsPaM.advid.net> on Thursday April 10, 2014 @05:16AM (#46712137) Journal
          Their fluid is boiling, phase transition takes a lot of heat out without pumping anything.
          If not, you need to pump fluid between boards, this require more space and energy, even more with a thicker fluid.
          • by Polo ( 30659 ) *

            Their fluid is boiling, phase transition takes a lot of heat out without pumping anything.

            That's the key point.

            If you have a pot on your stove filled with water at 211 degrees F, it will absorb 1000 calories and then the pot will be at 212 F.
            But then the pot will absorb 540,000 calories before it gets to 213 F.

    • There's also the fact that Fluorinert is potentially toxic, but it's also a greenhouse hazard. One would hope that 3M learned their lessons in the development of Novec and it's not an environmental hazard.
      • by fnj ( 64210 )

        There's also the fact that Fluorinert is potentially toxic, but it's also a greenhouse hazard. One would hope that 3M learned their lessons in the development of Novec and it's not an environmental hazard.

        All right, I'll bite. Aside from "OMG, it is, gasp, a CHEMICAL", if it is inert, how can it be toxic? From the MSDS for Fluorinert FC-40 [3m.com]:

        "Not classified as hazardous according to OSHA Hazard Communication Standard, 29 CFR 1910.1200."
        "No occupational exposure limit values exist for any of the components lis

    • The Cray-2 did this in 1985 using a liquid called Fluorinert also invented by 3M:

      Cray-2 [wikipedia.org]
      Fluorinert [wikipedia.org]

      Yup, and I was an admin on one at NASA LaRC from 1988-92. Always wanted to put some fake floaty fish inside the thing, but people have no sense of humor about something that cost ~ $20M.

    • by Polo ( 30659 ) *

      I hope it's cheaper than Fluorinert, which I remember reading was hundreds or thousands of dollars a gallon.

  • by jones_supa ( 887896 ) on Thursday April 10, 2014 @02:10AM (#46711569)
    A small history lesson for those who don't know, this is not the same SGI (or Silicon Graphics) than of the graphics workstation fame. This one is Rackable Systems which acquired the assets of the original SGI in 2009 (and SGI Japan in 2011).
    • by serviscope_minor ( 664417 ) on Thursday April 10, 2014 @02:31AM (#46711637) Journal

      It's also not the SGI that owned CRAY in the past, who used to make supercomputers immersed in 3M fluids.

      Anyway the summary desn't quite ring true. The fluids are great at getting heat efficiently away from the servers (better than air, if rather less convenient), but it still hsa to go somewhere after that.

    • Ah.. My first thought was "SGI is still around?"

      • I really miss SGI. Wish at least the SGI that made Itanic supercomputers was still around.
        • I worked at SGI for a short whole, at the mtn view campus. before it was infested and taken over by google...

          SGI was one of the coolest companies in the bay area, or even the world. I can't begin to describe the joy of working there and of just *being* there.

          really sad when they closed down. also sad when Sun closed down (I also worked there, too).

          why do we lose good companies and piece-of-shit things like facebook and twitter are the 'new computer economy'. we went backwards quite a bit, it seems. don

      • By the way, the finally stopped releasing bug fixes for IRIX [sgi.com] last December. The company still plans to keep phone tech support going on. They say that the MIPS/IRIX products continue to be a viable solution for many customers, with millions of dollars invested over the years.
        • They ought to take one of the BSDs - F/N/O and finetune their ports to their legacy platforms, so that their customers have a path to move to. In this, they could also develop Irix jails under BSD for software that just has to have Irix. That gives their customers a migration route for their existing hardware that continues to serve them well. Whenever they die, things could then move to a BSD on x64 platforms.
          • That makes absolutely no sense. :) There is almost nothing left of the original SGI. Most of their customers have moved to OS X and Linux a long time ago.
        • by armanox ( 826486 )

          I wouldn't have opposed seeing a new release of IRIX. I still love my Octane.

    • by jasonla ( 211640 )
      Thank you. I would think the current editors pay a little more attention... Is this the new Slashdot? Have I been away for so long?
  • by Anonymous Coward on Thursday April 10, 2014 @02:24AM (#46711609)

    Many years ago I invested in a Hardcore Computer Reactor system. This was a giant custom built computer that had both the motherboard and GPUs submerged in a proprietary non-conductive coolant. It weighs over a hundred pounds filled, and they still needed pumps inside it to direct the coolant across a bunch of purpose built water blocks to extract heat from the hottest components (since liquid convection alone was not enough).

    About a year ago I had to replace the motherboard (which is a proprietary part). I can't even begin to tell you what a gigantic pain in the ass this was. There is a ton of plumbing running around inside the system that you have to worry about, and beyond that the entire compute module comes out of the coolant dripping wet, so you can't just pop it out and chuck it down on your desk. I had to break out a pair of rubberized gloves just to service the damned thing since it became obvious that the boards weren't going to dry themselves just sitting there- the coolant doesn't evaporate at all and you can't just take a towel to the raw PCB to clean it off. I landed up lining the inside of a large plastic bin with antistatic bags and doing the procedure there, which still made one hell of a mess.

    I still run that system, but if anything else ever breaks I'm probably going to sell it off rather then try to fix it again. I honestly can't imagine trying to deal with that sort of a setup on a datacenter scale. General liquid cooling is easy enough to deal with since you can just disconnect the cooling lines and pull out a module (which is precisely what IBM does with their extreme high-end end PowerPC based servers). Submerging the entire PCB is nasty business, and I wouldn't want to be the tech who has to go through that amount of trouble on a weekly or monthly basis.

    • by whois ( 27479 )

      While I don't doubt your experiences were sucky, I think this could be overcome if they designed the computers and the datacenter with it in mind. You could make the boards be pullable cards from above. Depending on the size of the chassis they might use a robot crane to retrieve the cards or it might be by hand (the crane would mean the entire datacenter floor could be liquid and the cards would be brought to a place where they could be serviced without messing up the place)

      As far as the plumbing getting

      • No way, chief. Datacenters are designed to be cost efficient, too. That automatic card pulling robot is likely to cost more than an entire row of server racks, and the servers themselves will be ridiculously expensive from custom hardware design and unorthodox cooling systems. What you'll really see is a traditional datacenter running on commodity hardware with humans doing all the manual maintenance. If there's a glut of money for crazy stuff, they'll invest it in either more servers, bigger networking
  • http://en.battlestarwiki.org/w... [battlestarwiki.org]

    Not a new concept, Cylons are using it for 3000 years already.

  • I doubt it (Score:4, Interesting)

    by enriquevagu ( 1026480 ) on Thursday April 10, 2014 @02:39AM (#46711665)

    (sorry for the duplicated posting; the previous one was cut because of problems with the html marks)

    In order to obtain a 90% reduction in the energy bill, cooling must account for 90% of the power of the DC. This implies a PUE [wikipedia.org] >= 10. As a reference, 5 years ago virtually any DC had a PUE lower than 3. Nowadays, PUE lower than 1.15 can be obtained easily. As a referecence, Facebook publishes the instantaneous PUE of one of its DC in Prineville [facebook.com], which at the moment is 1.05. This implies that any savings in cooling would reduce the bill, at much, in a factor of 1.05 (1/1.05 = 0.9523).

    On the other hand, I believe that this is not the first commertial offer for a liquid-cooled server, Intel was already considering two years ago [datacenterknowledge.com], and the idea has been discussed in other forums [electronics-cooling.com] for several years. I can't remember right now which company that was actually selling these solutions, but I believe it was already in the market.

    • by Megol ( 3135005 )
      While 90% is a lot your calculations doesn't take into account that lowering temperature also lowers power consumed for processing due to leakage reductions and other effects.Look at  http://www.realworldtech.com/supercomputers-cooling/
  • If you search for "computer immersion cooling" with Google it will throw up a bunch of people (and companies) doing PC systems totally immersed in mineral oil and things as a way to get even more power out of a system (even more than regular liquid cooling gets you)

    • Sure, mineral oil, cooking oil, fluorinert distilled water, bunch of other esoteric fluids. The real thing that it comes down to the heat transfer between the component and the fluid itself. And this newer stuff is apparently leaps above flurorinert, especially besides that it won't kill you quite so quickly and won't destroy the ozone layer quite so badly. You thought that freon was bad? Fluorinert makes freon look like a glass of water in terms of reactivity.

      • by Megol ( 3135005 )
        <quote><p>Sure, mineral oil, cooking oil, fluorinert distilled water, bunch of other esoteric fluids. The real thing that it comes down to the heat transfer between the component and the fluid itself. And this newer stuff is apparently leaps above flurorinert, especially besides that it won't kill you quite so quickly and won't destroy the ozone layer quite so badly. You thought that freon was bad? Fluorinert makes freon look like a glass of water in terms of reacti
      • Sure, mineral oil, cooking oil, fluorinert distilled water, bunch of other esoteric fluids. The real thing that it comes down to the heat transfer between the component and the fluid itself. And this newer stuff is apparently leaps above flurorinert, especially besides that it won't kill you quite so quickly and won't destroy the ozone layer quite so badly. You thought that freon was bad? Fluorinert makes freon look like a glass of water in terms of reactivity.

        HUH? Kill you? Flourinert is just what it means, it's inert! It's what the medical community was been playing with years ago in an attempt to treat lung infections, you can breath it, like in the move "The Abyss" where they dunk his rat in the tank (they actually did that). It IS slightly toxic and is probably one of the reasons it never made into actual medical use.

        Toxicity Profile Fluorinert liquid FC-70 is non-irritating to the eyes and skin, and is practically non- toxic orally. The product also de

        • by Mashiki ( 184564 )

          HUH? Kill you? Flourinert is just what it means, it's inert!

          There's plenty of inert things that will kill you in a painful manner.

    • by Chas ( 5144 ) on Thursday April 10, 2014 @03:05AM (#46711767) Homepage Journal

      Yep. Got to fiddle around with Fluorinert cooling years ago.

      Interesting, just not very practical.

      You really DO need a fully sealed system and ostensibly clean-room assembly. Because, while the coolant itself is non-conductive, any detritus that accumulates in the fluid after settling out of the environment ISN'T. That's the main thing about water (straight H2O) isn't conductive. It's all the other things in the water, minerals, dust, etc that's doing the conduction.

      Also, as noted, there's STILL going to be use of fans and water. Because you still need systems that extract the thermal energy from the liquid medium. You simply remove them from the main system chassis.

      It also doesn't change the fact that it's still a TERRIBLY inefficient way to cool the system. Unlike water cooling loops, where you have no more than maybe a pint or so of fluid cooling the major heat sources in the system, you have QUARTS of fluid basically covering everything. And you really have no good flow control, other than extremely high volume fluid exchange, which is energy inefficient in and of itself.

      That's PROBABLY what a lot of the board re-engineering is about. Centralizing all the thermally active devices into a centralized area to limit the volume of immersion coolant required and to simplify flow control.

  • But there are several challenges, including the need to design new motherboards and servers.

    Swapping out that faulty network card gets to be a bitch [youtube.com].

    (might need a bit of context; something goes wrong with the super-cooled computers and Chris Evans has to dive in and fix it. Then he dies)

  • by Anonymous Coward

    Get rid of dynamic languages (like PHP which recompiles on every pageload) for web apps and use properly compiled ones.

    • by Anonymous Coward
      Agreed, using dynamic languages is hands down the biggest power waster.
      • by PPH ( 736903 )

        We'll just use JavaScript and generate all the heat on the client's systems.

    • by Anonymous Coward

      Web developers are too dumb to operate a compiler.

  • Transformer oil - move out!
    • Transformer oil - move out!

      Man, these greasy energon cubes are more than meets the eye, Optimus.

      I thought you had a taste for crunchy fried things, Bumblebee?

      When in Rome, do as the Romulans!

      Shut Up Starscream!

    • Transformer oil - move out!

      Fast forward to the 80's and please ditch the PCB's. Just switch to mineral oil. Costs a bit more, but does the same job.

    • by PPH ( 736903 )

      Transformer oil? Not a good idea [capturedlightning.com].

  • by Viol8 ( 599362 ) on Thursday April 10, 2014 @04:21AM (#46712011) Homepage

    Even though according to wonkypedia it has low GW potential and doesn't damage ozone, do we really want to be manufacturing more fluorinated hydrocarbons which almost never decay in the enviroment by themselves and just build up over time in the soil, plants and eventually us?

  • Comment removed based on user account deletion
  • by sirwired ( 27582 ) on Thursday April 10, 2014 @06:19AM (#46712349)

    Air cooling is inefficient, but it's not so horrible that that inefficiency alone accounts for 90% of data center power usage. Heat is heat, and Watts is Watts; they gotta go somewhere.

    And the "tons of water" that data centers use is generally used to spray the outdoor condenser (think cooling tower at a power plant); changing the servers to liquid cooling won't fix that.

    Liquid cooling makes less sense for smaller servers, as going to all the trouble to plumb a pizza box is generally more trouble than it's worth. Big Iron is already frequently liquid cooled, if not in an immersion bath.

    • changing the servers to liquid cooling won't fix that.

      Not that I disagree on most of what you said, but on this point I think there is a possible efficiency gain. What we do today is pump cool air into systems and cool the air to 70 degrees F using standard Air Conditioning systems. AC systems use a liquid phase change process, fans and compressors which take large amounts of power to run.

      If we can design liquid cooled systems that operate at higher temperatures, and get that temperature significantly above what you can reliably get from a water evaporator, yo

  • http://www.datacenterknowledge... [datacenterknowledge.com] It's been done before.
  • I'd been talking about Novec 1230 being used as a computer coolant for years on this site. Prior art all over the fucking place.

You know you've landed gear-up when it takes full power to taxi.

Working...