Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Bitcoin Hardware IT

Immersion Cooling Drives Server Power Densities To Insane New Heights (datacenterfrontier.com) 80

1sockchuck writes: By immersing IT equipment in liquid coolant, a new data center is reaching extreme power densities of 250 kW per enclosure. At 40 megawatts, the data center is also taking immersion cooling to an entirely new scale, building on a much smaller proof-of-concept from a Hong Kong skyscraper. The facility is being built by Bitcoin specialist BitFury and reflects how the harsh economics of industrial mining have prompted cryptocurrency firms to focus on data center design to cut costs and boost power. But this type of radical energy efficiency may soon be key to America's effort to build an exascale computer and the increasingly extreme data-crunching requirements for cloud and analytics.
This discussion has been archived. No new comments can be posted.

Immersion Cooling Drives Server Power Densities To Insane New Heights

Comments Filter:
  • by Anonymous Coward

    All this exascale capability will be used by the highest bidders to crack encryption to enable snooping without a warrant and model new nuclear weapons technologies...

    • by Anonymous Coward

      They really won't care about that.

      What they will care about is "which stock is going to make our company the most money this nanosecond?"

    • by Anonymous Coward

      I always read this. China builds a new computer, its for "cracking encryption" and modeling nuke poops. The US builds a CPU farm, and it is for the Illuminati to crack AES-256. Realistically, the computer winds up for other things. Better hurricane models (the current prediction technology had the worst hurricane on record cause -zero- deaths in Mexico.)

      The computers that do the number crunching are not vector based machines anyway. They are made for MIPS, not MFLOPS, for the most part, although FP is

      • by ShanghaiBill ( 739463 ) on Tuesday October 27, 2015 @01:07PM (#50811659)

        Better hurricane models (the current prediction technology had the worst hurricane on record cause -zero- deaths in Mexico.)

        That was caused by sensational journalism rather than bad models. Although Patricia was a record storm out at sea, models showed that it would lose energy as it passed over cooler waters close to the coast. The models correctly predicted that the winds would drop from 200mph to about 165 by the time it came ashore.

        The next big use for large machines is HFT. A microsecond or two on a fast pipe can mean millions in stock gains.

        The glory days of HFT are in the past. Speed is no longer an advantage when everyone is doing it. Besides, HFT needs fast pipes, but doesn't really need a lot of computation.

        • by stooo ( 2202012 )

          >> HFT needs fast pipes, but doesn't really need a lot of computation.
          HFT needs short pipes, not really fast pipes.

      • by Anonymous Coward

        Better hurricane models (the current prediction technology had the worst hurricane on record cause -zero- deaths in Mexico.)

        There is a reason for that. Hurricane Patricia hit in a sparsely populated area and the Mexican government did a good job of evacuating people from areas that might get hit. The storm weakened considerably after hitting the mountain range that's near the coast. Patricia had quickly increased in power while off the coast but it didn't have a lot of time to get a strong storm surge going (the storm surges are what usually cause most of the damage). The sea floor drops significantly off the coast too, so t

  • by Anonymous Coward

    The website is slashdotted. Here's a mirror:

    1sockchuck writes:
    By immersing IT equipment in liquid coolant, a new data center is reaching extreme power densities of 250 kW per enclosure. At 40 megawatts, the data center is also taking immersion cooling to an entirely new scale, building on a much smaller proof-of-concept from a Hong Kong skyscraper. The facility is being built by Bitcoin specialist BitFury and reflects how the harsh economics of industrial mining have prompted cryptocurrency firms to fo

    • by Anonymous Coward

      Sounds like even with all that power, their servers still suck. Not the best PR.

  • by Anonymous Coward

    ...they want their cooling back: https://en.wikipedia.org/wiki/Cray-2

    • by fahrbot-bot ( 874524 ) on Tuesday October 27, 2015 @12:31PM (#50811379)

      ...they want their cooling back: Cray-2 [wikipedia.org]

      I was actually one of the admins for a Cray-2 (and other systems) at NASA LaRC from 1988-1992. It was pretty cool (no pun intended). The chassis was Plexiglas (or something else clear) and you could see the 3D circuit boards immersed in the Fluorinert [wikipedia.org] - which was wicked expensive back then. I always wanted to put some plastic fish inside the system... The system was moved to the Virginia Air and Space Center (VASC) for a while after being decommissioned sometime later.

    • by jeffb (2.718) ( 1189693 ) on Tuesday October 27, 2015 @12:33PM (#50811399)

      The difference with this approach is two-phase cooling, where they're actually boiling the heat transfer fluid. That can remove heat a lot more quickly, as long as you can keep a few issues under control:

      1) Getting a working fluid with an appropriate boiling point and otherwise acceptable physical parameters (non-flammable, doesn't dissolve your circuitry, etc). 3M has already stepped up to the plate on that.

      2) Recondensing the vapor fast enough. This is a lot easier than cooling the circuits directly.

      3) Preventing the hot chips from forming a vapor barrier, which insulates the chips from the coolant. The Leidenfrost effect [wikipedia.org] is an example of this, but you can lose efficiency long before you reach the droplets-skittering-around level, especially if there are lots of nooks and crannies where bubbles can get stuck. Presumably the designers have handled this as well.

      If they go with a transparent enclosure and some gratuitous lighting, this could become the new mad-scientist/Big Scary Computer visual trope. Let's face it, lab coats, blinking lights and reel-to-reel tape drives are really tired...

      • by fgouget ( 925644 )

        3) Preventing the hot chips from forming a vapor barrier, which insulates the chips from the coolant. The Leidenfrost effect [wikipedia.org] is an example of this, but you can lose efficiency long before you reach the droplets-skittering-around level, especially if there are lots of nooks and crannies where bubbles can get stuck. Presumably the designers have handled this as well.

        I don't know if they took additional steps but in their schematic they show the chips being vertical, thus minimizing the horizontal surfaces needed for the Leidenfrost effect and making it easier for the vapor to move up.

        • The Leidenfrost effect is a form of film boiling. The bad news is that film boiling will take place on vertical surfaces about as easily as it will on horizontal surfaces. The one sure way to stop film boiling is to keep the heat transfer rate well below the departure from nucleate boiling limit (nuclear reactors are design to have the peak heat transfer rate at least 3X below the departure from nucleate boiling limit) and never let the circuitry become dry when powered.
      • by bobbied ( 2522392 ) on Tuesday October 27, 2015 @01:30PM (#50811855)

        Two phase cooling eh? I don't know if that's a good idea.

        I would contend that it's usually not a good idea to cool something using a liquid when you let that liquid boil when in contact with what you wish to cool. This is especially true for things you wish to keep evenly cool. I understand that you do gain a lot of heat transfer capability by vaporizing the liquid, but you loose the ability to easily keep heat evenly flowing from a surface when you let vapor bubbles form on it. Perhaps you could deal with that issue using conductive materials to spread the heat out (you are going to need some of that anyway) but it might be cheaper to implement a single phase solution. Also, presumably they are suggesting a "closed loop" system for this liquid, where the vapor would need to be recycled by compressing and condensing it back into liquid. This puts the ambient temperature as the lowest you can get the liquid, without some other multi-phase process (and associated expense).

        I would think that it would be better to stay a liquid at all times and pump the liquid though a heat exchanger to be cooled using conventional refrigeration methods. You avoid vapor bubbles causing hot spots, only need to come up a suitable liquid based on it's non-reactive nature that will stay liquid and not have to worry about it having the necessary phase change pressure/temperature for your application. Plus, water chillers are already standard fare at current data centers and in industrial cooling equipment. Just pump liquid though the whole thing and push the thermodynamically expensive processes that involve phase changes off onto existing efficient equipment designs which exist. In short, avoid inventing the wheel...

        • by jeffb (2.718) ( 1189693 ) on Tuesday October 27, 2015 @02:24PM (#50812335)

          I would think that it would be better to stay a liquid at all times and pump the liquid though a heat exchanger to be cooled using conventional refrigeration methods.

          The thing is, you typically move immensely more heat via phase changes than by simply raising and lowering a liquid's temperature. For water, heating one mole (about 18g, or 18 ml, or 1.2 tablespoons) of liquid from the freezing point to the boiling point takes about 7.5 kJ; converting that same amount of water from liquid at the boiling point to gas at the boiling point takes over 40 kJ. (Standard pressure, etc, etc.)

          That confers a huge advantage in two-phase systems. Yes, you have to deal with bubbles and vapor barriers, but you also get free vigorous agitation, reducing the risks of boundary layers and poor mixing that complicate all-liquid systems.

          • The ideal system will re-condense the vapor quickly, perhaps before it is more than a few cm from where it boiled. The vapor movement accelerates the fluid near the "point of contact" and gets much more heat transfer due to the fresh fluid brought in, the vapor transfers its heat to the cooler fluid until it is vapor no more.

            If those vapor bubbles live long enough to get near other boiling spots, then they're starting to have some negative effects. If you end up with a large reservoir of vapor at the top

        • Hot spots are worse in conventional systems.. Unless you carefully design the boards you will have areas with no flow that will overheat. Any hot spot in this system will immediately vaporize the liquid and refresh with cooler fluid. Also simplicity- no pumps or hoses to worry about and you could easily have a failsafe that shuts everything down if the fluid evaporates past a certain level.

        • They're not compressing it, they're simply condensing it on a cooling coil. And I assume they're going to need some form of refrigeration for that, so I'm not sure why they talk about getting rid of chillers.

          Also, the chip surfaces are shown to be vertical, so the bubbles will rise along the surface of the chip, likely creating a convection current in the process.

          • I was looking at the 3M site about the fluid they use. I have no idea how expensive this stuff is, but I have a feeling it's going to be pricy stuff. This means that you won't want it to be disappearing so you will need a closed system. This will imply some kind of air tight enclosure topped with a serious heat exchanger. suppose you could get rid of the chillers if you could afford enough of the Novac liquid and didn't care that it evaporated away... But what's the point of the heat exchangers then?

        • by lott11 ( 1516269 )
          I was not going to say anything but that is the most asinine way of thinking. So where did you get your degree, from a cracker jack box? In the first place if you are going to make a water heat exchange unit that means pumps & pipes in a much larger scale. Not including if you do not have a cold body of water, you would need a refrigeration unit to exchange the heat for a second time. Have you ever heard of fluid dynamics, or even heat exchange system using low boiling point gasses. along with a heat ex
          • I was not going to say anything but that is the most asinine way of thinking. So where did you get your degree, from a cracker jack box?

            No, it was the close cover before striking school of engineering...

            And with that, we are done.... I might choose to read the rest of your rant, some day when I have time to waste, just not today.

      • If I had to support a system with reel-to-reel tape drives, I would consider that Big Scary and I would also get really tired.
  • Really? (Score:5, Funny)

    by argStyopa ( 232550 ) on Tuesday October 27, 2015 @12:28PM (#50811361) Journal

    "INSANE" new heights?

    Has Slashdot been sold to the Gawker network now?

    • Re:Really? (Score:5, Funny)

      by thegarbz ( 1787294 ) on Tuesday October 27, 2015 @01:36PM (#50811917)

      No don't understand. This is revolutionary. The article has such amazing facts such as:
       

      The Novec liquid inside a BitFury cooling enclosure actively boils as it changes phase, removing heat from bitcoin mining hardware.

      How have we not had stuff that ACTIVELY boils as it changes phase before! It's INSANE!

  • I remember reading an article about Moore's Law and some rough calculations that at some point we would have to have the energy of the Sun moving through our computers to keep up with performance. In some sort of Matrix-style universe maybe the Milky-way is just some of super advanced alien data center. :-)

    • It's not really a law, but more of a set of guidelines... Moore or less.... He said so himself.

    • An interesting architecture might combine a processing unit with a solar panel on a simple stand. Large numbers of these units could could then be placed in a desert area, communicating via a mesh network, with almost no additional infrastructure, not even roads. Coining cryptocurrency and code breaking do not require much interprocessor communication. Energy costs would be zero. There would be no need to locate near a power transmission grid, so the most desolate desert areas might be suitable, minimizing
      • Energy costs would be zero.

        No, energy cost would be the cost of the solar panel, amortized over the number of computations the unit performed during the lifetime of the panel.

        If batteries get cheap enough, they could be incorporated to allow processing to continue at night

        In which case you need to include the cost of the batteries in your estimate of energy cost.

        It's possible that your idea could be very cost-efficient, but definitely not zero.

        • Point taken, I should have said zero ongoing energy cost. But the cost of the solar panels should be compared to the cost of the equipment generating power for conventional bitcoin mining and the cost of constructing the transmission lines bringing that power to the data center. And solar is getting to be competitive on a cost per watt-hour at the source.
  • I spilled a glass of wine on my laptop. It didn't go any faster. In fact, it died.

news: gotcha

Working...