Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Power IT

Dell To Sell Advanced Server Cooling Systems 79

Mitechsi writes "Dell has struck a deal with Emerson to sell advanced liquid cooling systems and services to data center owners. One type of supplemental cooling technology is called the Liebert XD. The XD consists of refrigerant-filled pipes that snake around the server racks in a data center. The liquid system cuts the cooling power load by about 30%–50% compared to other types of cooling systems."
This discussion has been archived. No new comments can be posted.

Dell To Sell Advanced Server Cooling Systems

Comments Filter:
  • by Anonymous Coward on Saturday June 30, 2007 @04:55PM (#19701597)
    Coors plans to sell advanced personnel liquid cooling systems to IT departments too.
  • It makes me XD.
  • It will be nice to start seeing these in our datacenters (hopefully sometime soon). Considering half of our servers are probably dell, that's a good 15k potential installations.
  • by eviltypeguy ( 521224 ) on Saturday June 30, 2007 @05:07PM (#19701671)
    Kill me later for this, but let me be the first to say:

    "Snakes in a server room!"

    *ducks*
    • Re: (Score:1, Funny)

      by Anonymous Coward
      See here, please make it clear. Snakes, or ducks in a server room? And is it duck season, or is it wabbit season? And just what did Samuel L. Jackson mean by his cryptic remark, "No more m^$%&in wabbits in this m$#^%& server room!" And Dell Tech Support was no help. "We do not sell ducks, wabbits, snakes, or m$#^%& servers. However, we will sell you an extended warranty." I am so confused.
  • What I'd like to see is integrated liquid cooling using the XD systems built into dell servers.
  • liquid nitrogen cooling systems :)

    Actually, I'm pretty sure I saw that on the Screensavers on TechTV a looooooooong time ago. It kicked ass.
  • by billcopc ( 196330 ) <vrillco@yahoo.com> on Saturday June 30, 2007 @05:45PM (#19701843) Homepage
    So Dell is going to offer water cooling, ok great! Why call it "Liebert XD" ? What, it's not water ? Oh ok then. I don't care if you pump it full of $400/gallon fluorinert and have plastic fishies floating through it, it's still just liquid cooling, something that existed in the server room long before Michael Dell ever sold a single server machine.
    • Re: (Score:3, Informative)

      by TubeSteak ( 669689 )

      I don't care if you pump it full of $400/gallon fluorinert and have plastic fishies floating through it, it's still just liquid cooling, something that existed in the server room long before Michael Dell ever sold a single server machine.

      If you RTFA, you'd know that they are not selling liquid cooling systems, they're selling liquid-to-vapor phase change cooling systems. Read the third paragraph of TFA.

      You know those nifty heat pipes in fancy heat sinks?
      Imagine that on a bigger scale.

      • So it's not a water cooling system, it's a steam engine! Or at least the 'boiler' part of one.
    • by Anonymous Coward
      Disclaimer: Posting as AC because I work for Liebert.

      The ZDNet article is, unfortunately, very scant on details when it comes to our XD (short-hand for "eXtreme Density") system. For one, they give the impression that cooling coils are water-based. They're not; they're refrigerant, just like a whole-house air conditioner. The specific coolers they seem to be referring to are the XDV, an 8kW unit which mounts directly to the top of the rack, and/or the XDO, which is a 16kW unit which hangs in the center

      • by jbengt ( 874751 )

        "they're refrigerant, just like a whole-house air conditioner"

        Technically, they're not "just" like an air-conditioner. They're (like the water based systems also listed under the XD tag) only a heat exchange system - there's no compressor (although they also sell some of those under the XD name). Rather than compressing the vapor before condensing it, they pump the liquid refrigerant after it has condensed, which has the advantage of not cooling it below the room dewpoint, so there's no dripping condensat
      • Disclaimer: Posting as AC because I work for Liebert.
        I realize you are AC for obvious reasons, but do you work out of the Columbus area? I was on a very short term gig there earlier this year and it was interesting to see there was a "Liebert Culture" and an "Emerson Culture," at least from the IT side. Pretty neat facility, out in Dearborn if memory serves.
      • Why the XDV/XDO and not the XDF? And looking at the XDF, which should be a better match for Dell (14kW max?!), Lennart isn't keeping up with Rittal. The XDF is a water cooled or compressor based high density cabinet.

        Anybody that tries to buy this stuff from Dell and just install it like a server is in for quite some fun!
    • by Corgha ( 60478 ) on Saturday June 30, 2007 @06:35PM (#19702081)
      Why call it "Liebert XD" ? What, it's not water ?

      Ummm, because it's made by the Liebert Corporation? And it's their "XD" line of products?
      Is this the first time you've ever heard of branding?

      You might as well ask: "Why call it 'Toyota Camry'? What, it's not a car?"

      it's still just liquid cooling, something that existed in the server room long before Michael Dell ever sold a single server machine.

      So? Cars have been around a long time, too. That doesn't mean I want to drive a Model T.

      Anyway, the news here is not that there is a new HVAC product, but that Dell is going to be selling HVAC systems to datacenter owners.

      Also, this system uses a gas/liquid phase-change cycle [liebert.com], and it operates on a different scope (zone and spot cooling -- doesn't get anywhere near the CPU), so it's really not like what most people would think of as liquid cooling systems for computers. It's just a way of getting the cooling closer to the heat source instead of blowing cold air around in ducts, such that your HVAC operates more efficiently.

      Sure, this idea has been around for a while (though this system makes some improvements that are especially helpful for datacenter use). The news is that Dell is selling it.

      • Yes, this is OT, but I just wanted to congratulate Corgha on a well-delivered smackdown. It's rare that I laugh out loud at comments anymore...
  • Better Idea (Score:2, Informative)

    by Kamokazi ( 1080091 )
    Flood the server room with vegetable oil. [tomshardware.com]
  • by Anonymous Coward
    One constantly reads of problems with heat and cooling at datacenters, and exotic solutions, which would all be solved by leaving every other rack empty and renting twice as much space.

    This is North America. A data center, by definition, is remote from it's users. There is no need to place it in one of the three or four regions where square footage is that expensive.

    If I were building a data center, I would select one of the empty Albertson's or Kmarts that recently closed in my area. I would pick the on
    • This idea certainly has merits based on the cost per sq. footage factor; however it fails to address the employee factor. Datacenters require skilled IT professionals to work there. Unfortunately, these people tend to be drawn to highly populated, urbanized, and consequently, expensive areas. I would imagine that it is much more difficult to find and retain these kinds of people in less than desirable locations as you have suggested. Hey man, IT people like their mocha lattes, Apple Stores, and hot geek

    • Bandwidth. (Score:2, Informative)

      One word. Bandwidth.

      Of course there are several other very important reasons, but lets start there. I work for one of the larger Web Hosting companies in the country (we are actually global but that is another story). One thing you don't find in a back lot behind the K-mart is the top ten tier-one providers converging in one spot. The backbone needed to host things bigger than mom and pop websites is enormous and not readily available for most locations.

      "which would all be solved by leaving every other r

    • If this was such a great idea somebody would have done it already. I actually work for a hosting company and we have rennovated a few buildings into data centers but it isn't easy. You need to install power systems, cooling, and have bandwidth available.

      Spreading the servers out isn't going to help that much, you need to have air moving in order to remove the excess heat and bring cool air in. Seriously, servers generate tons of heat, literally. We have 10 tons of AC for just one section of our data cen
    • by aaarrrgggh ( 9205 ) on Sunday July 01, 2007 @01:47PM (#19708649)
      Density has a number of benefits-- especially in reducing your network and power interconnect costs. We build data centers that are about as big as a big-box retail store, and the average power draw is around 5kW per cabinet, or 125W/SF raised floor. Your 100,000 square foot retail space would tend to be half raised floor, so you have a total of about 6.2 MW UPS.

      The real problem for data centers is that some equipment works much better packed close together. Usually, it's only 20% or so, but you have to figure out solutions for this type of equipment.

      The most interesting strategies for data center cooling today are using air side free cooling. There are plenty of challenges, and it only works with certain combinations of local climate and building design, but it is another area that benefits from high density-- being able to exhaust 110F air from your cabinets directly to the outside rather than trying to cool it back down to 55F at the CRAC units makes a lot of sense.

      (As for converting a big box retail building to a data center... you might be able to put in 100kW of computer load and just run the air conditioning at night as suggested. If you pay $0.50/SF, that would be about $50k/month in rent. Rent in a co-lo for the same power density would be about $26k per month. In either scenario, you need to add power and UPS to the equation for the total picture.)
  • While it would do a much better job cooling then just blowing air around, it sounds like a nightmare to maintain, with coolant lines running everywhere.
  • Antartica (Score:3, Funny)

    by Joebert ( 946227 ) on Saturday June 30, 2007 @06:50PM (#19702153) Homepage
    Why don't we just start building datacenters in Antartica ?
    • Relative humidity? You don't want condensation on your expensive servers.
      • by Joebert ( 946227 )
        Can't we spray them down with some sort of waterproof epoxy ?
      • Relative humidity isn't a problem if you don't bring in any outside air-- just have the building vapor sealed and maybe insulated a bit to help trap the heat inside. A few 100% glycol lines to the outside and you have a nice free cooling system.

        Of course, the cost of that OC192 line might slightly offset the electricity savings, but that's just a detail. Generating power for the servers might be interesting as well...
  • Anyone else get the sneaking suspicion that we're not doing a good enough job of ensuring efficient energy consumption and helping to contribute to a greener world? Those of us in the tech industry in our clean, white collar business casual attire (or 'vintage' jeans and 'retro' button-ups if you work in 'frisco), working in air-conditioned offices on cutting edge silicon sometimes seem to forget (at least, I certainly do) that all this wonderful technology is humming along because we still burn millions o

  • ... the server cooling system where I work -- it consists of a single floor fan positioned in the door of the server room.
  • by BBCWatcher ( 900486 ) on Saturday June 30, 2007 @09:22PM (#19702641)

    So Dell PC servers now have old fashioned, pipes-in-the-data-centre liquid cooling, while IBM mainframes do not.

    We have come full circle, haven't we?

  • Liquid cooling is more effective, but where is the claimed savings coming from? The cooling cycle hasn't changed, they take cool refrigerant, expose it to heat, it evaporates, they compress it, spin out the heat, repeat. Every A/C in the world works this way. All they have done is change the 'cooling fluid' from air to refrigerant. The same heat is generated, and it has to be pumped out. The reason IBM and others used liquid cooling in the olden days was that the specific heat of air is bad, it is better
    • I believe the idea is that instead of a massive A/C system freezing a room, the cooling coils normally in the A/C unit are run directly into the servers in a way that it more efficiently absorbs the heat inside. I imagine it's much smaller than a normal A/C unit is. Cooling is limited to the server boxes themselves rather than the entire room and having that air in the room do the cooling through the fans in the server boxes.
      • I would have to see it. I used to design HVAC systems. Having air around the server doesn't really do much. Back in the 70s "energy crisis" they made all the grocery stores turn up (warmer) their temperatures, and it ended up wasting energy because the produce and meat and frozen food coolers had to work harder. If the article is just talking about cooling the CPU or northbridge, etc. then I can see it as being more efficient at cooling that components. But as a total electrical bill, I don't see it.
  • Coming from Dell, I would have expected an "Advanced Cooling System" to just be a really big fan.
  • The refrigerant-based approach is an efficiency disaster. Any serious datacenter is cooled with chilled water, or it is using double the watt/ton of cooling it should be. A datacenter that is not using water based cooling and some form of freecooling, which saves money even in climates like Phoenix or Atlanta in a 24/7 flat-load datacenter situation, should be sued for false advertising if they claim to be "efficient." There is benchmarking data available on this - a closed-loop dx system is an energy disas

    • by jhw539 ( 982431 )

      The refrigerant-based approach is an efficiency disaster. Any serious datacenter is cooled with chilled water, or it is using double the watt/ton of cooling it should be. A datacenter that is not using water based cooling and some form of freecooling, which saves money even in climates like Phoenix or Atlanta in a 24/7 flat-load datacenter situation, should be sued for false advertising if they claim to be "efficient." There is benchmarking data available on this - a closed-loop dx system is an energy disas

  • cool.
  • ....Siberian gulag cools data center.
  • Emerson appliances around Dell servers, eh? Sounds like a disaster waiting to happen.
  • The broader message is that there are alternatives that enable better efficiency at both the server level as well as the facility. This is essentially an endorsement of this cooling technology and you may see further endorsements of other systems. If you dig further into the announcement and get down to the supporting white paper, you'll find that it is a joint marketing campaign explaining the efficiency benefits of the XD system and the benefits at the server level of Dell's Energy Smart brand. Energy

"Been through Hell? Whaddya bring back for me?" -- A. Brilliant

Working...