Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Power Earth IT

IBM Pushing Water-Cooled Servers, Meeting Resistance 159

judgecorp writes "IBM has said that water-cooled servers could become the norm in ten years. The company has lately been promoting wider user of the forty-year-old mainframe technology (e.g., here's a piece from April 2008), which allows faster clock speeds and higher processing power. But IBM now says water cooling is greener and more efficient, because it delivers waste heat in a form that's easier to re-use. They estimate that water can be up to 4,000 times more effective in cooling computer systems than air. However, most new data center designs tend to take the opposite approach, running warmer, and using free-air cooling to expend less energy in the first place. For instance, Dutch engineer Imtech sees no need for water cooling in its new multi-story approach which reduces piping and saves waste."
This discussion has been archived. No new comments can be posted.

IBM Pushing Water-Cooled Servers, Meeting Resistance

Comments Filter:
  • by Meshach ( 578918 ) on Tuesday May 19, 2009 @02:26AM (#28007921)
    These kind of predictions always remind me of Bill Gates asserting that "640 K should be enough for anybody."

    Hardware and software faces change so fast; who has any idea what will be required or available in even ten years?
    • by MrEricSir ( 398214 ) on Tuesday May 19, 2009 @05:14AM (#28008887) Homepage

      I doubt Bill Gates ever said that. He's claimed the contrary on several occasions:
      http://www.usnews.com/usnews/biztech/gatesivu.htm [usnews.com]
      http://www.wired.com/politics/law/news/1997/01/1484 [wired.com]

      But yes, making predictions for the future is dumb. Unless you control the future, in which case it's not really a prediction *cough* Moore's Law *cough*

    • Re: (Score:3, Insightful)

      It's much easier to predict the past, however, if you've been paying attention. Early computers blended their cooling system with the heating system of the surrounding building. They were sometimes designed together that way.
      • Early computers blended their cooling system with the heating system of the surrounding building. They were sometimes designed together that way.

        You know, one day a couple of years ago I saw something that really blew my mind. Our huge server room had an AC outage, and slowly things were starting to overheat.
        The server team dragged out fans and portable AC's and started shutting servers down, basically helpless. Meanwhile, less than 20 feet from the server room was a window
        that could not be opened, and the

        • by jbengt ( 874751 )

          f only someone had thought to run a duct to the outside to use natures AC, they could have saved a lot of money and headache.

          Or, they could have, you know, put some A/C on the emergency back-up power.
          Backing up power to the servers without backing up power to the A/C really just allows for an orderly shutdown.

  • The community in which a server farms is found surely has a need for what will be thousands of gallons a day. To the benefit of all, I'd suggest diverting a small amount of the heated water (hopefully near boiling) to another piping system in the building .... which would be routed to a building-wide coffee or espresso maker. Great for the employees and with an outside tap, the community can get free coffee to boot. If anyone from Greenpeace shows up to protest about the water wastage, avoid telling them ab

    • Re: (Score:2, Funny)

      by Tablizer ( 95088 )

      would be routed to a building-wide coffee...the community can get free coffee to boot.

      Is that a pun?
         

    • by Ed Avis ( 5917 )

      Good idea! And the server farm can come ready-equipped with a camera and web server to show the status of the coffee maker.

    • by seanadams.com ( 463190 ) * on Tuesday May 19, 2009 @02:49AM (#28008079) Homepage

      The community in which a server farms is found surely has a need for what will be thousands of gallons a day. To the benefit of all, I'd suggest diverting a small amount of the heated water (hopefully near boiling) to another piping system in the building ....

      I'm sure it could be designed as a closed system with a heat exchange into the ground or outdoors. Indeed, it is the high temperature (relative to outdoors) at which the water is extracted straight off the CPU which makes this more efficient than air conditioning.

      However if you wanted to let it feed into the building's hot water system, it turns out there is already a really elegant way to do that: a tempering valve. It's a mechnical device which chooses the right amount of hot and cold water (each of arbitrary, variable temperatures) to produce some fixed output temperature. So to make moderately hot water you can combine some warm water from the servers and some super hot water from the boiler. The "free" server heat offsets the amount of water that needs to be heated by conventional means.

      • Comment removed based on user account deletion
        • by Chrisq ( 894406 )

          That's still quite a bit of hot water. I'm sure very little would be used in an office environment. Residential would be more like it if possible.

          Perhaps it would be better to engineer the hardware to run warm-hot. Having to chill the water down to outside ambient temp (no compressors needed) would save a lot of energy and cost.

          Maybe google could get an environmental initiative grant to provide a staff swimming pool.

        • by horza ( 87255 )

          You could force the techies to take a shower at least once a day, to drain off the excess hot water, though this scheme may find some staff resistance.

          Phillip.

        • by jhw539 ( 982431 )

          Perhaps it would be better to engineer the hardware to run warm-hot. Having to chill the water down to outside ambient temp (no compressors needed) would save a lot of energy and cost.

          YES! This is exactly the approach being used currently in the most efficient (short of insane, no-cooling in a tent one-offs) datacenters today. Design the system to provide adequate space control at the typical outside ambient. Direct water cooling isn't even required, it can be done with large coils and evaporation cooling towers to take advantage of the wetbulb depression. As for energy savings, well on a typical 15 MW datacenter you can save about 6 MW for 8000 hours a year... That adds up fast.

      • Re: (Score:2, Interesting)

        by cdxta ( 1170917 )
        Why not just pipe the warm water from the servers in to the boiler and then the boiler has to heat the water less?
        • by adolf ( 21054 )

          What if you need more hot water (in terms of volume) than the loop through the server room(s) can provide? All the hot faucets on at once. Someone in the kitchen filling up a steam kettle. Power-washing the lot. (Or any combination of these things.)

          At that point, you'd still need some sort of relatively complicated valving to bypass the server room loop for these instances. And as long as you're adding complexity, one might as well just use a tempering valve and be done with it.

      • If you have more hot water than you can use, it seems like you could pipe it next door, and charge them some fraction of what they pay to heat water, or even simply trade it for cold water (which you need to cool your servers.)

  • Seems like if one system goes bad, they have to shut down the whole array because of the water process? I guess when people complain about not having hot water, they won't call their utilities anymore, it would be the datacenter.
    • A little piece of technology called an "isolation valve" helps with that one.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      This is IBM... Do you _really_ think they'd design it in such a way that you'd have to take down the whole thing to fix a small section?

      You wouldn't have one long pipe running to all of them, with no way to shut off segments/individual nodes.

      • by Chrisq ( 894406 )

        This is IBM... Do you _really_ think they'd design it in such a way that you'd have to take down the whole thing to fix a small section?

        You wouldn't have one long pipe running to all of them, with no way to shut off segments/individual nodes.

        No, but they might assume that the world will never need more than one hot water system.

    • Re: (Score:3, Insightful)

      Sort of depends on where the water's coming from, doesn't it? I remember once when the water piping in an old -- think it was a 360/95 or some such oddity -- failed (yes, it was a looong time ago) and the area under the false floor flooded. This was before Ethernet and the floor was a rats nest of individual terminal cables (not from the 360) -- hundreds of them, along with power cabling. The real problem surfaced (so to speak) some time later, when the actual rodents who did make a rats nest of it displ
  • I attach the power cable and the network cord to my laptop.
    So, will I, now, need to connect a water pipe carrying cold water too?
    I wouldn't mind it if I can get my drink of water from it too :)

    • Re: (Score:3, Interesting)

      by Hurricane78 ( 562437 )

      In case you didn't know: Water cooling must be in a closed loop. You should not ever need to replace that water. If you do, you can destroy your coolers, because growing crystals will burst them. I have seen pictures of that.

      • by miffo.swe ( 547642 ) <daniel@hedblom.gmail@com> on Tuesday May 19, 2009 @04:49AM (#28008777) Homepage Journal

        Problems with crystals comes with some types of water where there are a high degree of lime in it. While its simpler to just use heat exchangers you could also use waterfilters that separates the minerals from the water before use. Most places have water with low amounts of lime and minerals so deposits arent really a problem.

        I had a company that made solar panels (heating houses) and inverters for house warming. In some cases we took ground water and extracted heat directly from it and when taken apart those heat exchangers very rarely showed any deposits at all even after ten years of use.

        The easiest way to see what type of water you have is to look in your toilet and your sink. If there are much deposits there (not brown ones) you have water thats high with lime or other minerals.

        • I'm talking about crystals that you get with *distilled* water. ^^
          (Because it's never perfectly 100% pure.)

      • by sjames ( 1099 )

        They should have periodically flushed the system with a scale remover. However, it is better to keep the system closed. If evaporative cooling is wanted, that should be done in a secondary loop with a heat exchanger.

  • by Anonymous Coward on Tuesday May 19, 2009 @02:58AM (#28008137)

    look at some of the newer blades or blade-ish solutions from supermicro, with 4 dual-socket boards in 2u. Dang near half the space is now taken up by ram slots. Suppose they switched to sodimms packed in like heatsink fins on one of those boards, and you could possibly cram in 2 more sockets with waterblocks.

    Similar re-arrangement with blade boards would likely also be possible.

    In a dedicated datacenter, I can't think of any real great money saving solutions for the waste heat, but it WOULD allow you to more easily cool everything with a large ground loop to get 50-55 degree water. Add a few more loops so you can melt the snow off the parking lot in the winter and bleed off heat there. Add a large tank of water and radiators to take advantage of cool nights to pre-cool the water before the chillers.

    Only chill the water with the chillers when needed. Seems to make a lot of sense to me.

    In smaller serverrooms in large office buildings, pre-heat the hot water, pipe the hot water to help heat the building in the winter, and depending on location and if its a mixed use building, you MIGHT be able to sell some of the heat to other building tenants.

    Personally, if I was building a new house, I'd have ground loop heat pump for cooling, heating, put a decent sized water tank on the top floor/attic that I could use to preheat hot water in the winter (also be good to hook into for solar hot water on the roof) and a water tank in the basement/crawl space as a source for cooling. Add some electronics to determine where to draw water source, and where to push water return for different devices depending on temperatures of each given tank, as well as when to run ground source heat pump or outside radiator and I think I could cut heating/cooling costs by a huge margin.

    Now if only we had a good way to pipe the light from all the blinking LED's to where its needed to remove the ugly florescent lighting. That or get everyone to work by the glow of their CRT/LCD

    • by jhw539 ( 982431 )
      A ground loop is not effective for a continuous cooling load - the ground loop is more a season heat storage medium than a heat sink. A cooling tower system is the traditional approach to achieving water at 55F for free cooling. Actually, we're using air to water coils that control datacenter temperature with 65F water, which can be achieved the majority of the time from cooling towers in many climates.
  • Resistance (Score:5, Funny)

    by spectrokid ( 660550 ) on Tuesday May 19, 2009 @02:59AM (#28008153) Homepage
    If they meet resistance, can't they just add some salt to the water?
  • Not for all (Score:5, Interesting)

    by __aarvde6843 ( 1435165 ) on Tuesday May 19, 2009 @03:03AM (#28008183) Journal

    I worked in several banks using IBM mainframes. The server room was always like a freezer.

    I think for now, many companies are perfectly ok with air cooling solutions. Besides, it's much safer to have air-conditioning and fans than some liquid flowing. The simpler the system, the less accidents occur within it...

    And believe me when I say that, if a company owns an IBM mainframe, they pay big bucks and they *don't* want any accidents.

    • I worked in several banks using IBM mainframes. The server room was always like a freezer. I think for now, many companies are perfectly ok with air cooling solutions.

      Me too. I didn't have to wear an aqualung and enter through an airlock, so they can't have been water cooled.

      And believe me when I say that, if a company owns an IBM mainframe, they pay big bucks and they *don't* want any accidents.

      Better steer clear of them there new-fangled 3090 models, then. It's a fad, I tell you.

    • Re:Not for all (Score:4, Insightful)

      by stephanruby ( 542433 ) on Tuesday May 19, 2009 @04:18AM (#28008623)

      I think for now, many companies are perfectly ok with air cooling solutions. Besides, it's much safer to have air-conditioning and fans than some liquid flowing.

      If some companies can make fridges that do not leak coolant. I'm pretty sure IBM can make mainframes that do not leak their coolant either.

    • by Kupfernigk ( 1190345 ) on Tuesday May 19, 2009 @04:36AM (#28008719)
      Actually, you are wrong. Designing an air cooled system is hard. You have to deal with problems of filtration (there will be dust - but where do you want it to build up?), ensuring that the flow goes where you want, turbulence, finding room for the ducting, designing the system so that components do not mask other components, and needing to handle high volumes of air. With properly designed water cooling, you have a few quite simple heat removal blocks and a simple plumbing system which can route pretty much anywhere.

      This is why nowadays virtually all internal combustion engines of any power output use liquid cooling despite the apparent reliability benefits of air cooling. To take the transition period, WW2, as an example, you only have to look at the complexity of American rotary aircooled designs versus, say, the liquid cooled Merlin engine, to see the point. It would be astonishing if the same transition did not eventually occur for large computers.

      • by mangu ( 126918 ) on Tuesday May 19, 2009 @06:16AM (#28009245)

        you only have to look at the complexity of American rotary aircooled designs versus, say, the liquid cooled Merlin engine,

        I think you mean radial engines [wikipedia.org], because rotary engines [wikipedia.org] may look similar when not running but are an entirely different thing.

        Air cooled engines are still used in small planes, their weight to power ratio is better than in water cooled engines. In larger aircraft both water and air cooled engines were replaced by turbines.

        Also, air cooled engines are still widely used in motorcycles. I think the main motive for not using them in cars anymore is due mostly to the difficulty in cooling in an enclosed region, have you seen how cramped is a modern car under the hood?

        The main advantage of water cooling is that it's easier to carry the heat away to some place where it can be either reused for some other purpose or dumped to the environment. With air cooling you have to bring a substantial amount of cool air to where the heat is being generated.

        However I still think computers are mostly in the range where it's easier to bring the air in. The amount of heat dissipated per volume of equipment is not so great that the additional complexity of water cooling would be justified.

        • Also, air cooled engines are still widely used in motorcycles. I think the main motive for not using them in cars anymore is due mostly to the difficulty in cooling in an enclosed region, have you seen how cramped is a modern car under the hood?

          Water-cooled engines are quieter. In addition to the sound-baffling that water provides, many air-cooled engines need additional fans. Think of the original VW Beetle or the older Porsche 911's. Their engines sound a lot different from most cars.

        • You are correct, but both radial and rotary engines are complex beasts. I think too that if you look into aircraft engines you will find that the way they made many air cooled engines light and powerful was basically by wasting fuel - the heat removed from the cylinder head by fuel evaporation was significant, and the specific output was terrible. Liquid cooled engines can use high super- and turbocharge while still being quite thermally efficient. And, as someone has noted above, modern motorcycle engines
        • I don't know how once can quantify "widely used" but off the top of my head, as far as street bikes go, only the less powerful Ducatis are air cooled. There are also the old-tech Thunderbolt powered Buells, but those get so hot, they could melt your leather boots. If you take an honest look, water cooling is more common. I suppose the only advantage of an air cooled bike is the lower maintenance, and the Ducatis especially are trying to lower their cost of ownership. But that wouldn't be an issue in a d

          • by mangu ( 126918 )

            off the top of my head, as far as street bikes go, only the less powerful Ducatis are air cooled

            Ducati? I think you are way too elitist, a Ducati isn't what most people ride [google.com].

            Anyhow, internal combustion engines and computers have vastly different cooling needs. My car has a 1.6 liter 113 HP (83 kW) engine. Considering that the IC engine has an efficiency around 20%, there are hundreds of kilowatts of heat that must be dissipated from a volume of 1.6 liter. In a data center, it takes a roomful of equipment to

            • I don't understand your point about Ducati. Elitist? Hmm? They have a few air cooled monsters, no?

              Nor do I understand your link to that Honda. I suppose its ridden by countries where most still can't afford a car. But not in the states. You don't see many 125cc bikes. Why wouldn't you just get a scooter at that point?

              And that link probably proved my point in that the main reason to use air cooling is for the lower maintenance and cost. In my dc, we haven't had many issues with the glycol that's used

      • by Targon ( 17348 )

        I would expect it to happen eventually to normal home computers, the key is in how reliable the systems are, plus getting the public to be aware of adding new coolant to the system. Many people HATE how loud computers can be, so liquid would help solve that in the long term.

        Before you can point out the problems with end-users and water cooling, keep in mind that as any technology gains in popularity, there will be increasing amounts of innovation that would improve on the designs.

      • Wait, are we talking about water-cooled servers (as in the water flows right over the cpu) or water-cooled rooms? Because if it is the former, then you are probably still going to have to deal with an air cooling system. Perhaps it won't be as complex or as important in the overall sense, but you probably still have to have the room cooled in some fashion, which brings up most of the issues you claim it will remove. If it is the latter, then I am unaware of the feasibility of water cooled rooms with such
      • Re: (Score:3, Interesting)

        Actually, you are wrong. Designing an air cooled system is hard. You have to deal with problems of filtration (there will be dust - but where do you want it to build up?), ensuring that the flow goes where you want, turbulence, finding room for the ducting, designing the system so that components do not mask other components, and needing to handle high volumes of air. With properly designed water cooling, you have a few quite simple heat removal blocks and a simple plumbing system which can route pretty much anywhere.

        The reality is that there are pros and cons to water cooling vs. air cooling and people have to weigh both and decide what works for them. While your post is essentially correct, it's also a bit heavy on the theoretical. I remember working at a place that had water cooled IBM mainframes. This was a US government facility and it was in a building I almost never had to go to. I remember downtimes there because "the water chiller is down" or there was a leak and a hellacious mess of water was under the flo

      • by jhw539 ( 982431 )
        I find it interesting that you bring up engines. With some clients now proposing things like 120 kW single racks, we honestly are approaching a computer rack putting off the same waste heat as a small automobile. And dealing with the waste heat needs to be treated the same way. The days of throwing CRAC units against the wall blowing into a raised floor are numbered unless power density starts trending dramatically down - you just can't control these types of loads that way.
    • it's much safer to have air-conditioning and fans than some liquid flowing.

      Because, of course, if you have A/C, then you don't have any liquids flowing.

      You have obviously never had the joy of having to deal with a data center where one of the A/C condenser lines broke.

  • Comment removed (Score:3, Informative)

    by account_deleted ( 4530225 ) on Tuesday May 19, 2009 @03:07AM (#28008209)
    Comment removed based on user account deletion
  • by yogibaer ( 757010 ) on Tuesday May 19, 2009 @03:20AM (#28008287)
    From the article: "We can use that to heat offices, or water for a swimming pool". Job Ad: "Wanted: Sysadmin, we offer: all year heated office (20+ C), swimming pool, Jacuzzi (body temperature) integrated coffe mug wamer in tabletop and always a nice, warm breeze from our datacenter." IBM could even use the swimming pool as a cooling tank (or is it the other way round?).
  • Where do people always get these kinds of numbers. I dare to guess that in reality it's not even 2 times more effective :p
    • by theheadlessrabbit ( 1022587 ) on Tuesday May 19, 2009 @03:33AM (#28008357) Homepage Journal

      Where do people always get these kinds of numbers.

      this is a situation where a link to goatse would actually answer your question.

    • Re:4000 times? (Score:5, Insightful)

      by Chrisq ( 894406 ) on Tuesday May 19, 2009 @03:49AM (#28008467)
      Just try this. See how long you can stand naked (OK wear some running shorts) in air at 5 degrees centigrade. Probably fifteen minutes standing still or indefinitely if running.

      Now see how long you can stay in water at 5 degrees centigrade. For most people it would be less than a minute - you may not even be able to get in.
      • A much simpler explaination is a nice hot oven. Not a problem sticking your arm in there, even if it's 200 C (392F) as long as you don't touch the sides. Not really recomended doing that with water that's not quite boiling (say 93 C, 200F).

    • Re:4000 times? (Score:5, Interesting)

      by TheTurtlesMoves ( 1442727 ) on Tuesday May 19, 2009 @04:06AM (#28008573)
      Water has a density of 1000kg per meter^3. Air is 1Kg per meter^3. Water has a much higher heat capacity than air. Current systems go from CPU->Air->Water and you need a thermal gradient for each, not to mention that blasting cold air through a server wastes quite a lot of air. Cut out the air and 4000 times seems quite likely, but I can't be bothered running the numbers.
    • http://en.wikipedia.org/wiki/Specific_heat_capacity#Table_of_specific_heat_capacities [wikipedia.org]

      Water: 4.186 J/cm^3/K
      Air: 0.001297 J/cm^3/K

      3227 is closer to 3000 than 4000 I guess.. but at least he got the right orders of magnitude

    • by Targon ( 17348 )

      These numbers are generated scientifically, not just by some "study". Liquid cooling works MUCH better than air cooling, but generally requires more maintenance in a single computer system. With a full building system where water is being pumped in from a larger system, there might not be as much maintenance needed, but the need to replace various components, like the tubes or fasteners for the tubes might require more maintenance than some people are familiar with.

  • by NCamero ( 35481 ) on Tuesday May 19, 2009 @03:36AM (#28008385) Homepage Journal

    All IBM is saying is that water is a better heat conductor, and air is an insulator.

    http://en.wikipedia.org/wiki/Specific_heat_capacity [wikipedia.org]

    Water ; 4 J /cm^3 K
    Air ; 0.001 J /cm^3 K

    Water/Air = 4000 times more heat transfer.

    So, given the choice, you would use water to transfer heat.

    • Re: (Score:3, Insightful)

      Yes, but eventually, all that heat ends up in the air anyway ... the water is only the middleman. Water is actually probably the most efficient coolant around, however, the latent heat of evaporation means it works best when it is boiled off the surface to be cooled. This is not exactly ideal for a semiconductor, although it might be okay if the water was in direct contact with the silicon. (Silicon junction temperatures must be kept below 360 degrees Celsius.)
      • then datacenters simply need nuclear plant style evaporative cooling towers.

        • by dbIII ( 701233 )
          Spot on, plus evaporative cooling consumes a bit of water. Unfortunately they would also need a power station quality water treatment plant so all that hot copper doesn't corrode - that or use a lot of brass instead which is a reasonable conductor.
      • Yes, but you have to *pump* that air. Pumping all that air through your rack door, your server covers, past all the components in the server, and out that rat's nest of cabling at the back is awkward, expensive, and unreliable, especially with (as someone else pointed out) dust collecting and clotting your filters or collecting on your heat sinks. It's not as bad in a good server room because the air is filtered, but it still collects, especially in less sophisticated server environments such as many office

      • I don't think they're talking about a phase-change cooling system; if they wanted to do that, there's liquids which boil closer to chip temperature than water.

    • by Aceticon ( 140883 ) on Tuesday May 19, 2009 @05:05AM (#28008851)

      My personal experience with using passive (no fans) water-cooling with my desktop PC at home (the setup is similar to this: http://www.silent.se/bilder/reserator1_c_p-410.jpg [silent.se]) is that that it's exceptionally effective.

      In my setup a cylinder full of water surrounded by fins to dissipate the heat and with a pump to make water flow as the only active element have replaced a big nasty CPU heatsink with a large fan (on a heavily overclocked CPU)* and a set of fans on a single high-end graphics card of the previous generation. At an ambient temperature in the room where this is in of about 20-25C The whole thing idles at 28C and stays at around 60C with everything going on at max - considering that with everything going on at full throttle the system is using almost 400W, it's impressive how efficient it all is.

      In practice, "home" water-cooling mostly just uses the water as a heat carrier to quickly move the heat around from the inside of the computer case (and it's constrained airflow) to a place where it is easier to dissipate that heat into the ambient air either with a more efficient radiator and fans (for the active systems) or with an outsized heatsink (like the one I use which has roughly 10 times the surface of the ones it replaces).

      In an "industrial" deployment, said heat being carried in the water cold potentially be used/dissipated in many more ways. For example large pipes could transport the hot water coming out of a data-center to the sea or a river and let it be dissipated there (keeping a closed circuit and returning the cool water back for reuse). The actual running costs in terms of active elements for such a system are limited to the cost of running a number of large efficient water pumps that make the water flow around the circuite as opposed to most data-centers out there at the moment that use (less efficient) small fans to move the air out of the blade boxes into the room and then active refrigeration to cool down the air in the room.

      * Since the point of my argument is not to show off my "virtual dick", I've moved the relevant stats down here for those that are curious on the details: CPU - Core 2 Quad 2.4 GHz which is overclocked to 3.2GHz, GPU - GTS280

      • strictly speaking, a "Passive" cooling system has no moving parts... therefore your system was not passive as you had a pump.
      • by Targon ( 17348 )

        You didn't mention that if the water is hot, it would be possible to recapture some of the energy in the form of an electric generator. The amount may not be terribly high initially, but if you are pumping water to help with cooling, the system could also supply some energy to help offset the water pump costs.

  • Sounds like a tall tale to me.
  • Whatever the technologies that are available, we shouldn't as a society use them to be wasteful. If we could still efficiently cool a power-hungry, inefficient processor complex using a liquid rather than air, it doesn't make it a good idea. It would be better to design a data center that didn't require such amounts of energy in the first place.

    This reminds me of recycling schemes that make people think it is OK to overpackage goods in the first place.

    • Until the desire for efficient power use outweighs the desire (or even need) for faster, more powerful data centers, there will always be a need for cooling systems. When it gets to a point where people want efficiency (or need efficiency), then you'll see data centers that don't require as much power.

      You can't realistically develop towards maximizing efficiency AND power. At a certain point, you have to sacrifice a little bit of one for the other. To maximize both would require too much time in developmen
  • I expect integrated power-and-watercooling sockets all over the house. The forced flow, the air conditioning unit integrated with water cooling facility, pump and reservoir, also using the heated water as heat source for heat pump.

  • For those who are in a position to design their own building with this sort of thing in mind, then yes, there may be ways to just design the building to get a better cooling environment. That is not always possible or practical though. Using a liquid cooling solution may very well be the future, but the real key is to make sure the cooling systems require as little attention as possible, and the amount of maintenance for cooling is fairly low.

    Think about it, if you run your systems off-site, the last th

  • Then learn how to capture the waste heat from a datacenter. In the cooling field, there is a wide range of ways to provide very efficient cooling - all the way down to putting 'em in a tent and just letting the wind blow through (a PUE of 1 - more realistically, a PUE of 1.3-1.1 is easily doable if your mechanical is up to par.) BUT, we are just fighting to throw away all that energy. Does Dean Kamen have a sterling engine for us yet that be hooked up to a generator to recover some of that usable heat? I ca
  • Geothermal heat pumps [wikipedia.org] use the ground as a heat sink, usually letting electric powered circulation pumps and compressors move 4-5x as much heat energy as electric power consumed when water (or a fluid like antifreeze) is the circulating medium. Generally they sink heat in the ground, which has only so much capacity (room heating/cooling apps get the heat back in the colder weather). But they could transfer the heat to sewage water that flows out of buildings, taking heat with it. Such a system could be 4-5x

  • So a company that provides air cooling solutions to data-center design believes air cooling is the way to go? Who would have thought!

    Data center cooling requirements vary wildly. In some cases water cooling may be the correct answer. In others air cooling may be the correct answer. You will almost certainly continue to see both solutions for quite a while.

  • The idea of using water cooling and then re-using the stored heat energy by redirecting it to an area that is heat-deficient is sound, green, and all that, at least in principle. The objections, if there are any, would come from the implementation. Since that is unique to each circumstance, the right answer is "it depends", but most certainly not "that's crazy talk" or "everyone should be doing this".

    If we're talking about efficiency, then I'm not sure where to start ... getting more work out of a processor

  • IBM's water-cooling rear-door heat exchanger (known as "Cool Blue") was judged the most energy-efficient entry in a "chill-off" [datacenterknowledge.com] between cooling vendors last year. The contest was sponsored by the Silicon Valley Leaders Group and conducted in the data center on Sun's Santa Clara HQ campus. The same unit is also sold by Vette Corp., which offered a video demo [datacenterknowledge.com] at Data Center World.

Beware of Programmers who carry screwdrivers. -- Leonard Brandwein

Working...