Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Hardware Hacking Build Hardware

Asetek LCLC Takes Liquid Cooling Mainstream 118

bigwophh writes "Liquid cooling a PC has traditionally been considered an extreme solution, pursued by enthusiasts trying to squeeze every last bit of performance from their systems. In recent years, however, liquid cooling has moved toward the mainstream, as evidenced by the number of manufacturers producing entry-level, all-in-one kits. These kits are usually easy to install and operate, but at the expense of performance. Asetek's aptly named LCLC (Low Cost Liquid Cooling) may resemble other liquid cooling setups, but it offers a number of features that set it apart. For one, the LCLC is a totally sealed system that comes pre-assembled. Secondly, plastic tubing and a non-toxic, non-flammable liquid are used to overcome evaporation issues, eliminating the need to refill the system. And to further simplify the LCLC, its pump and water block are integrated into a single unit. Considering its relative simplicity, silence, and low cost, the Asetek LCLC performs quite well, besting traditional air coolers by a large margin in some tests."
This discussion has been archived. No new comments can be posted.

Asetek LCLC Takes Liquid Cooling Mainstream

Comments Filter:
  • by Anonymous Coward
    Heck, I'm typing this on an out-of-the-box ~4 year old liquid-cooled Power Mac G5....
    • What's strange is the TFS extols the virtues of the new part that sounds just like the AC Delco part that Apple used.
  • by bobdotorg ( 598873 ) on Saturday April 12, 2008 @05:49PM (#23049776)
    • Re: (Score:3, Interesting)

      Too bad they didnt compare it to a good air cooling solution like the thermalright ifx-14 or ultra-120.
    • Ummmmm (Score:2, Insightful)

      Wouldn't "is a totally sealed system" take care of "evaporation issues, eliminating the need to refill the system" without requiring "plastic tubing and a non-toxic, non-flammable liquid"???? I'm just saying....
      • Re:Ummmmm (Score:5, Informative)

        by ncc74656 ( 45571 ) * <scott@alfter.us> on Saturday April 12, 2008 @11:41PM (#23051744) Homepage Journal
        If you had RTFA, you would've found that making a sealed system apparently isn't enough by itself. The silicone tubing used in most liquid-cooling rigs apparently is somewhat permeable, so water can seep through it and evaporate. Replacing silicone with vinyl fixes that, at the expense of slightly increased rigidity.
        • Re: (Score:3, Interesting)

          by kd4zqe ( 587495 )
          This is very true. I just recently disassembled my system in favor of a Core2 Duo machine. I built the rig because my 1st gen P4 3.6GHz was a pain to air-cool efficiently. I noticed that about a year after assembling the system that the temps climbed rapidly moments after power up. I found that almost all my fluid had gone from the system.

          What I thought was fluid was actually UV dye that had permeated the silicone tubing from the cooling solution. Additionally, when I stripped the system, all the tubing
  • by mrogers ( 85392 ) on Saturday April 12, 2008 @05:51PM (#23049782)
    I'm surprised liquid cooling is still seen as a fringe/hobbyist technique, with water (or oil) having a much higher heat capacity than air I would have thought liquid cooling would make sense for datacentres - instead of huge electricity bills for A/C you could just plumb each rack into the building's water system (via a heat exchanger of course, I don't really want to drink anything that's passed through a server rack). Does anyone know if this has been tried, and if so why it didn't work?
    • Re: (Score:3, Interesting)

      by jfim ( 1167051 )
      As far as I know, that's what project Blackbox uses for cooling. Note the blurb where it specifies the water connectivity requirements [sun.com].
      • by algae ( 2196 )
        I believe that Sun's Blackbox uses water cooling for refrigeration between racks, not as a method of cooling the server hardware directly. Like the sibling poster says, too much risk of leakage near the electrical bits. With however many gallons/sec Blackbox requires though, you can turn a lot of hot air back into cold air and just move it around in a circle.
    • Re: (Score:2, Funny)

      by notgm ( 1069012 )
      do you really want plumbers called in when your site is down?
      • Re: (Score:1, Funny)

        by Anonymous Coward
        Not sure what you mean, isn't that what sys admins are anyway?

    • by ZeroExistenZ ( 721849 ) on Saturday April 12, 2008 @06:11PM (#23049910)

      i would have thought liquid cooling would make sense for datacentres - instead of huge electricity bills for A/C you could just plumb each rack into the building's water system

      There are a few things that come to mind:

      • - A datacenter might have different clients renting a cage, owning their own servers you can't enforce the use of watercooling. AC will have to be present and running in any case.
      • - Water + electricity is a risk. With tight SLA's, you don't want to fry your server with your extra investments in its redundant failover hardware altogether.
      • - Available server hardware isn't typically watercooled. Who's going to convince the client hacking a watercooled system on your most critical hardware is a good decision? For defects, a support contract with the hardware vendor is typical. If you mod it, soak it, you're out of warranty and can't fall back on your external SLA.
      • - electricity "bills" aren't an issue, you have so much amps you can run on each cage if you rent you keep under it or you'll have to rent another cage (notice an advantage for the datacenter here?) It's always part of the calculated cost, it's a non-issue really for datacenters or for you when you want to rent a part of the datacenter.
      • Re: (Score:3, Insightful)

        by pavera ( 320634 )
        I don't know where you are hosting where "electricity bills" don't matter.

        I have systems hosted in 3 different DCs, 3 different companies. All of them raised their rates in the last year by 20-30% in one way or another. One DC includes the electricity in your flat monthly bill, the only incremental charge in that DC is bandwidth (IE you get 100GB of transfer, if you go over its some dollars per GB), they raised their flat rate 20%, citing higher electricity costs.

        The other 2 DCs provide metered electricit
      • Re: (Score:2, Insightful)

        Y'all are basically idiots.

        I just came from NASA Ames research center, (Talk about heavy supercomputing!) and they are heavily water-cooled. Right now they have coolers on each of the processor blocks, and radiators on the backs of the cabinets, but are quickly moving to directly chilling the water.
        They use quality hoses and fittings, no leakage.
        The efficiency is so much higher than air, and it makes the operating environment much nicer. (They have people in there regularly swapping out drives tapes, whatev

      • I actually have a rack of watercooled equipment sitting in a datacenter.
        air-cooling was not an option because the air-cooling system was maxed-out for that floor, whilst there was plenty of floorspace left.
        (blame it on the silly cooling requirements of bladeservers)
    • by greyhueofdoubt ( 1159527 ) on Saturday April 12, 2008 @06:16PM (#23049954) Homepage Journal
      Because air has some undeniable advantages over water:

      -Free (both source and disposal)
      -Non-conductive
      -Non-corrosive
      -Lightweight
      -Will not undergo phase change under typical or emergency server conditions (think water>steam)
      -Cooling air does not need to be kept separate from breathing air, unlike water, which must be kept completely separate from potable water

      Imagine the worst-case scenario concerning a coolant failure WRT water vs air:
      -Water: flood server room/short-circuit moboard or power backplane/cooling block must be replaced (labor)
      -Air: Cause processor to scale down clock speed

      I don't think water/oil cooling is ready for mainstream data farm applications quite yet. I also think that future processors will use technology that isn't nearly as hot and wasteful as what we use now, making water cooling a moot point.

      -b
      • by KillerBob ( 217953 ) on Saturday April 12, 2008 @07:13PM (#23050238)

        -Non-corrosive


        Air is one of the most corrosive substances there is. Specifically, the oxygen in the air is. It just takes time. Normally, a server won't be in operation long enough for this kind of corrosion to happen, especially if it uses gold-plated contacts, but it will happen.

        Air is less corrosive. But depending on the liquid that's in use in a liquid cooling rig, it usually isn't corrosive or dangerous to a computer anyway. Liquid cooling rigs are usually an oil such as mineral oil or an alcohol like propanol, neither of which is particularly harmful to electronics.

        Also... while it's a technicality, air *is* conductive. It just has a very high impedance. It *will* conduct electricity, and I'm pretty near certain you've seen it happen: it's called lightening.

        Finally... if your server is running hot enough that mineral oil is boiling off, you've got more serious things to worry about than that. (its boiling point varies, based on the grade, between 260-330'C -- http://www.jtbaker.com/msds/englishhtml/M7700.htm [jtbaker.com] )
        • by evanbd ( 210358 ) on Saturday April 12, 2008 @10:11PM (#23051328)

          Also... while it's a technicality, air *is* conductive. It just has a very high impedance. It *will* conduct electricity, and I'm pretty near certain you've seen it happen: it's called lightening.

          If you want to get all technical about it, you're basically wrong. The resistivity of air is exceedingly high. However, like all insulators, it has a breakdown strength, and at electric field strengths beyond that, the conduction mode changes. It's not simply a very high value resistor -- nonconducting air and conducting air are two very different states, which is the reason lightning happens. The air doesn't conduct, allowing the charge to build higher and higher, until the field is strong enough that breakdown begins.

          For materials with resistivity as high as air in its normal state, it's not reasonable to call them conducting except under the most extreme conditions. Typical resistance values for air paths found in computers would be on the order of petaohms. While there is some sense in which a petaohm resistor conducts, the cases where that is relevant are so vanishingly rare that it is far more productive to the discussion to simply say it doesn't conduct.

          This is one of those cases. Claiming that air is conductive is detrimental to the discussion at best.

      • by Artuir ( 1226648 )
        I'm sure there's some engineering to be done to solve that problem for servers. You could make copper piping through the entire solution (this IS a server, no expense is spared, no need for flimsy tubing) that would be good for 70+ years. You don't keep servers operating 70 years typically. Well, at least *I* don't. Some of you guys might just to brag about how much uptime your Linux box has to your great grandchildren.

        Linux bastards.
      • If you live in a desert climate, air cooling sucks and if the dust is not dealt with at regular intervals things fail quickly. First dust starts to accumulate on the fan blades (unevenly) putting it out of balance thus placing greater strain on its barrings. Meanwhile, intel's ingenious design of their retail cooling fan and heatsink ends up being clogged with dust. The ambient temperature inside the chassis begins to increase as the chassis fan and PSU fans have now ceased, leaving only the higher power cp
      • I can see a generation or two of blade-type applications returning to a CRAY-style apparatus:
        http://upload.wikimedia.org/wikipedia/commons/e/e4/Cray-1-p1010221.jpg [wikimedia.org]

        You might not have an entire DC relying on a common non-air cooling implementation, but doing it for a complete rack-sized unit is feasible.

        I'd personally like to see an entire rack siliconed up and flooded with mineral oil.
    • A DC might have 20,000 servers. That heat has to go SOMEWHERE. If it's pumped into the ambient air just like an aircooled machine, you're still going to need large AC units to move that hot air out of the DC
      • With the caveat that thermodynamics scares and confuses me, if you have a bunch of heat coming out of the servers' water-coolers, couldn't you pipe that into a heat pump and recover some cooling energy or even electricity? I'm familiar with a local facility which operates its air conditioning systems on steam, though I forget the name of the technology at the moment.
        • With the caveat that thermodynamics scares and confuses me, if you have a bunch of heat coming out of the servers' water-coolers, couldn't you pipe that into a heat pump and recover some cooling energy or even electricity?

          Yes. Now, THAT would be smart. Eliminate the cost of water heaters, augment winter HVAC bills, etc. Steam power plants use "waste" energy, the heat left over in the water after it runs the main turbines, to preheat the water going into the boiler. There's usually heat left over after THAT, and it is at a good temp for use in the power plant building itself. Any heat sent back out to the environment is wasted, and wasted energy = wasted $$.

          Now, if it's enough wasted energy to warrant the cost of ca

          • Now, if it's enough wasted energy to warrant the cost of capturing it... Depends. That's what they pay engineers for, to figure that out.

            Which means Google already probably knows the answer. ;) Those guys are ruthlessly (and brilliantly) cheap.

    • instead of huge electricity bills for A/C you could just plumb each rack into the building's water system

      Or you could use magical pixie dust...

      The ONLY THING water cooling does is (potentially) provide a larger surface area to disperse the heat. It does not magically "cool" anything. Unless ambient temperatures are always much lower than you want your datacenter to be, you'll still be running the water through an A/C. And if you're lucky enough to be someplace that ambient temperatures are always that lo

      • by eagl ( 86459 ) on Saturday April 12, 2008 @06:44PM (#23050096) Journal

        The ONLY THING water cooling does is (potentially) provide a larger surface area to disperse the heat.
        So totally wrong/ignorant... Is this a troll? Water cooling does a lot more than that.

        1. Can be a LOT quieter than normal air cooling.
        2. Allows for heat removal with a much smaller heat exchange unit on the heat source.
        3. Allows for heat transfer to a location less affected be the excess heat being dumped (such as outside a case) instead of just dumping the heat in the immediate vicinity of either the item being cooled or near other components affected by heat.

        There are other reasons, but these alone are more than enough. Did you not know these, or were you just trolling?
        • It's also cheaper to pump a small amount of water than a huge volume of air.
        • Re: (Score:3, Insightful)

          by evilviper ( 135110 )
          1. Is a result of the larger heat exchange area. And makes no difference in a data center.
          2. No benefit for any practical application. Definitely makes no difference in a data center.
          3. Does not affect the cooling costs of a data center in the slightest.

          Nothing about water cooling will reduce the cooling and energy costs of a data center IN THE SLIGHTEST. You're doing a lot of magical thinking, with NO experience in the subject.
          • by eagl ( 86459 )
            My experience is with datacenters that are apparently not as generic as the ones you seem to be claiming experience with. Just because you do things one way and maybe always will, does not mean that another customer will have requirements that your cookie-cutter approach can satisfy.

            Put a datacenter 300 ft underground, and see how far simple air cooling gets you. In that case, there MUST be a way to dump the heat that doesn't involve simply blowing air around. If it works for you, that's fine. But attem
            • Put a datacenter 300 ft underground, and see how far simple air cooling gets you.

              Do you have ANY experience with ANY air conditioning? You don't see to have the foggiest idea how they fundamentally work.

              An A/C will work superbly underground, with less work on the building itself (ie. never mind the per-machine hook-ups).
              • by eagl ( 86459 )
                You keep going on about experience, when it's obvious you're the one with the limited experience since you have not seen any actual applications that require any more thought than "stick some more fans on it and it'll be ok" or "well, just put another AC exchanger on the roof and it'll probably work fine". You still have to get the heat out. And that's the whole point about water cooling. Getting the heat out without relying on whooshing air around (whether it's air blowing on heatsinks or air in an clim
                • It's almost amusing to see you continually backpedaling, after every factually incorrect statement...

                  Good luck with your subterranean server farm.
          • Re: (Score:2, Interesting)

            by cheier ( 790875 )
            Liquid cooling can affect the energy costs in a big way depending on how well integrated the system is. As an example, CoolIT systems had developed a server rack with an integrated liquid cooling system that they had shown off at CES this year. The rack essentially used hydraulic fittings to allow you to hot-swap systems from the chassis, while still keeping the cooling centralized.

            They had essentially used the radiator from a Honda Accord, which they found to be able to dissipate between 25 and 35 KW o
            • Liquid cooling can affect the energy costs in a big way

              No, it really can't.

              With a system like this centralizing the area where heat is dumped, fluids can be piped out to a radiator sitting outside, so essentially, a large portion of the heat produced from a rack of computers, can be relocated outside of the data center.

              You could similarly open up a data center, with just large fans blowing ambient air in and out.

              With either method, it just wouldn't work. A $50,000 server rack is not your home PC. It's not

              • by BLKMGK ( 34057 )
                The biggest issue with running a datacenter on 40C ambient air with big fans blowing it in and out the doors is that air cooling is so inefficient that the cooled components would overheat as they pick up so little temp drop from AIR. 40C WATER cooling on the other hand would bring those CPU and HDD temps down a good bit.

                You're failing to understand just how much better water transfers heat vs air.
            • by BLKMGK ( 34057 )
              This is actually pretty amusing as when I setup my home office I designed much the same thing! Sadly I didn't put it into place but indeed it would have worked quite well I'm sure. Radiator in the crawlspace, temp sensed electric cooling fan mounted on radiator - Ford Taurus fan most likely. Copper or PVC piping up through the floor into the office in a loop with a shutoff valve in the middle to regulate bypass flow. An agro pump to move the liquid or perhaps a small pool pump. Fittings on the pipe mounted
          • Re: (Score:2, Informative)

            by jack8609 ( 1217124 )
            As someone who makes their living figuring out how to move heat from A to B (in avionics, not datacenters), this comment makes my head hurt for a number of reasons... First off, as others have pointed out, liquid cooling in data centers is a reality and folks like IBM have worked on liquid cooilng for decades. Due to many of the reasons already mentioned, everyone avoids liquid cooling as long as they can and a number of technologies have helped on this. For example, the transition from Bipolar to CMOS a
        • As any scuba diver could have told him, water conducts heat far more efficiently than air. IIRC it is a factor of about 25.
          • by adolf ( 21054 )
            Ok, you've got a brain, so you should also know this:

            Given a water cooled rig and an air cooled rig which operate at the same efficiency (in terms of Watts dissipated per Watt of cooling power), water cooling and air cooling perform just about identically as long as things remain inside of the case.

            Move the water cooling system's radiator outside of the case, and things start to slant toward water cooling.

            Observations:

            1. They're equal in cooling capacity, but the air-cooled system is simpler and has fewer
            • by BLKMGK ( 34057 )
              You haven't actually ever built and run a water cooled rig have you?
              • by adolf ( 21054 )
                No, I have not.

                Would the basic rules of thermodynamics change if I had?

                • by BLKMGK ( 34057 )
                  What would change is that you would actually have a better understanding as to what you were talking about vs simply making broad assumptions with ZERO experience. You start by assuming that air and water systems work at the same efficiency and that mounting the radiator inside the case somehow puts them there. This is not true, had you actually worked with any of this you would know how well it works even with a radiator in the case.

                  But hey, all of the various big guys moving to water to cool their large s
                  • by adolf ( 21054 )
                    My statements are true. If you do not understand them, then please ask for clarification. If you'd like to refute them, feel free to use your anecdotal evidence to do so.

                    But all you're doing is waving your hands around and talking about "various big boys," as if mere the notion of it growing in popularity obviously means that it is better, while insinuating that the total concept of dissipating waste heat into air is impossible to grasp without actually experiencing a water cooling rig first.

                    But rather th
      • You could run the hot water for the building through a heat exchanger before you heat it up with a boiler. Cold Water -> Heat Exchanger -> Warm Water -> Boiler -> Hot Water Over all the energy used to go from Cold Water -> Warm Water is saved.
        • The same is equally true for an A/C unit, as it is for a liquid system.
        • Not only that but if you attach a Maxwell's Demon to the output you can get cooled water separated out from the hot and then feed that back in to the cooler while sending the separated hot water to the boiler!!!
        • You could run the hot water for the building through a heat exchanger before you heat it up with a boiler. Cold Water -> Heat Exchanger -> Warm Water -> Boiler -> Hot Water Over all the energy used to go from Cold Water -> Warm Water is saved.

          Unless your datacenter is collocating with a (large) laundromat, there just isn't that much demand for hot water at a datacenter. No laundry, no showers, little to no cooking.

          Someone check my numbers.
          Tap water = 7 degrees C.
          Water heater hot water = 50 C

          • Put a datacenter in near city center and you could sell the warm water to steam producers.

            At least in the US most major city centers use steam to heat their buildings.
    • by ibeleo ( 319444 )
      IBM recently released the p575 which states "Cooling requirements Chilled inlet water supply/return required for all systems". They also have a kit to turn the rear door of a rack cabinet into a Heat Exchanger. So there is a move in that direction

      p575 spec http://www-03.ibm.com/systems/power/hardware/575/specs.html [ibm.com]
      Rear door exchanger http://www-132.ibm.com/webapp/wcs/stores/servlet/ProductDisplay?catalogId=-840&storeId=1&langId=-1&dualCurrId=73&categoryId=4611686018425028106&productId= [ibm.com]
    • Re: (Score:3, Interesting)

      by diablovision ( 83618 )
      The Black Box [sun.com] is a complete watercooled data center in a shipping container.
    • Given that they use AC because they cant be bothered to organise a proper air cooling system (pumping the hot air out of the back of server, instead of cooling all the air in the room, etc), its simply because its cheaper to use AC than actually organise anything.

      You do have a good point though, use of a non conductive oil, that was cooled against water pipes, would mean the servers are just as safe as they are at the moment.
    • If a server came ready-built with fail-safe plumbing and cooling mechanisms, the answer would be yes. Water, oil, flourinert - these would all be excellent. Total immersive cooling would be more logical than piped cooling, as there are fewer parts that could fail and less possible damage from a failure. You could have a completely sealed compute unit that contained everything and was ready to go, so eliminating any special skills on the part of the data centre or any special requirements in the way of plumb
    • by kisielk ( 467327 )
      It's being done though not on the system level but on the rack level. SGI's ICE platform has water-chilled doors: http://www.sgi.com/products/servers/altix/ice/features.html [sgi.com]

      This is a great bonus for high density HPC applications. Typically in a datacenter you are blowing air up from the raised floor in front of the servers. However, a good deal of it is taken up by the servers in the lower part of the rack, leaving the top servers running warmer than the lower servers. Supposedly the water chilled doors hel
    • It did work. IBM used it in the past. Check Google for "water-cooled ibm"
    • Liquid cooling hit the mainstream, mainframes in fact, 'way back in the early 1980s with the IBM 3090 mainframe. What we now call a water block, IBM called the Thermal Conduction Module. According to an article in Scientific American at the time, it combined a water block and chip packaging. The metal of the TCM directly contacted the chip substrates.

      Does anyone know about any water-cooled supercomputers?
    • by eth1 ( 94901 )
      I too have often wondered why more datacenters don't use water. (aside from the fact that you'd have to build the place from the ground up for it, probably)

      I think a standardized interface could help immensely:
      - each rack-mount server has a cold water input on one side, and a hot water output on the other.
      - the rack has a cold water rail up one side and a hot return down the other.
      - under the floor, each rack is plugged into hot and cold "bus" pipes, which feed into one of those industrial waterfall coolers
  • Even the article tries hard to tout its benefits but their own stats show its not worth it. Either it's a crappy implementation or its simply not relevant.
    • Re: (Score:3, Insightful)

      by eagl ( 86459 )

      Even the article tries hard to tout its benefits but their own stats show its not worth it. Either it's a crappy implementation or its simply not relevant.

      How so? They show that it's quieter and more effective than stock cooling, and significantly quieter than an aftermarket air cooling solution. What exactly are you looking for then? You gotta be more specific than just a completely unsupported criticism that doesn't even reflect the test results, let alone explain your personal criteria.

      Here, try somet

      • by gelfling ( 6534 )
        You don't have to be an entirely patronizing asshole. But I'm guessing you don't work in sales.

        Ok so it's marginally quieter. As for its absolute cooling power it's on par with whatever air cooled unit you can get today with a lot less complexity. All in all that's pretty weak justification. If that's your definition of "it works well" then you clearly care about noise above all other criteria. There are probably better ways to make your PC quieter than this.
        • by eagl ( 86459 )

          You don't have to be an entirely patronizing asshole.

          It's the internet, so actually I do (heh).

          But you are still arguing from a position lacking in factual information. Water cooling can be almost completely silent, and can remain so even when cooling hardware that would otherwise require very loud fans for conventional air cooling.

          This does not even address the additional cooling requirements seen in overclocking, small form factor, or otherwise special use equipment. A water cooled HTPC for example typically has to trade off performance for noise, as hig

          • by gelfling ( 6534 )
            So it's just quieter. Again, if that's your main concern, then fine. There are probably less problematic ways to address that. I for one would not want to ever have to worry about hundreds of little water cooling systems in a data center each with the potential to break down and cause a catastrophic failure among many machines.

            Back in the old TCM days you might have a dozen CEM complexes each with a TCM ganged into a single chiller pump system. That's a kind of failure rate that's manageable. But with 5000
            • by eagl ( 86459 )
              In a datacenter using blade servers, I'd expect some sort of hybrid heat exchange system would be more useful than pure water cooling. I strongly disagree that the only benefit is lower noise, but also remember that we're not just talking about datacenter applications either. All sorts of applications (such as the htpc setup I described) could get not only lower noise but also allow higher performance due to the better managed thermal load.

              And that is all water cooling does - allow a better and more manag
  • plastic tubing and a non-toxic, non-flammable liquid are used to overcome evaporation issues,

    If it is truly sealed, there should never be any "evaporation issues," as there is no where for it to evaporate to. Being non-toxic, non-flammable has nothing to do with it, I can think of another very common non-flammable non-toxic (in most of its forms and uses) compound thats readily available but is NOT used specifically because it tends to boil at relatively low temps and low pressures: Dihydrogen monoxide. As for plastic tubing, what else are you going to make it from? Metal? You could, but most sys

    • by dwater ( 72834 )
      I noticed this too. I wonder what they were trying to say.

      ...or is it just marketing crap?
    • The article says that most water cooling systems use silicone tubing, which the author seems to think is not a plastic. I'm not an expert on plastics, but PVC seems like a poor choice to me. It's too likely to degrade over a decade or so and become brittle or fragile.
      • I'm not an expert on plastics, but PVC seems like a poor choice to me. It's too likely to degrade over a decade or so and become brittle or fragile.

        Like the PVC drainpipes in modern houses?
        Like the insulation on your home's wiring?
    • As for plastic tubing, what else are you going to make it from? Metal?

      Rubber?

    • by BLKMGK ( 34057 )
      The clear plastic "bling" tubing is often medical grade and guess what? Over time liquid EVAPS from such systems. How? It actually manages to be absorbed by the tubing and slowly dissipate into the air, which is why this system uses a different kind of tubing and why they highlighted it's lack of evaporation issues. You haven't run a liquid cooled rig have you?

      Oh and plain old water is a BAD idea in a liquid cooled PC. For one it tends to oxidize things like copper heatsinks over time and for another you ge
  • I thought the G5 Power Mac took liquid cooling mainstream in 2004.

    I guess this is one of those phrases, like "the Year of Linux on Desktop," that we'll hear ad infinitum.
    • by dwater ( 72834 )
      Is that only one machine?

      If so, I don't see how that could be considered mainstream. Perhaps I misunderstand the term, but to me it means it is used on many different computers, not just one.

      Perhaps 'mainstream' is valid in this case because the one model sold a lot? I don't think that fits with my uderstanding of the word, but it's at least debatable, I suppose.
  • It doesn't seem much different to the gigabyte kit i put in my computer 2 years ago http://www.cluboc.net/reviews/super_cooling/gigabyte/galaxy/index.htm [cluboc.net] the only difference being the pre built bit which could cause great difficulty if you want to do something sensible like mount the radiator on the outside. (Note: soon after i got mine they released a second version with a different pump and reservoir, and i can tell why, after 13 months, just out of waranty, my reservoir cracked)
  • Didn't liquid cooling go mainstream when Apple used it in the last generation of Power Mac G5s?
  • This is kind of inevitable, and IMHO overdue. Monolithic heat sinks and fans the size of jet engine intakes have been a pain in the arse for top of the range gaming machines for years. Also, I don't know about anyone else, but the air cooling of my computer is a depressingly efficient mechanism for sucking dust and fluff into the computer and keeping it there.

  • Shuttle PCs have had a heat-pipe and heat exchanger liquid cooling system for years. This made possible their little "breadbox" systems.

    • by BLKMGK ( 34057 )
      Intel and AMD systems are also using heat pipes just like the Shuttle XPCs and have been for a year or three now. All of the best "air" heatsinks I am aware of use some liquid in them in the same fashion. Shuttle just managed to build it such that the radiator was a little further divorced from the heatsource is all.
    • The important difference here is that the heat exchanger in shuttle pcs uses heat pipes which thru the use of a pressurized fluid, utilize phase change to transport heat. Phase change systems include heat pipes and those using compressors. Liquid cooling with pcs includes pumps, tubing/piping, and a liquid - the fluid never changes to a gas. In this way, your point does not apply to the liquid cooling topic.
  • Welcome to 2008, OP. Sealed systems have been on the market for months. You can even find Cooler Master Aquagatte (on the market since 2007) in some of the larger retail stores.
    • Re: (Score:2, Informative)

      Hell, I've been using a "Consumer grade" easy to use water cooling system in my desktop for over a year in the form of the Titan Robela (http://www.titan-cd.com/eng/watercase/robela.htm or http://www.inland-products.com/singleproduct.asp?search=accessories&partnum=03011 [inland-products.com])

      I have the black Al faced one for longer PSs. It was extremely easy to set the water cooling up, and has kept my machine cool even with two extra blocks for the SLI cards and a chipset cooler. Yes it's not sealed, but then again, is that
  • by bcmm ( 768152 )
    What is this non-toxic, non-flammable liquid, given that it probably isn't allowed to be a CFC?
  • I can't help but think that this is a stop-gap measure. I used to read up about all the various methods of silencing a computer (with the intention to implement myself) but for consumer-grade applications I'd prefer to wait for a variant of Moore's Law to do its work - the propensity for performance per watt to keep increasing until it nears whatever limits are predicted by information theory.

    At that stage there will be an option to cool with no moving parts for typical desktop/laptop applications, and it w
  • Anybody actually find where you can buy this system? The article only says that they found one and the price for a minimal setup, but not where. I'm upgrading soon and this could be a good addition to some new hardware. Googling for "asetek liquid cooling system" only finds pages of news articles :(
    • Unfortunately, Asetek does not sell to the retail market, and has no plans to ever sell to the retail market. This means that even if you do manage to get ahold of one, you will get zero support from the manufacturer if anything goes wrong. Not something I'd risk.
  • when some standards have been defined and actually used. I'm sure one day we'll have an 'ATX+' power supply. As well as the plethora of wires hanging out the back of it, we'll have some loops of tubing with heat-exchangers on the end. Maybe standard ones for chipset, CPU and a couple of GPU ones. Buy a new graphics card and just snap on the right heatsink. It's never going to take off until the systems are all sealed (My mum is not going to buy a Dell with a bottle of 'UV Reactive' magic solvent). Sealed sy
    • While I completely agree with the need for standard fittings on the plumbing, is the high-voltage portion of your computer really the best place for the waterworks portion as well?

news: gotcha

Working...