Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Hardware Hacking Build Hardware

Asetek LCLC Takes Liquid Cooling Mainstream 118

bigwophh writes "Liquid cooling a PC has traditionally been considered an extreme solution, pursued by enthusiasts trying to squeeze every last bit of performance from their systems. In recent years, however, liquid cooling has moved toward the mainstream, as evidenced by the number of manufacturers producing entry-level, all-in-one kits. These kits are usually easy to install and operate, but at the expense of performance. Asetek's aptly named LCLC (Low Cost Liquid Cooling) may resemble other liquid cooling setups, but it offers a number of features that set it apart. For one, the LCLC is a totally sealed system that comes pre-assembled. Secondly, plastic tubing and a non-toxic, non-flammable liquid are used to overcome evaporation issues, eliminating the need to refill the system. And to further simplify the LCLC, its pump and water block are integrated into a single unit. Considering its relative simplicity, silence, and low cost, the Asetek LCLC performs quite well, besting traditional air coolers by a large margin in some tests."
This discussion has been archived. No new comments can be posted.

Asetek LCLC Takes Liquid Cooling Mainstream

Comments Filter:
  • by Anonymous Coward on Saturday April 12, 2008 @06:46PM (#23049758)
    Heck, I'm typing this on an out-of-the-box ~4 year old liquid-cooled Power Mac G5....
  • by ZeroExistenZ ( 721849 ) on Saturday April 12, 2008 @07:11PM (#23049910)

    i would have thought liquid cooling would make sense for datacentres - instead of huge electricity bills for A/C you could just plumb each rack into the building's water system

    There are a few things that come to mind:

    • - A datacenter might have different clients renting a cage, owning their own servers you can't enforce the use of watercooling. AC will have to be present and running in any case.
    • - Water + electricity is a risk. With tight SLA's, you don't want to fry your server with your extra investments in its redundant failover hardware altogether.
    • - Available server hardware isn't typically watercooled. Who's going to convince the client hacking a watercooled system on your most critical hardware is a good decision? For defects, a support contract with the hardware vendor is typical. If you mod it, soak it, you're out of warranty and can't fall back on your external SLA.
    • - electricity "bills" aren't an issue, you have so much amps you can run on each cage if you rent you keep under it or you'll have to rent another cage (notice an advantage for the datacenter here?) It's always part of the calculated cost, it's a non-issue really for datacenters or for you when you want to rent a part of the datacenter.
  • by greyhueofdoubt ( 1159527 ) on Saturday April 12, 2008 @07:16PM (#23049954) Homepage Journal
    Because air has some undeniable advantages over water:

    -Free (both source and disposal)
    -Non-conductive
    -Non-corrosive
    -Lightweight
    -Will not undergo phase change under typical or emergency server conditions (think water>steam)
    -Cooling air does not need to be kept separate from breathing air, unlike water, which must be kept completely separate from potable water

    Imagine the worst-case scenario concerning a coolant failure WRT water vs air:
    -Water: flood server room/short-circuit moboard or power backplane/cooling block must be replaced (labor)
    -Air: Cause processor to scale down clock speed

    I don't think water/oil cooling is ready for mainstream data farm applications quite yet. I also think that future processors will use technology that isn't nearly as hot and wasteful as what we use now, making water cooling a moot point.

    -b
  • by gelfling ( 6534 ) on Saturday April 12, 2008 @07:22PM (#23049978) Homepage Journal
    Even the article tries hard to tout its benefits but their own stats show its not worth it. Either it's a crappy implementation or its simply not relevant.
  • by MightyYar ( 622222 ) on Saturday April 12, 2008 @07:24PM (#23049986)
    Let me know when Asustek sells as many kits as Apple sells computers.
  • by eagl ( 86459 ) on Saturday April 12, 2008 @07:35PM (#23050050) Journal

    Even the article tries hard to tout its benefits but their own stats show its not worth it. Either it's a crappy implementation or its simply not relevant.


    How so? They show that it's quieter and more effective than stock cooling, and significantly quieter than an aftermarket air cooling solution. What exactly are you looking for then? You gotta be more specific than just a completely unsupported criticism that doesn't even reflect the test results, let alone explain your personal criteria.

    Here, try something like this next time:

    It looks like a good/bad item because the performance was/was not what I'd expect from a water cooling system costing [insert price here]. You can get similar/better performance from [insert alternative product here] for less. Tradeoffs with the alternative are it's quieter/cheaper/louder/expensive but based on my own critera of [insert your own priorities here], I think this product is great/teh suck.

    Give it a shot, you might like it.

  • by evilviper ( 135110 ) on Saturday April 12, 2008 @09:33PM (#23050698) Journal
    1. Is a result of the larger heat exchange area. And makes no difference in a data center.
    2. No benefit for any practical application. Definitely makes no difference in a data center.
    3. Does not affect the cooling costs of a data center in the slightest.

    Nothing about water cooling will reduce the cooling and energy costs of a data center IN THE SLIGHTEST. You're doing a lot of magical thinking, with NO experience in the subject.
  • Ummmmm (Score:2, Insightful)

    by Have Brain Will Rent ( 1031664 ) on Saturday April 12, 2008 @10:12PM (#23050930)
    Wouldn't "is a totally sealed system" take care of "evaporation issues, eliminating the need to refill the system" without requiring "plastic tubing and a non-toxic, non-flammable liquid"???? I'm just saying....
  • by pavera ( 320634 ) on Saturday April 12, 2008 @10:55PM (#23051238) Homepage Journal
    I don't know where you are hosting where "electricity bills" don't matter.

    I have systems hosted in 3 different DCs, 3 different companies. All of them raised their rates in the last year by 20-30% in one way or another. One DC includes the electricity in your flat monthly bill, the only incremental charge in that DC is bandwidth (IE you get 100GB of transfer, if you go over its some dollars per GB), they raised their flat rate 20%, citing higher electricity costs.

    The other 2 DCs provide metered electricity to the cage, some amount is included in the cage rental, overages are billed incrementally. These 2 data centers have both increased their incremental charges by 100% in the last year, and increased their cage rental rates by 10-15% citing increased electricity costs. Now you can say "they're just increasing their margins" but I live within 25 miles of 2 of the facilities, I know my electric costs at my home have more than doubled in the last year, up almost 250% in the last 5, so no they aren't just marking things up unnecessarily, its all the same electric co.

    All in all, this means an additional $5-600/mo in cost for our hosting. from $2000/mo to $2500-2600/mo depending on electricity and bandwidth usage (and a hint, we've only gone over on our bandwidth 1 time for a total charge of $12). I can only imagine if we were grown out (we plan in 3-5 years to have multiple racks in these three DCs and have budgeted in our plan ~75k/mo for hosting costs (based on the prices from a year ago). Well, a 20-30% increase in that turns into real money like 15-25k/mo increase. Being able to save that money would mean being able to hire 3-5 full time engineers at 60k/yr each. I'd much rather have the engineers than give that money to the electric company.
  • by evanbd ( 210358 ) on Saturday April 12, 2008 @11:11PM (#23051328)

    Also... while it's a technicality, air *is* conductive. It just has a very high impedance. It *will* conduct electricity, and I'm pretty near certain you've seen it happen: it's called lightening.

    If you want to get all technical about it, you're basically wrong. The resistivity of air is exceedingly high. However, like all insulators, it has a breakdown strength, and at electric field strengths beyond that, the conduction mode changes. It's not simply a very high value resistor -- nonconducting air and conducting air are two very different states, which is the reason lightning happens. The air doesn't conduct, allowing the charge to build higher and higher, until the field is strong enough that breakdown begins.

    For materials with resistivity as high as air in its normal state, it's not reasonable to call them conducting except under the most extreme conditions. Typical resistance values for air paths found in computers would be on the order of petaohms. While there is some sense in which a petaohm resistor conducts, the cases where that is relevant are so vanishingly rare that it is far more productive to the discussion to simply say it doesn't conduct.

    This is one of those cases. Claiming that air is conductive is detrimental to the discussion at best.

  • by Paul server guy ( 1128251 ) on Sunday April 13, 2008 @02:05PM (#23055280) Homepage
    Y'all are basically idiots.

    I just came from NASA Ames research center, (Talk about heavy supercomputing!) and they are heavily water-cooled. Right now they have coolers on each of the processor blocks, and radiators on the backs of the cabinets, but are quickly moving to directly chilling the water.
    They use quality hoses and fittings, no leakage.
    The efficiency is so much higher than air, and it makes the operating environment much nicer. (They have people in there regularly swapping out drives tapes, whatever.)

    Of COURSE water cooling is what you want to use for any high-performance computing. It's purely a matter of efficiency. (And you can use the hot water elsewhere.)
  • by MightyYar ( 622222 ) on Sunday April 13, 2008 @02:30PM (#23055406)
    If I were to define "mainstream" in terms of quality, then Windows would be a niche product :)

What is research but a blind date with knowledge? -- Will Harvey

Working...