Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Hardware

Sandia's Smart Heat Pipe 189

An anonymous reader writes "Science Blog is reporting a story from Sandia National Laboratory, best known for its nuclear weapons research. "Evacuating heat is one of the great problems facing engineers as they design faster laptops by downsizing circuit sizes and stacking chips one above the other. The heat from more circuits and chips increase the likelihood of circuit failures as well as overly heated laps. "Space, military, and consumer applications, are all bumping up against a thermal barrier," says Sandia researcher Mike Rightley, whose newly patented "smart" heat pipe seems to solve the problem. The simple, self-powered mechanism transfers heat to the side edge of the computer, where air fins or a tiny fan can dissipate the unwanted energy into air."
This discussion has been archived. No new comments can be posted.

Sandia's Smart Heat Pipe

Comments Filter:
  • by pr0c ( 604875 ) on Friday December 06, 2002 @09:53AM (#4826174)
    No matter what i do my laptop is one hot sucker! Especially when i have it docked, whoever made my docking station (all from Dell) they decided to block my fans on the back of the laptop when I dock it.

    Sometimes the better thing is simply a more well though out design, all this newer technology is good too of course but people need to stop substituting higher technology for stupidity.
    • by Lumpy ( 12016 ) on Friday December 06, 2002 @11:39AM (#4826805) Homepage
      Like why doesnt the "hot" air get blown throught ducts that channel it up across the screen or better yet through a duct that meets when the laptop is open so the tiny fan blows the air across the back of the LCD so that it's startup to get to operating temperature is faster? this would be a boon to us techs that sit there in a fox hole with out laptop when it's 35 degrees trying to figure out why this fiber node repeater is not doing it's job right. (yes I get to troubleshoot my own fiber stuff.. It helps being the only IT guy in the company with confined space training and certification.. but then they let me play with the fusion splicer too.)

      Or how about (GASP!) someone making a SMP laptop? a pair of mobile P-III processors with a modern OS will do the job quite nicely and cooler than the P-4 2.2ghz oven or the equilivant AMD blast furnace sitting there. (Note to the SMP naysayers.. W2K will take advantage of that SMP even though the apps you run will not. The same way linux has for years.)

      and this heat thing is only going to get worse.... as we stop using batteries and start using fuel burning power supplies in laptops.
  • Too late... (Score:4, Funny)

    by Quaryon ( 93318 ) on Friday December 06, 2002 @09:54AM (#4826176)
    ... for this guy [theregister.co.uk].

    Q.
    • by Big Mark ( 575945 ) on Friday December 06, 2002 @09:56AM (#4826194)
      Wonder what kind of science he was doing...

      Perhaps all the talk about "fluid interchange" was a bit too much for him to handle in a mature manner...

      -Mark
    • A lot of things were too late for that guy. Like common sense.

      Next you know he'll sue frying pan makers because he likes eating bacon right off the pan which he rests on his lap...

      If its hot, don't touch it... simple rule.

      Tom
      • Re:Too late... (Score:5, Interesting)

        by JanneM ( 7445 ) on Friday December 06, 2002 @10:08AM (#4826248) Homepage
        The heating was gradual. There's a pretty well known fact that if you put a frog in cool water, then gradually heat it, it will never jump out but be boiled alive. To a lesser extent our own sensory systems work the same; they react to differentials rather than absolute values.

        In this case, the machine probably got warm, but not so quickly nor so much that it ever became really uncomfortable (and if your attention is fixed on your work, the threshold is even higher). Also, to some extent you can exchange temperature for time in getting an equivalent burn; ie. while something needs to be scalding hot to burn you with just a touch, it can be considerably cooler if it's in contact for a long period.

        • by AGMW ( 594303 )
          Indeed. I friend managed to give himself a 3rd degree burn on his foot from a Hot Water Bottle that was OK to pick up. He went to sleep with his foot resting on the sucker and woke up with a cooked foot!

          Unfortunately, he's a vegitarian.

          • You can get burned without knowing it, easily.

            Here's a science project: take a cup of hot water, like really hot tap water. Then drop in some raw egg white. What you'll see, over a short period (but far from instant), is that the egg white becomes solid. That's because the proteins in the egg white are denaturing.

            Proteins denature at temperatures not incredibly far above body temperature, and the rate increases with rising temperature. Your cells are full of proteins, and prolonged exposure to temperatures that simply feel really warm can damage them. It does have to be for a pretty long time.

            Don't worry about hot tubs and saunas, unless you plan to live in one. If you can stand the heat over your entire body, it's not hot enough to denature your proteins yet. I believe 160F is about where there is a danger, but IANAD.
    • by badansible ( 630677 ) on Friday December 06, 2002 @09:58AM (#4826202)
      That's a 'smart heat pipe'? Now I understand.
    • Ok, he's sat with his pc on his lap, for long enough for it to get hot and "suddenly" he burns himself, "there", so why would that be?

      all together now

      "We know what you were doing
      We know what you were doing"

    • For those having trouble with the reference:

      phimosis: unretractable foreskin
      balanitis: inflammation of the helmet
    • If he hurries, he can have methanol spill into his unhealed lap. It's a disinfectant you know. Call it Norton Laptop. No smoking, please, wicked methonol is highly flamible.
  • there's an idea... (Score:5, Insightful)

    by TechnoVooDooDaddy ( 470187 ) on Friday December 06, 2002 @09:54AM (#4826177) Homepage
    In colder climates, the heat could be dumped into hand warmers rather than undesirably into fabric and the flesh beneath.

    colder clients being the 66F computer room? i know 66F isn't that cold, but when you're drinking a code red, my hands get quite numb in there. be nice to be able to flip a switch and redirect that heat up into the keyboard instead of the edge...
    • by WPIDalamar ( 122110 ) on Friday December 06, 2002 @10:06AM (#4826241) Homepage
      Caffenine is a bad idea for cold rooms. It makes your blood vessels shrink a bit, bringing less warmth to your extremeties. I never understood why computer geeks working in cold labs suck down the caffenine.
      • by jcoy42 ( 412359 ) on Friday December 06, 2002 @10:24AM (#4826327) Homepage Journal
        I never understood why computer geeks working in cold labs suck down the caffenine.

        Because when we find ourselves in the data center at the system console, it's usually because something Very Bad has happened. Our brain decides (at a subliminal level) to take drastic measures to avoid having to deal with such tasks in the future.
      • WPIDalamar [slashdot.org] writes:
        Caffenine is a bad idea for cold rooms. It makes your blood vessels shrink a bit, bringing less warmth to your extremeties. I never understood why computer geeks working in cold labs suck down the caffenine.
        Ahh, but you see, by constricting your blood vessels and restricting the flow of heat you your extermities, you're keeping it in your core. This will help you survive longer if you fall asleep there and there aren't any saint bernards [sbernard.net] handy to pull you to safety.
      • bringing less warmth to your extremeties. which also means less heat loss through your extremities; your hands and feet will get colder but you get less degradation of your core temperature. Put on gloves, a hat and drink hot coffee instead of soda and you'll be fine.
      • When you drink enough caffiene, you begin to get jittery...the heat your body produces through the muscle spasms is similar to that produced by shivering....hence the caffeine helps keep you warm.
  • Uncool news (Score:2, Troll)

    by mnmn ( 145599 )

    Altho its nice to have better cooling for computers, this news is just redundant.
  • by jkcity ( 577735 ) on Friday December 06, 2002 @09:56AM (#4826190) Homepage
    When they have found a way to channel the heat into keeping my cup of coffee warm while I'm reading then i'll be intrested.
  • by Andy Dodd ( 701 ) <atd7NO@SPAMcornell.edu> on Friday December 06, 2002 @09:57AM (#4826195) Homepage
    I see nothing in this article that distinguishes this "smart" heat pipe from standard heat pipes that have existed for quite some time.

    Yes, this technology is significantly better than air being blown over a heatsink on a CPU.

    No, it's nothing new. Shuttle small-form-factor PCs anyone? And Dell Inspiron 8x00 series laptops too. Probably other laptop manufacturers are also already using heat pipes.
    • I worked on a Compaq Laptop 6 years ago that had a heat pipe. It was solid copper, not a fluid system, but the principle the same. It's not exactly revolutionary...
      • If it was a heat pipe, it was probably not solid copper, though it looked like it. It would be a copper tube filled with a volatile liquid. Liquid evaporates at the hot end, diffuses to cool end where it condenses, transferring heat as it does so. But most of them looked solid.

        This invention just looks (from the uninformative article) as if they hae some improvements on the mechanical structire and on helping the methanol get thr right idea about where to flow (cappillaries with "one way" structires, I would guess).

        As said elsewhere, only incremental. But then, the latest Pentium is "only incremental" on the original 386 - but thos increments have taken us a long way.
    • The micro grooves layed in the pipe by photolithographic techniques so the medium can wick properly along designed paths is probably what is patented here.
    • by Jeremiah Blatz ( 173527 ) on Friday December 06, 2002 @10:58AM (#4826526) Homepage
      I see nothing in this article that distinguishes this "smart" heat pipe from standard heat pipes that have existed for quite some time.
      It's an incremental improvement on a standard heat pipe. The most advanced laptop heat pipes today are phase change, a volatile liquid is heated to a gas and flows out to the cooling fins. These tend to use natural convection to work.

      This device (as is says at the end of the article) uses capillary action to move the cooling liquid from the hot side to the cool side. It doesn't say if this is more efficient than phase change. I expect that it would work better in non-stationary applications, where a phase change material would just get mixed up. They list military wearables as a potential application.

      • Capillary action (Score:4, Informative)

        by Andy Dodd ( 701 ) <atd7NO@SPAMcornell.edu> on Friday December 06, 2002 @11:29AM (#4826728) Homepage
        Existing heat pipes already use capillary action. I remember a while ago looking at info on heat pipes out of curiosity, and I saw a number of descriptions of various wicks that were in use, and this doesn't appear to be anything new, except thay maybe they've made slightly more efficient wicks.

        Even these new heat pipes almost surely use a phase change - It's most likely possible to do it without a phase change, but far less effective/efficient. Current heat pipes use a phase change combined with capillary action - Gas vaporizes on heat source, condenses at radiator, and is wicked back. Heat pipes can be made without wicks, but they are orientation-sensitive - i.e. the condenser must be above the evaporator so gravity will bring the condensed medium back to the heat source. The Shuttle may not use a wick since the condenser is higher than the CPU, but in Dell laptops they are even, I'm positive that laptop heatpipes already use wicks.
    • by evocate ( 209951 ) on Friday December 06, 2002 @11:04AM (#4826560)
      I have a Shuttle SS51G w/ P4-2.533 +1G DDR and I'm very happy with it. Heatpipe keeps inside surprisingly cool and is exceptionally quiet. Some have replaced the fan and fan grill or modified the case itself to lower the noise even further.
    • by photon317 ( 208409 ) on Friday December 06, 2002 @11:33AM (#4826748)

      Read the whole article, it is different. The difference is that:

      1) They're using methanol, which at least some of the current commercial heatpipes don't.

      2) They're using some sort of lithography to carve micron-scale curved pathways into the inside of the tubing. These are customized in order to wick the methanol to the correct locations. This allows them to really "shape" the methanol flow for much better efficiency (send 30% methanol to hot spot A and 70% to hot spot B, and release the heat at sink spot C), instead of just having the vapors/liquids roam around as they choose. This is a boon for any heatpipe, but especially if you have an embedded device that might need complex heatpipe routing to/from possibly multiple heat sources and heat sinks.
      • It is also different in that they are using a phase-change heat transfer. When most heat pipes boil the water they are completely ineffective.

        Also, traditional heat pipes rely on elevation differences to maintain flow.
        • As I mentioned in another post, phase-change heat transfer in heat pipes is old hat. So is using a wick to allow for the heat pipe to work without an elevation difference. For an example of the latter, see the aforementioned Dell Inspiron 8200. Has no problem working with the laptop level, or even with the laptop tilted backwards (i.e. evaporator above condenser)
  • Cool! (heh, heh)
    Actually my main thought is that this makes living comfortably off the grid even more viable.
    All that compressor-based stuff? Fridges with motors and coils and water traps? Naw, they's just for thems as don't know any better.

    I *love* living in the future!
    Rustin

    • It's more like living in the past. Early refrigerators didn't use electrical compressors and such. Your Grandmother's refrigerator used a pilot flame to do its cooling. Sure, it wasn't able to cool and freeze quite as well modern refridgerators do but, it still kept food cold and made ice.

      How cool is that, to use a flame for refrigeration? It's so cool that it is still used today in things like Recreational Vehicle refrigerators. See here [howstuffworks.com].
  • SUE (Score:1, Redundant)

    by katalyst ( 618126 )
    I distinctly remember a law suit which involved a customer suing McDonalds because they hadn't warned her that the coffee was HOT. I guess laptops have to come with that warning too. Meanwhile, The register (register.co.uk) had an article about a buy burning his weenie thanks to a laptop.
    The bonus - here is a link for a genius who wanted to water cool his cpu http://www.avforums.com/frame.html?http://www.avfo rums.com/forums/showthread.php?s=24bcc587f9de10276 57e1d7862a85f58&threadid=56924
    • Re:SUE (Score:3, Interesting)

      by iomud ( 241310 )
      Everyone brings up that McDonalds case these are a few words from a friend of a friend that was actually involved in the case. While this was not 100% McDonalds fault they do share a burden of the blame for what happend.
      For example, the structural integrity of a McDonalds cup is substantially decreased when it is filled with hot coffee. It is meant to be used with the top on it to reduce spilling and keep the cup in the proper form. Like many other people, the woman took the top off to let it cool down. By doing this she made the walls of the coffee cup much weaker. She didn't actually spill the coffee; the cup collapsed when she took it out of the cup holder. Before you laugh, consider that McDonalds deliberately created a low cost coffee cup with weak walls. It was argued that if the integrity of the cup was compromised by taking the top off, then the top should be fixed in place. Or, the walls of the cup should be thicker.
      This would be like the heat sink on a laptop melting through the case due to poor and or frugal engineering. Deliberate intention to create a lower cost product without reguard for the safety of the person using it.
    • The bonus - here is a link for a genius who wanted to water cool his cpu [avforums.com]

      Jesus! After reading on, it becomes apparent that it's not a wind up, he genuinely made his PC watertight, and filled it with water! Talk about running gung-ho into something without doing some research first!

      This guy is the stuff of legends. If he had touched the case and died from the electrical shock, we'd be reading about him in the Darwin Awards!

  • As an average Joe... (Score:3, Interesting)

    by suman28 ( 558822 ) <suman28NO@SPAMhotmail.com> on Friday December 06, 2002 @10:03AM (#4826226)
    we may not be interested in this type of news, but I this as a great stepping stone for advanced and more powerful machinery. I always heard about computers, for instance not going past certain speed in Mhz because of various factors, one of them being the amount of heat it generates. So hats off to all the people that work hard to make life better for others.
  • Space? (Score:2, Interesting)

    by Anonymous Coward
    How can electronics overheat in space?
    • Re:Space? (Score:2, Informative)

      by Anonymous Coward
      There's nothing there to absorb the heat...
    • Re:Space? (Score:2, Informative)

      by Anonymous Coward
      See also: Thermos. A layer of vacuum in a silvered container.
    • Re:Space? (Score:3, Informative)

      by Muad'Dave ( 255648 )
      As has been mentioned in other replies, you have several choices for heat transfer. Conduction, convection, and radiation.

      Conduction is heat transfer thru direct contact. You touch the stove, it burns your skin.

      Convection is the transfer of heat via a moving medium. Air at the earth's surface is warmed by the sun's radiation, causing the air to rise. The heat is then transferred to the surrounding cold air, which causes the previously warm air to sink back down.

      Radiation if the transfer of heat via electromagnetic radiation. All objects above absolute zero emit some form of EM radiation in proportion to the fourth power of their absolute temperature. Also involved is a coefficient that depends on how close a radiator is to an ideal 'black body' - ie a perfect radiator. See Stefan-Boltzmann equation Inet = e*s*A(T^4 - T0^4) where Inet is the net power radiated in Watts, e is the emissivity coefficient, s is Stefan's constant = 5.6703 x 10-8 W/m^2 K^4, A is the area, and T is the absolute temp and T0 is the ambient temp. (To get the total radiation emitted, set T0 = 0). The peak wavelength of the radiation is given by Wein's displacement law, lambda = 2.898 mm * K / T, where the 2.898 mm * K is a universal constant and T is the absolute temp of the object.

      For example, a person has about 1.4m^2 of skin at 33C = 306K. If you assume they're a perfect radiator, in a room at 20C the person is emitting 111W of power, net. The emission peak wavelength is approx 9.5 um, which is in the part of the EM spectrum called "infrared".

    • Re:Space? (Score:3, Informative)

      The single biggest constraint on the Apollo 13 lunar module was the amount of cooling water on board to keep the systems at their operational temperatures. In a vaccum, you don't get any convective cooling, and radiation is extremely inefficient. At one point, they were looking at re-using the astronaut's urine in the cooling systems, but it turned out that it was unnecessary.

      Zero-g is also a factor. Lovell actually commented in his debriefing that you could get warmer if you didn't move. A small blanket of warm air would form around you, and since there was not much to move it around (all the fans being shut off) it would just stay there. Then you'd move and you'd be freezing again.

  • Forget... (Score:2, Interesting)

    by athlon02 ( 201713 )

    cooling engineers. We need to continue working towards things like 0.01 micron process (and smaller), fiber optic interconnects, and use the technologies like from Alchemy, Inc. like I'm sure AMD is doing.

    What I'm really hoping for one day is a chip made entirely of fiber optics. Sure it's a ways off, but certainly should help speed and heat issues.

  • by cybermace5 ( 446439 ) <g.ryan@macetech.com> on Friday December 06, 2002 @10:07AM (#4826245) Homepage Journal
    I remember sitting in on a presentation of heat pipe theory and applications.

    The article talks about how the methanol vaporizes at one end, and condenses at the other. Then the liquid wicks back to the first end, where it can be vaporized again. You don't necessarily have to use methanol; the coolant is varied according to the temperature range you operate in.

    The pipe pressure is carefully set so that the vaporization takes place at the optimal temperature. Usually these pipes are used in a vertical configuration, so that the vapor rises and gets to the other end more quickly, and the condensate sinks to other end quickly. The heat pipe behavior is then kind of like a passive heat diode.

    A use for heat pipes was presented; apparently a lot of structures were sinking on the Alaska pipeline. When the ground was frozen, everything was fine...but the permafrost was receding in the warm months. The solution was to keep the ground frozen all the time, by removing heat from about 20 feet down. Heat pipes were constructed with a vaporization point at the desired temperature, and sunk into the ground at the problem areas. The ground stayed frozen, and the problem was solved.
    • Smart Heat Pipes (Score:2, Interesting)

      by Lt Razak ( 631189 )
      There is an HPT system to accommodate almost any central air conditioner, whether it be horizontal or vertical in design.

      Because HPT equipment treats the entire home with dry-cooled air, there is no need for additional dehumidifiers or special equipment. Not only is dry-cooling better for you, it costs less to operate, usually recovering a payback on installation within 2 to 4 years as you set the thermostat 2 to 3 F higher.

      The heat pipe dehumidification process is automatically activated any time the air conditioner is operating. In the winter, the smart heat pipes automatically deactivate, allowing your central heating system to operate as normal.

  • The simple, self-powered mechanism transfers heat to the side edge of the computer, where air fins or a tiny fan can dissipate the unwanted energy into air.

    Exacerbating the heat-death of the universe. Whee!

    psxndc

  • Other uses for heat (Score:5, Interesting)

    by UCRowerG ( 523510 ) <UCRowerG@y a h o o . c om> on Friday December 06, 2002 @10:12AM (#4826265) Homepage Journal
    "The simple, self-powered mechanism transfers heat to the side edge of the computer, where air fins or a tiny fan can dissipate the unwanted energy into air"

    I wonder what else designers could do with that extra heat energy. If these heat pipes turn methanol into vapor, carry it to heat fans, then recondense it (due to heat loss) back into liquid.... isn't this process quite similar to how turbines work with steam? I wonder how much power could be gleaned from the extra heat. Maybe someone could design a tiny electrical generator. I doubt you could run anything significant off the power output, but I'm sure there could be some use for it, rather than simply letting that extra energy go to waste.

    • by Jeremiah Blatz ( 173527 ) on Friday December 06, 2002 @11:09AM (#4826595) Homepage
      UCRowerG writes:

      I wonder what else designers could do with that extra heat energy. If these heat pipes turn methanol into vapor, carry it to heat fans, then recondense it (due to heat loss) back into liquid.... isn't this process quite similar to how turbines work with steam? I wonder how much power could be gleaned from the extra heat. Maybe someone could design a tiny electrical generator. I doubt you could run anything significant off the power output, but I'm sure there could be some use for it, rather than simply letting that extra energy go to waste.
      The problem with solutions like this is that the power generation step interferes with the cooling step. In other words, the inefficiency in the power generation reduces the efficiency of the cooling. However, the whole point of this is cooling, which means that you have to put in bigger, heavier cooling mechanisms to cope with the reduced efficiency.

      It might be worth it if you could come up with a super-efficient generator, but that's pretty unlikely. Furthermore, the temperature gradients here are pretty low (boiling point of methanol vs. room temp), so there's not a whole lot of ooomph to drive your generator. Heat pipe designers are pretty happy when they can use this thermal gradient just to power their heat pipe convection, actual generation seems a long way off.

    • Hmmm

      And then someone suggests that we use the generator to power the laptop so we don't need the battery anyway...

      I mean there is just enough energy here to make sure the fluid in the heat pipe flows. When you have all the mechanical losses involved in the minature turbine and alternator, not to mention the heat generated by the turbine/alternator combination.

      Then you have the problem of the fact that the output of the alternator will need rectifying and regulating as the speed varies according to heat load.

      The you have issue to do with the noise generated by these mechanical devices.

      And you have to do all this with tiny mechanical devices that will fit in a laptop.

      It seems odd to come up with a system that can transport heat to a remote passive radiator in small form devices so, in an ideal world, you don't need a mechanical fan. And then use a tiny mechanical generating plant.

      Don't get me wrong, an interesting thought experiment, but given the losses in power generation its not practical.
      • Re:Thermodynamics (Score:3, Insightful)

        by Rich0 ( 548339 )
        Don't get me wrong, an interesting thought experiment, but given the losses in power generation its not practical.

        Agreed.

        I think you would get more bang for the buck by improving the efficiency in the laptop components themselves so that they don't put out so much heat - which is exactly what is done. If you get a top-of-the-line laptop you'll need insulated pants to avoid 2nd degree burns, but if you get a new laptop built for battery life (and not performance) then you'll find it runs much cooler.

        The reason for the heat bleed is that they are always rushing to get the fastest processor out - by the time they can make it cooler nobody wants it.

        If one were to do the math, the wasted heat can't be more than a few watts at most, and there isn't a whole lot you could do with that even if you could efficiently turn it to electricity at a high enough voltage.
  • When using a laptop, especially when running on batteries, no energy is unwanted. If these scientists could design a system where they took the "unwanted" heat energy and somehow transfered it back into the battery, then it wouldn't be unwanted, no? Of course, there would be some lose, but it's still better than getting nothing but a burned lap from the heat generated by your laptop.
    • The same applies to car engines...a large portion of the engines workings are to remove the heat it generates. If you don't manage the heat, you get faced with an expensive repair bill.
  • Wrong chips! (Score:4, Insightful)

    by bgat ( 123664 ) on Friday December 06, 2002 @10:15AM (#4826273) Homepage
    Why the hell do we insist on using Intel heat pumps in our laptops anyway?! There are any of a dozen different non-Intel chips that are nearly as fast as a decent P-III (or, at least, from the user's perspective) that don't need heatsinks at all! MIPS, ARM (ok, even StrongARM and XScale), SH, ...

    Oh, wait, Bill doesn't want to support Windows on those chips. My bad. He'd rather force the rest of the industry and users to deal with crappy, Intel-specific problems like heat and power consumption than construct a product that's actually well-designed and portable. Yea, that's "innovative".

    b.g.
    • Well, you can get a Mac...

      And I believe I've heard mention on this site of some alternative Operating Systems to Windows. I'll try to find the link to that article.
    • Exactly. I always think heatsinks are like drugs cops. Change the rules a week bit and you don't need them! Saving everyone a lot of hassle, money, and worries.

      Legalise drugs and ban Intel for a better world all round!
    • Oh, wait, Bill doesn't want to support Windows on those chips

      If you're going to rant, at least do it correctly. Windows NT used to run on many different architectures, but people only bought the x86 version. Windows CE does run on many architectures.


  • "It's clear now that the smaller we go, the more that cooling engineers need to be involved early in product design."
    How small could these pipes be? Could methonol filled nanotubes vent heat from processors? Or would liquid nitrogen still be the move?
    I would think liquid nitrogen would be better for troops - I don't know about you, but were I a troop, predator or no predator, I would want the smallest infrared signature possible in combat. And processor temperatures ought to show up nicely while venting, - also, the troops could dip ballons and bannanas in the nitrogen and then shatter them to impress villagers.
  • by DarkHelmet ( 120004 ) <mark&seventhcycle,net> on Friday December 06, 2002 @10:22AM (#4826319) Homepage
    Oh cool... Maybe I can now donate the excess heat generated from my Athlon to pour, freezing children in the street.

    Glad it's all done for a good cause. I just hope it's tax deductible

    • Why would you want do pour freezing children into the street? Don't they have enough problems already? ;) Ok. Bad joke.
    • If the children are frozen, you can't pour them.

      Oh, wait, I get it.... You'll *melt* them with the heat,
      then pour them?

      But... you said they were freezing, not melting...

      I'm confused again.

  • by elmegil ( 12001 ) on Friday December 06, 2002 @10:26AM (#4826339) Homepage Journal
    transfers heat to the side edge of the computer, where air fins or a tiny fan can dissipate the unwanted energy into air.

    Or your skin.

  • This isn't a new idea. Technology has come full circle- the first computers were cooled with fluid pumps, and guess what? We were right the first time.

    And the old problems with heat pumps will return- leaks that short out the machine, the added complexity of the design, yet another part to get disconnected, and idiots buring themselves by opening the box and touching the thing after it's been running for days.

    Heat sinks are just that- sinks. They hold the heat, they don't disperse it. Almost any heat dispersal method is preferable to heat sinks, which is preferable to no thermal control whatsoever.

    But they better make those tubes industrial-strength, especially on laptops. Computers are put through a lot rougher treatment than they're ever specced for; the hoses used for this had better be up to the task.

    I can see a very real possibility of a computer springing a leak and shorting itself out, and/or dripping on the user and scalding him/her- and that user very well might have reason to sue.

    It's a good idea. Just as it was the first time. But engineers need to take this type of thing into account in the original spec; it can't be slapped on at the end like just another Microsoft UI.
  • Patents? (Score:2, Interesting)

    by Buckbeak ( 591708 )
    Help me understand,

    This research is funded by the American tax payer. Why are they patenting it? Doesn't it belong in the public domain?

    • U.S. government employees, and contractors, can obtain patents for inventions that are the result of government funded research. You may be thinking of copyrights. The work of a U.S. government employee can not be copyrighted, it is in the public domain.
  • I wonder if they can apply that acoustothermic cooling technology to CPUs that was posted a couple days ago.

    Has anything been said about energy impact?... moving liquids around requires more work than moving gases I would guess.
  • by oldstrat ( 87076 ) on Friday December 06, 2002 @10:36AM (#4826387) Journal
    /.
    This guy [news.com.au] may have had the external vents in the wrong location.

    On the other hand the extra heat vented to the outside edges could be a handy deterent to theft, just change from sleep mode to heat mode.

    And I'm eager to Evaluate the new George Foreman laptop.
  • Yes, is new (Score:2, Informative)

    If I understand the article correctly, this is new in that is has a very small size, and uses a very small amount of liquid to conduct the heat, and requires no mechanical pump to drive it, no rewiring, etc.

    Because of this, it can easily be fit into an existing design with minimal re-engineering of your product. That's where the cost comes into play for manufacturers -- or has no one noticed that we don't see liquid cooling in consumer computers yet? Too expensive to add into existing designs. Also, you get one leak, there goes your computer. Not to mention the potential hazards of having a liquid flowing over live electrical circuits.

    Small size, small amount of coolant liquid, and no need to add mechanical pumps. Any laptop manufacturer could add this and not have to increase the price to cover the retooling costs for the manufacturing process. This means a faster -- and naturally hotter -- chip could be put into the laptop. That will mean laptops that are as fast as desktops, instead of lagging behind by a few years.

    What's the name of the company that will be making these things? I want to buy stock NOW while I can still afford it!!!!
    • Not all liquids are a problem in an electrical environment. Only those that are conductive are worrisome. That is why people can submerge computers in vats of liquid nitrogen with no adverse problems. (Except for the whole damn cold thing)
    • Any laptop manufacturer could add this and not have to increase the price to cover the retooling costs for the manufacturing process.

      They already do. My Dell 8100 (bought 10 months ago) has one on the CPU moving heat to the back of the case.

      Jon.

  • by gregger ( 156275 ) on Friday December 06, 2002 @10:40AM (#4826407) Journal
    One of the largest applications of "heat pipes" is in the Alyeska Pipeline [alyeska-pipe.com]. The oil they're moving is hotter than the permafrost supporting the pipe. If the permafrost melts... well, we can guess what happens.

    So if you look at the picture on the site, the heat pipe is actually built into the support structure of the pipe joints. The little vanes on the posts wick away heat that is absorbed from the ground. They use a substance that has a very low vapor pressure in order to capitalize on the energy released in the latent heat of vaporization and condensation of the anhydrous ammonia (caused by the cold Alaska air circling around the vanes). You can find the details of this huge heat-pipe installation on their Web site [alyeska-pipe.com].

    Pretty cool (literally)!

    TTFN
  • Pretty soon they'll be reading [H]ardOCP and the Case & Cooling section of Ars Technica, experimenting with peltiers and putting their computers in refrigerators.

    Then the government will truly be l33t. :D
  • by jridley ( 9305 ) on Friday December 06, 2002 @11:07AM (#4826580)
    I read the article, and it doesn't say how this is different from existing heat pipes. My Dell Inspiron 8200 uses a heat pipe to move heat from the CPU to a radiator in the back. The Shuttle lunchbox machines use heat pipes to get heat to a large heatsink in the back. You've been able to buy heat pipes to speed cooking the thanksgiving turkey for years.

    What's the difference between them and this? They talk about technology but to those of us who don't know the specifics of *traditional* heat pipe manufacture, it means nothing.
  • I thought this problem had already been adequately solved by that scientist who used his penis to sink heat away from his laptop. So maybe this new heatpipe won't get blisters?

    http://www.manningworldnews.com/archives/00000264. php
  • I work on a laptop most of the time, on battery power a lot. Every time my fan kicks in when I'm on battery, I think to myself how absurd this is... I am using up a significant percentage of my stored electrons to generate heat I don't want... I then use up a significant percentage of my stored electrons running fans to make thast heat go away.

    Seems to me that even a small improvement in thermal efficiency of the processor would reduce TWO reasons to consume my precious battery power. Anything short of this seems like a hack - a stopgap solution until we get better thermal efficiency at the source of the problem.
    • Amen brother,
      Any, any, reduction is actual generated heat reduces the amount of additional energy needed to move that produced waste heat. The benefits are two-fold. Higher initial electrical efficiency, coupled with lower power requirements for running mechanical fans. However, some heat pipe designs (depending on their thermal characteristics) move heat well enough to be able to remove the mechanical fan as well. So, I don't think it's a stop-gap, per se. Just a good solution to the wrong problem.

      But every time I make a suggestion that we work smarter, instead of brute forcing everything, I get modded down. I guess that means I should just post more crap, instead of better... you be the judge.

  • Other posters have stated the obvious: heat pipes are nothing new, they have been around in industrial capacities at least since the 60's. The papers I've read indicate that the original development was done for satellites, to move heat from electronics modules to the skin where fins were used to radiate the heat into space. Heat pipes are quite robust, in general.
    The article gave no detail about why these new devices are 'smart', so I suspect it's used as a buzz word to grab attention. While the heat pipes aren't particular smart, applying them to CPU cooling is a good idea. I wish I had thought of it.
    However, even more interesting is the size. If I were to design a cooling system using these, I'd use a flexible ribbon to move the heat up to the back plane of the screen. This has the ideal characteristic of having a large radiating area that's rarely covered up. Back of the envelope calculations show that you can cool a typical CPU by 40 degrees (130F to 90F) with only 4.5F increase in the back plane temperature. This idea is even more attractive for metal cased laptops.
    However, I suspect that their use will be more general, extending to desktops: imagine completely passive CPU cooling - no fan, no pump, just a heat pipe the case.
    I'll be interested to see if this idea makes it into general use, or whether our pc manufacturers are too hide bound to change.

  • Comment removed based on user account deletion
  • --I'm not an engineer so can't answer this question. I was wondering why exactly no one has adapted thermocouples to this heat problem? Seems like a dandy idea to get some electricity back into the batteries. I've seen running an old kerosene lamp from russia that used a surrounding thermocouple that was adequate to run a normal radio. It looked like a normal kero lamp with fins around it, sort of like an air cooled cylinder on a small engine, kinda sorta. My boss at a dairy I worked at brought it back from a trip he made in the merchant marine to russia during ww2, it worked great! Just took waste heat, made electric, poof, done. Why can't something like this be done with hot chips? Seems like a decent way to help extend battery life and remove heat, the old two birds with one stone concept.
    • Re:thermocouples (Score:2, Informative)

      by Big_Breaker ( 190457 )
      Thermocouples has very low thermal efficiencies due to heat conductance across the pad and heat generated from the electrical resistance.

      When you have a 200 watt power supply driving the thing that isn't a problem. For a laptop it would exhaust the batteries pretty quickly.
  • Reminds me of the Hoover Dam, where they had a heat problem when the concrete dried. The solution was to run small pipes through the wet concrete and let the natural cool water flow through the pipes. This cooled the concrete as it set. Later, the pipes themselves could be filled in with concrete themselves.
  • Bumping the processor speed yet again isn't going to do squat when my win2k laptop swaps.

    Give me a laptop HD as fast as a low end desktop drive and then we can talk about better cooling....
  • Remember the guy who had his dick burnt by using his laptop on his lap?

    Thing which bothered me about that was that he felt nothing whilst using the laptop; the pain & blisters appeared some time after using the laptop.

    It occurs to me that this makes radiated or conducted heat from the laptop an unlikely culprit; I'd expect pain at the time of heat transferrence not a delayed effect.

    Radiation on the other hand could produce such a delayed burn effect, right? Or not?

    At the CPU/bus speeds these days, 2GHz processors? They must be emmitting some pretty serious radio signals, and very close like that the inverse quare law won't have blunted its teeth, so to speak.

    Maybe, just maybe, modern high speed procs need radiation shielding for close-quarters use?

    Heck, maybe Dubbya could have Saddam for possessing radiological weapons just for possessing a multi GHz proc or two... ;)
    oh that last quip was a *joke*
  • what a waste of energy. why generate heat, wasting your battery, only to throw it away. make the processors run cold, by LOWERING clock speeds and whatnot. nobody needs 3000000 gigahertz to run an editor or to email or to whatever. if you need the processing horsepower, put an optional processor that's usually sleeping but that comes on when heavy computations are done. it will heat up but that will get dissipated. oooooooooooh well.

Genetics explains why you look like your father, and if you don't, why you should.

Working...