Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Hardware

Microcoolers Could Change Processor Design 112

Skaven writes: "Nature.com is reporting about these nifty new microcoolers, tiny thermoelectric heat sinks that can be built directly onto CPUs. Using the new technology, scientists cooled a processor at 100 degrees C by 7 degrees. That's still a fried t-bird, but what this means is that if the technology gets good enough, cooling chips could soon be getting a lot easier. If anything, small 'hot spots' on the CPU could be avoided by strategic placement of microcoolers, thus helping all of us overclockers out. Heck, maybe even increasing the voltage to your CPU would make it run cooler...how weird would that be?"
This discussion has been archived. No new comments can be posted.

Microcoolers Could Change Processor Design

Comments Filter:
  • the standard overclocking tool, exactly how?
  • This article. [slashdot.org]
  • This is really cool! (no pun intended)
    This has some really far-reaching effects. Where heat was previously one of the prime concerns, it will become less so. I've heard stories of supercooled Pentium II's overclocked to around 1GHz. This could mean an instant increase in processor speeds, without any changes in the actual design. R&D, baby. R&D
  • How is this different from ->

    http://slashdot.org/article.pl?sid=01/03/16/1452 22 9
    ?
  • by rw2 ( 17419 ) on Wednesday March 21, 2001 @11:17AM (#349277) Homepage
    If these things are anything like Peltier devices then the energy crisis is going to get a lot worse. Solid state cooling take a ton of energy to perform a small amount of cooling.

    I adds a lot of waste heat too. It would be funny to see the web farms have to upgrade their air conditioning plants because their chips require on-board heat disposal. A double whammy. Dissipate an extra 7C, but spend 200W to get it!

    --

  • I'm still waiting for a breakthrough in superconductors. Imagine a CPU that can run at extremely high speeds and generate no descernable heat. You wouldn't need a heat sink or fan, making your system run that much quieter.

    Now if we could just get harddrives to run silent, we'd be all set (yes, I know about the solid state drives, but they're way too expensive and have too little capacity). Maybe when holographic drives become reality...

    --

  • If you read past the first paragraph, it says "the layers can reduce the local temperature on a silicon chip by up to 7 degrees Celsius".

    I'm sorry, but that's not going to keep your CPU from turning into bubbly goo.

  • This would be awesome, overclockers would have to be worried about freezing their chips though, hopefully it never gets that bad...I wonder if these could be inserted in other objects? I think they could make good uses in a lot of things, a battery operated "cool" towel. Just things like that...
    -----
  • There was a test of an overclocking team who put the motherboard into a freezer. 486dx25 or somesuch. They had the clock speed up to about 200 before it melted into slag.

    But people would look at you strange if you had your cables running into a freezer and just used it as a normal computer. wouldn't they?

    Okay, got to say it. I wonder what this would do for a beowulf cluster of overclocked computers...

    DanH
    Cav Pilot's Reference Page [cavalrypilot.com]
  • The distance between a cooling surface and the actual chip itself is now much smaller, and so the cooling effect is better. This seems like a very good substitute for heatsinks. However, 7 degrees difference at 100 degrees really doesn't seem all that much now. Hopefully there will be future advances to cool the chips even further.

    However, the cost of these things isn't mentioned. Judging by the way this article is presented though, it should be relatively cheap to make.

    The cooling units consist of 200 alternating layers of two semiconductors stacked atop one another like tiles.
    I wonder how many layers you can put on top of each other before it becomes inefficient.

    Better yet, make a chip that's 2 pieces, so that each side of the cooling mini-fridges could cool 2 sides. Of course this isn't feasible because of fabrication costs, but hey, it's an idea!

  • It sounds from the article (which was lacking in technical detail), like the microcoolers can chill the portion of the chip they're in contact with. Okay, I'm good with that. But where does the heat go?

    Assuming that it's redistributed, what we're really looking at is a way to take that 1GHz+ CPU and let it run nice and cool while we fry everything else inside the case, right?

  • by nublord ( 88026 ) on Wednesday March 21, 2001 @11:19AM (#349284)
    A few months ago I saw this article [sciencedaily.com]. It concerns making water run up hill so that micro coolers such as these can work in low gravity and zero gravity environments without the need of pumps.
  • Did you even read the article at all?

    Maybe you should...and take note about how they are more efficient....

    This looks interesting, especially if they caan improve this tech even further...but if they are to be fabbed in with the chip, does this mean we will have to expect Intel, AMD, and/or Transmeta to adopt this?

    Caino

    Don't touch my .sig there!

  • Am I the only one amused that the lead researcher on the project to develop microcooling for electronics is named Xiaofeng Fan? It just happened to catch my eye. :)
  • by Overphiend ( 227888 ) on Wednesday March 21, 2001 @11:21AM (#349287)
    In a room temperature environment just about any thermal conducting material, even really small ones, will bring something that hot down that little amount. A circuit has to produce a lot of heat to stay at 100 degrees, even flowing air would drop the micro controller down a few degrees. Lets see some tests at closer to room temperature, and then I'll believe in the product.
  • Actually that article talks about microfans that still have moving parts. These micro coolers have no moving parts and thus are not prone to breakdown. They work by using the flow of electricity from one semiconducting material to another to extract the heat inbetween. Howver as of yet they are still quite inefficient. Someone did point out an slashdot article that is exactly this however.
  • At least until someone manages to get a greater cooling effect without expending more power, I see the practical uses of this as very far and few between.

    It is interesting science, though, and makes me wonder if this will lead to efficient cooling devices for non-computing applications. For example, if this were made very efficient, chair-rail air conditioners could become possible (and low-noise too!). Me, I'll wait until the next breakthrough before shouting triumphantly.

    --

  • The difference between these is that this one is "a layer of two semiconductors stacked atop one another like tiles" while the other article was about tiny masses of cooling fans. One has moving parts, the other doesn't.
  • Oops. This nature article refers to non-liquid coolers. Oh well, the link I supplied is still cool.
  • I've heard of people overclocking a 486 to 247MHz [totl.net] This should help them out a whole lot!
  • what are you supposed to be overclocking, then? maybe you'd want to overclock your floppy drive... it might run those text-based games a little faster.
  • How is this different from -> http://slashdot.org/article.pl?sid=01/03/16/145222 9 [slashdot.org]

    Actually, there was a whole story on this thing, I think, here:

    http://slashdot.org/articles/01/01/23/1350208_F.sh tml [slashdot.org].

    Originally New Scientist had a story on it (here [newscientist.com]), and now it looks like it made it into Nature.

    I guess it must be officially "cool" now.

    but we will not likely see it next year... it will take a while.

  • Why not just mount the CPU pendicular on the motherboard. I know it would be really easy to break it off then (for frequent box openers) and o lot of pins would have to get on a lot smaller surface but hey now the cpu has 2 sides for cooling instead of 1. Seems like an idea, and since the coolers are quite big those could also be mounted on the motherboard to make it not so easy to break.
  • An extra 200W for every computer in the state isn't going to make any difference. Buying the distribution system? Well, that's going to solve everything, Mr. Davis. And spending $50mil on consultants to create commercials to cut residential use, so we save maybe 0.2%??? Priceless.
  • That test was a hoax. The clock on the motherboard couldn't support that kind of multiplier and bus speed. It was all just free publicity from /.
  • And if you read the very next sentence, you would see the part where it said "To be commercially useful, these devices will have to perform several times better than this; this should be possible with further improvements, the researchers estimate."
  • So far as I can tell, the 'hot' side is much closer to the 'cold' side. How this helps is certainly beyond me.

    Did anyone else wonder how on earth they're actually moving the heat? Seems like "We've made a device that can move destructive heat a very small distance from where it's generated!", which I don't get the point of. Wouldn't this, at best, create a more uniform distribution of the heat? Doesn't say anything about where it goes...

    Well, maybe they're using a big peltier on top of the chip and using these little things to move heat over to it. Maybe.

    -grendel drago
  • everybody knows that beer is the best coolant [totl.net] :)
  • Not really. Being an Asian guy myself, I know that common last names, such as Wong, Huang, and Wang are all pronounced the same. They are the names given by immigration officers who merely try to write out the phonetic sound of the name. The "Fan" of the inventor is the same as "Fong," which is my mother's maiden name.

    Oh crap. Better go change all my passwords now.

  • Thanks a lot, now I'm gonna have to build a mini-keg ooled by a watercooled peltier. Inspiration can make you do strange things...

    Someone oughta mod this up.
  • I've always liked the idea of getting the CPU itself to spin (on at least 2 axes), rather than having a fan. You'd still get efficient air flow across the CPU core, and it would look pretty good. The only problem I can envisage would be the connector getting twisted. I'm sure I've seen electric toothbrushes that alternate rather than just spin, so this may not be such a hurdle...
  • does this mean we will have to expect Intel, AMD, and/or Transmeta to adopt this?

    Yes. However, prohibitive licensing costs aside, it doesn't sound like this is a particularly expensive process, so I don't see why any chip manufacturer would shy away from this. Of course, I'm no engineer.

  • You mean like the Slot 1 and Slot A (Intel & AMD respectively) CPU packages? IT's been done, it's just much more costly. Although I do remember in the heyday of the Celeron 300A, quite a few overclockers would make "Celery Sandwiches" with a Celeron in the middle and a heastsink and fan mounted on both sides. It sire made for an interesting look.
  • They do that, it's called Slot {1,2,A,Z,whatever}.

    -grendel drago
  • need a cooler chip? get a G4.
  • You know, this all seems cool and stuff, but this story just aint that good for generating conversation beyond "ok, that's great...next story please?" So . . . next story please?

    . . .

  • I believe that was called the pentium II, and the AMD Athalon. There were a few other chips that were mounted that way too, although they never tried to cool both sides at once. I'm thinking that the chip would be too thin to actually attach the cooling plate to though.
    =\=\=\=\=\=\=\=\=\=\=\=\=\=\=\=\=\=\=\=\= \=\=\=\=\
  • Would you still want to use a fan for somethng like this? Would it increase the efficiency of the micro cooler or would it simply be required to move the thermal energy away from the chip and coolers?
  • Actually, it's even more amusing, if I'm guessing the tones right, xiao-feng means 'small-air'. So, the name is 'Small-Air Fan'; which is in some respect what was invented. How often do you invent something described by your own name?

    I should probably stay out of cooling research, my last name being 'Burns'. I knew there was a reason I left EE for CS... :-)

  • This was posted to the Science section last Friday:

    link [slashdot.org].
  • by Dest ( 207166 )
    You actually run the CPU at the rated speed, you won't need to microcool it.
  • Thats not necessarily true. If you have the same core and the same bin of processor, but there is still a demand for the proc in a lower speed bin. Then perfectly good higher bin procs are down binned to fill the demand of the lower bin.. so MTBF is the same when you clock it up again. Of course, this isn't always true, but my overclocked Celeron 300A now at 450 seems to be doing just fine after two years.

    JOhn
  • What would you do with a "Peltier-on-a-chip"?

    1. Assemble a Beowulf cluster of them.

    2. Leech more mp3s from Napster.

    3. Bundle censorware with each one (in compliance with Texas law).

    4. Leech more mp3s from Napster, but call it "hacktivism".

    5. Wintel r00l3z!

    6. Cowboy Neal.

  • As I understand it, this gets the cooling action right down to the silicon. there is a lot of thermal resistance (physics anyone?) between the silicon and the package/outside world itself. If you can pull the heat directly off of the silicon, the world becomes a happy place. And as the story points out, it would eliminate hot spots, a major source of failues in ICs.
  • The way I see it, there are still some problems. The heat still has to be moved off the chip, presumably via a heatsink. I guess this technology will help make the cores a bit cooler and therefore be able to run faster. The system that these are in will suffer because of the increased heat. And given the power consumption of Peltier devices, I think that researchers are better off trying to reduce the heat generation through better materials and design rather than strapping a fridge onto the thing. So to simplifiy - generate less not move more



  • In Nature, one must have a very good peer-reviewed paper culled from years of top-notch research to get published.

    On Slashdot, basic high-school science principles can be posted and moderated to Insightful in a matter of minutes.

    Yes gentleman (and not-so-gentle women ;-), the world is a better place with Slashdot.

  • That is just what we need...computers that use more electricity. California is have power problems as is. How much would this add to there power problems? I would like to see more chips made that are cooler.
  • This reminds me of the saying, "Everything has advanced but toilet paper..." Well everything has advanced in computers but the cooling fan.

    Well maby exept the IDE cables, but hey bluetooth is coming out!
  • "If anything, small 'hot spots' on the CPU could be avoided by strategic placement of microcoolers, thus helping all of us overclockers out."

    And we all know how much they want to help us overclockers out. Heh.

    Would be a real boon to getting a quiet PC, though.

  • /me smacks Dest around a little with a large trout.

    If designers incorporate this into their chip, they'll be cheaper/cooler/whatever! It isn't like you can mount one of these things on your chip yourself, you know.

  • Of course I read it. I hope you noticed that the claim that they were more efficient was immediately followed by a paragraph claiming they were more efficient to manufacture and that the cooling was more localized. Leaving it at least somewhat unclear as to whether the actually process of heat extraction was very efficient or just the method of use.

    --

  • by Bonker ( 243350 ) on Wednesday March 21, 2001 @12:09PM (#349324)
    Most chips manufactured are created to work at a certain maximum tolerance. If a chip won't test reliably at 1.5 Ghz, it's thrown in a pile of identical chips labeled and sold as 1.4 Ghz.

    If these advances allow for reliable on-chip cooling, then you can bet that both AMD and Intel will keep these chips clocked as absolutely high as they'll go, thus eliminating the practice of user overclocking altogether.
  • Well... I wasn't referring to "overclockers"
    This was a corporate effort IIRC. They weren't your average business machines or gaming machines, but high performance computing systems.
  • Well, it's not their serial ports... but you can use the parallel ports and use PLIP -- parallel line internet protocol.
    http://dmoz.org/Computers/Software/Operating_Syste ms/Linux/Hardware_Support/PLIP/ [dmoz.org]
  • Well isnt this the case with any cooling?
    I hate to point out the obvious, but any form of cooling is heat displacement.. including ur current heatsink / fan combo.. all it does is take the heat and pump it into your case.

    However the heat a 1.x Ghz cpu generates is nothing compared to the heat a 10k rpm HD generates. The harddrive has a larger surface, so feels 'cooler' per square inch, but the total amount of energy displaced is a lot bigger. (this is why case fan's are good to use)

    so i dont think the comment 'fry everything else in the case' is very relevant, since the microcoolers dont change the question or situation, just the method for re-distributing the heat.


    -- Chris Chabot
    "I dont suffer from insanity, i enjoy every minute of it!"
  • Besides, the copper that some chips are made of is a pretty good conductor of heat already. Perhaps all that we need are copper rods embedded in chips (cunningly placed to avoid short circuits, and equally cunningly shaped to avoid inductive effects on the useful copper) to help draw out the heat?

    But then I think that if I had actually read the article, I might have something more insightfull to say.
  • I'm not sure of the licensing issues, but the full journal article can be found on the Applied Physics Online web site, at this URL. [aip.org]
  • they don't have annoying "Free Leonard Peltier" fliers stuck to them.
  • Besides, the copper that some chips are made of is a pretty good conductor of heat already.
    Maybe it would be, if the copper weren't just nanometer-wide ribbons on the surface of the silicon. Mostly the copper keeps things cool by being a better conductor than aluminum, which reduces resistive heating.
    --
    spam spam spam spam spam spam
    No one expects the Spammish Repetition!
  • It will certainly be embraced by the micro-breweries... haaaahhh hah huhahha
    watch me lose karma for the sake of a pun

    sic semper tyrannis
  • Here's the theory... Right now, you can't build a CPU that generates too much heat, or it will fry itself. If you come up with an efficient way to remove the heat (without pumping it out of the case entirely), then they'll make hotter-running CPUs, which will cause more heat in the case, which will cause nearby chips to overheat and fail instead.

  • That's impressive. I've always heard rumors (friend-of-a-friend variety) that Intel had the 486 line running well over 100Mhz internally. The FOAF said his personal workstation at Intel was a 133Mhz 486. It wasn't clear whether these were standard 486s running on odd mainboards or odd 486s running on standard mainboards.
  • I seem to recal that in bulk, liquid nitrogen is cheaper than milk. More expensive than beer, but not by a lot.

    My understanding is that in bulk quantities, beer is cheaper than milk - there are milk marketing boards that set prices on the white stuff. Nobody is doing that for tanker trucks of beer.

  • From the article:

    Heck, maybe even increasing the voltage to your CPU would make it run cooler...how weird would that be?

    We learned in physics that it takes energy to move energy. Leaving a refridgerator open will heat up the room eventually (that was actually an exam question, I remember). So putting the microcoolers on/in the chip might allow the chip to run with more stability at higher voltages and clock speeds, but it won't make the chip run cooler. If anything, overclockers will need even bigger heatsinks and peltiers than they're using now to deal with the heat of the chip plus the heat generated by the microcoolers moving heat from the inside of the chip to the outside.

    -ck
  • by Ergo2000 ( 203269 ) on Wednesday March 21, 2001 @12:36PM (#349337) Homepage

    Forest...trees... The amount of power being used completely unnecessarily [yafla.com] by residential users is significant : Maybe it doesn't make a big difference when you consider one single home and you can laugh at initiatives for conservation, but when you consider an entire state it can be substantial. In 1999 there were 11,490,000 households in California. If every one of them replaced a single 100W lightbulb with a 15W compact flourescent, that is 976,650,000W of savings. Do you realize that most nuclear power plants only produce around 100,000,000W? So there you've potentially eliminated the need for >9 nuclear power plants by REPLACING A LIGHTBULB and you're talking about how individual users don't make a difference? Give me a break...

    And you say that an extra 200W per PC, or >2,000MW over the state, isn't a big deal. Let me guess : You don't vote because your vote doesn't count, right?

  • You would still have to have a fan to cool whatever heat sink you attached to the chip to leach the heat away from the 'internal' heat sinks. Unless I am greatly misunderstanding the tech.

    What I think will happen: the chip makers will overclock the slower chips and charge the higher chip prices for them. *shrug* People tell me that I am cynical.
    -CrackElf
  • by xtal ( 49134 ) on Wednesday March 21, 2001 @12:48PM (#349339)

    If every one of them replaced a single 100W lightbulb with a 15W compact flourescent, that is 976,650,000W of savings.

    You assume that the lightbulb is on all the time, which is incorrect. I hardly have any lights on ever at my place, and most people I know at most use bulbs for a few hours per day - and they're not going to spend a hundred bucks swapping bulbs - those 15W ones are expensive as hell. Telling people to buy them at an added cost to them - less beer, for example - without raising the price accordingly flies in the face of the economics upon which your country was built.

    Not to say conversion isn't a good thing, but the reason people waste power IS BECAUSE THE PRICE IS ARTIFICIALLY LOW. If you want people to use less power, for god's sake, just RAISE THE PRICE. That's capitialism, aren't you guys the united states of america? The supply falls, the price rises, more people will want to build power stations - but oh, wait, you've gone and fucked yourselves with environmental legislation that flies in the face of reality. You SHOULD have several more nuclear power plants, or hydro, or coal, or whatever, if you want to sustain the current price to consumers.

    You can buy all the power you want from us in Canada - it just isn't going to be cheap. Raise the price, and watch all those 15W bulbs fly off the shelves. Lower the enviromental regulations, and build some power plants. Just wait until people start using their A/C in summer - you have lots of people, well, you get lots of pollution to match.

  • My impression is that they take the heat energy and produce electricity ("thermoelectric"), which is just cool by itself. If a processor could recycle the heat it generates back into clock ticks, California would be all the better for it, let alone the cooling effect.
  • ...people who think their funny...

    "Funny" is not usually something thought of as being posessed. Usually you need a noun for possessives: "his shirt", "her car", "their country".

    Now if the intent was to talk about "people who think that they are funny", one might want to use a contraction (which generally is frowned up on in written communications) to say something like "people who think that they're funny". Dropping the "that" is probably fine for an informal forum such as this.

    One benifit of avoiding contractions (and their apostrophies) is that you can avoid the confusion between "it's" (contraction of "it is") and "its" (possessive of "it").

    Of course none of this is on topic...

  • You assume that the lightbulb is on all the time, which is incorrect. I hardly have any lights on ever at my place, and most people I know at most use bulbs for a few hours per day - and they're not going to spend a hundred bucks swapping bulbs - those 15W ones are expensive as hell.

    Ah but therein lies the crunch : Most of the power system in place is to deal with momentary peaks because people do tend to all do the same things at the same time: Everyone cranks their ovens on at the same time, and generally at the same time AC powers up (and of course in warmer places like California every W of lighting turns into a W of heat that the AC has to remove from the air). At common times a good portion of the population has their hairdriers on in the morning, and their water heaters come on because they had a shower. Every W that is piled on top of that load is a W that has to be accomodated in the power grid.

    Having said that a couple of quick points

    • You can get those bulbs inexpensively now at places like IKEA (they're $3CDN here in Canada for an eq. to a 60W incandescent). Given that they last as long as 10 normal bulbs already they're a cost savings, but the 80% reduction in power consumption is an added bonus.
    • Even if everyone wasn't using them at the same time, a lot of power companies are moving to on demand power (i.e. banks of diesel engines in distributed locations). When you turn on that 100W light that diesel engine is cranking just a little bit harder, emitting just a bit more sulphur, etc., directly because of you. People underestimate their own effect on the environment, the power grid, etc., when in reality it is substantial.
  • The end doesn't necessarily justify the means. Saving power is great - I use a lot of fluorescent lights and try to be conscienscious about power consumption, but in the grand scheme of things, 200W per computer is still chump change, with regard to the current manufactured power crunch in California.

    Tell people to conserve, but don't make up faulty data to support your claim.

  • The point of these isn't for you to go buy them and put it on your chip to overclock it. Chip manufacturers like Intel and AMD can build these into a chip in order to keep it cooler, and therefore they can make faster chips. The reason you don't have heat problems unless you overclock is because Intel and AMD design the chips so that they will not overheat at their rated speed, but in order to keep making faster chips they need to keep finding ways to better cool the chips. Heatsinks and fans work for now, but as chips get faster, cooling will need to improve, and this is supposed to help in that.
  • And even that is not 100% true anymore.
    Kimberly-Clark to introduce wet toilet paper [naplesnews.com]
    What can I say, go figure. :-)
    I inveted that years ago, something to do with it falling into the bathtub. :-)
    --------
  • Placing stuff on both sides of the cooler wouldn't work. The "coolers" don't actually change the total temperature, but move heat from one side to another. They're basically like micro-peltiers, and anyone who is into overclocking knows that putting a peltier on upside down will fry your chip. What happens is that one side cools down, but the other side heats up so putting something on both sides of the cooler would ruin whatever was on the hot side.
  • ...because "The new microcoolers are more efficient to make and to use because they can be fabricated directly onto a chip." That is, they are plated on by the chip manufacturer rather than bolted on later. It might reduce the cost. More important, it gives a much less thermal resistance between the chip and the heat pump. And it lets the chip's designers put the cooling right on the hottest spot(s).
  • I believe rw2's point is that the room was at room temperature, the chip or whatever was at 100C, and when the cooling thing was activated, the chip cooled by 7C. And he's right, that's not especially impressive. But then again, rw2 is missing the bigger point made in the Nature article, that this is something new and people are just getting their feet wet. rw2: give it a couple years and maybe they'll be able to drop the temperature by 50C.
  • Actually, the microcoolers do increase the total heat. On the hot side, you get all the heat they removed from the chip plus all the energy that was used to pump the heat. Still, the CPU+cooler isn't burning as much power as gets expended in other parts of the system like power supply, hard drive, and especially the monitor. It's just that you get 10's of watts in an area a fraction of an inch across at the CPU, rather than spread across the whole hard drive, etc. So you've got to take extraordinary measures to get the heat out of the CPU chip and into a sizable heat sink. And then you might have to look at whether that heat sink is too close to other chips. With proper design it shouldn't be a problem, but if the motherboard, chip with heatsink, and enclosure are all designed independently it might be...
  • No, they are running the thermoelectric stack backwards -- putting in electricity to pump the heat to a higher temperature at the top end, rather than letting the natural flow of heat to a lower temperature generate electricity. It would be nice if you could use the Peltier effect to recycle heat into electricity, but at the CPU the trouble is that the thermal resistance between the silicon and the heatsink is high enough to cause difficulties with keeping the silicon at a survivable temperature. Adding extra resistance in the form of a Peltier generator would cook the chip for sure.
  • ... something like

    "who the hell overclocked this comp ? we are freezing to death !"

  • "You would still have to have a fan to cool whatever heat sink you attached" Probably. Certainly if you were overclocking. But (assuming that further work on this provides much, much more cooling that the 7 degrees C mentioned in the article), it might also be used to run a chip at a moderate speed without a fan. In laptops, that's a big advantage. I don't expect to see that very often in desktops because chances are the Peltier cooler costs more than the fan.
  • People in the know about overclocking have been using this technology for years except in a larger scale. The problem is for every BTU of heat removed from one side of the device you create 2 BTU of heat on the other side so cooling the peltier or "micro cooler" itself becomes the problem. They're also expensive, making the $960 dual Athlon boards seem like biologist feed.
  • You forgot 'Put "Free Peltier from his chip!" bumper stickers'.

  • You can buy all the power you want from us in Canada - it just isn't going to be cheap. Raise the price, and watch all those 15W bulbs fly off the shelves. Lower the enviromental regulations, and build some power plants. Just wait until people start using their A/C in summer - you have lots of people, well, you get lots of pollution to match.

    As a sidenote I am from Canada (Ontario) and while we have a surplus of power in general, the fact that there is a large coal-burning power generation plant up the lake instantly tells me that we don't conserve enough, or we're not exploring more sustainable methods of power generation enough. With all the hydroelectric and alternative power generation, still we're burning coal?

    Power generation, just like all industry, should be based upon real numbers. If you're running a plant that is emitting massive pollutants into the air, the degradation of quality of life, increased health costs, lowered land value in proximity, and future cleanup costs should all be assessed and built into the cost of that power. If that were the case there would be far more environmentally friendly power generation facilities because the real numbers would be more apparent. As it is we like to charge it on the environmental credit card and pretend it's cheaper while disparaging "environmentalist", when in the long run it always ends up costing us far more in health care costs, we live shorter lives, and we (the public) end up funding billions of dollars in clean up costs. Although I generally think nuclear power is a good option, that same dreamland thinking has proven detrimental because at the outset everyone talked about cheap nuclear power...not adding in the cost of tens of thousands of years of nuclear waste monitoring/cleanup.

  • by evanbd ( 210358 ) on Wednesday March 21, 2001 @02:40PM (#349356)
    I read a different article on this (don't have link, sorry) that suggested that these would be used to move heat between different regions of a chip. For example, cool a very hot ALU by dumping the heat elsewhere. Before you say that's stupid, in reality a more even heat distribution would decrease the maximum temperature of any given point on the chip, allowing it to run hotter on average (and hence faster). Also, it makes it easier to cool, as a greater region is in contact with the heat sink.
  • Which ones are you using? Visible light output is measured in lumens and if two lights are a given lumen output they are measurabley and visibly putting out the same amount of light. The current 15W compact-flourescents put out the same lumens as a 75W incandescent.

    If you mean quality of light I greatly disagree : I find that compact flourescents put out whiter, more natural light. There is no flicker with CFs.

  • just right for when you need to carry that microbrew beer around but don't want the hassle of a regular sized cooler.

    --
  • People have been waiting for this for many a year and nothing has come of it. I remember reading that it required supercooled nitrogen which is not exactly ecconomical.

    Right, but what I'm referring to are high temperature superconductors. If some day a way is determined to create such things, the liquid nitrogen will be unnecessary.



    --
  • You must not have any SCSI drives!
  • those 15W ones are expensive as hell



    They cost about 8x the price of a normal lightbulb, use up 1/5 of the energy, and last 5 - 10x longer. What is the problem? Oh, you live in America, there is probably some 1000% (to get a $100 lightbulb) tax on energy efficient devices from lobbying from traditional lightbulb manufacturers. Get real.

    Even when they originally came out and cost about 30x the price of a normal lightbulb, they would still save you money over the lifetime of the bulb because of the reduced electricity costs.

    However, where I live, there was a normal lightbulb that gave out a few weeks ago. It was bought 70 years ago. Got a good lifespan from that one. I think that new cheap lightbulbs are like new floppy disks - they always give up after about a few months or if you use them.

    All the bulbs in my house are energy efficient. From the 7W candle bulbs (replacing 40W bulbs) that you can hold in your hand when on after a few hours, to the 11W (60W) bulbs, to the 20W (100W) main lightbulb. I can have 4 lights on in the lounge using up a total of 53W in total, not 240W.

  • :::grumble::: They're not kidding... Of 3 T-birds I've seen, 2 melted! Guess it's the high price/high stability/low performace of Intel vs. the low price/low stability/high performance of AMD. Go figure.
  • I agree with you up to the point about the environmental legislation. Environmental legislation was not the problem, the problem was that no cities wanted power plants next to them. People are rapidly rethinking this attitude, but it is too little too late. A power plant doesn't get built overnight, and by the time that it is built, the damage to the economy and environment is done.

    I agree that prices should be raised. What is happening now is that the utilities are having to take out loans, which will be repaid with interest both from higher utility rates and taxes. Great, as if CA state tax wasn't high enough already.
  • Just to state the obvious, fluorescent lights use less power AND give off less waste heat, which makes my air conditioner run less often.

    This summer I'm going to put aluminum foil over one of my apartment windows (there's no shade at all), has anyone tried this?
  • ..........i'm sorry, you've already reported this one. http://slashdot.org/article.pl?sid=01/03/16/145222 9 . Next!
  • That coal plant is there for a reason. Have a look at this graph [uic.com.au].

    The problem is that many types of electricity generating plants cannot be "turned on" and "off" quickly. Things like nuclear plants and hydroelectric are great for supplying the base load but for the peaks you need something you can bring online quickly and take off just as quickly. Natural gas and coal are the most popular choices here. Bad Things (tm) happen if you have more or less generating capacity than the load you're trying to supply.

  • Hey, Great post... first of all. Cooling by increasing voltage is not weird at all! Think about it. When you double your voltage, the amount of current necessary to do the same work, gets cut in half. Now if you remember your science classes, 2x the current = 2x the heat. 20 AMPS will be twice as hot as 10 AMPS (not exactly, I don't have my electro-thermal equations handy). Therefore the gauge of the wire carrying the current can be smaller. Since it takes 60 AMPS of 110v to run the average external AC unit(for example) you can save copper and make it more robust by running the AC unit on 220v with 30 AMP wire (as long as you don't live in San Francisco...hehehe). Higher voltage is always preferable to low voltage high current. Motors last longer etc. because the copper is less susceptible to heat damage. The down side to high voltage is that it is more dangerous. It arcs easier etc. and kills much more completely, faster. Don't get me wrong, with enough current, 1 volt can kill you just as dead as 50,000 volts, depending on the current. That is what kills... High voltage scrambles neurons more than it burns, unless the current is high. You can get knocked out by your ignition coil on your car, but it can't fry your arm off (at least you wouldn't be able to hold that on your arm anyway). Why? Low current. High voltage and high current are a deadly mix indeed. 50,000 volts at 500,000 amps will vaporize you from 1 foot away. You don't even need to touch it. That is why power companies use high tension wires... they are cooler and the electricity suffers less loss (through heat). Hope this clarified that whole thing a bit. I am not an electrical engineer (I am just another programmer...), but worked in an electrical motor shop during college. I used to know the equations. L8, Neil
  • I just printed this post and followed your advice, all over it.
  • Another interesting property the Peltier element has is that it can convert heat into electricity. So could you have them drive the fans, etc. to a degree? You could save a little energy that way, but it would add up in the long run.
  • Not sure if it will make a big difference in the long run, but the general trend is (if I can remember from my physics classes...anyone verify this?) good thermal conductors are good electrical conductors. This presents obvious problems if you try and make these things too big. Also, from a design standpoint, does this serverly limit the way chips can be built around these thermal conductors? Also, the cooling capacity is a function of surface area...perhaps a design can be made that can dissipate more (via more surface area). Comments?
  • It doesn't matter if it only moves the heat a small distance.. it still moves it. If you touch a surface with your finger, it doesn't matter if the material 1 mm down is at a thousand degrees.... if the surface you are touching is at 0 degrees, it'll feel cold. Period.

    Peltier devices move heat away from one side to the other side.. and they also generate heat (which ends up on the hot side of course). That's why there is always a point where it's generating more heat than it can move, and becomes inefficient.

    The point is, it moves heat away from the chip surface faster and more reliably.. that heat still has to be bled off with a heat sink/fan/whatever.

UNIX is hot. It's more than hot. It's steaming. It's quicksilver lightning with a laserbeam kicker. -- Michael Jay Tucker

Working...