Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
AMD Hardware IT

AMD 90nm Evaluated 291

muyuubyou writes "The Tech Report has measured the new 90nm A64 3500+ against its 130nm counterpart and a Pentium 4 3.6Ghz 90nm. AMD looks way ahead in the 90nm process especially when it comes to power consumption. Note these are consumptions for the entire system including GeForce 6800 GTs and hefty PSUs. RTFineShortArticle for more detail on the configuration. Leaving the PC on overnight is probably not a good idea with these new Pentium 4s."
This discussion has been archived. No new comments can be posted.

AMD 90nm Evaluated

Comments Filter:
  • wow (Score:5, Interesting)

    by hruntrung ( 89993 ) on Monday October 04, 2004 @01:57PM (#10431114)
    That's impressive. Of course, since it's total system wattage, it'd be nice to have some information about disk usage over the period of time, etc.

    I like, though, that the 130nm Athlon 64 is still better than the 90nm P4. It might just be time to buy another desktop.
    • Re:wow (Score:5, Informative)

      by Tumbleweed ( 3706 ) * on Monday October 04, 2004 @02:01PM (#10431187)
      I'd wait until the Athlon 64 PCIe boards come out before buying a new system, so as to prolong useful system life. Nvidia's nForce4 chipset should be out in 4th quarter or thereabouts. _Then_ you can jump safely into a new system.
      • Re:wow (Score:2, Funny)

        by tanguyr ( 468371 )
        " I'd wait until the Athlon 64 PCIe boards come out before buying a new system, so as to prolong useful system life."

        soooooooooooon. [trustedreviews.com]
        • I don't think I'd go for a VIA solution over an NVidia one, but it's certainly best to wait for detailed reviews of both before deciding, especially if you tend to keep your systems around for a long time. I'm just now in the process of upgrading from a 550MHz Slot-A Athlon system. I have high hopes for BZFlag playing with my new NVidia 5900XTV over my old GeForce2mx (which wasn't doing textures very well any longer, either). :)
      • Question...

        Can any somewhat-modern Socket 939 motherboard run these new chips?

        I"m looking at getting the MSI Neo 2 Platinum, but I'd like to get a 90nm A64. I've been waiting for a while.
        • It shouldn't require anything more than a BIOS update, if that. If you've been waiting awhile, I'd suggest you keep waiting for the PCIe boards; it shouldn't be much more than a few months now, and will ensure a much longer useful life.
    • Re:wow (Score:3, Interesting)

      by stratjakt ( 596332 )
      This isn't just P4 vs AMD, this is:

      Asus Motherboard + AMD + DDR400 + AGP version of video card

      vs

      Abit motherboard + P4 + DDR2-533 + PCI-E version of video card

      The P4 is no doubt hotter, but the faster RAM and video bus on that rig must account for a good chunk of the extra wattage too. Note that the benchmarks used are particularly memory intensive (mpeg rendering, speech recognition, molecule modelling)..... Hmmmmmmmmmm..

      I hate rigged tests to make "my favorite corporate tech asshole company" look be
  • Power consumption (Score:4, Insightful)

    by TrollBridge ( 550878 ) on Monday October 04, 2004 @01:57PM (#10431124) Homepage Journal
    "Leaving the PC on overnight is probably not a good idea with these new Pentium 4s."

    It' snot going to make THAT much of a difference on your electric bill.

    Now what I want to see is an analysis of the possible benefits to notebooks, specifically in extending battery life. Intel's Centrino seems to be doing fairly well in that department, but where is AMD's response?

    • by lakiolen ( 785856 )
      I must say that computers make good space heaters. (Especially with winter just around the corner) And while heating up the room you're in they can also entertain you, unlike most regular space heaters.
      • Re:Power consumption (Score:2, Informative)

        by aldoman ( 670791 )
        Most people have gas heating. Gas heating is far, far cheaper than horrible, ineffecient electric heating.
        • Re: (Score:2, Funny)

          Comment removed based on user account deletion
    • Re:Power consumption (Score:4, Informative)

      by aldoman ( 670791 ) on Monday October 04, 2004 @02:01PM (#10431183) Homepage
      Yes, actually it is.

      Let's say your average 'gamer' system uses 500W of power, including monitor.

      At 10c per KWh, that is going to be 5c/hour, or $37/month.
      • Re:Power consumption (Score:5, Informative)

        by Minwee ( 522556 ) <dcr@neverwhen.org> on Monday October 04, 2004 @02:17PM (#10431399) Homepage
        That's being pretty generous -- The power supply of that 'average gamer system' would have to be running at peak capacity 100% of the time to use that kind of power.

        A large screen CRT monitor uses somewhere around 50-70W when active, and 1-2W in sleep mode. LCD displays use less power, but they're not what the average gamer uses.

        Steady state usage for the computer itself is more like 200W than 500 -- The 500W capacity on your average gamer's power supply is equal parts peak capacity for boot-up and lies told by marketing, and you would have to be playing Doom 3 all day long, every day to keep that up for the entire month. Even if you disabled power management and just let it idle all night long it would still use less than 100W.

        Using these numbers, and assuming that your average gamer is playing twelve hours a day and in class or sleeping the other twelve, we're looking at an average power consumption of 175W for a total of fourty-two cents per day or $13 a month at your rates.

        The back of the envelope rests, your honour.
        • "A large screen CRT monitor uses somewhere around 50-70W when active, and 1-2W in sleep mode. "

          Sorry but you are not correct. The 19" Flat Screens do about 100-140W. My Sony G400 19" does about 140W [sony.com] and <1 W in standby.

          Samsung Syncmaster 957 MB 19" CRT: 110 W [samsung.com]
          ViewSonic E90 19" CRT: 100 W [viewsonic.com]
          Benq Professional P992 19" CRT: 110W [ncix.com]

          • Re:Power consumption (Score:5, Informative)

            by default luser ( 529332 ) on Monday October 04, 2004 @03:51PM (#10432489) Journal
            Actually, for CRTs, active power usage depends on the rate of the dot clock.

            The power usage of a monitor will increase linearly with dot clock (with some minimum accounting for the brightness of the display).

            Most high-end 19" monitors (with high-speed dot-clocks) have a maximum power usage of around 140w. Those numbers you have quoted are for THE HIGHEST supported resolution and refresh rate, with the maximum brightness...they vary because the maximum brightness and maximum dot clock speed vary among them.

            On the other hand, most people use the recommended resolution and brightness set by the manufacturer. That is usually 1280x1024@85Hz on a 19" monitor, for a dot clock of around 111MHz.

            For comparison, if you run your 19" monitor at 1600x1200@85Hz, you'll see a clock of 163MHz, and a proportionate increase in power usage.

            For example, my monitor (Vision Master Pro 454) has a maximum rated output of 135w. If we ignore the brightness issue, then we assume that at maximum frequency (1920x1440@85), or 235MHz, the power usage is 135w.

            So, scale down to a more reasonable resolution like 1600x1200, and we're only using ~ 93w. Or use the recommended resolution at 1280x1024, and we're sipping a cool ~ 63w.

            Of course, these numbers are probably a bit higher due to components I have not taken into account. I do recall that the instruction booklet for my 454 lists power usage at multiple resolutions, and they did display this linear relationship, but I don't have access to it now.
        • A large screen CRT monitor uses somewhere around 50-70W

          When I last measured, it was nearer 150w when active, and about 15w when idle (Sony 15" screen, about 10 years ago)

        • Re:Power consumption (Score:3, Informative)

          by Lost Race ( 681080 )
          I have a buncha PCs here, all a few years old now. The mid-towers with 1GHz Pentium III CPU, GeForce4 video, one IDE drive, take about 60W for the system itself at idle. One particularly big PC server with 10 HDDs, 2 CPUs and lots of fans is about 125W idle. One system with a 2GHz Athlon and a few SCSI drives is about 100W idle. Power consumption goes up about 20-50W when 100% busy, depending on the CPU model. I think that idle-to-busy difference might be approaching 100W now with the latest Pentium 4
        • Suffice it to say... (Score:3, Interesting)

          by PCM2 ( 4486 )

          That's being pretty generous -- The power supply of that 'average gamer system' would have to be running at peak capacity 100% of the time to use that kind of power.

          Yeah, well for a while I was running my computer all the time as a way to host my personal Web page and networked et ceteras, on a DSL line with DynDNS etc. This was a Mac G4 with the monitor switched off most of the time, not doing much of anything except fielding HTTP requests from viruses. And while it didn't increase my power bill by $37,

      • Re:Power consumption (Score:2, Informative)

        by Cramer ( 69040 )
        My entire "cluster" consumes less than 600W (554 "idle" -- 575 gaming.) That's 1 x dual Opteron 240, 1 x dual PIII 850, 1 x dual PII 333 (dual Voodoo2 in the thing), 7 x 146G FC drives in a Eurologic FC7 shelf, cable modem, Cisco 1760, Sony LCD monitor, unmanaged ethernet switch, etc. That comes to, on average, 352$ per year. [~30$/month]

        I spend more per year using the kitchen toaster than all of the computer hardware combined.

        Just because you have a 500W power supply in the PC does not mean it consumes
      • Yes, actually it is.

        Let's say your average 'gamer' system uses 500W of power, including monitor.

        At 10c per KWh, that is going to be 5c/hour, or $37/month.

        500 W of power is grossly overestimating even a top end gaming system with a big CRT. Real world numbers are closer to 1/3 to 1/2 that even when playing games. If you can manage to play games 24/7 on a 50" big screen TV, well then, more power to you.

      • Re:Power consumption (Score:5, Informative)

        by Remlik ( 654872 ) on Monday October 04, 2004 @02:35PM (#10431604) Homepage


        Ok, I have a P4 3Ghz, ATI Radeon 9800, 3 HDD's (which do not sleep) and a 21" CRT monitor. The only power saving feature I use is putting the CRT into sleep mode after 15 minutes. Otherwise the computer and drives run full time.

        My electricy bill is at times lower that $30 a month. No, I do not use the spread your payments out option.
      • Tell me what the same system draws when idling, the monitor is off and the computer is not running a 3D program? A graphics card does most of its power consumption when doing 3D rendering.
    • Re:Power consumption (Score:3, Informative)

      by Anonymous Coward
      And remember that most modern operating systems (Windows 2000/XP, Linux) run a single "HALT" instruction in their idle thread. All CPUs since the Pentium Pro (1995) automatically enter their low-power (200 mWatt or so) idle mode in response to a HALT. As long as you're not running Seti@- or Folding@Home in the background, your CPU isn't going to be wasting power unless you are DOING something.
      • Re:Power consumption (Score:5, Interesting)

        by dougmc ( 70836 ) <dougmc+slashdot@frenzied.us> on Monday October 04, 2004 @02:31PM (#10431564) Homepage
        200 mWatt or so
        Yes, it does use less power when running the HLT instruction, but not *that* much less. Half the power wouldn't surprise me. Modern cpus use over 50 watts of power -- I remember when the Pentium first came out, and people were amazed at it's 13 watts of power use. If only they knew!

        Now, going into suspend or sleep mode, that saves a lot more power, but it can't do that thousands of times a second.

        And in case people don't realize it, running things like Seti @Home or RC5 *do* cost them money. Their computer will probably use somewhat less energy if idle than it does when busy.

        Also, it gets worse. Not only do you have to pay for the extra power used by your computer, but if you live somewhere hot, you'll have to pay for the extra air conditioning needed (after all, 200 watts of power used by your computer = 200 watts of heat generated.) Somebody told me that as a rule of thumb that 5x the amount of power used to generate the heat is needed to remove it via air conditioning -- so 200 watts of computer = 1000 watts of A/C needed to keep it cool. Can anybody confirm or deny this rule of thumb? -- it sounds like too much to me.)

        • Somebody told me that as a rule of thumb that 5x the amount of power used to generate the heat is needed to remove it via air conditioning -- so 200 watts of computer = 1000 watts of A/C needed to keep it cool. Can anybody confirm or deny this rule of thumb? -- it sounds like too much to me.)

          Deny.

          500Watts of A/C gives about 5200BTU of cooling, which is what is recommended to cool 1000W of computers. And that's probably using less than 100% duty cycle.

          (A/C's are heat pumps, so they do not need as much

    • by Behrooz ( 302401 )
      130 watts of continuous usage runs to almost 100kW/Hr per month, which works out to an additional $8-12 on my power bill. I live in a relatively cheap electricity market, too.

      Given that there are five high-end computers currently living in my basement, I'd say it adds up.

      • I was a little surprised at what you said, so I checked the math. At 7 cents per kilowatt-hour, the cost is $6.64/month:

        Power in Watts: 130
        Hours/Day: 24
        Days/Month [365/12]: 30.42
        Hours/Month: 730.00
        Watt-Hours/Month: 94,900.00
        Kilowatt-Hours/Month: 94.90
        $/KW-Hour 0.07
        Cost/Month: $6.64

        That's approximately $1 per penny of cost per KW-Hour.

        It is true that a desktop computer, with monitor off, draws a little over 1 Amp at 120 Volts, or approximately 130 Watts. I teste

    • Re:Power consumption (Score:5, Informative)

      by wherley ( 42799 ) * on Monday October 04, 2004 @02:05PM (#10431238)
      say the P4 uses pretty grossly 100 watts more than the AMD - and you leave that on for 10 hours overnight each day for a year. say you pay about 7 cents a kilowatt-hour. then you end up paying .10 kilowatts * 10 hrs/day * 365 days/yr * 7 cents/kilowatt = ~ 25 bucks a year extra to run the P4.
      say there are 100,000 P4 users doing this - there goes 2.5 million USD worth of electricity up in heat! :)
      • The problem is, it doesn't use 100 watts more than AMD. The difference at idle is 40 watts.

        The 6800GT in their test-bed sucks more power than any CPU they're testing.

        And who knows how the various motherboards being tested are affecting the measurements. They aren't even from the same maker (Asus vs Abit). Odd that they take a top tier manufacturer for the AMD tests (Asus), and a manufacturer known for shit (Abit) for the Intel.

        Not only that, they have the intel running DDR2 533 vs DDR 400 on the AMD -
        • Odd that they take a top tier manufacturer for the AMD tests (Asus), and a manufacturer known for shit (Abit) for the Intel.

          I agree that they should have used motherboard from the same manufacturer, but I also can't see this making a measurable difference. Nothing on any motherboard I've ever used heats up aside from the chipset, so I can't imagine the power drain to be significant.

          RAM, CPU, chipset, GPU, hard drives... they heat up. Not the motherboard itself.

          Also, Abit produces excellent qualit
    • Re:Power consumption (Score:2, Informative)

      by hattig ( 47930 )
      AMD have 35W mobile processors at up to 2GHz now based on the Athlon 64.

      And leaving your PC on overnight does make a difference. Lets say you leave it on all the time, but only use it 8 hours a day. Intel P4:

      16 hours * 150W (idle, 230W if folding) * 7 * 52 = 870kW (1.3MW) of power consumed more than you need to use.

      Now I don't know about your electricity prices, but 15c/unit is $130 a year to run the system without any use ($200 when folding). If you have an overnight cheap electricity rate though it won
    • Re:Power consumption (Score:5, Informative)

      by iwadasn ( 742362 ) on Monday October 04, 2004 @03:56PM (#10432543)

      The biggest thing you can do for your electric bill is get rid of your incandescent bulbs, compared to that, nothing else comes close. Replacing a 60 watter with a 20 watt CF will net you 40 watts each, and you get about 1.5x the brightness.

      your average house has something like 10 replaceable bulbs, so that's something like 400 watts, more than even a couple large computers.
  • by aldoman ( 670791 ) on Monday October 04, 2004 @01:58PM (#10431139) Homepage
    With every computing consuming more and more power, its looking like we will need a wind turbine or solar PV array for anyone to run a decent sized network of computers at home.

    Anyone currently doing this? I'm thinking of installing a turbine, but unsure of where to start out.
    • Like most other /.ers, I'll only be interested if you can get the turbine to run linux.
    • You start by building a 50 to 100 foot tower on top of your home so that the turbine will be high enough to be in wind flow without blockage from nearby obstacles (i.e., trees and other houses).

      Research all about windpower here http://www.windpower.org/en/core.htm [windpower.org] as it seems to me to be the bible of wind power.

      Then, integrate the power that the turbine generates into your homes power grid. This is a good resource: http://www.homepower.com/ [homepower.com]

      An article that caught my eye in Popular Science pointed t
    • Anyone currently doing this? I'm thinking of installing a turbine, but unsure of where to start out.

      From the sound of things, directly over the cooling fan for your CPU would be the ideal location. :-)

  • by JUSTONEMORELATTE ( 584508 ) on Monday October 04, 2004 @01:59PM (#10431153) Homepage
    Can someone elighten me on this? Is there a reason why the SpeedStep and other power-saving methods that are used in most laptops can't be adapted to desktop systems?
    The old joke is that all CPUs sleep at the same speed, but after seeing the power consumption graph on this site, it's obvious that "power-hungry CPU" doesn't just mean high heat during gaming. This suckers are hungry even while doing nothing at all


    --
    Free gmail invites [slashdot.org]
    • by News for nerds ( 448130 ) on Monday October 04, 2004 @02:03PM (#10431206) Homepage
      Cool'n'Quiet [amd.com], baby.
    • As was already posted, AMD has Cool'n'Quiet on the desktop which runs chips at 1GHz using reduced voltage @ 22W.

      Intel is planning something similar for the Prescott before eventually getting the P-M to the desktop now that Tejas has been canned.

      http://www.xbitlabs.com/news/cpu/display/20040602 1 10858.html [xbitlabs.com]
      "The new capabilities Intel plans to include are the so-called AAC technology that adjusts performance depending on load in order to maintain low heat dissipation and quiet operation of personal compu
    • 150 watts at the wall. Part of that is due to inefficiency in the power supply. That "idle" system also contains a graphics card which is continuously flinging data out a port (VGA or DVI) to a monitor to display something. The hard drives may be spinning. DRAM requires periodic refreshes or it loses data. So that "idle" system still has a lot of internal maintenance functions to do. The CPU is the largest single chunk of system power usage, but most of the power goes to other components in the system.

      This
    • They certainly can be; and they are. It's just that the markets have different focus. Thus, a few yares ago, the engineers spent a disporportionate part of their time on speed. Today, it's a bit more about power, but it will take a few years for that to pan out (reference Intel's dual core strategy for their plans)

      The power saving methods are designed to cut the ACTIVE power use of the chip - the power that is dissipated by the transistors flipping from 0 to 1 (and 1 to 0). The challenge, as we shrink

  • by darkmeridian ( 119044 ) <william.chuangNO@SPAMgmail.com> on Monday October 04, 2004 @01:59PM (#10431156) Homepage
    They found that the Prescott P4, with its emphasis on Mhz, puts out a lot of heat in spite of its 90 nm architecture. The new 90 nm AMD 64 is cooler and uses less energy than the 130 nm version. Great.

    But what about performance? The new 90 nm Pentium M processors, the one with the funky names, aren't doing as well in terms of performance scalability because of electron leakage issues. Any such concerns here? How fast can the 90 nm Athlon 64 core go before it dies?

  • by Behrooz ( 302401 ) on Monday October 04, 2004 @01:59PM (#10431157)
    As the proud owner of an old-school Duron, my computer isn't a problem. However, living in a bachelor pad which happens to be filled with geeks, we have a cable modem.

    This makes our house faster than our friends' houses. So their computers migrate there also. And the bastards never remember to turn them off...

    Having five or six power-hungry gaming systems around explains much about our recent power bills.
    • by phorm ( 591458 )
      Just as an FYI, I've had a few durons, and they have tended to be among the more power-hungry and hot systems I've run.

      Durons really aren't all that efficient... not as powerful in output as compared to an Athlon, sure, but that doesn't mean that less output doesn't mean input is generally lower (just less efficient).
    • If I were you, I'd make sure your friends buy you beer/soda and pizza at least periodically... heh. Most of my friends would just do that because they're cool like that, though.
    • Just turn them off yourself... with extreme prejudice!
  • by hattig ( 47930 ) on Monday October 04, 2004 @02:00PM (#10431164) Journal
    Buy an Intel Prescott based system if you live in the Artic Circle ...

    Looking at the data in the article, would I be mad in assuming that a 90nm 3500+ uses around 23W in idle mode?

    Assuming power supply is 75% efficient:

    112W * 0.75 = 84W getting to system
    179W * 0.75 = 134W (130nm under load, near TDP of 89W, let's assume 84W)

    134W - 84W = 58W Mobo, Gfx, IDE, etc power consumption

    84W - 58W = 26W
    26W * 0.9 (motherboard VRM efficiency) = 23W

    I suppose that system power usage also drops in idle mode though as well.

    Yes, these figures are extremely dodgy and vague and aren't worth much more than the speculation they are. It looks like the 3.4GHz P4 uses over 100W under load though - that is shockingly high.
  • What's the issue (Score:3, Informative)

    by Saeed al-Sahaf ( 665390 ) on Monday October 04, 2004 @02:00PM (#10431166) Homepage
    Leaving the PC on overnight is probably not a good idea with these new Pentium 4s

    Well, in my book, power consumption is not a huge issue if there is proper cooling. Under normal and even high use conditions, the unit is designed to take the heat, and my server room needs a bit more heat anyway. Why shouldn't I leave it on? My units have good cooling, and since I run my boxes under normal server configuration, i'M not "overclocking".... Heat? No issue.

  • Wow... (Score:5, Interesting)

    by T3kno ( 51315 ) on Monday October 04, 2004 @02:06PM (#10431261) Homepage
    The ambient temperature in his office was 85 degrees F? I'm breaking a sweat at 72F. When the A/C turns off in our office over the weekend the ambient climbs to about 85 and all of my servers fans are on overdrive. I wonder if that had anything to do with the power consumption in this test, I'm curious to see what the diference is at a more normal operating temperature, say 69 degrees F.
    • Re:Wow... (Score:2, Informative)

      by Anonymous Coward
      "I'm breaking a sweat at 72F"

      Time to check your weight and/or blood pressure.
    • 69 is a normal operating temperature?

      I have my apartment air conditioning set at 79 degrees F. Granted, I've lived with 100 degree summers all my life, but still...

      Running a HVAC system down to 69 degrees seems like a waste of (mains) power.
  • by etaluclac ( 818307 ) on Monday October 04, 2004 @02:06PM (#10431263)
    they'd bill this as a "feature." Buy the processor and we'll bundle the radiator for free. Remember, supplies are limited, so hurry before winter approaches.
  • Overclocking (Score:5, Informative)

    by Sivar ( 316343 ) <charlesnburns[ AT ]gmail DOT com> on Monday October 04, 2004 @02:08PM (#10431293)
    A friend of mine, an overclocking expert (inventor of the "Goldfinger devices" if anyone remembers those) said that the new shrunk cores overclock to around 3GHz if you can get your FSB high enough (though this won't be an issue with the FX chips, which aren't clock-locked).
    To those paying attention, 2.2GHz in an Athlon64 can generally outperform a 3.4GHz PentiumIV, so this is a big deal.
  • by mreed911 ( 794582 ) on Monday October 04, 2004 @02:14PM (#10431358)
    The ambient temperature in my office was about 85F/29C,

    The *ambient* temp was 85F? Lord, I'd hate to think how much I'd be sweating in an 85 degree office with limited air movement...

    This magazine writer works at a place that can't afford air conditioning? Or does he have so many computers in there that he's just cooking himself voluntarily?!?

    What *does* roast-geek smell like?
    • This must be the data center for Kathi Lee Gifford's clothing sweatshop.
    • I worked in a lab last summer doing vapor deposition stuff.

      The temp in the building was probably around 72 (it's insane to keep it any lower than that--this was in Huntsville, AL, where summers are hot). However, when the ~15 kW of lab equipment all kicked on (heaters and coolers and microwave generators and computers and vacuum pumps and yada yada yada), it easily got to 85F.
  • 40% (Score:3, Informative)

    by InodoroPereyra ( 514794 ) on Monday October 04, 2004 @02:15PM (#10431377)
    the relative difference between the 90nm processors (defined as the difference divided by the average) in power consumptions is huge, and pretty consistent: 30% at idle, and then 43%, 45% and 44% for the other tests. These are huge numbers !
  • by 3770 ( 560838 ) on Monday October 04, 2004 @02:35PM (#10431605) Homepage
    The P4 system he was running was running at about 150 watts at idle.

    Now, if you are running an A/C unit then you will not only have to consider the 150W your computer is using, but also the power that your A/C is using to fight the heat that it produces.

    100% of the power used in the PC becomes heat (I think). So that is 150 W of heat. Your A/C, however is not 100% efficient. I really have no idea what the numbers are there. But it can't be more than 100% efficient so that is another 150 Watts (at least)

    So your 150W computer is costing you 300W at the least.

    Now, if you on the other hand live up north, then it looks much better. The heat produced will actually help your heating system, so that it doesn't have to run as much. My physics knowledge is a bit rusty, but I think you can say that if your heating system is based on electricity then it will cost you nothing extra to run your PC.

    Please let me know if/where I'm wrong.
    • I used to think that, but going back over my thermo I've realized that you can get more than 100% efficiency out of a heater. It's true that heat from computers costs you less in the winter (since it helps then and hurts during the summer), but modern heating systems are more efficient than that analysis indicates ... or so I think. I could be wrong.

      The key is the phrase "heat pump".

      Modern heat pumps, I believe, are nothing but refrigerators run backwards. For a simple analysis, consider your kitchen free
      • Hmm... First time i read that I thought for sure that you had just invented a perpetual motion machine.

        But maybe you are right. You don't seem to be violating physics.

        • The trick is that the extra power (above the amount of electricity you use) comes out of the ambient air, causing it to be even more frickin' cold right next to the heat pump.

          There's a limit to the ratio between the electricity you put in and the heat you pump out of the cold place and into the hot place. However, that limit isn't caused by conservation of energy, which is the principle that prevents perpetual motion; it's caused by some thermodynamics stuff that I really ought to go back over.
    • 100% of the power used in the PC becomes heat (I think). So that is 150 W of heat. Your A/C, however is not 100% efficient. I really have no idea what the numbers are there. But it can't be more than 100% efficient so that is another 150 Watts (at least)

      I can't remember the name of the measure of effectiveness of an air conditioning (or heat pump) system, but it's not 'efficiency.' There is no thermodynamic law saying that it takes at least 150 watts of input power to your heat transfer machine to reject

    • by NerveGas ( 168686 ) on Monday October 04, 2004 @02:59PM (#10431909)

      The one place where your figures aren't quite right is in the air conditioning department. An air conditioner, being a heat pump, just needs to move the heat from one spot to another, and the "typical" phase-change A/C unit is fairly efficient at it.

      To put some figures on it, an air conditioner with an EER of 12 means that it can move 12,000 BTUs with 1000 watt-hours of electricity.

      Now, 12,000 BTUs is equivalent to 3516 watt-hours of heat. So for every 3,516 watts of heat generation, you'll be expending 1,000 watts to move that heat to the outside of your building. And that's with an EER of 12, some units exist with EERs as high as 17.

      So, for every 150 watts of power your computer is using, figure 40 to 60 watts for your A/C.

      On the other hand, were you using a peltier device for cooling, you'd be in bad shape. If the EER figure were applied to them, it would be less than 1. For example, to move 30 watts of heat across a peltier, you'd need to apply approximately 45 watts of heat to it - meaning you'd be removing 30 watts from the cold side, but you'd need to remove 75(!) watts from the hot side.

      steve

  • The ambient temperature in my office was about 85F/29C

    If the ambient temperature in my office were that high I'd be looking furiously for a lower-power chip, too.

  • Recently, I took some "at-the-wall" measurements of various systems in my computing room. My router is a dual-CPU system, using Pentium 133's. I always thought I was probably being wasteful in the electrical department, but when I measured it, the entire machine, under load, only drew 47 watts of power. I was pretty surprised.

    My file server, a P3/650 with 4x120 gig drives and a 3ware card (running SETI) drew 85 watts. And my primary machine, under full gaming load, drew 270+ watts from the wall *w
  • Poor comparison? (Score:4, Insightful)

    by Epi-man ( 59145 ) on Monday October 04, 2004 @03:08PM (#10432004) Journal
    This seems a poor comparison between the AMD CPUs. Given they have taken a 130 nm chip and underclocked it, that means the chip is capable of higher clock speeds and therefore has "hotter" (from a speed sense) transistors as we used to say at AMD (used to work there). Since the transistors can deliver more current when on (leading to the higher clock speeds), by definition (subthreshold slope is limited by physics to ~60 mV/dec of current) they will "leak" more in the off state than transistors that don't supply so much current (and therefore run slower). I wish they had had equally rated (by AMD) chips to remove this uncertainty, although everyone seems to be focusing on the difference between the Intel and AMD boxes (which opens up a world of concerns....is it the motherboard under load increasing its demand, they have different memory systems which could contribute when stressed, is the PCI-E bus not as efficient as the (assumed) AGP, etc.).
    • What on earth are you talking about? Just because you can spoute some crap that sounds technical doesn't make it remotely true. All CPUs from the same process/family are identical other than miniscule speed and thermal characteriistics. Why do you think the low end version of a new process always overclocks so well? It's because the company simply marks down the CPU to a lower rating. That's alll TR has done here. The type of characteristics you're talking aboout are the small differences between any
      • Re:Poor comparison? (Score:3, Informative)

        by Epi-man ( 59145 )
        All CPUs from the same process/family are identical other than miniscule speed and thermal characteriistics.... It's because the company simply marks down the CPU to a lower rating.

        Yes, they have the same maskset etc. Heck, different speed grades come off of the same wafer. However...these "miniscule speed and thermal characteristics" quickly add up when you have the number of transistors on a CPU, and QA knows what areas of the wafer are better than others, and bin the die accordingly. Believe me, AMD
  • A CNet article [com.com] had specific power details on the 130nm Athlon 64 and 90 nm Pentium 4:

    "The 3800+ chip consumes 91 watts of power at idle, rising to 172 watts under a full load. That compares with 155 watts at idle and 258 watts under a full load for the Pentium 4 560."

    The lower power consumption of the AMD parts arises from their lower clock frequency, as well as from AMD's use of silicon-on-insulator technology [com.com].

    It's not clear if Cool n Quiet was used, but it shows the much better power utilization in e

"Engineering without management is art." -- Jeff Johnson

Working...