PC Power Management, ACPI Explained In Detail 133
DK writes "Computer performance has increased steadily in recent years, and unfortunately so has power consumption. An ultimate gaming system equipped with a quad-core processor, two NVIDIA GeForce 8800 Ultra, 4 sticks of DDR2 memory, and a few hard drives can easily consume 500W without doing anything! To reduce power wastage, the industry standards APM and ACPI have been developed to make our computers work more efficiently. ACPI is the successor of APM and is explained in detail in this article."
Re:Not worth reading this crap (Score:2, Interesting)
500W? (Score:4, Interesting)
Is that why people don't blink at PS3s and X360s that eat 150-200W when they're idle? I guess that locks me and my 100W/system power budget out of gaming . . .
Seriously, what is it that uses up so much power? I've got a pretty standard dual-core system that idles at about 65W, and I can't push it beyond 150W even when I try.
Re:OS (Score:5, Interesting)
Of course once you turn on your entertainment system the power consumption (taking the above example) can easily jump to 7GW even with fairly conservative systems. Now try the same simple maths with your fridge, microwave oven, oven clock (in fact any clock) and anything else that consumes power in standby. Add in lights even low wattage ones and your hot water heater (assume electrical off-peak not gas or solar) and the power consumption is massive. With regard to PC's and laptops consumption is dependent on what you have and can vary between 20W to over 1000W, It is possible to put a laptop in standby or sleep mode but this depends on if you are using your laptop as a standalone machine.
So what are we going to do about all that wastage? Well if you pay for your electricity and you want convenience then absolutely nothing and this is what most people will do.
Re:OS (Score:3, Interesting)
OTOH, there might be something about what you are saying. I think that a fast drive has a lot of say in booting Vista.
Re:OS (Score:3, Interesting)
Re:No, I didn't read the fucking article. (Score:1, Interesting)
Re:500W? (Score:3, Interesting)
I used a Kill-A-Watt meter to measure power usage on my two computers. My main computer is a less than 2 year old single-core AMD-64 3800+ with 1 GB RAM, two hard drives, an 83% efficient power supply, a fanless water cooled CPU, a 20 inch flat panel monitor and runs Kubuntu Linux. The monitor uses 40 Watts and the rest of the computer uses about 94 Watts most of the time. In the sleep mode the monitor only uses about 1 Watt. Under heavy use the CPU power usage is much more. I don't like noise, so I chose a graphics card that did not require a fan and which probably does not use very much power.
I also have a second computer hooked to the same keyboard, monitor and mouse through a KVM switch. It is an AOpen Mini PC with an Intel Core 2 Duo T5600 1.83GHz CPU and 2 MB of RAM and Windows XP Professional. It uses 23 Watts most of the time, but uses more under heavy usage. The 20 inch flat panel monitor uses an additional 40 Watts or just 1 Watt in the sleep mode. Occasionally, I run both computers at the same time and with just one monitor, keyboard and mouse can switch back and forth between computers in about a second or so. Even when I occasionally run both computers at once, I am not using an unreasonable amount of power.
I am not a gamer and for what I do both computers meet my needs very nicely. The AMD-64 running Kubuntu computer is my main computer. I haven't measured the power usage during all the different sleep modes so my information is somewhat incomplete. With the monitor in the 1 Watt sleep mode, I can leave the computer on most of the day without feeling like I am wasting an unreasonable amount of power. To me, 500 Watts sounds way too wasteful.
Kill-A-Watt meter [thinkgeek.com]