Curbing Energy Use In Appliances That Are Off 409
KarmaOverDogma writes "The New York Times has an interesting piece on the slow but steady movement to reduce the power drain for appliances that are never truly turned off when they are powered down. In the typical house that's enough to light a 100-watt light bulb 24/7, according to Lawrence Berkeley National Laboratories, a research arm of the Energy Department. In the United States alone, over $1 billion per year is spent powering devices such as TV's VCR's, Computers and Chargers while they are 'off.' Called 'vampires' and 'wall-warts' by Energy Experts, there has been growing support of their recommendations to adopt industry-wide standards, which would require manufacturers to build appliances with significantly lower consumption when not in use."
Surge Protectors (Score:2, Informative)
Re:Kill A Watt (Score:3, Informative)
In this case, "you get what you pay for".
I have all CFs bulbs in my house, and have noticed that the $5/3-packs from WallyWorld or Home Depot tend to take a second to start and then a long time to warm up, while the $7-each ones come on at full brightness just as fast as an incandescent.
Personally, I'll deal with the 30-second delay.
Re:Meter (Score:3, Informative)
Measuring wall-wart power usage. (Score:5, Informative)
My own house runs about 45 watts. The furnace alone has a microprocessor in it that takes a good 16 watts. Each GFI (ground fault interrupter circuit breaker that prevents you from getting shocked in wet places like kitchen, bathroom, outdoors) takes up a watt, but you can eliminate that draw by leaving them "popped." I have three motion detector lights -- they save energy, but they take about 2-3 watts each when the lights are off. The garage door opener has a radio receiver that draws about 4. We have a remote control TV that takes 6 watts. Phone answering machines are good for about 4-5 watts. Oh, and a PCI motherboard (it is always "on" when the computer is plugged in) is good for about 4 watts -- I have mine on a "power center", but I can't get my wife to put her computer on a "power center."
I know this info by using either a power meter that the local utility loans out through the public library or by counting turns on the outside electric meter (If you meter says 7.2 on it, it means it is 7.2 watt-hours for every turn. If it takes 10 minutes to make one full turn with everything turned of, it makes 3600/(10X60) turns or 6 turns per hour, or the house is using 6X7.2 or 43.2 watts -- instead of standing outside counting a full turn of the meter, you can turn on a light inside of known wattage to bias the reading higher so the meter turns faster. Also, you have to time a complete turn because there is runnout in the power meter rotor -- it goes faster and slower over different parts of a turn, but it is calibrated to read to better than 1 percent for a complete turn.).
Re:$4 a person? (Score:5, Informative)
Microwave, washer, dryer, printer, phone, monitors, lamps, battery chargers (cell phone, laptop, etc.), cradles, etc. also take energy when in standby mode -- or what most people call off.
They list 1000kw per year per household, so at 7 cents per kw that works out to closer to $70 per year. If it adds between $0.50 and $1 to the manufacturing cost to reduce that by 50% it would probably be a net-win for most devices plugged in for most than 6 months.
Re:here ye! (Score:3, Informative)
Interestingly enough, according to the web page it is more important to turn off the monitor than the PC.
Depends on where you live... (Score:5, Informative)
If you live in southern California this is a good idea, paybacks in as little as 4 years. (Including government subsidies) If you live in MN like I do, you are looking at a 30 year payback if all goes well - which is longer than many roofs last. If you shovel the roof you might do better, but that is both dangerous (Don't fall off the roof), and harmful to the panels (which tend to be easily damaged when walked on).
If you live in areas with a lot of sun you are stupid not to investigate this. Many people live in climates where they do not pay off.
My Canon is good (Score:3, Informative)
But my Canon inkjet (Pixma 8500) is fantastic. On the Kill-A-Watt you can see that within a minute of printing, it drops to less than 1W consumption. Measuring it over time (Kill-A-Watt doesn't measure less than 1W instantaneous) says it takes about 150mW in this standby mode. That's great. I suppose off takes less, I guess. I have mine set to turn off after 20 minutes. Honestly though, I bet it just turns off the LED (30mW?) and remains in the same state otherwise, because it wakes from off very fast (and without touching it, it does so over USB).
Of course, the inks cost a lot compared to toner... (Cheaper than most though)
Re:Meter Kill A Watt P3 results (Score:4, Informative)
Here are some of my results:
Air Conditioner wall unit: 2 hours: 17 minutes 3.12 kWh and 1300W when running.
Fridge from the 1970s, about 126W when running.
Microwave from 1980, 888W when running
Clock Radio from 1986, with the radio on and volume low, 0W measured.
Computer 1800+ AMD, 3 IDE HD, and Radeon AIW 8500DV
185W approximately
214 hours 38.62kWh
1083 hours 188kWh
current x voltage isn't usually watts (Score:3, Informative)
You cannot measure power factor with a multimeter. A Kill-A-Watt measures both VA (what you measured) and power (Watts) and since it knows both, power factor too.
Additionally, with your system if a device has a large surge current it might blow your meter. Or you might hurt yourself. Better to use an inductive current clamp (around only one wire, you cannot pass the entire power cord of line, load and ground through it), since it cannot overload in any way that causes danger or costs money to repair.
But really, get a Kill-A-Watt. It can measure over time the power consumption of a device. For example, it'll tell me how much power my computer takes in a week of use. It does this by measuring and integrating the power usage over time. For example, I've had my PC plugged into the Kill-A-Watt for 650 hours and it has used 15.25 KWh in that time. A month has about 740 hours in it, so I use about 17.4KWh/month for my PC. It is on for some of that time, off for some and in standby for most of it.
There's no other way to measure how much power your fridge uses in a week, unless you want to measure standby power (including power factor), compressor on power (including power factor), and then manually record each time your fridge turns on and for how long.
Re:No Rest for the Wicked... (Score:3, Informative)
Finally, on all modern UK plugs the tip of the live and neutral pins are exposed, while the end nearest the plug is wrapped in plastic, so if the plug slips out a little way (much harder to happen than with the US design), it does not expose a piece of live metal somewhere easy to touch.
Oh, and your original point was about voltage. Perhaps you have never encountered the saying 'It's the volts that jolts, but the mills that kills' - high current is far more dangerous than high voltage, and low voltage requires you to draw more current to get the same power.
Straying even further off topic, the only sockets allowed in bathrooms in the UK are special low-drain ones with a different shape for driving electric shavers. I noticed on my last visit to the US that it is relatively common to have standard sockets in bathrooms there.
Re:Tilting at windmills (Score:1, Informative)
Re:bad editing (Score:3, Informative)
I agree. I've been thinking writers should just do away with the Watt altogether. The problem is that people tend to think a "Watt" is a unit of energy, rather than power. It's normal to hear "Watt" and think it's a thing you can hold in your hand, and that an appliance should use a "Watt/second", even though that's ludicrous.
The standard unit of power should be: "joule/second (Watt)". And the unit of energy, anything previously measured in kWh or BTU, should just be changed to megajoules.
Sometimes I think that the whole confusion is a conspiracy to make units intentionally incomprehensible.