Slashdot is powered by your submissions, so send in your scoop


Forgot your password?
Power Technology

Curbing Energy Use In Appliances That Are Off 409

KarmaOverDogma writes "The New York Times has an interesting piece on the slow but steady movement to reduce the power drain for appliances that are never truly turned off when they are powered down. In the typical house that's enough to light a 100-watt light bulb 24/7, according to Lawrence Berkeley National Laboratories, a research arm of the Energy Department. In the United States alone, over $1 billion per year is spent powering devices such as TV's VCR's, Computers and Chargers while they are 'off.' Called 'vampires' and 'wall-warts' by Energy Experts, there has been growing support of their recommendations to adopt industry-wide standards, which would require manufacturers to build appliances with significantly lower consumption when not in use."
This discussion has been archived. No new comments can be posted.

Curbing Energy Use In Appliances That Are Off

Comments Filter:
  • Surge Protectors (Score:2, Informative)

    by Rinnt ( 917105 ) on Friday November 18, 2005 @10:57PM (#14068418)
    What about using surge protectors to make sure your stuff is "off"? That's what I use for my whole network - okay, so it's only two computers. But still, everything runs to a master switch. When stuff is done for the day I hit the kill switch... I would say this cuts the power to the devices since my LAN link lights all go dead.
  • Re:Kill A Watt (Score:3, Informative)

    by pla ( 258480 ) on Friday November 18, 2005 @11:13PM (#14068486) Journal
    I wish I could find some that lit to near full brightness in a few seconds instead of the 15-30 they take to warm up.

    In this case, "you get what you pay for".

    I have all CFs bulbs in my house, and have noticed that the $5/3-packs from WallyWorld or Home Depot tend to take a second to start and then a long time to warm up, while the $7-each ones come on at full brightness just as fast as an incandescent.

    Personally, I'll deal with the 30-second delay. ;-)
  • Re:Meter (Score:3, Informative)

    by max_entropy99 ( 638262 ) on Friday November 18, 2005 @11:20PM (#14068512) just happens to have such a device: []
  • by Latent Heat ( 558884 ) on Friday November 18, 2005 @11:33PM (#14068565)
    Appliances of any kind are typically rated in volt-amps, which tells you the current they draw knowing the voltage, but under worst-case conditions without telling you the power factor to know the P (watts) instead of the |S| (volt-amps). Applicance rating plates are meant to tell you how much current draw and hence how big a circuit breaker or fuse you need -- they are not energy ratings.

    My own house runs about 45 watts. The furnace alone has a microprocessor in it that takes a good 16 watts. Each GFI (ground fault interrupter circuit breaker that prevents you from getting shocked in wet places like kitchen, bathroom, outdoors) takes up a watt, but you can eliminate that draw by leaving them "popped." I have three motion detector lights -- they save energy, but they take about 2-3 watts each when the lights are off. The garage door opener has a radio receiver that draws about 4. We have a remote control TV that takes 6 watts. Phone answering machines are good for about 4-5 watts. Oh, and a PCI motherboard (it is always "on" when the computer is plugged in) is good for about 4 watts -- I have mine on a "power center", but I can't get my wife to put her computer on a "power center."

    I know this info by using either a power meter that the local utility loans out through the public library or by counting turns on the outside electric meter (If you meter says 7.2 on it, it means it is 7.2 watt-hours for every turn. If it takes 10 minutes to make one full turn with everything turned of, it makes 3600/(10X60) turns or 6 turns per hour, or the house is using 6X7.2 or 43.2 watts -- instead of standing outside counting a full turn of the meter, you can turn on a light inside of known wattage to bias the reading higher so the meter turns faster. Also, you have to time a complete turn because there is runnout in the power meter rotor -- it goes faster and slower over different parts of a turn, but it is calibrated to read to better than 1 percent for a complete turn.).

  • Re:$4 a person? (Score:5, Informative)

    by rtaylor ( 70602 ) on Friday November 18, 2005 @11:39PM (#14068590) Homepage
    Ahem.. That is $4 per person per year for the TV and VCR only (two devices).

    Microwave, washer, dryer, printer, phone, monitors, lamps, battery chargers (cell phone, laptop, etc.), cradles, etc. also take energy when in standby mode -- or what most people call off.

    They list 1000kw per year per household, so at 7 cents per kw that works out to closer to $70 per year. If it adds between $0.50 and $1 to the manufacturing cost to reduce that by 50% it would probably be a net-win for most devices plugged in for most than 6 months.
  • Re:here ye! (Score:3, Informative)

    by NCraig ( 773500 ) on Friday November 18, 2005 @11:41PM (#14068598)
    I've also heard that its more efficient to leave your computer on all the time because the amount it costs in wear and tear on your computer is more than the power saved by doing it. Anyone hear of this rumor or know anything about it?
    The argument is that thermal stress from turning the cooled-down PC on wears components out. I've seen many arguments for and against leaving a computer on all the time. This page [] details a few of them.

    Interestingly enough, according to the web page it is more important to turn off the monitor than the PC.
  • by bluGill ( 862 ) on Friday November 18, 2005 @11:52PM (#14068660)

    If you live in southern California this is a good idea, paybacks in as little as 4 years. (Including government subsidies) If you live in MN like I do, you are looking at a 30 year payback if all goes well - which is longer than many roofs last. If you shovel the roof you might do better, but that is both dangerous (Don't fall off the roof), and harmful to the panels (which tend to be easily damaged when walked on).

    If you live in areas with a lot of sun you are stupid not to investigate this. Many people live in climates where they do not pay off.

  • My Canon is good (Score:3, Informative)

    by YesIAmAScript ( 886271 ) on Saturday November 19, 2005 @12:18AM (#14068792)
    Laser printers take a lot of power when standing by, like copy machines. They keep the innards partially warmed up for fast response.

    But my Canon inkjet (Pixma 8500) is fantastic. On the Kill-A-Watt you can see that within a minute of printing, it drops to less than 1W consumption. Measuring it over time (Kill-A-Watt doesn't measure less than 1W instantaneous) says it takes about 150mW in this standby mode. That's great. I suppose off takes less, I guess. I have mine set to turn off after 20 minutes. Honestly though, I bet it just turns off the LED (30mW?) and remains in the same state otherwise, because it wakes from off very fast (and without touching it, it does so over USB).

    Of course, the inks cost a lot compared to toner... (Cheaper than most though)
  • by saskboy ( 600063 ) on Saturday November 19, 2005 @12:35AM (#14068872) Homepage Journal
    I got a P3 for my Dad, and have since borrowed it to meter nearly everything in my house just for fun. [Yeah I said fun, this is Slashdot and if I consider plugging things in to test for Wattage use as fun, that's fine.] I got the meter from eBay, it was about $30.

    Here are some of my results:
    Air Conditioner wall unit: 2 hours: 17 minutes 3.12 kWh and 1300W when running.

    Fridge from the 1970s, about 126W when running.

    Microwave from 1980, 888W when running

    Clock Radio from 1986, with the radio on and volume low, 0W measured.

    Computer 1800+ AMD, 3 IDE HD, and Radeon AIW 8500DV /Speakers/Monitor/Modems/Sony VCR, 13" TV, UPS, all typically used, but the computer running 24/7:
    185W approximately
    214 hours 38.62kWh
    1083 hours 188kWh
  • by YesIAmAScript ( 886271 ) on Saturday November 19, 2005 @01:05AM (#14068994)
    current x potential (voltage) / power factor = power (Watts)

    You cannot measure power factor with a multimeter. A Kill-A-Watt measures both VA (what you measured) and power (Watts) and since it knows both, power factor too.

    Additionally, with your system if a device has a large surge current it might blow your meter. Or you might hurt yourself. Better to use an inductive current clamp (around only one wire, you cannot pass the entire power cord of line, load and ground through it), since it cannot overload in any way that causes danger or costs money to repair.

    But really, get a Kill-A-Watt. It can measure over time the power consumption of a device. For example, it'll tell me how much power my computer takes in a week of use. It does this by measuring and integrating the power usage over time. For example, I've had my PC plugged into the Kill-A-Watt for 650 hours and it has used 15.25 KWh in that time. A month has about 740 hours in it, so I use about 17.4KWh/month for my PC. It is on for some of that time, off for some and in standby for most of it.

    There's no other way to measure how much power your fridge uses in a week, unless you want to measure standby power (including power factor), compressor on power (including power factor), and then manually record each time your fridge turns on and for how long.
  • by TheRaven64 ( 641858 ) on Saturday November 19, 2005 @06:43AM (#14069797) Journal
    The UK socket design is much safer than the US counterpart. For one thing, there is a requirement that every appliance be earthed, both legal and mechanical. The live and neutral pins are protected by covers which are only released once the earth pin is inserted - this also helps protect children, since it is impossible to stick anything into the live pin without first opening the doors by inserting something into the earth pin. Contrast this with the US design, where the holes are unprotected, and about the right shape for a knife or a fork handle to go in.

    Finally, on all modern UK plugs the tip of the live and neutral pins are exposed, while the end nearest the plug is wrapped in plastic, so if the plug slips out a little way (much harder to happen than with the US design), it does not expose a piece of live metal somewhere easy to touch.

    Oh, and your original point was about voltage. Perhaps you have never encountered the saying 'It's the volts that jolts, but the mills that kills' - high current is far more dangerous than high voltage, and low voltage requires you to draw more current to get the same power.

    Straying even further off topic, the only sockets allowed in bathrooms in the UK are special low-drain ones with a different shape for driving electric shavers. I noticed on my last visit to the US that it is relatively common to have standard sockets in bathrooms there.

  • by Anonymous Coward on Saturday November 19, 2005 @09:04AM (#14070091)
    Twenty VARs maybe, but twenty watts, no way!
  • Re:bad editing (Score:3, Informative)

    by benjamindees ( 441808 ) on Saturday November 19, 2005 @09:11AM (#14070114) Homepage
    why even bother mixing up the concepts of power and energy ?

    I agree. I've been thinking writers should just do away with the Watt altogether. The problem is that people tend to think a "Watt" is a unit of energy, rather than power. It's normal to hear "Watt" and think it's a thing you can hold in your hand, and that an appliance should use a "Watt/second", even though that's ludicrous.

    The standard unit of power should be: "joule/second (Watt)". And the unit of energy, anything previously measured in kWh or BTU, should just be changed to megajoules.

    Sometimes I think that the whole confusion is a conspiracy to make units intentionally incomprehensible.

"No, no, I don't mind being called the smartest man in the world. I just wish it wasn't this one." -- Adrian Veidt/Ozymandias, WATCHMEN