Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Power The Almighty Buck Hardware

The Insatiable Power Hunger of Home Electronics 340

An anonymous reader writes "A Wall Street Journal columnist recently got his hand on a power meter and decided to write about his findings, the resulting article being discussed here on Slashdot. That author concluded that gadgets are getting a bad rap, and are relatively insignificant power consumers in the grand scheme of things. A rebuttal has appeared, arguing that not only are modern electronics significant power consumers already, while everything else is becoming more efficient, home electronics seem to be getting worse. This echoes the Department of Energy's assertion that 'Electricity consumption for home electronics, particularly for color TVs and computer equipment, is also forecast to grow significantly over the next two decades.' Are gadgets unfairly maligned, or getting an unearned pardon?"
This discussion has been archived. No new comments can be posted.

The Insatiable Power Hunger of Home Electronics

Comments Filter:
  • by crc32 ( 133399 ) <colin@crc3[ ]om ['2.c' in gap]> on Thursday December 28, 2006 @08:14AM (#17386464) Homepage
    In general, an LCD TV is 2x more energy efficient than a CRT. Modern dual-core processors are more energy efficient then older processors. However, as with all gains in efficiency, we're using MANY more of them. That's just what happens.
    • by Aladrin ( 926209 )
      I just thought I'd note that the parent DOES actually mean '2x more' and not '2x as'. This is rare these days, and I think it should be marked on a calendar or something.

      http://answers.google.com/answers/threadview?id=10 82 [google.com]

      According to this link, a CRT uses 3x as much electricity in a year as an LCD. Which is, of course, the same as saying '2x more'.
    • Something central to human psychology. The more we have of something, the more we use. It's why supply and demand works, why scarce things are valuable.
       
    • Modern dual-core processors are more energy efficient then older processors.

      This is irrelevant, as long as the new devices still consume more power than its older counterparts.

      How much power does a machine with, for example, a pentium III compare with a core 2 duo machine?

      Yes, they are getting more efficient, but this isn't enough. They need to consume less power than previous versions in order for actually mean something.
  • It's regional (Score:4, Interesting)

    by FST ( 766202 ) on Thursday December 28, 2006 @08:17AM (#17386492) Journal
    I think it's more regional than anything else. The current definition of National household electricity consumption is, in effect, an average of household electricity consumption in different regions across the United States and is affected by many factors. However, hot summers increase the amount of electricity used for air conditioning and other space cooling, so households in southern States will tend to use more electricity. Similarly, cold winters increase the amount of energy used for space heating. Although U.S. households more frequently rely on natural gas than on electricity for heating, in the South the reverse is true, meaning that households in southern States will tend to have a peak of electricity use in winter as well as in summer.

    Humidity is another climate-related factor that affects electricity consumption. Households in more humid regions tend to use air-conditioners and dehumidifiers to remove humidity. Households in arid regions, such as the Mountain States, are able to use evaporative coolers instead of air-conditioning for space cooling.
  • by thc69 ( 98798 ) on Thursday December 28, 2006 @08:19AM (#17386510) Homepage Journal
    particularly for color TVs and computer equipment
    Oh good, all my black and white TVs and computer equipment are okay...
  • by Noryungi ( 70322 ) on Thursday December 28, 2006 @08:20AM (#17386520) Homepage Journal
    So, who is right? The WSJ or the article referenced? Actually both.

    The article referenced talks about the trends for energy consumption. And, in that respect, the consumer electronics win hands down, since more and more people buy computers, flat-screen TVs and assorted electronic gadgets. On the other hand, the WSJ is right, since the overall energy consumption of these gadgets is still a very small fraction of the total.

    One thing that I'd personally like to do soon would be to compare the electricity used by all my computers (6 and counting, including a big Sun workstation, 3 laptops, a modem/router, a wireless access point, a laser printer, etc) vs the overall electricity usage in my home. I have relatively modern equipment, and I am currently switching everything to low-power equipment.
  • by tacocat ( 527354 ) <tallison1 AT twmi DOT rr DOT com> on Thursday December 28, 2006 @08:33AM (#17386638)

    Sure they might run instant on feature that takes some current drain 24x7 so they can do a warm start. Or a clock.

    Chase down the Off-Grid living web sites and you'll soon find that one of the biggest problems people have when they first try to do off grid is all their appliances that drain just a little power all night long, leaving insufficient power for the morning routines.

    I have three digital clocks in my kitchen, two in my entertainment center... I don't own a watch anymore because I realized that there is no place except the bathroom that I can stand in my house and not see a clock face. And I don't own any clocks!

    The need for everything to have a digital clock and instant on takes up a lot more power then you think. Turn everything off and go look at your meter. it's still chugging along rather nicely. We could do much better if we dropped the clocks and dropped the instant on. Tube televisions took minutes to warm up. Solid State televisions take a few seconds to warm up. Instant On only saves me 3 seconds at most.

    • Re: (Score:3, Interesting)

      by Xugumad ( 39311 )
      What particularly bugs me is that when I bought a new LCD TV last year, I discovered it had no power switch. It has a standby button, but the only way of turning it off is at the wall/powerstrip. On a related note, decidedly unhappy with the Wii's 24-hour on mode; I'd be more accepting if it wasn't required for things like Mii transfer to work, but there's no way of telling it to do network maintenance when it's first turned on each day.
      • My TV's even worse. Not only does it not have an Off switch, but it doesn't store its settings in Flash anywhere. So if I *do* unplug it, or if the power goes out, it defaults to the wrong input, channel 2 (wrong channel) and volume SUPER LOUD.
        • I thought the SUPER-LOUD-when-you-first-switch-them-on TVs (especially Philips models) were only found in hotels.

          I've always suspected that the audio defaults were finalised by a senior engineer who was a little bit deaf from years of working on TV and hifi equipment.
      • by jandrese ( 485 )
        The question here of course is "how much power does it draw in Standby mode?" I know there is a lot of gnashing of teeth about the power draw from standby mode, but most of the appliances I've tested draw only 300mA or so in standby, which is like leaving your front door open for an extra 3 seconds when entering the house.
        • 300mA is not a measurement of power. It's a measurement of amperage. P=VI, meaning watts (power) equals volts (electromotive force) times amperes (current). Are you saying 300mA at 120V? That's 36 watts (120 * 0.3)! 36 watts is a shitload of power! Double that and you can run a modern laptop!
      • I don't know about your TV, but I've measured some LCD monitors in standby, and came out with draws of 3 or 4 watts. While that's not zero, and claims like "We could get rid of fifteen power plants..." are thrown around, it's hard for me to worry about that when my gaming computer draws a hundred times that much, and my air conditioner draws five hundred times that much. When an hour of using the AC is worth almost a month of standby time on my LCD, I have bigger fish to fry than unplugging the monitor.

        st
        • by LunaticTippy ( 872397 ) on Thursday December 28, 2006 @12:50PM (#17389372)
          You're right that one 3 watt drain is insignificant. However, in my house I have probably 20 of these drains, between 3 and 20 watts. I also have at least 20 wall-wart transformers that suck juice whether they're hooked up to anything or not. I'd say that my (admittedly not normal) total standby power is 300 watts, 24 hours a day. That's a lot. It'll affect my bill substantially, and for no good reason. If the average house uses 100 watts, once you multiply that by hundreds of millions of houses you're talking about real power.

          It's like a leaky faucet. Sure it's only 1/10 gallon a minute, but it adds up and doesn't benefit anyone. Why not minimize it? I know manufacturers could lower standby power use if consumers demand it.
    • One thing that really annoys me is that most televisions, DVD players, VCRs, game consoles, etc lose their data and have to have some level of reconfiguration on start-up if you hard kill their power. I'd love to be able to put all that crap on a power strip that I could flip everything on, or off, at once on and save some power when I don't need them. It doesn't cost all that much more (a couple dollars) to build such items to retain such information when cut off from power - most companies just don't both
      • If a clear signal is not important to you, you can. The problem is that the power'll really fuck up your reception.

        I do like your idea of having all devices recognize all other devices, though. Bluetooth would be perfect for that.
      • I've always wondered why home entertainment devices can't talk to each other at all. I mean, if I turn on my DVD player and hit Play, it should be able to automatically set my TV to the DVD input and make sure it's on. It should also talk to my surround receiver and turn it on and switch its input. This could all be done easily with USB, or Bluetooth (like another commenter suggested) and yet there's nothing in place.
        • With the move to HD, one of the proposed solutions was HAVi over Firewire [extremetech.com]. Basically, each device would have a firewire port (well, two so you daisy chain), you run the daisy chain between the devices, and they provide their interface via Java.

          The studios HATED it, because it meant their content was moving around the network digitally (in MPEG-2), which was the point. Want to record something to D-VHS or AVHDD, just choose to record it. The devices tell everyone that they record. No more PVR, or if you
  • by tgd ( 2822 ) on Thursday December 28, 2006 @08:37AM (#17386652)
    He's absolutely right. Ignoring AC costs, IMO its house size that is causing the increase in usage, and its changes in how houses are lit. 20 years ago houses were lit typically with a single fixture in a room, and lamps. (Or, if you're in the northeast US, typically just lamps, although I couldn't tell you why that is...)

    These days lighting design is all the rage, and its common to have 4 or more fixtures in a room, often R30 can lights at 65w each projecting downward so you need 4 or more to light a room. The room I'm in right now visiting my parents has 4 can lights, a light with 4 60 watt bulbs in it, and two recessed spot lights of unknown power. Ignoring those, its still 500 watts to light this room.

    My house is 60+ years old, but was renovated six years ago -- most of it is can lit as well. It has 24 65 watt R30 can lights in it, among all the other lights.

    I saw a nearly $30 a month drop in my electric bill switching the entire house to CFL. Dimmable R30 bulbs are pricey, $12+ each, but they will have payed themselves off in a year. I typically am facist about keeping lights off, too... I'm sure the savings would be double that if I had kids leaving them on all the time.

    On a geek note, I also got a $30 savings a month by making changes in the data center in the basement. An old HP rack server was replaced with a much less power hungry desktop box which was faster... that saved 75% of the electricity it used to use. Three other desktop boxes which were slower were replaced with two free laptops with broken screens I got from friends who tend to break their laptops. The upside as well is that one small UPS can power everything for almost an hour.
    • by CharlieG ( 34950 )
      where did you find the dimmable R-30s (and do they have r-40s) I've been looking to replace the 7 in this room and the ones in the living room
    • Next step is VMWare/Xen, and downsizing a couple of stray servers. This is when some sort of power-generation scheme for the home begins to look attractive. Thermoelectric materials near the stove/shower/fireplace would seem to be a good start, but probably not for anything larger than a CFL or two.
    • It's not just size. 60 years ago, your entire electrical appliance list probably consisted of a toaster, a television, a radio, and a clothes iron. You didn't have three televisions, thee DVD players, two TIVOs, two (or more) computers, two external hard drives, a home theater receiver, four cell phone chargers, a laptop charger, three CD players, a breadmaker, baby monitors, three hair curlers, two hair dryers, an air conditioner, and about a hundred other things.

      The NEC has constantly revised the electr
  • Peripheral Power control with screen saver
    http://www.instructables.com/id/EE62QUOM31EUOJJVA4 / [instructables.com]

    saves a few pennies here and there.....
  • Remotes + Sleep mode (Score:4, Interesting)

    by Gopal.V ( 532678 ) on Thursday December 28, 2006 @08:39AM (#17386670) Homepage Journal

    When we first got a TV (1988), the TV had a power switch, five channels and definitely no remote. So, whenever we didn't need the TV, we just switched off the power and turned it on when we needed it.

    When 1999 dawned, the TV was a flat screen 25" with a remote. And lo, we would turn off the power for the TV only when we left the house (locked up) or at night. And that was just because my house was on the very top of a hill and power lines were often hit by lightning (yeah, I had my modem explode once).

    And finally, now in 2006 (in a different city), I have six things plugged in - from DVD player to the TV itself. And it is such a big mess that nobody ever unplugs anything at all - just use the remote to turn it on & off. That sleep mode does take a fair bit of power (well, tens of watts) which is just going to an absolute waste (well, heating the room).

    It is these un-noticed devices which suck a constant, but econonomically neglible drain - which could be avoided. The things you can fix aren't always the biggest consumers (water heaters, refrigerator) but small things like these - in a global level.

    It is not just such permanently on stuff that you have - the average geek still has more connectors than you'd think. I realized this when I was in the high himalayas - and we were charging [flickr.com] stuff before we left human habitation. (Oh, took the laptop to 18,000 feet).

    • by djh101010 ( 656795 ) * on Thursday December 28, 2006 @12:04PM (#17388824) Homepage Journal
      And finally, now in 2006 (in a different city), I have six things plugged in - from DVD player to the TV itself. And it is such a big mess that nobody ever unplugs anything at all - just use the remote to turn it on & off. That sleep mode does take a fair bit of power (well, tens of watts) which is just going to an absolute waste (well, heating the room).

      That last bit is critical. Guys, we're not wasting ANY energy, at least during the heating season. The heat put out by the wall warts and other always-on stuff, helps heat your house. If you have electric heat, it's exactly a wash. If you heat with natural gas or propane, well, this is that much less fuel you'll burn. The cost per BTU even comes out in favor of electric, sometimes. For me, the on-peak rate is 5x as high as the off-peak rate, so during nights and weekends, electric heat is cheaper than propane.

      For off-peak heating, I use a 4500W water heater, piped into plastic tubing cast into the concrete slabs in my basement and kitchen. I can get a 1 degree (f) per hour temperature rise in the slabs, which doesn't sound like much but in practice is more than enough. The electric heat, in this case, saves me quite a bit in propane costs, somewhere around 20% in heating costs savings last time I calculated.

      Point is, that heat isn't wasted, unless you're running an air conditioner at the same time.
  • by brokeninside ( 34168 ) on Thursday December 28, 2006 @08:48AM (#17386742)
    The Christian Science Monitor has an excellent article on energy conservation in the home: Surprise: Not-so-glamorous conservation works best [csmonitor.com]. The two biggest issues to tackle are lighting and heating. Consider this:
    although residences consume only about two-fifths of this as electricity, because electrical generation is inherently inefficient, it accounts for 71 percent of household emissions. A home's electrical use may be responsible for more CO2 emissions than the two cars in the driveway.
  • A friend of mine rents a loft in my house and he asked me to check out why his part
    of our power bill was so much greater (he now has a meter). Turns out his standby
    power on all his devices is half of his total average power draw. They are on all the
    time, after all, whereas the bigegr items are used mkore rarely. He also has more
    gizmos than you can shake a stick at. To sum thar up: when he's away from the house
    on vacation or whatever, with TVs and compuetrs off, his power draw is still at 50% of
    the noraml am
  • by JustNiz ( 692889 ) on Thursday December 28, 2006 @08:53AM (#17386764)
    1) Off buttons that really turn off the power, not just put the device in a 'standby mode'.

    2) Manufacturers should be obliged to make low-voltage devices have transformers internal(and wired after the power switch), and make those really annoying power bricks you now get with everything illegal.

    Apart from usually being a ridiculous single-piece design that occludes several other sockets in a power strip, they cause massive cable tangles and practical use requires that they be left permanently powered-on.
    • Actually - removing standby buttons would be a bloody silly idea. At least with Macs, I'm lead to believe that the power used to boot the machine is greater than the power used to keep the machine sleeping for a week, so roll on those standby buttons. Bob
      • by walt-sjc ( 145127 ) on Thursday December 28, 2006 @09:44AM (#17387152)
        OK, here are the numbers for a mac mini (no monitor - just the cpu.)

        Powered off: 0.035A
        Booting: 0.250A - 0.320A
        On, but idle: 0.180A - 0.250A
        Sleep mode: 0.050A
        Unplugged: 0.0A

        So booting isn't that much more power than idle, and it's for a short period of time.

        I find it interesting that powered off isn't really powered off, so you are better off using the switch on your power strip than relying on the mac "off" mode, which isn't a whole lot better than sleep.

        Someone who wants to play with math more than me can figure out the break-even points, but it's clear that you are far better off unplugging your mac and rebooting overnight than leaving it in sleep mode. It's a no-brainer for a week. This basically says, unplug all your crap when you go on vacation, because with modern electronics, off isn't off.

      • by 1u3hr ( 530656 )
        At least with Macs, I'm lead to believe that the power used to boot the machine is greater than the power used to keep the machine sleeping for a week,

        This is like that prevalent myth that turning a fluorescent light on and off uses up more energy than running it all day.

        If your Mac takes 1 minute to boot, for your claim to be true it would have to draw 60x24x7 times as much power as it does when "sleeping", i.e., if it draws 5W when sleeping (surely it would be more) then booting would draw over 50kW.

    • by swb ( 14022 )
      What's the manufacturing/engineering/economic reason that so many things use external power bricks instead of internal transformers?

      Has the manufacturing of power bricks become so efficient that they are in effect "free" and device designers simply assume a DC power source?

      Does the extra space/heat/complexity of including the transformer within the device and the larger power connector required to actually plug it in make the devices that much more expensive to manufacture or somehow less attractive to cust
      • by Rob the Bold ( 788862 ) on Thursday December 28, 2006 @09:59AM (#17387270)
        What's the manufacturing/engineering/economic reason that so many things use external power bricks instead of internal transformers?

        Glad you asked. The main reason is safety regulations. Devices that plug in to your household power need 3rd party certification (e.g. UL approval in US). Power supply design is a specialty, and although any EE could do it, not all can do it well, quickly and cheaply. If you (as designer) spec an external transformer, then you don't have to worry about the approval. You just buy an approved transformer and design your device to work on low voltage. This saves you thousands of dollars and many man-hours of time per design by not having to hire an independent lab to verify your safety compliance.

        As an additional benefit, you can sell you product to work with different AC voltages just by supplying the appropriate transformer for each market. Plus, when you buy an external transformer, you get economies of scale because it can power not only your devices but many others built by thousands of other firms.

        • Not to mention that a lot of the heat of the power supply is in the transformer and if you move that outside the unit then it not only becomes smaller and potentially more attractive, but it also runs cooler.
        • Re: (Score:3, Insightful)

          by Faeton ( 522316 )
          That is the case, but why don't they do go one step further in the quasi-standardization of transformers and make the power plug (the one that goes to the device) all the same? I'm sure the vast majority of us don't have to charge ever little gadget we have, all at once. If they made all the charging plugs universal (say, mini-USB) and the same voltage, we could save a lot of power and socket space by unplugging all those wall-warts.
      • Re: (Score:3, Interesting)

        by b0s0z0ku ( 752509 )
        What's the manufacturing/engineering/economic reason that so many things use external power bricks instead of internal transformers?

        Manufacturing costs: you get economy of scale on the power supply circuits.
        Liability: if the power supply blows up, *you* didn't design or build it. Also, users of the circuit can't be directly exposed to 120VAC.
        Size: yes, the circuit can be smaller, and the extra parts are out of the way on a floor or wall.

        The problem with many wall wart bricks is that their transformer

        • Re: (Score:3, Informative)

          The problem with many wall wart bricks is that their transformer's primary winding is energized all the time and thus drawing power.

          This is true to an extent, but the amount of power drawn in "zero load" conditions is quite small. The more load placed on the secondary, the more load seen on the primary. Some energy is wasted as heat. IIRC, a typical iron-core transformer is around 85% efficient at 50/60 Hz line current.

          Switch mode power supplies can be much more efficient (such as a laptop "brick" or a PC s
    • by ex-geek ( 847495 )
      2) Manufacturers should be obliged to make low-voltage devices have transformers internal(and wired after the power switch), and make those really annoying power bricks you now get with everything illegal.

      What I'd like to do is to power peripherals with the efficient power supply of my PC instead of having to independently manage a myriad of said annoying bricks.

      USB maxes out at 2.5W, which is at least good enough to power a scanner, but not much more.

  • ...my computer is using a lot more power than before - then again, it didn't play back HDTV or very impressive 3D games before either. And my last TV, well it's a lot bigger and thus draws a lot more watts than my last one. Compare that to a washing machine - it washes my clothes, they get clean. Two thumbs up for that, I don't need one spinning twice as fast. I must admit, I don't think energy efficiency when I look at power draw, I think cooling and noise - sitting in front of a computer is hardly an expe
  • by Secrity ( 742221 ) on Thursday December 28, 2006 @09:20AM (#17386962)
    I admit it, I now have more gadgets drawing current than I did five years ago. I have also reduced power consumption in the past five years. Five years ago, my typical electric bill was US $125 a month, it is now in the $75 range. None of the changes have caused any hardships or reduction in quality of life.

    1. Replaced heat pump with a more efficient model and installed set back thermostat. I lucked out, the compressor crapped out and I had a service policy. The impact on quality of life is nil, I had to learn the new thermostat.

    2. Replaced refrigerator with a more efficient model. It was expensive but the old refrigerator was about 30 years old and was reaching the end of it's service life. It is a nicer refrigerator than the old one and it is quieter.

    3. Replaced commonly used light bulbs with compact fluorescent. This was an inexpensive change and it had the most impact on quality of life. The color and light quality of the new compact fluorescents compares to the old lights but they take a few minutes to produce full light output. They remind me of a tube type radio warming up.

    I think that the most interesting replacements were the night lights. I replaced the 6 night lights that used to draw about 4 watts each with LEDs. I connected a wall wart to an unused wire pair in my home telephone wiring and I use the phone wiring to transport power to my night light LEDs. I had the wall wart, LEDs, and other parts in my junk box -- and they work great.

    The light conversion is both saving power used for lighting and reducing the summer air conditioning load. Someday I might even figure out how long it will take to save any money by replacing those lights. The main light in the living room was a 300 watt halogen torchiere which I replaced with three fluorescent flood lights which cost $35 for a new floor lamp and bulbs, rated power consumption went from 300 watts down to about 75 watts; and I frequently don't turn on all three of the bulbs. This summer I noticed that the living room was much cooler with the new lights. The kitchen is saving a similar amount of watts but the lights in the kitchen are not used very often.
  • This article has COMPLETELY missed the point.

    Consumer electronics do increasingly contribute to a home's electric budget, but only by virtue of quantity. Except for PCs and TVs, most products draw a pittance. For PCs, they did draw more and more power from the mid-80s to a year or two ago, but newer CPUs have finally addressed that problem (and power supplies have gotten more efficient as well). For TVs, larger means more power, but the tech has drastically improved... A 50" plasma draws comparable to a
    • by Dunbal ( 464142 )
      but newer CPUs have finally addressed that problem (and power supplies have gotten more efficient as well).

      Are you saying that my PC XT with a CGA card, a dinky little power supply fan, and a 150 watt power supply was less power hungry than my Athlon 64 X2 Dual with 2 top of the line video cards, SIX fans, and that REQUIRES (I know this cos I have already burned 2 out) at least a 550 Watt power supply to run? Computers were more efficient a few years ago around the 1990's, but now they
    • A Geforce 8800 uses more power in idle alone than 3 386 computers with 14" b/w monitors...
  • My TV has a power button, which works as a hard power button. There's also the TV remote, which puts the TV into the "soft-off" state where it's ready to turn on, but not exactly off. That's not all - when there's a power failure, the TV turns on as soon as power is restored. Given the size of the TV, I guess the manufacturer thought it would be used as a Kiosk where it needed to be always on rather than being used at home.

    I guess it's no worse than the "Wake on Modem" that's enabled by default in the co
  • It is interesting to take his numbers and do a bit of arithmetic. The highest power user is the kettle, but is only on for (say) 10 minutes a day, whereas the DVD and microwave are on all day (1440 minutes) [I assume that you never cook anything for watch a film]:

    what.. usage mins watt-minutes
    m.wave 3 1440 4,320
    dvd... 7 1440 10,080
    kettle 1475 10 14,750

    So what you think is the big user (kettle) is about the same as the microwave.

  • My Creative Labs computer speakers draw 75% power of the "on" when supposedly turned "off". The power adapter for it is always hot. Lots of little devices like this are each costing $10-20 per year in electricity when not in use. It starts to add up. Multiply that across a nation, and that's a huge amount of wasted electricity, as well as pollution. Electricity companies aren't charities, but we're giving them more money for no reason than a lot of people give to charity. I'm sure if people turned the
    • I live in Houston, where every building is air-conditioned to somewhere in the 68-70F region during the summer. Everyone thinks it's too cold, but by some Divine Decree, that's the temperature indoors. At home, we have our thermostat set at 78 during the summer. Assuming an average (over a 24-hour period) temperature of 88 degrees, that means we're saving roughly half of our A/C costs by using 78 instead of 68 or 70F. As temperatures fall, that number increases dramatically.

      What really hit me was when
    • The big problem in North America is the sheer number of things like apartments with air conditioning -- but NO timer. The typical person in the summer goes to work, and leaves the AC on when no one's at home. This wastes tremendous amounts of power. But people just forget to turn it off when they go. An easy to use timer would fix this problem.

      I always tried to remember to turn off my AC. When I left the apartment, the power company mistakenly sent me a bill some time after I left - and I noticed the next p

  • I'd venture that LED lighting in the home will become mainstream within the next 10 years. Given that lightbulbs make up about 33% of a home's power consumption & they will be going from 40-100 watts a piece to 2-6 watts, isn't the complaining about gadgets power draw a little ... hot headed?

    So long as our power generation is cyclical when it comes to CO2, it really doesn't matter what we spend the energy doing. Getting to solar, wind & biofuel generation is a real target, not making a phone recharg
    • by Alioth ( 221270 )
      You don't need LED lighting - compact flourescents already exist (and are cheap) and have been around for over a decade. Flourescent is more efficient than semiconductor lighting.
  • I've dropped my electricity consumption by more than half. This chart on my blog [blogspot.com] shows my total KWHs consumed over five years.

    Sure, I replaced incandescent bulbs with CFLs when I moved in. So where is the savings? I optimized things like computers and then "insignificant, low-power" devices.

    I'd love to see this journalist's KWHs per year over the past 5 years. I've love to see how many KWH he consumes a month. Perhaps given his waste, a savings of 100 KWH/month is insignificant.

    Some people think that
  • Gadgets smadgets (Score:3, Insightful)

    by Cadallin ( 863437 ) on Thursday December 28, 2006 @10:50AM (#17387772)
    The Wall Street Journal is right (for once). The vast majority any house's electrical costs are Heating-Air Conditioning, and Water heating (baring designs using solar water heaters, and below ground air conditioning, I acknowledge that you exist, but let's face it, you're far less than 1% of the population). If electrical usage is rising, its the fault of the rise of McMansions, and generally larger housing in general. Most housing in the US is poorly designed and piss-poor insulated, with dozens of windows. All of which add hugely to HAC. Windows in particular are a huge elephant of electricity costs, especially the huge ones popular today, built with no consideration at all about where the sun is going to be at different seasons.
  • by intnsred ( 199771 ) on Thursday December 28, 2006 @11:04AM (#17387968)
    From the article theorizing that home electronic power usage seems to be getting worse:

    We could probably save the Earth a little more if we didn't do one to two loads of dishes a day, and if we didn't wash a dozen loads of laundry a week, but hey, that's modern life with small children. These are luxuries of modern living that I'm going to clutch onto until the ocean is lapping at the door.

    I wonder if his kids and grandkids will feel similarly about Dad's attitude?!

    Don't get me wrong, the guy seems to be doing more than most people. My point is that we are not "entitled" to lives of such "luxuries" (his word) as we kill off species and, indeed, the entire planet.

    We have a helluva lot of change to do -- either willingly or it'll be forced on us -- and most of that change needs to occur between our ears.
  • I have a 2.5GHz P4 with 1GB of memory and 4 HDD as well as two 21" CRT monitors.

    After 10 minutes in sleep mode it all consumes 5W.

    1. PC It runs 24/7, and consumes 43kWh or $6 a year.
    2. Clothes drier runs 6 hours a week at 4kW thats 416kWh or $60 a year

    3. PC when CPU doing actual work sucks 147W thats 1300kWh or $206 a year. When I discovered this, I immediately disabled the protein folding project my PC was participating in.
  • i remember reading about a proposed dc bus in pop sci some years ago that really grabbed my imagination. basically, in your house, along with the 120 vac outlets, there would be connections for +/-12vdc and +5vdc -- the most common voltages for analog and digital electronics.

    the horribly stupid situation we've gotten ourselves into is that now we have a myriad transformers in our houses. i can think of five that are currently plugged in right now in this room. put your hand on one of these -- that heat i
    • This is a bad idea because low-voltage DC has fairly high losses. It would probably cost you more in large-gauge copper than you would save over the lifetime of the house. It makes more sense to A) buy more convergence devices that eliminate gadgets and B) unplug devices when they are not in use. Also C) buy more efficient gadgets to begin with. It's sad that people won't spend a few more cents to get something more efficient, so that we need legislation to force all devices to be efficient.
  • by evilviper ( 135110 ) on Friday December 29, 2006 @04:07AM (#17396564) Journal

    it's empirically relatively power efficient [...]

    No, that computer isn't *remotely* power effecient.

    The ridiculously high 130+watt idle numbers are probably due to the S2K Bus Disconnect bug/issue AMD had before the switch to 64bit CPUs. Running a program like VCool or FVcool would likely reduce that number by 20-60 watts.

    The trend in CPUs (still the biggest single power drain in modern computers) is for MUCH more power-effecient models (especially when idle). A newer CPU and motherboard would be using significantly less power than that old Athlon, despite vastly outperforming the older chip.

    [...] given that it survives quite happily on a rather anemic 300W power supply, in an era when many advice guides are pushing monster 450W+ units.

    The recomendations are probably due to the ridiculous power consumption of Pentium 4 CPUs (which are thankfully behind us now) and $5 "500w" PSUs, which can't possibly deliver half the power advertised. Stay away from those two issues, and a 300W power supply is more than enough for modern systems.

    Additionally, 80% effecient power supplies like Seasonic's units are becomming more common, and more widely available, helping to significantly reduce power consumption as well.

    With all of this, many people are putting together new towers that use less power than their notebooks.

    A comparable LCD screen would use about 45-50W (yes, I've validated those numbers, and a 19" LCD really does use that much power.

    That's not a fair comparison. Those 19" CRTs probably have a "viewable" size of 17.9".

    Besides that, a jump of approx 50% power savings is still huge, and better than you'd get trading-in your old refridgerator for a new one... And with other improvements on the horizon, I predict computer displays will continue to out-pace refrigerator effeciency gains for many years to come.

    Furnace blower motors have been moving to DC, significantly reducing their electricity consumption as well.

    I fail to see how a DC motor is inherently more effecient than an AC motor. For one thing, it comes into your house as AC to begin with.

    the receiver/audio amplifier takes 51W, regardless of output

    I sincerely doubt most people watch TV with a surround-sound amplifier on, 40 hours a week.

    I don't see the majority of TV programming (things like news, game shows, soap operas, etc.) getting any more exciting when played over 6 speakers instead of the two built-in to the TV.

    (like most families, it's often on even when unwatched, especially given some of the great digital music channels our cable provider streams out)

    I don't know why anyone would leave their TV on to listen to digital music channels for hours a day, when it has already been established that the person in question is using a seperate amplifier for their TV viewing already...

    But that didn't stop him from using this in his calculations, not to mention claiming that he's trying to save the earth...

    that PVR is just as efficient at turning 42W into heat as your baseboard would be with the same power.

    No, it isn't. As I've repeated on /. many times before:

    "An electric heater will be a purely resistive load, giving you a nearly perfect power factor of 1.0, whereas your VCR probably has a cheap power supply with a power factor as low as 0.4. So the VCR is causing a lot more power loss [line losses], even though it's the same 5watts."

    The measured difference between doing general desktop tasks on Vista Ultimate running with Aero Glass, and Windows 2003 running on the same hardware, was negligible.

    No doubt this test was done on the same 32-bit AMD Athlon system (without S2K Bus Disconnect enabled) WHICH DOESN'T IDLE P

A complex system that works is invariably found to have evolved from a simple system that works.

Working...