Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Displays Power

Monitor Draws Zero Power In Standby 405

fifthace writes "A new range of Fujitsu Siemens monitors don't draw power during standby. The technology uses capacitors and relays to avoid drawing power when no video signal is present. With political parties all over Europe calling for a ban on standby, this small development could end up as one of the most significant advances in recent times. The British Government estimates eight percent of all domestic electricity is consumed by devices in standby."
This discussion has been archived. No new comments can be posted.

Monitor Draws Zero Power In Standby

Comments Filter:
  • by kcbanner ( 929309 ) * on Thursday November 08, 2007 @09:44PM (#21289801) Homepage Journal
    ...when I see CRTs at work lighting up the room when they render "black".
    • You sure you dont mean LCDs? CRTs dont use any power to display black.

      Personally I've never seen a CRT display much light when showing black at night.
      Thats comparing my web surfing monitor (white) to my IRC/Konsole monitor (black).
      • by InvalidError ( 771317 ) on Thursday November 08, 2007 @11:05PM (#21290421)
        Most of the power in a CRT goes into the H/V beam deflection electromagnets, not the electron gun. The H/V scanning electronics operate regardless of which color is being rendered. The filament heater also uses about 6W whenever the CRT is turned on. Between displaying 100% white at the highest brightness and the blackest black at the lowest brightness, there is only a 5-10% difference depending on resolution and refresh rates.

        As for Fujitsu's 0W-standby monitor, they conveniently omit the fact that this extra relay's coil and related components will be drawing an extra 1W or so while the monitor/TV is on. I would prefer that they perfected ultra-low-power standby like 1W as the current typical appliance has 4-10W standby power: having standby rely on capacitors means standby would not work as expected every now and then if it's been too long since the previous power-up.
        • Re: (Score:3, Informative)

          by SeaFox ( 739806 )

          Between displaying 100% white at the highest brightness and the blackest black at the lowest brightness, there is only a 5-10% difference depending on resolution and refresh rates.

          Blackle [blackle.com] seems to say differently. And people have done the math [blogspot.com].
          • by Sycraft-fu ( 314770 ) on Friday November 09, 2007 @02:07AM (#21291607)
            For one, their math is not based in reality. These are numbers pulled out of their asses, with no backing as to if they are correct. However even if there is some truth, you run in to the fact that most people are using LCDs (and more convert all the time) and most LCDs are backwards. All LCDs run their backlights on full (or rather at the full level the user sets) at all times they are displaying. They work by blocking light. Well, the most common form of LCDs, the Twisted Nematic, are open by default. That is to say when there's no current across the junction, they pass the maximum amount of light. As such to turn black they need full power applied to the junction. They actually use more power to do black then white. There are LCDs that do not work this way (IPS and VA variants) but they are by far the minority on computer displays.

            So a "Blackle" would increase power usage on LCD systems, which needs to be factored in.

            If these people really care about saving energy, maybe they'd look to things like old, inefficient air conditioning units. ACs use power like no other appliance in a normal home. However there are many different quality levels out there. Good modern ones can move a lot more heat per unit of energy input. This is generally measured in a term called SEER, which means how many Btus of cooling a unit does per watt-hour of energy input. For old units SEER values of 9 or less are common. These days, you can't get less than 13 (by law) and you can get them over 20 SEER. That means that you'll be talking about a unit roughly twice as efficient at cooling. That is some major, major energy savings right there. Doesn't take a lot of that to equal their theoretical Google numbers, and this is backed up by reality.
        • Re: (Score:3, Insightful)

          "As for Fujitsu's 0W-standby monitor, they conveniently omit the fact that this extra relay's coil and related components will be drawing an extra 1W or so while the monitor/TV is on."

          I'm sure that design could be improved either by using a solid-state switch or a bi-polar relay that only needs a pulse to change state rather than to hold a state. What Fujitsu have done is a good start.

          How long is a monitor on compared to off for most people anyway? In an average work place one would hope that most peo
          • by Zaffle ( 13798 ) on Friday November 09, 2007 @02:28AM (#21291721) Homepage Journal

            "As for Fujitsu's 0W-standby monitor, they conveniently omit the fact that this extra relay's coil and related components will be drawing an extra 1W or so while the monitor/TV is on."


            1 Watt??? I built a circuit that used a relay for precisely this. I just called it from the other point of view, it turned itself off. There is no way you need 1 Watt of power to hold anything but the largest relays.

            Btw - this 0W standby only works when its a relatively simple thing to monitor for to come out of standby, a line level. Try making a TV that is 0W standby, yet I can boot it with just my remote. Actually, its quite simple, you use a rechargeable battery to power a IR monitoring circuit, but thats cheating :)
        • by nmg196 ( 184961 ) * on Friday November 09, 2007 @05:13AM (#21292501)
          > As for Fujitsu's 0W-standby monitor, they conveniently omit the fact that this extra relay's coil
          > and related components will be drawing an extra 1W or so while the monitor/TV is on

          Can you please post a link to the datasheet or page where you read that. I strongly suspect that you made that up because I've never come across a relay that requires 1 *WATT* to work. A relay only requires a few milliamps to work. A 1 watt relay would be a brick sized device that might be used to turn on some stadium lights or or several miles of highway lighting or something - not an LCD screen sat on your desk.

          I doubt it adds any significant power consumption wattsoever (geddit?).
    • by RuBLed ( 995686 )
      You should check your supplier if they got those screens that renders dark black instead...
  • power isnt free (Score:4, Insightful)

    by Gothmolly ( 148874 ) on Thursday November 08, 2007 @09:46PM (#21289819)
    Then it just draws EXTRA power while running, to charge the capacitors. Electricity can't be produced from nothing.

    A more useful version would be one that used solar cells on the top of the LCD to absorb the already expended energy of ambient lighting.
    • Re:power isnt free (Score:5, Informative)

      by CaptainPatent ( 1087643 ) on Thursday November 08, 2007 @09:49PM (#21289839) Journal

      Then it just draws EXTRA power while running, to charge the capacitors. Electricity can't be produced from nothing.
      Yes, but it only draws enough electricity to fill the capacitors instead of constantly drawing enough power to bring the monitor out of standby.

      Sure you're going to use some extra electricity to come out of standby, but this does cut down on that amount in a vast manner.
      • Re: (Score:2, Informative)

        by Gothmolly ( 148874 )
        Dude... the total energy consumption remains constant. Think about it. For the capacitors to run the monitor that long, they MUST HAVE DRAWN THE POWER IN THE FIRST PLACE.
        • Re:power isnt free (Score:4, Informative)

          by Zekasu ( 1059298 ) on Thursday November 08, 2007 @10:02PM (#21289965)

          A relay cuts off the mains power whenever the video stream stops; capacitors store enough charge to flick the relay back when the signal returns. Solar panels provide enough power to maintain zero consumption mode for up to five days, after which you have to press a regular power button to bring the machine out of standby.

          There's a difference here, and that is that this new monitor will draw enough power to wake itself out of standby, and then not draw anymore power. Normal monitors generally go into standby, and then continue consuming power, which is less wpoer than an idle screen, but still more than just enough to charge some capacitors.

          I don't see it as winning a prize for groundbreaking-innovation, though.
          • Re: (Score:2, Informative)

            by arodland ( 127775 )

            There's a difference here, and that is that this new monitor will draw enough power to wake itself out of standby, and then not draw anymore power.

            Except of course that that's not really possible since it needs to draw power to know when to come out of standby. That's where the constant draw comes from. The key to this is the solar panels they mention, which keep the caps topped off against leakage current. Without them, the design seems worthless to me, but with them you have an "alternative energy" monitor that puts photovoltaics to a use where, amazingly enough, they actually work.

            • Re: (Score:3, Interesting)

              by evilviper ( 135110 )

              The key to this is the solar panels they mention,

              Congratulations. You may well be the first non-idiot to post a reply to this story. It's been a painful read to see so much ignorance and stupidity getting points.

              The key to this is the solar panels they mention, which keep the caps topped off against leakage current.

              Indeed... Solar panels aren't cheap, though, and I thought of something else. A computer monitor has no point in turning on when there's no signal, so why not power the relay from the VGA/DVI

            • Re:power isnt free (Score:5, Insightful)

              by iamacat ( 583406 ) on Friday November 09, 2007 @03:35AM (#21291995)
              Nope, it's the absolutely worst use of solar panels. They could just draw mains power for one second every 6 hours. As it is, there is pollution created by manufacturing the panels, added cost for a component that does not add functionality and serious cases of remote control rage. And let's not get started on ceiling-mounted TVs.
        • Re: (Score:2, Informative)

          by amccaf1 ( 813772 )

          Dude... the total energy consumption remains constant. Think about it. For the capacitors to run the monitor that long, they MUST HAVE DRAWN THE POWER IN THE FIRST PLACE.

          According to the article:

          Fujitsu Siemens showed two 22in widescreen test monitors with power meters attached at a press event in Augsburg, Germany. The display drew 0.6-0.9W when the monitor was switched off using its standby button and with an active video signal from a VGA cable present. When the display signal was switchedc off the mo

        • Re:power isnt free (Score:5, Insightful)

          by JonathanR ( 852748 ) on Thursday November 08, 2007 @10:05PM (#21289995)
          Dude... Think about it. They're using capacitors and relays in order to detect a video signal and respond to it. Think of it like a mousetrap. It can remain armed for a long time without using any of the stored energy. The mousetrap is not powered while on standby mode, nor does it draw-down the energy from the spring.
          • by guruevi ( 827432 )
            Technically you should say the mousetrap is fully powered when armed, yes with stored energy but it still is powered in a sense, if not, put your fingers where the mouse is supposed to be.

            And ideally the mousetrap doesn't draw down energy from the spring, but practically it does. After months or years, the spring will lose tension strength and wait a longer time (if nothing trips it) and it will eventually be all the way back to it's beginning.
            • Re: (Score:3, Informative)

              by JonathanR ( 852748 )

              After months or years, the spring will lose tension strength and wait a longer time (if nothing trips it) and it will eventually be all the way back to it's beginning.

              This is not true. Your symptoms might occur after repeated cycles of energising/de-energising the spring, but at normal ambient temperatures, creep does not occur (in metallic materials).

              Capacitors (to return to the monitor standby topic) will lose their charge over time, which is presumably what the solar cells are to mitigate in this application.

        • Re: (Score:2, Insightful)

          by xzaph ( 1157805 )
          Except they're not "running the monitor that long", because the monitor isn't running. It's like saying that a battery that sits in bin for a year draws as much power as a 110V->1.5V transformer that's been plugged in and turned on for a year: obviously, the transformer consumes much more power because it's continually drawing power and wasting it all off to heat energy if there's no other load on the system.
          • Re: (Score:2, Informative)

            by slazzy ( 864185 )
            I think you've got it there - the transformer AKA power supply uses a lot of power when the monitor is doing nothing at all - IE in stanby mode. The relay will disconnect the power supply, and store the tiny amount of power needed to turn back on the relay in a capacitor - seems like a good idea to me.
        • Are you serious or trolling?

          The caps aren't running the monitor, all they are doing is reserving enough energy to start things up again on demand. Which means that the energy draw is fixed wrt the interval in which the monitor switches itself off, and user input switches it on again.
          • Re: (Score:3, Insightful)

            by dgatwood ( 11270 )

            Yes, but the relay is now basically acting like a latch and is drawing power continuously to keep itself closed until the appropriate hardware cuts off the control voltage. Now I'm not saying that the relay might not have been there anyway, but if this is an additional relay, you have an efficiency problem. Also, when the capacitor bleeds down, there has to be another way to cause the relay to latch. So why not just use a pushbutton to latch the relay and be done with it. After all, you're sitting at th

            • Re:power isnt free (Score:4, Insightful)

              by Kadin2048 ( 468275 ) * <slashdot...kadin@@@xoxy...net> on Friday November 09, 2007 @02:03AM (#21291585) Homepage Journal
              I agree with you wrt the uselessness of soft-power settings on computer monitors. I habitually hit the "real" power switch on my (circa 1998 or so, so it has both) monitor when I'm going to leave for a while, rather than just leaving it to go into standby. Mostly because it tends to come out of 'sleep' at the slightest whim.

              But the real reason for all those soft-power settings I think has less to do with powering on than it does with powering off. Most devices don't like to be daisy-chained and controlled by a remote source, like lots of analog electronics were, because they can't stand having their power cut abruptly.

              In other words, it's the "shut down" procedure that's the killer, not the "start up" one. Lots of devices perform little rituals when you turn them off, writing settings to non-volatile memory for instance, that analog electronics just don't have to do. Because of this, you need to make sure that the user doesn't really have control over the device's whole power. So instead of a real switch, the user gets a soft-power button. That way, they can press it, and the device can start shutting down, and do its thing. But this basically necessitates 'standby' rather than 'off,' in order to be able to start up from the soft power button.

              Remote controls are the other driving force, but there are lots of devices that do 'standby' now, that don't have remotes. I think it's often because they have a power-off procedure; if you designed devices so that they could be unplugged at any time without consequence, then you could go back to centrally-controlled, daisy-chained power supplies.
      • by EmbeddedJanitor ( 597831 ) on Thursday November 08, 2007 @10:04PM (#21289987)
        You don't need much power to run a very small 8-bit micro, enough to wake a sleeping monitor. We're talking about nano Amps here. A cheap capacitor can keep that going for months.

        The biggest wastage in taditional designs is that they use switch mode power supplies designed to run at full power. They don't operate very efficiently at very low (standby) power. It is far better to completely turn off the power supply and just use a local capacitor to keep the micro going.

        • You wouldn't even need a capacitor in the sense of storage.

          You would just need an RF diode coupled to the video input to be rectified and bias on the gate of a MOSFET that inturn drives a relay to connect mains power to the switchmode PSU.

          The crazy thing is, what took me 10 seconds to design in my head will probably be patented, and used to extort millions!!
           
          • Re: (Score:3, Interesting)

            by Grishnakh ( 216268 )
            You would just need an RF diode coupled to the video input to be rectified and bias on the gate of a MOSFET that inturn drives a relay to connect mains power to the switchmode PSU.

            The crazy thing is, what took me 10 seconds to design in my head will probably be patented, and used to extort millions!!


            I'm not sure this would work anyway: in order to power the MOSFET, wouldn't you need a power supply of some sort? Maybe if you used a triac instead, something like this might work.
            • Re: (Score:3, Interesting)

              by rcw-home ( 122017 )

              I'm not sure this would work anyway: in order to power the MOSFET, wouldn't you need a power supply of some sort? Maybe if you used a triac instead, something like this might work.

              VGA gets you 1V peak-to-peak at 75 ohms impedance (13 milliamps, probably per color). DVI gives you 5VDC @ 50mA through pins 14 and 15. The latter can drive a relay directly, the former would probably need a voltage multiplier circuit (which at those low voltages could probably be embedded on an IC, in fact you'd probably have

      • Re: (Score:3, Insightful)

        And yet according to TFA this monitor still draws power when you press the standby / power button. It's only when the video signal ceases that the power usage drops to zero.

        If I press the "off" button and have to press it again to turn it on, why is the monitor still drawing power?

      • I honestly cannot see how standby can chew significant amounts of power.
        The circuitry is dead simple and very light.
        • Re: (Score:3, Insightful)

          by gmack ( 197796 )
          What bothers me is they are worried about all these half watt drains and meanwhile most of the electricity used in a house goes into heating, appliances, hot water and lights.

          The big offenders need nailing first so they should be banning hot water tanks (instant on hot water uses 50% less energy) before they start regulating standby mode.
    • Re: (Score:2, Informative)

      by amccaf1 ( 813772 )

      A more useful version would be one that used solar cells on the top of the LCD to absorb the already expended energy of ambient lighting.
      Looks like it does... From TFA:

      Solar panels provide enough power to maintain zero consumption mode for up to five days, after which you have to press a regular power button to bring the machine out of standby.
    • A more useful version would be one that used solar cells

      *AHEM* From TFA:

      A relay cuts off the mains power whenever the video stream stops; capacitors store enough charge to flick the relay back when the signal returns. Solar panels provide enough power to maintain zero consumption mode for up to five days, after which you have to press a regular power button to bring the machine out of standby.


    • by mikael ( 484 )
      The issue is that one or two Megawatt power stations in the UK are effectively being used to keep electronic components on standby. According to the article these TV sets will also have solar panels to keep the capacitors charged.

      The other thing people can do is to make sure they are using rechargable batteries for the remote control. I wonder if solar panels could be added to rechargeable batteries, so you could have them recharged simply by leaving them beside a window.
      • I wish the magnetic remote charging technology takes off.
        Then the remote can function off a capacitor as well.
  • by thatskinnyguy ( 1129515 ) on Thursday November 08, 2007 @09:52PM (#21289861)
    I believe the proper term is "hibernate". When my laptop is in standby, it still draws power. But when I close the lid on my laptop, and it goes into hibernation mode, it draws no power until I open the lid again. The same could be said of these monitors. They draw no power until a user does something analogous to me opening the lid on my laptop.
    • Most use some sort of supervisory micro or other electronics to sense you pressing the power switch etc. It might draw very little power, but it isn't nothing.
    • by tknd ( 979052 ) on Thursday November 08, 2007 @10:16PM (#21290081)
      They're referring to the electronics standby not computer OS standby. Nearly all electronic devices (TVs, monitors, computers, etc) are on standby unless they're unplugged. This allows you to turn on the device with an electronic switch or a remote rather than a physical switch because part of the electronics are still "on". The surprising thing is some electronics are incredibly inefficient at standby. I tested some PSUs which would use 4 watts while the computer was "off". If you start adding up the number of electronic gadgets in your home, the watts start adding up all while your stuff is doing absolutely nothing.
    • I think hibernation implies that the computer can be completely turned off (i.e., the power source can be disconnected, because the contents of RAM have been written to the hard disk). I think most laptops just go into a deep sleep, perhaps S3 suspend-to-RAM, where a small amount of power is still necessary to maintain the contents of RAM. I use S3 on my desktop, put if you pull the power plug in S3, you'll have to start up normally (or abnormally, since you probably shouldn't do that).
    • Re: (Score:2, Insightful)

      Ah, hibernation I remember it fondly. Upon discovering my new PowerBook G4 didn't support such an advanced feature I nearly returned it. Since then when I'm not using my laptop it is constantly drawing enough power to refresh the RAM and pulse its LED. It is never off for more than an hour. I wish Apple would get with it and implement hibernation.
  • instead (Score:2, Insightful)

    why can't people just be disciplined enough to switch off their monitors before leaving for home/office?
    • Re:instead (Score:5, Insightful)

      by evanbd ( 210358 ) on Thursday November 08, 2007 @09:59PM (#21289929)
      Empirically, they can't. It does not matter why, unless with that answer comes some insight into how to change it. It would appear that simply telling them to do better has no impact. If *you* want to save power, then that method has some hope of success. If a large organization or society wants to save power, that method is almost hopeless. So, given that you can't just tell people to conserve energy and expect it to work, what can you do? Incentives or mandates for more efficient standby modes is one solution that might actually have an impact.
      • The only reason I'm in the habit of turning my monitor off at home is that unlike most appliances, its "standby" mode includes a bright, blue, flashing light. The light is on, solid, when the machine is on, but it blinks on and off constantly when it's in standby. I realize the LCD uses almost no power, but it both gives me a visual cue that the thing is still wasting power, and it actually keeps me awake at night (it's in my bedroom).

        But, I'd argue that no matter what the reason that people are lazy, or ev
      • Re:instead (Score:4, Insightful)

        by famebait ( 450028 ) on Friday November 09, 2007 @05:47AM (#21292653)
        Damn I wish I had points. That was the clearest rebuttal I've seen to date to date of the sort of numbskulled responses you see all the time on slashdot these days: "why can't people just take responsibility blah blah blah".

        It seems a lot of people simply can't tell the difference between "not my problem" and "not a problem" - between placing responsibility for a problem and actually seeing it solved. You wouldn't expect the same people to argue "Why can't all world leaders just sit down and hold hands and sort it all out peacefully", but it's exactly the same sort of worthless argument. Well, I don't know why, but your rhetorical question doesn't mean whatever reasons there are suddenly disappear, and hurrah if they all did what you suggest, but I'm sure as hell not going to carry on with my life pretending "well that's solved, then".

        This mental dodo is especially mind-boggling when the negative impact is on a third party and not on the one identified as 'responsible'. "Damn regulations. Parents should take some responsibility and screen their children's toys for toxic chemicals". Implicitly: "if they don't then they deserve what they get". Errr, OK, let's just for the sake of argument assume that they did deserve it. Does that affect what their kids deserve?

        This last variant also incorporates another common logical gem: the scapegoat fallacy - the idea that responsibility for something is a constant amount. If you can blame someone, everybody else is off the hook. It's like saying that "the hit man was just doing his job", or "don't blame me, hire the hit man I hired". No. You are both fully responsible for all easily foreseeable consequences of your actions, including how you affect the actions of others, and a longer list of parties who share responsibility for the result does not lessen yours unless it lessens your control or predictive capability over what happened.
    • If people were capable of that, we wouldn't need computers in the first place.
    • If you switch the devices on/off all the time, then they don't last very long. One reason why modern electronic devices last for decades without failure, is due to not ever being really switched off.
      • Re:instead (Score:4, Insightful)

        by rmerry72 ( 934528 ) on Thursday November 08, 2007 @10:35PM (#21290235) Homepage

        If you switch the devices on/off all the time, then they don't last very long. One reason why modern electronic devices last for decades without failure, is due to not ever being really switched off.

        Oh crap. Maybe mechanical devices might have a problem - like spinning down and spinning up your hard drive - but not electrical devices. Modern electronic devices haven't been around for decades, maybe just over one. Most old fashioned electoronics - like old TVs and radios - did get turned on and off (they had no standby) and they did last decades.

        Modern devices barely last five years before needing replacing. Add the fact that they chew up power when they are in "stand-by" and I wonder what the definition of "progress" really is.

        • Re: (Score:3, Informative)

          by russotto ( 537200 )
          Look up "inrush current". Wear and damage due to switching devices on and off all the time is not limited to mechanical devices. You can get high voltages when turning a device off, as well.

          Old TVs certainly did have standby. It was called "instant on".

          Modern devices barely last five years before needing replacing.

          Generally because of obselescence, not failure. Or because of a failure that, in an older device in former times, would have been worth repairing. Those old TVs and radios and VCRs were not

    • Re: (Score:3, Insightful)

      by jmorris42 ( 1458 ) *
      > why can't people just be disciplined enough to switch off their monitors before leaving for home/office?

      Go ahead, push the button on the front if it makes you feel 'green' or something. But other than the LED on the front going off instead of blinking and/or changing colors you ain't done a goddamned thing. It is still wasting almost (less the couple of milliwatts for the LED) exactly as much power as if you hadn't pushed the button. Because the button on the front is just a 'soft button' on almost
      • What kind of LCD "locks up"?

        I have never, ever seen that happen.
      • Re: (Score:3, Funny)

        by Khyber ( 864651 )
        Yup, and knowing plenty about the problem, I keep my power strip right between my tower and my amplifier, right where I can reach over the keyboard and KILL EVERYTHING AT ONCE.

        No sissy waiting for stuff to shut down. All my programs are closed, hard disk activity light not blinking *click* everything's off.

        Why wait for a solution when we've had one for decades and it works more reliably than some software-controlled switch?
    • Because I'm that fucking lazy. And there's not much point if I'm not switching off the computer too, which I'm also not doing.
    • by SeaFox ( 739806 )

      why can't people just be disciplined enough to switch off their monitors before leaving for home/office?

      Remember old beige Macintoshes? The monitor power actually ran through the computer, so when you shut down the machine the monitor was powered down, too.

      One of those nice little touches we lost on the way to cost cutting and standardization with the PC industry.
  • by User 956 ( 568564 ) on Thursday November 08, 2007 @09:59PM (#21289927) Homepage
    A new range of Fujitsu Siemens monitors don't draw power during standby.

    The monitor might not, but what about the power brick? those things consume power even if no monitor is attached.
    • Since when do monitors have power bricks? I've never seen a monitor with a power brick.
      • Many LCDs use power bricks (several Dell LCD models I've worked with as well as the the Acer AL2051W I'm using right now, for example).
  • by HeyBob! ( 111243 ) on Thursday November 08, 2007 @09:59PM (#21289931)
    I just want an Off switch on my printers and scanners! Or if they do have one, put it in the front. I use my scanner once a month, it's crazy to leave it plugged in all the time (no power switch). My printer's power switch is way around at the back, hard to reach - I only print once or twice a week. At least my LCD has an off button on the front, but it is never really off.
  • 8% sounds high (Score:2, Insightful)

    by stratjakt ( 596332 )
    What do they consider standby?

    I guess this is more save the planet stuff.

    Now I need to buy new monitors, tv's, vcrs, dvd players, microwave, oven, unplug my clocks every day, etc.. Lots more aluminum smelted. Lots more resources used up. Lots more pollution, but we all can sleep better knowing the residential power demand may shrink by a fraction of a percent.

    I'll get right on that after I scrap my relatively new car and buy a prius, and pull and toss all my perfectly functional lighing in favor of compa
  • by Wonko the Sane ( 25252 ) * on Thursday November 08, 2007 @10:12PM (#21290047) Journal
    I like this trend. If a device wants to consume 0 power on standby then it finally means that they'll stop putting those damn blue LEDs on everything electronic. Then I could have a dark bedroom at night without the use of electrical tape.
    • My Dads eye's are "sensitive". He had me put black electrical tape over the num-lock light on his keyboard because it bothered him. He never uses Caps lock, and I don't even know what scroll lock does so I guess he's good for now.
    • Re: (Score:3, Informative)

      by TurboStar ( 712836 )
      How'd that get modded funny? I tape over mine too. Some blue LEDs literally hurt even glancing at them in a dark room. Then you have the night vision loss.
      • by TeknoHog ( 164938 )

        Back in the day, devices came with nice red LEDs that didn't ruin your night vision. A nice coincidence with the fact that red ones were the first/easiest LEDs to make.

        One problem with blue LEDs is that the human eye has poor sensitivity to blue, at least resolution-wise. There's a great example of this problem here in Jyvaskyla, a bicycle counter installed in a cycle path (probably using some inductive effect for detection, and intended to collect statistics for traffic planning). Its display consists

  • Pull the plug (Score:3, Interesting)

    by Drakin020 ( 980931 ) on Thursday November 08, 2007 @10:17PM (#21290093)
    So does that mean I can pull the plug and have the monitor remain in standby mode?
    • by Gryle ( 933382 )
      Why is the parent modded down? Granted my knowledge of circuitry is rather limited, but it still seems like a legitimate question.
  • Seems important to fix but it's kind of a problem of degree isn't it? Our traditional power sources are running out and our power needs will increase dramatically. In that kind of equation even improving world power usage efficiency by 50% would be of rather minor benefit in actually solving the problem.

    How much political power gets directed at stuff like this which could be more properly directed at new power sources?
  • by thogard ( 43403 ) on Thursday November 08, 2007 @10:22PM (#21290151) Homepage
    Most very low power modern devices have nasty power factors. PC power supplies tend to be .6 to .8. CFLs run from about .2 to .6 while many phone charges are about .2. That means for every watt delivered to the phone, there line losses in the grid are at least 3 W if not more. There are also losses in the generator so getting 1 Watt into your phone (or CFL) may require more power than putting 5W into a resistive load.
    • Re: (Score:3, Interesting)

      by evanbd ( 210358 )
      I wonder what would happen if the electric company billed for the volt-amps consumed, instead of the watts, and then reported both numbers (together with your power factor) on the bill. I also wonder what would be required to do whole-house power factor correction? How much cost would it add if you were going to install a grid-tie solar system or something similar? How do these numbers compare to the added cost of power factor corrected power supplies in consumer electronics?
      • Re: (Score:3, Interesting)

        by evilviper ( 135110 )

        I wonder what would happen if the electric company billed for the volt-amps consumed,

        That happens for companies already, and I believe even homes in parts of Europe.

        I also wonder what would be required to do whole-house power factor correction?

        Some big-ass capacitors, just like the power companies do already to keep from being overwhelmed.

        How much cost would it add if you were going to install a grid-tie solar system or something similar?

        Funny you should mention it.

        Over the past few months, I've been notici

  • by TheLink ( 130905 ) on Thursday November 08, 2007 @10:34PM (#21290229) Journal
    Yes it's still a good thing, but meanwhile has anyone invented an airconditioner/heater or car that's much more efficient but at the same time as practical and as affordable as the conventional stuff?

    My airconditioner uses at least 1kW. 1 hour of airconditioning = 20 days of monitor standby.

    For those of you who live in countries that need central heating, the standby power isn't going to hurt as much during winter since you want stuff warmer anyway.

    I need a better designed house (to reduce cooling bills etc), but I can't afford one... An "Energy Star" legislation for houses here might be good, but I'm worried the builders will just use it as a way to make a lot more money.

    • by blueg3 ( 192743 )
      Unfortunately, the efficiency of heating and cooling is seriously limited by thermodynamics. Cars have been developed that are a ton more efficient, but only some people buy them. Many others buy cars that are less efficient than the ones they had before.

      The best thing you can do for heating/cooling is to have a well-insulated house with good air movement and to take advantage of passive heating/cooling.
  • Quick Question:

    Do American power points have switches on them? Or are they just live the whole time?
  • by Animats ( 122034 ) on Thursday November 08, 2007 @10:49PM (#21290333) Homepage

    This is more of a stunt. It's relatively straightforward to design the control electronics for a display such that the electronics draws under a milliwatt in standby. The problem is how to get 1mW at 5V or so from the power line. Low-end switching power supplies don't even work right with no load, and better ones still draw a few percent of full-load current when unloaded. So you can't use the main power supply. Transformers have the same problem.

    What's really needed are low-cost power supplies for obtaining something like a milliwatt from the power line without wasting more power than they deliver. But they have to be attached to the power line, and need the the protection circuitry and isolation for that. It's not something that can be done with a single IC.

    One could power the standby electronics from an ultracapacitor, and when it gets low, bring up the main power supply for a few seconds for a recharge.

  • From the introductory blurb "uses capacitors and relays to avoid drawing power". Drawing on my memory from my hardware (as in soldering and breadboarding) geek phase (Z8 ForthChip anybody?) a capacitor acts like a battery so all this is doing is storing power before going into standby. That can't be saving power just shifting it around.

    The next part (my opionion) is the one that makes this work (FTFA)->"Solar panels provide enough power to maintain zero consumption mode". Pretty nifty, I've seen solar
    • Re: (Score:3, Informative)

      by Alioth ( 221270 )
      No, it can be saving power and this is why.

      If you power your standby circuit off the line power, you need a transformer or switch mode supply to isolate it from line power and provide the low voltage (probably not 5v, probably 3.3v for most modern devices). The power supply itself unloaded will consume several watts - at very low loads, the power supply is probably less than 1% efficient, so it's just wasting 99% of the energy.

      If you charge a capacitor instead, when the supply is under load and operating ef
  • It is not a trivial problem. Most video display technologies need an amount of stabilization to display images accurately. That requires a constant current load, unless you want to go back to the 50's, 60's, and 70's where the TVs have to warm up before you use them.

    I agree that most things don't require "stand-by" power. Hell, I have some USB external hard drives where the switch isn't on the actual power supply, but on the device, meaning that the power supply is always drawing some current even though th
  • Okay, so a relay flips the mains power off when there's no signal, and presumably the relay coil is off in this state. But when the monitor comes back on, presumably the relay needs to flip into its on state. Surely that would increase the "on" power consumption of the monitor, making it not very green for high-use applications.

    Unless they have a two-coil or polarity-reversing relay and some clever magnets on the relay contacts so one state doesn't need to be constantly fighting a spring.

  • Instead of storing power in the device, send power to it instead - that is put in a RFID-like receiver that can be energized from a distance, such as from a current-inducing remote control.

    Think outside the box ... no need to even have a capacitor.

    Ron
  • Remote Conservation (Score:3, Interesting)

    by Doc Ruby ( 173196 ) on Friday November 09, 2007 @12:10AM (#21290905) Homepage Journal
    Is there a really cheap remote power control for appliances that I can control via PC/Linux, which will shut off all power, and drain the minimum while watching for the powerup signal? Bluetooth or other wireless, or even over the electric wires in the wall.

    It seems to me like some kind of RFID type passive tech could do this with only the power from a RF signal itself to flip the transistors gating the appliance power on/off.
  • I'm not sure who, but someone has indeed already invented a mechanism by which a device draws Zero Power when not in use.

    It's called ' OFF ' . You may have heard of it.
  • by Gordonjcp ( 186804 ) on Friday November 09, 2007 @08:31AM (#21293559) Homepage
    No, I'm not a Mac fanboi, but I did have a Mac IIfx. That, in common with most Macs of the day, would draw no power at all in standby mode, but could be woken with a keystroke. There was a relay in the PSU that shut off all power, and a small battery that kept the clock running. The power switch fired the relay in the PSU through a couple of capacitors, enough to turn on the supply for long enough to bring up one of the supply rails and hold the relay on.

"I shall expect a chemical cure for psychopathic behavior by 10 A.M. tomorrow, or I'll have your guts for spaghetti." -- a comic panel by Cotham

Working...