Standby Electronics a Waste? 751
gnunick writes to tell us BBC News is reporting that UK citizens waste quite a bit of electricity each year by leaving electronic gadgets on standby or charging. Critics are arguing that standby mode on electronics are completely unnecessary and should be removed for a number of reasons. From the article: "To put it another way, the entire population of Glasgow could fly to New York and back again and the resulting emissions would still be less than that from devices left in sleep mode."
A small step in the right direction (Score:2, Informative)
In Europe you have to physically push a button to turn them on in stand by mode. Unfortunally I haven't seen many devices (like radios) that work the same way.
But I guess TV was something that almost everyone has and everyone left on stand-by so it was a good choice for a device with mandatory off switch.
Lets hope this practices spread around elsewhere and in other devices. It's a small price to pay (moving you ass to turn
Re:A small step in the right direction (Score:3, Interesting)
Maybe the Europen TVs today are a hold-over from that.
Re:A small step in the right direction (Score:3, Informative)
This also lengthens the life of the tubes, since what kills them is usually related to heating and cooling cycles more th
Re:A small step in the right direction (Score:3, Informative)
A Small Step In The Wrong Direction (Score:5, Insightful)
Anyone with any sense with a career in environmental protection tries to make people take one less flight per year (all the cars in uk produce 1 tenth the emissions all the airflights in the UK produce! They persuade people that if they recycle anything, to recycle their aluminium because the carbon savings from, eg glass, are neglible if not negative, but the savings from aluminium are immense. They persuade people to buy electricity from companies that at least pretend to care about emissions. They persuade people to buy food that doesn't have to be flown from New Zealand to get to their plates.
They do not have a go at people about leaving devices on standby.
Standby is there to make life a little easier, and almost all devices make standby easy, and full-power-off harder. Standby wastes relatively, bugger-all electricity. So put things in perspective and don't make people feel guilty about trivial shit, because they will assume that saving the environment is all as tedious and unpleasant, and choose to not do anything at all.
Re:A Small Step In The Wrong Direction (Score:3, Interesting)
My power supplier lends compact power usage meters for about one week ( about like this [thinkgeek.com] (yes, they seem to have recycled the pun in the dept.-name))
Anyway, more or less coincidentally (/. has got these stories quite often, and I planned on posting about it as soon as I find the right occasion), I have got one pretty much right now. The claims you promote there, about the people with a career in environmental protection, not promoting anti-sta
Re:A Small Step In The Wrong Direction (Score:3, Informative)
Dude, I have been the UK. There is a damn good reason why all of your food is shipped in from New Zealand. When talking about the environment you need to be reasonable. Buy an efficient car, trying to use public transportation, cutting down on energy consumption, and recycling? All are reasonable. Having to eat native British food every single day for the rest of your life? Put a gun to my fucking hea
Re:A Small Step In The Wrong Direction (Score:3, Informative)
Don't know, whether you have any specific emissions in mind, but I'd call this statement plain wrong. Currently total airflight energy use is about a quarter of total car traffic energy use (but admittedly airflight is growing at an alarming rate). Airplanes produce more emissions per distance, and also some particularily nasty types of pollution (water vapor in high altitudes, for instance, is a greenhouse factor), but i
Re:A Small Step In The Wrong Direction (Score:3, Interesting)
I prefer the light from the modern compact fluorescents to incandescents -- I find that rooms just seem a lot "sunnier" when lit with the higher-color-temperature lamps. I'm not talking about the old greenish/blue-white ones, they're pretty disgusting, but the last few "warm fluorescent" ones I picked up at Home Depot are a lot nicer than the incandescent lights I replaced. I think they're around 4000-5000K, and going back to 2800K (typical incandes
Re:A Small Step In The Wrong Direction (Score:3, Insightful)
Really folks, this gadget-centric perspective is pretty ridiculous. If you want to save power, look at the bigger appliances and the heating and cooling efficiencies of your house. That's where the savings are to be had, not in obsessing about your roomba.
(Though as an aside I must say it would be real nice if the Linux Kernel folks would deal
Re:A small step in the right direction (Score:3, Informative)
Re:A small step in the right direction (Score:4, Interesting)
Completely off topic. Why have an eject button on a DVD remote? You still have to physically remove the disk!
Re:A small step in the right direction (Score:4, Informative)
This is also a good concept to remember in the context of this discussion. A CRT uses a very large electro-magnetic coil. When you first power this coil up, it draws an enormous current (if your house is wired poorly, you will see your lights dim). That energy is not dissipated, however; rather, it is stored in the coil as an electromagnetic field. As that field is used to control the electron ray that generates the image on the screen, the electromagnetic field is consumed, and the coil draws a current (much smaller than the initial current) in order to replace it. When the CRT goes into standby, that electromagnetic field is no longer being consumed, and the only current being drawn represents the energy being dissipated as heat -- the more efficient the design, the lower this current will be. Remember, there is a large amount of energy stored in the coil, and a small amount of energy being consumed. When you switch off the CRT, the circuit of which the coil is a part is broken. When this circuit is broken, the entire electromagnetic field will be dissipated at once as an electromagnetic pulse, wasting all of the energy that it was storing. So, depending on how often you use it, standby may waste less energy that repetedly turning the device on and then off again.
Re:A small step in the right direction (Score:3, Informative)
TV's use a thermistor arrangement that results in a coil wound around the CRT sucking gobs of power for about a second on powerup. It's there to give the shadow mask a quick demagnetizing.
You can often hear this as a brief hum that quickly fades away on startup. If you then turn the TV off and quickly on again, the heavy draw and hum won't happen, because the thermistor is still hot.
Mod parent up! (Score:3, Informative)
Re:Don't lie (Score:5, Informative)
US TV: "Power" button on the TV itself and the one on the remote do exactly the same thing: switch between "on" and "standby". The only way to get it off is to unplug the mains cord.
European TV: Power button on the TV requires some finger pressure and physically disconnects the power, leaving the remote impotent. The "power" button on the remote only puts it into standby.
Of course there are exceptions but this has typically been the situation with my and my family's relatively modern CRT TVs on both continents.
Re:Don't lie (Score:3, Informative)
That would be illegal in the UK and EU. It wouldn't meet the safety requirements.
As an aside, I had a TV that could be switched off from the remote - actually entirely off - but not on. The on/off switch had a solenoid on it. When you triggered the solenoid, it let go the power button, turning the set off. Big mid-80s Decca
Re:Don't lie (Score:5, Insightful)
Who cares? The CRT in my TV is turned off (to the point that it takes about 10 seconds to fully come back on), so the component that takes 99.9% of the power isn't drawing a thing. The only thing required for standby is the IR receiver circuit. How much current can that possibly draw (at low voltages to boot) when idle?
It's impossible to waste energy in the winter (Score:5, Insightful)
I actually bought one of those power outlet meters to try to reduce my home energy usage. [amazon.com]
But after I tested two or three appliances, I realized that this whole endeavor is completely nonsense except in summertime. If my computer, power amp, water heater, or even incandescent lights, are running during the winter... every watt of power they generate will reduce my heating bill by almost exactly that watt.
Now yes, I do have electric heating. The tradeoff may differ for those who don't. But the fact remains that powering devices in the home is much less wasteful than it seems, for those who live in colder climates. Since this study was done in Britain, I wonder if they controlled for this factor.
In the summer, of course, I try to keep things off as much as possible. But this is primarily because it's too hot, and only secondarily to save power.
--
Dum de dum.
What about us? (Score:2)
It would have to be a lot I would think. Something to think about- what really needs standby and what doesn't?
Re:What about us? (Score:3, Interesting)
At home, I have my entire computer setup (box, monitor, printer, scanner, etc.) plugged into a single power-line with a big switch mounted to the desk; one switch to rule them all. If only I wouldn't need to manually shut-down WinXP, it'd be perfect.
The main idea was to rid of the annoying stand-by LEDs when I didn't need them, but the power saving is nice too.
Re:What about us? (Score:3, Insightful)
VCRs make sense. I don't necessarily need the visible clock display, but I do rely on the timer to kick the machine on and record my shows.
Televisions? What a waste. Sync up to the cable system's time when I power it on.
Cable Boxes? Please. Those things u
Any heat is good heat in winter (Score:4, Informative)
I'm not saying we shouldn't conserve energy, but these kinds of calculations are often off by orders of magnitude.
Re:Any heat is good heat in winter (Score:5, Interesting)
No, not quite as easy unfortunately. I'm renovating a summer house, and though hardly an expert, I've learned that where you place the heat sources matter a lot. You want your radiators below the windows for instance, because that is where the cold "fall" in to the room. If you put the heating somewhere else (a PSU in the computer of your desk for instance), you risk getting cold air currents along the floor and walls, and the nice heating going up to the ceiling and being wasted. Humans react to temperature changes, many will feel chilled if they get these cold draughts along the floor and walls.
Offtopic - What amazes me as a Swede is that all Anglo-saxon countries I've been to build so incredibly flimsy and energy-inefficient houses. England, Australia, and from what I've heard, the US as well. I mean, you are rich countries, why build like third world?
When I lived in Australia, my host had an aircon constantly blasting heat in winter and cold in summer. Since there were big gaps under the doors and around the windows, and very little insulation in the ceiling this desired temperature quickly escaped. In winter he closed much of the house except one room where the air con was, and we had to stay there wrapped in blankets. When I suggested he insulate the house to save money and energy, he said "No no, it is much to hot in summer here!" I tried to explain that insulating a house is like a thermos. It can keep your chocolate warm in winter, or your chilled drinks cold in summer. He remained sceptical.
Re:Any heat is good heat in winter (Score:3, Informative)
Where was that? In Victoria certainly almost all houses are insulated. It gets pretty hot in summer too; over 40C, and close to freezing (though never snow i
Re:Any heat is good heat in winter (Score:5, Insightful)
I live in Australia and it amazes me what primitive building codes they have. Most homes are timber-framed "brick veneer" and their thermal performance is abysmal. I think new regulations now force walls and roofspace to be insulated but it seems to have been a long time coming. My house was built in 1982 and it totally sucks - absolutely nothing in the walls and a limited layer of loose fill in the roof. Whenever I have done any interior work that involves exposing the frame I have insulated that bit, but it's very patchy. The roof space can be dealt with, but most of the problem is the walls and windows.
In addition, many homes are built individually to the owner's specification, and very few seem to have a clue about using the natural direction of the sun to create sensible areas of light and shade, areas that are warm in winter and cool in summer. Luckily in that respect my own house is situated correctly - in fact 180 to the orientation shown on the original plans! Obviously someone realised just before it was erected that the original orientation was stupid. Or maybe they just misread them...
The other thing that amazes me is that more homes are not built with built-in solar water heating and other solar-powered ventilation arrangements. These require no moving parts or external power, are very simple and effective. There ARE some houses that have these features and their benefits are obvious as soon as you walk into one - nice and cool in summer, and the sunnier it is, the cooler they get! Hot water for free. Instead most people fit reverse-cycle aircon to their homes to make them bearable when all it would take is some better building codes. It's about time this was forced on builders by legislation, but there appears to be no sign of it. Even the UK is forcing new homes to be built with solar water heating for god's sake!! I think outsiders think of Austrlians as being quite 'green conscious' and in some respects they are, but talk about missing the wood for the trees!
Have you considered blown-in insulation? (Score:3, Informative)
Here in the states, we have "blown-in" insulation. They simply drill a small hole (maybe 3/4" or so) in your wall, and blow little flecks of insulation into it. Actally, I think they drill two holes, one low and one high, and when they see the insulation pasing the top hole they know the cavity has been filled. Because there are studs every 16" or so, they have to do this many times acr
Re:Any heat is good heat in winter (Score:5, Interesting)
As a Norwegian living in England, I have to agree... Here in the UK I think it's largely down to mild winters. Insulation is practically non-existent in older buildings here (most new builds seems to be better, thankfully) - just a thin wooden floor with huge cracks and 20 cm or so of air separating you from the ground is quite common. And hollow wooden floors with cracks, only sealed with plaster plates for the ceiling in the floor below is pretty normal within residential houses.
Before I'd moved to the UK I hadn't even seen buildings built like that except in museums.
The lofts are usually equally bad - huge parts of the building mass still have completely uninsulated lofts (though admittedly there is a push to change that, with government grants often available to offset the cost of insulation) and huge cracks everywhere.
But my pet peeve is the British builders approach to leaks. Just fill the cracks with some silicone or other filler, and paint over whatever stains there are, wait until the next crack develops and try again, instead of ensuring bathroom floors are properly sealed.
I guess it's a cost thing combined with the fact that the climate lets them get away with it (for those who haven't lived anywhere COLD: Imagine having your walls full of moisture. Then imagine that water freezing and expanding. Now imagine the cracks developing after a few years of that happening on a regular basis...). But it annoys the hell out of me when I see bathrooms built in a way that'll give the people on the floor below a nice shower if you get the floor a little bit wet.
British builders, though, seems to be in a league of their own, and that is not a compliment. I've never ever had to deal with such a bunch of incompetent twits. Just got to love how they think that it's perfectly fine to just keep pumping more silicone into a flat roof if it's leaking, instead of actually trying to find a fix the massive leaks in the top coating of the roof. Because apparently that's too much work for them.
The lack of a proper certification system and a proper education is really a problem - to the point where it's not uncommon for people here to hire in German builders to get things done properly even with the extra costs (for larger jobs they'll easily pay for themselves by actually doing things properly, and without the massive delays British builders seems to take great pride in...).
Re:Any heat is good heat in winter (Score:4, Funny)
Re:Any heat is good heat in winter (Score:3, Funny)
Re:Any heat is good heat in winter (Score:3, Interesting)
Re:Any heat is good heat in winter (Score:4, Interesting)
Re:Any heat is good heat in winter (Score:3, Informative)
Sort of: electricity is currently (ha!) about four times more expensive per kWh than gas in the UK. Presumably this is down to a) conversion losses at the power station b) transmission losses c) value - electricity can be used for more purposes in a typical home than gas. If you're heating your home with electricity, you're effectively doing c
Re:Any heat is good heat in winter (Score:3, Informative)
This is only true if it is a season and a time of day when you would normally have the heating on, and you normally heat your house with electricity only.
Heating with electricity generated from fossil fuels is ridiculously wasteful in any case. Burning them locally with a modern heating system
is radically more efficient.
Re:Any heat is good heat in winter (Score:3, Insightful)
So where exactly does that power go? In the form of flying angels that flap around the room maybe?
Re:Any heat is good heat in winter (Score:3, Informative)
Wouldn't that also heat the room ;-)?
The angels would need to fly outside and flap there for the heat to be completely lost.
Re:Any heat is good heat in winter (Score:4, Insightful)
The real problem with this reasoning is that the generator of the original electricity was possibly going around drive by a turbine that was driven by heat. The efficiency of that transfer is far below 50 %. Only if your house is electrically heated, without employing a phase-change heat exchange (a reversed fridge for the air leaving the building, making the outside a little cooler) it's equivalent and one can still argue about how to achieve optimum airflow.
Of course, standby power in electrically heated buildings is less of a problem than in electrically cooled ones. In that case you have waste power for standby and waste heat from standby that must be handled by the AC, causing even more waste.
Re:Any heat is good heat in winter (Score:5, Informative)
Not possible! Unless that energy is actually performing some work -- causing motion, facilitating a chemical reaction etc -- ANY power drawn by an electronic device will come right on out as heat.
If a device uses 2 Watts of electricity while on standby, you'd better believe that 2 Watts of heat energy come out of that device. (minus the energy of any photons emitted by light-producing components)
GP is right in that in any environment where energy is being used to keep the room temperature UP, there's really no "waste" by this standby power. Electric heating is usually a bit more expensive than other energy sources, but your vcr on standby at 5 watts is no worse than running a small electric space heater at 5 watts.
The real problem comes in cases where energy is being used to COOL a space -- in any hot part of the country, or in data centers etc. In THOSE cases, you'd want to eliminate ANY power waste, since you're paying for that heat twice -- once for the energy that's producing the waste heat, and a second time for the cooling equipment to REMOVE that heat.
I don't mind leaving any/all lights on in my house during the winter. But during hot summers, I look at each 100W light bulb as an evil source of dastardly HEAT.
- Peter
Re:Uh-huh... (Score:3, Insightful)
Consumers want standby? (Score:5, Insightful)
I remember my first exposure to "standby". An HP laserjet 4L I bought in 1995 -- it didn't have an off button. That bothered me so much I bought one of those undermonitor powerbars with switches on the front so I could turn the darn thing off. Since then, more and more things have come out that can't be shut off and I've sort of accepted "standby" now
Re: (Score:3, Insightful)
Re:Consumers want standby? (Score:5, Funny)
Re:Consumers want standby? (Score:5, Insightful)
I did the same thing to allow myself to power-off a Brother laser printer I bought around that same time with no off switch.
My plan backfired, though. Due to the design of the printer a (long) cool-off period was required after anything was printed on it. I got in the habit of killing power to it immediately after printing, the fans didn't blow, I ended up ruining the fuser and having to get it replaced.
Now granted, not all devices have this type of passive power consumption required. But it pays to keep in mind WHY an appliance designer may have opted to design a standby mode instead of a power on/off switch.
Convenience (Score:5, Interesting)
Some devices, like my DVD player and amplifier, have no way turning them fully off. The power button on the unit simply takes them out of standby or puts them back into standby. It is not a hard power switch like devices of old. Even PCs these days (with ATX power supplies) can be considered to be on standby since there will be a little bit of power consumed.
Really, the only way you are going to stop this problem is by switching off everything at the wall. The power point for my hifi setup is behind a shelf and there is no way to easily reach it so that option is out. The only other thing that comes to mind is for manufacturers putting the older style power switches on equiptment, but I can't see that happening in a hurry.
Re:Convenience (Score:5, Insightful)
I suspect actually that what is being angled for here is either UK or European legislation that would prohibit equipment from having a standby button, and mandates hard on/off switches. Personally, I am sufficiently concerned by global warming to support such a move though I'm a a pretty big offender when it comes to leaving the TV on standby.
Re:Convenience (Score:3, Insightful)
AOL
To turn off my TV installation would mean separately switching off the television itself, the video recorder, the DVD player and the digital TV decoder. Neith
Re:Convenience (Score:3, Interesting)
I guess this would encourage many companies to invest a few bucks more into energy efficiency when it comes to standby. Even if devices get a little more expensive, consumers and the environment will benefit in the long run.
Re:Convenience (Score:3, Insightful)
Why? I turn my TV off at the button on the set every night. Doing so adds maybe an extra 5 seconds to the warm up time the next day when I switch it on, but so what? Same for my monitor - if I'm going to be away from the PC for more than a few minutes, off it goes.
I really don't see how it's an inconvenience.
Re:Convenience (Score:4, Insightful)
John.
Re:Convenience (Score:3, Interesting)
A small device to listen for an ON signal from a remote control is only going to consume a milliwatt or so. The real problem is that a normal power supply will waste more than that milliwatt with no load.
I have several devices in my home which run on plug packs at about the same voltage. I made a wiring harness to run them off the same supply. Doing it this way should waste less power.
forgetting the off button (Score:4, Interesting)
Re:forgetting the off button (Score:5, Funny)
Conference rooms in my office building have PIR movement detectors to switch on lights. When we developed problems with our mains power supply (too many computers and aircon units in the building) I suggested we use them all over the place.
One day I went past my managers office. He was sitting at his desk in the dark. If he stops moving for long enough the lights go off.
Re:forgetting the off button (Score:4, Funny)
*twitch*
Don't start me on the fact that we can't turn the lights off, so they're blazing away throughout summer. Although we kinda need them, because they made the windows tiny "to save energy".
Arrgh!
It's not just standby... (Score:3, Funny)
Believe it or not, Oil companies are to blame. (Score:3, Insightful)
Apple's Sleep Mode on Macs, A Question. (Score:4, Interesting)
Re:Apple's Sleep Mode on Macs, A Question. (Score:4, Insightful)
You might save time, of course, there is no denial of that. Saving energy by the process is a kind of weird question. Will your saved time result in the machine staying in sleep mode for one minute longer, or will you do actual work for one more minute? The shift in power usage for an idle and active desktop system is not that significant, at least not when the shift won't involve heavy duty for the GPU in either case.
On the other hand, the sleep mode will also induce almost all of the material fatigue in different components that turning off would give. The HD will stop and so on.
Long live Suspend-to-disk, no matter what OS it is. Yes, it will take longer to resume than suspend-to-RAM, but it's still often quicker than a clean boot, and certainly quicker than a clean boot + resuming work where it was.
Re:Apple's Sleep Mode on Macs, A Question. (Score:5, Insightful)
Re:Apple's Sleep Mode on Macs, A Question. (Score:3, Informative)
If the the imac is like my mini-mac, then keeping it in sleep mode for a few hours saves more power than the extra power required for a full boot. I don't know the cutoff time but I expect if your going to leave it all weekend, then off is the best option but if you check it several times a day and its only idle 8 or so hours while you sleep, then keep it in sleep mode.
I've been looking for ways to replace as much of the "on all the time" junk with smaller more efficient systems
Smarter electronics or smarte people? (Score:4, Insightful)
While stuff that needs longer "boots" (like PCs) can take advantages from "stand by" (or sleep) mode, everyday appliances like TVs, VCRs and so on could easily be smarter as far as power consumption is concerned.
Maybe the same could be for power supply units and AC-to-DC units. Once the device is charged a controlled circuit breaker could interrupt any further consumption.
But then how much pollution would be created by all those new things whose lifespan is within a couple of years?
Or maybe smarter people would be a much better solution!
Turn your appliances completely off if you know you won't need them for a while. Unplug your cell phone charger once you used it.
And don't leave anything turned on only because you think you'll save some milliseconds of your time!
Don't forget Transformers (Score:5, Insightful)
I've read that 10% of a households energy use is from transformers.
That they use power is obvious if you look at the electrical diagram -- the things have a loop through which current travels. There is some waste power that gets lost.
Do we all go around the house unplugging our transformers, to stop from using power? I doubt it.
I figure that my electronic devices, with their "waste heat" are actually heating my place. I don't see that as a bad thing -- I want the heat.
If, on the other hand, I had to run AC to cool down the building, then I'd be peeved at them sucking up power.
Comment removed (Score:4, Funny)
Re:Don't forget Transformers (Score:3, Interesting)
Well, kind of. But you're not really doing your bit for the national Kyoto commitment that way. Consider: if you heat your home by burning gas, you're getting pretty much 100% efficiency. All the energy turns to heat. If, OTOH, you heat your home by electricity, somewhere there's a powerplant burning gas at much less than 100% efficiency to provide that power. Much
Re:Don't forget Transformers (Score:5, Funny)
Transformers... more than meets the eye!
Standby mode doesn't have to suck (Score:5, Insightful)
The main reason sleep mode sucks though is that by its increasing ubiquitousness, it's pushing away good old circuit breakers to where you can't find them. Plenty of PC cases only have the soft-off button connected to the BIOS, and the only way to break the circuit is to remove the powerplug from the socket (which incidentally is just great for repair and maintenance, since now you've also removed the ground circuit). Many TVs have thoroughly hidden actual-off switches. And sometimes, when you switch something OFF you just want it to switch OFF. *sigh*
cold lights (Score:4, Funny)
Re:cold lights (Score:5, Funny)
Gets off with what? The cucumber ?
Comment removed (Score:5, Insightful)
Re:Has anybody thought of or mentioned... (Score:5, Informative)
None. Motion activated sensors would know if someone is in there who shouldn't be. I expect that local government could slash energy consumption by enforcing some kind of "out of hours" energy tax aimed at lights, computers etc. being left on over night. Companies would certainly enforce a turn off policy if it was hitting them in the wallet.
A job for the manufacturers (Score:5, Insightful)
Also there's an issue which no-one seems to have noticed - perhaps not with all TV's, but at least on the two that I own.
If I turn them off on the set, they lose the settings. I have to reset the time & any preferences etc.
I do agree that wasting all that power is plain crazy, so why can't the manufacturers just have an on/off on the remote & off means a *tiny* amount of power is flowing just to keep the IR active. All prefs should be saved onto solid state memory that does not require power - regardless of how cheap the TV is, surely all manufacturers can manage that without a cost implication.
I guess Standby is a leftover from old TV's that took time to warm up - that's pretty much gone now & I imagine non existant with flat screen TV's
Seems bizarre really, 2006 & we havent thought of a way to turn a TV off
The real question is... (Score:5, Funny)
home entertainment issues (Score:3, Interesting)
In order to even turn all the devices into standby, I need to fumble for four different remote controls, else they all end up heating the living room when nobody is in there. Typically, the TV is the only thing that gets put into standby.
Given that the VHS has auto-set up and can recover from a power outage (save for timer recording, which many people don't use), I guess it might make some sense to hook them up to one of those master-slave power bars, whereby you set it up so that when the TV stops drawing full current, the other sockets are switched OFF.
The digital satellite set-top box has a few issues with losing power (it loses EPG reminders, and defaults to some silly promotional channel, which I guess is mostly due to design by BSkyB).
Here's another thought. Duplicate circuitry. All of those devices have DC transformers. The digital satellite set-top box has MPEG2 decoders, as does the DVD player, yet they are never used at the same time, but the circuitry is probably receiving their full power budget at all times. Likewise, the TV set and DVD player both have audio amplifiers, yet I've never used the speaker outputs on the DVD player.
If I had one well-designed appliance that had the screen, a DVD transport, a VHS transport (yes, they are still used), and an integral digital satellite decoder, it could use far less power overall. The problem there is obsolecence. In order to get that, I need to either sell, give away, or recycle the existing equipment, which uses energy. It also means that if I decide that High Definition television is going to be good, I'd have to discard the lot of it and replace it, but with something with a HD-DVD, or blue ray mechanism? turns into diminishing returns.
If all such equipment responded to a standard "enter standby" remote control code, then I bet more equipment would be going into standby rather than remaining on full-power. If they could all go into a mode where they use less than a watt in standby, all the better.
Haway the lads (Score:5, Insightful)
It's not the entire population of Glasgow flying to New York that worries me. It's the prospect of them coming back again.
Wasting electricity is an expensive pastime, no doubt. But worrying about standby mode is a gnat-bite compared to our hopeless dependence on the motor car and in the UK's case our increasing dependence on importing energy from rather unstable parts of the world. This sounds rather like a typical UK New Labour gambit: encouraging people to feel good citizens while dodging the all the tough questions.
Moving parts (Score:3, Insightful)
My parents routinely turn things off instead of using standby and get through a TV rougly every 4 years. Cause of failure? The power switch! Kettle? Hoover? Stereo? All die within a few years because of a dead power switch! In contrast I moved out of the house some 10 or so years ago and have yet to have anything die. Go figure.
Sure, not using standby may save a few watts per year but what about the extra waste generated by dead electronics
Re:Moving parts (Score:3, Insightful)
Simply Off (Score:4, Insightful)
The problem isn't that electronics are not smart enough. The problem is that electronics manufacturers aren't. As customer, I would like to have one very simple thing: A button that when I use it actually means "off" as in "absolutely no more electric power going into this device".
not a difference, sometimes (Score:5, Interesting)
Solution? I sacrificed factory guarantee and I am currently in process of device modification. However, I mourn the electronics consumer droids without knowledge of circuitry and without soldering skills, not to mention I will never buy any AverMedia product in the future.
Exclamation mark instead of question mark (Score:3, Insightful)
Shouldn't that be an exclamation mark at the end instead of a question mark?
Alarmist graphs? (Score:3, Interesting)
Check out this graph [bbc.co.uk]. They seem the believe that electricity used by future TV's will grow faster than the amount of new TV's on the market [bbc.co.uk]. I am skeptical of this claim, since it seems to suggest that newer TV's will be more power hungry than the older ones. Does this not account for the new LCD, plasma, and projection (DLP, LCD, and LCoS), which should use significantly less electricity than their CRT coutnerparts?
In any case, looking at the graphs and trying to extrapolate the numbers, it looks like there's a projected 11 and 22% increase in the number of TV's in GB from 2000 to 2020, which (they claim) represents a 50% and 70% increase in power consumption by TV's. The numbers don't work out in any logical fashion, and don't represent the use of new, lower power technology that will almost certainly replace most new CRT's over the next 15 years.
This is beginning to sound like a bit of alarmism...which is sadly typical in the news (especially when it comes to issues of fear, including issues like terrorism or especially the environment and conservation).
Also, another bit of potential stupidity:
This is just silly, because the manufacturers will just pass this cost along to the consumer. The statement is a clear attempt to obfuscate the ultimate payor for the new regulations.This article leads me to the question of whether or not most people are able to question anything when it comes to conservation because it's not PC to question environmental rhetoric.
Re:Tell me exactly... (Score:2, Insightful)
Re:Tell me exactly... (Score:3, Insightful)
Well, it's pretty hard to fail to notice that my USB mouse receives power even with the computer being off. I mean, it's not just a LED, it's nearly bright enough to read by.
This is the second mouse I have that emits so much light -- and we're not speaking about special fancy geek-style mice. They were just the "tell the tech guy at work: 'do we have a mouse I can buy? I'm too damn lazy to go to
Re:Tell me exactly... (Score:5, Informative)
That's for sure. And there are even more devices where it isn't even standby - they're wasting power when "off" while providing no added functionality at all.
Anything with an A.C. adaptor feeding it is generally wasting power all of the time it is off. Switching designs help, but most adaptors have transformer core losses being fed all the time. I've found the same thing internally in some devices. Looking around the house, I found that my soldering stations and a table radio had the power switches wired after the transformer. Some things that have transformers or whole power supplies live all the time include doorbells, thermostats, garage door openers, VCRs, CD/DVD players, cable/satellite boxes, printers, and cable/DSL/dialup modems. I remember the shock at discovering that my old electric toothbrush had a stand with a field coil powered all the time. The coil was the powered portion of a motor to wind a spring in the hand-held unit.
Contrary to what the article says, cable boxes could be designed in a way where they could be shut down. The boxes could designed to handle revalidation only when a box is on. Data when off could be retained by a small amount of CMOS memory and a capacitor, or by using flash memory. Switching on the main power supply could be done by passing power for devices it feeds signal to through the box, and sensing load current to trigger starting the power supply. I don't think we should be paying for energy just to make someone's DRM work.
Devices with timers could be designed to run from charged capacitors. Small half-Farad capacitors are available. Some devices use lithium batteries, but I prefer to avoid those since they're toxic waste later.
I reduced the power consumption of an old L.E.D. digital alarm clock from 8 Watts to 1.2 Watts by replacing the transformer with a capacitive voltage divider, and eliminating the series-pass regulator by using S.C.R.s in place of two of the diodes in the bridge rectifier and controlling those. That savings was enough to power a bedroom color t.v. 2 hours a day.
I'd like to see someone design a cordless phone that was efficient enough to get by with powering the base unit from the phone line. They could at least use a switching supply for the base unit. Few people really need to have their microwave ovens programmed in advance to come on at a certain time. For years I kept my old microwave with a rotary knob mechanical timer. That oven didn't use any power when off. Most U.P.S.es could be designed to use less power once the battery is charged - they'd probably get better battery life too.
Devices that are powered all the time are at a greater risk of being fried by line surges.
On my old computer I wired an outlet box to the switched monitor power outlet. Then things like my modem and amplified speakers would have the power cut when the machine was off. If the machine had been designed to control that outlet in sleep mode, consumption could be cut even more. Having those items powered from the computers switching supply instead of transformers would save even more.
Sometimes when shopping I ask salespeople how many kilowatt hours per year a product uses when turned off. It's entertaining to see the weird looks I get. If a few more of us asked suppliers about these things it might speed design changes. Designers need to be educated about the need for reduced consumption also. Sometimes it seems like many don't worry about it except when too much heat is produced.
Consumers tend not to think of low power leeches as costing anything, but it adds up over the life of a product. Where I am it runs about $1 (U.S.) per month for every 10 Watts used continuously. In hot climates where air conditioning is used, waste costs are compounded with those to remove the waste heat from these devices.
Re:Tell me exactly... (Score:5, Interesting)
These numbers are not new, and this story is 5 years late. See: http://www.berkeley.edu/news/media/releases/2001/
They will keep talking about energy wastage and no amount of energy awareness if going to change that. Unless of course, you have to refill your electricy "tank" for $5.00 a gallon, and then everyone will buy the consumer electronics equivalent of a Prius or Insight.
Re:Tell me exactly... (Score:5, Interesting)
I find it strange the way people use electricity like it doesn't cost anything. I suspect it is because the link between using it and paying for it is weak in that you might pay for it upto a month after you use it. I firmly believe that _all_ electricity meters should have a display showing how much it is _actually_ costing you in some prominant place. How many people could honestly be bothered to climb into the broom cupboard to take a reading and then convert that reading from units in to £/$//etc using some tricky to understand pricing structure that changes with frightning regularity. It's just not going to happen so people will just keep paying whatever their bill shows and not understand how much different things cost to run.
Re:Tell me exactly... (Score:4, Interesting)
I'm currently having to stick some £15 a week into it (Winter and the heating is on) so I know if things can be reduced by turning them really off.
ps, I get 30 minutes grace with the server as it's the only thing on the UPS... so I have enough time to get the emergency credit activated which gives me a couple of days to get credit put on the payment card.
Re:Tell me exactly... (Score:5, Informative)
I know this might sound a little strange but I actually looked into getting a pre-pay meter installed so that I could find out how much leccy was costing me. I couldn't believe the cost of it though. You have to pay for the meter (if you want one installed by request), electricity costs more and you have the hassle of getting the card charged up.
I think it is absolutely stupid that we make the people that can least afford it pay the most for electricity.
Re:Tell me exactly... (Score:5, Interesting)
What they don't mention is why it's so high. I remember when we first got a TV with a standby mode. According to the specs, the draw in standby mode was absolutely miniscule (less than 1W). It did exactly what it said on the tin. Yet when I just checked the specs on my monitors, one is 3-10W in standby mode, and the other doesn't even bother listing power consumption in standby mode. I don't get it. What on earth could they be doing that needs to draw that much power? I don't agree with banning standby mode, but I do think it should be quite feasible to get devices down to using less than 1W while in that mode.
Re:Tell me exactly... (Score:3, Informative)
Cheap circuitry. Of course you can get standby power down to below 1 W, but then you'd have to spend a few extra cents or bucks on the electronics. Since most consumers don't care (or know about) standby mode power consumption, the more profitable choice is to use the cheap design and let the consumer pay for it through higher electric bills.
Re:Tell me exactly... (Score:2)
Re:the entire population of Glasgow... (Score:3, Informative)
Re:the entire population of Glasgow... (Score:3, Funny)
Re:the entire population of Glasgow... (Score:3, Funny)
It would probably replace the mugging with drunken disorderly though .
Not saying that Glasgow is all rough neighbourhoods , but being a Psycho is part of the Rent agreement in some areas
Re:the entire population of Glasgow... (Score:5, Funny)
Just wait 'till Chirac turns up in his nice new presidential A380. "Hey is that Airforce One? What a cute little plain that 747 was".
A380 - When a SUV gets too cramped.
Re:the entire population of Glasgow... (Score:3, Interesting)
Re:the entire population of Glasgow... (Score:3, Funny)
Re:Somebody crack the heads together of the eco-nu (Score:4, Insightful)
Ah yes, I can see how that would be useful for televisions. Ahem.
Talk about the eco-nuts missing the point, its not about making this a harsher world. I suspect the eco-nuts believe that the world is going to get really very harsh quite quickly if people aren't willing to take remedial steps such as... oh I don't know - standing up to turn on the TV.
its.... about people being smarter.
Yes, yes it is.
Re:Somebody crack the heads together of the eco-nu (Score:3, Informative)
Assuming you can actually do that. Often enough, manufacturers are too cheap to put in a switch that completely separates the internal circuitry from the power outlet. The result is that the thing even draws power when it is "off" (not standby, but off). The only solution is an external switch.
Re:I'm sorry, what about the US??? (Score:3, Informative)
The data was gathered in the UK and therefore the conclusions were specifically pertinent to the UK, although applicable to the US.
Re:Back in The Day ..... (Score:3, Interesting)
The problem is, most "wall warts" are not just simple transformers. Most devices that use wall warts use DC, therefore, there has to be rectification of the AC current in there to make the devices work. On a wall wart, that is typically done through some form of diode bridge [wikipedia.org] with a capacitor in there to level off the power.
Most wall warts are incredibly inefficient (Popular Electronics once did an article on this
Re:This lazy article is meaningless (Score:5, Informative)
No. kWh is NOT Kilowatts per hour, it's Kilowatts times hours, aka Kilowatt-hours.
On top of that, kWh/yr isn't wrong at all, it is merely an equivalent to Watts that makes it easier to calculate how much money (power companies usually charge by the kWh) is wasted by the device over the course of one year.