TV Manufacturers Accused of Gaming Energy Usage Tests (cbslocal.com) 86
The Natural Resources Defense Council has issued a new report accusing Samsung, LG and Vizio of "misleading consumers and regulators about how much energy high-definition screens devour, alleging that the televisions were designed to perform more efficiently during government testing than in ordinary use." The report "estimates that the collective electricity bills during a decade of watching the high-definition TVs will be $1.2 billion higher than the energy ratings imply," and that "the higher energy usage generates an additional 5 million metric tons of carbon pollution." CBS Local reports: The findings are based on an analysis of high-definition TVs with screens spanning at least 55 inches made in 2015 and 2016. The estimates on electricity costs are based on high definition TVs with screens 32 inches and larger. The study concluded that Samsung and LG have gamed the system during government testing in an effort to get better scores on the "Energy Star" yellow labels that appear on the sets in stores. Those scores often influence the buying decisions of consumers looking to save money on their utility bills. The report said Samsung and LG did not break any laws in their manipulation of the tests, but rather exploited weaknesses in the Department of Energy's system to measure electricity usage. The Samsung and LG sets have a dimming feature that turns off the screens' backlight during part of the 10-minute video clip used in government tests. But that does not typically happen when the sets are being used in homes to watch sports, comedies, dramas and news programming. The analysis also found that Samsung, LG and Vizio disable energy-saving features in their TVs when consumers change the factory setting on the picture, a common practice. The energy-saving feature is turned off, with little or no warning on the screen, sometimes doubling the amount of electricity consumed, according to the NRDC report.
Thats terrible (Score:3, Funny)
Re:Thats terrible (Score:5, Insightful)
I'm sure a horde of lawyers would love to charge $50 for each $0.75 refund check cut in their class action.
Re: (Score:3, Insightful)
Yes, let's send the message to large corporations that fraud is a-ok and we aren't even going to bother pretending to punish them any more. There's no way that could possibly lead to further problems down the road.!
Re: (Score:2)
As much as I dislike large corporations, the problem as I see it is slightly different here.
Of course when someone "gives you grades on some arbitrary test that has nothing to do with reality whatsoever", be it energy consumption, emission volumes or school grades, you optimize your behaviour to get as high as possible marks on those arbitrary test.
And then you get televisions, cars, employees, etc.. that all scored great on some arbitrary test, but those test don't really tell you anything about how they p
Re: (Score:2)
I can't afford the extra 75 cents this year. I was counting on the manufacturers numbers to make my budget numbers. I guess I will just need to skip a meal. Damn you, LG!
Have you ever listened to people complain about price increases in their cable TV service. They will actually claim they "can't afford" the new price, even when it only goes up 50 cents a month.
Re: (Score:1)
Re: (Score:2)
My guess would be Strawmanitoba.
Re: (Score:1)
That's kind of where I was on this. I'm amazed that the manufacturers would cheat like this in order to drop their annual usage cost estimate by a dollar. Because that's about all it should be. Even with any power saving features off, these 55" screens use far less than your 32" CRT screen used.
Re: (Score:2)
oh I thought the summary said it would cost me an extra $1.2 billion a year. I was brokering a deal with the local nuke plant for dedicated use of their 2nd reactor and genset. What a relief
Re: (Score:2)
Well high resolution TVs use a lot more power than you would think they do. Between the screen and the electronics to drive it they can easily be one of the major power expenditure devices in a house. They use hundreds of Watts. Now that LED lightbulbs use like 4W instead of 60W like an incandescent did a lot of those old advises they used to make like, turning down the lights to save power, aren't THAT good of an advice anymore.
Going to a lower resolution screen saves a lot of power.
No end... (Score:3, Insightful)
Diesel, energy usage... this really has no end, right?
You start wondering if your home appliances, electronics and whatnot are really all that efficient, or if in fact it's just all the testing procedures that are rigged instead.
Re:No end... (Score:5, Informative)
I has Engineer degree...
I actually am an Engineer, a Test Engineer. An Experimental Physics Test Engineer. Sometimes the Physics Folks get a little too enamored of the Physics, without paying attention to the lowly Engineer tugging at their sleeve and trying to get across the point that something unintended is about to explode...
In my experience, such Energy Usage Tests are extremely conservative. The Vizio LED Monitor that I'm looking at right now, as stated in the manual, consumes 65 Watts. In reality, and this does vary, it's more like ~21 Watts. The 24" Polaroid Monitor on my boat that has a 12VDC Brick, (That is precisely why I bought it...), consumes at average lighting levels, only 9 Watts off the Ship's Batteries. The manual says 27.
I've gone around and measured the Energy Usage of just about everything in my house and boat, and consistently, with the exception of the Kettles and one space heater, they use substantially less electricity than advertised. Even this MacBook Air, whose sucking goes all over the place, averages out to about 3 Watts. Playing a Flash video with sound turned all the way up, it goes to 7 Watts. Turn off the Wifi, and just type away in near darkness... 2.1 Watts. Apple was very conservative in estimating battery life; I've gone 14 hours on one charge, typing away like mad until dawn. Apple says I should expect 9.
But there is one area that concerns none of us here except me maybe, and that is Power Factor. I just happen to have an ancient HP Vector Voltmeter. (A Vector Voltmeter, very simply, compares Phase of two otherwise identical sinusoidal signals, and displays the Phase Difference on a meter.) The absolute values that the VV displays isn't of any real interest to me, as long as a calibrated Resistive source, like a Tea Kettle, centers the meter. Historically, the VV shows how Inductive a load is, like in a refrigerator compressor. It was rare to come across Capacitive loads, except maybe Cyclotrons. But increasingly now, these newfangled switching power supplies do show _Capacitive_ Reactance. That is, they draw more "Imaginary" Capacitive Reactive Watts than Resistive. We don't care, because we pay for Real Watts. But Power Companies, who rely on Energy Usage Tests to forecast demand and allow for it, do care. It gets _very_ complicated to put down in writing just how much Power a device uses under all circumstances, while a VV dances away. So, to make it easy for everyone, just state the worst-case scenario- Fold worst-case Reactance into the measured Resistive load, add a fudge factor, and state that this Vizio consumes 65 Watts. Plan around this accordingly. (I'm not getting any further into this without getting very tiresome indeed; just Google "Power Factor".)
If everybody plays by the same rules, we have some semblance of Reality, with "Imaginary" Tendrils.
I don't know the Realities of this Case; I would need to take randomly chosen victims into my Underground Lair and subject them to the Vector Voltmeter, and once the acrid smoke clears, make some calculations. But with the premise of always underpromise and overdeliver for Government Certification, or face the Consequences, "Gaming" Energy Usage Tests makes no sense here. Anybody buying a 55" TV to watch mostly commercials simply doesn't care.
I call... "Doubtful".
Re: (Score:3)
But Power Companies, who rely on Energy Usage Tests to forecast demand and allow for it, do care.
You imply that power companies try to guess which items people buy, and how much they use them, and then use the Energy Usage Tests to figure out aggregate demand. This sounds highly improbable.
Re: (Score:2, Interesting)
That is _exactly_ what the Utilities do. (This also goes for, at least in California, Water...)
Very few people want new Power Plants. They are expensive, troublesome, and take many years to design and construct. One of the points about the "Energy Star" program is to get some insight into future purchasing trends, which helps to forecast aggregate Electricity Demand. Take the simple Light Bulb. I've been testing a bunch of the new LED Lamps. Now any individual LED Lamp that uses ~12% of the Equivalent Inca
Re: (Score:2)
Even bad its good (Score:5, Insightful)
Compared to the plasmas, rear projection screens, and even good old fasioned CRTs over the last dozen years the new LED units are positively energy sippers.
Even so, update the tests and fix the stickers; consumers should know what they are buying.
Although I do take exception to the idea that the auto dimming during the test is 'unrealistic' -- yeah its true there isn't a minute of blackness during the average superbowl. But I can't tell you how often a movie has ended, or someone walked away from the HTPC, or something and its gone to sleep. Some of the devices go idle/sleep/off the TV basically shuts the screen off. Others it goes to this blue no-signal screen which it doesn't seem to detect as idle, and will sit there glowing blue nothing for hours... so yeah... testing what the screen does when its not being used SHOULD be part of the testing.
Re: (Score:3)
But I can't tell you how often a movie has ended, or someone walked away from the HTPC, or something and its gone to sleep
Microsoft probably can. ;)
Re: (Score:3)
Yep, my 40" Samsung LED backlit TV is rated at 40W. To give you an idea of how much energy that is, the Samsung soundbar + subwoofer is rated at 180W. That's 4x more energy consumed (at peak) by the barely-midrange soundsystem than the display. It's 8x more than my 5W rated Amazon Fire TV (streaming media device),
but, running my electric oven for 20 min to make my pizza uses more power than my TV, Speakers and streaming media device do in a month.
Shrug. There are bigger, better fights to pi
Re: (Score:2, Insightful)
Your sound bar would be only using max 20 to 30 Watts, Peak is a useless measure because it is a measure the power the sound bar can pump out for a moment, if you try to drive it hard continuously it will just crap it self and you will very soon find yourself pushing the volume down to a level it can actually handle.
Re: (Score:2)
Your sound bar would be only using max 20 to 30 Watts, Peak is a useless measure because it is a measure the power the sound bar can pump out for a moment, if you try to drive it hard continuously it will just crap it self and you will very soon find yourself pushing the volume down to a level it can actually handle.
The AC has it right. 180W is marketing. It will never take that from the socket.
Re: (Score:2)
Most people listening to a TV through the TV's speakers use less than one watt average. Only the pricks hooked up with high power 5.1 amps who are rattling every goddam window in the fucking neighborhood watching Mr. Robot are using very much audio power, and even those are probably averaging only a handful of watts most of the time.
Re: (Score:2)
Actually he was right, if you parse his words. 1000 watts is more than 40 watts. I mean, he did say POWER, not ENERGY.
Re: (Score:2)
Sorry; this got attached to the wrong comment.
Re: (Score:2)
Huh?
A month of 40W is about 29.2kWh.
For your electric oven to use more than that in 20 minutes, it would have to draw more than about 87,600W for each of those 20 minutes, or about 380A @ 230V.
Either you are lying, don't understand the math, or we'll be requiring pics showing this oven.
Re: (Score:2)
Actually he was right if you come right down to it. 1000 watts is more than 40 watts. I mean, he did say POWER, not ENERGY.
Re: (Score:2)
Watts *is* power.
And he said his oven uses more power in 20 minutes, than a whole pile of other gear does in a month.
I thought he was pretty clear about what he meant, despite being wrong in his assumptions.
Re: (Score:2)
Sure, he said power when he clearly meant energy, but I suspect he's still wrong. Assuming his oven averages a typical 2000W when operating, and takes an extra 10 minutes to warm up before he puts in his pizza, we're still only talking 1000Wh to cook a pizza.
Spread that energy across a month and it would be ~33Wh per day, or less than an hour per day for his TV alone (at peak consumption), much less his speakers, etc. Maybe that's really all he watches, but that would put him well below the average.
Re: (Score:2)
The issue is honesty. In the EU many electrical items have to come with a sticker, displayed in the shop, showing energy efficiency. Lying to consumers, even if it is a relatively small amount, is frowned upon.
Having said that, multiply that 40W by the number of TVs in the world and it's actually something worth making an effort with. And yeah, I own a plasma.
Re: (Score:2)
>There are bigger, better fights to pick than LED backlit TVs.
But not many better than honesty in product information. So long as we live in a market-based economy, informed consumers can exert a surprising amount of pressure on the supply chain. Allow companies to make fraudulent claims about their products, and it becomes far more difficult to make informed purchasing decisions, to the point that almost no one would bother. (honestly, are you going to look online for information about every item you
Re: (Score:2)
Shrug. There are bigger, better fights to pick than LED backlit TVs.
...like DVRs. My Verizon DVR uses at least 32W all day and night long. That's about the same average energy usage as a present-day refrigerator. These things don't properly go to sleep, in a world of electronics that constantly sleep and wake on demand to save electricity.
That's the main reason I'm eager to see regulators stop the cable operators' monopoly over subscriber cable boxes. The boxes should become normal consumer electronics and use Energy Star ratings to compete against each other.
Mo Brightness, Mo Power? (Score:2)
NRDC and its consultant Ecos Research found that just a few clicks on a remote control could lead many 2015 and 2016 televisions from Samsung, LG, and Vizio to use up to twice the energy that consumers were told they would.
So, they are accusing the TV manufacturers of cheating on benchmarks, and they go on to say that if a user turns up the brightness, the TV will user more power. I just lost all respect for these clowns.
Re: (Score:1)
Re: (Score:2)
Yeah, there's an option I disabled before even knowing what it was: "Power saving: off".
If I'm spending coin on a shiny thing to be watched, I don't want to be continually fighting it over the one task I bought it for.
Re: (Score:2)
My supposedly "smart" Samsung TV detects when power saving activates on the attached device and puts up a bright white logo to inform me. The logo does not go away. At least it moves around, so the wear on the screen is somewhat even.
The only way to do power saving with modern TV's is to use ARC, and ARC support is just not very widespread yet.
Re: (Score:2)
The only way to do power saving with modern TV's is to use ARC, and ARC support is just not very widespread yet.
Two of the tree devices plugged in to my TV support ARC. Both my Chromecast and my Android box support it. Only my seldom used Blu Ray (I rip my blu rays and toss them on a fileshare for my android box to play) does not support it.
Re: (Score:2)
Now that's a boneheaded design choice - even my budget Samsung "dumb" TV, circa 2005, goes to sleep after a short while without a signal.
Re: (Score:2)
I have an LG monitor and the backlight-dimming feature definitely does activate during fades to black in normal content, especially movies. I'm guessing that the "sports, comedies, dramas and news programming" that this organization chose to test these TVs just happened to have a lot less of those fades to black than a broader, more representative set of content would. Wonder who's paying their bills.
Re: (Score:2)
The whole point is that consumers don't know what they're buying. Energy Star rating on a 79 watt TV over a 109 watt TV that's on for 12 hours per day? That's $1.86/month. If your TV lasts 10 years, you might save $225. As such, this is quite possibly the least-important thing you should concern yourself with when buying a new TV--in fact, you should probably just flatly ignore the power consumption within the same class (e.g. Energy Star LED TVs). (Note: a 55 inch LED LCD panel consumes around 60 wa
Who'da-Thunk It? (Score:2)
Wow! Who would have thought increasing the brightness and/or contrast of an LED screen would use more energy and make the power-saving measurement certification mode unusable?
OTOH, turning the screen off during video playback seems a little VW/Mitsubishi/Hyundai-like.
Re: Who'da-Thunk It? (Score:2)
BMW. Their cars were shown to pass the required tests, making the claims that passing the tests without modifications were impossible a little odd..
However.. The thing people seem to ignore is that the actual requirements are insanely tight.. And have been getting tighter and tighter at a fast rate.. To the point where diesel vehicles are now required to be significantly cleaner than petrol in many ways... I will give you exactly over guess who is pushing for this (hint.. Selling petrol is significantly mor
What's our take away on this supposed to be? (Score:4, Informative)
What's our take away on this supposed to be?
(A) These evil scoundrels are cheating on the government tests
(B) The people who are designing the government tests epically suck at their jobs, should be fired, and have competent people hired in their places
I'm going to have to vote "B" here, folks.
Re: (Score:2)
It's not that they suck at their jobs. Due to "fairness, transparency and accountability" requirements any testing methodology they come up with has to be fully documented and given to the manufacturers ahead of time. Manufacturers being the scum-sucking bastards that they are will, of course, run all these tests in their own labs ahead of time and tweak the cra
Re: (Score:3)
It's not that they suck at their jobs. Due to "fairness, transparency and accountability" requirements any testing methodology they come up with has to be fully documented and given to the manufacturers ahead of time. Manufacturers being the scum-sucking bastards that they are will, of course, run all these tests in their own labs ahead of time and tweak the crap out of things so they come out on top.
Sorry, but the tests are supposed to be "representative of normal usage".
Even if they document the tests, if they can be gamed in a test representative of "normal usage", then the same gaming will kick in on actual "normal usage", and so the test will not have been gamed.
You can have them be shitty at designing tests, or you can have them being shitty at determining what constitutes "normal usage", but it's not possible to game something that doesn't have a variance between expected use and actual use.
The m
Re: (Score:2)
Even if they document the tests, if they can be gamed in a test representative of "normal usage", then the same gaming will kick in on actual "normal usage", and so the test will not have been gamed.
Normal usage will be viewing a different movie than the one they test with. If you can get viewers to only watch the test signal, over and over, then sure there is no variance between expected use and actual use. However, I did not buy my TV to watch a specific set of video clips in a specific sequence, repeatedly.
Re: (Score:2)
Actual "Normal use" involves lots and lots of data. Hundreds of movies, millions of hours, TV content, commercials, use as a PC monitor, game systems, the lot. It's impossible to actually measure that in a controlled environment without actually running everything across the TV; it's possible to approximate it for naive algorithms (i.e. the TV doesn't know about the test, but knows about real-world usage behavior).
I've collected a set of various types of media--e-mails, Web pages, musics of different g
Re: (Score:3, Interesting)
I'm going to have to vote "B" here, folks.
Correct answer. Energy Star certified a gasoline powered alarm clock [nytimes.com] in 2010. It's pointless pencil whipping operation; another collection of government lawyers sopping up a grand living from government teats.
Re: (Score:2)
Do you have proof the alarm clock used gasoline inefficiently?
Re: (Score:2)
The audit revealed more that the pencil pushers didn't even look at the applications before approving them, those products didn't exist yet got Energy Star approved. They set up a set of fake companies and sent Energy Star a set of devices they purported to make that were 20% more efficient than any competitor, they eventually worked up to a gas powered alarm clock. The other thing they found is that once they got an Energy Star certification, they could plaster it on any product they wished even if it hadn
Re: (Score:2)
Or maybe the test is designed so comparisons can be made between years, models and history?.
I mean yes, you can design the test to be different and updated every single year, but then you lose the ability to compare m
Re: (Score:2)
What's our take away on this supposed to be?
(A) These evil scoundrels are cheating on the government tests
(B) The people who are designing the government tests epically suck at their jobs, should be fired, and have competent people hired in their places
I'm going to have to vote "B" here, folks.
C: Marketing people see any mandated metric as something to be gamed in order to get an edge over the competition. Marketers tell this to managers, managers order B to game the test.
Like fuel efficiency ratings on cars, I don't trust energy efficiency tests on consumer electronics to be accurate. I'm sure they're real, but were done under laboratory conditions which probably involved a lot of settings turned down.
So they studied for the test (Score:3, Informative)
Unless they did some Volkswagen-esque cheat the detected that a test was running and changed settings on the fly, then all they were doing was optimizing for the test, which is hardly fraud. They can't stop a consumer for switching to ultra-bright or from watching continuous flashing Pokemon episodes. It is impossible to come up with a "standardized" test that perfectly replicates "real world" conditions. My real world conditions are different than yours. All you can do is come up with some standardized test that is hopefully representative. If the government failed, then change the test.
Re: (Score:2)
My Samsung TV has this setting (Score:2, Interesting)
The setting on my Samsung TV is called "Motion Lighting". It dims the screen when the image is perfectly still for more than a few seconds. It was, of course, one of the first things I disabled since it's absolute bullshit for normal use; it doesn't kick in for 99% of content, and when it does it's extremely disruptive, there's no reason static images should suddenly dim out of nowhere. I Googled it at that time (three months ago) and it seemed to be common knowledge that it was added to circumvent energy e
Re: (Score:2)
Besides this, dimming the screen during periods of low/no activity makes it impossible to watch any M. Night Shyamalan movies.
Re: (Score:3)
Re: (Score:2)
The setting on my Samsung TV is called "Motion Lighting". It dims the screen when the image is perfectly still for more than a few seconds. It was, of course, one of the first things I disabled since it's absolute bullshit for normal use; it doesn't kick in for 99% of content, and when it does it's extremely disruptive, there's no reason static images should suddenly dim out of nowhere. I Googled it at that time (three months ago) and it seemed to be common knowledge that it was added to circumvent energy efficiency tests.
The one use for this would be if you hit pause and stepped away from the TV. However, I would think that the timer should be configurable with a default of 10 or 15 minutes. Anything shorter and anything that can't be changed, as pointed out, is useless....
Coincidence? (Score:2)
The difference between tests and actual is highest when depicting VW's on the screen.
TIL (Score:3)
So today I learnt that manufacturers think that consumers give enough of a crap about the energy rating of a TV that the companies need to game the system.
Just another example of companies completely detached from their user base.
Cheat software? (Score:2)
Do they use sophisticated cheat software that somehow draws less from the kill-a-watt meter when an FCC employee is watching?
Re: (Score:2)
Re: (Score:2)
There are 150 kilocalories in a 1` oz snack bag of cheetos, that's enough to run a 16W flourescent bulb for 12 hours!
How else are you going to fight fascist economics? (Score:3)
I can't say that I blame any private industry for trying to evade the capricious regulations fomented by fascist economics. Show me a single regulator that has an advanced degree in engineering and who has put it to actual use.
Those scores means nothing to me (Score:2)
Those scores often influence the buying decisions of consumers looking to save money on their utility bills.
Errr... no actually.
That's the very last thing that could influence my decision.
Size, Image quality, price, those are the determining factors for me.
Seriously, my TV represent less than 1% of my electric bill. Where I live, 80% of the bill is for heating.