Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Power Television Businesses Government Entertainment Technology

TV Manufacturers Accused of Gaming Energy Usage Tests (cbslocal.com) 86

The Natural Resources Defense Council has issued a new report accusing Samsung, LG and Vizio of "misleading consumers and regulators about how much energy high-definition screens devour, alleging that the televisions were designed to perform more efficiently during government testing than in ordinary use." The report "estimates that the collective electricity bills during a decade of watching the high-definition TVs will be $1.2 billion higher than the energy ratings imply," and that "the higher energy usage generates an additional 5 million metric tons of carbon pollution." CBS Local reports: The findings are based on an analysis of high-definition TVs with screens spanning at least 55 inches made in 2015 and 2016. The estimates on electricity costs are based on high definition TVs with screens 32 inches and larger. The study concluded that Samsung and LG have gamed the system during government testing in an effort to get better scores on the "Energy Star" yellow labels that appear on the sets in stores. Those scores often influence the buying decisions of consumers looking to save money on their utility bills. The report said Samsung and LG did not break any laws in their manipulation of the tests, but rather exploited weaknesses in the Department of Energy's system to measure electricity usage. The Samsung and LG sets have a dimming feature that turns off the screens' backlight during part of the 10-minute video clip used in government tests. But that does not typically happen when the sets are being used in homes to watch sports, comedies, dramas and news programming. The analysis also found that Samsung, LG and Vizio disable energy-saving features in their TVs when consumers change the factory setting on the picture, a common practice. The energy-saving feature is turned off, with little or no warning on the screen, sometimes doubling the amount of electricity consumed, according to the NRDC report.
This discussion has been archived. No new comments can be posted.

TV Manufacturers Accused of Gaming Energy Usage Tests

Comments Filter:
  • by 110010001000 ( 697113 ) on Wednesday September 21, 2016 @09:04PM (#52935395) Homepage Journal
    I can't afford the extra 75 cents this year. I was counting on the manufacturers numbers to make my budget numbers. I guess I will just need to skip a meal. Damn you, LG!
    • Re:Thats terrible (Score:5, Insightful)

      by JBMcB ( 73720 ) on Wednesday September 21, 2016 @09:06PM (#52935411)

      I'm sure a horde of lawyers would love to charge $50 for each $0.75 refund check cut in their class action.

    • Re: (Score:3, Insightful)

      by Anonymous Coward

      Yes, let's send the message to large corporations that fraud is a-ok and we aren't even going to bother pretending to punish them any more. There's no way that could possibly lead to further problems down the road.!

      • by aix tom ( 902140 )

        As much as I dislike large corporations, the problem as I see it is slightly different here.

        Of course when someone "gives you grades on some arbitrary test that has nothing to do with reality whatsoever", be it energy consumption, emission volumes or school grades, you optimize your behaviour to get as high as possible marks on those arbitrary test.

        And then you get televisions, cars, employees, etc.. that all scored great on some arbitrary test, but those test don't really tell you anything about how they p

    • by SeaFox ( 739806 )

      I can't afford the extra 75 cents this year. I was counting on the manufacturers numbers to make my budget numbers. I guess I will just need to skip a meal. Damn you, LG!

      Have you ever listened to people complain about price increases in their cable TV service. They will actually claim they "can't afford" the new price, even when it only goes up 50 cents a month.

    • That's kind of where I was on this. I'm amazed that the manufacturers would cheat like this in order to drop their annual usage cost estimate by a dollar. Because that's about all it should be. Even with any power saving features off, these 55" screens use far less than your 32" CRT screen used.

    • oh I thought the summary said it would cost me an extra $1.2 billion a year. I was brokering a deal with the local nuke plant for dedicated use of their 2nd reactor and genset. What a relief

    • Well high resolution TVs use a lot more power than you would think they do. Between the screen and the electronics to drive it they can easily be one of the major power expenditure devices in a house. They use hundreds of Watts. Now that LED lightbulbs use like 4W instead of 60W like an incandescent did a lot of those old advises they used to make like, turning down the lights to save power, aren't THAT good of an advice anymore.

      Going to a lower resolution screen saves a lot of power.

  • No end... (Score:3, Insightful)

    by XSportSeeker ( 4641865 ) on Wednesday September 21, 2016 @09:06PM (#52935413)

    Diesel, energy usage... this really has no end, right?
    You start wondering if your home appliances, electronics and whatnot are really all that efficient, or if in fact it's just all the testing procedures that are rigged instead.

    • Re:No end... (Score:5, Informative)

      by Anonymous Coward on Wednesday September 21, 2016 @10:38PM (#52935811)

      I has Engineer degree...
      I actually am an Engineer, a Test Engineer. An Experimental Physics Test Engineer. Sometimes the Physics Folks get a little too enamored of the Physics, without paying attention to the lowly Engineer tugging at their sleeve and trying to get across the point that something unintended is about to explode...

      In my experience, such Energy Usage Tests are extremely conservative. The Vizio LED Monitor that I'm looking at right now, as stated in the manual, consumes 65 Watts. In reality, and this does vary, it's more like ~21 Watts. The 24" Polaroid Monitor on my boat that has a 12VDC Brick, (That is precisely why I bought it...), consumes at average lighting levels, only 9 Watts off the Ship's Batteries. The manual says 27.

      I've gone around and measured the Energy Usage of just about everything in my house and boat, and consistently, with the exception of the Kettles and one space heater, they use substantially less electricity than advertised. Even this MacBook Air, whose sucking goes all over the place, averages out to about 3 Watts. Playing a Flash video with sound turned all the way up, it goes to 7 Watts. Turn off the Wifi, and just type away in near darkness... 2.1 Watts. Apple was very conservative in estimating battery life; I've gone 14 hours on one charge, typing away like mad until dawn. Apple says I should expect 9.

      But there is one area that concerns none of us here except me maybe, and that is Power Factor. I just happen to have an ancient HP Vector Voltmeter. (A Vector Voltmeter, very simply, compares Phase of two otherwise identical sinusoidal signals, and displays the Phase Difference on a meter.) The absolute values that the VV displays isn't of any real interest to me, as long as a calibrated Resistive source, like a Tea Kettle, centers the meter. Historically, the VV shows how Inductive a load is, like in a refrigerator compressor. It was rare to come across Capacitive loads, except maybe Cyclotrons. But increasingly now, these newfangled switching power supplies do show _Capacitive_ Reactance. That is, they draw more "Imaginary" Capacitive Reactive Watts than Resistive. We don't care, because we pay for Real Watts. But Power Companies, who rely on Energy Usage Tests to forecast demand and allow for it, do care. It gets _very_ complicated to put down in writing just how much Power a device uses under all circumstances, while a VV dances away. So, to make it easy for everyone, just state the worst-case scenario- Fold worst-case Reactance into the measured Resistive load, add a fudge factor, and state that this Vizio consumes 65 Watts. Plan around this accordingly. (I'm not getting any further into this without getting very tiresome indeed; just Google "Power Factor".)
      If everybody plays by the same rules, we have some semblance of Reality, with "Imaginary" Tendrils.

      I don't know the Realities of this Case; I would need to take randomly chosen victims into my Underground Lair and subject them to the Vector Voltmeter, and once the acrid smoke clears, make some calculations. But with the premise of always underpromise and overdeliver for Government Certification, or face the Consequences, "Gaming" Energy Usage Tests makes no sense here. Anybody buying a 55" TV to watch mostly commercials simply doesn't care.

      I call... "Doubtful".

      • by amorsen ( 7485 )

        But Power Companies, who rely on Energy Usage Tests to forecast demand and allow for it, do care.

        You imply that power companies try to guess which items people buy, and how much they use them, and then use the Energy Usage Tests to figure out aggregate demand. This sounds highly improbable.

        • Re: (Score:2, Interesting)

          by Anonymous Coward

          That is _exactly_ what the Utilities do. (This also goes for, at least in California, Water...)
          Very few people want new Power Plants. They are expensive, troublesome, and take many years to design and construct. One of the points about the "Energy Star" program is to get some insight into future purchasing trends, which helps to forecast aggregate Electricity Demand. Take the simple Light Bulb. I've been testing a bunch of the new LED Lamps. Now any individual LED Lamp that uses ~12% of the Equivalent Inca

      • Power factor correction for electronics is old news, here's an article from 2010 - http://www.edn.com/electronics... [edn.com]
  • Even bad its good (Score:5, Insightful)

    by vux984 ( 928602 ) on Wednesday September 21, 2016 @09:09PM (#52935423)

    Compared to the plasmas, rear projection screens, and even good old fasioned CRTs over the last dozen years the new LED units are positively energy sippers.

    Even so, update the tests and fix the stickers; consumers should know what they are buying.

    Although I do take exception to the idea that the auto dimming during the test is 'unrealistic' -- yeah its true there isn't a minute of blackness during the average superbowl. But I can't tell you how often a movie has ended, or someone walked away from the HTPC, or something and its gone to sleep. Some of the devices go idle/sleep/off the TV basically shuts the screen off. Others it goes to this blue no-signal screen which it doesn't seem to detect as idle, and will sit there glowing blue nothing for hours... so yeah... testing what the screen does when its not being used SHOULD be part of the testing.

    • But I can't tell you how often a movie has ended, or someone walked away from the HTPC, or something and its gone to sleep

      Microsoft probably can. ;)

    • by Hadlock ( 143607 )

      Yep, my 40" Samsung LED backlit TV is rated at 40W. To give you an idea of how much energy that is, the Samsung soundbar + subwoofer is rated at 180W. That's 4x more energy consumed (at peak) by the barely-midrange soundsystem than the display. It's 8x more than my 5W rated Amazon Fire TV (streaming media device),

      but, running my electric oven for 20 min to make my pizza uses more power than my TV, Speakers and streaming media device do in a month.

      Shrug. There are bigger, better fights to pi

      • Re: (Score:2, Insightful)

        by Anonymous Coward

        Your sound bar would be only using max 20 to 30 Watts, Peak is a useless measure because it is a measure the power the sound bar can pump out for a moment, if you try to drive it hard continuously it will just crap it self and you will very soon find yourself pushing the volume down to a level it can actually handle.

        • by amorsen ( 7485 )

          Your sound bar would be only using max 20 to 30 Watts, Peak is a useless measure because it is a measure the power the sound bar can pump out for a moment, if you try to drive it hard continuously it will just crap it self and you will very soon find yourself pushing the volume down to a level it can actually handle.

          The AC has it right. 180W is marketing. It will never take that from the socket.

          • by fnj ( 64210 )

            Most people listening to a TV through the TV's speakers use less than one watt average. Only the pricks hooked up with high power 5.1 amps who are rattling every goddam window in the fucking neighborhood watching Mr. Robot are using very much audio power, and even those are probably averaging only a handful of watts most of the time.

          • by fnj ( 64210 )

            Actually he was right, if you parse his words. 1000 watts is more than 40 watts. I mean, he did say POWER, not ENERGY.

      • by adolf ( 21054 )

        Huh?

        A month of 40W is about 29.2kWh.

        For your electric oven to use more than that in 20 minutes, it would have to draw more than about 87,600W for each of those 20 minutes, or about 380A @ 230V.

        Either you are lying, don't understand the math, or we'll be requiring pics showing this oven.

        • by fnj ( 64210 )

          Actually he was right if you come right down to it. 1000 watts is more than 40 watts. I mean, he did say POWER, not ENERGY.

          • by adolf ( 21054 )

            Watts *is* power.

            And he said his oven uses more power in 20 minutes, than a whole pile of other gear does in a month.

            I thought he was pretty clear about what he meant, despite being wrong in his assumptions.

            • Sure, he said power when he clearly meant energy, but I suspect he's still wrong. Assuming his oven averages a typical 2000W when operating, and takes an extra 10 minutes to warm up before he puts in his pizza, we're still only talking 1000Wh to cook a pizza.

              Spread that energy across a month and it would be ~33Wh per day, or less than an hour per day for his TV alone (at peak consumption), much less his speakers, etc. Maybe that's really all he watches, but that would put him well below the average.

      • by AmiMoJo ( 196126 )

        The issue is honesty. In the EU many electrical items have to come with a sticker, displayed in the shop, showing energy efficiency. Lying to consumers, even if it is a relatively small amount, is frowned upon.

        Having said that, multiply that 40W by the number of TVs in the world and it's actually something worth making an effort with. And yeah, I own a plasma.

      • >There are bigger, better fights to pick than LED backlit TVs.

        But not many better than honesty in product information. So long as we live in a market-based economy, informed consumers can exert a surprising amount of pressure on the supply chain. Allow companies to make fraudulent claims about their products, and it becomes far more difficult to make informed purchasing decisions, to the point that almost no one would bother. (honestly, are you going to look online for information about every item you

      • by Burz ( 138833 )

        Shrug. There are bigger, better fights to pick than LED backlit TVs.

        ...like DVRs. My Verizon DVR uses at least 32W all day and night long. That's about the same average energy usage as a present-day refrigerator. These things don't properly go to sleep, in a world of electronics that constantly sleep and wake on demand to save electricity.

        That's the main reason I'm eager to see regulators stop the cable operators' monopoly over subscriber cable boxes. The boxes should become normal consumer electronics and use Energy Star ratings to compete against each other.

    • from TFA:

      NRDC and its consultant Ecos Research found that just a few clicks on a remote control could lead many 2015 and 2016 televisions from Samsung, LG, and Vizio to use up to twice the energy that consumers were told they would.

      So, they are accusing the TV manufacturers of cheating on benchmarks, and they go on to say that if a user turns up the brightness, the TV will user more power. I just lost all respect for these clowns.
      • In other news, if you drive your car at full throttle all the time, it won't get the 30 MPG on the sticker.
    • by amorsen ( 7485 )

      My supposedly "smart" Samsung TV detects when power saving activates on the attached device and puts up a bright white logo to inform me. The logo does not go away. At least it moves around, so the wear on the screen is somewhat even.

      The only way to do power saving with modern TV's is to use ARC, and ARC support is just not very widespread yet.

      • by gmack ( 197796 )

        The only way to do power saving with modern TV's is to use ARC, and ARC support is just not very widespread yet.

        Two of the tree devices plugged in to my TV support ARC. Both my Chromecast and my Android box support it. Only my seldom used Blu Ray (I rip my blu rays and toss them on a fileshare for my android box to play) does not support it.

      • Now that's a boneheaded design choice - even my budget Samsung "dumb" TV, circa 2005, goes to sleep after a short while without a signal.

    • by makomk ( 752139 )

      I have an LG monitor and the backlight-dimming feature definitely does activate during fades to black in normal content, especially movies. I'm guessing that the "sports, comedies, dramas and news programming" that this organization chose to test these TVs just happened to have a lot less of those fades to black than a broader, more representative set of content would. Wonder who's paying their bills.

    • The whole point is that consumers don't know what they're buying. Energy Star rating on a 79 watt TV over a 109 watt TV that's on for 12 hours per day? That's $1.86/month. If your TV lasts 10 years, you might save $225. As such, this is quite possibly the least-important thing you should concern yourself with when buying a new TV--in fact, you should probably just flatly ignore the power consumption within the same class (e.g. Energy Star LED TVs). (Note: a 55 inch LED LCD panel consumes around 60 wa

  • Wow! Who would have thought increasing the brightness and/or contrast of an LED screen would use more energy and make the power-saving measurement certification mode unusable?

    OTOH, turning the screen off during video playback seems a little VW/Mitsubishi/Hyundai-like.

  • by tlambert ( 566799 ) on Wednesday September 21, 2016 @09:32PM (#52935525)

    What's our take away on this supposed to be?

    (A) These evil scoundrels are cheating on the government tests

    (B) The people who are designing the government tests epically suck at their jobs, should be fired, and have competent people hired in their places

    I'm going to have to vote "B" here, folks.

    • (B) The people who are designing the government tests epically suck at their jobs, should be fired, and have competent people hired in their places

      It's not that they suck at their jobs. Due to "fairness, transparency and accountability" requirements any testing methodology they come up with has to be fully documented and given to the manufacturers ahead of time. Manufacturers being the scum-sucking bastards that they are will, of course, run all these tests in their own labs ahead of time and tweak the cra

      • It's not that they suck at their jobs. Due to "fairness, transparency and accountability" requirements any testing methodology they come up with has to be fully documented and given to the manufacturers ahead of time. Manufacturers being the scum-sucking bastards that they are will, of course, run all these tests in their own labs ahead of time and tweak the crap out of things so they come out on top.

        Sorry, but the tests are supposed to be "representative of normal usage".

        Even if they document the tests, if they can be gamed in a test representative of "normal usage", then the same gaming will kick in on actual "normal usage", and so the test will not have been gamed.

        You can have them be shitty at designing tests, or you can have them being shitty at determining what constitutes "normal usage", but it's not possible to game something that doesn't have a variance between expected use and actual use.

        The m

        • by amorsen ( 7485 )

          Even if they document the tests, if they can be gamed in a test representative of "normal usage", then the same gaming will kick in on actual "normal usage", and so the test will not have been gamed.

          Normal usage will be viewing a different movie than the one they test with. If you can get viewers to only watch the test signal, over and over, then sure there is no variance between expected use and actual use. However, I did not buy my TV to watch a specific set of video clips in a specific sequence, repeatedly.

        • Actual "Normal use" involves lots and lots of data. Hundreds of movies, millions of hours, TV content, commercials, use as a PC monitor, game systems, the lot. It's impossible to actually measure that in a controlled environment without actually running everything across the TV; it's possible to approximate it for naive algorithms (i.e. the TV doesn't know about the test, but knows about real-world usage behavior).

          I've collected a set of various types of media--e-mails, Web pages, musics of different g

    • Re: (Score:3, Interesting)

      by Tailhook ( 98486 )

      I'm going to have to vote "B" here, folks.

      Correct answer. Energy Star certified a gasoline powered alarm clock [nytimes.com] in 2010. It's pointless pencil whipping operation; another collection of government lawyers sopping up a grand living from government teats.

      • Do you have proof the alarm clock used gasoline inefficiently?

        • by guruevi ( 827432 )

          The audit revealed more that the pencil pushers didn't even look at the applications before approving them, those products didn't exist yet got Energy Star approved. They set up a set of fake companies and sent Energy Star a set of devices they purported to make that were 20% more efficient than any competitor, they eventually worked up to a gas powered alarm clock. The other thing they found is that once they got an Energy Star certification, they could plaster it on any product they wished even if it hadn

    • by tlhIngan ( 30335 )

      What's our take away on this supposed to be?

      (A) These evil scoundrels are cheating on the government tests

      (B) The people who are designing the government tests epically suck at their jobs, should be fired, and have competent people hired in their places

      I'm going to have to vote "B" here, folks.

      Or maybe the test is designed so comparisons can be made between years, models and history?.

      I mean yes, you can design the test to be different and updated every single year, but then you lose the ability to compare m

    • by mjwx ( 966435 )

      What's our take away on this supposed to be?

      (A) These evil scoundrels are cheating on the government tests

      (B) The people who are designing the government tests epically suck at their jobs, should be fired, and have competent people hired in their places

      I'm going to have to vote "B" here, folks.

      C: Marketing people see any mandated metric as something to be gamed in order to get an edge over the competition. Marketers tell this to managers, managers order B to game the test.

      Like fuel efficiency ratings on cars, I don't trust energy efficiency tests on consumer electronics to be accurate. I'm sure they're real, but were done under laboratory conditions which probably involved a lot of settings turned down.

  • by Anonymous Coward on Wednesday September 21, 2016 @09:38PM (#52935537)

    Unless they did some Volkswagen-esque cheat the detected that a test was running and changed settings on the fly, then all they were doing was optimizing for the test, which is hardly fraud. They can't stop a consumer for switching to ultra-bright or from watching continuous flashing Pokemon episodes. It is impossible to come up with a "standardized" test that perfectly replicates "real world" conditions. My real world conditions are different than yours. All you can do is come up with some standardized test that is hopefully representative. If the government failed, then change the test.

    • by havana9 ( 101033 )
      I think is the same for other appliances. I have a dishwasher and in the appendix is specified how the energy test was made, with a specific program and a specific set of dishes and pans. If you've time you could use the "energy saving program" bus of course if you're in a hurry or need a high-temperature washing the dishwasher will draw more energy. I suppose is the same with fridges and so on. In the real world the conditions are too different to get a standard and unique test.
  • by Anonymous Coward

    The setting on my Samsung TV is called "Motion Lighting". It dims the screen when the image is perfectly still for more than a few seconds. It was, of course, one of the first things I disabled since it's absolute bullshit for normal use; it doesn't kick in for 99% of content, and when it does it's extremely disruptive, there's no reason static images should suddenly dim out of nowhere. I Googled it at that time (three months ago) and it seemed to be common knowledge that it was added to circumvent energy e

    • by PPH ( 736903 )

      Besides this, dimming the screen during periods of low/no activity makes it impossible to watch any M. Night Shyamalan movies.

    • The setting on my Samsung TV is called "Motion Lighting". It dims the screen when the image is perfectly still for more than a few seconds. It was, of course, one of the first things I disabled since it's absolute bullshit for normal use; it doesn't kick in for 99% of content, and when it does it's extremely disruptive, there's no reason static images should suddenly dim out of nowhere. I Googled it at that time (three months ago) and it seemed to be common knowledge that it was added to circumvent energy efficiency tests.

      The one use for this would be if you hit pause and stepped away from the TV. However, I would think that the timer should be configurable with a default of 10 or 15 minutes. Anything shorter and anything that can't be changed, as pointed out, is useless....

  • The difference between tests and actual is highest when depicting VW's on the screen.

  • by thegarbz ( 1787294 ) on Wednesday September 21, 2016 @11:06PM (#52935957)

    So today I learnt that manufacturers think that consumers give enough of a crap about the energy rating of a TV that the companies need to game the system.

    Just another example of companies completely detached from their user base.

  • Do they use sophisticated cheat software that somehow draws less from the kill-a-watt meter when an FCC employee is watching?

    • As other people will tell, default TV settings might be too aggressive regarding reducing power usage. It is similar to WD green HDDs that have a 7 second timer to park their heads and turn itself off. That reduces the average power usage if you use the drive as a secondary unit in Windows. It also makes it slower, and that is OK for storage of files. However, if you happen to insert this kind of drive into a Synology Diskstation, another 15 second timer will wake up the drive, thus stressing the HDD so mu
  • by RogueWarrior65 ( 678876 ) on Thursday September 22, 2016 @10:37AM (#52938655)

    I can't say that I blame any private industry for trying to evade the capricious regulations fomented by fascist economics. Show me a single regulator that has an advanced degree in engineering and who has put it to actual use.

  • Those scores often influence the buying decisions of consumers looking to save money on their utility bills.

    Errr... no actually.

    That's the very last thing that could influence my decision.

    Size, Image quality, price, those are the determining factors for me.

    Seriously, my TV represent less than 1% of my electric bill. Where I live, 80% of the bill is for heating.

"What man has done, man can aspire to do." -- Jerry Pournelle, about space flight

Working...