AMD — "We're Not Entirely Honest" About Batteries 154
Slatterz writes "In an apparent attack of the bleeding-obvious, an AMD rep has come clean and admitted (on behalf of the industry) that notebook and phone battery life figures are completely unreliable. AMD's senior vice president Nigel Dessau says that 'we are not being entirely honest with users about what PC battery life they can expect to actually experience.' He says AMD will now use a combination of idle time (where the machine is left to sit idle, and timed to see how long it takes for the battery to go dead), and 3DMark06 to measure battery life. Great in theory but some of the industry already bases battery figures on a two-test measurement, and the results are still wildly inaccurate."
Anonymous Coward (Score:1, Insightful)
"We're Not Entirely Honest" = We've been lying
Not that big of a deal (Score:1, Insightful)
Not much different than EPA ratings on cars (Score:4, Insightful)
Re:Not much different than EPA ratings on cars (Score:3, Insightful)
Re:Anonymous Coward (Score:5, Insightful)
No, that's the thing. Everything they've told you is technically true... under certain conditions. Possibly even the conditions that they've listed in a small-print disclaimer (available upon request, if you can arm-wrestle the tiger and win).
Re:Sounds familiar.. (Score:5, Insightful)
If you recall, AMD's performance rating was an important step forward for the CPU manufacturer industry at that time. Intel was pushing for higher and higher clock frequencies with longer and longer pipelines - something that made little sense.
Performance ratings allowed consumers to effectively compare AMD and Intel chips side by side in ways that are useful.
Re:Isn't this simple? (Score:3, Insightful)
What's needed is are some regulations regarding how computer parts are advertised, but that would require a federal government that takes an interest in protecting the consumer, not protecting corporate profit.
Re:Won't matter soon (Score:4, Insightful)
Re:Isn't this simple? (Score:5, Insightful)
3) Advertise "minimum" battery life
What is wrong with that? Then I can expect at least 40 minutes of battery life and anything more than that is nice.
What's wrong with that? What's wrong is that you're telling the customer a number that by and large they aren't interested in. What they want to know is if they can watch a full DVD without recharging. If they can work on their Excel spreadsheet for the entire 6-hour cross-country flight if the plane doesn't have plugs. You tell them "minimum 40 minutes" and they say "Whoa! That's not long enough to do anything!" and you say "Well it's just the minimum, under typical usage conditions it will last much longer," and then they ask "And how long is typical? Long enough to watch Casino Royale on BluRay?"
What's your answer? Hypothetically you should be able to actually say whether it'll last long enough to watch the movie, but how do you answer that question in general? What is "typical"? That's what people really want to know, the minimum number doesn't really do them much good except to say that if they really load down the laptop, it won't last long. Which makes the product look bad, and is still by and large not that helpful.
It's not an easy question, more difficult in many ways than talking about performance. Considering that power has only become a major concern for commodity chip makers in recent times, I'm not surprised that their battery life estimates aren't very accurate. Of course, whatever estimate they do use, no matter how accurate, will be measured in a way that makes their parts look good. That won't change, ever. I'm sure that's part of your motivation for the minimum time metric -- there are far fewer ways to screw with it. Which is nice, but not sufficient by itself.
Exactly like MPG estimates (Score:5, Insightful)
>>This happens in every industry
This is a bit different from a breakfast cereal saying "now even tastier" or a soap promising "more suds!" The first is subjective (personal preference) but the second is objective -- it can be quantified and proven/disproven.
In this case with batteries, rather than taking an actual measurement of performance, the industry is building an estimate from a combination of measured behavior + a calculation based on a performance variable. It's no different than the automobile industry stating "EPA Estimated MPG city/highway" which is not based on a dynamometer test or actual performance measurement but instead is calculated based on the amount of CO2 which exits the exhaust pipe of the car! Is it any wonder, then, that hybrid cars which shut off their gasoline engine when stopped and at low speed/light acceleration, would give grossly inflated figures? Well, they did (and do), which explains why real-world MPG is often far less than this calculated (not even simulated) performance.
In short, they're both lying and it's obvious. Yet companies wonder why consumers are so cynical and therefore difficult to reach with advertising.
What is needed is real-world testing -- dynamometer ("rolling test track") testing for autos where the wind resistance, temperature, barometric pressure, etc. can all be carefully controlled. Similarly with computers, a pure performance-based measurement is needed which should account for idle time, network activity, etc. Just as an automobile is not tested at full-throttle for 3 hours, neither should a PC, but instead a variety of benchmarks (gaming, web browsing, spreadsheet, word processing, ???) could show performance figures for various activities.
In short, manufacturers, we want real numbers free of hype.
Give us synthetic and real-world benchmarks (Score:2, Insightful)
Give us "real" numbers like how long the battery will last sitting in a drawer or under "full load" in a particular device, and how long the battery will last under a variety of scenarios.
An "emergency" phone user is more interested in how long they can leave their phone in their glove compartment before recharging.
"Light" users want to know standby time and how many minutes of standby time they lose for every minute they talk.
"Heavy" users are more interested in talk time and how much "talk time" they lose if they leave their phone on but not charging overnight.
Re:Isn't this simple? (Score:4, Insightful)
What is wrong with that?
It gives me, the customer, absolutely nothing to work with. There's a reason we don't calculate "miles per gallon" (or km/l over here) for the "pedal to the metal" case.
In an ideal world (you know, where everyone knows basic math and nobody is fooled by politicians' campaign promises) you'd have a bunch of measurements at various loads and simply print a curve that tells me what I need to know because I have a somewhat good feeling for where my average use scenario is on the curve.
Re:Anonymous Coward (Score:4, Insightful)
Re:Anonymous Coward (Score:5, Insightful)
"We're Not Entirely Honest" = We've been lying
Actually, "We're Not Entirely Honest" = "We have no idea how to give you an accurate estimate." As someone whose found himself sucked into the battery/mobile power side of a project recently, I can understand why they'd face difficulties.
When it comes to batteries, there are really only three options for measuring how much power is stored: completely drain it over several cycles to see what you get (which is how the manufacturer confirms capacity, but isn't too useful in situ); test the voltage across the terminals and estimate absed on pre-measured battery curves (which is difficult because voltages don't change dramatically until they're nearly drained); or, in some chemistries, measure the temperature changes in the battery (which detects inreactions that don't happen until the battery is almost completely drained). In practice, all you can do is take the manufacturer-specified capacity, derate that based on conditions in your application, and test to see if you came close.
In general, pulling more current from a battery disproportionately decreases remaining capacity. In general, it's pretty difficult to respond to sudden surges and lulls in power consumption for a user's unknown power cycle needs without making your estimate jump all over the place. In general, the problem is just a pain in the neck. It's like ordering a margarita with margarita-flavored ice cubes from a waiter whose never seen you before, then demanding to know exactly how long before you'll need to refill it (regardless of whether you intended to chug it or nurse it).
I'm no expert, but you don't have to be to see it's not a trivial problem.
Re:Isn't this simple? (Score:4, Insightful)
Incorrect. I would respond, "That is the minimum time under the heaviest possible load it can be put under."
And they'd say "Okay, is playing a DVD the heaviest possible load, so I couldn't even play half of one movie? Or will I be able to play my movie? What about working on my earnings report?" and then you either have to refuse to give them any other number and lose the sale, or start talking about "typical" usage.
I'd like to kindly point you to my initial post where I quite clearly said "fully load" and not "typical load"..
Yes, I noticed, and I'd like to point you to my post where I clearly understood what you are talking about and said "That isn't very useful to the customer". Just because the minimum battery life has the useful property of being easier to quantify without hand-waving and assumptions doesn't mean it's actually the more useful number. Customers want to know if their lap top will last on a cross-country flight doing what it is they usually do.
Re:Exactly like MPG estimates (Score:3, Insightful)
The problem with fuel economy and battery life measurements is that in the real world you do not drive same as when the vehicle was tested on a dyno. The dyno test specifies how fast to drive, how quickly to accelerate, the number of stop lights, and how far to drive. Your daily commute will be different for each one of these parameters which changes your actual fuel economy. Even on your daily commute, your average speed will even change from day to day. So, even though your destination is the same, your fuel consumption will be different on a daily basis.
The problem with fuel economy testing is that it attempts to specify a driving cycle that represents that average for all Americans. As you know the way people driving in Los Angles is completely different that the way people drive in rural Wyoming. So, people in LA get completely different fuel economy than people in Wyoming
The same problem exists with predicting battery life. You simply don't know how the machine is going to be used. How can you predict the future? If you are using the optical drive you will use more power, which shortens your battery life. The best you can do is to try and predict life based on some average statistics for a given machine. Standardized tests will help, but you will never be able to provide precise number because of the variations on power consumption.
The other problem with battery life is that as the battery ages, the capacity of the battery decreases, which further shortens your battery life. In order to accurately predict battery life you need to model the aging properties of the battery. Having a background in batteries, I can say that this is not simple.
Re:Isn't this simple? (Score:5, Insightful)
1) If a customer doesn't know how much load playing a DVD is, they don't care about advertised battery life.
O_o Seriously?
You think knowing the % CPU utilization of watching a DVD (or BluRay, if this makes you feel better) is a pre-requisite to caring about the question "Will my laptop be able to play a full DVD between recharges?"
And I suppose anyone who doesn't know engine timings and torque curves wouldn't care about the question "Can I get to Grandma's house without refueling" too. That's weird, because I care about MPG but know very little about the engine physics that inform it. Should they scrap the "city/highway" MPG usage models, and instead tell you what MPG you'd get with the accelerator floored the entire way?
You're being ridiculous. Obviously people will care about being able to do the things they want to do without knowing exactly how much load on the system that actually entails.
Of course, even if I accept this premise, you're still not giving them the information they want. Okay, so I know that the fully loaded laptop life time is 40 minutes, and I know that my DVD player uses 5% of my processor. Now what? What's the scaling factor so I can do the math? Oh right it's not that easy, even if you're an electrical engineer. I know what it is you said I should know, and you still can't answer the question I care about.
2) Minimum is a more useful number because it always applies. Typical usage figures can be plucked out of thin air because it varies too much.
Easy to figure out is not the same as useful. Minimum is rarely useful because it rarely applies to what the person is actually doing. When the "typical" numbers can be 6x higher than "minimum", and what people really care about is "typical" for all the difficulty of defining what that means, then no minimum isn't that useful.
Sure the minimum should be specified. That does not get you out of the tricky problem of estimating 'typical' battery life, because that is what the customer wants to know, and for good reason. If you refuse to give them anything more useful than minimum, then you lose sales because you can't or won't answer the questions they care about.
Re:Exactly like MPG estimates (Score:3, Insightful)
It's no different than the automobile industry stating "EPA Estimated MPG city/highway" which is not based on a dynamometer test or actual performance measurement but instead is calculated based on the amount of CO2 which exits the exhaust pipe of the car!
I'm sick and tired of hearing this repeated over and over, and even on a website that's supposed to be read by geeky, sciency type people.
Yes, fuel consumption is measured by analyzing exhaust. However, this is an extremely accurate way of measuring it. Calculating the amount of C8H18 required to produce a given quantity of CO2 is a simple problem, one which a high school chemistry student could easily figure out. And it's certainly simpler than ripping apart the fuel system to try and measure every missing drop of fuel. And you don't have to worry about losses from evaporation or spills. If anything, it errs on the side of inefficiency, because of the extra CO2 introduced from any engine oil that is burned.
They DO perform the tests on a dynamometer, you are simply wrong. They use wind resistance values from wind tunnel testing of the vehicles to ensure that the dyno is programmed correctly for each car.
The only real fault in the system is that the "virtual driver" in the test is not nearly as aggressive as a typical driver today. Hard 0-70 acceleration and 85 MPH cruise speeds are not represented in these tests. The stop and go city portion of the simulation has the car stopping and going at legal, sedate speeds; no full throttle dashes from one stop light to another.
A new set of test procedures were instituted in for model year 2008 by the EPA, which are significantly more realistic. The previous tests were established in the late 70's and represented the typical driving conditions at that time. The old "highway" test routine had an average speed of only 45MPH and a top speed of 60MPH. Maximum acceleration in any of the old tests was 3.3 MPH/second The new "high speed" test has a top speed of 80 MPH and a acceleration rate of 8.5 MPH/second. The average speed is unchanged, though; I don't know many people who average 48 MPH on the freeway.
Most cars rated fuel economy has plummeted between '07 and '08, reflecting these new test procedures.