Intel Claims Haswell Architecture Offers 50% Longer Battery Life vs. Ivy Bridge 120
MojoKid writes "As with any major CPU microarchitecture launch, one can expect the usual 10~15% performance gains, but Intel apparently has put its efficiency focus into overdrive. Haswell should provide 2x the graphics performance, and it's designed to be as power efficient as possible. In addition, the company has further gone on to state that Haswell should enable a 50% battery-life increase over last year's Ivy Bridge. There are a couple of reasons why Haswell is so energy-efficient versus the previous generation, but the major reason is moving the CPU voltage regulator off of the motherboard and into the CPU package, creating a Fully Integrated Voltage Regulator, or FIVR. This is a far more efficient design and with the use of 'enhanced' tri-gate transistors, current leakage has been reduced by about 2x — 3x versus Ivy Bridge."
OEMs don't always get voltage regulation right (Score:1)
Early last year some Lenovo Thinkpads had issues with lockups due to a voltage regulator being off spec.
Not terribly on-topic, but it was either that or scream: "I just bought an Ivy Brigde laptop dammit, Dammit, DAAAMMMIT!!!"
Re: (Score:3)
Source? This is the first I've heard of this, I haven't seen any articles on the subject, so this would be very enlightening. Generally Thinkpad quality is very high, even if their screen quality went to garbage starting around Thanksgiving 2012... It would be interesting to see more details on this, as I have been tracking the downward spiral of Thinkpad quality ever since the Lenovo CEO Yang Yuanqing announced that they were going to square off the Thinkpad vs Ideapad brands under lenovo at the cost of g
Re: (Score:2)
Check out the Lenovo forums regarding the "stop code" problem on the T430s model. They rectified the production problems in early September.
Incidentally, coming from Macbooks I have to say that press coverage of Windows/Linux systems and their performance issues is very scanty. It feels like no single model sells enough units to garner a critical mass of attention. With Apple stuff, every model has 3rd party teardown videos, other online guides and press attention just days after hitting the shelves. Maybe
Re: (Score:2)
Early last year some Lenovo Thinkpads had issues with lockups due to a voltage regulator being off spec.
Not terribly on-topic, but it was either that or scream: "I just bought an Ivy Brigde laptop dammit, Dammit, DAAAMMMIT!!!"
But putting the voltage reg in the CPU seems to be fraught with peril as well.
This means you are going to have to 1) have redundant regulation on the mo-bo for other components, and 2) subject your CPU to much higher (and unregulated) voltages. You've added another heat generation source right there on the CPU, and power excursions are likely to take out your processor.
Re: (Score:2)
But higher voltages means less current, which helps.
Plus if the voltage regulators are in the CPU package, they can use the MUCH better thermal solution provided for it.
Re: (Score:2)
If that's true then maybe Intel is making this move so they can sell more product: Power breakdowns to stand in as a replacement for technological obsolescence (which has been petering out in recent years).
And before anyone calls me cynical, I know for a fact that Intel is concerned about keeping the replacement cycle going. They have stated it at times when investors were getting jittery, and they even had a TV ad in plain view that admitted they wanted to entice people who "thought" they were perfectly ha
Re: OEMs don't always get voltage regulation right (Score:2)
You already have a separate, programmable regulator for Vcore (overclockers fiddle with it all the time) and in both cases if the regulator fails the CPU is toast so there's no advantage in keeping it outside. I'm not sure how they integrated the reactive components, but they're surely more reliable than current electrolytics, plus shorter paths mean less voltage drop meaning less stress.
Re: (Score:3, Interesting)
This means you are going to have to 1) have redundant regulation on the mo-bo for other components,
Nope. Motherboards already had dedicated regulators just for the CPU.
High-speed CPU core logic needs very low supply voltages, around 1.0V these days. Lower speed parts built in older processes need higher voltages -- 1.2V, 1.5V, 1.8V, or more. There's not much on the motherboard which even can share supplies with the CPU. Also, CPUs now dynamically vary their own core voltage (by sending commands to the regulator) in order to save power. That wouldn't work so well with other chips sharing the same reg
Nice (Score:1)
That's fantastic. I love seeing efficiency, but I imagine that the screen would eat most of the battery life in consumer applications.
Re: (Score:2, Informative)
Depends on the screen you have, I would guess. https://www.google.com/search?q=laptop+screen+wattage&aq=f&oq=laptop+screen+wattage
If you look at the first link there, you'll see that the LCD screen takes up on the order of 5W of power at full brightness. The same paper says that the power usage roughly doubles when you start blasting the CPU. If you use your laptop like I do (I'm in an engineering program at college), that's some nice savings there if they can trim the CPU usage.
Re: (Score:3, Insightful)
Depends on the screen you have, I would guess. https://www.google.com/search?q=laptop+screen+wattage&aq=f&oq=laptop+screen+wattage [google.com] If you look at the first link there, you'll see that the LCD screen takes up on the order of 5W of power at full brightness. The same paper says that the power usage roughly doubles when you start blasting the CPU. If you use your laptop like I do (I'm in an engineering program at college), that's some nice savings there if they can trim the CPU usage.
Yes screen technology is important.... Pixel Qi technology seems to be ignored and should not
Especially on laptops that mate well with a docking station for "work".
A big quality display at the office is a good thing. Especially on that has been rotated to be tall. The ability to have a very low power transmissive/ reflective display while mobile and a serious display at a desk at work is under served.
Docking station tech is lame at best. First the battery charging logic is flawed. The charger sho
Re: (Score:2)
Docking station tech is lame at best. First the battery charging logic is flawed. The charger should disconnect from the battery once it is charged. It should test the battery once an hour thereafter and decide what to do. I cannot tell you how many batteries I have had die from long term over charging and lack of correct dynamics in use.
Or simply not charge the battery. I think this is a software problem as opposed to hardware issue.
A docking station should have cooling designed to keep the battery as well as the CPU/logic cool. Most obstruct air flow and do neither well.
I think this was the purpose of Thunderbolt. You don't need a docking station anymore. Just the charger and one cable for connections. As far as I know Apple is the only one that fully embraces TB. Not surprisingly I think this is because Apple doesn't have a docking station. Maybe it was for aesthetics that Apple never designed one. Other manufacturers are more hestitant to use TB as it means they can no
Re: (Score:2)
You do realize Apple was one of first, if not the first with a docking station years and years ago, right?
Re: (Score:2)
Re: (Score:2)
Docking station tech is lame at best. First the battery charging logic
is flawed. The charger should disconnect from the battery once it is charged.
It should test the battery once an hour thereafter and decide what to do. I cannot
tell you how many batteries I have had die from long term over charging and
lack of correct dynamics in use.
A docking station should have cooling designed to keep the battery as well
as the CPU/logic cool. Most obstruct air flow and do neither well.
This depends entirely on the laptop/battery. The last two Lenovos I've had both offered smart charging where the battery would optionally not begin charging until below X% and would stop when the battery signaled it was full. The charging threshold could either be directly specified by the user or determined by the laptop based on usage pattern.
My previous machine I set to not recharge until below 85%. It was a power hog so the battery was pretty much a pack-along UPS. 15% represented a fairly small number
Re: (Score:3)
Re: (Score:2)
Re: (Score:1)
here is an imaginary i5-3439Y laptop power budget: :1 W
screen : 12W
board
cpu : 15 W
HDD : 1W
Wifi : 1W
Total : 30W
To slash that by 50% you would have to have a magical CPU that consume no power, so there is your upper bound on power reduction...
Re: (Score:2)
Re:No way (Score:4, Insightful)
Tn = a + (1-a)/N
Where Tn = Time with N cores
N = Number of Cores
a (should be alpha) = fraction of instructions in serial code.
What you are talking about is:
Bp = (1-((Pt - Pc)/pt))*100
While Amdahl is significant to the computer science world, are you claiming he invented percentages?
Re: (Score:2)
you forgot the 2-4w for RAM, if not more.
Re: (Score:2)
I've highlighted the problem with this refutation.
Re: (Score:2)
Don't hold your breath. He obviously could have just said that the screen eats enough power that you couldn't possibly cut total power consumption by half with just the CPU. But that doesn't make him sound as intelligent and mysterious as citing a mathematical argument for which he has no idea how it would actually work out without actual numbers and isn't really relevant in the first place.
Re: (Score:2)
The analogy is sound, the "parallel" part is the processor and the "non-parallel" part the rest and it'll approach the same power baseline with increased processor efficiency as it does the performance baseline with increased parallelization. But I feel it's a rather silly complication of the obvious, unlike parallelization. Yes of course if the screen is the biggest power hog, then it has the most potential for improvement. Note that it would be a fallacy to think it will always have the greatest improveme
Re: (Score:2)
More to the point, however, the OP's assumption that the screen uses the most power is dead wrong.
Re: (Score:2)
Re: (Score:3)
Amdahl's laws are many.
Here are four of them.
0. Amdahl’s parallelism law: If a computation has a serial component S and a parallel component P, then the maximum speedup is (S+P)/S.
1. Amdahl’s balanced system law: A system needs a bit of IO per second for each instruction per second: about 8 MIPS per MBps.
2. Amdahl’s memory law: alpha=1: that is, in a balanced system the MB/MIPS ratio, called alpha, is 1.
3. Amdahl’s IO law: Programs do one IO per 50,000 instructions.
Corollary:
In any d
Re: (Score:2)
Re: (Score:2)
But only what you listed as the 0th law is credible and broadly accepted, as a fundamental rule. There is great doubt as to under what conditions those other 'laws' can or will continue to hold.
Re: (Score:3)
Wait, I thought Adama's law was: The only good toaster is a dead toaster.
Re: (Score:3)
Maybe with the old fluorescent backlights, but not these days. A typical LED backlight on a laptop draws something like 3 watts at maximum brightness. It isn't lost in the noise, but it is by no means the main power draw. The CPU, chipset, and RAM take way more current.
Re: (Score:2)
Considering that everything is ultimately running off a battery that provides only a single voltage, the distinction is moot.
What about the display? (Score:1, Informative)
The biggest battery drain on my phone is always the display, followed by "Cell standby". How is a CPU and chipset able to promise a 50% increase in battery life when it's not even the biggest power user in the phone?
Re:What about the display? (Score:5, Insightful)
Re: (Score:2)
Maybe it's a modded Osborne 1 [oldcomputers.net]?
Probably not Ivy but... (Score:2)
http://i01.i.aliimg.com/img/pb/419/444/420/420444419_371.jpg [aliimg.com]
Re: (Score:2)
Re: (Score:1)
Phone CPU's vs laptop and desktop CPUs are in different leagues.
It is no surprise the biggest draw is your screen in a phone. On a laptop the biggest ones are CPU then video card/chipset then screen.
You are comparing apples and oranges. Many phones are SoC's these days or at best 2-3 chips. Laptops are not there yet. Your phone cpu measures its draw in milliwatts the laptop/desktop crew measure in watts.
They had a very decent boost last year with ivy. I went from a sandy bridge laptop to an ivy and the
Re:What about the display? (Score:5, Funny)
They had a very decent boost last year with ivy. I went from a sandy bridge laptop to an ivy and the battery life doubled.
That's nothing, wait till you see the Tacoma Bridge chips they're planning for the next year. I've heard they've made a real break-through with them.
This is about the cpu gpu? (Score:3)
Without checking the source, i bet it is only the cpu/gpu/power thtat is getting lower values. It is the old intel story again. First it was the atom cpu that was supposed to be super low power. However they forgot to mention you needed a chitset along with it for the video networking pci that was not so super savy with power.
Now the cpu/gpu is super power savery. But the wifi/display/battery/2g/3g/nfc/audio/cam/gps might still drain your battery in 3 seconds.....
Re: (Score:2)
Seconded with Atom netbooks. I had one that lasted from LA to Tokyo at one point; it drew about 6W from the battery.
Although we're close to that now. Now I have a 14" gaming laptop with an i5 Ivy Bridge in it, and powertop reports that I can run the whole shebang (obviously with the GPU off) on 8W or so.
Re: (Score:2)
According to Wikipedia at least [wikipedia.org], the Haswell architecture will include a die-shrink in the PCH (Northbridge) chipset from 65nm to 32nm, so this issue is avoided I think.
Re: (Score:2)
No, sorry, you did not understand. The northbridge may be very savy now. (compared to???) But beside the CPU with ingegrated north bridge you also need a lot of other supporting hardware.
Notice in the pictures that it is targeted for tablet size, not phalet, or phones. They need a lower kind of power usage I suppose.
Re: (Score:2)
The biggest battery drain on my phone is always the display, followed by "Cell standby". How is a CPU and chipset able to promise a 50% increase in battery life when it's not even the biggest power user in the phone?
I would guess that you suffered a brief lapse in reading comprehension. My take on this is that the Haswell uses 50% less power for the same performance / capability as an Ivy Bridge. Whether or not that cuts battery consumption overall by 50%... well I highly doubt it.
Barry Life or CPU Power Usage (Score:4, Insightful)
Is this seriously 50% increase in battery life? Or just 50% reduction in power usage by CPU? The article wasn't clear on this. I'm assuming the power usage thing.
Re: (Score:2, Insightful)
Very likely, they're talking about the CPU using 50% less power. Intel doesn't make laptop batteries, and battery technology is on a plateau right now since we're hitting the very limits of chemistry in Li-Ion and Li-Poly batteries at the moment.
Re: (Score:2)
Very like they are talking about the CPU using 33% percent less power, thus increasing how long it can run on given about of Wh by 50%.
Re: (Score:2)
Very likely, they aren't, since they make specific claims about CPU power under different regimes and all of them are much more significant than that, and then go on to say that the CPUs will enable laptops using them to have 50% greater battery life.
Screen power usage (Score:2)
For workloads where the system is awake but mostly idle (think web browsing etc) you'll see enormous gains in energy efficiency; the less idle it is the less gain.
When I browse the web on my Nexus 7 tablet, "Screen" already takes at least 67 percent of the battery. And that's with ARM, which already sips less power in general than x86. What CPU upgrade will fix that?
A more efficient CPU is good but won't fix all (Score:2)
Re: (Score:2)
> Is this seriously 50% increase in battery life? Or just 50% reduction in power usage by CPU?
Assuming the CPU was the only element consuming power, a 50% reduction in power usage by the CPU would equate to a 100% increase in battery life. But, yes, what they are claiming is that the net effect of the various improvements is that it should enable a 50% increase in battery life, not that it will merely reduce power consumption on the CPU by the amount that would do that if the CPU was the only power draw.
Re: How is Barry Life Formed? (Score:1)
How is Barry Life Formed? How Usage get Consemption?
they need to do way instain comsumer> who kill thier barrys. becuse these barry cant frigth back?
it was on the charger this mroing a user in ar who had kill their three divice.
they are taking the three barry back to zero charge too lady to rest.
My parry are with the tickle chrager who lost its powre ; i am truley sorry for voltage lots.
Re: (Score:2)
IIRC, Intel did a lot of work on the whole system (including motherboards - I think they actually worked with other manufacturers too), not just the chip. Not all the savings are from the CPU.
No, that's not it. (Score:5, Interesting)
Math tip: A 50% increase in battery life (what they actually claimed) isn't the same as doubling it.
Also, since a big selling point for Haswell (aside from power efficiencies) is the claimed greatly improved (~2x for laptop-oriented models, ~3x for desktop-oriented models) improvement in graphics performance, I'd be very surprised if their claims for about battery life were focussed on systems using discrete GPUs rather than relying on the integrated graphics on Haswell.
Well, except that they explicitly claimed that was overall battery life, and it was a 50% increase not 2x, and they actually cited numbers for improvement in idle life and it was much higher than the +50% claimed overall (or even the 2x you pulled out of who-knows-where), since their claimed idle-mode improvement was twenty times (TFA is less clear on this, but Computerworld covers the same event with more specificity: "And in idle or standby mode the chips will do even better, extending battery life by up to 20 times, [Rani Borkar, Intel's Architecture Group VP] said." [emphasis added])
I've got FIVR (Score:1)
Re: (Score:2)
FIVR in the mornin' FIVR in the evenin', FIVR all through the night!
Yeah, but the biggest benefit it seems they got in sleep states and I don't think sleeping in the morning, sleeping in the evening, sleeping all through the night is what the song is all about...
Desktops? (Score:3)
Is this a laptop only chipset, or does intel have goodies for those who like to be chained to their desks?
Re: (Score:2)
Haswell is a laptop/desktop/server microarchitecture, but Intel doesn't care very much about the desktop anymore, so expect little press coverage of that angle.
Re: (Score:2)
Yeah, its not like most of the stories on this announcement have covered Intel's claim of tripling the integrated graphics performance on desktop systems (and doubling it on laptop systems.)
Well, except that that is exactly the case.
Yes and No. (Score:3)
Like most CPU's these days, they produce a lot of variants.
For this article they are likely talking about the "U" variant with 15W TDP.
http://en.wikipedia.org/wiki/Haswell_(microarchitecture)#Mobile_processors [wikipedia.org]
You can't really compare that with the (or say in same breath) desktop "K" variant with 84W TDP (also has twice the cores and threads).
http://en.wikipedia.org/wiki/Haswell_(microarchitecture)#Desktop_processors [wikipedia.org]
I am pretty sure the benchmarks will be wildly different. Anyway the summary makes it sound like it is all one thing. I am sure it will be very good and all, but I know I won't be getting one of those power saving versions. POWER! (To quote Clarkson)
I wonder how they compare (Score:2)
I wonder how the performance vs power consumption compares to the old Transmeta chips that started the trend.
Not much left for the motherboard to do (Score:1)
Soon motherboards will be just wiring for the I/O and CPU
Re: (Score:2)
Soon motherboards will be just wiring for the I/O and CPU
And despite that, there is no price decrease to be seen in motherboards... if anything, they are getting more expensive, despite having less silicon and intelligence on them <G>
"Transistors maintain the same operating frequency (Score:3)
Is that marketing speak for "we were unable to increase the operating frequency"?
Re: (Score:2)
AMD CPUs run 4.4GHz stock. There must be a different reason. It might be a tradeoff between complexity and pipeline depth.
Re: (Score:2)
Or its simply the fact that dynamic power is related to the switching speed via the square of the voltage * frequency. So, increasing the clock and voltage causes the power to go up significantly for a given number of transistors.
So, its an evil tradeoff, add more transistors increase leakage, or bump the clock rate and increase the dynamic power.
AMD has chosen the faster clockrate, for a few percent decrease in efficiency, while Intel has chosen the save power option for a few percent decrease in performac
And yet... (Score:5, Informative)
Too bad CPU power consumption hasn't been the biggest consumer of watts in many years.
Hint; the biggest amount of consumed current in most laptops is the glowing part you look at.
E-ink laptop plz (Score:3)
Great! (Score:2)
Now they can make the OS and application coding less efficient!
Re: (Score:2, Informative)
Re: (Score:2)
Yes, but they don't have a ground plug, so jacking off in the can while plugged in greatly increases the risks of severe electric shock.
Re: (Score:2)
Yes, but they don't have a ground plug, so jacking off in the can while plugged in greatly increases the risks of severe electric shock.
You've obviously never tried electric stimulation...
laptops are dc and I don't think ground pass thoug (Score:2)
laptops are dc and I don't think ground pass though the power brick to laptop.
Re: laptops are dc and I don't think ground pass t (Score:2)
The Y capacitor [imgur.com] can leak enough for an uncomfortable tingle on sensitive skin like your bare lap (eg wearing shorts) or the underside of your forearms.
What the flux? (Score:2)
Re: (Score:2, Funny)
If you're watching 3-hours of porn in a single sitting you're doing it wrong.
Re:Well then (Score:5, Funny)