Forgot your password?
typodupeerror
Intel Power

Intel Claims Haswell Architecture Offers 50% Longer Battery Life vs. Ivy Bridge 120

Posted by Soulskill
from the just-don't-use-a-screen-or-any-other-hardware dept.
MojoKid writes "As with any major CPU microarchitecture launch, one can expect the usual 10~15% performance gains, but Intel apparently has put its efficiency focus into overdrive. Haswell should provide 2x the graphics performance, and it's designed to be as power efficient as possible. In addition, the company has further gone on to state that Haswell should enable a 50% battery-life increase over last year's Ivy Bridge. There are a couple of reasons why Haswell is so energy-efficient versus the previous generation, but the major reason is moving the CPU voltage regulator off of the motherboard and into the CPU package, creating a Fully Integrated Voltage Regulator, or FIVR. This is a far more efficient design and with the use of 'enhanced' tri-gate transistors, current leakage has been reduced by about 2x — 3x versus Ivy Bridge."
This discussion has been archived. No new comments can be posted.

Intel Claims Haswell Architecture Offers 50% Longer Battery Life vs. Ivy Bridge

Comments Filter:
  • Early last year some Lenovo Thinkpads had issues with lockups due to a voltage regulator being off spec.

    Not terribly on-topic, but it was either that or scream: "I just bought an Ivy Brigde laptop dammit, Dammit, DAAAMMMIT!!!"

    • by Hadlock (143607)

      Source? This is the first I've heard of this, I haven't seen any articles on the subject, so this would be very enlightening. Generally Thinkpad quality is very high, even if their screen quality went to garbage starting around Thanksgiving 2012... It would be interesting to see more details on this, as I have been tracking the downward spiral of Thinkpad quality ever since the Lenovo CEO Yang Yuanqing announced that they were going to square off the Thinkpad vs Ideapad brands under lenovo at the cost of g

      • by Burz (138833)

        Check out the Lenovo forums regarding the "stop code" problem on the T430s model. They rectified the production problems in early September.

        Incidentally, coming from Macbooks I have to say that press coverage of Windows/Linux systems and their performance issues is very scanty. It feels like no single model sells enough units to garner a critical mass of attention. With Apple stuff, every model has 3rd party teardown videos, other online guides and press attention just days after hitting the shelves. Maybe

    • by icebike (68054)

      Early last year some Lenovo Thinkpads had issues with lockups due to a voltage regulator being off spec.

      Not terribly on-topic, but it was either that or scream: "I just bought an Ivy Brigde laptop dammit, Dammit, DAAAMMMIT!!!"

      But putting the voltage reg in the CPU seems to be fraught with peril as well.

      This means you are going to have to 1) have redundant regulation on the mo-bo for other components, and 2) subject your CPU to much higher (and unregulated) voltages. You've added another heat generation source right there on the CPU, and power excursions are likely to take out your processor.

      • by imsabbel (611519)

        But higher voltages means less current, which helps.

        Plus if the voltage regulators are in the CPU package, they can use the MUCH better thermal solution provided for it.

      • by Burz (138833)

        If that's true then maybe Intel is making this move so they can sell more product: Power breakdowns to stand in as a replacement for technological obsolescence (which has been petering out in recent years).

        And before anyone calls me cynical, I know for a fact that Intel is concerned about keeping the replacement cycle going. They have stated it at times when investors were getting jittery, and they even had a TV ad in plain view that admitted they wanted to entice people who "thought" they were perfectly ha

      • You already have a separate, programmable regulator for Vcore (overclockers fiddle with it all the time) and in both cases if the regulator fails the CPU is toast so there's no advantage in keeping it outside. I'm not sure how they integrated the reactive components, but they're surely more reliable than current electrolytics, plus shorter paths mean less voltage drop meaning less stress.

      • Re: (Score:3, Interesting)

        by Anonymous Coward

        This means you are going to have to 1) have redundant regulation on the mo-bo for other components,

        Nope. Motherboards already had dedicated regulators just for the CPU.

        High-speed CPU core logic needs very low supply voltages, around 1.0V these days. Lower speed parts built in older processes need higher voltages -- 1.2V, 1.5V, 1.8V, or more. There's not much on the motherboard which even can share supplies with the CPU. Also, CPUs now dynamically vary their own core voltage (by sending commands to the regulator) in order to save power. That wouldn't work so well with other chips sharing the same reg

  • by Anonymous Coward

    That's fantastic. I love seeing efficiency, but I imagine that the screen would eat most of the battery life in consumer applications.

    • Re: (Score:2, Informative)

      by Anonymous Coward

      Depends on the screen you have, I would guess. https://www.google.com/search?q=laptop+screen+wattage&aq=f&oq=laptop+screen+wattage
      If you look at the first link there, you'll see that the LCD screen takes up on the order of 5W of power at full brightness. The same paper says that the power usage roughly doubles when you start blasting the CPU. If you use your laptop like I do (I'm in an engineering program at college), that's some nice savings there if they can trim the CPU usage.

      • Re: (Score:3, Insightful)

        by niftymitch (1625721)

        Depends on the screen you have, I would guess. https://www.google.com/search?q=laptop+screen+wattage&aq=f&oq=laptop+screen+wattage [google.com] If you look at the first link there, you'll see that the LCD screen takes up on the order of 5W of power at full brightness. The same paper says that the power usage roughly doubles when you start blasting the CPU. If you use your laptop like I do (I'm in an engineering program at college), that's some nice savings there if they can trim the CPU usage.

        Yes screen technology is important.... Pixel Qi technology seems to be ignored and should not
        Especially on laptops that mate well with a docking station for "work".
        A big quality display at the office is a good thing. Especially on that has been rotated to be tall. The ability to have a very low power transmissive/ reflective display while mobile and a serious display at a desk at work is under served.

        Docking station tech is lame at best. First the battery charging logic is flawed. The charger sho

        • Docking station tech is lame at best. First the battery charging logic is flawed. The charger should disconnect from the battery once it is charged. It should test the battery once an hour thereafter and decide what to do. I cannot tell you how many batteries I have had die from long term over charging and lack of correct dynamics in use.

          Or simply not charge the battery. I think this is a software problem as opposed to hardware issue.

          A docking station should have cooling designed to keep the battery as well as the CPU/logic cool. Most obstruct air flow and do neither well.

          I think this was the purpose of Thunderbolt. You don't need a docking station anymore. Just the charger and one cable for connections. As far as I know Apple is the only one that fully embraces TB. Not surprisingly I think this is because Apple doesn't have a docking station. Maybe it was for aesthetics that Apple never designed one. Other manufacturers are more hestitant to use TB as it means they can no

          • by the_B0fh (208483)

            You do realize Apple was one of first, if not the first with a docking station years and years ago, right?

            • Not since Jobs came back have they had a docking station. Like I said it was probably aesthetics why they didn't have them.
        • Docking station tech is lame at best. First the battery charging logic
          is flawed. The charger should disconnect from the battery once it is charged.
          It should test the battery once an hour thereafter and decide what to do. I cannot
          tell you how many batteries I have had die from long term over charging and
          lack of correct dynamics in use.

          A docking station should have cooling designed to keep the battery as well
          as the CPU/logic cool. Most obstruct air flow and do neither well.

          This depends entirely on the laptop/battery. The last two Lenovos I've had both offered smart charging where the battery would optionally not begin charging until below X% and would stop when the battery signaled it was full. The charging threshold could either be directly specified by the user or determined by the laptop based on usage pattern.

          My previous machine I set to not recharge until below 85%. It was a power hog so the battery was pretty much a pack-along UPS. 15% represented a fairly small number

  • by hawguy (1600213)

    The biggest battery drain on my phone is always the display, followed by "Cell standby". How is a CPU and chipset able to promise a 50% increase in battery life when it's not even the biggest power user in the phone?

    • by msauve (701917) on Friday May 24, 2013 @03:47PM (#43816073)
      I'd be interested to know what phone you have, that uses an Intel Ivy Bridge server/desktop/laptop processor.
    • by EvilSS (557649)
      For starters we are talking about laptops, with x86 CPUs that are much more power hungry than the ARM based proc in your phone.
    • by Anonymous Coward

      Phone CPU's vs laptop and desktop CPUs are in different leagues.

      It is no surprise the biggest draw is your screen in a phone. On a laptop the biggest ones are CPU then video card/chipset then screen.

      You are comparing apples and oranges. Many phones are SoC's these days or at best 2-3 chips. Laptops are not there yet. Your phone cpu measures its draw in milliwatts the laptop/desktop crew measure in watts.

      They had a very decent boost last year with ivy. I went from a sandy bridge laptop to an ivy and the

    • Without checking the source, i bet it is only the cpu/gpu/power thtat is getting lower values. It is the old intel story again. First it was the atom cpu that was supposed to be super low power. However they forgot to mention you needed a chitset along with it for the video networking pci that was not so super savy with power.

      Now the cpu/gpu is super power savery. But the wifi/display/battery/2g/3g/nfc/audio/cam/gps might still drain your battery in 3 seconds.....

      • According to Wikipedia at least [wikipedia.org], the Haswell architecture will include a die-shrink in the PCH (Northbridge) chipset from 65nm to 32nm, so this issue is avoided I think.

        • by leuk_he (194174)

          No, sorry, you did not understand. The northbridge may be very savy now. (compared to???) But beside the CPU with ingegrated north bridge you also need a lot of other supporting hardware.

          Notice in the pictures that it is targeted for tablet size, not phalet, or phones. They need a lower kind of power usage I suppose.

    • by jittles (1613415)

      The biggest battery drain on my phone is always the display, followed by "Cell standby". How is a CPU and chipset able to promise a 50% increase in battery life when it's not even the biggest power user in the phone?

      I would guess that you suffered a brief lapse in reading comprehension. My take on this is that the Haswell uses 50% less power for the same performance / capability as an Ivy Bridge. Whether or not that cuts battery consumption overall by 50%... well I highly doubt it.

  • by Anonymous Coward on Friday May 24, 2013 @03:48PM (#43816085)

    Is this seriously 50% increase in battery life? Or just 50% reduction in power usage by CPU? The article wasn't clear on this. I'm assuming the power usage thing.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Very likely, they're talking about the CPU using 50% less power. Intel doesn't make laptop batteries, and battery technology is on a plateau right now since we're hitting the very limits of chemistry in Li-Ion and Li-Poly batteries at the moment.

      • by Carewolf (581105)

        Very like they are talking about the CPU using 33% percent less power, thus increasing how long it can run on given about of Wh by 50%.

      • Very likely, they're talking about the CPU using 50% less power.

        Very likely, they aren't, since they make specific claims about CPU power under different regimes and all of them are much more significant than that, and then go on to say that the CPUs will enable laptops using them to have 50% greater battery life.

    • > Is this seriously 50% increase in battery life? Or just 50% reduction in power usage by CPU?

      Assuming the CPU was the only element consuming power, a 50% reduction in power usage by the CPU would equate to a 100% increase in battery life. But, yes, what they are claiming is that the net effect of the various improvements is that it should enable a 50% increase in battery life, not that it will merely reduce power consumption on the CPU by the amount that would do that if the CPU was the only power draw.

    • How is Barry Life Formed? How Usage get Consemption?

      they need to do way instain comsumer> who kill thier barrys. becuse these barry cant frigth back?

      it was on the charger this mroing a user in ar who had kill their three divice.
      they are taking the three barry back to zero charge too lady to rest.
      My parry are with the tickle chrager who lost its powre ; i am truley sorry for voltage lots.

    • by wisty (1335733)

      IIRC, Intel did a lot of work on the whole system (including motherboards - I think they actually worked with other manufacturers too), not just the chip. Not all the savings are from the CPU.

  • FIVR in the mornin' FIVR in the evenin', FIVR all through the night!
    • by Kjella (173770)

      FIVR in the mornin' FIVR in the evenin', FIVR all through the night!

      Yeah, but the biggest benefit it seems they got in sleep states and I don't think sleeping in the morning, sleeping in the evening, sleeping all through the night is what the song is all about...

  • by Jeremy Erwin (2054) on Friday May 24, 2013 @04:18PM (#43816317) Journal

    Is this a laptop only chipset, or does intel have goodies for those who like to be chained to their desks?

    • Haswell is a laptop/desktop/server microarchitecture, but Intel doesn't care very much about the desktop anymore, so expect little press coverage of that angle.

      • Haswell is a laptop/desktop/server microarchitecture, but Intel doesn't care very much about the desktop anymore, so expect little press coverage of that angle.

        Yeah, its not like most of the stories on this announcement have covered Intel's claim of tripling the integrated graphics performance on desktop systems (and doubling it on laptop systems.)

        Well, except that that is exactly the case.

  • by DarthVain (724186) on Friday May 24, 2013 @04:23PM (#43816369)

    Like most CPU's these days, they produce a lot of variants.

    For this article they are likely talking about the "U" variant with 15W TDP.
    http://en.wikipedia.org/wiki/Haswell_(microarchitecture)#Mobile_processors [wikipedia.org]

    You can't really compare that with the (or say in same breath) desktop "K" variant with 84W TDP (also has twice the cores and threads).
    http://en.wikipedia.org/wiki/Haswell_(microarchitecture)#Desktop_processors [wikipedia.org]

    I am pretty sure the benchmarks will be wildly different. Anyway the summary makes it sound like it is all one thing. I am sure it will be very good and all, but I know I won't be getting one of those power saving versions. POWER! (To quote Clarkson)

  • I wonder how the performance vs power consumption compares to the old Transmeta chips that started the trend.

  • Soon motherboards will be just wiring for the I/O and CPU

    • by mysidia (191772)

      Soon motherboards will be just wiring for the I/O and CPU

      And despite that, there is no price decrease to be seen in motherboards... if anything, they are getting more expensive, despite having less silicon and intelligence on them <G>

  • Is that marketing speak for "we were unable to increase the operating frequency"?

  • And yet... (Score:5, Informative)

    by RMingin (985478) on Friday May 24, 2013 @08:02PM (#43818047) Homepage

    Too bad CPU power consumption hasn't been the biggest consumer of watts in many years.

    Hint; the biggest amount of consumed current in most laptops is the glowing part you look at.

    • Then perhaps the next step is to build user interfaces that aren't based on scrolling or other smooth motion, so that something with a laptop form factor and an e-ink display becomes viable.
  • Now they can make the OS and application coding less efficient!

"You know, we've won awards for this crap." -- David Letterman

Working...