Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Intel Power Hardware

Intel Unveils 10-Watt Haswell Chip 103

adeelarshad82 writes "At IDF, Intel announced the company's fourth-generation Core processor code-named Haswell. The chip is based off of the same 22nm process used in the current third-generation Core products. What makes this chip remarkably different from the third-generation chips is its ability to product twice the graphic capabilities at a much lower power consumption, which Intel has achieved by making use of a number of tactics." HotHardware has video of Haswell running a 3D benchmark.
This discussion has been archived. No new comments can be posted.

Intel Unveils 10-Watt Haswell Chip

Comments Filter:
  • Closing in on Atom (Score:5, Interesting)

    by PhrostyMcByte ( 589271 ) <phrosty@gmail.com> on Tuesday September 11, 2012 @06:35PM (#41307005) Homepage
    Intel's top Atom chips have a 10W TDP. Of course the chipset/RAM also play a large factor, but still -- this is an amazingly frugal CPU
    • Intel's top Atom chips have a 10W TDP. Of course the chipset/RAM also play a large factor, but still -- this is an amazingly frugal CPU

      At IDF, Intel also talked about upcoming 5W Atom chips that will be out at the same as Haswell

    • by Anonymous Coward on Tuesday September 11, 2012 @06:46PM (#41307107)

      The part I was impressed with was how they did it...
      "[...] which Intel has acheived by making use of a number of tactics."
      +5 Informative!

      • by LordKronos ( 470910 ) on Tuesday September 11, 2012 @08:31PM (#41307825)

        The part I was impressed with was how they did it...
        "[...] which Intel has acheived by making use of a number of tactics."
        +5 Informative!

        Welcome to the internet. We have these thing called hyperlinks. Anytime you see underlined text of a different color, you should consider clicking on it. I you had clicked on the phrase "number of tactics", you would have been taken to view another article which would have explained many of these tactics.

    • by Unknown Lamer ( 78415 ) Works for Slashdot <clintonNO@SPAMunknownlamer.org> on Tuesday September 11, 2012 @06:52PM (#41307147) Homepage Journal

      The best part is that, unlike Atom, these things are usably fast. I have a 2x1.3Ghz core2 process shrunk or something with a TDP of 12W (total system is about 26W ... under full load). I mostly live in Emacs but I like a compositing window manager (wobbly windows are fun alright) and GL screen hacks... the thing does great and can handle my regular abuse of PostgreSQL/SBCL/mlton/... all while getting about 8-9 hours of realistic use (ok, closer to 7 now that the battery is down to 72Wh from its 84Wh theoretical max when it was new) and all under 10W generally. Sign me up for something that uses about the same power and is just a bit slower than the current Ivybridge processors... (come on laptop, don't die until next summer).

      And it all Just Works (tm) with Debian testing (it even worked with Squeeze, but GL was a bit slow since it predated the existence of the graphics hardware and all). Now, if only someone would make one of these low voltage things with a danged Pixel Qi display or whatever Qualcomm has so that I can use it outside... not having to worry about finding power every 2 to 3 hours is great, but if you're still stuck indoors it's not half as great as it could be.

    • by default luser ( 529332 ) on Tuesday September 11, 2012 @09:24PM (#41308129) Journal

      Intel's top Atom chips have a 10W TDP. Of course the chipset/RAM also play a large factor, but still -- this is an amazingly frugal CPU

      You're thinking of the wrong Atom CPU there. You want to compare Intel's lowest-power Core architecture to...their lowest-power Atom.

      Intel has placed an Atom Z2460 on a smartphone platform, complete with 1.6 GHz core speed and sub 1w typical power consumption [anandtech.com], and they've done it on just the old 32nm process. The 10w parts you're thinking of are for desktops.

      These 10w Haswell chips will also be the pick of the litter, but the power consumption will be nowhere near that of Atom (and neither will the price...expect to pay upwards of $300 for these exotic cores). The Lava Xolo X900 costs only $400 on the street [anandtech.com], so you can imagine Intel's charging around $25 for their chipset.

  • by Lord Lode ( 1290856 ) on Tuesday September 11, 2012 @06:41PM (#41307067)

    So wait, is this only about the graphics part inside the CPU or what?

    Who cares about that graphics part inside the CPU. Useful for a laptop maybe, but for the real stuff you need an actual graphics card.

    • by silas_moeckel ( 234313 ) <silas AT dsminc-corp DOT com> on Tuesday September 11, 2012 @06:44PM (#41307089) Homepage

      Because most PC's sold use integrated graphics, traditionally they have been abysmal. In the last few years seemly pushed by AMD they have been looking to correct that.

      • Re: (Score:3, Informative)

        by Anonymous Coward

        The integrated graphics are still crap.

        The thermal overhead added to the CPU die limits the amount of computational power they can realistically add. Not to mention that on enthusiast systems it creates needless waste heat that could be better spent on CPU cycles. (Supposedly we'll see some tightly integrated cpu+gpu systems with shared memory space and registers and whatnot.. But we're far away from that, as it presents a large departure from traditional PC architecture, let alone x86 arch. AMD is way ahea

        • by Nimey ( 114278 )

          You could use integrated GPUs for vector/matrix math, something they're a lot better at than the x86 core, thus greatly increasing the efficiency of certain workloads.

          You don't have to use them only for making pretty pictures.

          • Correct and GPU/CPU combinations also gain the next fastest memory tier below "the register" instead of going off die like even discrete GPUs have to do for some workloads.

        • by bemymonkey ( 1244086 ) on Wednesday September 12, 2012 @01:36AM (#41309661)

          "The integrated graphics are still crap."

          Depends what for, really... Office, web and HD video? Nope, they're pretty good at that - so good, in fact, that I don't buy machines with dedicated graphics cards unless I'm planning on playing games or running applications that specifically require a fast GPU.

          Even the HD3000 or HD4000 (Sandy and Ivy Bridge, respectively) graphics included with the last and current generations of Intel Core iX CPUs are overkill for most people - even a 4500MHD (Core 2 Duo 2nd generation) had perfect support for 1080p acceleration and ran Windows 7 at full tilt with all the bells and whistles, if you wanted those. What more do you want from integrated graphics?

          The fact that I can even play Starcraft II on low at 1080p on a Core i3 with the integrated HD3000 at acceptable framerates is just icing on the cake...

          Oh and have I mentioned the sub-5W total system power consumption on a 15.6" laptop with a standard voltage CPU? THAT is what integrated graphics are for. If you're looking to do gaming or CAD or use the GPU for computationally intensive tasks, you're not in the target audience...

          • by Solandri ( 704621 ) on Wednesday September 12, 2012 @06:47AM (#41310921)
            My 2-year old laptop has an nVidia GT 330M [notebookcheck.net]. At the time it was a mid-range dedicated mobile 3D video card.

            Ivy Bridge's HD4000 comes very close to matching its performance [notebookcheck.net] while burning a helluva lot less power. So the delta between mid-grade dedicated video and integrated video performance is down to a little over 2 years now. Intel claims Haswell's 3D video is twice as fast as HD4000. If true, that would put it near the performance of the GT 640M, and lower the delta to a bit over 1 year.

            This is all the more impressive if you remember that integrated video is hobbled by having to mooch off of system memory. If there were some way to give the HD4000 dedicated VRAM, then you'd have a fairer apples to apples comparison of just how good the chipset's engineering and design are compared to the dedicated offerings of nVidia and AMD.

            I used to be a hardcore gamer in my youth, but life and work have caught up and I only game casually now. If Haswell pans out, its integrated 3D should be plenty enough for my needs. It may be "crap" to the hardcore gamer, but they haven't figured out yet that in the grand scheme of things, being able to play video games with all the graphics on max is a pretty low priority.
          • have I mentioned the sub-5W total system power consumption on a 15.6" laptop with a standard voltage CPU?

            Obviously at least an LED backlight, and probably turned way down. Or what, is it OLED?

            • Laptop displays have been LED backlit for years now - you can't buy a CCFL backlit display except maybe as a standalone monitor in the clearance aisle of your local big box electronics store...

              As for AMOLED... that's useless as a laptop display, because it uses 2-5x as much power as a decently efficient LED backlit display when displaying mainly-white content (such as Slashdot or other websites) - not to mention the fact that AMOLED displays at this size (15.6" diagonal in this case, but consider this sente

          • Depends what for, really... Office, web and HD video? Nope, they're pretty good at that - so good, in fact, that I don't buy machines with dedicated graphics cards unless I'm planning on playing games

            So if someone buys a laptop for "Office, web and HD video" and later decides to try games, what should he do? Buy another computer? Whatever happened to buying something that will grow with your requirements?

            • The problem is that buying a laptop (or even desktop - although the problems are usually more pronounced on laptops) with a high-powered graphics card has very negative side-effects:

              1. More heat - Fan noise, uncomfortable heat during use, significan reduction in longevity (ever seen a non-plastic-POS Intel based laptop without dedicated graphics overheat? I haven't...)
              2. Higher power consumption - the most efficient laptops with dedicated non-switchable graphics draw upwards of 10W idle... many draw 15 or 2

        • GDDR3 is an optimized type of DDR2 memory. The 700Mhz in the Xbox360 it is less than half the speed of the DDR3-800Mhz stuff used by Atom's for the last couple of years. Even if you fit them with 4 times the memory they can't get close to the 360's graphics performance?
          In the context of Haswell you are talking about an entry level of dual-channel DDR3-1600Mhz or around 25Gb/s beating the GDDR5 in bleeding edge top of the line discreet cards from just 4 years ago.

        • Not true. Firstly, the memory channel to the iGPU is somewhat more sophisticated than just tacking on the main memory bus. Secondly, the iGPU has much less processing power than a top end dGPU, therefore it need much less memory bandwidth. Increasing bandwidth would be of no value.
          iGPUs are mainly for budget laptops, where there is going to be no dGPU installed. Ivy bridge and Trinty iGPUs are powerful enough to run Crysis (on low settings). Something not to be scoffed at, considing low budget buyers never

      • Comment removed (Score:5, Insightful)

        by account_deleted ( 4530225 ) on Tuesday September 11, 2012 @09:18PM (#41308079)
        Comment removed based on user account deletion
        • nVidia knows this too. As you can see, they've been focusing in on advanced 3D gaming and super computing.

          And mobile stuff (Tegra) where they have their own (licensed) CPU and GPU.

        • by fa2k ( 881632 )

          It's official. Intel on-board video is all you'll ever need for home and general office use.

          Agreed, but don't confuse this with "you should recommend integrated graphics to home users", though. Your examples are perfectly tuned for Intel graphics because many people have them. For a business, that's fine, you only need a fixed set of applications. For a home user, it's likely to be some flash game, Google Earth or some software that works much better with a dedicated graphics card. Good ones are quite cheap now, and if your looking at a "i5", spending some extra on a GPU gives more bang for the b

        • by Lennie ( 16154 )

          So what Intel has is "good enough" for 99.9% of the users, but AMD delivers the same thing but for less money ?

    • Useful for a laptop maybe

      Hmm, I wonder where these ultra low-power chips are intended to go...

    • by Hadlock ( 143607 )

      Laptops make up something like 50% of the consumer market. Integrated graphics are what go in most dells for corporate users. An HD4000 has no problem pushing a dual or triple screen setup. The triple head displays at my work choke on anything more than word processing. Dragging a youtube video across all three makes things very choppy. Also, the HD4000 is an actually usable chipset. It's nothing like the old integrated graphics of old like the GMA950 which couldn't even load TF2. HD4000 will do TF2 at 250-

    • Actually, I find the integrated GPU interesting - not for graphics, but for additional GPGPU power. Those things are fully OpenCL/DX11.1 compliant, so you can probably run some fluid simulation or n-body on them while at the same time doing some different crunching on the CPU, all being rendered extra pretty by a powerful discrete GPU.

    • by LWATCDR ( 28044 )

      You do know that laptops outsell desktops. As for real stuff if you mean work then boy are you wrong. For anything outside of the sciences, CAD, CAM, Video, and Audio production, these will work very well. For all the home users that run Quicken and go on the web to use Facebook and such then this will do very well for them.
      If these chips can get good enough performance on a 1080 monitor then they will be a giant boon for gaming. Most people use a single 1920x1080 monitor if this allows for a lot of games t

    • Why do you need an actual graphics card to provide hardware acceleration to your OS windows and web browsing?
      Why can't you have the integrated graphics render most things, and your games/cad software using a discrete card when they need it?
    • Who cares about that graphics part inside the CPU. Useful for a laptop maybe, but for the real stuff you need an actual graphics card.

      I have seen it asserted a couple times now that the current intel integrated graphics are acceptable for light gaming. If the new stuff is twice as powerful (I'm confused by the summary but don't care enough to RTFA as I have no plans for new machines in that class any time soon) then it will be entirely useful for everyone but gamers who must play the latest and greatest at high resolution.

  • by Anonymous Coward on Tuesday September 11, 2012 @06:48PM (#41307123)

    Intel's Statement was that it could produce similar results as Ivy Bridge at half the power consumption OR around twice the power at the same power consumption as Ivy Bridge's built in chip.

    Which is still pretty good all considered.

    • Sounds awesome to me... I'll take half the power consumption, thanks. I wonder if that goes for total idle power consumption... I'm already seeing less than 5W idle (depending on which LCD panel is installed - add a Watt for the enormously inefficient AUOptronics 1080p panel Lenovo uses) on my Sandy Bridge laptop (and that power consumption includes the SSD), so Haswell should hopefully be able to drop that to 3-4W... hopefully that'll also average out to ~2W less in actual use - meaning a 94Wh battery woul

      • Haha, what do you have, a T520? I was excited about the T530 until I saw the keyboard. It's not even the chiclet thing, it's the missing row and consequently fucked up layout!

        Aaanyway, I've been getting extremely impressive battery life out of the Sandy based laptops, so the future looks bright :)

  • Welcome to the world of the supersmall. As real as software, and just as hard to impress when going, "see this".

  • by Anonymous Coward

    Intel has laid its share of rotten eggs, but for the past few years they seem to "get it" relative to the technology market. Consumers want lower power consumption, small form factor, and hardware acceleration for mobile access to Internet services. Companies want higher core density per physical chip, lower power consumption, and virtualization to better deliver services to the Internet. If Intel delivers the best products for each segment of that ecosystem, they have a bright future ahead of them.

  • by Anonymous Coward

    I accidentally went into the article and near the bottom they mention an i7 powered coke machine. Now that's bloat.

    • That's just the code name for the Charlie Sheen bot they've got in skunkworks.

    • A lot of such devices are built round PCs (some use special embedded form factor PCs, others just have a normal PC tower sitting inside them) despite them being overkill and not particularly reliable. I guess it's because windows devs are easier to find than devs who can handle an arm linux board.

      I do wonder why an i7 though, a celeron would be more than sufficent.

  • It looks like based on what we're seeing from intel's plan for Haswell, the upgrade path for those on SandyBridge-E is going to be Xeon going forward.
  • What I want for my ultimate mobile computing device:

    1. Small, lightweight and have physical keyboard
    I walk a lot so I want small device that fit comfortably in my backpack (so that's below 7'') and weight less than 1.5(preferrably 1) pound. I'm not all-day mobile warrior so I can live with cramped keyboard but after testing my wife's galaxy s2 touch keyboard I decided I DO NEED a physical keyboard for typing documents/playing games(like nethack, old dosbox compatible games).

    2. MS application/IE comp

    • Happy to know that bay trail platform finally drops PVR graphics core. Hope that some manufacturer produces small factor platform that I want in 2013.

    • MS IE only internet banking

      Other banks exist.

      I need graphic core that supports linux well and play angry bird. PVR core in atom don't support either.

      Since when are PC makers still using GMA 500 (the PowerVR core) in new Atom netbooks? I thought they had all switched to four-digit GMAs, which have working drivers in Ubuntu.

  • by jbernardo ( 1014507 ) on Wednesday September 12, 2012 @03:43AM (#41310173)
    As any other owner of that orphaned Intel chipset, I'll never buy another Intel integrated video solution. Even if they manage to get their power consumption below competitive ARM SoC, I will still not get that crap. The GMA500 disaster showed how much Intel cares for end users after selling them the hardware. So it is interesting they managed to reduce power consumption so much, but my netbooks are still going to be AMD, my tablets and phones are ARM possibly with NVidia's Tegra chipset. Intel will have to do a lot more to convince me to try their solutions again.
    • Re: (Score:2, Informative)

      by Anonymous Coward

      The GMA500 was for embedded devices anyway, and not a real Intel chipset. Intel knows of the problem is actively working on replacing those PowerVR chips with their own chips. ARM chips have the same or even worse problems than GMA500 chips: You don't have working drivers for those either, maybe some for Android, but not for Xorg.

    • by Kjella ( 173770 ) on Wednesday September 12, 2012 @10:14AM (#41312739) Homepage

      The GMA500 disaster showed how much Intel cares for end users after selling them the hardware.

      GMA500 = rebranded PowerVR SGX 535. The graphics Intel develops themselves isn't for serious gamers but it's improved leaps and bounds over the last couple years. You're of course free to be unhappy about the Poulsbo and with good reason, but most people with a recent Intel IGP are very happy and the sales of discrete cards only goes one way, down.

      • I know the Poulsbo chipset is a re-branded PowerVR. But that isn't the main problem here; I don't care if it was re-branded or developed in house. The problem is that Intel released that crap, then abandoned it. They had half-decent psb drivers, which where great for watching films without stressing the underpowered atom CPU, but they just dropped any development (or even basic maintenance) for them.Then, after a huge outcry, promised gallium drivers for it, had them almost finished, and never released the

    • The GMA500 is not an Intel chipset. It's a rebranded PowerVR SGX something or other.

You know you've landed gear-up when it takes full power to taxi.

Working...