Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Intel Power Hardware

Intel Unveils 10-Watt Haswell Chip 103

adeelarshad82 writes "At IDF, Intel announced the company's fourth-generation Core processor code-named Haswell. The chip is based off of the same 22nm process used in the current third-generation Core products. What makes this chip remarkably different from the third-generation chips is its ability to product twice the graphic capabilities at a much lower power consumption, which Intel has achieved by making use of a number of tactics." HotHardware has video of Haswell running a 3D benchmark.
This discussion has been archived. No new comments can be posted.

Intel Unveils 10-Watt Haswell Chip

Comments Filter:
  • by Anonymous Coward on Tuesday September 11, 2012 @07:48PM (#41307123)

    Intel's Statement was that it could produce similar results as Ivy Bridge at half the power consumption OR around twice the power at the same power consumption as Ivy Bridge's built in chip.

    Which is still pretty good all considered.

  • by sexconker ( 1179573 ) on Tuesday September 11, 2012 @07:59PM (#41307207)

    Pushed by Intel. AMD is following... still.

    The GPU parts in AMD's "APUs" are miles beyond Intel's HD Graphics.

  • by Anonymous Coward on Tuesday September 11, 2012 @08:20PM (#41307367)

    The integrated graphics are still crap.

    The thermal overhead added to the CPU die limits the amount of computational power they can realistically add. Not to mention that on enthusiast systems it creates needless waste heat that could be better spent on CPU cycles. (Supposedly we'll see some tightly integrated cpu+gpu systems with shared memory space and registers and whatnot.. But we're far away from that, as it presents a large departure from traditional PC architecture, let alone x86 arch. AMD is way ahead on this path though, and it may pay off for them in the future.)

    Above aside, the real elephant in the room comes down to memory speed. GPUs need memory bandwidth. Lots of it. GPU speed scales with memory bandwidth to the point that it's pretty much the most significant metric that separates price tiers in the traditional GPU card market. GPUs are supposed to have fast, high clocked, tightly integrated memory subsystems using exotic high speed memory types explicitly designed to directly coupled to GPU chips, for their exclusive use.(GDDR3, GDDR5, etc)

    These cpu-gpus have to make due with the plain old low bandwidth narrow bus main memory in your system. And they have to share that bandwidth with the rest of the system. GPUs are so memory speed sensitive that you can see drastic differences in performance on the AMD chips simply by getting faster main memory modules. Overclocking your memory yields even more improvement - But that's the thing. All of these solutions are budget oriented, so they'll be saddled with slow and cheap memory to begin with.

    You know the xbox? It's got shared memory. How to they get fast performance? ALL of the system's main memory hangs off the GPU. It's ALL GDDR3. They can do this sort of unified memory arch because it's a special custom designed system.

    Until the memory bandwidth issue is solved the integrated GPUs will continue to be crap.

  • Re:Compared to ARM (Score:5, Informative)

    by Wallslide ( 544078 ) on Tuesday September 11, 2012 @10:03PM (#41307975)
    According to anandtech.com, the '20x lower power' statistic is only a reference to the chip's idle power state, not while it's under any sort of processing load.
  • by default luser ( 529332 ) on Tuesday September 11, 2012 @10:24PM (#41308129) Journal

    Intel's top Atom chips have a 10W TDP. Of course the chipset/RAM also play a large factor, but still -- this is an amazingly frugal CPU

    You're thinking of the wrong Atom CPU there. You want to compare Intel's lowest-power Core architecture to...their lowest-power Atom.

    Intel has placed an Atom Z2460 on a smartphone platform, complete with 1.6 GHz core speed and sub 1w typical power consumption [anandtech.com], and they've done it on just the old 32nm process. The 10w parts you're thinking of are for desktops.

    These 10w Haswell chips will also be the pick of the litter, but the power consumption will be nowhere near that of Atom (and neither will the price...expect to pay upwards of $300 for these exotic cores). The Lava Xolo X900 costs only $400 on the street [anandtech.com], so you can imagine Intel's charging around $25 for their chipset.

  • by bemymonkey ( 1244086 ) on Wednesday September 12, 2012 @02:36AM (#41309661)

    "The integrated graphics are still crap."

    Depends what for, really... Office, web and HD video? Nope, they're pretty good at that - so good, in fact, that I don't buy machines with dedicated graphics cards unless I'm planning on playing games or running applications that specifically require a fast GPU.

    Even the HD3000 or HD4000 (Sandy and Ivy Bridge, respectively) graphics included with the last and current generations of Intel Core iX CPUs are overkill for most people - even a 4500MHD (Core 2 Duo 2nd generation) had perfect support for 1080p acceleration and ran Windows 7 at full tilt with all the bells and whistles, if you wanted those. What more do you want from integrated graphics?

    The fact that I can even play Starcraft II on low at 1080p on a Core i3 with the integrated HD3000 at acceptable framerates is just icing on the cake...

    Oh and have I mentioned the sub-5W total system power consumption on a 15.6" laptop with a standard voltage CPU? THAT is what integrated graphics are for. If you're looking to do gaming or CAD or use the GPU for computationally intensive tasks, you're not in the target audience...

  • by Anonymous Coward on Wednesday September 12, 2012 @05:50AM (#41310425)

    The GMA500 was for embedded devices anyway, and not a real Intel chipset. Intel knows of the problem is actively working on replacing those PowerVR chips with their own chips. ARM chips have the same or even worse problems than GMA500 chips: You don't have working drivers for those either, maybe some for Android, but not for Xorg.

  • by Solandri ( 704621 ) on Wednesday September 12, 2012 @07:47AM (#41310921)
    My 2-year old laptop has an nVidia GT 330M [notebookcheck.net]. At the time it was a mid-range dedicated mobile 3D video card.

    Ivy Bridge's HD4000 comes very close to matching its performance [notebookcheck.net] while burning a helluva lot less power. So the delta between mid-grade dedicated video and integrated video performance is down to a little over 2 years now. Intel claims Haswell's 3D video is twice as fast as HD4000. If true, that would put it near the performance of the GT 640M, and lower the delta to a bit over 1 year.

    This is all the more impressive if you remember that integrated video is hobbled by having to mooch off of system memory. If there were some way to give the HD4000 dedicated VRAM, then you'd have a fairer apples to apples comparison of just how good the chipset's engineering and design are compared to the dedicated offerings of nVidia and AMD.

    I used to be a hardcore gamer in my youth, but life and work have caught up and I only game casually now. If Haswell pans out, its integrated 3D should be plenty enough for my needs. It may be "crap" to the hardcore gamer, but they haven't figured out yet that in the grand scheme of things, being able to play video games with all the graphics on max is a pretty low priority.
  • by Kjella ( 173770 ) on Wednesday September 12, 2012 @11:14AM (#41312739) Homepage

    The GMA500 disaster showed how much Intel cares for end users after selling them the hardware.

    GMA500 = rebranded PowerVR SGX 535. The graphics Intel develops themselves isn't for serious gamers but it's improved leaps and bounds over the last couple years. You're of course free to be unhappy about the Poulsbo and with good reason, but most people with a recent Intel IGP are very happy and the sales of discrete cards only goes one way, down.

Old programmers never die, they just hit account block limit.

Working...