Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Graphics Intel Upgrades Hardware

Haswell Integrated Graphics Promise 2-3X Performance Boost 133

crookedvulture writes "Intel has revealed fresh details about the integrated graphics in upcoming Haswell processors. The fastest variants of the built-in GPU will be known as Iris and Iris Pro graphics, with the latter boasting embedded DRAM. Unlike Ivy Bridge, which reserves its fastest GPU implementations for mobile parts, the Haswell family will include R-series desktop chips with the full-fat GPU. These processors are likely bound for all-in-one systems, and they'll purportedly offer close to three times the graphics performance of their predecessors. Intel says notebook users can look forward to a smaller 2X boost, while 15-17W ultrabook CPUs benefit from an increase closer to 1.5X. Haswell's integrated graphics has other perks aside from better performance, including faster Quick Sync video transcoding, MJPEG acceleration, and support for 4K resolutions. The new IGP will support DirectX 11.1, OpenGL 4.0, and OpenCL 1.2, as well." Note: Same story, different words, at Extreme Tech and Hot Hardware.
This discussion has been archived. No new comments can be posted.

Haswell Integrated Graphics Promise 2-3X Performance Boost

Comments Filter:
  • by Anonymous Coward

    http://en.wikipedia.org/wiki/SGI_IRIS

  • by earlzdotnet ( 2788729 ) on Thursday May 02, 2013 @09:16AM (#43609515)
    is that Intel provides very nice open source drivers for their integrated GPUs
    • Tell that to Poulsbo chipset buyers...

      The promised Gallium3D driver [phoronix.com] was never delivered, even if it apparently was nearly ready to release, and Intel only kicked users around between their desktop and automotive teams, releasing some crappy binary drivers to keep them quiet.

      • by jonwil ( 467024 )

        I suspect a lot of the problem there has to do with Imagination Technologies (creator of the PowerVR GPU core in the Poulsbo parts) and how much Imagination Technologies were willing to let Intel release (either as binary drivers or as source code)

        • I suspect a lot of the problem there has to do with Imagination Technologies (creator of the PowerVR GPU core in the Poulsbo parts) and how much Imagination Technologies were willing to let Intel release (either as binary drivers or as source code)

          And you're probably right, in what respects to not releasing a open source driver for the poulsbo chipset. But Intel treated their customers with the utmost disrespect, only pretending to be working on a usable driver until it was discontinued and abandoned. No decent closed source driver was released after the first interaction, before the promise of the Gallium 3D driver. Users were lied to, and pushed from one team to the other looking for drivers. That wasn't Imagination Tech - that was Intel stalling a

      • by h4rr4r ( 612664 ) on Thursday May 02, 2013 @09:55AM (#43609999)

        Not an intel chip, it came from PowerVR.

        Anyone who bought one of those was simply a fool. Everyone gets to be a fool now and then so don't feel to bad if it bit you.

        • Anyone who bought one of those was simply a fool. Everyone gets to be a fool now and then so don't feel to bad if it bit you.

          Intel fooled me once. They won't fool me twice, as I won't buy their chipsets again. My neybooks are AMD APUs, my tablets and smartphones are ARM. Good ridance!

          • by h4rr4r ( 612664 )

            When AMD makes a decent open source driver I will use their graphics cards. So far I use intel GPUs and Nvidia ones. The latter will end up no longer in my home when Intel gets good enough. At that point I will likely stop buying AMD cpus.

            • When AMD makes a decent open source driver I will use their graphics cards. So far I use intel GPUs and Nvidia ones.

              You're in luck. AMD's open source driver trounces NVIDIA's [slashdot.org].

            • by higuita ( 129722 )

              So you are using open Intel drivers, that are more or less on par with AMD ones (usually lagging a few months from intel, because they share most of the mesa code, its easier to catch up than building ground up), with AMD cards being more powerful and so, faster.

              I play linux games (via humble bundle, steam and desura) with AMD open drivers, so i can tell you that they work. If you check phoronix benchmarks, AMD closed and open drivers have a average performance difference of 20%, that isnt perfect, but not

    • Bullshit if it's the PowerVR crap - no opensoruce drivers at all. Check it out before you make a blanket statement but I will agree for their HD series (which the Iris is simply an improvement of) they do offer a compelling reason to consider if going the Open Source route.

    • How long until we finally have Intel and other CPU vendors create a unified memory model now that we have a GPU on die? I mean if anything I'd think the point would be to have your on-die GPU integrate with a discrete card so that both low and high-end setups gain something from this. PS4 will have a unified memory model; how long until the rest of us do on our desktops?

  • by Anonymous Coward

    the Hot Hardware link confirms DisplayPort 1.2, which is the only thing I /really/ care about. The others are nice, but 4K out of the laptop means my next mid-range laptop can be my primary desk machine as well. This should push along the display manufacturers after their decade of stalling (perhaps soon we'll see screens in the 20-24" range with more resolution than our 5" displays have).

    • Making a 4k display isn't as simple as manufacturers just wanting it bad enough. I know people like to look at little phone screens and say "If these can be high rez, why can't big displays be higher rez!" but all that shows is a lack of understanding of the situation.

      Transistors cost money and pixels require them. How many pixels a display has is not a small part of its cost. So you can't just say "Let's have 4k or 8k desktop displays, should be hard!" because it in fact is.

      That isn't to say we won't, it i

      • The technology is out there, witness the few korean dead-cheap dumb high-res screens.
        They only cost more than similar lower, hd-res of the same featureless no-name brands.

        What is lacking is a huge market, so economy of scales kicks in and produced ueber-high-resolution screens is worthy.
        Currently the biggest chunk of all produced flat pannel end up in TV screens. It makes more sense economically for the constructor to just put the same pannels into computer screens, than to market a special different type o

      • http://www.engadget.com/2013/04/12/seiki-50-inch-4k-1300/ [engadget.com]

        $1300 for a 4k display. Granted, it's locked to 30Hz, but for most of us 60Hz will be as fast as we need to go (though we'll get more for that god-awful 3D crap they keep trying to push). 4k @ 50 is very close the 2560x1600 30" monitor I have for pixel size, which is fine enough for me at my working distance.

        We stalled at 1920x1080 because every moved to TV production. Now that 4k/8k has broken free, we can get over that hump. Not saying there aren't

        • 1080i60 is quite common on broadcast TV, there is really little point to 4k unless you have an 80 inch tv, and those 240hz tvs aren't actually letting you input 240hz (it's interpolation, which makes everything look like blurry soap opera shit).
          • 1080i60 is really only 30 frames per second. The fact that the TV industry actually managed to dupe the FCC into allowing interlaced standards into HD is, perhaps, one of the biggest snow jobs ever.

            The part about 240hz is actually my point - people will gravitate towards the shiny, even if it gets them nothing - as long as it's out there. I just want higher res to be produced for less than a kings ransom so I can use it for computing. I deal with large architectural CAD files (and photography for fun, and

      • Seiki has a 50" TV with a 3840x2160 resolution, available right now for $1499 [tigerdirect.com]. So I don't buy the argument that it's somehow technologically prohibitive. Why can this crappy company no one has ever heard of bring out a 4K TV under $1500, but no one else can make a 4K monitor in an even smaller size (32" or so) without charging over $5500? (and that's for Sharp's offering, which is the next least expensive – most 4K monitors cost as much as a new car). As far as I can tell, it's not technological barri

      • Yeah it reminds me of the "no wireless" comment about the iPod. Geeks complaining that their MP3 player didn't have wireless in 2001. The ramifications and practical considerations never come to mind.
    • by tlhIngan ( 30335 )

      the Hot Hardware link confirms DisplayPort 1.2, which is the only thing I /really/ care about. The others are nice, but 4K out of the laptop means my next mid-range laptop can be my primary desk machine as well. This should push along the display manufacturers after their decade of stalling (perhaps soon we'll see screens in the 20-24" range with more resolution than our 5" displays have).

      Know what else supports 4K? HDMI.

      And I'm fairly sure that if you're willing to pay $10K for a 4K screen (the current che

      • by Kjella ( 173770 )

        And I'm fairly sure that if you're willing to pay $10K for a 4K screen (the current cheapest on the market - some Chinese knockoff), display manufacturers are willing to give you the 4K display you want.

        Try less than $5K for the Sharp PN-K321 [compsource.com] professional monitor.

        Or you could pay $1K and get high res 27-30" screens as well just as you always could. You won't be able to find a 4K screen for $100 until 4KTVs are down in the under-$1000 range like HDTVs are now.

        Is $1300 [shopnbc.com] close enough?

        The only reason display resolutions "stalled" was because everyone was attracted to cheap cheap cheap - cheap laptops, cheap LCD monitors, etc.

        LCDs have been growing and dropping steadily in price, I picked up a 60" LCD at about half the cost a 42" LCD cost me five years earlier. That's double the screen estate (60/42)^2 for considerably less, while resolution has been ridiculously expensive. 4K - as opposed to 2560x1600/1440 that never saw any adoption outside a few niche computer monitors has the potential to be the new HDTV. Right now you're paying early adopter

      • Know what else supports 4K? HDMI.

        The current HDMI revision only supports 4K at frame rates of 30 fps or below, so it's not really suitable for anything except watching film-sourced content. Supposedly HDMI 1.5 might support 4K@60Hz, but this is not confirmed. You need DisplayPort to do it now.

  • Last November it was revealed that the intel processors would have the GT1, GT2 and GT3 graphics in Haswell. The only difference is that Intel has lifted the muzzle of a press embargo on Haswell to push more Ivy Bridge (and yes, even Sandy Bridge) units out the door to clear out back logs.
     
    It's been known since last year that the release date for Haswell is June 2nd, but nobody is allowed to report on that for fear of losing intel advertising dollars.

  • They might even be able to turn up their graphics in WoW now :D
    • Exactly. It seems we hear this every time has an integrated graphics upgrade. By that I mean a chorus of marketing speak to make us believe their graphics offerings compete and surpass with lower end discrete GPUs. In reality, there is hype, hope and ultimately disappointment as the parts actually end up in the hands of users. It's getting old. Integrated GPUs are still great for office applications and basic OS effects rendering. The real benefit is cost savings to manufacturers and battery savings to use
      • by h4rr4r ( 612664 )

        Honestly unless you are playing Crisis they are getting pretty good. I played Portal 2 on IGP. That is two years old, but the laptop I played on was already a year old at that point.

        • by Shinobi ( 19308 )

          Portal 2 runs on an engine that's effectively 12 years old by now, with just some updates. It's far more CPU dependant than more modern engines for example.

          Same thing with Left 4 Dead 2, the benchmark of which Valve rigged by using a 1½ year newer update for the Linux version than what's available for the Windows version, an update that actually shifts more stuff to the GPU for example.

        • Honestly unless you are playing Crisis they are getting pretty good. I played Portal 2 on IGP. That is two years old, but the laptop I played on was already a year old at that point.

          New Intel IGPs does handle Crysis [youtube.com] with fluid framerates even with quality settings turned high.

          • To maximum? I doubt it. And even if it plays it ok, Crysis is 6 years old now. It'll be 5-10 years before an IGP can play Crysis 3 with all the niceties like tessellation and bounce lighting.
    • by alen ( 225700 )

      yep

      i love buying games the day of release with all the DRM, bugs, always connected to internet issues where they can't support all the players, etc

      i'd rather buy a game a year or two after release after its on sale at least 50% off

    • New Intel GPUs are surprisingly competent. No, they don't stand up to higher end discrete solutions but you can game on them no problem. You have to turn down the details and rez a bit in some of the more intense ones, but you can play pretty much any game on it. (http://www.notebookcheck.net/Intel-HD-Graphics-4000-Benchmarked.73567.0.html). For desktops I always recommend dropping $100ish to get a reasonable dedicated card but for laptops, gaming on an integrated chip is realistic if your expectations are

      • by oic0 ( 1864384 )
        I've got one of those hybrid laptops with both intel and Nvidia gpus in it. When the settings are screwed up and it forgets to switch to the Nvidia card, even simple games like league of legends run jerky and more modern stuff becomes completely unplayable unless you completely neuter the settings and resolution.
        • In other words, precisely what I said. Yes, you have to back off on rez and settings. Guess what? That's fine, and expected for something as low power profile as an integrated GPU. Fact remains you can game on it just fine, even new games. Not everyone needs everything cranked, not everyone wants to spend that kind of money. Intel GPUs are extremely competent these days. They are low end and will always be because they get a fraction of the 30-40ish watts of power a laptop CPU/GPU combo can have rather than

          • Yes, you have to back off on rez and settings. Guess what? That's fine, and expected for something as low power profile as an integrated GPU. Fact remains you can game on it just fine, even new games

            One could game just fine on an original PlayStation or a Nintendo DS, and new DS games were still coming out until perhaps a few months ago. It's just that the settings have to be scaled so far back that things look like paper models [wikipedia.org] of what they're supposed to be. The DS in particular has a limit of 6000 vertices (about 1500-2000 polygons) per scene unless a game enables the multipass mode that dramatically lowers frame rate and texture resolution. Fortunately, the HD 4000 in Ivy Bridge runs games with det

      • Re: (Score:2, Informative)

        by Anonymous Coward

        I put together a tiny mini itx system with an Ivybridge i3-3225. The case is super tiny and does not have space for a dual slot video card. Even low-to-mid grade video cards are dual slot nowadays, and I didn't have any on hand that would fit.

        I shruged, and decided to give it a go with the integrated HD4000. The motherboard had really good provision for using the integrated graphics anyway. Dual HDMI+DVI and even WiDi support via a built in intel wireless N card. (Widi only works via the integrated GPU, so

      • No, they don't stand up to higher end discrete solutions

        They don't stand up to AMD's integrated graphics either.

        • Actually, given the benchmarks, the GT3e should be about 20% faster than the A10 5800k's graphics chip.

    • The high end part runs Dirt 3. Intel showed a demo running equally fast as a GT 650M: http://www.anandtech.com/show/6600/intel-haswell-gt3e-gpu-performance-compared-to-nvidias-geforce-gt-650m [anandtech.com]

  • Wouldn't these kinds of things be more accurately described as GPUs with integrated CPUs?

    It's been 10 years since Intel started panicking when they realized a Pentium core could be tucked into a tiny corner of a GPU, as far as transistor count went.

  • Amazing! (Score:2, Insightful)

    by DarthVain ( 724186 )

    Wow so rather than the 11 FPS you were getting, you might be able to get 22 FPS!

    You can almost play a video game at those speeds! Well done!

    • Actually, if you look at some Benchmarks [anandtech.com] You'll see that the current (Ivy Bridge) chips play current games at 30-40fps (even Crysis on pretty high detail settings). So you're looking at 60-100fps with Haswell's GT3e.

      • Not sure I would trust those benchmarks.

        First: Metro is all on lowest settings and scores between 11-25FPS
        Second: Crisis I am not sure how performance is higher than mainstream... unless "performance" is just a nicer way to say lowest quality, highest performance.
        Third: The resolutions they are talking about are 1366x768 which might have been relevant 10 years ago, but are hardly what I would call modern.

        So if you are saying that these integrated solutions will barely play modern games if at all on their lo

        • First: Metro is all on lowest settings and scores between 11-25FPS

          Okay, so one single game shows poorish performance, and will show entirely adequate performance on Haswell GT3e.

          Second: Crisis I am not sure how performance is higher than mainstream... unless "performance" is just a nicer way to say lowest quality, highest performance.

          In other words, it manages 60fps already on lowest settings, and will do 120-150fps on Haswell, and manages 30-40fps on decent settings, and will manage 60-80 on Haswell.

          Third: The resolutions they are talking about are 1366x768 which might have been relevant 10 years ago, but are hardly what I would call modern.

          Actually, 1366x768 is the single most common resolution out there these days on new machines. This is more true on laptops, which these IGPs are aimed at, where it ships on 95% of machines. Sure, us geeks are likely to want shin

          • I suppose. I wasn't really thinking laptops. But you are right on laptops integrated video is more common, and dedicated is horribly expensive for even a middling card. Which is why I would not really consider gaming on a laptop.

            So I suppose as a target, it will make some difference for those that wish to play some games on their laptop. At any rate it will raise the bar as to which games are possible VS impossible to play on a regular laptop that doesn't cost 2 grand.

  • by Gordo_1 ( 256312 ) on Thursday May 02, 2013 @10:41AM (#43610593)

    Haswell parts are expected to be 10-15% faster than Ivy Bridge, which was itself barely any faster than Sandy Bridge.

    Anyone remember the days when computing performance doubled or even tripled between generations?

    I have a desktop PC running a Sandy Bridge i5-2500K running at a consistent 4.5GHz (on air). At this rate, it could be another couple generations before Intel has anything worthwhile as an upgrade... I suspect that discrete-GPU-buying home PC enthusiasts are going to continue to be completely ignored going forward while Intel continues to focus on chips for tablets and ultrabooks.

    • by s7uar7 ( 746699 )
      I use a 5 year old Q6600 for my everyday computing and only now starting to feel a little sluggish. I suspect that with more RAM and/or an SSD it could quite happily chug along for another 2 or 3 years.
    • Intel has publicly stated many many times that they are going to a "tick - tock" development cycle, where they don't expect people to buy new every release. This is a "tick" which introduces new features and performance improvements, followed next year by the "tock" which will include a die shrink and even more power savings, with a slight bump in performance and mostly the same features.

      This is why most Sandy Bridge main boards just needed a firmware update to support Ivy Bridge - the were mostly the same

      • by Gordo_1 ( 256312 )

        Ok, but Haswell delivers virtually nothing computing performance-wise over a reasonably overclocked Sandy Bridge which is a full tick-tock cycle earlier.

    • by Hadlock ( 143607 )

      Modern CPUs are so fast nowadays that the bottleneck is feeding them, not processing the data. The bus between memory and the CPU allows it to process 24GB/s, but most computers only come with 2, 4 sometimes 8GB. And then there's the issue of reading from the drive. The only way you're going to consistently peg an i5 for more than a few second (let alone an i7) is crunching terabytes of scientific data. Otherwise it's a software issue like Kerbal Space Program which only uses 50% of one core instead of 80%

    • by Kjella ( 173770 )

      Haswell parts are expected to be 10-15% faster than Ivy Bridge, which was itself barely any faster than Sandy Bridge. (...) I suspect that discrete-GPU-buying home PC enthusiasts are going to continue to be completely ignored going forward while Intel continues to focus on chips for tablets and ultrabooks.

      It's taken Intel from 2004 to now to go from the 3.8 GHz Penium IV to the 3.5-3.9 GHz i7-4770K, where do you want them to go? While they're still doing moderate IPC improvements there's just no easy way to scale to 5 GHz and beyond and scaling with more cores has rather tapped out, interest in their six-core LGA2011 platform is minimal. Of course price is a big part of it, but also that people don't have much to put the two extra cores to work with. Heck, if all you're doing is gaming many would suggest the

  • by slashmydots ( 2189826 ) on Thursday May 02, 2013 @12:26PM (#43611801)
    Wow, that will bring it up to almost the same speed as the 2 year old AMD APUs if I'm not mistaken, lol. Talk about being behind! I was amazed when the first couple P-series graphics adapters came out built into the first Sandy Bridge chips. You could actually play Skyrim on an i3-2100 at low settings and it killed at HD video playback. Now for $110, my demo unit at my shop is an A10 APU with a 6.9 graphics rating. It can play Starcraft II at almost maxed settings at 60FPS at 1280x1024 and the CPU is just a hair slower than an i5-2400 for a crap-ton less money. At least $60 if I remember correctly. The 1866 native memory controller helps too since even Ivy Bridge only hits 1600. So then it's better at video encoding and gaming than any i5. Intel is really playing catch-up at this point. I don't know why everyone's all doom and gloom over AMD getting crushed by Intel. I would have thought that was over by now, especially after releasing Trinity, Zambezi, and Vishera. Those 3 really are better than Intel in almost every way.
  • Twice nothing is still nothing, although getting a 1.5x increase for low-wattage applications is nice to hear.

    End result, I'm still going to get people complaining to me that The Sims runs slow. Only difference is it'll be stop-motion instead of slideshow.
  • arstechnica has a more in-depth look [including architectural details] at:
    http://arstechnica.com/gadgets/2013/05/a-look-at-haswell/ [arstechnica.com]

The 11 is for people with the pride of a 10 and the pocketbook of an 8. -- R.B. Greenberg [referring to PDPs?]

Working...