Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Intel Graphics Hardware

Early Ivy Bridge Benchmark: Graphics Performance Greatly Improved 146

The folks over at Anandtech managed to spend some time with early Ivy bridge production samples and perform a few benchmarks. The skinny: CPU performance is mildly increased as expected, but the GPU is 20-50% faster than the Sandy Bridge GPU. Power consumption is also down about 30W under full load. The graphics, however, are still slower than AMD's Llano (but the Ivy Bridge CPU beats the pants off of the Fusion's). Is the tradeoff worth it?
This discussion has been archived. No new comments can be posted.

Early Ivy Bridge Benchmark: Graphics Performance Greatly Improved

Comments Filter:
  • Tradeoff? (Score:5, Insightful)

    by Sycraft-fu ( 314770 ) on Wednesday March 07, 2012 @11:46AM (#39275955)

    It isn't meant to be powerful graphics. It isn't a "tradeoff". Intel's HD graphics are meant to be very low power, but competent enough to run basics, shiny OS features at least. That they do, and it sounds like IB is even better at that. But it isn't a "tradeoff" to get a good CPU with basic graphics that is called "normal". If you need good graphics discrete is still the way to go and there are plenty of reasonable options.

    From the look of it, Ivy Bridge is quite a win. Sandy Bridge, but a bit better. Nothing not to like there.

    • by tepples ( 727027 )

      If you need good graphics discrete is still the way to go

      Do they have interchangeable discrete video cards for typical laptops yet?

      • by Sycraft-fu ( 314770 ) on Wednesday March 07, 2012 @11:56AM (#39276063)

        So basically all laptops that have discrete graphics have it socketed in an nVidia MXM slot. Way cheaper to have one board and just knock cards on it for the manufacturers. However the thing is that since it is for OEMs and not consumers, it isn't as easy to swap as a PCI card. It is all on you to make sure the card you are getting is physically the right size, electrically something you system can handle, and thermally not to much.

        Also pretty much only Sager actually supports doing it, and other laptop manufacturer will tell you to GTFO if you ask them about it. As such even finding the parts isn't easy.

        With laptops you don't really upgrade much other than maybe the RAM or disk.

        However the IB will be useful in laptops not only because it can give better performance for integrated only systems, but it'll be nice for switchable ones. You can get ATi card systems where you can manually switch between discrete and integrated and nVidia ones that do it on the fly. Better integrated graphics means you can use them for more things, so when on battery it is more feasible to use them and leave the discrete system shut down.

        However note this wasn't a laptop part they are talking about, this is the desktop part.

        • by rhook ( 943951 )

          99% of laptops that have discrete graphics have the GPU soldered to the mainboard.

          • by Creepy ( 93888 )

            And when they do have MXM slots, those are sometimes tied to manufacturer drivers and customized hardware that will only start if it sees the manufacturer's hardware. Some people have said drivers from laptopvideo2go and such work, but I haven't tried it (my last two laptops both had soldered on discrete graphics).

          • Crack them open some time. Slots are the big thing since it keeps production costs down.

            • I did that a few weeks ago on my old MSI laptop with discrete ati video. Its soldered on the mainboard. Makes sense too, since like all laptops, there is a custom heatsink that will only fit in the chassis the mainboard is designed to fit and the gpu must align with it perfectly.
      • I've always wondered why there isn't some kind of expansion port standard for video cards on laptops. Let me plugin a video card black box onto the side of my laptop! I don't care if I need a power adapter for the video card box. That way I can use the normal onboard graphics as needed, but occasionally, when I want to game, I can just plugin my video card box, turn on my laptop, and the laptop will automatically switch to using it for graphics. Heck, maybe the port could be PCI express (without power if ne

        • Thunderbolt will do this. There are prototype thunderbolt GPU enclosures out there now. We'll start seeing them soon, hopefully.
        • by voidptr ( 609 )

          Thunderbolt is essentially external PCIe, and there are a few external PCIe enclosures now designed for this use so you can attach a better graphics card to a MacBook Pro or Air when you're at your desk.

        • There is the promise of the with thunderbolt but latency is an issue. However I believe there is one ( at least) on the market here. [extremetech.com]
        • For external cards, you're looking for Thunderbolt. I have high hopes for it.

          Internal cards are caught between all the factors you mentioned plus the very limited internal space. Laptop manufacturers don't have much incentive to reserve a large volume for an aftermarket upgrade that most users will never be interested in. It's a niche someone might eventually cater to, but don't hold your breath.

        • Re:Tradeoff? (Score:5, Informative)

          by fuzzyfuzzyfungus ( 1223518 ) on Wednesday March 07, 2012 @12:56PM (#39276805) Journal
          There have been a few stabs at it, I think both ATI and Nvidia have released more-or-less-orphaned-on-launch partnerships with some laptop outfit or other, using proprietary cabling.

          My understanding is that there are a few major hurdles:

          Historically, there really haven't been any good standardized high-bandwidth interfaces to the outside world on laptops. The proprietary docking station port, if provided, might connect directly to the PCI bus; but your next best bets were relatively lousy things like PCMCIA or USB. Even with PCIe, you get 1x from an expresscard slot; but the standards for external cabling for anything beefier than that have been languishing in the PCIe SIG forever...

          Unless you are content to use an external monitor only, an 'expansion GPU' both has to have access to all the usual bandwidth that a GPU requires and have enough bandwidth(and suitable software/firmware cooperation) to dump its framebuffer back to whatever internal GPU is driving the laptop screen. You can get(albeit at unattractive prices) enclosures that connect to the 1xPCIe lane in an expresscard slot and break that out into a mechanically 16xPCIe card enclosure with supplemental power. Assuming the BIOS isn't a clusterfuck, you can pop in an expansion card just as you would on a desktop. That only gets you the video outs, though, it doesn't solve the trickier and more system-specific problem of driving the laptop screen.

          Docking stations: At present, laptop manufacturers get to designate one line as 'enterprise' by including the necessary connector, and then charge a stiff fee for the proprietary docking station as your only option to drive a few extra heads. I imagine that this blunts the enthusiasm of the major enterprise laptop players for a well-standardized and high bandwidth external connector.
          • what's wrong with a display port/hdmi "in" port to drive the internal screen? or an LVDS connector of some description to directly drive the LCD panel
            • A video in would actually be much more broadly useful(surely the IT minions of the world who occasionally have to deal with a headless box would pay a premium for a laptop that could use its own monitor, keyboard, and mouse as a KVM with just a single cable connection?); but I've never seen one, not even a vaporware or overpriced specialty one...

              More generally, though, driving the internal monitor is hardly an impossible problem(either through feeding a video output, or agreeing on some standard way of g
        • by rhook ( 943951 )

          Look into the ViDock, it does exactly this.

          http://www.villageinstruments.com/tiki-index.php?page=ViDock [villageinstruments.com]

        • If they took out the optical drive, you would have plenty of space for a pretty spiffy graphics card. I'm sure the majority of people would rather have a really nice interchangeable graphics card on their laptop than an optical drive. If you really need an optical drive, there's always an option for USB. Sure there's things like ThunderBolt, which is basically External PCIe, but I think it would make much more sense to just leave the graphics card in the main laptop body, and do away with the optical driv
        • by drsmithy ( 35869 )

          I've always wondered why there isn't some kind of expansion port standard for video cards on laptops.

          There is: ExpressCard and Thunderbolt.

          The reason you don't see anyone actually doing it is because serious customer demand for upgradeable GPUs in laptops is, for all intents and purposes, nonexistent.

        • You should check out the Vaio Z with the Power Media Dock.
    • by Luckyo ( 1726890 )

      Thing is, many people like games. And games are demanding. Llano and brazos allow playing mainstream 3D (as in not angry birds/solitaire) games at low settings.
      Sandy/Ivy bridge and atom on the other hand are utterly useless for that. They can run aero and give very low end support to video decoding in hardware, and that's pretty much it.

      So if you're buying a machine where you intend to actually use that GPU for anything more graphically intensive then aero, intel is simply not an option unless you're also g

      • by tepples ( 727027 )

        Thing is, many people like games. And games are demanding.

        Indie games tend not to be quite as demanding due to the cost of producing detailed assets, and mainstream games tend to be ported to consoles. So a lot of people will buy a homework-and-Facebook PC with integrated graphics and buy a console for those games that won't run on a homework-and-Facebook PC.

        • by Luckyo ( 1726890 )

          Gamers who only play indie games are an extremely small minority, likely below single digit in terms of percentage. Most people who play indie games also play non-indie games.

          • by tepples ( 727027 )

            Most people who play indie games also play non-indie games.

            And they have the PC with a GMA for homework, Facebook, and indie games, and the console for major label games.

          • And there are a billion users out there who play facebook games.
    • by b0bby ( 201198 )

      My reading was that the tradeoff was between Intel's more powerful CPU/less powerful GPU, and AMD's more powerful GPU/less powerful CPU offerings. In that case there is a real tradeoff - you can't get both the more powerful CPU & GPU in one package.

    • I thought the tradeoff mentioned in the summary was with regards to Intel vs AMD: You get better graphics with the integrated AMD Fusion chips, but poorer CPU performance.

      In other news, I bought one of the new AMD 6-core FX processors. Despite the miserable benchmarks, that thing feels faster than any other CPU I've had the privilege of using.
      • In other news, I bought one of the new AMD 6-core FX processors. Despite the miserable benchmarks, that thing feels faster than any other CPU I've had the privilege of using.

        Yeah, AMD's marketing department is full of fail. They were telling everyone "50% faster than Sandy Bridge" and giving people high expectations, so after the benchmarks came out instead of people thinking "meh, it's OK" everybody was running around predicting the end of the world.

        On top of that, they sent reviewers the FX-8150, the 8-thread version with the worst single thread performance per dollar because you're paying for eight threads whether you use them or not. So the reviewers compared it to the Inte

    • Comment removed (Score:5, Interesting)

      by account_deleted ( 4530225 ) on Wednesday March 07, 2012 @02:26PM (#39278077)
      Comment removed based on user account deletion
      • Hats off to your incisive analysis. I certainly will not make the mistake of buying an Atom-based computer again. Atom runs too hot for the amount of computing it does. Against ARM it is just no contest. The only thing it has going for it is x86 compatability, and I guess I would prefer to get that more of that for less money from AMD, bundled with a decent GPU.

        One thing to add: OpenCL is a game changer. It shifts the multi-core equation onto the GPU. Four cores? Feh, how about 80, or 800, all cranking sing

  • by Kenja ( 541830 ) on Wednesday March 07, 2012 @11:47AM (#39275961)
    Frankly, I am sick and tired of these integrated GPUs. The theory is that its a cost saver, but since I just put in a dedicated graphics card it ends up being a cost with no benefit. Ah well.
    • by tepples ( 727027 )
      People who don't use a computer other than for homework and Facebook don't feel the need to put in a dedicated graphics card. As I understand it, as long as Office and YouTube work, non-gamers and console gamers are perfectly fine if a computer's 3D capability is comparable to a Voodoo3 from over a decade ago.
      • by Sir_Sri ( 199544 )

        These are a hell of a lot better than that. They aren't good, but they will manage to play skyrim for example (albeit at a relatively low resolution for a decent framerate). http://www.anandtech.com/show/5626/ivy-bridge-preview-core-i7-3770k/15

        These are probably the wrong direction for the product though. They don't viably compete with a discrete GPU, so people who can, would rather not have to buy an integrated GPU at all, and for business it's so powerful it's letting employees game on work computers,

        • by tepples ( 727027 )

          They aren't good, but they will manage to play skyrim for example (albeit at a relatively low resolution for a decent framerate).

          Just in case someone doesn't care to click through to the benchmark, allow me to summarize: AnandTech reports 46 fps at 720p and no AA for Skyrim, a PC game comparable to PS3 games. So no, using integrated graphics doesn't mean going back to Dreamcast-class graphics anymore.

        • Arguably, any business whose strategy to prevent staff gaming involves the GMA950, rather than software administration or hiring responsible people deserves the huge number of games of Farmville and Snood currently being played on their network...
        • Comment removed based on user account deletion
          • by Sir_Sri ( 199544 )

            Come back again in a console generation.

            Right now you can't do enough better on a PC to warrant doing major PC exclusive graphics compared to the 360/PS3. Higher resolution, better FPS sure, but not significantly better. Intel is reasonably closing the gap on PS3/Xbox 2 level performance, but that puts then at about 0.1 of a good graphics card*.

            Of course the 'next gen' consoles are in the making now, and that means we'll see consoles about on par with what you can do with a decent rig today. So then the

      • People who only use their computer for homework and Facebook don't need a Sandy or Ivy Bridge processor. A Core2 with GMA is more than sufficient.

        I think it makes sense for mobile applications, but for desktop it doesn't. You can get a $40 card that will outperform the onboard. That being said I'm sure Dell etc love it. They love charging for upgrades. They're the car salesmen of the computer world. Once you add the goodies onto your base model you could have bought the top end that came with those feature
        • You can get a $40 card that will outperform the onboard.

          True yesterday, false today: [anandtech.com]

          Based on what we've seen, discrete GPUs below the $50 - $60 mark don't make sense if you've got Intel's HD 4000 inside your system. The discrete market above $100 remains fairly safe however.

          • They've got some weasel words in there, "..based on what we've seen..." and no benchmark scores.
            • Not sure what you mean? That was their final-page conclusion based on a dozen pages of benchmarks.
              • Lol. Didn't see it was a full review. I don't read AnandTech so I missed the dropdown for the different pages..

                It didn't do so well on DX11, in fact it was out down by a GT 440 (which I can get locally for $54). After quick browsing it appears the GT 440 does better on all the games as well. Well enough for me to prefer not having integrate video (as in don't include it on chip and pass the savings on to me).

                Ivy Bridge does do significantly better than Sandy Bridge 2600K integrated, so Intel is improvi
      • Comment removed based on user account deletion
        • The sad thing though is that us gamers/enthusiasts are basically paying for a GPU we'll never use. It would be nice if these CPUs were sold also without an integrated GPU.

          They actually do exist, check out the Core i5 2550 for example. It has a higher clock than the 2500 for the same price. The difference is they removed the iGPU from the chip.

        • by Kjella ( 173770 )

          Well, it's not like Intel does it just to annoy you. The top Intel chips have 16 EUs which is roughly equal to 32 shaders. A top graphics card like the 7970 has 2048 shaders. So if you use AMD's $450 price as basis that works out to about $7 for an Intel, make it $10 to include QuickSync and whatnot. For that small savings Intel would have to validate a new design and risk a potential shortage of chips with/without IGP. Look at the die layout [pcper.com] for Sandy Bridge, there's no Ivy Bridge layout yet but it's proba

        • by timeOday ( 582209 ) on Wednesday March 07, 2012 @12:51PM (#39276757)
          I would like to see the ability to use the integrated GPU, even if not for graphics. The traditional CPU is good for sequential logic. But for pattern recognition, physics simulations (which is basically what 3d graphics is), encoding, or code-cracking (e.g. bitcoin), the highly parallel structure of the GPU is better. Now you might argue, my offboard GPU is still the same thing, but better. OK. But these are inherently parallel tasks, so if you could use the one built-in AND the add-on, you wouldn't be wasting anything.
        • I don't even know why Intel bothers with the integrated graphics on the Core i7. It makes perfect sense on the Core i3 and the Pentiums, and maybe the i5. But how many people are going to buy a $250+ processor and not use a discrete GPU?
    • Well then it sounds like Ivy Bridge is your best bet over Llano. For most people, the latest generation of integrated graphics is good enough for even 1080p video. It can even work for casual gaming.
    • Well even for you one of the advantages of the AMD Fusion platform is the ability to add in a discrete card and combine the power of the two (Figuring the discrete card is another AMD and your running under windows). Though from what I've seen the Fusion platform is capable enough for most 3D tasks unless your serious gamer who wants every bell and whistle @ 1600x1200+.

      It's also a different story when it comes to laptops. Fusion is incredibly useful in the laptop market where the entire lower end of the mar

    • And there is a lot of use for them.

      In terms of desktop chips it is for low end use. A lot of people just do web/e-mail/word with their systems and an Intel HD graphics setup is perfect for them. It is plenty of power to do the shiny OS interface, accelerate video, and so on, and comes with the system.

      In terms of laptop chips, you really always want it on account of switchable graphics. If your laptop has switchable graphics it can use the integrated for low power consumption and only kick on the discrete wh

      • Not really. See Athlon II X4 631. It costs quite a bit less than the A6-3650. Not much, but enough for GP to have a point. Why pay even little for something that you're not going to use at all?

        • The Lano is a different beast. AMD is trying to whack a bigass graphics card, relatively speaking, on there. Intel's HD graphics are tiny, that is why they are low performers. But sure if he really wants Intel processors with no graphics he can have them. Intel's high end CPUs don't feature them, their LGA 2011 and LGA 1366 CPUs. However, they cost more so there you go.

          Intel's mainstream CPUs have integrated GPUs. They are very reasonably priced so just deal with it.

          • "Deal with it"? I though I was doing that just fine. I'm not losing my shit here, suing Intel or even registering a IHATEINTELGPUS.com. All I did was point out that buying something you'll never use is a bad deal, regardless of price.

            I think AMD had the right idea: bundle a good GPU, strong enough to beat discrete graphics of past generations or it's pointless. And allow crossfire in case the user wants to upgrade, so as not to waste any resources. As for Intel, if I already have a Radeon HD5570 or Geforce

    • http://www.anandtech.com/show/5626/ivy-bridge-preview-core-i7-3770k/11 [anandtech.com]

      It was faster than low end cheapo cards. Which is mainly the point.

      If you are putting in $200 cards, they are a long ways off, but they essentially obsolete the need for a low end card, which is a good thing.

      And since all most people need is a low end card, this is sufficient for most people.

      For desktop, internet, video, web games, older games and even new games at modest settings this is fine.

    • Frankly, I am sick and tired of these integrated GPUs. The theory is that its a cost saver, but since I just put in a dedicated graphics card it ends up being a cost with no benefit. Ah well.

      According to this review, the AMD A8-3850 is 50% faster than the ~$50 Radeon 6450, but 50% slower than the ~$75 Radeon 6670. [pcper.com]

      So sure, it's not better than a $200 dedicated card, but it's far better than what integrated cards use to be like. Integrated will never be faster than dedicated, but if I can get about 50% of the performance from integrated then that's reasonable until I have an extra $200 for a "real" video card.

      • The A8 is way faster than Intel's offerings, works with a 6670 in crossfire so as not to diminish its value when upgrading and can be bought without a GPU as an Athlon II X4 641. I think he was referring to the fact that going Intel forces you to pay for a GPU that you'll have no use for if you have anything better than a Radeon HD5570, which is often the case, especially with processors more powerful than, say an i3.

    • by Bengie ( 1121981 )

      Having a graphics card integrated into the CPU is only one benefit. The future benefit is using the GPU as a co-CPU. AMD already has plans for the IGP to understand context switching and respect protected memory.

      Some people say "why, the IGP is slower than discrete". But no one thinks, ohh, the IGP has 2-3 magnitudes less latency than a discrete GPU while being less than 1 magnitude slower.

      Think of future multimedia where the CPU and IGP ping-pong data back and forth. I like to think of what kind of physic

    • The latest benchmark shows that the integrated graphics are better than the budget discrete cards currently offered. For uses where graphics performance does not matter as much (business desktops and laptops) this is a cost savings for them. Also the current trend today is ultrabooks which benefit greatly from not having a discrete card.
    • by durrr ( 1316311 )

      Where they really shine is when combined with a mini ITX mobo. Now if I got around to get an inverter and some decent battery I could bring my desktop computer with me as a moderately bulky laptop replacement.

    • Actually, there is one place where Intel's integrated GPU knocks the socks off all the competition... Video encoding!

      Just look at the benchmarks and image examples from AnandTech's review [anandtech.com].

      And that's the old Sandy Bridge. If we see 30%-50% improvement over that again.. I can see some uses for the integrated card :)

  • How does ivy bridge compare to ARM? From what I've read, it appears that the ARM has lower wattage, but I'm not sure about the performance.
  • The CPU in Llano is 2 generations back... with Athlon II. Beating the pants off Bulldozer is easy for Intel: just find a benchmark optimized for single threads, compiled with ICC, or weights the single threaded result. One of the major new features, the random number generator, wasn't even tested. Monte Carlo benchmarks, where are you? [nist.gov]
  • The graphics, however, are still slower than AMD's Llano (but the Ivy Bridge CPU beats the pants off of the Fusion's). Is the tradeoff worth it?

    It depends on what you're doing with it! Duh... Seriously, that was a deeply stupid question.

    • by Nimey ( 114278 )

      It's a leading question in a Slashdot summary. It's hardly meant to be intelligent; I think the purpose is to drive discussion.

      You see that somewhat often on news stories elsewhere, probably more at lower-quality establishments whose MO is to drum up controversy.

  • 1. AMD CPU bug
    2. AMD divesting from its fab
    3. Intel pulling even MORE ahead on performance and even lowering power usage at the same time!

    Not to mention AMD's financial troubles and the fact they have a tendency to burn up.

    • Plus Intel focusing on ultrabooks which is helped by better integrated graphics. I don't think AMD has an answer to that.
    • Perhaps AMD throwing away GF is due to TSMC already having 22nm capability where GF is stuck in on 28. They can't compete with 28nm when Intel is on the way to 14nm next year
  • I'm a bit confused by the target market for these improvements. If you're buying one of these fancy chips for a desktop, you must have some reason to need all that power, and 90% of the people who have such a reason will also need their computer to have a discrete graphics card. If all you're doing in Facebook and photos, a cheap Core2 duo is more than you need. If you're gaming, you still can't do it without a discrete card. So now we hear that the only thing that really got improved in this generation is
  • Soooo, you built a CPU that barely runs faster than the previous generation CPU. However the integrated graphics are 20-50% better.

    Integrated graphics for anything other than the most basic tasks are horrible by several degrees of magnitude. You can buy a 130$ discrete video card that will deliver 1000% time graphics.

    In real world terms this is like taking a game that runs at say 12FPS and making it run at 14-18 FPS which is still unplayable. More realistically you will take a game that is completely unplay

A committee takes root and grows, it flowers, wilts and dies, scattering the seed from which other committees will bloom. -- Parkinson

Working...