Slashdot is powered by your submissions, so send in your scoop


Forgot your password?
Intel Graphics Hardware

Early Ivy Bridge Benchmark: Graphics Performance Greatly Improved 146

The folks over at Anandtech managed to spend some time with early Ivy bridge production samples and perform a few benchmarks. The skinny: CPU performance is mildly increased as expected, but the GPU is 20-50% faster than the Sandy Bridge GPU. Power consumption is also down about 30W under full load. The graphics, however, are still slower than AMD's Llano (but the Ivy Bridge CPU beats the pants off of the Fusion's). Is the tradeoff worth it?
This discussion has been archived. No new comments can be posted.

Early Ivy Bridge Benchmark: Graphics Performance Greatly Improved

Comments Filter:
  • by Kenja ( 541830 ) on Wednesday March 07, 2012 @12:47PM (#39275961)
    Frankly, I am sick and tired of these integrated GPUs. The theory is that its a cost saver, but since I just put in a dedicated graphics card it ends up being a cost with no benefit. Ah well.
  • by UnknowingFool ( 672806 ) on Wednesday March 07, 2012 @01:07PM (#39276241)
    True but integrated is getting better. At this point the budget nVidia and AMD discrete cards are slightly better than Intel but IMO not worth the $50 for the slight upgrade. You are better off spending a little more and moving towards mid-range for a lot more performance.
  • Re:Tradeoff? (Score:5, Interesting)

    by hairyfeet ( 841228 ) <bassbeast1968 AT gmail DOT com> on Wednesday March 07, 2012 @03:26PM (#39278077) Journal

    But it IS a tradeoff Blanche, it is. You see most folks are embracing the wonder that is "The Internet" and all the TV movies and other entertainment that this wonderful medium has to offer and Intel GPUs...well they suck REALLY hard.

    But here is the dirty little secret AMD knows that Intel doesn't want you to hear, going so far as to shoot their Atom division in the face by killing off the Nvidia chipset business and hobbling Atom with insanely shitty rules like "Only 10 inches with crappy resolution" and "Only 2Gb of RAM" and the secret is this...Most folks simply aren't slamming even 5 year old chips hard enough to worry about, much less the newer ones. You see chips passed "Good enough" for the vast majority once we hit dual cores, so the fact that AMD's chips are 30% slower really doesn't matter if the user is only using less than half the power available anyway. And having that really nice GPU makes everything nice and smooth, with great HD video and even gaming if you so desire, although the majority isn't playing heavy CPU slamming games but crap like Farmville and Mob Wars

    This is why both my desktop and netbook are AMD and I sold my full size for the netbook because i found when I was mobile I simply wasn't hitting the CPU hard enough to matter. My Thuban X6 has OCing headroom up the wazoo should I ever need it but with most games barely hitting dual cores and transcoding on 6 cores being so sweet i doubt I'll need it and the E350? Man whomever designed that chip needs to be given a Corvette and a raise by AMD because that thing is bloody brilliant! 6 hours playing HD video at default voltages (BTW if you have an E or C series check out Brazos Tweaker [] as you can add 20%-30% battery life by using it) and the ability to just pop in an HDMI cable for full 1080p goodness, hell it even plays L4D, Bioshock II, and GTA:VC (I could play the newer GTA games but I don't care for them) and all while staying cool to the touch and quiet as a churchmouse. The OEMs have taken notice (now that Intel isn't bribing them not to anymore) and you can see everything from HTPCs to laptops and netbooks to all in ones and desktops running Brazos. In fact last time I walked into my local Walmart Supercenter there were only 2 Intel units, both of which were bottom o' the line Atoms, the rest of the store? All AMD Fusions. I've built several office boxes (the traditional stronghold of Intel) with the E350 and the employees just love them, whisper quiet while giving them plenty of power for their everyday tasks.

    Where Intel screwed the pooch was being too greedy. they SHOULD have made a deal with Nvidia to ensure plenty of new GPUs for their chips and instead they wiped out the entire Nvidia chipset business and made many of their chips simply overpriced and underperforming, especially in the laptop arena where you can't add a discrete. ION and Optimus was the perfect answer, with the low power shitty Intel chip for when you were on battery and the Nvidia chip when you were plugged in but now that the option is gone frankly I wouldn't touch Intel on a mobile unless it had a discrete and i warn my customers of the same. As we get more and more multimedia heavy folks want good graphics with smooth video and nice gaming and intel just don't have that. You can buy an AMD A series for probably half of what this chip is gonna cost, an E series for like one fifth, and while you won't notice the CPU unless you are doing number crunching or some other task one doesn't do on a mobile often you WILL notice the much nicer graphics. Intel just went the wrong direction on this IMHO and will pay the price.

Logic is a pretty flower that smells bad.