Early Ivy Bridge Benchmark: Graphics Performance Greatly Improved 146
The folks over at Anandtech managed to spend some time with early Ivy bridge production samples and perform a few benchmarks. The skinny: CPU performance is mildly increased as expected, but the GPU is 20-50% faster than the Sandy Bridge GPU. Power consumption is also down about 30W under full load. The graphics, however, are still slower than AMD's Llano (but the Ivy Bridge CPU beats the pants off of the Fusion's). Is the tradeoff worth it?
Tradeoff? (Score:5, Insightful)
It isn't meant to be powerful graphics. It isn't a "tradeoff". Intel's HD graphics are meant to be very low power, but competent enough to run basics, shiny OS features at least. That they do, and it sounds like IB is even better at that. But it isn't a "tradeoff" to get a good CPU with basic graphics that is called "normal". If you need good graphics discrete is still the way to go and there are plenty of reasonable options.
From the look of it, Ivy Bridge is quite a win. Sandy Bridge, but a bit better. Nothing not to like there.
Re: (Score:2)
If you need good graphics discrete is still the way to go
Do they have interchangeable discrete video cards for typical laptops yet?
Depends on what you mean (Score:5, Informative)
So basically all laptops that have discrete graphics have it socketed in an nVidia MXM slot. Way cheaper to have one board and just knock cards on it for the manufacturers. However the thing is that since it is for OEMs and not consumers, it isn't as easy to swap as a PCI card. It is all on you to make sure the card you are getting is physically the right size, electrically something you system can handle, and thermally not to much.
Also pretty much only Sager actually supports doing it, and other laptop manufacturer will tell you to GTFO if you ask them about it. As such even finding the parts isn't easy.
With laptops you don't really upgrade much other than maybe the RAM or disk.
However the IB will be useful in laptops not only because it can give better performance for integrated only systems, but it'll be nice for switchable ones. You can get ATi card systems where you can manually switch between discrete and integrated and nVidia ones that do it on the fly. Better integrated graphics means you can use them for more things, so when on battery it is more feasible to use them and leave the discrete system shut down.
However note this wasn't a laptop part they are talking about, this is the desktop part.
Re: (Score:2)
99% of laptops that have discrete graphics have the GPU soldered to the mainboard.
Re: (Score:2)
And when they do have MXM slots, those are sometimes tied to manufacturer drivers and customized hardware that will only start if it sees the manufacturer's hardware. Some people have said drivers from laptopvideo2go and such work, but I haven't tried it (my last two laptops both had soldered on discrete graphics).
Not anymore (Score:3)
Crack them open some time. Slots are the big thing since it keeps production costs down.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
I've always wondered why there isn't some kind of expansion port standard for video cards on laptops. Let me plugin a video card black box onto the side of my laptop! I don't care if I need a power adapter for the video card box. That way I can use the normal onboard graphics as needed, but occasionally, when I want to game, I can just plugin my video card box, turn on my laptop, and the laptop will automatically switch to using it for graphics. Heck, maybe the port could be PCI express (without power if ne
Re: (Score:3)
Re: (Score:2)
Thunderbolt is essentially external PCIe, and there are a few external PCIe enclosures now designed for this use so you can attach a better graphics card to a MacBook Pro or Air when you're at your desk.
Re: (Score:2)
Re: (Score:2)
For external cards, you're looking for Thunderbolt. I have high hopes for it.
Internal cards are caught between all the factors you mentioned plus the very limited internal space. Laptop manufacturers don't have much incentive to reserve a large volume for an aftermarket upgrade that most users will never be interested in. It's a niche someone might eventually cater to, but don't hold your breath.
Re:Tradeoff? (Score:5, Informative)
My understanding is that there are a few major hurdles:
Historically, there really haven't been any good standardized high-bandwidth interfaces to the outside world on laptops. The proprietary docking station port, if provided, might connect directly to the PCI bus; but your next best bets were relatively lousy things like PCMCIA or USB. Even with PCIe, you get 1x from an expresscard slot; but the standards for external cabling for anything beefier than that have been languishing in the PCIe SIG forever...
Unless you are content to use an external monitor only, an 'expansion GPU' both has to have access to all the usual bandwidth that a GPU requires and have enough bandwidth(and suitable software/firmware cooperation) to dump its framebuffer back to whatever internal GPU is driving the laptop screen. You can get(albeit at unattractive prices) enclosures that connect to the 1xPCIe lane in an expresscard slot and break that out into a mechanically 16xPCIe card enclosure with supplemental power. Assuming the BIOS isn't a clusterfuck, you can pop in an expansion card just as you would on a desktop. That only gets you the video outs, though, it doesn't solve the trickier and more system-specific problem of driving the laptop screen.
Docking stations: At present, laptop manufacturers get to designate one line as 'enterprise' by including the necessary connector, and then charge a stiff fee for the proprietary docking station as your only option to drive a few extra heads. I imagine that this blunts the enthusiasm of the major enterprise laptop players for a well-standardized and high bandwidth external connector.
Re: (Score:2)
Re: (Score:2)
More generally, though, driving the internal monitor is hardly an impossible problem(either through feeding a video output, or agreeing on some standard way of g
Re: (Score:2)
Look into the ViDock, it does exactly this.
http://www.villageinstruments.com/tiki-index.php?page=ViDock [villageinstruments.com]
Re: (Score:2)
Re: (Score:2)
There is: ExpressCard and Thunderbolt.
The reason you don't see anyone actually doing it is because serious customer demand for upgradeable GPUs in laptops is, for all intents and purposes, nonexistent.
Re: (Score:2)
Re: (Score:3)
Thing is, many people like games. And games are demanding. Llano and brazos allow playing mainstream 3D (as in not angry birds/solitaire) games at low settings.
Sandy/Ivy bridge and atom on the other hand are utterly useless for that. They can run aero and give very low end support to video decoding in hardware, and that's pretty much it.
So if you're buying a machine where you intend to actually use that GPU for anything more graphically intensive then aero, intel is simply not an option unless you're also g
Re: (Score:2)
Thing is, many people like games. And games are demanding.
Indie games tend not to be quite as demanding due to the cost of producing detailed assets, and mainstream games tend to be ported to consoles. So a lot of people will buy a homework-and-Facebook PC with integrated graphics and buy a console for those games that won't run on a homework-and-Facebook PC.
Re: (Score:2)
Gamers who only play indie games are an extremely small minority, likely below single digit in terms of percentage. Most people who play indie games also play non-indie games.
Re: (Score:2)
Most people who play indie games also play non-indie games.
And they have the PC with a GMA for homework, Facebook, and indie games, and the console for major label games.
Re: (Score:2)
Re: (Score:2)
Vast majority of these games (or more specifically gamers) have nothing to do with indie niche. They're playing zynga games.
Re: (Score:2)
Indie games tend not to be quite as demanding
Just like zynga games.
Re: (Score:2)
Re: (Score:2)
Having looked at them, I stand by my opinion. Even the highest end available, HD4000 loses to comparable AMD offering by around third to half. This not even counting the cheating in filtering tests (which apparently was reduced).
For example, in my book, SC2 is barely playable on AMD offering. Losing third to half FPS takes it quite far into unplayable territory.
Re: (Score:2)
Re: (Score:2)
I suspect that you and I have a very different idea on what "playable" is. To me, 10fps with occasional spikes into sub-1 fps is NOT playable and that's what I've seen HD3000 get. On two different machines.
Re: (Score:2)
Re: (Score:2)
But that's just the thing. It's NOT sufficient to run most games in low settings. Try starcraft 2 for example. It's very optimized for low end, just like all blizzard offerings, and it's still spectacularly unplayable on intel offerings.
Re: (Score:3)
My reading was that the tradeoff was between Intel's more powerful CPU/less powerful GPU, and AMD's more powerful GPU/less powerful CPU offerings. In that case there is a real tradeoff - you can't get both the more powerful CPU & GPU in one package.
Re: (Score:3)
In other news, I bought one of the new AMD 6-core FX processors. Despite the miserable benchmarks, that thing feels faster than any other CPU I've had the privilege of using.
Re: (Score:2)
In other news, I bought one of the new AMD 6-core FX processors. Despite the miserable benchmarks, that thing feels faster than any other CPU I've had the privilege of using.
Yeah, AMD's marketing department is full of fail. They were telling everyone "50% faster than Sandy Bridge" and giving people high expectations, so after the benchmarks came out instead of people thinking "meh, it's OK" everybody was running around predicting the end of the world.
On top of that, they sent reviewers the FX-8150, the 8-thread version with the worst single thread performance per dollar because you're paying for eight threads whether you use them or not. So the reviewers compared it to the Inte
Comment removed (Score:5, Interesting)
Re: (Score:2)
Hats off to your incisive analysis. I certainly will not make the mistake of buying an Atom-based computer again. Atom runs too hot for the amount of computing it does. Against ARM it is just no contest. The only thing it has going for it is x86 compatability, and I guess I would prefer to get that more of that for less money from AMD, bundled with a decent GPU.
One thing to add: OpenCL is a game changer. It shifts the multi-core equation onto the GPU. Four cores? Feh, how about 80, or 800, all cranking sing
Re: (Score:2)
Re: (Score:2)
Intel very rarely falls behind anything AMD does.
Not true at all. But when they do fall behind, which has happened several times (e.g. the slow move to serial interconnect) they will spend any amount of money and employ any means, fair or foul to catch up.
This time round though, AMD has a very real chance to be on a 20nm process while Intel spends a year or two at 22nm. That is because TSMC has already taped out ARM15s at 20 nm, while Intel has a long way to go on its 14 nm process. That will go a long way towards levelling the playing field. Hmm, intere
Re: (Score:3)
Seriously, there is no reason at all to go amd right now.
I went AMD very recently when building a cheap home server, because AMD motherboards tend to have higher SATA port counts on consumer level hardware. They also don't make a habit of over-zealously disabling key features from the CPU like Intel does to differentiate their pricing structure (the lack of VT-X bit me pretty hard when my P7450 laptop arrived, and Intel documentation hadn't been released indicating the lack of VT-X at the time I purchased it). The Intel CPUs are faster, but that didn't mean much
But still slower then a "real" video card... (Score:4, Interesting)
Re: (Score:3)
Re: (Score:3)
These are a hell of a lot better than that. They aren't good, but they will manage to play skyrim for example (albeit at a relatively low resolution for a decent framerate). http://www.anandtech.com/show/5626/ivy-bridge-preview-core-i7-3770k/15
These are probably the wrong direction for the product though. They don't viably compete with a discrete GPU, so people who can, would rather not have to buy an integrated GPU at all, and for business it's so powerful it's letting employees game on work computers,
Re: (Score:2)
They aren't good, but they will manage to play skyrim for example (albeit at a relatively low resolution for a decent framerate).
Just in case someone doesn't care to click through to the benchmark, allow me to summarize: AnandTech reports 46 fps at 720p and no AA for Skyrim, a PC game comparable to PS3 games. So no, using integrated graphics doesn't mean going back to Dreamcast-class graphics anymore.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Come back again in a console generation.
Right now you can't do enough better on a PC to warrant doing major PC exclusive graphics compared to the 360/PS3. Higher resolution, better FPS sure, but not significantly better. Intel is reasonably closing the gap on PS3/Xbox 2 level performance, but that puts then at about 0.1 of a good graphics card*.
Of course the 'next gen' consoles are in the making now, and that means we'll see consoles about on par with what you can do with a decent rig today. So then the
Re: (Score:2)
Re: (Score:2)
I think it makes sense for mobile applications, but for desktop it doesn't. You can get a $40 card that will outperform the onboard. That being said I'm sure Dell etc love it. They love charging for upgrades. They're the car salesmen of the computer world. Once you add the goodies onto your base model you could have bought the top end that came with those feature
Re: (Score:3)
True yesterday, false today: [anandtech.com]
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
It didn't do so well on DX11, in fact it was out down by a GT 440 (which I can get locally for $54). After quick browsing it appears the GT 440 does better on all the games as well. Well enough for me to prefer not having integrate video (as in don't include it on chip and pass the savings on to me).
Ivy Bridge does do significantly better than Sandy Bridge 2600K integrated, so Intel is improvi
Re: (Score:2)
Re: (Score:3)
The sad thing though is that us gamers/enthusiasts are basically paying for a GPU we'll never use. It would be nice if these CPUs were sold also without an integrated GPU.
They actually do exist, check out the Core i5 2550 for example. It has a higher clock than the 2500 for the same price. The difference is they removed the iGPU from the chip.
Re: (Score:2)
Re: (Score:2)
Well, it's not like Intel does it just to annoy you. The top Intel chips have 16 EUs which is roughly equal to 32 shaders. A top graphics card like the 7970 has 2048 shaders. So if you use AMD's $450 price as basis that works out to about $7 for an Intel, make it $10 to include QuickSync and whatnot. For that small savings Intel would have to validate a new design and risk a potential shortage of chips with/without IGP. Look at the die layout [pcper.com] for Sandy Bridge, there's no Ivy Bridge layout yet but it's proba
Re:But still slower then a "real" video card... (Score:5, Informative)
Look at the die layout [pcper.com] for Sandy Bridge, there's no Ivy Bridge layout yet but it's probably the same. You see that huge chunk called "graphics"? Me neither, it's somewhere in those small "misc io" bits. That's the only little thing of your CPU you aren't using with a dGPU.
I guess that's simply a chip without an integrated GPU. Here's a picture of a Sandy Bridge Core i7 with GPU [pcmag.com].
Re:But still slower then a "real" video card... (Score:5, Insightful)
Re: (Score:2)
Which ION based desktop PC? (Score:2)
Even an ION board blows the doors off a Voodoo from a decade ago.
I've noticed that Best Buy doesn't have the ION-based Aspire Revo PC anymore. Is the EeeBox any good for those who choose to buy rather than build? And I thought NVIDIA had got out of the Intel chipset business anyway [slashdot.org] due to patent licensing squabbles.
Re: (Score:2)
The former is an intel Atom, running in the GTL mode that allows it to pair with core logic designed for P4s, using an Nvidia chipset instead of the bottom-barrel GMA-950-hobbled Intel chipsets that were the cheap(but not especially power-efficient or high-performance) thing to pair with Atom parts.
With the later Atom revisions, Intel moved to a new chipset interface that they assert Nvidia has to right to interface with(they moved to a different, also Nvidia-disallowed, inte
Re: (Score:2)
Voodoo from a decade ago.
Here's a link to a benchmark [guru3d.com] of 3dfx's never-released Voodoo5-6000 (modified and overclocked).
3dfx Voodoo5 6000 3700A Gold SE @201 MHz ( 3dmark2001se ): 6341 marks
Looks like you're right. An integrated nVidia ION does indeed beat it in benchmarks: http://hwbot.org/hardware/videocard/nvidia_ion_integrated/ [hwbot.org]
Re: (Score:2)
Re: (Score:2)
Well even for you one of the advantages of the AMD Fusion platform is the ability to add in a discrete card and combine the power of the two (Figuring the discrete card is another AMD and your running under windows). Though from what I've seen the Fusion platform is capable enough for most 3D tasks unless your serious gamer who wants every bell and whistle @ 1600x1200+.
It's also a different story when it comes to laptops. Fusion is incredibly useful in the laptop market where the entire lower end of the mar
Doesn't cost much (Score:3)
And there is a lot of use for them.
In terms of desktop chips it is for low end use. A lot of people just do web/e-mail/word with their systems and an Intel HD graphics setup is perfect for them. It is plenty of power to do the shiny OS interface, accelerate video, and so on, and comes with the system.
In terms of laptop chips, you really always want it on account of switchable graphics. If your laptop has switchable graphics it can use the integrated for low power consumption and only kick on the discrete wh
Re: (Score:2)
Not really. See Athlon II X4 631. It costs quite a bit less than the A6-3650. Not much, but enough for GP to have a point. Why pay even little for something that you're not going to use at all?
Re: (Score:2)
The Lano is a different beast. AMD is trying to whack a bigass graphics card, relatively speaking, on there. Intel's HD graphics are tiny, that is why they are low performers. But sure if he really wants Intel processors with no graphics he can have them. Intel's high end CPUs don't feature them, their LGA 2011 and LGA 1366 CPUs. However, they cost more so there you go.
Intel's mainstream CPUs have integrated GPUs. They are very reasonably priced so just deal with it.
Re: (Score:2)
"Deal with it"? I though I was doing that just fine. I'm not losing my shit here, suing Intel or even registering a IHATEINTELGPUS.com. All I did was point out that buying something you'll never use is a bad deal, regardless of price.
I think AMD had the right idea: bundle a good GPU, strong enough to beat discrete graphics of past generations or it's pointless. And allow crossfire in case the user wants to upgrade, so as not to waste any resources. As for Intel, if I already have a Radeon HD5570 or Geforce
Re: (Score:2)
Isn't an i5 overkill for an HTPC? Unless you often reencode your media, a Pentium G620 would probably work just as well.
Faster than low end Card. (Score:2)
http://www.anandtech.com/show/5626/ivy-bridge-preview-core-i7-3770k/11 [anandtech.com]
It was faster than low end cheapo cards. Which is mainly the point.
If you are putting in $200 cards, they are a long ways off, but they essentially obsolete the need for a low end card, which is a good thing.
And since all most people need is a low end card, this is sufficient for most people.
For desktop, internet, video, web games, older games and even new games at modest settings this is fine.
Re: (Score:2)
Frankly, I am sick and tired of these integrated GPUs. The theory is that its a cost saver, but since I just put in a dedicated graphics card it ends up being a cost with no benefit. Ah well.
According to this review, the AMD A8-3850 is 50% faster than the ~$50 Radeon 6450, but 50% slower than the ~$75 Radeon 6670. [pcper.com]
So sure, it's not better than a $200 dedicated card, but it's far better than what integrated cards use to be like. Integrated will never be faster than dedicated, but if I can get about 50% of the performance from integrated then that's reasonable until I have an extra $200 for a "real" video card.
Re: (Score:2)
The A8 is way faster than Intel's offerings, works with a 6670 in crossfire so as not to diminish its value when upgrading and can be bought without a GPU as an Athlon II X4 641. I think he was referring to the fact that going Intel forces you to pay for a GPU that you'll have no use for if you have anything better than a Radeon HD5570, which is often the case, especially with processors more powerful than, say an i3.
Re: (Score:3)
Having a graphics card integrated into the CPU is only one benefit. The future benefit is using the GPU as a co-CPU. AMD already has plans for the IGP to understand context switching and respect protected memory.
Some people say "why, the IGP is slower than discrete". But no one thinks, ohh, the IGP has 2-3 magnitudes less latency than a discrete GPU while being less than 1 magnitude slower.
Think of future multimedia where the CPU and IGP ping-pong data back and forth. I like to think of what kind of physic
Re: (Score:2)
Re: (Score:2)
Where they really shine is when combined with a mini ITX mobo. Now if I got around to get an inverter and some decent battery I could bring my desktop computer with me as a moderately bulky laptop replacement.
Re: (Score:2)
Actually, there is one place where Intel's integrated GPU knocks the socks off all the competition... Video encoding!
Just look at the benchmarks and image examples from AnandTech's review [anandtech.com].
And that's the old Sandy Bridge. If we see 30%-50% improvement over that again.. I can see some uses for the integrated card :)
Re: (Score:2)
Ivy bridge vs ARM (Score:2)
Re:Ivy bridge vs ARM (Score:5, Insightful)
As soon as ARM tries to catch up to the performance of x86 (and x64) it no longer has the lower power consumption.
Llano (Score:2)
Like all tradeoffs (Score:2)
It depends on what you're doing with it! Duh... Seriously, that was a deeply stupid question.
Re: (Score:2)
It's a leading question in a Slashdot summary. It's hardly meant to be intelligent; I think the purpose is to drive discussion.
You see that somewhat often on news stories elsewhere, probably more at lower-quality establishments whose MO is to drum up controversy.
Yet another nail in AMD's coffin (Score:2)
1. AMD CPU bug
2. AMD divesting from its fab
3. Intel pulling even MORE ahead on performance and even lowering power usage at the same time!
Not to mention AMD's financial troubles and the fact they have a tendency to burn up.
Re: (Score:2)
Re: (Score:2)
New CPUs made for laptops in mind? (Score:2)
I'll File this under "Who Cares?" (Score:2)
Soooo, you built a CPU that barely runs faster than the previous generation CPU. However the integrated graphics are 20-50% better.
Integrated graphics for anything other than the most basic tasks are horrible by several degrees of magnitude. You can buy a 130$ discrete video card that will deliver 1000% time graphics.
In real world terms this is like taking a game that runs at say 12FPS and making it run at 14-18 FPS which is still unplayable. More realistically you will take a game that is completely unplay
Re:GPU performance always wins (Score:5, Interesting)
Re: (Score:2)
Back in the bad old days, buying a bottom-barrel graphics card meant getting a single VGA out(and possibly one where the manufacturer had cheaped out so hard that analog quality issues were visible...), and lousy performance, and the PCI ones that you needed to run more than one, once your AGP slot filled, were always mysteriously overpriced(alas, this still seems to be
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re:GPU Performance (Score:5, Insightful)
Re: (Score:2)
The main reason that integrated GPU performance matters(aside from the fact that it is all the GPU you get in any too-cheap or too-skinny device that doesn't have a discrete option) is that it defines the (overwhelmingly common) baseline for what 'PC graphics' means. If that situation is uniformly awful, GPU intensive stuff will continue to be fairly niche, which leads to a chicken-and-egg issue: if integrated graphics suck, the market for GPU intensive stuff will be constrained, which will reduce the incentive to improve GPU performance, and so it goes...
And this is exactly how Intel wants it - any level playing field that emphasizes GPU would have them at the mercy of ATI/NVidia as Intel's previous efforts at a competetive GPU (see Larabee) were pretty dismal.
However, with the rise of mobile devices (iOS, Android) and ARM (even Microsoft is targeting ARM for Win8), they are cooking their own goose. They can't keep fighting yesterday's battle - it will be a Pyrrhic victory. When Win8 releases with full ARM support and PC laptop manufacturers (likely follo
Re: (Score:3)
Re: (Score:3)
The AMD llano chips are better then just competent for MMO games. A laptop llano chip will run EQ1, EQ2, WoW, The Old Republic etc without any discrete GPU.
They even handle things like Fallout 3, Fallout New Vegas + lots of mods without needing a discrete card.
The llano chips also do GPGPU without crushing your battery power. So if you are on battery power you can do calculations hundreds of times faster then an intel chip can if you can do GPGPU and not kill your battery doing it.
For me I run into more and
Re: (Score:3, Insightful)
Re: (Score:2)
^ This
Not all work loads need lots of throughput, some are very sensitive to latency. IGP is a great trade-off between latency and throughput. 10s of times faster at SIMD than a CPU could ever be and 100-1000 times less latency than a discrete GPU.