Intel Officially Lifts the Veil On Ivy Bridge 200
New submitter zackmerles writes "Tom's Hardware takes the newly-released, top-of-the-line Ivy Bridge Core i7-3770K for a spin. All Core i7 Ivy Bridge CPUs come with Intel HD Graphics 4000, which despite the DirectX 11 support, only provides a modest boost to the Sandy Bridge Intel HD Graphics 3000. However, the new architecture tops the charts for low power consumption, which should make the Ivy Bridge mobile offerings more desirable. In CPU performance, the new Ivy Bridge Core i7 is only marginally better than last generation's Core i7-2700K. Essentially, Ivy Bridge is not the fantastic follow-up to Sandy Bridge that many enthusiasts had hoped for, but an incremental improvement. In the end, those desktop users who decided to skip Sandy Bridge to hold out for Ivy Bridge, probably shouldn't have. On the other hand, since Intel priced the new Core i7-3770K and Core i5-3570K the same as their Sandy Bridge counterparts, there is no reason to purchase the previous generation chips."
Reader jjslash points out that coverage is available from all the usual suspects — pick your favorite: AnandTech, TechSpot, Hot Hardware, ExtremeTech, and Overclockers.
Let me get this straight... (Score:4, Insightful)
Re:Let me get this straight... (Score:5, Informative)
For people familiar with Intel's Tick-Tock cadence - this should not come as much surprise. Some people may have gotten caught up in marketing and expected more, but this is a "Tick" which brings a process shrink, power savings, and a modest performance increase. It is just about delivering that, though perhaps on the softer side of things.
Sandy Bridge was a Tock - a BIG performance improvement. Haswell should be a Tock - a BIG performance improvement.
On the tick, they set more modest performance goals, and focus on getting the process shrink right and tuning things up. On the tock, they should knock our socks off. So maybe Ivy Bridge is disappointing, but perhaps familiarity with their product development strategy helps to manage expectations
Re:Let me get this straight... (Score:5, Funny)
Sounds a little like Microsoft's method.
Win 95 - Tock
Win 98 - Tick
Win Me - Sproing
Win 2000 - Tock
Win XP - Tock
XP SP1 - Tick
XP SP2 - Tock
XP SP3 - Tick
Vista - Tock sprooooing
Win 7 - Tick
Win 8 - Tock (maybe)
Re: (Score:2)
Terrible analogy given that there's no analog in the software for the alteration between manufacturing process and microarchitecture design steps that Tick / Tock represents.
Re: (Score:2)
For people familiar with Intel's Tick-Tock cadence - this should not come as much surprise....
It comes as a great surprise. Normally, the process shrink delivers either or both a clock boost or power efficiency improvement. Normally, also a speedup due to additional superscalar hardware, but Intel explained that one away as "improved graphics". Well, where did the clock boost go then? Power efficiency? OK, all missing in action. So, the big unwritten subtext here is: Intel's 22nm node has got problems. Big problems. Trigate not working out so well?
Re: (Score:3)
Higher clockspeeds use more power. Intel hasnt gone much above 3.3gHz for years, with 3.7 (i believe) being the top clock rate that they have ever done. You expect them to change that now when the focus is on higher efficiency, more cores, and lower power usage?
It doesnt represent a problem at all, and for the record all of the benchmarks ive seen on hothardware (linky [hothardware.com]) show it as being faster than sandy bridge, so theres that speedup youre complaining about.
They never said that there would be a clock boo
Re: (Score:2)
You twisted my words up entirely. Let me put it more simply: I am underwhelmed by the "tock" this time. As is every commentator with a clue. This process node appears to be a fail for Intel.
Re: (Score:3)
And, oh yes, I am underwhelmed by the "tick". On the face of it, Intel would have accomplished more with another go around at 28nm.
Now for the Intel fanboys in the thread, let's shed some authoritative light [pcper.com] on the subject.
Re: (Score:3)
vWell, where did the clock boost go then?
Normally, also a speedup due to additional superscalar hardware, but Intel explained that one away as "improved graphics".
Your words, not mine. You wanted a clockboost (which they really havent done for about 6 years now, and did not promise), and a speedup (which they delivered). You claimed that power efficiency is, to quote your post, "missing in action" (even though it isnt).
You seem to have assumed they reneged on all their promises despite the reality of the situation, for no other apparent reason than that you wanted something to rail about. Possibly check the sources before buying into the slashdot spin bs.
Re:Let me get this straight... (Score:5, Interesting)
ower efficiency? OK, all missing in action.
Per some of the articles, power consumption is down nearly 20W between the two generations.
So, the big unwritten subtext here is: Intel's 22nm node has got problems. Big problems. Trigate not working out so well?
Far too early to tell. The fact that they introduced a brand new, immensely complex process into manufacturing and it is working so well actually says a lot of good about how the trigate process is fairing. It will, of course, need some tuning and massaging. But it is already performing as well as/slightly better than the previous generation on its first release, at lower power (at least per Anand).
IVB is also farking small, which as the process matures, should mean more parts and lower prices.
Re: (Score:2)
Re: (Score:3, Interesting)
Re:Let me get this straight... (Score:4, Informative)
Leakage was handled in several ways. Materials technology in semiconductor manufacturing (in particular CPU manufacturing) advanced a lot in the last decade and a half. It used to be chips were all made from polysilicon. Eventually as the transistors got smaller closer to the nanoscale there was work done on new materials (so called low-k and high-k materials). You probably heard about names such as Black Diamond low-k or Hafnium high-k (aka metal gates) along the way. These reduced the leakage issue. Instead of using aluminum for the wires today we use copper to reduce power consumption because copper is a better conductor. Then there is germanium doping to produce so called 'strained silicon' so that the silicon atoms are further apart to improve electron mobility. Taking these material changes and a couple of design changes today's processors are clocking higher than they were 10 years ago even if not at the rate Intel used to predict back then. You probably noticed by now we are either hitting or close to hitting 4 GHz on CPUs while not so long ago they used to be 2 GHz or less with regular air cooling.
Today people are either doing chip stacking (e.g. on cellphones it is common to stack the DRAM and Flash on top of the CPU module) to make the system more compact. Then there are people working on so called vertical transistors and trigate transistors instead of using regular planar transistors. Ivy Bridge for example is the first processor featuring trigate transistors which is one reason for its low power consumption and reduced leakage over Sandy Bridge. It has been more trouble than usual but it seems everything is ok for the next two process shrinks to work in technological terms. Ultimately we will see the whole system on a chip and CPU/GPU integration is simply the first step with DRAM probably following soon afterwards.
Re: (Score:3)
Re: (Score:3)
It offers a significant power reduction (~22%), plus a slight boost in IPC, same clock rates, and a notable boost in IGP performance (~30%). For instance, i7 3770K (77W TDP, and HD 4000) vs i7 2700K (95W TDP and HD 3000) [anandtech.com]. Both are quad core, 8 thread, 3.5GHz with max turbo of 3.9GHz, and 8MB L3 cache. On the mobile CPU side, a new i7 3612QM, 35W quad core, 8 thread, 6MB L3 cache, and HD 4000 graphics, compared to at least 45W TDP on all prior quad core mobile i7 CPUs (with slower IGP).
Re:Intel's Tick-Tock cadence (Score:3)
Okay, so I have been planning a long term computer strategy since 2006 when I got a decent first gen Quad Core.
So hopefully if I can hold out that long, I should wait for the Tock - Haswell architecture, at the same time waiting for the Post-Win8-Metro consensus, which might just be either a Tock for Microsoft or maybe even a paradigm explosion into Apple and/or Linux if by some Mayan Miracle Microsoft implodes as a company. Or, if there is no "Windows 9", then I'll have to think about what to do then.
Re: (Score:2)
Smart way of doing things too (Score:2)
It avoids a problem many companies have of fighting with a new design and a new process, and having a product that gets delayed, has issues, etc. They either use a new design, on a stable process, or an untested process, with a proven design.
Sometimes a new process can give a moderate sized performance bump due to higher clock speeds, but that doesn't always happen.
It does reduce power consumption though and that is always nice.
Re: (Score:2)
Haswell should be a Tock - a BIG performance improvement.
Don't expect a raw BIG performance improvement. Not like p4 vs core2. But Haswell DOES implement a breaktrough - Transactional Synchronization Extensions. It translates to "transactional memory for threads". An addition of couple of new instructions (may even be available on current CPUs, as they are already documented in the development manuals) to control thread context. Check http://software.intel.com/file/41604 [intel.com] for details.
Re:Let me get this straight... (Score:5, Informative)
Re: (Score:2)
Either way, the party don't stop.
Re:Let me get this straight... (Score:5, Funny)
I know this is how intel defines it, but that's always seemed odd to me. Tick comes before tock. A new architecture comes before the refinement of that architecture. Seems like the tick should be the new architecture. and the tock should be the refinement.
It should come as no surprise given that Intel also got the order of bytes in a word backwards.
Re: (Score:3)
Consider a single byte. We call the low bit, which contributes 2^0, 'bit 0'. We call the high bit, which contributes 2^7, 'bit 7'. Why shoud we not use the same order for larger constructions?
In a little endian word, bit n is in the (n/8)th byte. Big endian is just weird.
Re:Let me get this straight... (Score:5, Informative)
No, Tocks (Penryn, Nehalem, SandyB, Haswell) are new architecture, Ticks (Merom, Westmere, IvyBridge, Haswell-sucessor-on-next-gen-XXnm-process) are updated architecture on new process.
Re: (Score:2)
The successor to Haswell is Broadwell. 14nm process, if I remember correctly.
Re:Let me get this straight... (Score:4, Insightful)
Hopefully meta mods are paying attention to this one.
Re: (Score:2)
the whole analogy is retarded anyway, because it's invariably qualified with a sentence explaining that it's a die shrink or a new architecture. So why bother with the whole tick-tock business?
Because Intel wants it to sound like a clock ticking away the life of its competitors?
Re: (Score:2)
the whole analogy is retarded anyway, because it's invariably qualified with a sentence explaining that it's a die shrink or a new architecture. So why bother with the whole tick-tock business?
Because Intel wants it to sound like a clock ticking away the life of its competitors?
Does this mean that after a few more tick-tocks it will end with a great big kaboom?
Intel is probably hoping to hear just a muted whimper and a brief death rattle...
Re: (Score:2)
Care to link the benchmarks where the SB outperforms IB? The linked sources seem to agree that it is generally 10-15% faster, with the GPU being substantially faster.
Re: (Score:2)
under-performing the older "Sandy Bridge" is pretty damning to Intel.
In what category does Ivy Bridge 'under-perform' Sandy Bridge? It exceeds it in every category - esp performance per watt important in Ultrabooks - and is behind NO WHERE. Not living up to someone's expectations about how much better it is than SB is not the same as under-performing.
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
Power consumption is important to some people.. Gamers won't care about either.../quote.
Wrong. Power consumption determines cooling requirements, which determines fan noise. Power consumption also determines whether your long suffering power supply needs yet another upgrade. We are already in circuit breaking blowing zone on a lot of gaming rigs.
Re: (Score:2)
Re: (Score:2)
Re: (Score:3, Insightful)
Re:Let me get this straight... (Score:5, Insightful)
A 50% GPU improvement [anandtech.com] over Sandy Bridge is VERY significant.
Not particularly. A 50% faster GPU will still suck for gamers and will be irrelevant to non-gamers.
No, not really (Score:2)
Intel's onboard GPUs are good enough to play games these days. No you won't be cranking up the graphics detail, but they'll do the trick to play many games. You might notice on the linked page it is running Skyrim in medium detail at a playable framerate. That is a modern game title. The other games tested are similar. None of them are running stellar, but they are doing 30fps at medium quality.
For non-gamers, well more and more is making use of the GPU. All that shiny UI stuff is done on the GPU, and all t
Re: (Score:2)
Sure, but low resolution and medium quality is of no interest to most gamers. Being able to play a game badly isn't really a big selling point when you can play it well for another $100.
Re: (Score:2)
Laptops are certainly a possible candidate, but do you really believe the average gaming laptop consumes 320W?
If I remember correctly, the power brick on my gaming laptop is only rated for 80W. I've no idea how the graphics performance compares to Ivy Bridge, but I've only recently had to stop running with high-quality graphics settings on most new games.
Re: (Score:2)
Not for gaming
"More disheartening is the fact that you have to dial down to 1280x720 and use detail settings that make five-year-old consoles look good."
Re: (Score:2)
Intel HD graphics 2000 is sufficient to play Starcraft 2 at reasonable levels, I think 50% improvement over the already 50% faster HD 3000 would be very welcome for gamers.
Re: (Score:2)
As a system builder, you have a very desktop-oriented view of the world. But look at what Intel is facing [inquisitr.com]. 110v power and a big enclosure for lots of discrete components are luxuries that a diminishing minority of "computers" have going into the future.
Re: (Score:3)
Very significant for Intel, not for users. AMD is leading in that segment by far, still being the only relevant option in integrated graphics. And Trinity will only widen that gap. Sandy Bridge was already enough for desktop effects, video playback and legacy gaming, and Ivy is good for exactly the same things. The performance gains, impressive as they are (50% is a major leap) aren't that significant for any of those tasks nor improve serious usage by a lot. And, given that Llano has far superior performan
Re: (Score:2)
Not really true –the HD4000 is fast enough to beat all Llanos except for the top end one. That, and the fact that it's possible to buy a faster intel CPU *and* a discrete radeon 6570 for less than the top end llano with an integrated 6550 llano is pretty much irrelevant.
Re: (Score:2)
That's not my point. My point is that HD4000 isn't significantly better than HD3000 because there's a monstrous leap in needed performance between "casual use" and "serious use". It's especially not enough for an i7, being too bottlenecky. If they make it to the Celerons without too much gimping, though, Intel will pack a reasonable punch in the low end.
Also, Llano won't be competing against Ivy, Trinity will. We will have to see how good of an iGPU it has, but seems like they have achieved the same 50% inc
Re: (Score:2)
Re: (Score:3)
A 50% GPU improvement [anandtech.com] over Sandy Bridge is VERY significant.
Compared to other Intel. But compared to AMD and NVidia it still sucks major donkey poo.
Re:Let me get this straight... (Score:4, Insightful)
And now that they can't squeeze any more performance out of the designs, they're working on decreasing energy consumption.
Is it really because they can't squeeze out more performance, or is it because decreased energy consumption is primarily what consumers are demanding these days?
I can't remember the last time I heard anyone complaining about their CPU being too slow (barring software problems), but people still wish their laptop/tablet had longer battery life.
Re: (Score:2)
Re: (Score:2)
An HTPC only needs to be on when you're using it.
Even MythTV supports the idea of putting a backend to sleep when it's not being used. Putting a frontend to sleep is pretty trivial. You hit the off switch.
Intel GPUs continue to be disappointing: something you either try to ingore or work around (by upgrading to AMD or Nvidia).
Re:Let me get this straight... (Score:2)
But a "tock" which I feel nobody has mentioned and is almost the sole reason why I am patiently waiting for the next MBP is 4K screen resolution. I feel that "retina display" type dpi becomes possible with this feature. The next release of OS X shows development to utilize 4K potential.
Gaming may be poor performance since GPUs may have to get a substantial overhaul and nobody probably
Re: (Score:2)
Right, a company with no competition releasing better products for the same price... That's really unfair on consumers.
Re:Let me get this straight... (Score:4, Informative)
Yes, IB isn't a massive improvement on SB. But it's also worth stating what Intel did right:
Same price
Compatible with old sockets/motherboards
And who said every generation of processors had to be a significant improvement? Toyota puts out essentially the same car every year for a decade, with only minor, incremental improvements. There's no reason why you can't do the same for processors. The only downside is for people who like to brag about having the very-latest processor.
Personally, I'm going to be grabbing an Ivy Bridge laptop, if only because my old, reliable Core 2 laptop finally died. And I'll probably skip over Haswell, maybe Broadwell too, before upgrading again.
Long story short, if you've got a Sandy Bridge, you don't need to upgrade yet. If you've got a Nehalem and some spare cash, an upgrade may (or may not) be useful. If you're on something before that, IB is the chip to upgrade to.
PS: I'm not really a fanboy for either company (I've used both extensively - the Phenom's were great, and even my old Athlon 900 still sees service now and again), but AMD really doesn't have any attractive higher-end options. The Fusion processors look good compared to Intel's low-power options, though - I seriously considered getting a small Fusion laptop and then building a more powerful SB or IB desktop at home, but decided single-device was better.
Re: (Score:2)
Maybe or you may be seeing just a mature product segment.
Look at airliners. They are not getting any faster for the most part but incrementally more efficient. Every technology reaches a point of maturity when improvements become incremental. The I7 right now is fast enough for the vast majority of users needs, what will be interesting is to see how the i5 and i3 do.
Re:Let me get this straight... (Score:5, Informative)
Except the summary seems wrong by its own sources:
TechSpot
Since late last year Ivy Bridge seems to be the architecture everyone is waiting for. Although Intel is only anticipating a 10–15% processing performance bump when compared to Sandy Bridge,
Which is what they have been saying for about a year now, and what everyone expected. And for the record, 15% speed boost at the same clock with lower power usage is not insignificant, at all.
AnandTech:
Ivy Bridge is a tick+, as we've already established. ... The end result is a reasonable increase in CPU performance (for a tick), a big step in GPU performance, and a decrease in power consumption.
SemiAccurate:
For raw numbers, the top HD 4000 only has 16 shaders, but the underlying architecture is completely new. .....Intel is claiming about 2x the graphics performance from 33% more units. We don't think these claims are out of line for the general case.
Way to go, summary, you successfully implied that the chip was a flop when your sources indicate it hit its target, has substantially better GPU performance, and has a launch price in line with its current lineup. Slashdot truly is master of the art of spin.
Re: (Score:2)
Re: (Score:2)
So, Intel, a company with no real competition right now in the market, has produced a product that offers only a very slight performance boost, and relied on tons of marketing to drum up anticipation for this mediocre offering.
I'm guessing you weren't working in the industry back in the 90's when we had cyrix in the game too. Even back then, Intel during the MHZ race would do this if not to simply one-up the competition. Hell it was even worse during the socket/slot fiasco.
Comment removed (Score:5, Interesting)
Re: (Score:2)
> What exactly do you expect, Captain High Expectations, a wormhole/laser based CPU that is 10000X faster at 1 millionth the power usage?
Something that would make me consider upgrading any of my machines would be a nice start.
Re: (Score:2)
Depending upon what you have NOW this could very well be reason to upgrade. I'm on an i7 920 clocked to a bit over 4.2ghz. Judging from the benchmarks I'm seeing on this new CPU it's more than worth upgrading for me. Look at the overclocked benchmarks, particularly on video processing which is what I do a great deal of. The new CPU is a good bit faster and oh yeah uses less power. Considering that some of the processing I do sees as low as 8FPS I'd surely appreciate the kick in speed. Best part is being abl
Re: (Score:2)
forgot this -> http://www.overclockersclub.com/reviews/intel_corei7_3770k/4.htm [overclockersclub.com]
Note that they overclock this CPU as well as the existing one I mentioned. The deltas are impressive IMO.
Review Roundup (Score:5, Informative)
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
X-bit Labs [xbitlabs.com] review.
Not much new stuff in there compared to other reviews. I miss the days when they accurately measured CPU and GPU power consumption... Now it's just meaningless "total power".
Skipping Sandy Bridge (Score:2)
In the end, those desktop users who decided to skip Sandy Bridge to hold out for Ivy Bridge, probably shouldn't have.
Well, that rather depends on how many Ivy Bridge recalls there will be, doesn't it?
Waiting for Sandy Bridge price cut (Score:2)
When building a new PC I went for the Pentium G620. It's pretty much the lowest-end Sandy Bridge CPU in existence. I've been running this model for a while now.
With Ivy Bridge coming out, hopefully the prices on Sandy Bridge CPUs will come down. Maybe I could move to an i3 on the cheap, then. Or perhaps I'll even wait for Haswell; Sandy Bridge CPUs will probably be dirt cheap by then.
Re: (Score:2)
(I'm not compiling C++ or even Java most of the time.)
Telling us what you are doing is probably more likely to encourage helpful suggestions than telling us what you aren't doing...
Re: (Score:2)
Mostly web development (that's where the market is): including frameworks like Symfony, Drupal, etc. which have large codebases. Using Netbeans (Java-based) to step through and debug that code.
Running standard MySQL/Apache, etc.
Running virtual machines to simulate environments. Running MongoDB and Postfix for mail. Running unit tests. Running integration tests (like automated in-browser testing).
In addition, running standard productivity tools (email, office, etc.). Running light graphics tools (Inkscape, G
Re: (Score:3)
Don't worry about the CPU and spend your money on a big SSD & lots of RAM.
Re: (Score:2)
I was thinking I needed to have at least a Core i3 because it supports Intel Virtualization Extensions (VT-x). But then I read that VirtualBox doesn't really use hardware virtualization much. So even a Dual Core B940 should suffice, right?
Correct, but it is still a good idea to have hardware virtualization. Virtualbox does use hardware virtualization but it is only required when virtualizing 64bit guests. If you are running 32bit guests, hardware virtualization can still be used and should allow for better performance. Hardware virtualization will also allow you to play around with KVM and other VM software. If you're getting a new machine, why limit yourself?
Re: (Score:2)
Speaking of 64 vs 32, anybody want to say anything about 64 or 32 being faster?
I.e., if you don't have > 2GB or > 4GB datasets, 32-bit is faster because it doesn't have the overhead? (I.e., 64bit is pushing more data around in every single machine instruction because the addresses specified are longer.)
Re: (Score:2)
(I.e., 64bit is pushing more data around in every single machine instruction because the addresses specified are longer.)
No, it's not. In fact, you're less likely to require an address in an instruction on an x64 CPU because you have twice as many registers so you're not having to perpetually push values out to RAM and read them back in order to free up registers for other uses.
Re: (Score:2)
Oh, so if you're setting up a virtual machine (either for testing locally or for production on a VPS somewhere like Rackspace), even it's only a 1GB machine, it should be 64 bit? I'm assuming 512MB should be 32bit.
Re: (Score:2)
Power to compute ratio (Score:2)
Ivy Bridge feature summary (Score:3)
For those of us who need a reminder:
https://en.wikipedia.org/wiki/Ivy_Bridge_(microarchitecture) [wikipedia.org]
Yeah, it's Wikipedia. But it's short and to the point.
Re:HD 4000 (Score:5, Interesting)
The vast majority of users will use it. Intel integrated has been a good enough solution for most users for a long time now.
It would cost more to fab a chip without it, would you pay extra for that? Since they would be making so few.
This is a normal tick in the Intel tick-tock cycle. You will get that 50%-100% with Haswell.
Re: (Score:2)
This is cheaper for you, it would cost more to fab a separate unit. Also motherboards are now also cheaper.
You have no real complaint.
Re: (Score:2)
If you care about CPU performance and you aren't already running a Sandy Bridge CPU, then yes, you should get an Ivy Bridge system.
Sandy Bridge is a very fast architecture, so if you already have one, then it makes more sense to wait for Haswell. Even users who care about speed are unlikely to appreciate the improvement made by Ivy Bridge over Sandy Bridge.
Meanwhile, nobody who cares about graphics on the desktop is going to want an on-die GPU, regardless of the CPU it's attached to. Ivy Bridge is really
Re: (Score:2)
Every technology that sees huge improvements over short timespans will begin to plateau eventually. There's only so much you can do before you start bumping into major constraints, such as the laws of physics.
Re: (Score:3)
Re: (Score:2)
Why? If you've waited this long, just hold out for Haswell. Intel has already confirmed they're changing the socket, so really that gives you the best odds of an upgrade path in the future should they decide to keep LGA1150 around for Skylake (the successor to Haswell). At the very least you're stuck with yet another obsoleted socket, but with a likely impressive performance upgrade over Ivy Bridge. By buying in now, you've already lost 1 year on current performance levels going forward... the best time
Re: (Score:2)
Re: (Score:2)
If you are expecting to upgrade and actually wanting to *plan* on retaining a motherboard, Intel is actually a pretty poor choice. Socket AM2 (first out in 2006) can still run the latest Bulldozer based AMD processors. Meanwhile, LGA1155 was released in 2011 and is already hitting the end of the line with Ivy Bridge. Given that he waited this long to upgrade conroe (my desktop is still conroe), there is nearly zero chance that his next round of reasonable upgrade will be within the life of LGA1155 or LGA
Re: (Score:2)
Besides, whatever plan Intel currently has for Haswell, I wouldn't be surprised if they slow down a bit. When they only face 'real' competition from themselves, they tend to get a bit more sluggish and unimpressive with their product line enhancements. Sure, they'll be hammering on Medfield sucessors to try to make inroads into the ARM dominated mobile space, but desktop and server lines don't have a lot of pressure to force them to be aggressive in product development right now.
Re: (Score:2)
Re: (Score:3)
Well, any machine at retail will. Retail is just to slim margins for the extra cost.
Graphics CAN'T be removed because they are built into the CPU-chipset combo... And nobody else is licensed to make chipsets. Intel is forcing OEMS to go back to "external" chips on the PCI-E bus... Which is 100% more circuitry and super complex firmware to get back to what you got from Nvidia. That adds $100-$200 to the wholesale.
Things like MacBook Air are forced to choose battery/size or performance... Which is why Apple
Re: (Score:2)
GPUs are no longer "graphics" cards, but co-processors. Not much takes advantage of them yet, but in the near future software will.
Microsoft, AMD, Intel, nVidia, ARM, IBM, and some others, are all working on taking advantage of the massive number-crunching power of modern GPUS. They want to make it easy for programmers to use them and let the OS interface in a simple manner.
These frameworks mostly just look at throughput vs latency. They plan to make it easy to not ju
Re:HD 4000 (Score:4, Informative)
Other than the obvious (Score:2)
"For the budget conscious consumer, there is no reason to get the budget nVidia or Radeon."
Except for the fact that a An AMD llano A8 will blow an ivy bridge out of the water when it comes to games. Plenty of consumers
who buy $500 budget pcs never open the case.
Re: (Score:2)
Odd. You seem to be uninformed or lying. Ivy Bridge beats the A8 in games, go look at the benchmarks.
What now?
Re: (Score:3)
http://www.anandtech.com/show/5626/ivy-bridge-preview-core-i7-3770k/12 [anandtech.com]
You mean these benchmarks???
"What Ivy Bridge does not appear to do is catch up to AMD's A8-series Llano APU. It narrows the gap, but for systems whose primary purpose is gaming AMD will still likely hold a significant advantage with Trinity. The fact that Ivy Bridge hasn't progressed enough to challenge AMD on the GPU side is good news. The last thing we need is for a single company to dominate on both fronts."
Re: (Score:2)
In that set of benchmarks, HD4000 does indeed edge out A8 in three of the tests, one by 40%, one by 25% and another by 5%. In the rest, AMD wins by 67%, 12%, 23%, and 34%.
So yes, the as-yet unavailable HD4000 based product starts making the results ambiguous compared to AMDs contemporary of the HD3000. However, AMD still comes out on top more often than not and at larger average gaps. Advance the clock when Ivy Bridge will be realistically available and the FM2 based AMD offerings will also be available,
Re: (Score:2)
I take it you used the classic "common sense says it must be true, so it is" reasoning there.
Have you actually looked at benchmarks?
Re: (Score:2)
Re: (Score:2)
Since when is a 6550 a budget GPU? The 6350/6450/7350/7450 are easily dusted by the Intel HD 4000. If you want to move to mid-range you'll get more performance for the money but a consumer is better off not spending the extra $50 and sticking with the HD4000.
Since the 6570 is just over $60, I'd say that the 6550 is a budget GPU.
Re: (Score:3)
Re: (Score:2)