Intel Details Nehalem CPU and Larrabee GPU 166
Vigile writes "Intel previewed the information set to be released at IDF next month including details on a wide array of technology for server, workstation, desktop and graphics chips. The upcoming Tukwila chip will replace the current Itanium lineup with about twice the performance at a cost of 2 billion transistors and Dunnington is a hexa-core processor using existing Core 2 architecture. Details of Nehalem, Intel's next desktop CPU core that includes an integrated memory controller, show a return of HyperThreading-like SMT, a new SSE 4.2 extension and modular design that features optional integrated graphics on the CPU as well. Could Intel beat AMD in its own "Fusion" plans? Finally, Larrabee, the GPU technology Intel is building, was verified to support OpenGL and DirectX upon release and Intel provided information on a new extension called Advanced Vector Extension (AVX) for SSE that would improve graphics performance on the many-core architecture."
Nehalem? Larrabee? (Score:2, Interesting)
Re:Intel Vs. AMD? (Score:2, Interesting)
AMD 64-bit processing is better. Depending on the type of processing you're doing that could mean a lot.
We all know what a debacle Intels integrated graphics were in the past. I'm not sure if they should be using that as a marketing point.
Since AMD acquired ATI I would assume AMDs integrated graphics would be far superior.
NVIDIAs stock price hasn't been doing so good in the last couple months. Could this mean a return of integrated graphics? I'd bet my money on AMD who already owns ones of the big players.
Re:Intel Vs. AMD? (Score:3, Interesting)
Video on the cpu may be faster but you are still useing the same system ram and that is not as fast that ram on a video card and that ram it on it's own.
Re:Nehalem? Larrabee? (Score:4, Interesting)
Heck, I remember when "Itanium" came out and people laughed...
But before they laughed, I remember a bunch of companies folded up their project tents (sun, mips, the remains of dec/alpha). I'm not so sure companies will do the same this time around... Not saying this time Intel doesn't have their ducks in a row, but certainly, the past is no indication of the future...
Re:Nehalem? Larrabee? (Score:3, Interesting)
Re:Intel Vs. AMD? (Score:4, Interesting)
HyperThreading (Score:3, Interesting)
Gosh, I hope it is more effective, because in my implementations I actually saw a slowdown instead of an advantage. Even then I'm generally not happy with hyper-threading. The OS & Applications simply don't see the difference between two real cores and a hyperthreading core. If I run another thread on a hyperthreading core, I'll slowdown the other thread. This might not always be what you want to see happening. IMHO, the advantage should be over 10/20% for a desktop processor to even consider hyperthreading, and even then I want back that BIOS option so that disables hyperthreading again.
I've checked and both the Linux and Vista kernel support a large number of cores, so that should not be a problem.
Does anyone have any information on how well the multi-threading works on the multi-core Sun niagara based processors?
Re:Gflargen and Blackeblae (Score:3, Interesting)
Ummmmm, no (Score:5, Interesting)
However, this will be much faster since it fixes a major problem with integrated graphics: Shared RAM. All integrated Intel chipsets nab system RAM to work. Makes sense, this keeps costs down and that is the whole idea behind them. The problem is it is slow. System RAM is much slower than video RAM. As an example, high end systems might have a theoretical max RAM bandwidth of 10GB/sec if they have the latest DDR3. In reality, it is going to be more along the lines of 5GB/sec in systems that have integrated graphics. A high end graphics card can have 10 TIMES that. The 8800 Ultra has a theoretical bandwidth over 100GB/sec.
Well, in addition to the RAM not being as fast, the GPU has to fight with the CPU for access to it. All in all, it means that RAM access is just not fast for the GPU. That is a major limiting factor in modern graphics. Pushing all those pixels with multiple passes of textures takes some serious memory bandwidth. No problem for a discrete card, of course, it'll have it's own RAM just like any other.
In addition to that, it looks like they are putting some real beefy processing power on this thing.
As such I expect this will perform quite well. Will it do as good as the offerings from nVidia or ATi? Who knows? But this clearly isn't just an integrated chip on a board.
Re:Nehalem? Larrabee? (Score:5, Interesting)
Anti-Trust Question... (Score:2, Interesting)
I can see this going two ways:
1) Intel forces AMD outta business. AMD ends up liquidating its stock/technology to foreign companies (read: outside USA).
2) AMD Brings an Anti-Trust case against Intel for 'unfair practices' or some crap (IANAL).
However, there is ALWAYS the possibility that Intel pulls another Pentium Bug [wikipedia.org]. Remember the mid-late 90's ? (God how _could_ we _forget_ the 90's!?) Either way, AMD needs to diversify their R&D and/or look for more lucrative business opportunities (whatever that means), or -the winner IMHO- work with IBM on this power saving crusade.
Was denken Sie, Slashdot Crowd?
Re:HyperThreading (Score:3, Interesting)
(Since a compute element is designed for one specific task, you end up with a maximum number of supportable virtual cores equal to the number of pools times the number of elements in each pool. The minimum number of cores would be determined by the maximum number of threads generated by any instruction supported. If the CPU was really smart, it could "hotplug" CPUs to increase and reduce the number of cores that appear to the operating system, so that if there's a heavy, sustained use of the threading, the CPU doesn't try to overcommit resources.)
Re:Intel Vs. AMD? (Score:3, Interesting)
Instead of waiting for some real competition, Intel released Penryn more or less right on schedule when the only competition they had was their own 65nm processors. Of course the quad cores are only just being released now, but they are still releasing them way before AMD has anything to really compete with them. People make all kinds of cynical statements about business methods without even considering corporate culture. Has it ever occurred to anyone that Intel simply may not believe in only releasing new tech when they absolutely have no choice due to competition?
I'm not saying competition is not a good thing, but I don't think AMD is presenting much competition to Intel at the moment. AMD is in big trouble and Intel is well aware of that fact. I just don't think that it is competition that is driving Intel forward. Competition may affect their pricing, but I think Intel would keep right on with their two year tick tock cycles and process shrinks even if AMD folded tomorrow.
Re:Intel Vs. AMD? (Score:3, Interesting)
also console games don't have mod's / users maps and other add ones they also don't have that many free games.
you don't have many mmorpg games on them and the Xbox is pay to play online vs free on the pc and ps3.
The Giant is awakened (Score:3, Interesting)
Re:dual monitor support? (Score:2, Interesting)
There is a standard called ADD+ that allows you too connect the transmitters via an AGP or PCIe card, however, given that drivers are validated with specific transmitters, it's unusual to find ADD+ cards outside of driver development groups or validation teams.
However, if you can find an ADD+ card with a pair of common transmitters such as the Chrontel CH7307, then you can get your dual DVI outputs.
(Not speaking as an official representative of Intel Corporation)
Re:More Integrated Garbage? (Score:3, Interesting)
Re:Gflargen and Blackeblae (Score:3, Interesting)
They weren't even used in the Arab world until modern times -
http://en.wikipedia.org/wiki/History_of_the_Hindu-Arabic_numeral_system [wikipedia.org]