Intel Details Nehalem CPU and Larrabee GPU 166
Vigile writes "Intel previewed the information set to be released at IDF next month including details on a wide array of technology for server, workstation, desktop and graphics chips. The upcoming Tukwila chip will replace the current Itanium lineup with about twice the performance at a cost of 2 billion transistors and Dunnington is a hexa-core processor using existing Core 2 architecture. Details of Nehalem, Intel's next desktop CPU core that includes an integrated memory controller, show a return of HyperThreading-like SMT, a new SSE 4.2 extension and modular design that features optional integrated graphics on the CPU as well. Could Intel beat AMD in its own "Fusion" plans? Finally, Larrabee, the GPU technology Intel is building, was verified to support OpenGL and DirectX upon release and Intel provided information on a new extension called Advanced Vector Extension (AVX) for SSE that would improve graphics performance on the many-core architecture."
Nehalem? Larrabee? (Score:2, Interesting)
Re:Nehalem? Larrabee? (Score:5, Informative)
Intel has a rich collection of silly code names.
Re:Nehalem? Larrabee? (Score:4, Funny)
Re: (Score:2)
Re: (Score:2)
Re:Nehalem? Larrabee? (Score:5, Informative)
Re: (Score:3, Funny)
How long until they release a chip named after Intercourse,PA,
Or my favourite, Wankers Corner,OR
AMD will join the fun and look to France for inspirational place names, such as Condom, Tampon and Herpes
Not to be outdone, poor old Amiga Inc finally release a new computer named after the village of Shittington,in the UK,with an update scheduled for 2025 named after Mount Buggery in Australia.
Re: (Score:2)
Pronunciation (Score:2)
Re: (Score:2)
Ok, not that last one.
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
http://www.wouldyoubelieve.com/larabee.html [wouldyoubelieve.com]
Hymie was the robot
http://www.wouldyoubelieve.com/hymie.html [wouldyoubelieve.com]
Re: (Score:2, Informative)
Re:Nehalem? Larrabee? (Score:4, Interesting)
Heck, I remember when "Itanium" came out and people laughed...
But before they laughed, I remember a bunch of companies folded up their project tents (sun, mips, the remains of dec/alpha). I'm not so sure companies will do the same this time around... Not saying this time Intel doesn't have their ducks in a row, but certainly, the past is no indication of the future...
Re:Nehalem? Larrabee? (Score:5, Interesting)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
The x86 is still king. It may be as ugly as a pig with a rocket strapped on, but it still flies faster than those elegant RISC eagles.
While IBM's POWER stuff might be faster, it sure doesn't like a RISC anymore - definitely not a very "reduced instruction set"
Re: (Score:3, Interesting)
Re: (Score:1, Informative)
Someone even wrote a song about the place: http://www.google.com/search?q=everclear+Nehalem [google.com]
Intel Vs. AMD? (Score:4, Insightful)
Re: (Score:1)
Hell yeah! Without AMD, we'd all be on x86 technology. Although, there is/was Motorola. Wouldn't it be nice to run multiple time line(s) scenarios?
Re: (Score:1)
This could give AMD an advantage come beyond quad cores however Intel I am sure are hard at work to make sure they stay in the lead.
Re: (Score:2, Interesting)
AMD 64-bit processing is better. Depending on the type of processing you're doing that could mean a lot.
We all know what a debacle Intels integrated graphics were in the past. I'm not sure if they should be using that as a marketing point.
Since AMD acquired ATI I would assume AMDs integrated graphics would be far superior.
NVIDIAs stock price hasn't been doing so good in the last couple months. Could this mean a return of integrated graphics? I'd bet my m
Re: (Score:2)
Your post makes me think that Intel will attempt a take-over of Nvidia, hostile or otherwise. But I have no knowledge in this area.
Re:Intel Vs. AMD? (Score:4, Interesting)
Re: (Score:2)
To your general point- I fully agree, what with my 8800 equipped Ubuntu box 'n all.
Re:Intel Vs. AMD? (Score:4, Informative)
Re: (Score:3, Interesting)
Video on the cpu may be faster but you are still useing the same system ram and that is not as fast that ram on a video card and that ram it on it's own.
Re: (Score:3, Insightful)
Nobody could argue against that, but the two approaches solve different problems currently. If the drift is towards an all in one solution, then the drift is towards less capable, but cheaper tech. Most gamers are console gamers, perhaps the chip makers are coming to the conclusion that dedicated GPUs for the PC are a blind alley (a shame IMHO).
Re: (Score:3, Interesting)
also console games don't have mod's / users maps and other add ones they also don't have
Re: (Score:2)
True enough about the restricted nature of console gaming, but don't expect that to inform 'Big Silicon' in its future decisions. Money is their only friend.
Re: (Score:2)
The reason Intel has so much of
Re: (Score:2)
Re: (Score:3, Interesting)
without AMD we wouldn't be seeing these releases.
Actually this seems a bit disingenuous to me. Intel released Penryn way before they had to. Intel (the hare) was so far ahead of AMD (the tortoise) with the 65nm Core 2 that they could have sat back and relaxed for a while, saving R&D costs while waiting for AMD to catch up at least a little. I mean look at Nvidia for a perfect counterexample. Most people believe that they already have a next gen GPU ready but that they are sitting on it until they have someone to compete with besides themselves. To a
Re: (Score:3, Insightful)
Re: (Score:2)
There
Re: (Score:2)
Though, 45nm processors are currently in short supply. They're usually sold out, and are marked up considerably.
http://techreport.co [techreport.com]
Re: (Score:2)
Indeed. And furthermore, for a very long time intel was avoiding actual innovation and instead just arbitrarily segmenting the market. For example, back in the 1990s, they were selling 486SX chips. To make a 486SX, intel had to manufacture a 486DX and then go through the extra step of disabling the math coprocessor. In spite of the fact that they took that extra step - thus necessarily increasing the manufacturing cost,
Re: (Score:2)
Re: (Score:2)
No, without a demand for these advances, competition would exist only to lower prices, but because this demand exists, the competition also includes innovation. If AMD weren't in the running, some other company or companies would be. Hurray for the market being properly represented.
More Integrated Garbage? (Score:5, Insightful)
Re: (Score:1)
Re: (Score:1)
Re:More Integrated Garbage? (Score:5, Insightful)
"It was clear from Gelsinger's public statements at IDF and from Intel's prior closed-door presentations that the company intends to see the Larrabee architecture find uses in the supercomputing market, but it wasn't so clear that this new many-core architecture would ever see the light of day as an enthusiast GPU. This lack of clarity prompted me to speculate that Larrabee might never yield a GPU product, and others went so far as to report "Larrabee is GPGPU-only" as fact.
Subsequent to my IDF coverage, however, I was contacted by a few people who have more intimate knowledge of the project than I. These folks assured me that Intel definitely intends to release a straight-up enthusiast GPU part based on the Larrabee architecture. So while Intel won't publicly talk about any actual products that will arise from the project, it's clear that a GPU aimed at real-time 3D rendering for games will be among the first public fruits of Larrabee, with non-graphics products following later.
As for what type of GPU Larrabee will be, it's probably going to have important similarities to we're seeing out of NVIDIA with the G80. Contrary to what's implied in this Inquirer article, GPU-accelerated raster graphics are here to stay for the foreseeable future, and they won't be replaced by real-time ray-tracing engines. Actually, it's worthwhile to take a moment to look at this issue in more detail."
Shamelessly ripped from:
http://arstechnica.com/articles/paedia/hardware/clearing-up-the-confusion-over-intels-larrabee.ars/2 [arstechnica.com]
Re: (Score:2)
Re: (Score:3, Interesting)
Re: (Score:2)
Carmack has had a good track record of figuring out nifty tricks that current popular tech can achieve or at least the popular near cutting edge tech - I remember just barely managing to play the first Doom on a 386SX, it sure looked a lot better than the other stuff out there.
He used tricks for commander keen, wolf 3d, doom (2D game with some 3D), and so on.
Carmack's engines tend to do pretty decent
Re: (Score:2)
Yes, if AVX includes 256bit = 4*64 FPU calculations with reasonable performance, I can imagine many computer scientist drooling over this..
Ummmmm, no (Score:5, Interesting)
However, this will be much faster since it fixes a major problem with integrated graphics: Shared RAM. All integrated Intel chipsets nab system RAM to work. Makes sense, this keeps costs down and that is the whole idea behind them. The problem is it is slow. System RAM is much slower than video RAM. As an example, high end systems might have a theoretical max RAM bandwidth of 10GB/sec if they have the latest DDR3. In reality, it is going to be more along the lines of 5GB/sec in systems that have integrated graphics. A high end graphics card can have 10 TIMES that. The 8800 Ultra has a theoretical bandwidth over 100GB/sec.
Well, in addition to the RAM not being as fast, the GPU has to fight with the CPU for access to it. All in all, it means that RAM access is just not fast for the GPU. That is a major limiting factor in modern graphics. Pushing all those pixels with multiple passes of textures takes some serious memory bandwidth. No problem for a discrete card, of course, it'll have it's own RAM just like any other.
In addition to that, it looks like they are putting some real beefy processing power on this thing.
As such I expect this will perform quite well. Will it do as good as the offerings from nVidia or ATi? Who knows? But this clearly isn't just an integrated chip on a board.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
HyperThreading (Score:3, Interesting)
Gosh, I hope it is more effective, because in my implementations I actually saw a slowdown instead of an advantage. Even then I'm generally not happy with hyper-threading. The OS & Applications simply don't see the difference between two real cores and a hyperthreading core. If I run another thread on a hyperthreading core, I'll slowdown the other thread. This might not always be what you want to see happening. IMHO, the advantage should be over 10/20% for a desktop processor to even consider hyperthreading, and even then I want back that BIOS option so that disables hyperthreading again.
I've checked and both the Linux and Vista kernel support a large number of cores, so that should not be a problem.
Does anyone have any information on how well the multi-threading works on the multi-core Sun niagara based processors?
Re: (Score:3, Interesting)
Re: (Score:2)
The OS & Applications simply don't see the difference between two real cores and a hyperthreading core. If I run another thread on a hyperthreading core, I'll slowdown the other thread.
Wrong. Linux scheduler can distinguish between two real cores and a hyperthreading core (i.e. it prefers to run the threads on independent cores). Linux scheduler can also take into consideration core to socket mapping (it prefers to run threads on cores in a single socket in order to allow the other sockets lower the
Re: (Score:2)
Re: (Score:2)
Good and bad (Score:2)
Why the brick wall? (Score:2)
Re: (Score:1)
I can't even find the clock speed in that article, which means we're STILL probably stuck at 3.5 Ghz +/- .5 Ghz, which we've been stuck for what, three, four years? What the hell happened? If we're still shrinking components, why are we not seeing clock speed increases?
Intel's current designs are basically focusing on what I'd consider horizontal scaling instead of vertical. That is, they are increasing the # of cores that run at a lower frequency which makes up for raising the clock speed. In addition, they run cooler. You aren't losing ground. If the Core 2 Duos weren't more efficient and provided better performance then Intel wouldn't be beating AMD's ass with them. You now have up to 4 cores in a single package each running at 2-3ghz (not sure the exact number for t
Re:Why the brick wall? (Score:5, Informative)
For example. Let's start with a single-core Core 2 @ 2GHz. Let's say it uses 10 W (not sure what the actual number is).
Running it at twice the frequency results in a (2^3) = 8X power increase. So, we can either have a single-core 4 GHz Core 2 at 80W, or we can have a quad-core 2GHz Core 2 at 40W. Which one makes more sense?
Re: (Score:2)
We still don't have much software that can really take advantage of multiple cores. A single core running at 4GHz is going to be MUCH faster on almost every benchmark than 2 cores running at 2GHz each.
But, it doesn't matter. Multi-cores are the future, and we need to figure out a way to take advantage of them.
Re: (Score:2)
The problem with this is that to achieve twice the frequency (for the same cpu), you likely need to increase the voltage (increasing voltage increases power at a rate of voltage^2), and there is only so much you can increase the voltage... If you'd design the cpu to reach h
Re: (Score:2)
Instead, what is needed when power becomes excessive is simply a shift to a newer technology--and there are many options, so there's no fundamental issue, just a monetary one, and so any possible profits will be milked from the slow silicon substrate for as long as possible, even if progress is slowed down because of it.
Re: (Score:2)
power generally increases at a rate of frequency^3
No, power is linear in clock frequency, and quadratic in voltage. References are easy to find on the Web; here's one [poly.edu].
Re: (Score:2)
DYNAMIC POWER = FREQUENCY * CAPACITIVE LOAD * VOLTAGE^2
The above ignores leakage, but as another poster mentioned, that is not related to frequency. Leakage actually scales LINEARLY with the device voltage.
Adding more cores DOES increase power linearly. but the frequency^3 comment is completely off-base. The worst offender is actually voltage, which adds quadratic dynamic power and linear leakage power. As you raise the frequency, power consumption can increase even more
Re:Why the brick wall? (Score:5, Informative)
2) We also have hit the "Memory Wall", modern microprocessors can take 200 clocks to access DRAM, but even floating-point multiplies may take only four clock cycles.
3) Because of this, processor performance gain has slowed dramatically. In 2006, performance is a factor of three below the traditional doubling every 18 months that occurred between 1986 and 2002.
To understand where we are, and why the only way to go now is parallelism versus clock speed increase, see The Landscape of Parallel Computing ReseView from Berkeley [berkeley.edu].
Re: (Score:2)
Anti-Trust Question... (Score:2, Interesting)
Re: (Score:2)
Since the Pentium, all Intel (and AMD) processors have used microcode. That is, there is a layer of abstraction between machine code that the processor executes and the actual electronic logic on the chip. It's a layer between the physical processor and Assembly. What it allows you to do is provide bug fixes for processor design errors. It's slightly slower because it's an extra decode operation, but it allow
closing the FSB loophole (Score:2)
The Giant is awakened (Score:3, Interesting)
Re:The Giant is awakened (Score:5, Informative)
AMD and they have other clever stuff in the pipeline. E.g.
http://www.tech.co.uk/computing/upgrades-and-peripherals/motherboards-and-processors/news/amd-plots-16-core-super-cpu-for-2009?articleid=1754617439 [tech.co.uk]
What's with the Hebrewlish? (Score:2)
"Hebrew English is to be helpings and not to be laughings at."
prefix confusion... (Score:2)
</pedantic>
OpenGL support needed to be confirmed? (Score:2)
Re: (Score:2)
Re: (Score:2)
Compare the amount of Mac sales to the amount of PC sales and if Apple moved over to AMD, I doubt Intel would give a second glance either.
Apple probably buys around 10 percent of all laptop chips that Intel produces, and mostly goes for the more expensive ones, so I would estimate about 20 percent of dollar revenue. And Apple buys a good amount of expensive quad core server chips as well. And they don't buy any of the $50 low end chips that end up in your $399 PC. So financially, losing Apple would be a major hit for Intel.
Re: (Score:3, Informative)
Re: (Score:2)
Apple probably buys around 10 percent of all laptop chips that Intel produces, and mostly goes for the more expensive ones, so I would estimate about 20 percent of dollar revenue.
I notice you've tried to sneak in the adjective "laptop" in there. I think it would be erring on your side to suggest that no more than half the chips Intel produces are for laptops, the remaining being for desktop and servers. If your figures are correct (which I seriously doubt), then that puts Apple down to buying a maximum of 5% of Intels overall chip production. (Even then, whilst I accept there are possibly a higher proportion of Apple users in the US, that is not the case here in Europe where Apple's penetration for computers is very low.)
And they don't buy any of the $50 low end chips that end up in your $399 PC.
Except that you're now (presumably) talking about $399 PCs in general, not just laptops - I detect some serious massaging of figures now on your part.
However, if you're talking about $399 (or in my case £399) laptops, then I call BS on you. Sure, a lot of home users buy a cheap laptop as a second home machine but the biggest buyers of laptops are corporates who do not buy the cheapest machines. Therefore, by supposition, higher grade chips also go into Dell's, HP's, Lenovo's, etc. mid- to high- end laptops which, because there are more of those than there are Macs sold, puts Apple into a much smaller minority than you are claiming.
So please do not exaggerate the Mac's penetration (outside of the US at least) - there really are not that many of them about. As I've said previously on Slashdot, having spent 25+ years as a technical person in telecomms and IT travelling quite regularly around Europe and parts of the Middle East, I have seen a total of 3 Mac machines ever - one was an American tutor on a course I did, one was a student posing in the local Starbucks with one, and a friend of mine has a surplus Mac given to him by his boss that he has no idea what to do with and is still in the box.
My original comment stands. Having friends from both Intel and Apple I know the close relationship that has developed and the cross-pollenation of technical knowledge benefits both companies. However, with the upcoming products Apple has in the pipeline, their impressive market gains in several market spaces and upcoming markets it's clear that Intel would lose several Billion dollars of future revenue by having Apple leave.
Let me also point out the stagnation of the Intel stock that benefits from it's h
Re: (Score:2)
And Apple buys a good amount of expensive quad core server chips as well.
I doubt that very much. Mac Pros aren't exactly a volume seller, and with only a single mid-range 1U server offering - and not an especially compelling one at that - Apple are far, far from a major player in the server market.
So financially, losing Apple would be a major hit for Intel.
No, they wouldn't A hit, yes, but not a major one.
Re:Gflargen and Blackeblae (Score:5, Informative)
You can't trademark numbers. When AMD started releasing "x86" numbered processors, Intel filed suit and lost. The judge stated that you can't trademark numbers. It's such an old case, this is what I found in the last 10 minutes regarding Intel and trademarking numbers [theinquirer.net].
I'm tired and too lazy to find the actual lawsuit.
Re: (Score:1)
Re: (Score:3, Interesting)
Re: (Score:2)
But you see, Core and Core2 are two completely different architectures. This would make Core3 a fitting name for the next big thing.
Core is simply a Pentium M with a streamlined FP unit. The SSE units are still 64-bit and there are still only three instruction decoders.
Core2 adds the two 128-bit SSE registers, adds a fourth decoder, and a mess of other optimizations. This is certainly an architectural change.
Re: (Score:2)
Re:Gflargen and Blackeblae (Score:5, Funny)
It took them a while to get that it was a joke.
Re: (Score:3, Interesting)
They weren't even used in the Arab world until modern times -
http://en.wikipedia.org/wiki/History_of_the_Hindu-Arabic_numeral_system [wikipedia.org]
In the Arab world - until modern times - the Arabic numeral system was used only by mathematicians. Muslim scientists used the Babylonian numeral system, and merchants used the Abjad numerals. It was not until Fibonacci that the Arabic numeral system was used by a large population.
Re: (Score:2)
Re: (Score:2, Insightful)
For something to be considered "trademarkable" there has to be some form of association with the trademark. For example: Mickey Mouse & the Walt Disney Castle are trademarks of Walt Disney since you see or hear these images, you conger the images of Disney and such. Now if Intel could prove such links with numbers, perhaps there is a chance. HOWEVER the reason this has been (and always will be) a tota
Re: (Score:2, Insightful)
Your post confuses me (or I'm being retarded, this has happened twice before in my life, along with the 3 times I've been wrong), and it forces me to conclude, that y
Re:TPM (Score:4, Funny)
Re: (Score:2)
me, personally, i'm all for it.
others may disagree, and I respect that.
Re:Please stop naming after WA and OR places (Score:4, Funny)
Re: (Score:1)
Re: (Score:2)
Re: (Score:2, Interesting)
There is a standard called ADD+ that allows you too connect the transmitters via an AGP or PCIe card, however, given that drivers are validated with specific transmitte
Re: (Score:2)