Forgot your password?
typodupeerror
Intel Hardware

Intel Details Nehalem CPU and Larrabee GPU 166

Posted by ScuttleMonkey
from the business-is-war dept.
Vigile writes "Intel previewed the information set to be released at IDF next month including details on a wide array of technology for server, workstation, desktop and graphics chips. The upcoming Tukwila chip will replace the current Itanium lineup with about twice the performance at a cost of 2 billion transistors and Dunnington is a hexa-core processor using existing Core 2 architecture. Details of Nehalem, Intel's next desktop CPU core that includes an integrated memory controller, show a return of HyperThreading-like SMT, a new SSE 4.2 extension and modular design that features optional integrated graphics on the CPU as well. Could Intel beat AMD in its own "Fusion" plans? Finally, Larrabee, the GPU technology Intel is building, was verified to support OpenGL and DirectX upon release and Intel provided information on a new extension called Advanced Vector Extension (AVX) for SSE that would improve graphics performance on the many-core architecture."
This discussion has been archived. No new comments can be posted.

Intel Details Nehalem CPU and Larrabee GPU

Comments Filter:
  • by TechyImmigrant (175943) * on Monday March 17, 2008 @05:38PM (#22778270) Journal
    They are code names, not product names.

    Intel has a rich collection of silly code names.

  • by iknownuttin (1099999) on Monday March 17, 2008 @05:44PM (#22778314)
    Haven't they heard of numbers?

    You can't trademark numbers. When AMD started releasing "x86" numbered processors, Intel filed suit and lost. The judge stated that you can't trademark numbers. It's such an old case, this is what I found in the last 10 minutes regarding Intel and trademarking numbers [theinquirer.net].

    I'm tired and too lazy to find the actual lawsuit.

  • by Have Blue (616) on Monday March 17, 2008 @06:12PM (#22778514) Homepage
    Most of Intel's codenames are names of real places [wikipedia.org].
  • by Anonymous Coward on Monday March 17, 2008 @06:19PM (#22778560)
    This is what it's named after: http://www.oregonstateparks.org/park_201.php [oregonstateparks.org]

    Someone even wrote a song about the place: http://www.google.com/search?q=everclear+Nehalem [google.com]

  • by glitch23 (557124) on Monday March 17, 2008 @06:41PM (#22778716)
    They typically (maybe all) come from various types of things (e.g. mountains [mckinley]) in the north west portion of North America. You'll notice many sound the same such as Tukwila and Willamette.
  • by djohnsto (133220) <<dan.e.johnston> <at> <gmail.com>> on Monday March 17, 2008 @06:49PM (#22778772) Homepage
    Because power generally increases at a rate of frequency^3 (that's cubed). Adding more cores generally increases power linearly.

    For example. Let's start with a single-core Core 2 @ 2GHz. Let's say it uses 10 W (not sure what the actual number is).

    Running it at twice the frequency results in a (2^3) = 8X power increase. So, we can either have a single-core 4 GHz Core 2 at 80W, or we can have a quad-core 2GHz Core 2 at 40W. Which one makes more sense?
  • by TheSync (5291) * on Monday March 17, 2008 @07:39PM (#22779102) Journal
    1) We've hit the "Power Wall", power is expensive, but transistors are "free". That is, we can put more transistors on a chip than we have the power to turn on.

    2) We also have hit the "Memory Wall", modern microprocessors can take 200 clocks to access DRAM, but even floating-point multiplies may take only four clock cycles.

    3) Because of this, processor performance gain has slowed dramatically. In 2006, performance is a factor of three below the traditional doubling every 18 months that occurred between 1986 and 2002.

    To understand where we are, and why the only way to go now is parallelism versus clock speed increase, see The Landscape of Parallel Computing ReseView from Berkeley [berkeley.edu].

  • Re:Intel Vs. AMD? (Score:1, Informative)

    by Anonymous Coward on Monday March 17, 2008 @08:35PM (#22779422)
    Given that you can purchase a system with 3-4 GB quite cheaply, and by time these processors come to market the price will go down (though it's unlikely that computers will commonly come with over 4gb until 64bit is more reliable, or until there's some application that you actually need 4gb for.) It stands to reason that if you purchase a computer with one of these processors, it will come with 4gb of ram. As someone who runs vista with 3gb, i can assure you that it is enough for any needs I have found (I seldom get over 70% usage, and if i do at least 20% is Firefox and pidgin and their memory leaks, or i'm running a game.) (no IM client should hog 170+mb of memory, especially with only AIM running and only a few windows open, but i have it lying around, so i don't mind too much)

    Vista also has a ceiling of 3.5gb if I remember correctly. I forget what the extra 512mb gets used for if you have it, but if you have 4gb of ram in a vista system, the system will not show a some 2^x mb amount of it, and it isn't some amount being dedicated to integrated graphics.
  • by Hal_Porter (817932) on Tuesday March 18, 2008 @03:04AM (#22780914)
    I think AMD will do OK. Once Dell and the like get used to using CPUs from multiple sources they will probably survive. And a small company like AMD probably has an edge in terms of shorter design cycles and the ability to pick niches. AMD64 was a brilliant hack in retrospect that gave people most of the features of Itanium they wanted (64 bit, more registers) and none that they didn't (and expensive single source CPU with crap integer performance). Meanwhile Intel got hopeless bogged down trying to sell people Itaniums that they didn't want.

    AMD and they have other clever stuff in the pipeline. E.g.

    http://www.tech.co.uk/computing/upgrades-and-peripherals/motherboards-and-processors/news/amd-plots-16-core-super-cpu-for-2009?articleid=1754617439 [tech.co.uk]

    What's more, with that longer instruction pipeline in mind, it will be interesting to see how Bulldozer pulls off improved single-threaded performance. Rumours are currently circulating that Bulldozer may be capable of thread-fusing or using multiple cores to compute a single thread. Thread fusing is one of the holy grails of PC processing. If Bulldozer is indeed capable of such a feat, the future could be very bright indeed for AMD.
  • Re:Intel Vs. AMD? (Score:4, Informative)

    by wild_berry (448019) on Tuesday March 18, 2008 @05:17AM (#22781268) Journal
    Unreal founder Tim Sweeney says that Intel's integrated graphics are a real set-back for PC gaming (http://games.slashdot.org/article.pl?sid=08/03/10/1239205 [slashdot.org]). Intel keep promising and failing to deliver substantive graphics performance (and even insisted on the 'Vista Capable' label being applicable to the Aero-incapable i915 graphics chipset to sell more of these chips - see http://slashdot.org/articles/08/03/01/1312233.shtml [slashdot.org] and http://yro.slashdot.org/article.pl?sid=08/02/28/1746211&from=rss [slashdot.org]). AMD have released the 780G chipset which includes a Radeon HD2x00-class onboard graphics chip and which offers a good basic capability to play recent games.
  • by pandrijeczko (588093) on Tuesday March 18, 2008 @07:28AM (#22781842)
    Apple probably buys around 10 percent of all laptop chips that Intel produces, and mostly goes for the more expensive ones, so I would estimate about 20 percent of dollar revenue.

    I notice you've tried to sneak in the adjective "laptop" in there. I think it would be erring on your side to suggest that no more than half the chips Intel produces are for laptops, the remaining being for desktop and servers. If your figures are correct (which I seriously doubt), then that puts Apple down to buying a maximum of 5% of Intels overall chip production. (Even then, whilst I accept there are possibly a higher proportion of Apple users in the US, that is not the case here in Europe where Apple's penetration for computers is very low.)

    And they don't buy any of the $50 low end chips that end up in your $399 PC.

    Except that you're now (presumably) talking about $399 PCs in general, not just laptops - I detect some serious massaging of figures now on your part.

    However, if you're talking about $399 (or in my case £399) laptops, then I call BS on you. Sure, a lot of home users buy a cheap laptop as a second home machine but the biggest buyers of laptops are corporates who do not buy the cheapest machines. Therefore, by supposition, higher grade chips also go into Dell's, HP's, Lenovo's, etc. mid- to high- end laptops which, because there are more of those than there are Macs sold, puts Apple into a much smaller minority than you are claiming.

    So please do not exaggerate the Mac's penetration (outside of the US at least) - there really are not that many of them about. As I've said previously on Slashdot, having spent 25+ years as a technical person in telecomms and IT travelling quite regularly around Europe and parts of the Middle East, I have seen a total of 3 Mac machines ever - one was an American tutor on a course I did, one was a student posing in the local Starbucks with one, and a friend of mine has a surplus Mac given to him by his boss that he has no idea what to do with and is still in the box.

If the facts don't fit the theory, change the facts. -- Albert Einstein

Working...