Become a fan of Slashdot on Facebook


Forgot your password?
Intel Hardware

Intel Details Nehalem CPU and Larrabee GPU 166

Vigile writes "Intel previewed the information set to be released at IDF next month including details on a wide array of technology for server, workstation, desktop and graphics chips. The upcoming Tukwila chip will replace the current Itanium lineup with about twice the performance at a cost of 2 billion transistors and Dunnington is a hexa-core processor using existing Core 2 architecture. Details of Nehalem, Intel's next desktop CPU core that includes an integrated memory controller, show a return of HyperThreading-like SMT, a new SSE 4.2 extension and modular design that features optional integrated graphics on the CPU as well. Could Intel beat AMD in its own "Fusion" plans? Finally, Larrabee, the GPU technology Intel is building, was verified to support OpenGL and DirectX upon release and Intel provided information on a new extension called Advanced Vector Extension (AVX) for SSE that would improve graphics performance on the many-core architecture."
This discussion has been archived. No new comments can be posted.

Intel Details Nehalem CPU and Larrabee GPU

Comments Filter:
  • Intel Vs. AMD? (Score:4, Insightful)

    by Naughty Bob ( 1004174 ) on Monday March 17, 2008 @06:38PM (#22778268)

    Could Intel beat AMD in its own "Fusion" plans?
    Intel is hugely advanced on AMD at this point, however, without AMD we wouldn't be seeing these releases. Hurray for the market, I guess....
  • Re:TPM (Score:2, Insightful)

    by trickonion ( 943942 ) on Monday March 17, 2008 @06:48PM (#22778338) Homepage
    I dont understand your comment, I, like many other people dont like the idea of TPM, and from your post it seems you are sarcastically agreeing with me. (via the word slipped). You also, however say so we can get the advantage of owning an expensive cable box (which I could actually see as an advantage, if you already have one in your house).
    Your post confuses me (or I'm being retarded, this has happened twice before in my life, along with the 3 times I've been wrong), and it forces me to conclude, that you, AC are in fact a woman and are using feminine wiles.
  • by immcintosh ( 1089551 ) <slashdot@SLACKWA ... org minus distro> on Monday March 17, 2008 @07:05PM (#22778464) Homepage
    So, this Larrabee, will it be another example of integrated graphics that "supports" all the standards while being too slow to be useful in any practical situation, even basic desktop acceleration (Composite / Aero)? If so, I've gotta wonder why they even bother rather than saving some cash and just making a solid 2D accelerator that would be for all intents and purposes functionally identical.
  • by Kamokazi ( 1080091 ) on Monday March 17, 2008 @07:19PM (#22778566)
    No, far, far, from integrated garbage. Larrabee will actually have uses as a supercomputer CPU:

    "It was clear from Gelsinger's public statements at IDF and from Intel's prior closed-door presentations that the company intends to see the Larrabee architecture find uses in the supercomputing market, but it wasn't so clear that this new many-core architecture would ever see the light of day as an enthusiast GPU. This lack of clarity prompted me to speculate that Larrabee might never yield a GPU product, and others went so far as to report "Larrabee is GPGPU-only" as fact.

    Subsequent to my IDF coverage, however, I was contacted by a few people who have more intimate knowledge of the project than I. These folks assured me that Intel definitely intends to release a straight-up enthusiast GPU part based on the Larrabee architecture. So while Intel won't publicly talk about any actual products that will arise from the project, it's clear that a GPU aimed at real-time 3D rendering for games will be among the first public fruits of Larrabee, with non-graphics products following later.

    As for what type of GPU Larrabee will be, it's probably going to have important similarities to we're seeing out of NVIDIA with the G80. Contrary to what's implied in this Inquirer article, GPU-accelerated raster graphics are here to stay for the foreseeable future, and they won't be replaced by real-time ray-tracing engines. Actually, it's worthwhile to take a moment to look at this issue in more detail."

    Shamelessly ripped from: []
  • Re:Intel Vs. AMD? (Score:3, Insightful)

    by Naughty Bob ( 1004174 ) on Monday March 17, 2008 @07:23PM (#22778590)

    Video on the cpu may be faster but you are still using the same system ram and that is not as fast that ram on a video card and that ram it on it's own.
    Nobody could argue against that, but the two approaches solve different problems currently. If the drift is towards an all in one solution, then the drift is towards less capable, but cheaper tech. Most gamers are console gamers, perhaps the chip makers are coming to the conclusion that dedicated GPUs for the PC are a blind alley (a shame IMHO).
  • by Whiteox ( 919863 ) on Monday March 17, 2008 @08:02PM (#22778856) Journal
    Mind you, it takes a genius like me to know what the hell you're talking about!
  • Re:Intel Vs. AMD? (Score:3, Insightful)

    by Naughty Bob ( 1004174 ) on Monday March 17, 2008 @08:17PM (#22778952)
    It is only about the money. All decisions ultimately come back to that. With Penryn, huge fabricating plants were coming online, and they couldn't have justified (to shareholders) not following through. That it kept Intel's jackboot firmly on the AMD windpipe was in that instance a happy sweetener.
  • by BrunoUsesBBEdit ( 636379 ) on Monday March 17, 2008 @10:13PM (#22779636) Homepage
    [quote]ATI lost me as a customer with their many years of zero Linux support and not to mention they still don't support FreeBSD. I won't use them except for some integrated server boards where it doesn't matter.[/quote]

    No forgiveness for ATI. I think we need to stay loyal to the companies that first showed us respect and show us the most respect today. Intel has poured resources into Linux and Xorg. When we are able to of load all HD video decoding from our CPUs to our GPUs, it will be Intel that makes that possible. For years ATI and nVidia have taunted the MythTV community with $25 512MB video cards that could easily handle HD video if only the manufactures would support us. This is a grievance of which I can't easily let go.
  • by DiEx-15 ( 959602 ) on Tuesday March 18, 2008 @12:09AM (#22780172)
    Now, please keep in mind my understanding of the law is next to "naive" but here is my understanding:

    For something to be considered "trademarkable" there has to be some form of association with the trademark. For example: Mickey Mouse & the Walt Disney Castle are trademarks of Walt Disney since you see or hear these images, you conger the images of Disney and such. Now if Intel could prove such links with numbers, perhaps there is a chance. HOWEVER the reason this has been (and always will be) a total demonstration in futility is because numbers can't generate the same iconic images as words or pictures. Numbers are numbers and signify values, not property or anything tangible. Granted there are trademarks with numbers in them but usually they have a letter or two thrown in. That is where it goes from just numbers to a word - a word with numbers in them. That is when it can be trademarked.

    What Intel is trying to do is go "If you use 10206 as a name for something, we will sue!" The problem is:
    1) I will sue Intel because that is part of a story I have and have proof I beat them to. (Although that is totally off the real topic here & I would meet with their pit bull lawyers)
    2) If you got 10206 as a math answer, how would the law differentiate between it and Intel's property?
    3) If 10206 was part of a formula, bar code, serial number, part number, etc., how would the system know if it is a violation of trademark laws?

    Think about this - The number 42 is a part of the Hitchhikers Guide story. I can safely use "42" in anything I want because its a number AS LONG AS I don't go and say "it's the meaning of life" BECAUSE then it would have an association. Now as far as Intel, they can't say "the number is associated with our chips" because there is such a weak (at best) association between a number and something physical (the chip).Mostly I think the law has told Intel "Whatever. The numbers look more like a serial number rather than a trademark worthy thing". That is why Intel can't get its wishes.

    Anyways, that is my ten cents (my two cents is free...) and I could totally be wrong here. However that is my understanding.

Think of it! With VLSI we can pack 100 ENIACs in 1 sq. cm.!