Become a fan of Slashdot on Facebook


Forgot your password?
Graphics Intel Hardware Technology

Intel Abandons Discrete Graphics 165

Stoobalou writes with this excerpt from Thinq: "Paul Otellini may think there's still life in Intel's Larrabee discrete graphics project, but the other guys at Intel don't appear to share his optimism. Intel's director of product and technology media relations, Bill Kircos, has just written a blog about Intel's graphics strategy, revealing that any plans for a discrete graphics card have been shelved for at least the foreseeable future. 'We will not bring a discrete graphics product to market,' stated Kircos, 'at least in the short-term.' He added that Intel had 'missed some key product milestones' in the development of the discrete Larrabee product, and said that the company's graphics division is now 'focused on processor graphics.'"
This discussion has been archived. No new comments can be posted.

Intel Abandons Discrete Graphics

Comments Filter:
  • by Anonymous Coward on Wednesday May 26, 2010 @02:28PM (#32351250)

    They've never been able to bring the most innovative designs to market.. they bring 'good enough' wrapped in the x86 instruction set.

    If x86 was available to all I think we'd see Intel regress to a foundry business model.

  • Both good and bad (Score:3, Interesting)

    by TheRealQuestor ( 1750940 ) on Wednesday May 26, 2010 @02:37PM (#32351368)
    This is bad news for one reason. Competition. There are only 2 major players in discreet graphics right now and that is horrible for the consumer. Now the good. Intel SUCKS at making gpus. I mean seriously. So either way Intel has no hope of making a 120 core GPU based off of x86 being cheap or fast enough to compete. Go big or stay at home. Intel stay at home.
  • Re:Both good and bad (Score:1, Interesting)

    by Anonymous Coward on Wednesday May 26, 2010 @03:15PM (#32351850)

    What about them?

    VIA's chips suck, like everything else they put out (e.g. their ridiculous re-badged Cyrix CPUs and stability-challenged motherboard chipsets).
    Matrox is a niche player (multi-monitor etc.). Their performance is on level with Intel's GMA, and their prices are a lot higher than "free" which is what a GMA core basically costs when buying a modern Intel CPU or chipset.

    Larrabee was indeed the only serious contender in discrete GFX we've seen for the better part of a decade or so, but it seems the Larrabee project management over-promised and under-delivered.
    Intel corp. basically chose to scrap the project rather than suffer the embarrassment of a card that would probably have been one or two generations slower than NVIDIA and AMD's enthusiast parts, while using more power.

    It would have been kick-ass for specialty graphics (truly fast vector graphics, voxel rendering, anything that doesn't conform to the standard vertex+effects shader pipeline of todays 3D graphics) and GPGPU though.
    Shaders are a horrible kludge compared to just having a bunch of real CPUs with real memory and just running your algorithm like on a normal cluster.

  • by Funk_dat69 ( 215898 ) on Wednesday May 26, 2010 @03:20PM (#32351904)

    I kind of think Larrabee was a hedge.

    If you think about it, around the time it was announced (very early on in development, which is not normal), you had a bunch of potentially scary things going on in the market.
    Cell came out with a potentially disruptive design, Nvidia was gaining ground in the HPC market, OpenCL was being brought forth by Apple to request a standard in hybrid computing.

    All of sudden it looked like maybe Intel was a little too far behind.

    Solution: Announce a new design of their own to crush the competition! In Intel-land, sometimes the announcement is as big as the GA. Heck, the announcement of Itanium was enough to kill off a few architectures. They would announce Larrabee as a discrete graphics chip to get gamers to subsidize development and....profit!

    Lucky for them, Cell never found a big enough market and Nvidia had a few missteps of their own. Also, Nehalem turned out to be successful. Add all that up, and it becomes kind of clear that Larrebee was no longer needed, negating the fact that it was a huge failure, performance-wise.

    Intel is the only company that can afford such huge hedge bets. Looks like maybe another one is coming to attack the ARM threat. We'll see.

  • by gman003 ( 1693318 ) on Wednesday May 26, 2010 @03:33PM (#32352094)

    The Larrabee chips actually looked pretty good. There was a lot of hype, especially from Intel. They demoed things like Quake Wars running a custom real-time ray-tracing renderer at a pretty decent resolution. Being able to use even a partial x86 ISA for shaders would have been a massive improvement as well, both in capabilities and performance.

    From what I've been able to piece together, the problem wasn't even the hardware, it was the drivers. Apparently, writing what amounts to a software renderer for OpenGL/DirectX that got good performance was beyond them.

    Another part was an odd insistence on doing all the rendering in software, even stuff like texel lookup and blitting, but that's another story.

  • by Jackie_Chan_Fan ( 730745 ) on Wednesday May 26, 2010 @03:51PM (#32352328)

    I disagree. Intel has been destroying AMD these past 4 years.

    AMD's 64bit instruction set, and athlons were a huge improvement where Intel had failed...

    But now.. Intel's chips are faster, and AMD has been playing catch up. For a while there AMD didnt have an answer for intel's core line of cpus.

    Now they do, and they're slightly cheaper than intel but they do not perform as fast as intel.

  • by Rockoon ( 1252108 ) on Wednesday May 26, 2010 @07:12PM (#32354850)

    AMD does beat intel on the price curve... but not in performance.

    AMD does seem to have an edge in the multiprocessor arena, although I am not sure why.

    According to PassMark, the fastest machines clocked using their software is a 4 x Opteron 6168 (4 x 12 cores = 48 cores) system and a 8 x Opteron 8435 (8 x 6 cores = 48 cores) []

    The actual numbers are:

    4 x Opteron 6168 : 23,784 Passmarks.
    8 x Opteron 8435 : 22,745 Passmarks.
    4 x Xeon X7460 : 18,304 Passmarks.
    2 x Xeon X5680 : 17,910 Passmarks.

    That $200 AMD chip that everyone is raving about, the Phenom II 1055T, scores 5,661 Passmarks. If AMD keeps that up, Intel might be in some trouble soon even in the high end market unless Intel can cut prices dramatically. Intel doesnt offer anything comparable for the money.

  • Some more fuel (Score:3, Interesting)

    by RelliK ( 4466 ) on Wednesday May 26, 2010 @08:08PM (#32355528)

    There was a company called Rapid Mind, which built library & tools for writing code to target various GPUs, multi-core CPUs, etc. Something similar OpenCL, I suppose, but easier to program (theoretically -- I never actually tried it). Intel bought it and killed it.

    Another company, Havok, developed a successful physics & AI library. They were going to port it to to GPUs. Then Intel bought it and canceled the GPU port.

  • Re:Both good and bad (Score:3, Interesting)

    by BikeHelmet ( 1437881 ) on Wednesday May 26, 2010 @09:29PM (#32356496) Journal

    Are you kidding me? This is great for consumers.

    If Intel got their claws in the discreet graphics market (which is already showing signs of stagnation rather than growth), then they'd take a huge chunk of nVidia and ATI's R&D budgets away. Unable to put as much money towards advancement, GPU generations (and their pricedrops) would come slower. Meanwhile Intel would utilize their advanced (and cheap) fabbing to make a killing on that market, just as they do IGPs.

    End result? Slower progress, nVidia and ATI suffer, Intel rakes in cash.

    Our current duopoly is GOOD. Videocards drop in price and increase in performance quicker than CPUs do. Adding a behemoth competitor will hurt the industry.

"I have not the slightest confidence in 'spiritual manifestations.'" -- Robert G. Ingersoll