Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Graphics Intel Hardware

Intel Kills Consumer Larrabee Plans 166

An anonymous reader tips news that Intel has canceled plans for a consumer version of their long-awaited and oft-delayed Larrabee chip, opting instead to use it as a development platform product. From VentureBeat: "'Larrabee silicon and software development are behind where we had hoped to be at this point in the project,' said Nick Knuppfler, a spokesman for Intel in Santa Clara, Calif. 'Larrabee will not be a consumer product.' In other words, it’s not entirely dead. It’s mostly dead. Instead of launching the chip in the consumer market, it will make it available as a software development platform for both internal and external developers. Those developers can use it to develop software that can run in high-performance computers. But Knuppfler said that Intel will continue to work on stand-alone graphics chip designs. He said the company would have more to say about that in 2010."
This discussion has been archived. No new comments can be posted.

Intel Kills Consumer Larrabee Plans

Comments Filter:
  • In other words... (Score:5, Insightful)

    by sznupi ( 719324 ) on Friday December 04, 2009 @09:14PM (#30331586) Homepage

    A nicer way of saying:

    Uhm, guys, remember how we were supposed to ship a year ago and said recently we will ship a year from now? Well, add 5 to that now...but we will provide and totally kick ass, promise.

  • Re:Oh rats (Score:5, Insightful)

    by QuantumRiff ( 120817 ) on Friday December 04, 2009 @09:24PM (#30331672)

    I would say ATI AMD are about to become the leader. Intel is making it more difficult to ship mobile systems without the craptastic intel graphics cards. Larrabee was supposed to be a decent performance GPU, that would almost be like a co-processor.

    AMD has slightly slower CPU's, but their intgerated graphics blow the snot out of the Intel ones, and are getting even better.. What good is a super fast CPU, if you can't play any games, or even do basic stuff without using the power hungry CPU?

  • by sznupi ( 719324 ) on Friday December 04, 2009 @09:51PM (#30331874) Homepage

    Hm, yeah... a variant of FUD; spreading wonderful stories about a future product just to stall / eradicate the competition; just so potential clients will wait.

    What doesn't add up in this case is that Intel, at this point in time, seems quite cautious in their claims about Larabee - they hardly have anything / are themselves very skeptical about it, even in face of major delay & reengineering?

  • by Foredecker ( 161844 ) * on Friday December 04, 2009 @10:04PM (#30331934) Homepage Journal
    Vaporware is not faster than existing products.
  • by segedunum ( 883035 ) on Friday December 04, 2009 @10:23PM (#30332036)
    I certainly had forgotten, thanks. I certainly haven't forgotten with regards to marrying a powerful Intel processor with anyting like acceptable integrated graphics.

    I guess this means that the only option we have to get half-decent graphics with an Intel processor is with an nVidia chipset. However, that relationship looks a bit rocky and very soon we'll probably only be left with the incredibly shitty Intel integrated graphics systems that work passibly (i.e. you can display a Vista/7 desktop with it and that's it) until you actually want it to do anything even remotely...........graphical. Their acceleration performance for video isn't too hot either.

    Either that, or you move to AMD/ATI if you want a decent processor/chipset/integrated graphics combination. AMD must be pleased. This is the best news they've had in quite a while. Their purchase of ATI looks to be paying off. If Intel can't get Larrabee working then I don't know where they go from here, apart from try again and actually get it working or start being nice to nVidia.
  • by Anonymous Coward on Friday December 04, 2009 @10:27PM (#30332072)

    This is being mis-reported or mis-communicated by Intel, I believe.
    The first version of Larrabee silicon isn't going to consumers, that's all.
    From the consumer's perspective, it's a delay. Yet to be seen if it's fatal.
    Otherwise, who'd want to use it to develop software?

  • by dwinks616 ( 1536791 ) on Friday December 04, 2009 @11:17PM (#30332256) Homepage
    Oh, well please show me where I can buy this discrete card for my laptop please?
  • by Kjella ( 173770 ) on Friday December 04, 2009 @11:59PM (#30332472) Homepage

    Vaporware is not faster than existing products.

    Vaporware is always faster than existing products.

  • by Foredecker ( 161844 ) * on Saturday December 05, 2009 @12:36AM (#30332636) Homepage Journal
    I have no insider knowledge, but I strongly suspect they had problems with both the HW and Softare. I suspect the hardware actualy worked pretty well (it is Intel - the don't suck at all) but the problem was costs. Both ATI and NVIDIA have been at this a long time and producin cost effective graphics silicon is quite difficult. The software is also quite complext. The rendering model is -very- different and nobody is goin to re-write all theitr software to accomodate someting alien. So, they had to make it work with existing models. This is expensive. Both in terms of run itme efficiencey and engineering calendar time. I suspect they figured out they simply couldn't compete in the mass PC graphics market. I suspect Jen-Hsun Huang at NVIDIA is having a very, very good day.
  • by bertok ( 226922 ) on Saturday December 05, 2009 @12:39AM (#30332652)

    Wow... thanks for your insight! Should have known Intel would be logical even about their failures, and roll them over to something that has a chance of applicability. The only thing I wish they would do is skip the 64-bit crap and make 128-bit architectures that are compatible with both 32- and 64-bit predecessors. It would ease the development of new applications since the life time of 128-bit archs would be decades as opposed to developing all 64-bit apps to only have 128-bit archs appear in 5-10 years.

    I'm not sure if you're trolling or not, but 64-bit memory capacity is not "twice" as big as 32-bit, it's 4.3 billion times as big. That's more than just 5 to 10 years of Moore's law, that's more like 50 years. Physical bus widths have nothing to do with architecture bitness either, there are memory buses for 64-bit architectures that only have a few pins.

  • by ThatMegathronDude ( 1189203 ) on Saturday December 05, 2009 @02:16AM (#30333072)
    I have a 4 year CS degree and I can tell you with certainty that that blogger is full of shit. The problems that are already parallelizable, are easily multithreaded with current technology. The problems with serial dependency, are not, and never will be, easily multithreaded.

    Rendering graphics is already done, because its easy to split the task of rendering a bunch of pixels into pixel-sized chunks. Each small thread can read from the same shared memory (the scene graph and textures, etc.) and write to a distinct location (its one pixel in the frame buffer).

    Encoding video using motion-compensation techniques (basically all modern video codecs) will never be satisfactorily parallelizable because the best bang/bitrate can only be achieved when frames are processed serially. Frames need to be processed as a whole to optimize for panning and other full-scene motion, and the results of the previous frame's motion analysis is typically needed to compute the next delta. You can break the processing up into multiple threads easily enough, but you miss out on opportunities to make the output more efficient or better looking.

    When Mr. PseudoScience blogger can parallelize the video encoding problem without so many dependencies that its essentially a serial process, then he should get some credit, not before then.
  • by Anonymous Coward on Saturday December 05, 2009 @02:56AM (#30333198)

    If I am correct Intel doesn't want a repeat of the 1st gen Itanium where on release the brand name is blemished by the less than expected performance. ...
    It's not as if Intel needs Larrabee in the near future anyway- AMD doesn't have anything significant in the near future as well; even if they do, with Intel's brute engineering capability, they will just pull a Core2 again. ...
    Another possibility is that no game company is able to support Larrabee's architecture.

    Intel is great at manufacturing and CPUs but they couldn't make a decent GPU and driver if their life depended on it. Until Intel can produce a gpu that is competitive with ATI / nVidia, any pie in the sky talk (like Larabee) is just vaporware and should be largely ignored.

  • by ppanon ( 16583 ) on Saturday December 05, 2009 @06:06AM (#30333776) Homepage Journal

    We're going into TMI territory. I've worked in Intel labs. The people there are first rate.

    Oh I totally agree that Intel has some top drawer engineers. I've heard their compiler division is first rate (which company was it that they bought for their patent portfolio again?). Intel's production process group is also tops and has been instrumental in keeping them ahead of the curve. Core is a testament to their CPU and chipset design teams. I've just never seen any indication that their graphics teams are of the same relative caliber in that domain. Just what historical market would good Intel graphics chips cannibalize, anyways? SSE(n+1)? On the other hand, Microsoft Research has hired some amazing people and yet you don't hear a lot about groundbreaking stuff coming out of there, so I'll grant you you may be right and that Intel's could just keeping an ace up their sleeve rather than play it to avoid drawing more AntiTrust heat. But they haven't been above some major strategic blunders out of greed (*cough* RDRAM *cough*) either. However, if you tried to make Intel executives accept shedding some light in their heart of darkness, more power to you.

With your bare hands?!?

Working...