Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Intel Hardware

Intel Kills Consumer Larrabee Plans 166

An anonymous reader tips news that Intel has canceled plans for a consumer version of their long-awaited and oft-delayed Larrabee chip, opting instead to use it as a development platform product. From VentureBeat: "'Larrabee silicon and software development are behind where we had hoped to be at this point in the project,' said Nick Knuppfler, a spokesman for Intel in Santa Clara, Calif. 'Larrabee will not be a consumer product.' In other words, it’s not entirely dead. It’s mostly dead. Instead of launching the chip in the consumer market, it will make it available as a software development platform for both internal and external developers. Those developers can use it to develop software that can run in high-performance computers. But Knuppfler said that Intel will continue to work on stand-alone graphics chip designs. He said the company would have more to say about that in 2010."
This discussion has been archived. No new comments can be posted.

Intel Kills Consumer Larrabee Plans

Comments Filter:
  • No consumer version means this will turn into another i860. I guess ATI will remain the only viable competitor to NVIDIA then.
    • Re:Oh rats (Score:5, Insightful)

      by QuantumRiff ( 120817 ) on Friday December 04, 2009 @08:24PM (#30331672)

      I would say ATI AMD are about to become the leader. Intel is making it more difficult to ship mobile systems without the craptastic intel graphics cards. Larrabee was supposed to be a decent performance GPU, that would almost be like a co-processor.

      AMD has slightly slower CPU's, but their intgerated graphics blow the snot out of the Intel ones, and are getting even better.. What good is a super fast CPU, if you can't play any games, or even do basic stuff without using the power hungry CPU?

      • I like the Fusion concept, and feel that Intel will ultimately be forced to imitate it as well. Their abandonment of Larrabee is consistent with that. Hell, I even hope that Scorpius will become the foundation for Nintendo's Wii-2 or Wii-HD.
      • Re: (Score:3, Interesting)

        by alvinrod ( 889928 )
        I don't know about that. Intel's offerings that are slated to come out 1Q - 1H of 2010 could give AMD some problems. Right now AMD has the performance advantage in the server space, but Gulftown [wikipedia.org] will likely trump their offerings. Arrandale [wikipedia.org] also looks quite impressive, especially the quad core i7 with an 18 watt TDP. The cores only run at 1.2 GHz, but with their Turbo boost the chip can clock up to 2.2 GHz. That will offer some amazing battery life for laptops and still provide good performance. I do believe
      • Craptastic as the Intel cards may be, in overall performance terms, I could happily take any of the integrated parts by Intel that has decent Linux support on my next desktop, even if that meant a massive reduction in performance. I have an Xbox 360 for playing games on and I would love for my desktop to Just Work as well as my Eee does with Linux. That said, with ATI cards getting better and better support under Linux it is quite possible that they'll be the best option by the time I upgrade again...

      • I disagree (Score:2, Informative)

        by Sycraft-fu ( 314770 )

        Many people really don't care about their graphics card. If you don't do games, an Intel chipset graphics unit works fine. It accelerates the shiny interface in Windows 7 and everything is nice and responsive. For business uses, this is plenty.

        Ok well if you do care about games, then you want a discreet graphics solution. Integrated solutions will just never do well. Big reason is memory. You can make your card as fast as you like, if it shares system memory it is severely bottlenecked. Graphics cards needs

        • by TheLink ( 130905 )

          > Ok well if you do care about games, then you want a discreet graphics solution.

          The graphics hardware for games tend to be rather indiscreet. Big rapidly spinning fans, hot, noisy, lots of shiny/glossy metal and big.

          See the second pic:
          http://techreport.com/articles.x/17986 [techreport.com]

          Integrated graphics solutions (which are nondiscrete) tend to be way more discrete. Just one small chip (or even just part of another chip), quiet, fanless, small.

        • My wife and I play wow but most users prefer to use a wii or ps3 if they want to play games.

          Its frustrating and I agree that the intel chipsets and integrated chips (not true video cards) put desktops 5 - 6 years behind and piss off game developers forcing them to port only to consoles.

          The netbook phenomena shows this trend for slim boring graphics that are cheap cheap and uh cheap.

          Most game developers have left the pc as a result due to angry kids whose parents get a nice i945 graphics chipset computer for

      • ATI are about to become the leader? They are already the leader in all categories: perf/$, perf/W, absolute perf, and at all price points. See list below. For gaming performance, the GFLOPS rating are a roughly (+/- 30%) good enough approximation to compare ATI vs. Nvidia. For GPGPU performance, the GFLOPS rating is actually unfair to ATI because Nvidia's GT200 microarchitecture causes it to be artificially inflated (they assume a MUL+MAD pair executing 3 floating-point op per cycle, whereas ATI assumes a

        • by Khyber ( 864651 )

          I find it very funny that my two-generations old 9800GTX+ has more power than the pretty new GTS250.

          And I can get it for 89 bux off pricewatch. So for a pair running SLI, you get roughly the same performance as the card that costs 40 bucks more (GTX260.)

          Biggest difference is DX version support.

          Glad my bet on the 9800GTX+ a couple years ago was a good one to make!

          • by gzunk ( 242371 )

            The GTS250 is a rebranded 9800GTX+. Which itself was a die shrink of the 8800GTS, with higher clocks. So the design goes back to 2006 with the G80, through to the G92 in late 2007.

      • by pjbass ( 144318 )

        I don't play games on my laptop, but I do run compiz-fusion with many of the features enabled. It's very eye-candy-heavy, and my integrated Intel graphics chip keeps up just fine. My CPUs don't bear much load at all. I don't think things are as grossly out of proportion as you make them out to be. 5 years ago, yes. Today, not so much.

  • by Anonymous Coward

    So they intend to take a product, who's chief advantage was that it could run old x86 code, and only sell it people who are designing new software? Am I the only one that sees a problem with this?

  • In other words... (Score:5, Insightful)

    by sznupi ( 719324 ) on Friday December 04, 2009 @08:14PM (#30331586) Homepage

    A nicer way of saying:

    Uhm, guys, remember how we were supposed to ship a year ago and said recently we will ship a year from now? Well, add 5 to that now...but we will provide and totally kick ass, promise.

    • Re: (Score:2, Funny)

      by symbolset ( 646467 )
      An Itanium class part, then.
      • Re: (Score:3, Insightful)

        by sznupi ( 719324 )

        Hm, yeah... a variant of FUD; spreading wonderful stories about a future product just to stall / eradicate the competition; just so potential clients will wait.

        What doesn't add up in this case is that Intel, at this point in time, seems quite cautious in their claims about Larabee - they hardly have anything / are themselves very skeptical about it, even in face of major delay & reengineering?

        • Re:In other words... (Score:5, Interesting)

          by ppanon ( 16583 ) on Saturday December 05, 2009 @04:10AM (#30333612) Homepage Journal
          Yeah, I've been wondering about that. For the last year I've heard people parrot how great Larabee was going to be and it reminded me a lot of the hype about how the Pentium IV (or even Itanium for that matter) was going to kick ass. I couldn't see Intel all of a sudden going from dead last in graphics performance to top of the heap. They would have needed some top graphics system designers on both the h/w and s/w sides and those people just haven't been at Intel. I can't help but wonder if Larabee FUD and the chipset disputes with NVidia might have been a one-two punch plan to knock down NVidia's market capitalization down a peg or two to make it cheaper buy out. Then Intel's in the driver's seat to get NVidia's expertise and patents for a song instead of paying top dollar for them. Intel could have been planning this from the moment AMD bought out ATI two years ago, or even earlier when the latter two were still in preliminary talks, I doubt there would be any email smoking guns over it though; Intel's where the paranoid survive after all. But if I'm right then I would expect Intel to make a play for NVidia in inside of two years. To wait much longer would give AMD/ATI too much of a headstart in a market increasingly dominated by laptops. Somehow, 18 months after Intel buys NVidia, Larabee II will show up with graphics performance slightly better than NVidia's last GPU (and those suckers doing Larabee development are going to find the pipeline/rendering model significantly changed to look a lot like NVidia's).
          • We're going into TMI territory. I've worked in Intel labs. The people there are first rate. There's no way to describe how much more fun it is to deal with folks who can think.

            The executive suite there could use a broom. That's all I can say about that.

            We'll have our progress with or without Intel. If Intel gets behind enabling individuals to do more without worrying about how much that "cannibalizes" their historical markets, they will have learned what I tried to teach them. I did try.

            • Re: (Score:3, Insightful)

              by ppanon ( 16583 )

              We're going into TMI territory. I've worked in Intel labs. The people there are first rate.

              Oh I totally agree that Intel has some top drawer engineers. I've heard their compiler division is first rate (which company was it that they bought for their patent portfolio again?). Intel's production process group is also tops and has been instrumental in keeping them ahead of the curve. Core is a testament to their CPU and chipset design teams. I've just never seen any indication that their graphics teams are of

  • by billstewart ( 78916 ) on Friday December 04, 2009 @08:22PM (#30331648) Journal

    In case you've forgotten what a Larrabee was (like I had), it was Intel's planned graphics / vector processing chip, competing with nVidia and AMD / ATI graphics systems. Here's the Wikipedia article [wikipedia.org].

    • Re: (Score:3, Insightful)

      by segedunum ( 883035 )
      I certainly had forgotten, thanks. I certainly haven't forgotten with regards to marrying a powerful Intel processor with anyting like acceptable integrated graphics.

      I guess this means that the only option we have to get half-decent graphics with an Intel processor is with an nVidia chipset. However, that relationship looks a bit rocky and very soon we'll probably only be left with the incredibly shitty Intel integrated graphics systems that work passibly (i.e. you can display a Vista/7 desktop with it a
    • I'm not one for conspiracy theories, although I wouldn't be terribly shocked if Intel surprised everybody and launched Larrabee a few months after AMD releases a competing product.

      In the past, Intel's deliberately stifled product development and engaged in anticompetitive behaviors that would even make Microsoft look twice (and has been found guilty and forced to pay up to this extent). Remember how quickly Intel brought consumer x86-64 chips to market after AMD proved that the platform was technically and

    • Why doesn't Intel just stop trying to create GPU's? I've got a new laptop with one of their graphics chipsets, and it absolutely sucks. Seems like Intel should stick to 'normal' processors.
  • by WoTG ( 610710 ) on Friday December 04, 2009 @08:23PM (#30331662) Homepage Journal
    Hmm... I think Intel's plan is for Larrabee GPU's to launch at the same time as Duke Nukem Forever! :)
  • So the next mini, low end imac and 13" macbook's will be stuck with shit video and the mac pro will start at $3000 with 6 core cpus.

    Will apple move to amd just to get better video in low end systems?

    • Re: (Score:1, Informative)

      by Anonymous Coward

      Apple already dropped GMA for low end stuff, they're using GeForce 9400M instead. They're also using Radeons on most iMac models.

    • by jasonwc ( 939262 )
      I'm not sure what you're referring to. Macbook and Macbook Pros are configured with Nvidia 9400M or 9600M chipsets. They may not be powerful but at least they are dedicated graphics solutions. Far superior to Intel Integrated graphics, and they provide working hardware acceleration for H.264/VC-1. The Intel G45 chipset does so - but only with MPC-HC - not for commercial blu-ray playback - and it had some corruption last I checked.
      • i3/i5 cut off nvidia and the low end cpus have gma build in and apple likely will put i3 in the mini and stick it with carp video at $800 as well.

      • Re: (Score:3, Interesting)

        by willy_me ( 212994 )

        but at least they are dedicated graphics solutions

        Actually, the 9400m is not. It uses system memory but does a much better job then Intel. It also acts as the memory controller and does system IO. The reason for the parent's comments is that all future Intel CPUs will have integrated memory controllers (like the i7 and i5) and an integrated GPU. Performance will suck but it will make for cheap systems. This will make it difficult for system builders to make a low end system with good graphic performance as the market for such systems will be small.

    • "The new macbook pro, now with AMD... and only 3 hours of battery"

      Somehow I think AMD still has a few things to learn about mobiles, and that's the mac's main market.

    • It's likely that Apple will have to use discrete graphics on all but the lowest-of-the-low (a theoretical $799 MacBook) in order to not regress graphically. NVIDIA GT240 could be an option as a discrete replacement for the integrated 9400M.

      It will require motherboard redesigns, but the CPU will force that anyway. The Intel I/O hub for the new systems is quite small, so there should be room.

      However Apple have regressed graphically in the past (Radeon 9550M -> Intel 2006 rubbish integrated graphics). It wo

  • by Plasmoid2000ad ( 1004859 ) on Friday December 04, 2009 @08:53PM (#30331890)
    I spent most of internship in intel arguing with people hyping larabee as the 2nd coming of jesus that it would never happen... And now i can finally say HAH!
    • So when the big guy does show up we will know what kind of a processor he'll be rockin', cool.
      Just remember:

      "Thou shalt NOT rootkit The Lord thy Admin."
    • If I worked at Intel in the group developing a product, I would keep my mouth shut, even if I was an intern. There are possibly a large group of smart dedicated people trying to make this happen.
  • by Anonymous Coward

    This is being mis-reported or mis-communicated by Intel, I believe.
    The first version of Larrabee silicon isn't going to consumers, that's all.
    From the consumer's perspective, it's a delay. Yet to be seen if it's fatal.
    Otherwise, who'd want to use it to develop software?

  • Maybe it's just resting?
    Stunned?
    Pining for the fjords?

    I'll show myself out.

  • Intel has shown real commitment to supporting their video hardware on Linux with full time staff [intellinuxgraphics.org] employed to produce high quality open source drivers in addition to providing open specifications for (most) of their contemporary hardware. Unfortunately this hardware provides only limited 3D acceleration. I was hoping that Larrabee would conflate these two and provide vendor supported, open, high performance accelerated 3D for Linux.

    So much for that happening anytime soon...

    I can't understand why Intel cede

    • They don't have the experience and all the good computer graphics engineers are at Nvidia and ATI.

  • by bertok ( 226922 ) on Friday December 04, 2009 @10:39PM (#30332358)

    I think the announcement of the 48-core Intel 'Bangalore' chip [slashdot.org] just recently is not a coincidence.

    When I first read about the Larrabee chip, I thought the decision to make it a cache coherent SMP chip to be simply insane - architectures like that are very difficult to scale, as the inter-core chatter scales roughly as the factorial of the number of cores. Remember how Larrabee was designed around a really wide 1024-bit ring bus? I bet that's required because otherwise the cores would spend all of their time trying to synchronize between each other.

    So, Larrabee is effectively cancelled, but only a day or two before Intel announced an almost identical sounding part without cache-coherence! It sounds to me like they've given up on the 100% x86 compatibility, and realised that a chip with some extra instructions around explicit software controlled memory synchronization and message passing would scale way better. Without cache coherence, a "many core" chip is basically just an independent unit repeated over and over, so scalability should be almost infinite, and wouldn't require design changes for different sizes. That sounds like a much better match for a graphics processor.

    While Intel kept their cards relatively close to their chest, from all of the presentations I've seen, no first-gen Larrabee chip could scale beyond 24 cores even with a 1024 bit bus, while the new Bangalore chip starts at 48 cores. There's no public info on how many lanes Bangalore has in its on-chip bus but based on the bandwidth of its 80 core experimental predecessor, I'm guessing it's either 32-bit or 64-bit (per core).

    • The problem is, a many-core non cache-coherent x86-like system isn't particularly interesting. The big advantage of Larrabee was that you could treat it like a normal SMP system, including (presumably) running standard multithreaded C code on it. Once you have to deal with memory synchronization explicitly, Larrabee starts to look a lot more (from a programming standpoint) like Fermi, Cypress or whatever other Nvidia/ATI GPUs are out at the time.

      There's nothing magic about x86/AMD64 in the HPC world. It's a

      • Re: (Score:3, Interesting)

        by bertok ( 226922 )

        The problem is, a many-core non cache-coherent x86-like system isn't particularly interesting. The big advantage of Larrabee was that you could treat it like a normal SMP system, including (presumably) running standard multithreaded C code on it. Once you have to deal with memory synchronization explicitly, Larrabee starts to look a lot more (from a programming standpoint) like Fermi, Cypress or whatever other Nvidia/ATI GPUs are out at the time.

        There's nothing magic about x86/AMD64 in the HPC world. It's attractive because it is cheap and has good performance. Clusters can, have been, and still are built using POWER and other architectures.

        But for "embarrassingly parallel" problems, which are the target application for these chips, cache coherence is often not necessary, and simply imposes a design burden. There are lots of problems where it's better to have 1000x the performance than 1/2 the developer time.

        It may not even involve less development time: Others have pointed out that the Unix "fork" mechanism combined with "copy-on-write" at the memory page level would also work, and wouldn't require cache coherency. Similarly, any existing cod

  • "mostly dead". maybe Miracle Max has a cure!
  • I suspect the Intel and PowerVR partnership may have have something to do with no consumer Larrabee plans. This partnership already has resulted in the 3100ce and PowervR has been working on some 1080p media accelerators. Larrabee does use a lot of power for the level of performance it would offer as a 3D chipset perhaps Intel and PowerVR have came with with something that does not use 160watts.
  • Sounds like a character played by Rodney Dangerfield in a teen grope movie.

Genius is ten percent inspiration and fifty percent capital gains.

Working...