Intel Kills Consumer Larrabee Plans 166
An anonymous reader tips news that Intel has canceled plans for a consumer version of their long-awaited and oft-delayed Larrabee chip, opting instead to use it as a development platform product. From VentureBeat:
"'Larrabee silicon and software development are behind where we had hoped to be at this point in the project,' said Nick Knuppfler, a spokesman for Intel in Santa Clara, Calif. 'Larrabee will not be a consumer product.' In other words, it’s not entirely dead. It’s mostly dead. Instead of launching the chip in the consumer market, it will make it available as a software development platform for both internal and external developers. Those developers can use it to develop software that can run in high-performance computers. But Knuppfler said that Intel will continue to work on stand-alone graphics chip designs. He said the company would have more to say about that in 2010."
Larrabee = Graphics Chip competing w nVidia (Score:5, Informative)
In case you've forgotten what a Larrabee was (like I had), it was Intel's planned graphics / vector processing chip, competing with nVidia and AMD / ATI graphics systems. Here's the Wikipedia article [wikipedia.org].
Re:the performance is there (Score:1, Informative)
Read the comments. It looks like a lopsided comparison, with other folks getting higher results from e.g. ATI 4800, 5800.
Re:So the next mini, low end imac and 13" macbook' (Score:1, Informative)
Apple already dropped GMA for low end stuff, they're using GeForce 9400M instead. They're also using Radeons on most iMac models.
Re:Oh rats (Score:1, Informative)
NVidia historically had a dominant position
I suppose "historically" is a relative term. I remember when just about EVERY graphics card was ATI.
ATI had the OEM market in the bag for quite a while.
From 1999: [findarticles.com]
What this also does is put a dent in the armor of ATI Technologies Inc., Toronto, Canada. ATI is the PC graphics market share leader with revenues close to $1 billion and has been steam rolling over the competition in the PC space for the past year or so. This includes S3, Trident Microsystems, 3Dfx, 3Dlabs and even Intel. The only companies to put up much of a fight was Nvidia, which is much smaller than ATI, and Montreal, Canada-based Matrox Graphics Inc., which has a similar business model to ATI.
Until the nVidia juggernaut took off [zdnetasia.com] in 2000:
Nvidia has overtaken ATI Technologies as the biggest maker of chips to enhance graphics on desktop computers, according to a new study by industry consultant Mercury Research.
In the third quarter, Nvidia chips were in 48 percent of all desktop computers, more than doubling its market share from 20 percent in the third quarter last year, Mercury said. ATI slipped to 34 percent from 39 percent.
I disagree (Score:2, Informative)
Many people really don't care about their graphics card. If you don't do games, an Intel chipset graphics unit works fine. It accelerates the shiny interface in Windows 7 and everything is nice and responsive. For business uses, this is plenty.
Ok well if you do care about games, then you want a discreet graphics solution. Integrated solutions will just never do well. Big reason is memory. You can make your card as fast as you like, if it shares system memory it is severely bottlenecked. Graphics cards needs their own dedicated high speed memory to perform well.
As such I just don't see ATi having a slightly better integrated solution as something people will care much about. The bigger question is who makes the better CPUs that that is firmly in Intel's arena. Their CPUs are faster, and can be lower power. So regardless of if you want a power saving app or a performance solution, they've got a good chip.
AMD really has to get their chips up to snuff before they'll start competing with Intel more. They don't have to beat Intel at everything, but they need to have at least one area they are better for and they really don't seem to. Also they need to do better with chipsets and motherboards. A big advantage Intel has with regards to the reseller market is that they do their own solutions. Intel will sell you a CPU, chipset and motherboard and they all work together well. OEMs like this, cuts down on supply chain problems and problems of vendors blaming each other when there's trouble.
This has also historically been a weakpoint for AMD. I remember when their Athlons came out and there was no question, they beat the P3's price/performance ratio. They were the kings of the hill. I bought one... and returned it two weeks later. The reason? Chipsets. I could not get a chipset that would work with my GeForce 256 properly. They had poor regulation of the AGP signal and it just wouldn't work. Bought an Intel chip/board and it worked flawlessly the first time.
So when AMD has a good CPU/chipset/mobo combo and CPUs competitive with Intel in at least one arena, I think maybe they'll make gains. Until then, I think they'll mainly be relegated to "cheap brands" and to enthusiast BYO systems.
Re:Oh rats (Score:3, Informative)
I might agree with you if ATI/AMD would finally get serious about producing drivers that aren't complete crap. Their hardware is fine, but Linux drivers, as well as OpenGL drivers on Windows just plain suck.
It's not just the video drivers. ATI also has a horrible software stack (SDK, runtime, compiler and documentation) for their Stream GPGPU computing architecture, which is why everybody uses NVIDIA and its excellent CUDA. Generally speaking, ATI has excellent hardware, but such hardware is useless if you don't have a matching software to exploit it.
Re:Oh rats (Score:4, Informative)
Don't forget about the NVidia Ion platforms. They also use a "just-enough" CPU in Intel's Atom, with higher end NVidia GPUs to run nicely integrated HD set-top boxes. Nice little platforms for MythTV frontends.