Forgot your password?
typodupeerror
Graphics Software Intel Hardware

Nvidia Firmly Denies Plans To Build a CPU 123

Posted by timothy
from the this-time-we-mean-it dept.
Barence writes "A senior vice president of Nvidia has denied rumours that the company is planning an entry into the x86 CPU market. Speaking to PC Pro, Chris Malachowsky, another co-founder and senior vice president, was unequivocal. 'That's not our business,' he insisted. 'It's not our business to build a CPU. We're a visual computing company, and I think the reason we've survived the other 35 companies who were making graphics at the start is that we've stayed focused.' He also pointed out that such a move would expose the company to fierce competition. 'Are we likely to build a CPU and take out Intel?' he asked. 'I don't think so, given their thirty-year head start and billions and billions of dollars invested in it. I think staying focused is our best strategy.' He was also dismissive of the threat from Intel's Larrabee architecture, following Nvidia's chief architect calling it a 'GPU from 2006' at the weekend."
This discussion has been archived. No new comments can be posted.

Nvidia Firmly Denies Plans To Build a CPU

Comments Filter:
  • Focused (Score:5, Insightful)

    by Akita24 (1080779) on Wednesday August 27, 2008 @10:45AM (#24765375)
    Yeah, they've stayed focused on graphics chips, that's why there are so many motherboards with nVidia chip sets .. *sigh*
  • Only reason (Score:3, Insightful)

    by Z00L00K (682162) on Wednesday August 27, 2008 @10:46AM (#24765393) Homepage

    The only reasons that they may build a chip for x86 (64-bit or not) would be to either use it for a special application or as a proof of concept.

    A GPU and a CPU are different, but it may be a way to test if a GPU architecture can be applied to a CPU with a classic instruction set. The next step is to sell the knowledge to the highest bidder.

    To compete with Intel would just be futile.

  • Just a thought... (Score:5, Insightful)

    by darkvizier (703808) on Wednesday August 27, 2008 @10:48AM (#24765441)
    If you're 30 years behind them in their market, and they're 2 years behind you in yours, maybe it's not wise to be "dismissive of the threat" ?
  • by Fourier404 (1129107) on Wednesday August 27, 2008 @10:50AM (#24765469)
    I would be very, very surprised if that was any cheaper than just buying 2, one manufactured as a GPU, the other as a CPU.
  • rumour machine (Score:3, Insightful)

    by Anonymous Coward on Wednesday August 27, 2008 @10:54AM (#24765533)

    rather handy that this rumour gives nvidia, a GPU company, the chance to point out how futile it would be for them to try and enter the CPU market... then point over to intel, a CPU company, trying to make a GPU...

  • by Van Cutter Romney (973766) <sriram@venkataramani.geemail@com> on Wednesday August 27, 2008 @10:58AM (#24765605)

    Even if the rumors were true, letting news out to market about it would give Intel time to prepare a response (and legal action).

    I don't get the legal action part. Is the x86 architecture patented by Intel? Even if it is, wouldn't the patent have expired by now? After all, its more than 30 years old. Do AMD, VIA etc. pay licensing fees to Intel for building processors using the x86 architecture? If so, why cant NVidia?

  • by Toffins (1069136) on Wednesday August 27, 2008 @11:01AM (#24765661)
    Who said price is the most interesting issue? I'd definitely choose the versatility of an open-source microcode GPU that could be dynamically reprogrammed to have any of several different instruction sets. It would be significantly simpler than the hassle of designing with FPGAs because much of the infrastructure (floating point logic etc) would already be available hardcoded into the GPU's silicon.
  • by mandark1967 (630856) on Wednesday August 27, 2008 @11:03AM (#24765685) Homepage Journal

    Remove their heads from their collective rectum and correct the damn problems they have with their video cards and motherboard chipsets.

    I've been a loyal nVidia customer since the good old days of the Diamond V550 TNT card through the 8800GTX but they have really hosed up lately.

    My 780i board has major data coruption problems on the IDE channel and my laptop is one of the ones affected by their recall so I am not too pleased with their ability to execute lately...

  • And why not? (Score:5, Insightful)

    by geogob (569250) on Wednesday August 27, 2008 @11:08AM (#24765745)

    I wouldn't mind seeing more players in the computer processor industry. The headlines really make it sound like it would be a bad thing. Maybe I'm getting the headlines wrong, but having Nvidia presenting new alternatives to a market almost exclusively owned by Intel and AMD would be interesting.

  • From 2006 (Score:5, Insightful)

    by Alioth (221270) <no@spam> on Wednesday August 27, 2008 @11:16AM (#24765883) Journal

    "A GPU from 2006" sounds a lot like famous last words.

    I wonder if anyone at DEC made comments in a similar vein about Intel CPUs, when the Alpha was so far ahead of anything Intel was making? NVidia's architect should not underestimate Intel, if he does, he does it at his company's peril.

  • by AKAImBatman (238306) * <akaimbatman@gUUU ... inus threevowels> on Wednesday August 27, 2008 @11:23AM (#24765981) Homepage Journal

    Is anyone actually surprised that the CEO is denying this?

    Not at all. As you say, he would have denied it even if NVidia WAS planning a CPU. What actually speaks volumes IMHO, is the vehemence with which he denied it. Any CEO who's cover-denying a market move is not going to close his own doors by stating that the company could never make it in that space. He would give far weaker reasons so that when the announcement comes the market will still react favorably to their new product.

    In other words: stick a fork in it, because this bit of tabloid reporting is dead.

  • by Bruce Perens (3872) * <bruce@perens.com> on Wednesday August 27, 2008 @11:42AM (#24766263) Homepage Journal

    I think the reason we've survived the other 35 companies who were making graphics at the start is that we've stayed focused.

    3DFx was the first company to publish Open Source 3D drivers for their 3D cards. nVidia sued them, then bought them at a discount, and shut down the operation. So, we had no Open Source 3D for another 5 years.

    That's not "staying focused". It's being a predator.

    Bruce

  • by Rufus211 (221883) <rufus-slashdot AT hackish DOT org> on Wednesday August 27, 2008 @12:06PM (#24766683) Homepage

    What on earth are you talking about? 3DFx died because it was horribly mismanaged and ran out of money. There were lawsuits, but 3dfx sued NV first in 1998 and then in 2000 NV counter-sued (source [bluesnews.com]). True NV's countersuit was right before 3dfx died, but a simple lawsuit that's gone nowhere in the courts yet doesn't cause a company to go bankrupt overnight.

    Personally I'll believe one of my (ex-3dfx Austin) friend's explanation for their downfall: the fully stocked Tequila bar that was free to all employees. Or there's a whole list of problems leading to their decline on wikipedia [wikipedia.org].

  • by CodeBuster (516420) on Wednesday August 27, 2008 @12:35PM (#24767169)

    Maybe that is where some journalist got mixed and where all this "nVidia is preparing a x86 chip" rumor began?

    This is what happens when technical information is filtered through the brain of a salesperson, manager, or executive. It comes out completely mangled on the opposite side or, even worse, it morphs into something which while technically correct is NOT the information that the non-technical person thought they were conveying (i.e. they have unknowingly modified the requirements specification in a way that is logically consistent from a technical standpoint, but will result in the wrong product being built).

  • by AKAImBatman (238306) * <akaimbatman@gUUU ... inus threevowels> on Wednesday August 27, 2008 @01:08PM (#24767625) Homepage Journal

    Otherwise we would be able to tell what he's doing, and he won't be able to deny anything, no?

    No. Because any CEO who immediately kills the market he's about to enter with his own statements is a fool.

    If you want to get into the market of competing with Intel, you don't say that you could never make a CPU as good as Intel can.

  • by Kjella (173770) on Wednesday August 27, 2008 @02:55PM (#24769051) Homepage

    Who said price is the most interesting issue? I'd definitely choose the versatility of an open-source microcode GPU that could be dynamically reprogrammed to have any of several different instruction sets.

    As long as they're Turing complete, any of them can in principle do anything. Yes, then at least to me it comes down to price - if it's cheaper to have a car, boat and plane than making a tranasformer that can do all three at it, suck at all three and cost a bajillion more I'll go for traditional chips, thank you very much.

  • by Steveftoth (78419) on Wednesday August 27, 2008 @03:16PM (#24769267) Homepage

    They are very good at doing research in making their chips very cheap to make and own the whole stack of production from start to finish. This is how they have managed to make it despite many many misteps along the way.

    nVidia doesn't own the factories that they use to make their chips, they just design them and use factories like TSMC. nVidia would be stupid to compete with intel in the same space (x86 CPUs) until they own and can efficiently build chips like intel can.

    AMD was the only ones doing it as they tried their best to own all their own fabs, however they are running in the red and are trying to sell some of them now. We'll see if they can pull it together but still they are one of the only other companies out there that actually tries to build the chips from start to finish.

    Intel's latest graphics offering is going to fail, not because they don't have the hardware (actually their new larabee looks really fast). but because their graphics drivers have always stunk and there is little evidence to suggest that they will be able to make a leap forward in graphics driver quality that will make their solution better then AMD or nVidia. They have to write full DX9, DX10, and OpenGL drivers to really compete with nVidia, then they have to optimize all those drivers for all the popular games (cause nobody will re-write Doom, HL, UT, FarCry, etc.. just for this new graphics card).

    It could happen, but will it?

    I do hope that larabee turns out to be an awesome coprocessor for other tasks. We'l just have to see if people actually port their code to it.

  • by MarcQuadra (129430) on Wednesday August 27, 2008 @03:36PM (#24769507)

    Transmeta tried that. It was slow, expensive, and inconsistent. Also, nobody ever used any other 'instruction sets' besides x86, mostly because that's the most-common-denominator in the computing world.

    It sucks, it's not the -best- way to do it, but it's the way the market seems to favor. Just ask Apple, Sun, DEC, and HP.

HEAD CRASH!! FILES LOST!! Details at 11.

Working...