Forgot your password?
typodupeerror
Graphics Software Intel Hardware

Nvidia Firmly Denies Plans To Build a CPU 123

Posted by timothy
from the this-time-we-mean-it dept.
Barence writes "A senior vice president of Nvidia has denied rumours that the company is planning an entry into the x86 CPU market. Speaking to PC Pro, Chris Malachowsky, another co-founder and senior vice president, was unequivocal. 'That's not our business,' he insisted. 'It's not our business to build a CPU. We're a visual computing company, and I think the reason we've survived the other 35 companies who were making graphics at the start is that we've stayed focused.' He also pointed out that such a move would expose the company to fierce competition. 'Are we likely to build a CPU and take out Intel?' he asked. 'I don't think so, given their thirty-year head start and billions and billions of dollars invested in it. I think staying focused is our best strategy.' He was also dismissive of the threat from Intel's Larrabee architecture, following Nvidia's chief architect calling it a 'GPU from 2006' at the weekend."
This discussion has been archived. No new comments can be posted.

Nvidia Firmly Denies Plans To Build a CPU

Comments Filter:
  • Inaccurate headline (Score:5, Informative)

    by TheRaven64 (641858) on Wednesday August 27, 2008 @10:42AM (#24765331) Journal
    nVidia are building a CPU, a Cortex A9 derivative with a GPU on-die and a load of other nice features. The summary states that they're not building an x86 CPU, but this is not what the headline says.
    • x86 rumors origin ? (Score:4, Interesting)

      by DrYak (748999) on Wednesday August 27, 2008 @10:59AM (#24765617) Homepage

      Currently nVidia is partnering with VIA for small form factor x86 boxes. And they have made several presentation about a combination of (VIA's) x86-64 Issaiah and (their own) embed GeForce.
      Touting that the platform would be the first small form factor able to sustain Vista in all DX10 and full Aero glory.

      Maybe that is where some journalist got mixed and where all this "nVidia is preparing a x86 chip" rumor began ?

      • by CodeBuster (516420) on Wednesday August 27, 2008 @12:35PM (#24767169)

        Maybe that is where some journalist got mixed and where all this "nVidia is preparing a x86 chip" rumor began?

        This is what happens when technical information is filtered through the brain of a salesperson, manager, or executive. It comes out completely mangled on the opposite side or, even worse, it morphs into something which while technically correct is NOT the information that the non-technical person thought they were conveying (i.e. they have unknowingly modified the requirements specification in a way that is logically consistent from a technical standpoint, but will result in the wrong product being built).

    • by LWATCDR (28044)

      Exactly. It is a lot easier to go into the mobile space than X86.

    • nVidia are building a CPU, a Cortex A9 derivative with a GPU on-die and a load of other nice features

      A CPU is a sequential processor and, as such, it has no business being in a parallel processor. Heterogeneous processors are hideous beasts that will be a pain in the ass to program. What the world needs is a pure MIMD vector processor in which every instruction is an indenpendent vector that can be processed in parallel. There is no reason to have a CPU for general purpose programs and a GPU for graphics an

  • Anyone Surprised? (Score:5, Interesting)

    by Underfoot (1344699) on Wednesday August 27, 2008 @10:42AM (#24765335)

    Is anyone actually surprised that the CEO is denying this? Even if the rumors were true, letting news out to market about it would give Intel time to prepare a response (and legal action).

    • Re: (Score:2, Insightful)

      Even if the rumors were true, letting news out to market about it would give Intel time to prepare a response (and legal action).

      I don't get the legal action part. Is the x86 architecture patented by Intel? Even if it is, wouldn't the patent have expired by now? After all, its more than 30 years old. Do AMD, VIA etc. pay licensing fees to Intel for building processors using the x86 architecture? If so, why cant NVidia?

    • Re: (Score:2, Interesting)

      by Hal_Porter (817932)

      Is anyone actually surprised that the CEO is denying this? Even if the rumors were true, letting news out to market about it would give Intel time to prepare a response (and legal action).

      The original story came from Charlie at The Inquirer. Charlie and NVidia hate each other.

      • Re: (Score:2, Interesting)

        The original story came from Charlie at The Inquirer. Charlie and NVidia hate each other.

        Possibly related to Charlie's vast holdings of AMD stock...

        • I heard it was some sort of falling out between Charlie and NVidia over some issue I don't know which has turned into a long running feud. He writes stuff to piss them off, they try to cut off his information about them. There is a cycle.

          So I don't believe a word he says about NVidia any more.

      • by Gromius (677157)
        I thought the Inquirer hated everybody and mostly runs sensationalist news stories that turn out to be a bit iffy in the end. To me they have less credibility than some random guy's blog.

        In the interests of fairness this is maybe because I don't regularly read them, only whats picked up by slashdot/other news stories which tend by their very nature to be sensationalist and often made up. Which is why I don't regularly read them :)
    • by AKAImBatman (238306) * <akaimbatman @ g m a i l . c om> on Wednesday August 27, 2008 @11:23AM (#24765981) Homepage Journal

      Is anyone actually surprised that the CEO is denying this?

      Not at all. As you say, he would have denied it even if NVidia WAS planning a CPU. What actually speaks volumes IMHO, is the vehemence with which he denied it. Any CEO who's cover-denying a market move is not going to close his own doors by stating that the company could never make it in that space. He would give far weaker reasons so that when the announcement comes the market will still react favorably to their new product.

      In other words: stick a fork in it, because this bit of tabloid reporting is dead.

      • by dnwq (910646)
        And if what you say is true, any CEO who's intending to cover-deny would be just as vehement as NVidia's CEO now.

        Otherwise we would be able to tell what he's doing, and he won't be able to deny anything, no?
        • Re: (Score:3, Insightful)

          by AKAImBatman (238306) *

          Otherwise we would be able to tell what he's doing, and he won't be able to deny anything, no?

          No. Because any CEO who immediately kills the market he's about to enter with his own statements is a fool.

          If you want to get into the market of competing with Intel, you don't say that you could never make a CPU as good as Intel can.

  • Reprogrammable GPU? (Score:5, Interesting)

    by Wills (242929) on Wednesday August 27, 2008 @10:44AM (#24765361)
    When hell freezes over, they could release a GPU where the instruction set is itself microprogrammable with open-source design, and then end users could decide whether they want to load the GPU's microcode with an x86 instruction set, a dsp set, or whatever.
    • Re: (Score:2, Insightful)

      by Fourier404 (1129107)
      I would be very, very surprised if that was any cheaper than just buying 2, one manufactured as a GPU, the other as a CPU.
      • Re: (Score:3, Insightful)

        by Toffins (1069136)
        Who said price is the most interesting issue? I'd definitely choose the versatility of an open-source microcode GPU that could be dynamically reprogrammed to have any of several different instruction sets. It would be significantly simpler than the hassle of designing with FPGAs because much of the infrastructure (floating point logic etc) would already be available hardcoded into the GPU's silicon.
        • by Fizzl (209397) <`ten.lzzif' `ta' `lzzif'> on Wednesday August 27, 2008 @01:36PM (#24768053) Homepage Journal

          And I want a microwave than can be customer bludgeoned into a bicycle. Where do you people get the idea that you can do hardware in software?

          • by Ant P. (974313)

            I dunno... Intel maybe? They haven't made x86-opcode hardware in years now.

          • Like the other Anons...see FPGAs, or EEPROM or a host of other firmware-y devices for the "idea" of doing hardware in software...or software in hardware. It's quite likely that 10-20 years down the road we'll be using machines where we can completely reconfigure the hardware on the fly. Not that I'm saying that it will happen, or it will be ubiquitous..but with FPGA technology progressing like it is, the opportunities for reconfigurable computing in academia, military/gvt, and the consumer markets are loo
        • Re: (Score:3, Interesting)

          by TheLink (130905)
          But who really wants that sort of versatility- who wants so many different instruction sets? The compiler writers? I doubt more than a few people want that.

          Would such a GPU be faster? It might be faster for some custom cases, but is it going to be faster at popular stuff than a GPU that's been optimized for popular stuff?

          The speed nowadays is not so much because of the instruction set, it's the fancy stuff the instruction set _control_ e.g. FPU units, out of order execution, trace cache, branch prediction e
        • by Kjella (173770) on Wednesday August 27, 2008 @02:55PM (#24769051) Homepage

          Who said price is the most interesting issue? I'd definitely choose the versatility of an open-source microcode GPU that could be dynamically reprogrammed to have any of several different instruction sets.

          As long as they're Turing complete, any of them can in principle do anything. Yes, then at least to me it comes down to price - if it's cheaper to have a car, boat and plane than making a tranasformer that can do all three at it, suck at all three and cost a bajillion more I'll go for traditional chips, thank you very much.

        • by MarcQuadra (129430) on Wednesday August 27, 2008 @03:36PM (#24769507)

          Transmeta tried that. It was slow, expensive, and inconsistent. Also, nobody ever used any other 'instruction sets' besides x86, mostly because that's the most-common-denominator in the computing world.

          It sucks, it's not the -best- way to do it, but it's the way the market seems to favor. Just ask Apple, Sun, DEC, and HP.

    • Difficult (Score:4, Informative)

      by DrYak (748999) on Wednesday August 27, 2008 @10:55AM (#24765551) Homepage

      Microcode-upgrade are possible for CPU that have a huge big complex reprogrammable pipeline like the current top of the line CPUs, or CPU where the pipeline is handled in software (like the Transmeta chips).

      GPU, on the other hand, have a very short and simplistic pipeline which is hard-fixed. They draw their tremendous performance, from the fact that this pipeline drives ultra-wide SIMD units which process a fuck-load of identical threads in parallel.

      But there nothing much you could reprogramm currently. Most of the die is just huge cache, huge registry files, and a crazy amount of parallel floating point ADD/MUL blocks for the SIMD. The pipeline is completely lost amid the rest.
      (Whereas on CPU, even if the cache dwarfs the other structure, there are quite complex logic blocks dedicated to instruction fetching and decoding).

      • Re: (Score:3, Interesting)

        by Wills (242929)
        I was aiming for the extreme reprogrammability and versatility that an open-source microcode CPU design with SIMD, RISC and CISC sections all on a single die. Sure, the trade off is that you don't get as much capability in each subsection (compared to the capabilities of a dedicated GPU, or a dedicated modern CPU) because the sub-sections all have to fit inside the same total area of silicon. But what you get instead is an open-source microcode CPU which has great versatility, without needing to go down the
        • Re:Difficult (Score:4, Informative)

          by billcopc (196330) <vrillco@yahoo.com> on Wednesday August 27, 2008 @12:57PM (#24767493) Homepage

          Let me guess: you've never read anything about microprocessor engineering, have you ?

          What you describe is what every non-engineer dreams of. You want a chip that any idiot can reprogram, without knowing the "less simple" ways of FPGAs. That's kind of like saying you want a car that gets 200 miles to the gallon, can park in a shoebox and carry 20 kids in the back seat - oh, and it drives itself automagically so your kids can take themselves to soccer practice without bugging you.

          The reason why no one ever builds such monstrosities is because there is simply no point to it, when you can have purpose-built chips designed and fabbed for a fraction of the cost. People don't stop breathing just because their device needs 2 distinct chips instead of one jesus-truck.

          • Let me guess: you've never read anything about microprocessor engineering, have you ?

            Actually I do my own FPGA designs, and write microcode too. Where do you get that I "want a chip that any idiot can reprogram"? I don't. I want an open-source microcode chip on the market that I can reprogram. That's not something "every non-engineer dreams of." Purpose-built chips are fixed in purpose. I don't want that. I want versatility in a single chip. That's why I want an open-source microcode chip. I would use tha

          • But I like Jesus trucks....
        • Re: (Score:3, Informative)

          by dreamchaser (49529)

          What you are describing is a pipe dream. Even *if* they managed to do something like that, performance would be utter crap, die size would be huge, and the odds are it just plain would suck.

      • by rew (6140)

        Still, I can imagine a top Nvidia engineer spending a couple of weekends on adapting the GPU to run X86 code. If that would show promising, they could put a team on it, finish the project, and make a surpise move.

        You correctly state that a GPU is usually an SIMD machine. So, they have an instruction fecth for the "I" in SIMD. They also have huge IO bandwidth for the "MD" part. If you go MIMD (muliticore in modern CPU terminology), you also need the huge IO bandwidth for the MI part. That's already done! Th

    • by HerculesMO (693085) on Wednesday August 27, 2008 @11:01AM (#24765663)

      If hell froze over they wouldn't have to worry about the cooling on their chips.

      I guess that's a plus.

    • by rew (6140)

      At the computer architecture lab here at the university of Delft, we built a CPU, and then tried to emulate x86 on it. Didn't go fast.

      Then a guy from HP visits. A year later HP comes with a design awkwardly similar to what we came up with. But they did emulate x86 quickly. The trick to a quick emulator is that you don't have to handle corner cases. So if your architecture has an "add" instruction and leaves the flags register exactly as the emulated architecture would, then you'll be able to emulate quickly

  • Focused (Score:5, Insightful)

    by Akita24 (1080779) on Wednesday August 27, 2008 @10:45AM (#24765375)
    Yeah, they've stayed focused on graphics chips, that's why there are so many motherboards with nVidia chip sets .. *sigh*
    • by Anonymous Coward
      Yes, no CPU for them. Just GPU as CPU motherboards and such.
    • by frieko (855745)
      Well, for quite a while an nForce chipset was the only (good) way to connect your Athlon to your GeForce. Can't sell a car if there's no roads.
      • Re: (Score:3, Informative)

        by microbrew_nj (764307)
        I can think of a few good reasons for Nvidia to roll their own chipsets. SLI is one. The market for integrated motherboards (with their chipset) is another.
    • by Akita24 (1080779)
      I never said they didn't have a goood/valid reason, in fact, I'm damn glad they did. However, they *have* focused on something else, even if the reason for it was forwarding their graphics agenda. :-)
    • by Kjella (173770)

      Yeah, they've stayed focused on graphics chips, that's why there are so many motherboards with nVidia chip sets .. *sigh*

      Of course, if you want to deliver integrated chipsets, you know the other much higher volume market for graphics chips, then you have to be able to build the rest of that chip as well or it wouldn't be integrated. Seeing as how the graphics capability become more and more important while the other features seem quite stable, it's be much stranger for them *not* to be in that market IMO.

    • The primary reasons why motherboards don't include as many nVidia chipsets (or any other good chipsets for that matter) as they might otherwise are (1) cost, (2) heat, and (3) space. The mainboard attempts to combine as many functions as are practical into the smallest and cheapest to manufacture area possible. Those who want the nVidia chipsets were always free to purchase the video card of their choice aftermarket and install that into the graphics slot on their motherboard. For everyone else (mostly cons
      • by Jthon (595383)

        Umm, no. The volume on chipset sales isn't from individual's buying boards but from OEM sales. Intel is so widely used because they basically throw the chipset with integrated graphics in free when you buy their CPU's.

        The profit Margin's on CPU's are something like 500% and the chipsets sell at cost or below cost to help motivate CPU sales.

        Considering the vast majority of PC's sold are to businesses that don't need anything that couldn't have been done with a VGA controller from 1995 decent mainboard/graphi

    • between nForce and their new ARM11 cpu. It's hard to take comments like "is that we've stayed focused." too seriously.

  • Only reason (Score:3, Insightful)

    by Z00L00K (682162) on Wednesday August 27, 2008 @10:46AM (#24765393) Homepage

    The only reasons that they may build a chip for x86 (64-bit or not) would be to either use it for a special application or as a proof of concept.

    A GPU and a CPU are different, but it may be a way to test if a GPU architecture can be applied to a CPU with a classic instruction set. The next step is to sell the knowledge to the highest bidder.

    To compete with Intel would just be futile.

    • To compete with Intel would just be futile.

      Hopefully we won't be saying the same about AMD in another few years.

    • Re: (Score:3, Interesting)

      by ratboy666 (104074)

      How is a "GPU" different from a "CPU"? If you take them to be the SAME, you end up with Intels LARRABEE. If you take them as somehow DIFFERENT, you end up with nVidias proclamation.

      If they are considered the SAME, but with different performance tunings, other applications begin to open up.

      As an example: it is currently true that the "GPU" is given an exorbitant amount of resources to do one thing -- create visuals for games.

      And that's it. It contains a significant amount of the system memory, and processing

      • Re:Only reason (Score:4, Informative)

        by Lisandro (799651) on Wednesday August 27, 2008 @03:12PM (#24769231)

        How is a "GPU" different from a "CPU"?

        The GPU is a specialized (vector) processor, while the CPU is a general purpose one. What the GPU does, it does great. But its reach ends pretty much there.

        The nVidia is programmed with a specific higher-order assembly language, We rely solely on the hardware vendor for tools. I think that this is UNIQUE in the (mass-market) processor world. And this is why Intel, with an x86 compatible GPU is such a threat.

        You're confused. Intel is not working on a "x86 GPU". Intel is working on a new GPU design - the kicker being that this is a relatively high performance one, instead of the kind of GPUs they offered so far (feature packed, but lacking in performance). The x86 instruction set has nothing to do with it, and in fact, has nothing to do with GPU programming, which is a completely different beast.

        Can anyone else produce an OpenGL shader compiler for the nVidia? Or, better yet, extend it to do NON-shader tasks. How about for the AMD?

        If i'm no mistaken, nVidias CG compiler is now open sourced. So yes.

        • by ratboy666 (104074)

          The GPU is (generally) a vector processor with VERY limited branching capability, and VERY limited data sourcing. But, these things can be "fixed".

          Yes, Intel is working on an "x86 GPU".

          "Larrabee can be considered a hybrid between a multi-core CPU and a GPU, and has similarities to both. Its coherent cache hierarchy and x86 architecture compatibility are CPU-like, while its wide SIMD vector units and texture sampling hardware are GPU-like." (from http://en.wikipedia.org/wiki/Larrabee_(GPU) [wikipedia.org] )

          As to Cg being "o

  • He seems rather confident with a two year head start on a company that has "billions and billions of dollars."

  • Just a thought... (Score:5, Insightful)

    by darkvizier (703808) on Wednesday August 27, 2008 @10:48AM (#24765441)
    If you're 30 years behind them in their market, and they're 2 years behind you in yours, maybe it's not wise to be "dismissive of the threat" ?
    • by Kjella (173770)

      If you're 30 years behind them in their market, and they're 2 years behind you in yours, maybe it's not wise to be "dismissive of the threat" ?

      You're comparing apples to oranges. nVidia has 13 years of experience in the market (NV1 - 1995) but it doesn't say anything about how fast someone else could catch up or how far they'd stay behind. Anyone could shave 20+ years off Intel's "head start" easily, it's the last few years to make a competitive product that are hard. nVidia could within a few years produce a processor some years behind Intel in technology, but it'd be marketwise dead on arrival. If Intel really is 3+ (you see any Larrabees this y

      • Yeah, I agree. His wording was a bit pretentious, but I expect both companies will be in the game for a long time yet.

        Regardless though, our hardware is finally going parallel. From a programmer's point of view, I'm just very happy to see things like CUDA [wikipedia.org] emerging, which will make parallel programming a whole lot more feasible. I think we're going to see some really impressive things developed as a result of this.

    • Doesn't matter how far behind you are in their market if the only thing in question is your own. NVIDIA has consistently put out vastly superior graphics hardware than Intel.
    • Re: (Score:1, Interesting)

      by Anonymous Coward

      Some would say that the way we use devices is changing, that feature packed cell phones, UMPCs, and specialist devices like consoles, are beginning to dominate the home space. These platforms often dont use an x86 CPU. They use a RISC cpu like an arm or a freescale chip.
      These people are significant rivals to intel.
      The XBOX and the PS2 both have quazi CISC CPU chips in designed by IBM.

      What I'm saying is that although Intel probably is now the dominant player in the x86 market, this is simply leading to a lot

  • If more companies entered the same market that would give us more choices and better prices. I say go for it Nvidia make a cpu and see how you do against Intel and AMD.

    I really wish that we could have the same socket in the motherboard for a CPU from Intel, AMD, Nvidia, . That would rock and give a real head to head test of which CPU is best for what you are doing. Never happen, but it would be cool to see.

    • by gnick (1211984)

      If more companies entered the same market that would give us more choices and better prices. I say go for it Nvidia make a cpu and see how you do against Intel and AMD.

      No, I do not think that would be a good thing. The up-front R&D cost for making CPUs is huge. Fabricating them ain't cheap either. Sure, NVIDIA has a lot of talent and would have a big jump on the R&D. And they have fabrication facilities that could be retuned for CPUs instead of GPUs. But I think that the end result of NVIDIA attempting to compete with Intel/AMD on the x86 CPU front would be death or serious damage to NVIDIA and we'd lose competition on the graphics card market rather than gai

      • by ThisNukes4u (752508) * <tcoppi@[ ]il.com ['gma' in gap]> on Wednesday August 27, 2008 @11:40AM (#24766227) Homepage
        Actually nvidia doesn't own any fabs, they contract out all their chips to TSMC, same as ati. Although now ati/amd are going to be making their fusion chips at TSMC, so they will definitely have the expertise to make x86 chips in the near future(TSMC will).
        • by gnick (1211984)

          Absolutely correct. Perhaps I should have said 'access to fabrication facilities' or 'fabrication relationships'. The point is that they have no resource issues barring them from the game, just a lot of catch-up work, stiff competition, and the good sense to lack motivation.

          'Decide what you're going to do and focus on doing it well' is a good business model and, whether you're an NVIDIA fan or not, that's certainly what they're trying. And, so far, it's working out a lot better for them than a lot of the

      • The real losers would be Via and AMD. If NVidia made a big entry into the x86/x86-64 space, they would take as much ore more market share from the smaller players as from Intel. NVidia would be poorly served by knocking Via out and especially by knocking AMD out. Even though those companies compete for graphics dollars, they give NVidia somewhere to put its graphics and chipsets other than on Intel-CPU boards.

    • there was such a socket for some time, the Socket 7, around the time of the AMD K6 generation. You could put most intel and amd cpus of the era into the same motherboard.

      • by MBGMorden (803437)

        Not just Intel and AMD. There was a time when you could use an Intel, Amd, Cyrix, IDT, or a Rise (and I'd bet even a couple more) CPU all in the same motherboard. Back then I didn't even DREAM of building a machine with an Intel chip - Cyrix and AMD were less than half the cost (close to 1/3rd the cost in some areas). And when those costs were in the hundreds of dollars for entry level stuff (rather than the $35 that you can get a budget CPU for now), it really made a difference.

        Of course, that was when

      • Indeed and not just intel and amd either but cyrix and IDT as well. Then intel moved to slot 1 which iirc involved some propietry stuff that stopped anyone else using it. The competitors stayed on socket 7 for a while then AMD moved to slot A and the others either died out or moved to processors soldered directly to the motherboard.

        • by cnettel (836611)
          Mostly correct, but I actually think that VIA was Socket 379 (PIII) compatible for a while, and also stayed on a similar bus even when Tualatin was all about obsolete.
    • by swordgeek (112599)

      Nice idea, but no.

      CPU manufacture has become the most expensive part of computing. The cost of designing, prototyping, and then fabricating CPUs is INSANE! Worse, the price grows fantastically as the trace-size shrinks. It's been suggested that one of the reasons Intel moved so aggressively from 65nm to 45nm is to push AMD to the sidelines.

      nVidia is roughly five percent the size of Intel. Trying to enter a market outside of their core competence against a behemoth like that is suicide.

  • rumour machine (Score:3, Insightful)

    by Anonymous Coward on Wednesday August 27, 2008 @10:54AM (#24765533)

    rather handy that this rumour gives nvidia, a GPU company, the chance to point out how futile it would be for them to try and enter the CPU market... then point over to intel, a CPU company, trying to make a GPU...

  • Remove their heads from their collective rectum and correct the damn problems they have with their video cards and motherboard chipsets.

    I've been a loyal nVidia customer since the good old days of the Diamond V550 TNT card through the 8800GTX but they have really hosed up lately.

    My 780i board has major data coruption problems on the IDE channel and my laptop is one of the ones affected by their recall so I am not too pleased with their ability to execute lately...

    • by Tuoqui (1091447)

      Well said...

      Dont go breaking into someone else's house while yours is burning down.

  • And why not? (Score:5, Insightful)

    by geogob (569250) on Wednesday August 27, 2008 @11:08AM (#24765745)

    I wouldn't mind seeing more players in the computer processor industry. The headlines really make it sound like it would be a bad thing. Maybe I'm getting the headlines wrong, but having Nvidia presenting new alternatives to a market almost exclusively owned by Intel and AMD would be interesting.

    • I completely agree... I generally do a major upgrade or new build every 2-3 years, and I was on a tight budget earlier this year when I performed the ritual. It was a nice moment as a consumer to be able to buy a comparable (for my needs) cpu from AMD for 2/3 the cost of the intel lineup. Sure, the Opteron X2 isn't gonna knock out a Core2Duo, but for my needs it was plenty, and considerably cheaper. It would be VERY nice to see a 3rd player in the game, especially if it was a company I trust as much as
  • From 2006 (Score:5, Insightful)

    by Alioth (221270) <no@spam> on Wednesday August 27, 2008 @11:16AM (#24765883) Journal

    "A GPU from 2006" sounds a lot like famous last words.

    I wonder if anyone at DEC made comments in a similar vein about Intel CPUs, when the Alpha was so far ahead of anything Intel was making? NVidia's architect should not underestimate Intel, if he does, he does it at his company's peril.

    • Re:From 2006 (Score:4, Interesting)

      by Lumpy (12016) on Wednesday August 27, 2008 @11:46AM (#24766339) Homepage

      The alpha failed because the motherboards were $1300.00 and the processors were $2600.00 nobody in their right mind bought the stuff when you could get Intel motherboards for $400 and processors for $800.00 (dual proc boards, high end processors)

      DEC died because they could not scale up to what the intel side was doing. you had thousands of motherboards made per hour for Intel with maybe 4 a day for Alpha. It's game over at that point.

      I loved the Alphas, I had a dual alpha motherboard running windows NT it rocked as a server.

      • by cbreaker (561297)

        The same can be said about Itanium. The original Itanium (and even the current ones) were so DAMNED expensive and they didn't offer any real performance increase.

        What really killed Itanium was AMD's x64 extensions.

        Itanium will be around for awhile but it will never become commonplace outside of high end, massively SMP UNIX servers.

        • by ishobo (160209)

          Itanium will be around for awhile but it will never become commonplace outside of high end, massively SMP UNIX servers.

          They never were marketed as such. Itanium competes on the same playing field as Power and Sparc. The primary problem with Itanium was it was three years behind schedule and rushed out to replace the aging PA and MIPS (and the end-of-lifed Alpha) and lost momentum with its flaws. This happened years before 64bit hit the x86 CPUs. The idea of moving x86 code to Itanium was not the compelling selling point for large enterprise customers; they had to replace their existing PA, Alpha, and MIPS systems.

          • by cbreaker (561297)

            Yea they were. Dell and HP were initially pushing Itanium servers running Windows hard. This was Intel's answer to the 64-bit question.

            Who knows, maybe if AMD didn't create AMD64 Itanium would have been more accepted and eventually the prices may have dropped some. But we'll never know, and I'm glad for that. I much prefer x86/x64 running the show, as it's accessible to everyone, including the enthusiast, for running server operating systems.

            HP has had some success with Itanium on their HP/UX machine

      • by ishobo (160209)

        DEC died because they could not scale up to what the intel side was doing. you had thousands of motherboards made per hour for Intel with maybe 4 a day for Alpha. It's game over at that point.

        You clearly do not understand the high end market. You cannot compare low end servers with P2 chips and systems based around Alpha (or Power, PA, etc). Alpha died because DEC was sold to Compaq (an Intel partner). Prior to the sale, Alpha systems were doing brisk business. This was 1998 folks. The P3 would not be rleased until the following year and Itanium would not see the light of day until 2001.

      • ... nobody in their right mind bought the stuff...

        ...I had a dual alpha motherboard running windows NT it rocked as a server.

        So, would it be fair to say that you weren't in your right mind? ;-)

    • by schwaang (667808)

      It's just the time-honored sports tradition of trash-talking your opponent. One example was when DEC's CEO Ken Olsen [wikipedia.org] famously said that "Unix is snake-oil".

      That's just hilarious Ken, ya Fred Thomson ugly dinosaur-scaly bastard, since a few years later I bought a DEC Alpha from you running Ultrix instead of VMS.

    • by cbreaker (561297)

      Yea, but think about it: A good GPU from 2006 is still PRETTY DAMNED GOOD!

      I'm still using an AGP 6800GT in one of my machines, and it's still trucking. I can't run everything at high quality but it's usable.

      Yesterday, Intel made a GPU as good as a GPU from 2002. Today it's 2006. Tomorrow they might be competitive. And honestly, with Intel GPU specs being FAR more open than nVidia or ATI, I welcome it. We might actually be able to get GOOD graphics, completely open sourced drivers, on Linux.

    • by CAIMLAS (41445)

      Yep: it doesnt matter if the Intel technology is akin to something nVidia was doing 2 - or 10 - years ago. What matters

      Hell, Microsoft has made that their primary means of income for the past 20 years through superior marketing and underhanded business practices.

  • by Anonymous Coward

    Rewrite the software in place to run on a different architecture (whatever their latest GPUs implement). Maybe, just maybe GPUs have evolved to a point where interpreted generic-x86 wouldn't be (completely) horrible.

    • that would be interesting if you turn a GPU into a general purpose CPU. That way they would have a CPU without having to invest much additional resources into developing it, using the same core for both. But I have no idea if that is possible. It is likely that the GPU actually has less processing power than a current CPU, so it might not be nearly as fast as regular CPUs. It could work for a low end market or embedded. The ISA though may be designed around 3D graphics operations and perhaps you wouldnt hav

  • by Anonymous Coward

    http://www.hackthematrix.org/matrix/pics/m3/arch/1.gif

  • by Bruce Perens (3872) * <bruce@perens.com> on Wednesday August 27, 2008 @11:42AM (#24766263) Homepage Journal

    I think the reason we've survived the other 35 companies who were making graphics at the start is that we've stayed focused.

    3DFx was the first company to publish Open Source 3D drivers for their 3D cards. nVidia sued them, then bought them at a discount, and shut down the operation. So, we had no Open Source 3D for another 5 years.

    That's not "staying focused". It's being a predator.

    Bruce

    • by Rufus211 (221883) <(gro.hsikcah) (ta) (todhsals-sufur)> on Wednesday August 27, 2008 @12:06PM (#24766683) Homepage

      What on earth are you talking about? 3DFx died because it was horribly mismanaged and ran out of money. There were lawsuits, but 3dfx sued NV first in 1998 and then in 2000 NV counter-sued (source [bluesnews.com]). True NV's countersuit was right before 3dfx died, but a simple lawsuit that's gone nowhere in the courts yet doesn't cause a company to go bankrupt overnight.

      Personally I'll believe one of my (ex-3dfx Austin) friend's explanation for their downfall: the fully stocked Tequila bar that was free to all employees. Or there's a whole list of problems leading to their decline on wikipedia [wikipedia.org].

      • Pixar has had a great many employee perks, starting with cohabitant insurance benefits long before they were profitable. It's not very well known that they went bankrupt, repurchased employee stock, and refinanced once, although with Steve Jobs as the only major creditor they didn't need to go through formal bankruptcy in court.

        They asked a lot of employees, and the benefits had to match that.

        I think nVidia's lawsuit was strategicaly positioned to be the straw that closed out additional investment prospects

      • by rtechie (244489) *

        3DFx died because NVIDIA crushed them with the GeForce. 3Dfx had already released a very disappointing product in the Banshee (it was buggy and slower that the Voodoo 2 SLI that proceeded it). Hardware T&L, controversial at the time, proved to be a killer feature.

    • by alen (225700) on Wednesday August 27, 2008 @12:22PM (#24766955)

      3dfx's problem was they could never figure out how they sold their cards. they flipped flopped from themselves to having others make the cards like Nvidia does. after so many times no one wants anything to do with you because it's bad for business planning.

      nvidia has had it's current selling model for 10 years and only its partners have changed. if you want to sell video cards you can trust that if you sell cards based on nvidia's chips they won't pull the rug out from under you next year and decide to sell the cards themselves

      • by Bruce Perens (3872) * <bruce@perens.com> on Wednesday August 27, 2008 @12:41PM (#24767263) Homepage Journal
        Pixar had an OEM model too, back in its days of making hardware and software products (the Pixar image computer, Renderman, Renderman hardware acceleration) while waiting for the noncompete with Lucasfilm to run out. It's a very difficult way to run a business, because you have to pull your own market along with you, and you can't control them.

        It does look like 3DFx bought the wrong card vendor. They also spun off Quantum3D, then a card vendor, which is still operating in the simulation business.

    • Actually, it's both. Bruce - you just don't like predatory behaviour, and I don't either. Removing competition is a common tool to relax a rapid and expensive development pace.
  • They are very good at doing research in making their chips very cheap to make and own the whole stack of production from start to finish. This is how they have managed to make it despite many many misteps along the way.

    nVidia doesn't own the factories that they use to make their chips, they just design them and use factories like TSMC. nVidia would be stupid to compete with intel in the same space (x86 CPUs) until they own and can efficiently build chips like intel can.

    AMD was the only ones doing it as th

    • by breeze95 (880714)

      Intel's latest graphics offering is going to fail, not because they don't have the hardware (actually their new larabee looks really fast). but because their graphics drivers have always stunk and there is little evidence to suggest that they will be able to make a leap forward in graphics driver quality that will make their solution better then AMD or nVidia. They have to write full DX9, DX10, and OpenGL drivers to really compete with nVidia, then they have to optimize all those drivers for all the popular games (cause nobody will re-write Doom, HL, UT, FarCry, etc.. just for this new graphics card).

      It could happen, but will it?

      That's what they used to say about ATI drivers a few years ago. It didn't stop customers from flocking to ATI video cards. The reason for that was ATI hardware was just as good or better than Nvidia. These days I don't hear too much fussing about AMD/ATI drivers. For graphics cards hardware is the key. The best driver will not overcome hardware short comings, but drivers can be upgraded.

      There is no reason to think that Intel will have driver problems out the box. Drivers are nothing more than firmware, a

  • Nvidia has denied rumours that the company is planning an entry into the x86 CPU market

    Of course they're denied building a x86 CPU, they're working on an x64 model. 'nuff said.

  • by bagofbeans (567926) on Wednesday August 27, 2008 @05:58PM (#24771069)
    I don't see an unequivocal denial in the quotes. Just an implied no, and then answering a question with a question. If I was defining products at Nvidia, I would propose an updated Via C7 (CPU+GPU) product anyway, not a simple standalone CPU.

    "That's not our business. It's not our business to build a CPU. We're a visual computing company, and I think the reason we've survived the other 35 companies who were making graphics at the start is that we've stayed focused."

    "Are we likely to build a CPU and take out Intel?"
    • by cheier (790875)
      Trust me on this one, just because the article only has an implied no, from someone that wasn't actually the CEO, doesn't mean there wasn't an actual no somewhere. If the author of the article had decided to show up at the CEO's emerging companies summit Q&A session, the quote would have read more like, "No, absolutely not.", then append the the pretty much the jist of the senior VPs quote on the end of that. Not much else to read into it unless you think the guy is lying, rather than batting around t
      • ...so let's presume that the CEO explicitly said no. I still expect Nvidia to offer a combined CPU+GPU combo. S'pose I am just annoyed that the reporter didn't explore the subject a bit more.

You can bring any calculator you like to the midterm, as long as it doesn't dim the lights when you turn it on. -- Hepler, Systems Design 182

Working...