



Nvidia Firmly Denies Plans To Build a CPU 123
Barence writes "A senior vice president of Nvidia has denied rumours that the company is planning an entry into the x86 CPU market. Speaking to PC Pro, Chris Malachowsky, another co-founder and senior vice president, was unequivocal. 'That's not our business,' he insisted. 'It's not our business to build a CPU. We're a visual computing company, and I think the reason we've survived the other 35 companies who were making graphics at the start is that we've stayed focused.' He also pointed out that such a move would expose the company to fierce competition. 'Are we likely to build a CPU and take out Intel?' he asked. 'I don't think so, given their thirty-year head start and billions and billions of dollars invested in it. I think staying focused is our best strategy.' He was also dismissive of the threat from Intel's Larrabee architecture, following Nvidia's chief architect calling it a 'GPU from 2006' at the weekend."
Inaccurate headline (Score:5, Informative)
x86 rumors origin ? (Score:4, Interesting)
Currently nVidia is partnering with VIA for small form factor x86 boxes. And they have made several presentation about a combination of (VIA's) x86-64 Issaiah and (their own) embed GeForce.
Touting that the platform would be the first small form factor able to sustain Vista in all DX10 and full Aero glory.
Maybe that is where some journalist got mixed and where all this "nVidia is preparing a x86 chip" rumor began ?
Re:x86 rumors origin ? (Score:5, Insightful)
Maybe that is where some journalist got mixed and where all this "nVidia is preparing a x86 chip" rumor began?
This is what happens when technical information is filtered through the brain of a salesperson, manager, or executive. It comes out completely mangled on the opposite side or, even worse, it morphs into something which while technically correct is NOT the information that the non-technical person thought they were conveying (i.e. they have unknowingly modified the requirements specification in a way that is logically consistent from a technical standpoint, but will result in the wrong product being built).
Re: (Score:2)
Exactly. It is a lot easier to go into the mobile space than X86.
The World Does Not Need Another CPU (Score:1)
nVidia are building a CPU, a Cortex A9 derivative with a GPU on-die and a load of other nice features
A CPU is a sequential processor and, as such, it has no business being in a parallel processor. Heterogeneous processors are hideous beasts that will be a pain in the ass to program. What the world needs is a pure MIMD vector processor in which every instruction is an indenpendent vector that can be processed in parallel. There is no reason to have a CPU for general purpose programs and a GPU for graphics an
MOD PARENT DOWN (Score:2)
Anyone Surprised? (Score:5, Interesting)
Is anyone actually surprised that the CEO is denying this? Even if the rumors were true, letting news out to market about it would give Intel time to prepare a response (and legal action).
Re: (Score:2, Insightful)
Even if the rumors were true, letting news out to market about it would give Intel time to prepare a response (and legal action).
I don't get the legal action part. Is the x86 architecture patented by Intel? Even if it is, wouldn't the patent have expired by now? After all, its more than 30 years old. Do AMD, VIA etc. pay licensing fees to Intel for building processors using the x86 architecture? If so, why cant NVidia?
Re: (Score:1)
The /. article on the rumor goes into that quite a bit. http://hardware.slashdot.org/article.pl?sid=08/08/20/1917239/ [slashdot.org]
Re:Anyone Surprised? (Score:4, Informative)
Re:Anyone Surprised? (Score:5, Informative)
I don't get the legal action part. Is the x86 architecture patented by Intel? Even if it is, wouldn't the patent have expired by now? After all, its more than 30 years old. Do AMD, VIA etc. pay licensing fees to Intel for building processors using the x86 architecture? If so, why cant NVidia?
Yes. Various pieces of parts of the x86 architecture that have been developed within the last 20 years (noteably, stuff related to the IA32 architecture of the 386, 486 and Pentium and later lines) are all still under patent.
Patents filed before June 8, 1995 get the greater of 17 past the patent grant date or 20 years total, whichever is greater.
Re: (Score:2, Interesting)
Is anyone actually surprised that the CEO is denying this? Even if the rumors were true, letting news out to market about it would give Intel time to prepare a response (and legal action).
The original story came from Charlie at The Inquirer. Charlie and NVidia hate each other.
Re: (Score:2, Interesting)
The original story came from Charlie at The Inquirer. Charlie and NVidia hate each other.
Possibly related to Charlie's vast holdings of AMD stock...
Re: (Score:1)
I heard it was some sort of falling out between Charlie and NVidia over some issue I don't know which has turned into a long running feud. He writes stuff to piss them off, they try to cut off his information about them. There is a cycle.
So I don't believe a word he says about NVidia any more.
Re: (Score:2)
In the interests of fairness this is maybe because I don't regularly read them, only whats picked up by slashdot/other news stories which tend by their very nature to be sensationalist and often made up. Which is why I don't regularly read them
Re:Anyone Surprised? (Score:5, Insightful)
Not at all. As you say, he would have denied it even if NVidia WAS planning a CPU. What actually speaks volumes IMHO, is the vehemence with which he denied it. Any CEO who's cover-denying a market move is not going to close his own doors by stating that the company could never make it in that space. He would give far weaker reasons so that when the announcement comes the market will still react favorably to their new product.
In other words: stick a fork in it, because this bit of tabloid reporting is dead.
Re: (Score:2)
Otherwise we would be able to tell what he's doing, and he won't be able to deny anything, no?
Re: (Score:3, Insightful)
No. Because any CEO who immediately kills the market he's about to enter with his own statements is a fool.
If you want to get into the market of competing with Intel, you don't say that you could never make a CPU as good as Intel can.
Reprogrammable GPU? (Score:5, Interesting)
Re: (Score:2, Insightful)
Re: (Score:3, Insightful)
Re:Reprogrammable GPU? (Score:4, Funny)
And I want a microwave than can be customer bludgeoned into a bicycle. Where do you people get the idea that you can do hardware in software?
Re: (Score:1)
I dunno... Intel maybe? They haven't made x86-opcode hardware in years now.
Re: (Score:1)
Re: (Score:3, Interesting)
Would such a GPU be faster? It might be faster for some custom cases, but is it going to be faster at popular stuff than a GPU that's been optimized for popular stuff?
The speed nowadays is not so much because of the instruction set, it's the fancy stuff the instruction set _control_ e.g. FPU units, out of order execution, trace cache, branch prediction e
Re:Reprogrammable GPU? (Score:4, Insightful)
Who said price is the most interesting issue? I'd definitely choose the versatility of an open-source microcode GPU that could be dynamically reprogrammed to have any of several different instruction sets.
As long as they're Turing complete, any of them can in principle do anything. Yes, then at least to me it comes down to price - if it's cheaper to have a car, boat and plane than making a tranasformer that can do all three at it, suck at all three and cost a bajillion more I'll go for traditional chips, thank you very much.
Re:Reprogrammable GPU? (Score:4, Insightful)
Transmeta tried that. It was slow, expensive, and inconsistent. Also, nobody ever used any other 'instruction sets' besides x86, mostly because that's the most-common-denominator in the computing world.
It sucks, it's not the -best- way to do it, but it's the way the market seems to favor. Just ask Apple, Sun, DEC, and HP.
Difficult (Score:4, Informative)
Microcode-upgrade are possible for CPU that have a huge big complex reprogrammable pipeline like the current top of the line CPUs, or CPU where the pipeline is handled in software (like the Transmeta chips).
GPU, on the other hand, have a very short and simplistic pipeline which is hard-fixed. They draw their tremendous performance, from the fact that this pipeline drives ultra-wide SIMD units which process a fuck-load of identical threads in parallel.
But there nothing much you could reprogramm currently. Most of the die is just huge cache, huge registry files, and a crazy amount of parallel floating point ADD/MUL blocks for the SIMD. The pipeline is completely lost amid the rest.
(Whereas on CPU, even if the cache dwarfs the other structure, there are quite complex logic blocks dedicated to instruction fetching and decoding).
Re: (Score:3, Interesting)
Re:Difficult (Score:4, Informative)
Let me guess: you've never read anything about microprocessor engineering, have you ?
What you describe is what every non-engineer dreams of. You want a chip that any idiot can reprogram, without knowing the "less simple" ways of FPGAs. That's kind of like saying you want a car that gets 200 miles to the gallon, can park in a shoebox and carry 20 kids in the back seat - oh, and it drives itself automagically so your kids can take themselves to soccer practice without bugging you.
The reason why no one ever builds such monstrosities is because there is simply no point to it, when you can have purpose-built chips designed and fabbed for a fraction of the cost. People don't stop breathing just because their device needs 2 distinct chips instead of one jesus-truck.
Hey, quit the dissing and flamebait (Score:2)
Actually I do my own FPGA designs, and write microcode too. Where do you get that I "want a chip that any idiot can reprogram"? I don't. I want an open-source microcode chip on the market that I can reprogram. That's not something "every non-engineer dreams of." Purpose-built chips are fixed in purpose. I don't want that. I want versatility in a single chip. That's why I want an open-source microcode chip. I would use tha
Re: (Score:1)
Re: (Score:3, Informative)
What you are describing is a pipe dream. Even *if* they managed to do something like that, performance would be utter crap, die size would be huge, and the odds are it just plain would suck.
Re: (Score:2)
Still, I can imagine a top Nvidia engineer spending a couple of weekends on adapting the GPU to run X86 code. If that would show promising, they could put a team on it, finish the project, and make a surpise move.
You correctly state that a GPU is usually an SIMD machine. So, they have an instruction fecth for the "I" in SIMD. They also have huge IO bandwidth for the "MD" part. If you go MIMD (muliticore in modern CPU terminology), you also need the huge IO bandwidth for the MI part. That's already done! Th
Re:Reprogrammable GPU? (Score:5, Funny)
If hell froze over they wouldn't have to worry about the cooling on their chips.
I guess that's a plus.
Re: (Score:2)
At the computer architecture lab here at the university of Delft, we built a CPU, and then tried to emulate x86 on it. Didn't go fast.
Then a guy from HP visits. A year later HP comes with a design awkwardly similar to what we came up with. But they did emulate x86 quickly. The trick to a quick emulator is that you don't have to handle corner cases. So if your architecture has an "add" instruction and leaves the flags register exactly as the emulated architecture would, then you'll be able to emulate quickly
Focused (Score:5, Insightful)
Re: (Score:1)
Re: (Score:2)
Re: (Score:3, Informative)
Re: (Score:1)
Re: (Score:2)
Yeah, they've stayed focused on graphics chips, that's why there are so many motherboards with nVidia chip sets .. *sigh*
Of course, if you want to deliver integrated chipsets, you know the other much higher volume market for graphics chips, then you have to be able to build the rest of that chip as well or it wouldn't be integrated. Seeing as how the graphics capability become more and more important while the other features seem quite stable, it's be much stranger for them *not* to be in that market IMO.
Re: (Score:2)
Re: (Score:1)
Umm, no. The volume on chipset sales isn't from individual's buying boards but from OEM sales. Intel is so widely used because they basically throw the chipset with integrated graphics in free when you buy their CPU's.
The profit Margin's on CPU's are something like 500% and the chipsets sell at cost or below cost to help motivate CPU sales.
Considering the vast majority of PC's sold are to businesses that don't need anything that couldn't have been done with a VGA controller from 1995 decent mainboard/graphi
Focused, except for MID CPU and nForce (Score:2)
between nForce and their new ARM11 cpu. It's hard to take comments like "is that we've stayed focused." too seriously.
Only reason (Score:3, Insightful)
The only reasons that they may build a chip for x86 (64-bit or not) would be to either use it for a special application or as a proof of concept.
A GPU and a CPU are different, but it may be a way to test if a GPU architecture can be applied to a CPU with a classic instruction set. The next step is to sell the knowledge to the highest bidder.
To compete with Intel would just be futile.
Re: (Score:2)
To compete with Intel would just be futile.
Hopefully we won't be saying the same about AMD in another few years.
Re: (Score:3, Interesting)
How is a "GPU" different from a "CPU"? If you take them to be the SAME, you end up with Intels LARRABEE. If you take them as somehow DIFFERENT, you end up with nVidias proclamation.
If they are considered the SAME, but with different performance tunings, other applications begin to open up.
As an example: it is currently true that the "GPU" is given an exorbitant amount of resources to do one thing -- create visuals for games.
And that's it. It contains a significant amount of the system memory, and processing
Re:Only reason (Score:4, Informative)
How is a "GPU" different from a "CPU"?
The GPU is a specialized (vector) processor, while the CPU is a general purpose one. What the GPU does, it does great. But its reach ends pretty much there.
The nVidia is programmed with a specific higher-order assembly language, We rely solely on the hardware vendor for tools. I think that this is UNIQUE in the (mass-market) processor world. And this is why Intel, with an x86 compatible GPU is such a threat.
You're confused. Intel is not working on a "x86 GPU". Intel is working on a new GPU design - the kicker being that this is a relatively high performance one, instead of the kind of GPUs they offered so far (feature packed, but lacking in performance). The x86 instruction set has nothing to do with it, and in fact, has nothing to do with GPU programming, which is a completely different beast.
Can anyone else produce an OpenGL shader compiler for the nVidia? Or, better yet, extend it to do NON-shader tasks. How about for the AMD?
If i'm no mistaken, nVidias CG compiler is now open sourced. So yes.
Re: (Score:2)
The GPU is (generally) a vector processor with VERY limited branching capability, and VERY limited data sourcing. But, these things can be "fixed".
Yes, Intel is working on an "x86 GPU".
"Larrabee can be considered a hybrid between a multi-core CPU and a GPU, and has similarities to both. Its coherent cache hierarchy and x86 architecture compatibility are CPU-like, while its wide SIMD vector units and texture sampling hardware are GPU-like." (from http://en.wikipedia.org/wiki/Larrabee_(GPU) [wikipedia.org] )
As to Cg being "o
Confident ? (Score:2)
He seems rather confident with a two year head start on a company that has "billions and billions of dollars."
Just a thought... (Score:5, Insightful)
Re: (Score:2)
Good logic there and you make a valid point, but being perfectly honest, 2 years in the GPU industry is more like 5 years in the CPU industry.
And Intel's currently more like 6 years behind NV/ATI. LRB may change that, but Intel shouldn't count its chickens before they're rendered. Even then, don't expect LRB to approach 2 year old NV/ATI performance at the same price or power draw point.
Re: (Score:2)
If you're 30 years behind them in their market, and they're 2 years behind you in yours, maybe it's not wise to be "dismissive of the threat" ?
You're comparing apples to oranges. nVidia has 13 years of experience in the market (NV1 - 1995) but it doesn't say anything about how fast someone else could catch up or how far they'd stay behind. Anyone could shave 20+ years off Intel's "head start" easily, it's the last few years to make a competitive product that are hard. nVidia could within a few years produce a processor some years behind Intel in technology, but it'd be marketwise dead on arrival. If Intel really is 3+ (you see any Larrabees this y
Re: (Score:2)
Yeah, I agree. His wording was a bit pretentious, but I expect both companies will be in the game for a long time yet.
Regardless though, our hardware is finally going parallel. From a programmer's point of view, I'm just very happy to see things like CUDA [wikipedia.org] emerging, which will make parallel programming a whole lot more feasible. I think we're going to see some really impressive things developed as a result of this.
Re: (Score:2)
Re: (Score:1, Interesting)
Some would say that the way we use devices is changing, that feature packed cell phones, UMPCs, and specialist devices like consoles, are beginning to dominate the home space. These platforms often dont use an x86 CPU. They use a RISC cpu like an arm or a freescale chip.
These people are significant rivals to intel.
The XBOX and the PS2 both have quazi CISC CPU chips in designed by IBM.
What I'm saying is that although Intel probably is now the dominant player in the x86 market, this is simply leading to a lot
wouldn't this be a good thing? (Score:2)
If more companies entered the same market that would give us more choices and better prices. I say go for it Nvidia make a cpu and see how you do against Intel and AMD.
I really wish that we could have the same socket in the motherboard for a CPU from Intel, AMD, Nvidia, . That would rock and give a real head to head test of which CPU is best for what you are doing. Never happen, but it would be cool to see.
Re: (Score:2)
If more companies entered the same market that would give us more choices and better prices. I say go for it Nvidia make a cpu and see how you do against Intel and AMD.
No, I do not think that would be a good thing. The up-front R&D cost for making CPUs is huge. Fabricating them ain't cheap either. Sure, NVIDIA has a lot of talent and would have a big jump on the R&D. And they have fabrication facilities that could be retuned for CPUs instead of GPUs. But I think that the end result of NVIDIA attempting to compete with Intel/AMD on the x86 CPU front would be death or serious damage to NVIDIA and we'd lose competition on the graphics card market rather than gai
Re:wouldn't this be a good thing? (Score:4, Informative)
Re: (Score:2)
Absolutely correct. Perhaps I should have said 'access to fabrication facilities' or 'fabrication relationships'. The point is that they have no resource issues barring them from the game, just a lot of catch-up work, stiff competition, and the good sense to lack motivation.
'Decide what you're going to do and focus on doing it well' is a good business model and, whether you're an NVIDIA fan or not, that's certainly what they're trying. And, so far, it's working out a lot better for them than a lot of the
Re: (Score:2)
The real losers would be Via and AMD. If NVidia made a big entry into the x86/x86-64 space, they would take as much ore more market share from the smaller players as from Intel. NVidia would be poorly served by knocking Via out and especially by knocking AMD out. Even though those companies compete for graphics dollars, they give NVidia somewhere to put its graphics and chipsets other than on Intel-CPU boards.
Re: (Score:2)
there was such a socket for some time, the Socket 7, around the time of the AMD K6 generation. You could put most intel and amd cpus of the era into the same motherboard.
Re: (Score:3)
Not just Intel and AMD. There was a time when you could use an Intel, Amd, Cyrix, IDT, or a Rise (and I'd bet even a couple more) CPU all in the same motherboard. Back then I didn't even DREAM of building a machine with an Intel chip - Cyrix and AMD were less than half the cost (close to 1/3rd the cost in some areas). And when those costs were in the hundreds of dollars for entry level stuff (rather than the $35 that you can get a budget CPU for now), it really made a difference.
Of course, that was when
Re: (Score:2)
Indeed and not just intel and amd either but cyrix and IDT as well. Then intel moved to slot 1 which iirc involved some propietry stuff that stopped anyone else using it. The competitors stayed on socket 7 for a while then AMD moved to slot A and the others either died out or moved to processors soldered directly to the motherboard.
Re: (Score:2)
Re: (Score:2)
Nice idea, but no.
CPU manufacture has become the most expensive part of computing. The cost of designing, prototyping, and then fabricating CPUs is INSANE! Worse, the price grows fantastically as the trace-size shrinks. It's been suggested that one of the reasons Intel moved so aggressively from 65nm to 45nm is to push AMD to the sidelines.
nVidia is roughly five percent the size of Intel. Trying to enter a market outside of their core competence against a behemoth like that is suicide.
rumour machine (Score:3, Insightful)
rather handy that this rumour gives nvidia, a GPU company, the chance to point out how futile it would be for them to try and enter the CPU market... then point over to intel, a CPU company, trying to make a GPU...
What they need to do is (Score:2, Insightful)
Remove their heads from their collective rectum and correct the damn problems they have with their video cards and motherboard chipsets.
I've been a loyal nVidia customer since the good old days of the Diamond V550 TNT card through the 8800GTX but they have really hosed up lately.
My 780i board has major data coruption problems on the IDE channel and my laptop is one of the ones affected by their recall so I am not too pleased with their ability to execute lately...
Re: (Score:2)
Well said...
Dont go breaking into someone else's house while yours is burning down.
And why not? (Score:5, Insightful)
I wouldn't mind seeing more players in the computer processor industry. The headlines really make it sound like it would be a bad thing. Maybe I'm getting the headlines wrong, but having Nvidia presenting new alternatives to a market almost exclusively owned by Intel and AMD would be interesting.
Re: (Score:1)
From 2006 (Score:5, Insightful)
"A GPU from 2006" sounds a lot like famous last words.
I wonder if anyone at DEC made comments in a similar vein about Intel CPUs, when the Alpha was so far ahead of anything Intel was making? NVidia's architect should not underestimate Intel, if he does, he does it at his company's peril.
Re:From 2006 (Score:4, Interesting)
The alpha failed because the motherboards were $1300.00 and the processors were $2600.00 nobody in their right mind bought the stuff when you could get Intel motherboards for $400 and processors for $800.00 (dual proc boards, high end processors)
DEC died because they could not scale up to what the intel side was doing. you had thousands of motherboards made per hour for Intel with maybe 4 a day for Alpha. It's game over at that point.
I loved the Alphas, I had a dual alpha motherboard running windows NT it rocked as a server.
Re: (Score:2)
The same can be said about Itanium. The original Itanium (and even the current ones) were so DAMNED expensive and they didn't offer any real performance increase.
What really killed Itanium was AMD's x64 extensions.
Itanium will be around for awhile but it will never become commonplace outside of high end, massively SMP UNIX servers.
Re: (Score:1)
Itanium will be around for awhile but it will never become commonplace outside of high end, massively SMP UNIX servers.
They never were marketed as such. Itanium competes on the same playing field as Power and Sparc. The primary problem with Itanium was it was three years behind schedule and rushed out to replace the aging PA and MIPS (and the end-of-lifed Alpha) and lost momentum with its flaws. This happened years before 64bit hit the x86 CPUs. The idea of moving x86 code to Itanium was not the compelling selling point for large enterprise customers; they had to replace their existing PA, Alpha, and MIPS systems.
Re: (Score:2)
Yea they were. Dell and HP were initially pushing Itanium servers running Windows hard. This was Intel's answer to the 64-bit question.
Who knows, maybe if AMD didn't create AMD64 Itanium would have been more accepted and eventually the prices may have dropped some. But we'll never know, and I'm glad for that. I much prefer x86/x64 running the show, as it's accessible to everyone, including the enthusiast, for running server operating systems.
HP has had some success with Itanium on their HP/UX machine
Re: (Score:1)
DEC died because they could not scale up to what the intel side was doing. you had thousands of motherboards made per hour for Intel with maybe 4 a day for Alpha. It's game over at that point.
You clearly do not understand the high end market. You cannot compare low end servers with P2 chips and systems based around Alpha (or Power, PA, etc). Alpha died because DEC was sold to Compaq (an Intel partner). Prior to the sale, Alpha systems were doing brisk business. This was 1998 folks. The P3 would not be rleased until the following year and Itanium would not see the light of day until 2001.
Re: (Score:2)
... nobody in their right mind bought the stuff...
...I had a dual alpha motherboard running windows NT it rocked as a server.
So, would it be fair to say that you weren't in your right mind? ;-)
Re: (Score:2)
It's just the time-honored sports tradition of trash-talking your opponent. One example was when DEC's CEO Ken Olsen [wikipedia.org] famously said that "Unix is snake-oil".
That's just hilarious Ken, ya Fred Thomson ugly dinosaur-scaly bastard, since a few years later I bought a DEC Alpha from you running Ultrix instead of VMS.
Re: (Score:2)
Yea, but think about it: A good GPU from 2006 is still PRETTY DAMNED GOOD!
I'm still using an AGP 6800GT in one of my machines, and it's still trucking. I can't run everything at high quality but it's usable.
Yesterday, Intel made a GPU as good as a GPU from 2002. Today it's 2006. Tomorrow they might be competitive. And honestly, with Intel GPU specs being FAR more open than nVidia or ATI, I welcome it. We might actually be able to get GOOD graphics, completely open sourced drivers, on Linux.
Re: (Score:2)
Fat chance getting GOOD open-source drivers in a timely fashion (as in, before the hardware is 2-3 generations behind), unless Intel writes them.
That's the point. Intel writes open source Linux drivers for their graphics hardware. They've done so for a while now. I find closed source graphics drivers to be a headache, so if Intel is getting into the high-end market, that's a hopeful sign. What are you complaining about?
Re: (Score:2)
Yep: it doesnt matter if the Intel technology is akin to something nVidia was doing 2 - or 10 - years ago. What matters
Hell, Microsoft has made that their primary means of income for the past 20 years through superior marketing and underhanded business practices.
How about Transmeta style technology? (Score:1, Interesting)
Rewrite the software in place to run on a different architecture (whatever their latest GPUs implement). Maybe, just maybe GPUs have evolved to a point where interpreted generic-x86 wouldn't be (completely) horrible.
Re: (Score:2)
that would be interesting if you turn a GPU into a general purpose CPU. That way they would have a CPU without having to invest much additional resources into developing it, using the same core for both. But I have no idea if that is possible. It is likely that the GPU actually has less processing power than a current CPU, so it might not be nearly as fast as regular CPUs. It could work for a low end market or embedded. The ISA though may be designed around 3D graphics operations and perhaps you wouldnt hav
NVidia's Architect (Score:1, Funny)
http://www.hackthematrix.org/matrix/pics/m3/arch/1.gif
How nVidia "Survived" (Score:5, Insightful)
3DFx was the first company to publish Open Source 3D drivers for their 3D cards. nVidia sued them, then bought them at a discount, and shut down the operation. So, we had no Open Source 3D for another 5 years.
That's not "staying focused". It's being a predator.
Bruce
Re:How nVidia "Survived" (Score:5, Insightful)
What on earth are you talking about? 3DFx died because it was horribly mismanaged and ran out of money. There were lawsuits, but 3dfx sued NV first in 1998 and then in 2000 NV counter-sued (source [bluesnews.com]). True NV's countersuit was right before 3dfx died, but a simple lawsuit that's gone nowhere in the courts yet doesn't cause a company to go bankrupt overnight.
Personally I'll believe one of my (ex-3dfx Austin) friend's explanation for their downfall: the fully stocked Tequila bar that was free to all employees. Or there's a whole list of problems leading to their decline on wikipedia [wikipedia.org].
Re: (Score:2)
They asked a lot of employees, and the benefits had to match that.
I think nVidia's lawsuit was strategicaly positioned to be the straw that closed out additional investment prospects
Re: (Score:2)
3DFx died because NVIDIA crushed them with the GeForce. 3Dfx had already released a very disappointing product in the Banshee (it was buggy and slower that the Voodoo 2 SLI that proceeded it). Hardware T&L, controversial at the time, proved to be a killer feature.
Re:How nVidia "Survived" (Score:5, Interesting)
3dfx's problem was they could never figure out how they sold their cards. they flipped flopped from themselves to having others make the cards like Nvidia does. after so many times no one wants anything to do with you because it's bad for business planning.
nvidia has had it's current selling model for 10 years and only its partners have changed. if you want to sell video cards you can trust that if you sell cards based on nvidia's chips they won't pull the rug out from under you next year and decide to sell the cards themselves
Re:How nVidia "Survived" (Score:4, Interesting)
It does look like 3DFx bought the wrong card vendor. They also spun off Quantum3D, then a card vendor, which is still operating in the simulation business.
That's not "staying focused".It's being a predator (Score:2)
intel is a process company (Score:2, Insightful)
They are very good at doing research in making their chips very cheap to make and own the whole stack of production from start to finish. This is how they have managed to make it despite many many misteps along the way.
nVidia doesn't own the factories that they use to make their chips, they just design them and use factories like TSMC. nVidia would be stupid to compete with intel in the same space (x86 CPUs) until they own and can efficiently build chips like intel can.
AMD was the only ones doing it as th
Re: (Score:1)
Intel's latest graphics offering is going to fail, not because they don't have the hardware (actually their new larabee looks really fast). but because their graphics drivers have always stunk and there is little evidence to suggest that they will be able to make a leap forward in graphics driver quality that will make their solution better then AMD or nVidia. They have to write full DX9, DX10, and OpenGL drivers to really compete with nVidia, then they have to optimize all those drivers for all the popular games (cause nobody will re-write Doom, HL, UT, FarCry, etc.. just for this new graphics card).
It could happen, but will it?
That's what they used to say about ATI drivers a few years ago. It didn't stop customers from flocking to ATI video cards. The reason for that was ATI hardware was just as good or better than Nvidia. These days I don't hear too much fussing about AMD/ATI drivers. For graphics cards hardware is the key. The best driver will not overcome hardware short comings, but drivers can be upgraded.
There is no reason to think that Intel will have driver problems out the box. Drivers are nothing more than firmware, a
of course they deny (Score:1)
Nvidia has denied rumours that the company is planning an entry into the x86 CPU market
Of course they're denied building a x86 CPU, they're working on an x64 model. 'nuff said.
Nvidia has denied... not really. (Score:3, Interesting)
"That's not our business. It's not our business to build a CPU. We're a visual computing company, and I think the reason we've survived the other 35 companies who were making graphics at the start is that we've stayed focused."
"Are we likely to build a CPU and take out Intel?"
Re: (Score:1)
Okay,... (Score:2)
...so let's presume that the CEO explicitly said no. I still expect Nvidia to offer a combined CPU+GPU combo. S'pose I am just annoyed that the reporter didn't explore the subject a bit more.