Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
AMD Graphics Intel Hardware

NVIDIA To Exit Chipset Business 185

The rumor that we discussed a few months back is looking more real. Vigile writes "Once the darling of the enthusiast chipset market, NVIDIA has apparently decided to quit development of future chipsets for all platforms. This 'state of NVIDIA' editorial at PC Perspective first highlighted the fact that the company was backing away from its plans to develop a DMI-based chipset for Intel's Lynnfield processors due to legal pressure from Intel and debates over licensing restrictions. That effectively left NVIDIA out in the cold in terms of high-end chipsets, but even more interesting is the later revelation that NVIDIA has only one remaining chipset product to release, what we know as ION 2, and that it was mainly built for Apple's upcoming products. NVIDIA still plans to sell its current offerings, like MCP61 for AMD platforms and current generation ION for netbooks and nettops, but will focus solely on discrete graphics options after this final release."
This discussion has been archived. No new comments can be posted.

NVIDIA To Exit Chipset Business

Comments Filter:
  • by TheGratefulNet ( 143330 ) on Friday October 09, 2009 @12:29PM (#29694707)

    due to many problems. reports of data corruption at the design level (not build or parts but *design* faults). their ethernet drivers were horribly reverse engineered and never came close to the stability of the eepro1000, for example. at least on linux.

    there were issues with sata and compatibility.

    in short, they were over their heads. glad they finally admitted it (sort of).

  • by PolarBearFire ( 1176791 ) on Friday October 09, 2009 @12:33PM (#29694763)
    They better have a compelling product with the upcoming fermi then, but from I what I hear they're trying to design their GPUs for more general purpose computing, specifically scientific computations. It's a really big gamble and I can't see that it will be a huge market. Their upcoming products are supposed to have 3 billion transistors which is way more than 4x the amount in an i7 CPU. It's probably going to cost a ton too.
  • Re:Intel? (Score:5, Interesting)

    by linhares ( 1241614 ) on Friday October 09, 2009 @12:37PM (#29694845)
    I think this is a clever ploy to make Intel play nice with Nvidia. By "letting go" of the market, true or not, Nvidia sends a message that Intel is a monopoly, which puts Intel in a much worse position (remember the EU) than Intel has when competing with Nvidia in the chipset scenario. Obviously, it's impossible to know what's going to happen. But if I were at the top @ Intel, I'd be freaking out a little, for this tiny little company "we have crushed" (that's what Nvidia makes it look like) will get us into the spotlight from regulators. I'm gonna go get some popcorn.
  • by blackchiney ( 556583 ) on Friday October 09, 2009 @12:48PM (#29695027)
    I don't see why they couldn't go for the GPGPU and scientific computation market. They've acquired a lot of SGI and Cray IP. The x86 has been done to death. Except for more cores and a faster bus there isn't much more R&D there. And I'm not really sure why they got into the chipset business in the first place. Intel and AMD had it helmed up leaving very little for a third competitor.

    Their core competences are in GPUs, they have a lot of IP there. This is valuable for negotiating licenses against the likes of Intel. And Intel's only dominance is in low margin integrated GPUs. Which is great for retailers but not great for the R&D team.
  • Old news (Score:2, Interesting)

    by Groo Wanderer ( 180806 ) <{charlie} {at} {semiaccurate.com}> on Friday October 09, 2009 @12:50PM (#29695061) Homepage

    This isn't new, they knifed it a year+ ago. I wrote it up then:
    http://www.theinquirer.net/inquirer/news/1021993/nvidia-chipsets-history [theinquirer.net]
    and no one believed it. Now that NV has no choice but to admit it, they stopped pretending. Yay?

    They are doing the same thing about their "not killing" the GTX285/275/260, it is just a temporary shortage or some twaddle. This one won't take a year to admit though.

                          -Charlie

  • Re:Bad idea?? (Score:4, Interesting)

    by Kjella ( 173770 ) on Friday October 09, 2009 @01:05PM (#29695269) Homepage

    Intel will be putting graphics on the CPU, according to their roadmap.
    AMD will be putting graphics on the CPU, according to their roadmap.

    At that point the GPU is already a "sunk cost", noone will buy an integrated GPU that's only slightly better than another integrated GPU. It's also not only legal reasons, but also about pricing, timing, access to resources and so on. Intel can increase license costs, do accounting so more profits go on processors, delay launches of competing chipsets, deny access to resources trying to work out incompatibilies or instabilities and so on. Intel is doing extremely well and is ready to do that landgrab, one way or the other. I think nVidia is doing a better play as the victim of Intel's legal department rather than being gently pushed out the door as the GPU joins the CPU.

  • Re:This is False (Score:3, Interesting)

    by default luser ( 529332 ) on Friday October 09, 2009 @01:12PM (#29695383) Journal

    That's great. Nvidia is outselling ATI chipsets by dumping stock of their Nforce4 (that is what the MCP61 is, you'd know these things if you read the PCPer article linked in the summary), a chipset from 2006 that doesn't even support PCIe 2.0. If that's not a sign of things to come, I don't know what is.

    And Nvidia is developing ONE new chipset - ION2, for Apple. Since the rest of the world is moving-on to mobile i7/i5/i3, and even Atom is getting on-die graphics, I can't forsee Nvidia really investing anything in future chipset tech.

  • by RMingin ( 985478 ) on Friday October 09, 2009 @01:16PM (#29695439) Homepage

    Give credit where it is due. During the Athlon64 days (socket 939?), Nvidia were in a class of their own.

    They were only in a class of their own because no one else was attending that school. Via was always a joke, Nvidia just provided the punchline. AMD was pulling out of chipsets at that point, and Intel had no interest in chipsets for AMD CPUs. Who then now?

    AMD bought ATI, between the two of them they manage to synthesize half a decent chipset, and et viola, Nvidia is irrelevant. Since no one on the Intel side ever had much love for NV, they managed to put THEMSELVES out in the cold.

    nForce2 was the high point for Nvidia chipsets. Since then it has all been a slow decline.

  • Re:Old news (Score:2, Interesting)

    by Zoson ( 300530 ) on Friday October 09, 2009 @01:25PM (#29695617) Journal

    And then you were promptly fired for writing FUD.

    Nobody believes a word you say. You lost all credibility long ago.

    It's just a shame the inquirer has not removed your negative, blatantly biased garbage.

  • Re:This is False (Score:3, Interesting)

    by TJamieson ( 218336 ) on Friday October 09, 2009 @01:27PM (#29695637)

    Hmm.. I've got an MCP61-based AMD system, and it also has PCI-Express 2.0. YMMV?

  • Re:Bad idea?? (Score:3, Interesting)

    by MBGMorden ( 803437 ) on Friday October 09, 2009 @01:29PM (#29695653)

    Which means what the GP said. Nvidia's integrated graphics solutions come in the form of Nvidia chipsets (of which the nForce is the most common). If they're no longer making chipsets, then they're no longer making integrated graphics. There's still the possibility of a maker taking a discrete chip and adding it separately to the motherboard PCB, but with virtually every modern northbridge chip having built in graphics already I don't see that happening. The people who are satisfied with integrated will use that, the people who want something better will want to do so via upgradeable addon cards.

    Truthfully, I just don't see the wisdom in this decision. I'd have sooner expected Nvidia to announce that they were leaving the discrete graphics chip market rather than the chipset market.

  • by MBGMorden ( 803437 ) on Friday October 09, 2009 @01:33PM (#29695737)

    You have to look at target market though. Sure they might be the deal of the century for that occasional scientist looking for supercomputer power on a budget, but in reality, few regular users - hell few extreme power users - need anything resembling a supercomputer (not just raw speed, but super computers are designed much more for parallel processing, and a ton of what users do is more suited to serialized processing).

    Overall, I think they do indeed have a target market - I just don't see that target market being sufficient for them.

  • Heh heh (Score:3, Interesting)

    by Groo Wanderer ( 180806 ) <{charlie} {at} {semiaccurate.com}> on Friday October 09, 2009 @01:55PM (#29696081) Homepage

    Try plugging an SSD into one of those chipsets and see how far you get. Especially an Intel SLC SSD. Then go look for a patch on Nvidia's site.

    Intel had to patch around Nvidia's bugs, Nvidia wouldn't. There is a long list of these things.

    They may exist, but I wouldn't call them good. They essentially haven't been touched, just renamed, and are seriously showing their age.

                    -Charlie

  • Re:Bad idea?? (Score:3, Interesting)

    by Sandbags ( 964742 ) on Friday October 09, 2009 @02:17PM (#29696371) Journal

    Well, as Apple made public knowledge when they switched to Intel, (not an exact quote) "we develop, compile, and test OS X on multiple hardware platfors, always have since the very first day of development, include new processor platforms as they come available, and can change to an alternate platform at any time."

    IBM appears to be working on a low power P6/P7 architecture, AMD has some nice new stuff, They have their own fab now for low power CPUs, I'm sure they're compiling against Atom and likely even Cell...

    Honetly, as long as GPUs remain seperate from CPUs, it's long past time when the north/southbridge became integrated into the core CPU silcon. They already added the memory controller and other mainboard resources, now the base systems bus and other common components could all be included. nVidia really is doing the right thing moving into alternate markets, this one IS dying, this may actually be good for both nVidia and intel as it gives intel an advantage in being able to seperate and move away from current trends easier, and gives nVidia a more consolodated and focussed research effoer for GPU/CPU acceleration - generic core processing technology.

    nVidia will still reap a LOT of profit from the existing systems for years, and makes a killing in GPUs and AMD chipsets. Saving this reaserch money, shutting down the facilities, and in the end almost certainly winning a case against intel for a few hundred million in cash down the road, this is a great opportunity for them, and I commend their decision.

  • by rgviza ( 1303161 ) on Friday October 09, 2009 @02:21PM (#29696425)

    Then there's also the whole thing of nVidia producing utter crap chipsets... That might have a teeny weeny little something to do with it.

    It has nothing to do with intel's "market dominance" and everything to do with nVidia's inability to be competitive in a market segment they know little about, and the shoddy crap they try to pass off as a chipset. Once you put the koolaid down and have an objective look at their product, it simply sucks.

    I've had 3 of them and all three were utter garbage. DFI, Gigabyte, ASUS, it didn't matter. Every time it turned out to be the MCP in the chipset or some other part of it failing or not working correctly to begin with. In one case the interrupt controller didn't work at all with a dual core CPU and on both linux and MS Windows they had to put a "software" interrupt controller in the kernel to make it work with a dual core cpu. As you might guess this made the multi cpu performance _worse_ than a single cpu. And this was a chipset designed for multi-core CPUs.

    I've subsequently had 2 AMD crossfire chipsets, both worked perfectly. nVidia chipsets are 0-3 in my book.

    Good riddance...

    That's hundreds of thousands of consumers that won't get burned. Intel or AMD chipsets for the win...

  • Re:Not Intel (Score:1, Interesting)

    by Anonymous Coward on Friday October 09, 2009 @03:16PM (#29697153)

    I'm afraid I'd have to say that if it is true, then good riddance. I have had nothing but problems with nVidia chipset mobos (though I've only had two mobos based on nVidia, one of which is my current computer)

    The first would simply freeze entirely. Oddly enough, it stopped doing that when I replaced the mobo with an Intel based one (which really didn't surprise me since nVidia couldn't get the chipset certified for use with Intel processors).

    The second, which has an AMD processor, frequently reboots itself, with the STOP error referring to... a problem with the chipset!

  • by Big Boss ( 7354 ) on Friday October 09, 2009 @03:41PM (#29697481)

    Unless you run Linux. Check the MythTV mailing list sometime, nearly every post referencing an ATI gfx chip can't get even basic stuff working. NVidia gave us VDPAU, ATI has yet to answer that one. I have no idea how ATI does in Windows for HTPCs as I don't run Windows on my HTPCs. The license alone would be 30% of the cost for the machine even if I wanted to use it. Too much for too little.

  • Re:WebGL (Score:3, Interesting)

    by tepples ( 727027 ) <tepplesNO@SPAMgmail.com> on Friday October 09, 2009 @05:02PM (#29698543) Homepage Journal

    GMail is a fairly typical AJAX front-end.

    My point is that AJAX front-ends like that of Gmail became "fairly typical" only after Google had released Gmail into a limited beta. Imagine Google Maps Street View using polygonal models of nearby buildings rather than still skyboxes. It'd be like the step from Myst (1993) to Real Myst (2000) [wikipedia.org].

One man's constant is another man's variable. -- A.J. Perlis

Working...