Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
AMD Graphics Intel Hardware

NVIDIA To Exit Chipset Business 185

The rumor that we discussed a few months back is looking more real. Vigile writes "Once the darling of the enthusiast chipset market, NVIDIA has apparently decided to quit development of future chipsets for all platforms. This 'state of NVIDIA' editorial at PC Perspective first highlighted the fact that the company was backing away from its plans to develop a DMI-based chipset for Intel's Lynnfield processors due to legal pressure from Intel and debates over licensing restrictions. That effectively left NVIDIA out in the cold in terms of high-end chipsets, but even more interesting is the later revelation that NVIDIA has only one remaining chipset product to release, what we know as ION 2, and that it was mainly built for Apple's upcoming products. NVIDIA still plans to sell its current offerings, like MCP61 for AMD platforms and current generation ION for netbooks and nettops, but will focus solely on discrete graphics options after this final release."
This discussion has been archived. No new comments can be posted.

NVIDIA To Exit Chipset Business

Comments Filter:
  • Intel? (Score:5, Insightful)

    by _PimpDaddy7_ ( 415866 ) on Friday October 09, 2009 @12:22PM (#29694595)

    Do we get mad at Intel?

    This is a sad day.

    Competition is good, I'm sorry.

  • WebGL (Score:5, Insightful)

    by tepples ( 727027 ) <tepplesNO@SPAMgmail.com> on Friday October 09, 2009 @12:25PM (#29694657) Homepage Journal

    Do we get mad at Intel?

    Yes. Intel hasn't produced a competitive GPU for its integrated graphics. This will become painfully apparent once web sites start to use JavaScript bindings for OpenGL ES [khronos.org].

  • by mpapet ( 761907 ) on Friday October 09, 2009 @12:35PM (#29694805) Homepage

    I would argue Intel's strength relies a little on the U.S. intellectual property laws and procedures. If the country loosened intellectual property law, Nvidia might have a chance in hell.

    But this is also about a global market where 80% of product comes from maybe 10% of all possible manufacturers and there are few laws preventing Intel from doing all kinds of market shenanigans in places like China.

    I know the loosening of intellectual property laws would help Nvidia's case, but I don't think it would bring about a semi-competitive marketplace because this market (global OEM) has few legal constraints.

  • by MrNemesis ( 587188 ) on Friday October 09, 2009 @12:36PM (#29694827) Homepage Journal

    ...that nVidia are at least giong to make a stab at providing graphics-enabled southbridges or something... as for things like HTPC's an Intel CPU + nVidia integrated graphics is brilliant. If I'm in the market that's looking for integrated graphics (in the case of HTPC's, power usage and space considerations) then the GPU is more important than the CPU... and I find myself being pushed to AMD for the whole platform.

    Intel is really shooting themselves in the foot with all the bus licensing stuff IMHO. By scaring off nVidia IGP's, they're left with their own mediocre offerings which, in my experience, are vastly inferior even in graphics tasks that don't involve 3D.

    If nVidia can supply us with miniscule IGP's-on-a-PCIe-stick-for-a-tenner then great, but their recent developments seem to be pushing themselves into niche applications (bigger and bigger GPU dies primarily) and I'm worried an Intel platform will make me choose between Intel IGP or a power-guzzling graphics card. Heck, pretty much every machine I've built for others in the last five years has come with an ATI or nVidia IGP because I don't know anyone that games.

    Disclaimer: I have every type of GPU in my house; I use nVidia IGP's for all my HTPC's since they're the only ones that are consistently good for HD content under both windows and Linux. Intel IGP's suck for video (my X3100 can't keep up with SD x264 scaled over a 1900x1200 screen without tearing and lag) but are fine for my laptops (low power usage preferred), and a mix of ATI and nVidia grpahics cards on the machines that need 3D. I was annoyed enough when nVidia IGP's stopped appearing for AMD boards, but not having them at all will be a serious pain in the arse.

  • Re:Intel? (Score:4, Insightful)

    by noundi ( 1044080 ) on Friday October 09, 2009 @12:50PM (#29695067)

    Do we get mad at Intel?

    This is a sad day.

    Competition is good, I'm sorry.

    This is competition. Just not one of the occasions that you like.

  • Not Intel (Score:5, Insightful)

    by Groo Wanderer ( 180806 ) <{charlie} {at} {semiaccurate.com}> on Friday October 09, 2009 @01:05PM (#29695267) Homepage

    "Do we get mad at Intel?"

    Yeah, they made Nvidia look bad by putting out chipset that met spec, survived average use, then had the gall to not hide the fact! (see http://support.apple.com/kb/TS2377 [apple.com]) I mean really, how can Intel do business like that? And people wonder why Nvidia is bailing, then trying to hide it before Wall Street notices and downgrades them more.

    The story goes like this.
    1) Nvidia stops designing future chipsets
    2) Nvidia blames Intel for nebulous atrocity
    3) Nvidia hides the facts
    4) It gets out
    5) Nvidia admits it
    6) Wall Street notices (several analyst reports out on the subject today)
    7) Nvidia realizes that Wall Street noticed
    8) Nvidia backpedals, hard, fast, and with all due slime

    The 'denial' they are throwing around now states that they are not going to develop AMD chipsets anymore, not going to develop Intel chipsets anymore, and only going to continue selling the ones they have made. Until Intel stops making FSB chips in a few months, then it WILL be Intel's fault somehow.

    Back to the original question, can you explain how Nvidia voluntarily stopping design of AMD chipsets is Intel's fault? :)

    I saw this a year ago when I saw them stop most if not all future chipset products. I wrote it up. Nvidia denied it. A year later, they announce a stoppage for a few hours until the implications sink in. Then they deny it.

    Yup. Intel. Those bastards!

    I agree about the competition part, but this isn't sad, it was planned.

                        -Charlie

  • by magus_melchior ( 262681 ) on Friday October 09, 2009 @01:10PM (#29695337) Journal

    They have a huge contract with Apple as they've adopted NVidia chipsets for pretty much the entire Mac product line. Given that Jobs would preemptively shift to another chipset platform in the last round of announcements if this were even remotely true, I seriously doubt that NVidia would even think of limiting further R&D in their chipsets to Ion 2.

    Unfortunately I'm used to the editors slipping at least twice a day...

  • by Sloppy ( 14984 ) on Friday October 09, 2009 @01:23PM (#29695579) Homepage Journal

    It looks like long-term, Intel and AMD/ATI are going to be the only games in town. That wouldn't worry me a whole lot, because I think their stuff looks good on paper, and they'll compete. And both of them are slowly advancing their open source drivers. But the key word is "slowly." If, say, you want to buy a machine to use as a MythTV box or something like that, right now NVidia is currently the only one it makes sense to buy. Anybody else, and you're going to have to decode your video with CPU and read promises about how some day you might not have to. I hate reading promises.

    I am not looking forward to the day when these two windows of acceptability don't overlap. What happens you want to build a box and neither Nvidia nor Intel not AMD have a product that can actually be used, either because they're gone (Nvidia) or their drivers aren't yet working (Intel and AMD)? That is going to suck.

  • by linhares ( 1241614 ) on Friday October 09, 2009 @01:53PM (#29696057)

    I hate reading promises.

    Well then the Nobel committee won't have you.

  • Re:WebGL (Score:4, Insightful)

    by Anonymous Coward on Friday October 09, 2009 @02:23PM (#29696453)

    I'm not looking forward to that day. Everything done with JavaScript so far has sucked filthy penises.

    Take the stupid comment slider here at Slashdot, for example. The old non-AJAX approach worked just fine. You didn't have to click "More" and then wait, click "More" and then wait, etc. hundreds of times just to see all of the comments.

    And you could view the -1 comments easier, as well. Even now I still don't know how to show the hidden comments. The piece of shit sidebar panel says "# Hidden", but I pull on the dragger thing and it refuses to move! The other one works fine, though.

    I see the equation as being:
    Idiot Web Developers + JavaScript + OpenGL ES = Totally Fucking Horrible Web Sites Which Make Me Want to Cry

  • Re:Not Intel (Score:3, Insightful)

    by Groo Wanderer ( 180806 ) <{charlie} {at} {semiaccurate.com}> on Friday October 09, 2009 @03:36PM (#29697425) Homepage

    Yes and no. Their excuse is the legal inability, but they have known that for ~2 years. Why it suddenly becomes an issue AFTER they realized they needed to publicly have a scapegoat is something you will have to ask them.

    The basic problem is that there will not be any chipsets in about a year, with memory controllers, graphics and PCIe moving on package or on die, depending on the exact chip, but on all on die shortly thereafter. What is a chipset? Sata controller, boot rom and USB ports? And why do I need an NVidia branded one for $50 commodity when everyone else is selling it for $5?

    NV is out of the business, and they played a really stupid game with Wall Street. Then Wall Street didn't like the surprise that wouldn't have been there had Nvidia come clean about it a year ago. Now, the analysts based their models on something that not only wasn't true, but NV knew it wasn't.

    The analysts look stupid, and NV is to blame. So they are putting out a childish attempt at the blame game. As someone who watches their little shenanigans, it would be entertaining as hell to write about, but since this is time #23 of this same game, it is just tiring.

    Basically, NV ended the program(s) over a year ago. They lead the financial guys on to believe it was strong and ongoing, and the finance people took their word for it. When word got out (again, not mine, Ryan/PCPer's above), they had to have an excuse NOW. So Intel! Yeah, they are big and bad, blame them! So they did.

    Remember, nothing is Nvidia's fault, EVER. They still have not released a list of the 'Bumpgate' bad chips, or done anything to help the affected people. If you intone something is their fault, you will be blacklisted. Blame something else or else! Been there, seen that, time for a new trick. Maybe if I buy some dog biscuits before CES........

                -Charlie

  • by Rudeboy777 ( 214749 ) on Friday October 09, 2009 @04:44PM (#29698315)

    Come on now, I know RTFA is not in fashion but YOU JUST QUOTED THE ARTICLE minus the commentary that points out this letter IS NVIDIA'S ADMISSION that they are leaving the chipset business!!!

One man's constant is another man's variable. -- A.J. Perlis

Working...