Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
AMD Graphics Intel Hardware

NVIDIA To Exit Chipset Business 185

The rumor that we discussed a few months back is looking more real. Vigile writes "Once the darling of the enthusiast chipset market, NVIDIA has apparently decided to quit development of future chipsets for all platforms. This 'state of NVIDIA' editorial at PC Perspective first highlighted the fact that the company was backing away from its plans to develop a DMI-based chipset for Intel's Lynnfield processors due to legal pressure from Intel and debates over licensing restrictions. That effectively left NVIDIA out in the cold in terms of high-end chipsets, but even more interesting is the later revelation that NVIDIA has only one remaining chipset product to release, what we know as ION 2, and that it was mainly built for Apple's upcoming products. NVIDIA still plans to sell its current offerings, like MCP61 for AMD platforms and current generation ION for netbooks and nettops, but will focus solely on discrete graphics options after this final release."
This discussion has been archived. No new comments can be posted.

NVIDIA To Exit Chipset Business

Comments Filter:
  • Intel? (Score:5, Insightful)

    by _PimpDaddy7_ ( 415866 ) on Friday October 09, 2009 @12:22PM (#29694595)

    Do we get mad at Intel?

    This is a sad day.

    Competition is good, I'm sorry.

    • WebGL (Score:5, Insightful)

      by tepples ( 727027 ) <tepplesNO@SPAMgmail.com> on Friday October 09, 2009 @12:25PM (#29694657) Homepage Journal

      Do we get mad at Intel?

      Yes. Intel hasn't produced a competitive GPU for its integrated graphics. This will become painfully apparent once web sites start to use JavaScript bindings for OpenGL ES [khronos.org].

      • Re:WebGL (Score:4, Insightful)

        by Anonymous Coward on Friday October 09, 2009 @02:23PM (#29696453)

        I'm not looking forward to that day. Everything done with JavaScript so far has sucked filthy penises.

        Take the stupid comment slider here at Slashdot, for example. The old non-AJAX approach worked just fine. You didn't have to click "More" and then wait, click "More" and then wait, etc. hundreds of times just to see all of the comments.

        And you could view the -1 comments easier, as well. Even now I still don't know how to show the hidden comments. The piece of shit sidebar panel says "# Hidden", but I pull on the dragger thing and it refuses to move! The other one works fine, though.

        I see the equation as being:
        Idiot Web Developers + JavaScript + OpenGL ES = Totally Fucking Horrible Web Sites Which Make Me Want to Cry

        • by tepples ( 727027 )

          Idiot Web Developers + JavaScript + OpenGL ES = Totally Fucking Horrible Web Sites Which Make Me Want to Cry

          Not all web developers are idiots. What happens when Google's non-idiot developers start doing amazing things with WebGL?

          • Then I will be avle to access the horrendously formatted yet relevant content through a prettier and more convenient interface.

        • Everything done with JavaScript so far has sucked filthy penises.

          Google Maps Streetview works pretty well for me. Certainly beats a crappy flash plugin that messes up webpage input!

          I'm looking forward to GPU accelerated video through javascript + OpenGL ES.

      • by Jaysyn ( 203771 )

        My laptop with it's built in ATI PCIe chip recently died & I had to swap the drive out to another Dell laptop I had laying around, one whose HD died on a friend and said friend basically didn't want anymore . It had one of those Intel IGP chips in it. I was pleasantly surprised when it would still play NWN & Dungeon Keeper II. I was freaking shocked that it played DDO & both ran faster & looked even better than the ATI PCIe chip did.

    • by mpapet ( 761907 ) on Friday October 09, 2009 @12:35PM (#29694805) Homepage

      I would argue Intel's strength relies a little on the U.S. intellectual property laws and procedures. If the country loosened intellectual property law, Nvidia might have a chance in hell.

      But this is also about a global market where 80% of product comes from maybe 10% of all possible manufacturers and there are few laws preventing Intel from doing all kinds of market shenanigans in places like China.

      I know the loosening of intellectual property laws would help Nvidia's case, but I don't think it would bring about a semi-competitive marketplace because this market (global OEM) has few legal constraints.

      • by rgviza ( 1303161 ) on Friday October 09, 2009 @02:21PM (#29696425)

        Then there's also the whole thing of nVidia producing utter crap chipsets... That might have a teeny weeny little something to do with it.

        It has nothing to do with intel's "market dominance" and everything to do with nVidia's inability to be competitive in a market segment they know little about, and the shoddy crap they try to pass off as a chipset. Once you put the koolaid down and have an objective look at their product, it simply sucks.

        I've had 3 of them and all three were utter garbage. DFI, Gigabyte, ASUS, it didn't matter. Every time it turned out to be the MCP in the chipset or some other part of it failing or not working correctly to begin with. In one case the interrupt controller didn't work at all with a dual core CPU and on both linux and MS Windows they had to put a "software" interrupt controller in the kernel to make it work with a dual core cpu. As you might guess this made the multi cpu performance _worse_ than a single cpu. And this was a chipset designed for multi-core CPUs.

        I've subsequently had 2 AMD crossfire chipsets, both worked perfectly. nVidia chipsets are 0-3 in my book.

        Good riddance...

        That's hundreds of thousands of consumers that won't get burned. Intel or AMD chipsets for the win...

        • nVidia was playing the same game in terms of wanting an excessive amount to license SLI to Intel. That was why Intel boards used to only support Crossfire, not SLI. ATi licensed it for a minimal fee, nVidia didn't because they wanted to push their own chipset products.

    • Re:Intel? (Score:5, Interesting)

      by linhares ( 1241614 ) on Friday October 09, 2009 @12:37PM (#29694845)
      I think this is a clever ploy to make Intel play nice with Nvidia. By "letting go" of the market, true or not, Nvidia sends a message that Intel is a monopoly, which puts Intel in a much worse position (remember the EU) than Intel has when competing with Nvidia in the chipset scenario. Obviously, it's impossible to know what's going to happen. But if I were at the top @ Intel, I'd be freaking out a little, for this tiny little company "we have crushed" (that's what Nvidia makes it look like) will get us into the spotlight from regulators. I'm gonna go get some popcorn.
      • by jaypifer ( 64463 )
        This is one of the best insights I've read in a while. Brilliant! And I'm sure you are dead on...the EU is on a monopoly crushing tear and needs fresh meat.
    • Re:Intel? (Score:4, Insightful)

      by noundi ( 1044080 ) on Friday October 09, 2009 @12:50PM (#29695067)

      Do we get mad at Intel?

      This is a sad day.

      Competition is good, I'm sorry.

      This is competition. Just not one of the occasions that you like.

    • Not Intel (Score:5, Insightful)

      by Groo Wanderer ( 180806 ) <{charlie} {at} {semiaccurate.com}> on Friday October 09, 2009 @01:05PM (#29695267) Homepage

      "Do we get mad at Intel?"

      Yeah, they made Nvidia look bad by putting out chipset that met spec, survived average use, then had the gall to not hide the fact! (see http://support.apple.com/kb/TS2377 [apple.com]) I mean really, how can Intel do business like that? And people wonder why Nvidia is bailing, then trying to hide it before Wall Street notices and downgrades them more.

      The story goes like this.
      1) Nvidia stops designing future chipsets
      2) Nvidia blames Intel for nebulous atrocity
      3) Nvidia hides the facts
      4) It gets out
      5) Nvidia admits it
      6) Wall Street notices (several analyst reports out on the subject today)
      7) Nvidia realizes that Wall Street noticed
      8) Nvidia backpedals, hard, fast, and with all due slime

      The 'denial' they are throwing around now states that they are not going to develop AMD chipsets anymore, not going to develop Intel chipsets anymore, and only going to continue selling the ones they have made. Until Intel stops making FSB chips in a few months, then it WILL be Intel's fault somehow.

      Back to the original question, can you explain how Nvidia voluntarily stopping design of AMD chipsets is Intel's fault? :)

      I saw this a year ago when I saw them stop most if not all future chipset products. I wrote it up. Nvidia denied it. A year later, they announce a stoppage for a few hours until the implications sink in. Then they deny it.

      Yup. Intel. Those bastards!

      I agree about the competition part, but this isn't sad, it was planned.

                          -Charlie

      • Re:Not Intel (Score:5, Informative)

        by ByOhTek ( 1181381 ) on Friday October 09, 2009 @01:34PM (#29695743) Journal

        That's GPUs, not mobo chipsets.

        Any pretty much every manufacturer has had screwups. That being said, nVidia has made some nice performance chipsets in the past, and it's a shame to see them go. Really, for my experience, and in terms of reliability, they are have been the only company to produce chipsets that could compete with Intel.

        • Re: (Score:3, Informative)

          Yup, you are right, but the same thing happened with their chipsets, same problem. Look up the recent Sony admission on the same topic, and Dell, HP along with many others. I won't keep spamming my own links/stories here, you can find them an a lot more with a little searching.

          I would not say their chipsets are reliable, nor bug free, but they did have speed at times. This may be OK for a home user, but looking at the data corruption problems for their RAID setups, drive controller issues in general, networ

          • I'm still not clear, are you trying to say that all or of Nvidia's problems are there own due to poor quality and none of it is due to their legal inability to produce chipsets supporting current Intel chips?
            • Re: (Score:3, Insightful)

              Yes and no. Their excuse is the legal inability, but they have known that for ~2 years. Why it suddenly becomes an issue AFTER they realized they needed to publicly have a scapegoat is something you will have to ask them.

              The basic problem is that there will not be any chipsets in about a year, with memory controllers, graphics and PCIe moving on package or on die, depending on the exact chip, but on all on die shortly thereafter. What is a chipset? Sata controller, boot rom and USB ports? And why do I need

        • For the longest time, they were the only company that had decent SATA controllers. I can remember getting 300MB/sec sustained read speeds in RAID-0 with burst speeds close to 700MB/sec off their nForce 5 boards. At the time Intel's controllers(ICH9R?) were choking on SSDs and couldn't manage more than 300MB/sec burst. Sustained read was significantly lower.

          Ahh... how companies fall over time.

      • Back to the original question, can you explain how Nvidia voluntarily stopping design of AMD chipsets is Intel's fault? :)

        Easily, if producing chipsets only becomes profitable with economies of scale due to fixed costs. It might not be profitable enough to design only AMD chipsets.

        It doesn't mean Intel did anything wrong (I'm not familiar on the circumstances), but that's up to the courts to decide.

    • Mad is an understatement.

      This article needs a GODDAMNIT tag...

    • No. More like their chipsets being utter crap for some years now. Always hailed as the greatest in the tests, but when you actually buy them, you notice weird things. Like the main bus not being big enough, so that a average raid0 can make professional sound cards crackle beyond usability. or like the builtin NIC being so bad, that you actually have to buy another one and disable the on-board one in the bios to avoid it crashing your OS on the first transferring packet. Things like that.

      I would have never e

      • True for me. The last nVidia chipset I was happy with was the nForce 1: perfect reliability, outstanding sound. Since then I've always had small problems, like RAM sticks working everywhere except with their chipsets, heavy HD loads causing OS crashes, heavy USB loads causing OS unresponsiveness...

        I had blacklisted nVidia chipsets years ago, I personally won't miss them. A mono/duopoly isn't ever good, though.

      • by dgatwood ( 11270 )

        Funny, I've been using Macs with NVIDIA chipsets for a while and haven't noticed any of those problems. Maybe it's not the chipset so much as poor BIOS and Windows support for ACPI interrupt steering, poor chipset drivers for Windows, poor Windows drivers that spend way too much time in interrupt handlers... hmm... I think I see a common theme here... Windows.... :-)

  • Comment removed based on user account deletion
    • Re: (Score:2, Informative)

      by Anonymous Coward
      No they are stopping production of their nForce line of motherboards.
      • Re: (Score:3, Informative)

        by Sandbags ( 964742 )

        ONLY for the new i5/i7 architecture and beyond...

        • >>>ONLY for the new i5/i7 architecture and beyond...

          I for one welcome our new Intel overlords. Maybe Apple will get smart and switch to AMD-based macintoshes. Too bad the 68000 series no longer exists, so we could have some real alternatives.

          • Re: (Score:3, Interesting)

            by Sandbags ( 964742 )

            Well, as Apple made public knowledge when they switched to Intel, (not an exact quote) "we develop, compile, and test OS X on multiple hardware platfors, always have since the very first day of development, include new processor platforms as they come available, and can change to an alternate platform at any time."

            IBM appears to be working on a low power P6/P7 architecture, AMD has some nice new stuff, They have their own fab now for low power CPUs, I'm sure they're compiling against Atom and likely even Ce

      • Re: (Score:3, Interesting)

        by MBGMorden ( 803437 )

        Which means what the GP said. Nvidia's integrated graphics solutions come in the form of Nvidia chipsets (of which the nForce is the most common). If they're no longer making chipsets, then they're no longer making integrated graphics. There's still the possibility of a maker taking a discrete chip and adding it separately to the motherboard PCB, but with virtually every modern northbridge chip having built in graphics already I don't see that happening. The people who are satisfied with integrated will

        • They are in a legal dispute with Intel and currently cannot produce chipsets for Intel's new CPUs.

          They probably find that they cannot recoup the costs of developing an IGP chipset for just the AMD platform.

          And in the quite short term (1yr), Video will move off the chipset and on to the CPU package, making IGP chipsets a dead-end.

          Since the Video part has always been the strong point of nVidia's chipsets, they see no point in continuing in the chipset business with non-IGP parts. I understand why.

          If I were th

          • Their only chance of getting into the CPU business is ARM. x86 is a licensing dead-end. Luckily companies like TI, Apple, Nokia, and Google are driving a wedge in there, so they might be able to get their foot through the door with those high-performance 2ghz ARM quad-cores that are supposed to come out in 2010 or 2011.

    • Let me quote the fine summary:

      [T]he company [is] backing away from its plans to develop a DMI-based chipset for Intel's Lynnfield processors due to legal pressure from Intel and debates over licensing restrictions.

      I'll let you decide, which of these two questions that quote is relevant for:

      Are the profit margins too slim on integrated graphics chips?
      Or are they just tired of dealing with Intel's legal dept?

    • They are stopping their nForce line of chipsets (as in, northbridge/southbridge). I couldn't be the only one to see this coming a mile away, could I? Before AMD acquired ATI, they and Nvidia were perfect partners. After that they became a lot less relevant. With Intel and AMD producing their own well regarded "gamer-grade" products for some time now, I can see why Nvidia sees little point in fighting.
    • by LWATCDR ( 28044 )

      All of the above plus Intel is going to put the GPU on the CPU soon.
      Intel is going to kill the integrated graphics market with that move and AMD/ATI is planning on doing the same thing.
      So since Intel's GPUs are terrible we will just have to wait and see what comes of this.
      The big impact I see is on Apple. They are really tied to Intel but have been using nVidia GPUs .

    • Re:Bad idea?? (Score:4, Interesting)

      by Kjella ( 173770 ) on Friday October 09, 2009 @01:05PM (#29695269) Homepage

      Intel will be putting graphics on the CPU, according to their roadmap.
      AMD will be putting graphics on the CPU, according to their roadmap.

      At that point the GPU is already a "sunk cost", noone will buy an integrated GPU that's only slightly better than another integrated GPU. It's also not only legal reasons, but also about pricing, timing, access to resources and so on. Intel can increase license costs, do accounting so more profits go on processors, delay launches of competing chipsets, deny access to resources trying to work out incompatibilies or instabilities and so on. Intel is doing extremely well and is ready to do that landgrab, one way or the other. I think nVidia is doing a better play as the victim of Intel's legal department rather than being gently pushed out the door as the GPU joins the CPU.

    • I stopped using discrete modems and went for winmodems (softmodems) almost immediately because of the latency getting the data through the serial port [linmodems.org]. Sadly, it is the same for graphics cards which is why you will never catch me dead with one in my machine. I will pown (sic) you all everyday of the week.
  • by TheGratefulNet ( 143330 ) on Friday October 09, 2009 @12:29PM (#29694707)

    due to many problems. reports of data corruption at the design level (not build or parts but *design* faults). their ethernet drivers were horribly reverse engineered and never came close to the stability of the eepro1000, for example. at least on linux.

    there were issues with sata and compatibility.

    in short, they were over their heads. glad they finally admitted it (sort of).

  • by PolarBearFire ( 1176791 ) on Friday October 09, 2009 @12:33PM (#29694763)
    They better have a compelling product with the upcoming fermi then, but from I what I hear they're trying to design their GPUs for more general purpose computing, specifically scientific computations. It's a really big gamble and I can't see that it will be a huge market. Their upcoming products are supposed to have 3 billion transistors which is way more than 4x the amount in an i7 CPU. It's probably going to cost a ton too.
    • It's probably going to cost a ton too.

      Sure it will, but it's meant as a replacement for a clusterf*ck of metal that costs in the millions. If it can compete with small supercomputers, they have a good chance IMO. They're also attacking from below with Tegra, and with ChomeOS running on ARM, so I think Nvidia is a company to watch.

      • by Svartalf ( 2997 )

        Ah...but NVidia said WinCE was the way to go on ARM...

      • Re: (Score:3, Interesting)

        by MBGMorden ( 803437 )

        You have to look at target market though. Sure they might be the deal of the century for that occasional scientist looking for supercomputer power on a budget, but in reality, few regular users - hell few extreme power users - need anything resembling a supercomputer (not just raw speed, but super computers are designed much more for parallel processing, and a ton of what users do is more suited to serialized processing).

        Overall, I think they do indeed have a target market - I just don't see that target ma

    • And yet the 3 billion transistors is only 50% more than the AMD Radeon HD5800 series.

      Considering that they're adding general purpose functionality and direct C++ programming onto the chip, it might not be an entirely unreasonable result adding an extra billion transistors. But, time will tell.

    • Re: (Score:3, Interesting)

      by blackchiney ( 556583 )
      I don't see why they couldn't go for the GPGPU and scientific computation market. They've acquired a lot of SGI and Cray IP. The x86 has been done to death. Except for more cores and a faster bus there isn't much more R&D there. And I'm not really sure why they got into the chipset business in the first place. Intel and AMD had it helmed up leaving very little for a third competitor.

      Their core competences are in GPUs, they have a lot of IP there. This is valuable for negotiating licenses against the
      • Re: (Score:3, Informative)

        by Kjella ( 173770 )

        And I'm not really sure why they got into the chipset business in the first place. Intel and AMD had it helmed up

        You must have a very warped memory of when nVidia entered the chipset business. The first chipsets were before AMD bought ATI and nForce mostly killed off a terrible line of VIA chips. They were really good at their best, they're just being squeezed out of the market.

  • by MrNemesis ( 587188 ) on Friday October 09, 2009 @12:36PM (#29694827) Homepage Journal

    ...that nVidia are at least giong to make a stab at providing graphics-enabled southbridges or something... as for things like HTPC's an Intel CPU + nVidia integrated graphics is brilliant. If I'm in the market that's looking for integrated graphics (in the case of HTPC's, power usage and space considerations) then the GPU is more important than the CPU... and I find myself being pushed to AMD for the whole platform.

    Intel is really shooting themselves in the foot with all the bus licensing stuff IMHO. By scaring off nVidia IGP's, they're left with their own mediocre offerings which, in my experience, are vastly inferior even in graphics tasks that don't involve 3D.

    If nVidia can supply us with miniscule IGP's-on-a-PCIe-stick-for-a-tenner then great, but their recent developments seem to be pushing themselves into niche applications (bigger and bigger GPU dies primarily) and I'm worried an Intel platform will make me choose between Intel IGP or a power-guzzling graphics card. Heck, pretty much every machine I've built for others in the last five years has come with an ATI or nVidia IGP because I don't know anyone that games.

    Disclaimer: I have every type of GPU in my house; I use nVidia IGP's for all my HTPC's since they're the only ones that are consistently good for HD content under both windows and Linux. Intel IGP's suck for video (my X3100 can't keep up with SD x264 scaled over a 1900x1200 screen without tearing and lag) but are fine for my laptops (low power usage preferred), and a mix of ATI and nVidia grpahics cards on the machines that need 3D. I was annoyed enough when nVidia IGP's stopped appearing for AMD boards, but not having them at all will be a serious pain in the arse.

    • have you tried AMD IGP's ? theyr're quite good for HTPC.

      • by Big Boss ( 7354 ) on Friday October 09, 2009 @03:41PM (#29697481)

        Unless you run Linux. Check the MythTV mailing list sometime, nearly every post referencing an ATI gfx chip can't get even basic stuff working. NVidia gave us VDPAU, ATI has yet to answer that one. I have no idea how ATI does in Windows for HTPCs as I don't run Windows on my HTPCs. The license alone would be 30% of the cost for the machine even if I wanted to use it. Too much for too little.

        • Agreed, I'm a mythtv man myself, and I gave up on the ATI stuff for decent high-end video playback ages ago. Just sad to face the prospect of no good IGP's for my myth boxes.

    • I know how you feel. I've chosen AMD + nVidia for just about every desktop I've built that didn't require discreet graphics. Can't beat the price or the performance.

      Looks like I'm switching to AMD + ATI! I certainly won't be going Intel - IGP is the weakest link in an HTPC, and Intel's IGPs certainly can't compete!

  • by Anonymous Coward

    ... if it weren't a complete fabrication [hardocp.com].

  • This is False (Score:5, Informative)

    by Sycon ( 1622433 ) on Friday October 09, 2009 @12:39PM (#29694879)
    http://www.tomshardware.com/news/nvidia-gpu-graphics-chipset,8821.html [tomshardware.com] They have explicitly stated they have no intention of leaving the chipset business.
    • Interesting. Something on the internet that isn't true. Worse it's on slashdot..oh to bad it's not the glory days when everything on the internet was true and you didn't have to worry about hoaxes or fake news stories.
    • So, they are stopping development of AMD chipsets, and stopping development of the Intel chipsets, leaving.... what again?

      And their triumphant "no we are not" leaving statement amounts to, "We are going to sell the ones we have designed". Great. As long as Intel makes FSB chips, they can continue to trickle out older chipsets. But no new ones. And they aren't leaving. And there are no American tanks in Baghdad.

      Come on, the only reason they are countering this is because the financial community is noticing,

    • If you always believe what a company says about itself, I have some bridges that are just coming on the market that might interest you as an excellent, ground-floor, turnkey investment opportunity.
    • Re: (Score:3, Interesting)

      That's great. Nvidia is outselling ATI chipsets by dumping stock of their Nforce4 (that is what the MCP61 is, you'd know these things if you read the PCPer article linked in the summary), a chipset from 2006 that doesn't even support PCIe 2.0. If that's not a sign of things to come, I don't know what is.

      And Nvidia is developing ONE new chipset - ION2, for Apple. Since the rest of the world is moving-on to mobile i7/i5/i3, and even Atom is getting on-die graphics, I can't forsee Nvidia really investing an

    • Or did they? I know from experience, that companies all the time state that they have no intention of doing something *ever*... until the day where they actually do what they had long planned and just wanted to keep secret.

      Of course that makes such a company look like complete untrustworthy idiots. But hey, managers are managers for a reason (= huge ego. Everything that makes them look bad "does not exist"). ^^

  • x86 would go nowhere if only IBM could make PCs, only open OEM market achieved dominance of competitors like Apple or Commodore. If Intel is not letting other people release chipsets/motherboards for it's own processors but AMD is free for all, any technical advantages of Core/Xeon would not be enough to slowly erode the market share in favor of a more open product.

  • by Anonymous Coward on Friday October 09, 2009 @12:44PM (#29694969)

    Reported at HardOCP... http://www.hardocp.com/news/2009/10/08/nvidia_statement_on_chipset_business

    NVIDIA's Ken Brown wanted to give us NVIDIA's thoughts on the current state of its chipset business. So here it is in its full text.

    Hi,

    We've received a number of inquiries recently about NVIDIA's chipset (MCP) business. We'd like to set the record straight on current and future NVIDIA chipset activity.

    On Intel platforms, the NVIDIA GeForce 9400M/ION brands have enjoyed significant sales, as well as critical success. Customers including Apple, Dell, HP, Lenovo, Samsung, Acer, ASUS and others are continuing to incorporate GeForce 9400M and ION products in their current designs. There are many customers that have plans to use ION or GeForce 9400M chipsets for upcoming designs, as well.

    On AMD platforms, we continue to sell a higher quantity of chipsets than AMD itself. MCP61-based platforms continue to be extremely well positioned in the entry CPU segments where AMD CPUs are most competitive vs. Intel

    We will continue to innovate integrated solutions for Intel’s FSB architecture. We firmly believe that this market has a long healthy life ahead. But because of Intel’s improper claims to customers and the market that we aren’t licensed to the new DMI bus and its unfair business tactics, it is effectively impossible for us to market chipsets for future CPUs. So, until we resolve this matter in court next year, we’ll postpone further chipset investments for Intel DMI CPUs.

    Despite Intel's actions, we have innovative products that we are excited to introduce to the market in the months ahead. We know these products will bring with them some amazing breakthroughs that will surprise the industry, just as GeForce 9400M and ION have shaken up the industry this year.

    We expect our MCP business for both Intel and AMD to be strong well into the future.

    Let me know if you have any questions, and thanks for your interest.

    Best,

    Ken

    • Re: (Score:3, Insightful)

      by Rudeboy777 ( 214749 )

      Come on now, I know RTFA is not in fashion but YOU JUST QUOTED THE ARTICLE minus the commentary that points out this letter IS NVIDIA'S ADMISSION that they are leaving the chipset business!!!

  • Old news (Score:2, Interesting)

    This isn't new, they knifed it a year+ ago. I wrote it up then:
    http://www.theinquirer.net/inquirer/news/1021993/nvidia-chipsets-history [theinquirer.net]
    and no one believed it. Now that NV has no choice but to admit it, they stopped pretending. Yay?

    They are doing the same thing about their "not killing" the GTX285/275/260, it is just a temporary shortage or some twaddle. This one won't take a year to admit though.

    -Charlie

    • Charlie - you claim that Nvidia will be dropping their midrange graphics chipsets, but offer no explanation why. While I tend to agree with your insight, I can't see why Nvidia would be willing to give-up marketshare just to staunch the bleeding a little. I mean, what the hell else does Nvidia make money off of, aside from midrange graphics (Tegra? too early to tell. Chipsets? They're gone. HPC? Small market.)? It would be foolish to allow their one remaining profitable enterprise to languish.

      But I

      • I was trying not to pimp my own stuff, but since you asked.....

        Short story #1, the G200b based cards are huge and need expensive PCBs. They cost more to make than the upcoming and likely faster ATI Juniper parts, so NV will have to wrap a $20 bill around each card to make them sell. Not a long term good business plan. I can't say more because I was prebriefed on the ATI cards and agreed not to talk about them. When you read this, keep in mind that I gave Nvidia a very generous benefit of the doubt. You will

        • Re: (Score:3, Informative)

          Thanks Charlie. You don't have to say any more about ATI, because the cat's already out of the bag (some site broke the Tuesday NDA). They'll be moving exclusively to GDDR5 on 128-bit bus for their midrange parts. This means that right now, they could sell a cheap 512MB 5850 with 4 memory chips for next to nothing. And once the 2Gbit GDDR5 parts ship next year, those 1GB 5770 parts can be paired with just 4 memory devices, and could probably be sold for the same cheapo $100.

          The power of a 4890 (almost)

        • by Ant P. ( 974313 )

          I thought you were just trolling at first, but after seeing that response from nVidia...

          Wow. I am NEVER giving that bunch of smarmy douchebags a single penny ever again.

    • Re: (Score:2, Interesting)

      by Zoson ( 300530 )

      And then you were promptly fired for writing FUD.

      Nobody believes a word you say. You lost all credibility long ago.

      It's just a shame the inquirer has not removed your negative, blatantly biased garbage.

    • by tjb ( 226873 )

      How much do they pay you to make shit up?

    • I don't know if nVidia stole his girlfriend or killed his puppy or what, but god, that man is on a mission !

  • Fuckin' A.

    I never thought it would come to this and I'm sorry to see them go.

  • They have a huge contract with Apple as they've adopted NVidia chipsets for pretty much the entire Mac product line. Given that Jobs would preemptively shift to another chipset platform in the last round of announcements if this were even remotely true, I seriously doubt that NVidia would even think of limiting further R&D in their chipsets to Ion 2.

    Unfortunately I'm used to the editors slipping at least twice a day...

    • You know Apple switch to x86 architecture a while ago and uses Intel processors exclusively, right?

      If Nvidia can't produce chipsets for the new Intel processors, that deal is only going to last as long as the FSBs remain marketable. As soon as DMI is the norm from high end to low end Nvidia won't be selling chipsets to anybody.

      Sure, it will be a while, but that deal was doomed as soon as it was written - it is not a long term contract.

  • Maybe this is a sign that NVIDIA is going more towards ARM, that has always been a system-on-a-chip architecture. Tegra lineup is a very nice product already, with ARM going Cortex-A9 and multicore this year, maybe Nvidia just has a more important space to play in, than to tinker around with x86 chipsets ?
  • by Zoson ( 300530 ) on Friday October 09, 2009 @01:21PM (#29695545) Journal

    nVidia has published an official response.
    http://hardocp.com/news/2009/10/08/nvidia_statement_on_chipset_business

    • by pavon ( 30274 ) on Friday October 09, 2009 @04:43PM (#29698299)

      The summary and the official response say the same damn thing. Furthermore, if you would have RTFA, you would know that it quotes the official statement that every one is posting, giving a paragraph by paragraph critique of how it does not refute anything, just tries to spin it nicely for the stockholders.

      NVIDIA currently has no plans to create any new AMD or Intel chipsets after the ION2. Period.

  • by Sloppy ( 14984 ) on Friday October 09, 2009 @01:23PM (#29695579) Homepage Journal

    It looks like long-term, Intel and AMD/ATI are going to be the only games in town. That wouldn't worry me a whole lot, because I think their stuff looks good on paper, and they'll compete. And both of them are slowly advancing their open source drivers. But the key word is "slowly." If, say, you want to buy a machine to use as a MythTV box or something like that, right now NVidia is currently the only one it makes sense to buy. Anybody else, and you're going to have to decode your video with CPU and read promises about how some day you might not have to. I hate reading promises.

    I am not looking forward to the day when these two windows of acceptability don't overlap. What happens you want to build a box and neither Nvidia nor Intel not AMD have a product that can actually be used, either because they're gone (Nvidia) or their drivers aren't yet working (Intel and AMD)? That is going to suck.

    • Re: (Score:3, Insightful)

      by linhares ( 1241614 )

      I hate reading promises.

      Well then the Nobel committee won't have you.

    • I think you're confused. nVidia isn't leaving the graphics card business. Just the mainboard chipset market (allegedly). I suppose this will mean fewer integrated video solutions based on nVidia, but you'll always be able to go buy a discrete PCI Express 2.0 card for your MythTV box. And on top of that, Intel has really good open drivers for their mainboard chipsets, so the combination of the two could actually make good sense for your situation.
  • Really? That can't be good.

  • It's the new imac and mini thinner with Intel GMA video half the speed of are old system and $100 less.

  • apple should move to ati/amd and dump low end intel systems. The ATI780g / 790GX video systems kicks intel laptop cpu + intel gma video and the mini and imacs need to have desktop cpus and much better video cards. Apple can keep intel in the laptops + ati video and the high end mac pro and make the xmac.

Beware of Programmers who carry screwdrivers. -- Leonard Brandwein

Working...