Forgot your password?
typodupeerror
PlayStation (Games) Businesses Graphics Sony Hardware

Nvidia Walked Away From PS4 Hardware Negotiations 255

Posted by Soulskill
from the sony-forced-to-console-themselves-with-amd dept.
An anonymous reader writes "Tony Tamsai, Nvidia's senior vice president of content and technology, has said that providing hardware for use in the PlayStation 4 was on the table, but they walked away. Having provided chips for use in both the PS3 and the original Xbox, that decision doesn't come without experience. Nvidia didn't want to commit to producing hardware at the cost Sony was willing to pay. They also considered that by accepting a PS4 contract, they wouldn't have the resources to do something else in another sector. In other words, the PS4 is not a lucrative enough platform to consider when high-end graphics cards and the Tegra line of chips hold so much more revenue potential."
This discussion has been archived. No new comments can be posted.

Nvidia Walked Away From PS4 Hardware Negotiations

Comments Filter:
  • by Looker_Device (2857489) * on Friday March 15, 2013 @09:35AM (#43181813)

    You can bet MS has approached them on providing chips for Durango too. I wonder if they told *them* to piss off.

    • by robthebloke (1308483) on Friday March 15, 2013 @09:46AM (#43181919)
      I wonder if MS and sony simply decided that going to a single supplier for both the CPU and GPU was cheaper than using two suppliers for each component.
    • Messages are mixed, the rumour mill is convinced it's going to be AMD ... but there are few credible sources.

    • by MBCook (132727) <foobarsoft@foobarsoft.com> on Friday March 15, 2013 @10:53AM (#43182459) Homepage

      Can you? One of the reasons the original XBox was pulled off the market as soon as the 360 came out (and no slim was ever made) was because nVidia reportedly refused to do a die shrink or combine dies, etc. So MS was left with a big, hot, expensive chip while Sony was able to shrink theirs and lower their costs dramatically.

      MS might still hold a grudge on that one.

      • by rwise2112 (648849) on Friday March 15, 2013 @11:00AM (#43182515)

        Can you? One of the reasons the original XBox was pulled off the market as soon as the 360 came out (and no slim was ever made) was because nVidia reportedly refused to do a die shrink or combine dies, etc. So MS was left with a big, hot, expensive chip while Sony was able to shrink theirs and lower their costs dramatically.

        MS might still hold a grudge on that one.

        No sane business holds grudges like that. If MS wants it, it'll be written into the next contract and either nVidia will agree or not get the contract.

        • Re: (Score:2, Insightful)

          by Anonymous Coward

          No sane business holds grudges like that.

          The electronics industry isn't sane, then.

        • No sane business holds grudges like that. If MS wants it, it'll be written into the next contract and either nVidia will agree or not get the contract.

          Apple anyone? Rumor was Apple was going with Nvidia. Nvidia announced that had a deal with Apple and then Apple (well Jobs) killed the deal. Why? Apple has to announces things on Apple's schedule i.e. at some hyped Apple event.

          Many businesses are run by what many would consider to be not sane people. Sometimes that helps the business and others it hurts the business.

          • by am 2k (217885)

            Apple anyone? Rumor was Apple was going with Nvidia. Nvidia announced that had a deal with Apple and then Apple (well Jobs) killed the deal.

            Major breach of an NDA is a pretty good reason to go with another supplier, not only for Apple.

      • by Nyder (754090)

        Can you? One of the reasons the original XBox was pulled off the market as soon as the 360 came out (and no slim was ever made) was because nVidia reportedly refused to do a die shrink or combine dies, etc. So MS was left with a big, hot, expensive chip while Sony was able to shrink theirs and lower their costs dramatically.

        MS might still hold a grudge on that one.

        The Xbox was pulled because MS wanted everyone to buy a Xbox 360. Why? Because they stand to make more money off new releases, which would be for the Xbox 360 then they would for any Original Xbox games being bought. Granted they still made money off of Xbox Live with the original Xbox's, but not enough I imagine to keep producing them.

        I think Sony was the only company still making machines for the previous generation while games were still being made for it. Think it was a year or so after the PS3

        • Exactly, I think everyone forgets how Microsoft works. Another fine example is when they end support for an OS, they remove everything from their website that would be of any manual help to you. It's Microsoft's policy to Hook you and force you to upgrade by removing even the most basic self-help. You guys should know that by now, really.

          As for the article, it simply makes sense that AMD offers a fully integrated solution with a low price. And it's a good deal for AMD also.

        • Think it was a year or so after the PS3 was out that new PS2 games finally stopped coming

          Much, much longer than that - new PS2 games were still coming as recently as Q4 2012!

        • by PRMan (959735)

          New PS2 games were made for about 5-6 years after the PS3 came out.

          Six years after the PS3s release, there is still an occasional game being released for the PS2, such as the Final Fantasy XI expansion Seekers of Adoulin, which will release March 27 of next year.

          http://www.extremetech.com/gaming/144342-13-years-after-the-playstation-2-changed-the-industry-sony-finally-halts-production [extremetech.com]

        • by Rotag_FU (2039670)

          Similar to the claim the parent made, my understanding is also that the Xbox was pulled so quickly and replaced by the 360 because of the infeasibility of doing a die shrink to make a higher margin (or more accurately some margin rather than significant loss) design with the original Xbox.

          My understanding is that Microsoft did not procure the rights to the implementations of the CPU nor the graphics chip used in the original Xbox. This was presumably because the Xbox was rushed and/or MS was not familiar w

  • by h4rr4r (612664) on Friday March 15, 2013 @09:35AM (#43181817)

    You have to provide lots of parts at low cost and they will surely write in a lower price for each continued year of the console. That means you are tying up fab time on something is on an outdated process a few years down the road.

    On the other hand AMD had to do this, they need the money so any margin is likely acceptable.

    • by Anonymous Coward on Friday March 15, 2013 @09:51AM (#43181967)

      Money, yes but possibly also market share. People currently often write and test games only on nvidia hardware and then if it does not beak totally on AMD cards consider it done. With the differences between the cards this will give nvida a performance advantage in all games written this way, although I have no idea how much. AMD just turned the tables for all games written originally for the PS4, quite a win for PC ports of console games too I expect.

      • by Luthair (847766) on Friday March 15, 2013 @09:59AM (#43182013)
        This isn't really true, both the Xbox 360 and the Wii run AMD (well ATI) GPUs
      • Re: (Score:2, Informative)

        by The Raven (30575)

        Developers develop on NVIDIA because their drivers are better. Flat out better. More compliant, reliable, etc. This has been true for a long time... id Software's Carmack wrote about this years ago, and the situation has not improved since then.

        • Re: (Score:2, Insightful)

          by Anonymous Coward

          Developers develop on NVIDIA because their drivers are more tolerant of stupid programming mistakes.

          Fixed.

          • Someone mod this anon to 11. He's so right. The drivers fail to report GL errors correctly (unlike ATI/intel), hell, you can even link shaders without having compiled them (just setting the source is good enough). Nvidia drivers are the bain of my life.
        • That hasn't been my experience. I've simply had better reliability from AMDs offering (On Linux) than Nvidia. But everyone is entitled to their opinion. And by the way, Carmack also said that Linux wasn't viable as a gaming platform, so...Gabe disagrees.

    • by DarthVain (724186)

      If nVidia was this small minded, they deserve whatever they get.

      Having all games (and thus their ports) on million and millions of xbox and PS consoles designed and optimized for your specific hardware for the next 10 years is worth money. Any profit they actually get is just icing.

      • Re: (Score:2, Insightful)

        by Osgeld (1900440)

        yea but that optimization only lasts a few months before its totally outdated, usually before the console hits the stores. The embedded solutions on the other hand have a much higher rate of product rotation meaning you can get the latest n greatest out to customers without holding up fab on a 10 year old design for systems that usually only have high sales within the first couple of years.

        • by citizenr (871508)

          Please tell me more about optimizing games to be properly multithreaded will only last few months and then be outdated.

      • by wonkey_monkey (2592601) on Friday March 15, 2013 @10:20AM (#43182175) Homepage

        Having all games (and thus their ports) on million and millions of xbox and PS consoles designed and optimized for your specific hardware for the next 10 years is worth money. Any profit they actually get is just icing.

        Quick, better call Nvidia and tell them this before they make a terrible, terrible mistake! Just say you're calling from Slashdot - they'll put you straight through to the CEO.

        • by Nerdfest (867930) on Friday March 15, 2013 @10:40AM (#43182323)

          You're making the assumption that they thought about this. The people involved in the decision probably numbered in the dozens tops, with most of them marketing and finance people. With the way companies seem to be run to realize maximum profits in the short term these days, it's even possible they realized this but turned down the long term gain anyway.

          • by dnahelicase (1594971) on Friday March 15, 2013 @11:03AM (#43182551)

            You're making the assumption that they thought about this. The people involved in the decision probably numbered in the dozens tops, with most of them marketing and finance people. With the way companies seem to be run to realize maximum profits in the short term these days, it's even possible they realized this but turned down the long term gain anyway.

            Given the fact that we're talking about AMD and Nvidia, my guess is that it was a thoughtful decision.

            The fact that they have walked away before, that AMD is in previous consoles, and that everyone is continuously crying (from the tech world and wall street alike) that AMD is near it's end (even though it's not), it sounds like they might have made a good decision.

            AMD is going to spend a lot of time making a low margin product that is going to be outdated next year but one that they have to keep spending resources and time on for years. Nvidia is going to be spending their time on supercomputer applications, drivers, and pushing their image as a higher end card.

            Sometimes you walk away from a business deal because you want your competitor to win it.

            • by Nerdfest (867930)

              That's most likely in this case, but I still don't think it's guaranteed. Companies make bad, short sighted decisions all the time.

              • by PRMan (959735)
                NVidia's CEO has repeatedly shown himself to be a shrewd man that knows where he wants the company to go. They have stayed in business a long time while others have come and gone. If he doesn't want this business (at the probably ridiculous bargain basement prices that Sony wanted to pay), then he's probably right.
          • You're making the assumption that they thought about this.

            More specifically, I'm making the assumption that Nvidia, the multi-billion dollar company, have thought about this deal harder and for longer than the kind of Slashdotter who likes to chip in on these stories a few more minutes after reading about it.

          • by tlhIngan (30335)

            You're making the assumption that they thought about this. The people involved in the decision probably numbered in the dozens tops, with most of them marketing and finance people. With the way companies seem to be run to realize maximum profits in the short term these days, it's even possible they realized this but turned down the long term gain anyway.

            Or they DID think about this.

            Of the last gen consoles, two went ATi/AMD - Xbox360 and Wii. One went nVidia - PS3 (the RSX). nvidia was involved in the gener

        • by DarthVain (724186)

          Yet, do you disagree?

          Very likely the "mistake" is out of their hands and nothing a CEO can do about it other than build a time machine, go back in time, either A) prevent AMD from buying ATI, or B) Buy ATI, or C) Somehow convince Intel to buy nVidia, then go forward in time, and place a competitive bid on something they couldn't have without the advent of time travel.

      • | Having all games (and thus their ports) on million and millions of xbox and PS consoles designed and optimized for your specific hardware for the next 10 years is worth money.
        The problem is that AMD changed graphic card architecture on the HD 3xxx series, meaning that free console optmization only exists on the 1xxx and 2xxx series.

      • Having all games (and thus their ports) on million and millions of xbox and PS consoles designed and optimized for your specific hardware for the next 10 years is worth money. Any profit they actually get is just icing.

        Profit is never "just icing". Profit is the entire purpose for a for-profit company. No (sane) business exists to just make revenue. They have to do more than break even on the job.

    • AMD has another advantage in this sort of business. Since they no longer own their own fabrication plants, they can simply contract this out to another fabrication plant if it becomes a constraint on their first choice fabrication vendor.
      • by 0123456 (636235)

        AMD has another advantage in this sort of business. Since they no longer own their own fabrication plants, they can simply contract this out to another fabrication plant if it becomes a constraint on their first choice fabrication vendor.

        Uh, they could always have done that. No-one forced them to use thei rown fabs for all their chips.

        While flexibility is an advantage, being totally reliant on third-party suppliers is not.

      • by Guspaz (556486)

        Except nVidia has never had their own fabs, so that's not an advantage for AMD.

    • by unixisc (2429386)
      Not if the margin is negative. Once it goes into the red, then selling more just makes the losses greater.
  • by GrosTuba (227941) on Friday March 15, 2013 @09:36AM (#43181835)

    Just sayin'...

  • The only thing I can take from this is that the potential growth in mobile platforms far outstrips the costs associated with developing hardware for another game console platform. Like a previous comment asked, I wonder if they told Microsoft to go away as well. If they did, what does this mean in the bigger picture? Is the future of gaming on tablets?
    • Re:dem Economics (Score:4, Interesting)

      by David_Hart (1184661) on Friday March 15, 2013 @10:08AM (#43182073)

      The only thing I can take from this is that the potential growth in mobile platforms far outstrips the costs associated with developing hardware for another game console platform. Like a previous comment asked, I wonder if they told Microsoft to go away as well. If they did, what does this mean in the bigger picture? Is the future of gaming on tablets?

      My thought is that tablets will allow us to extend games and make them portable. For example, I would have loved to have been able to play Skyrim on the PS3 and the Tablet: The PS3 at home and the Tablet when on the road. Saved games would be synched to the cloud, similar to what Steam does today, and downloaded to the tablet so that you could pick up where you left the game. The capabilities of tablets would have to improve quite a bit before this happens, but it is coming...

      • Without some very clever thinking(or a greater acceptance among tablet users of peripherals), that is going to be a brutal UI problem...

        Even between PC and console, which are practically cousins in the 'lots of buttons and a pointing device' family of interface devices, you can smell a console port a mile away because of how wrong its interface feels. Some are salveagable(Thank you, thank you SkyUI!), some are basically game-breakers(Sorry GTA IV, I wanted to enjoy you...)

        Tablets are a whole different kettl

        • by Dishevel (1105119)

          Why would you not just hook up a PS3 controller via bluetooth to your tablet?

          • Why would you not just hook up a PS3 controller via bluetooth to your tablet?

            "(or a greater acceptance among tablet users of peripherals)" Architecturally, there wouldn't be any significant barrier, and it would be the easiest thing to do. I've just never(in a nontrivial amount of observing heavy tablet-use areas) seen any peripheral use aside from keyboard/case quasi-laptop style arrangements, speaker docks(mostly for smaller devices), and video dongles for projector connections. There isn't anything specifically stopping people; but they just don't seem to.

      • by dnahelicase (1594971) on Friday March 15, 2013 @11:06AM (#43182585)

        My thought is that tablets will allow us to extend games and make them portable. For example, I would have loved to have been able to play Skyrim on the PS3 and the Tablet: The PS3 at home and the Tablet when on the road. Saved games would be synched to the cloud, similar to what Steam does today, and downloaded to the tablet so that you could pick up where you left the game. The capabilities of tablets would have to improve quite a bit before this happens, but it is coming...

        I was thinking the same things as I was playing sim city the other day....man it would be nice if this game was synched to the cloud...

      • If you think playing an Elder Scrolls game on a console (where you can't patch it using the Construction Set) is a good idea, we should all take your opinion as advice on what not to do!
  • Hmm... (Score:5, Interesting)

    by fuzzyfuzzyfungus (1223518) on Friday March 15, 2013 @09:45AM (#43181911) Journal

    I wonder how much of the 'opportunity cost/things we could have been working on instead' factor has to do with the fact that AMD is simply in a tighter spot than Nvidia, and how much it has to do with the fact that AMD already makes CPU/GPU combination packages(and seems interested in making more), while Nvidia has nothing of that sort except their 'Tegra', which might be a snappy mobile part; but is fundamentally punching in a different weight class(if nothing else, Sony's plans for 8GBs of RAM get a lot uglier on a 32-bit architecture. Yes, ARM also has something PAE-like; but PAE is mostly a hack that makes running multiple independent programs on a 32 bit system with more than 4GB of RAM palatable, not something you'd want to design a game engine around.)

    This isn't to say that Nvidia couldn't have done it(heck, what would buying VIA cost these days?); but Nvidia would need, essentially, an entire new flavor of product line for this job, while AMD, whether they call it this or not, is punching out a modestly customized APU, which almost certainly shares substantially with the ones that they sell for PCs.

    • by teg (97890)

      ...how much it has to do with the fact that AMD already makes CPU/GPU combination packages(and seems interested in making more), while Nvidia has nothing of that sort except their 'Tegra', which might be a snappy mobile part...

      This is my guess. AMD can offer an integrated part with good performance. If the choice of a PC-like architecture had already been made (no "cell 2"), then there were two other options: An integrated Intel solution (not very good graphics), or a combination of CPU from Intel and GPU from Nvidia. This would mean more/larger assembly, and two solutions to pay for rather than one.

    • by Guspaz (556486)

      All indications are that this AMD APU has way more graphics hardware onboard than anything you'd find in a consumer part, though. This also seems to be the first proper consumer 8-core chip that AMD has produced; they've never put anything out with that sort of core count before in the consumer market that didn't use the quasi-multicore design where every set of two cores were sharing a lot of the hardware (their answer to SMT). The point is that they're already scaling this thing way the heck up from what

    • Agreed, but it also sounds like nVidia has unquestionably reached the "big corporation" stage. A scrappy startup would have found a way to make the business happen - today's nVidia says, "nah, not a big enough margin" like IBM would. Some of the more interesting corps would have thrown a skunkworks or subsidiary at it if it was that thin of an effort.

      The trouble is, large slow-moving corporations aren't known for innovation, which is essential in this product space. Ordinarily I'd say nVidia ought to wat

      • by 0123456 (636235)

        One thing you're missing is: how many of these consoles are they actually going to sell? Casual gamers have a lot more options with tablets and smartphones than they did when the last generation of consoles came out with really only PCs and older consoles to compete against.

  • by Luthair (847766) on Friday March 15, 2013 @09:51AM (#43181959)

    Why are people running a blatent self-serving PR story?

    We lost but... we didn't really want to win it anyway!

    • by Kjella (173770) on Friday March 15, 2013 @09:59AM (#43182017) Homepage

      Why are people running a blatent self-serving PR story?

      We lost but... we didn't really want to win it anyway!

      Yeah, that was what I was thinking too, of course they say that. And if they'd won instead they'd say the exact opposite and we'd hear this drivel from AMD. It's not like Sony and Microsoft had a lot of other options, who should they have gone to? Intel? VIA? PowerVR? No, if both AMD and nVidia had told them to buzz off they'd come back with a better offer. I doubt AMD sold themselves that cheap, since they knew nVidia wouldn't do that either. Just cheap enough to win, keep their volume up and live to fight another day.

    • by evilRhino (638506)
      Running a PR story as news is low hanging fruit for the lazy reporter. The sponsor company probably wrote the whole piece themselves.
  • Allegedly (Score:5, Funny)

    by Thanshin (1188877) on Friday March 15, 2013 @09:51AM (#43181965)

    They, Allegedly, walked away.

    Without video proof, we can't be sure they didn't strolled, strutted or even rambled away.

    • Re:Allegedly (Score:4, Informative)

      by Narishma (822073) on Friday March 15, 2013 @10:09AM (#43182079)

      Not to mention, with phrases like "I'm sure there was a negotiation that went on," the guy just seems to be speculating about what happened, instead of, you know, being there during the negotiations.

    • Re: (Score:3, Funny)

      by Anonymous Coward

      Perhaps they moseyed [penny-arcade.com].

    • by gman003 (1693318) on Friday March 15, 2013 @10:42AM (#43182339)

      Observers from the Ministry of Silly Walks have confirmed (to their disappointment) that their walk was one of the most serious ever recorded, and that they did not amble, dawdle, gambol, hustle, limp, meander, mosey, march, ramble, sashay, saunter, scamper, scurry, sidle, skulk, slink, slog, skip, stroll, stomp, strut, swagger, tiptoe, traipse. They did not even do a forward aerial half turn every alternate step with the left leg, which itself is hardly a silly walk at all.

    • by c (8461)

      They, Allegedly, walked away.

      Without video proof, we can't be sure they didn't strolled, strutted or even rambled away.

      And those are just the "legs" options. We have to consider wheelchairs and crutches (or even "limping away"), or rolling down the hall in a conference room chair shouting "weeeeee!" the whole way. If there was alcohol involved, crawling is certainly an option. If it was a crack team of negotiators, there may have been rappelling...

      Yes, we need video.

  • Bullshit (Score:5, Interesting)

    by DarthVain (724186) on Friday March 15, 2013 @10:02AM (#43182039)

    Considering AMD are producing the CPU chips for both platforms, and the the GPU as well, it isn't surprising that nVidia "walked" away. This is the eventual benefit of AMD buying ATI, in that they can produce both now. I have no doubt that AMD either have special consideration or simply could offer a better bid than nVidia could.

    Regardless of the profit, this would be a big feather in AMD's cap. AKA "We produce both the CPU and GPU of all modern game consoles, don't you want to buy our chips?". Also in the bigger scheme of things, if you get game developers in such numbers making games for YOUR video card on millions and millions of consoles for all games, which are ported to say PC, what do you think those games will be optimized for? AMD. Which will look better? AMD. This is something that is going to change things in a pretty large way over the next 10 years.

    nVidia should have paid money to be a part of this if only to prevent their rival in AMD from doing so. Perhaps they didn't have the money. More likely they think they have something that will make a difference. I doubt it.

    I'm not fired, I quit is the sentiment I feel about nVidia's statement...

    • Re:Bullshit (Score:5, Insightful)

      by Enderandrew (866215) <enderandrew@noSPam.gmail.com> on Friday March 15, 2013 @10:17AM (#43182149) Homepage Journal

      This. I'm shocked no one else saw what was obvious here.

      AMD is providing a unified CPU/GPU on a single die that shares the same memory and bandwidth. For Nvidia to provide a separate GPU to compete at the same performance and price would be really difficult, if not impossible.

      • But the price of the AMD solution goes up because they have to use GDDR5 instead of DDR3 for that memory pool. Estimates I have seen are 2-3x the cost versus DDR3, so it adds an extra 30-50 bucks to the BOM.

        You would spend the same amount of money buying 8GB DDR3 plus 2GB GDDR5 for your GPU, and you could choose whichever combination of CPU/GPU you want! It would also mean you could use cheaper 1Gbit GDDR5 chips.

        I think that Sony is betting on the unified memory architecture giving them an advantage in GP

    • by citizenr (871508)

      if you get game developers in such numbers making games for YOUR video card on millions and millions of consoles for all games, which are ported to say PC, what do you think those games will be optimized for?

      Forget GPU and think the end of Intel IPC ruling the CPU market. AMD just won CPU race in Gaming market sector. Games will be written for AMD 8 core arch from the grounds up, using every possible x86 extension AMD has to offer, and compiled with something other than Intel 'let me check if you run this on Genuine Intel so I can decide if Ill slow down the code" compiler.

      Also think end of PhysX.

      • by DarthVain (724186)

        Agree.

        Though game, most games anyway, tend to be more limited by GPU not CPU, so I can see more optimization (specialization) there. That said, in the long run you are right, I think this will give AMD a bump in the market share outside of consoles eventually.

        I know the CPU (and likely the GPU as well) will be a special product, so it will be interesting to see what the details actually become and what they throw in there. If they are strategic and implement some interesting things that is not supported by

    • Any idea on what GPU the "Steam Box" uses? AMD could end up powering gaming in the living room totally. </ConspiracyTheory>
      • by DarthVain (724186)

        If I was a betting man, I would go with AMD also, simply for their integration which translates into low cost. Of course it remains to be seen how revolutionary the "Steam Box" is. Could be huge or a big flop, or even a non-starter.

      • by Narishma (822073)

        Nobody knows what a Steam Box is, let alone what hardware it uses.

    • by Guspaz (556486)

      "We produce both the CPU and GPU of all modern game consoles, don't you want to buy our chips?"

      The Wii U stuck to an IBM PowerPC processor, although it does have an AMD GPU.

      IBM made the processors in all three of last generation's consoles (360/PS3/Wii), what impact did that have on IBM's processor sales?

  • by abigsmurf (919188) on Friday March 15, 2013 @10:18AM (#43182161)
    60million units doesn't have revenue potential?

    Not only that, the tech they came up with could likely be used for new laptops and set top boxes.

    I suspect it was more likely because they didn't have the level of tech needed. ATI had their APU systems lined up already and with tweaking, they're perfect for a console. I'm not sure that NVidia had anything approaching the power of these APUs drawn up (their focus has been on desktop graphics and tablet).

    Rumours suggest that the 3DS was going to use NVidia tegra based tech but they couldn't keep the heat down so Nintendo went with the as-seen-in-every-bargain-bucket-chinese-tablet Mali+arm combination.
    • 60million units doesn't have revenue potential?

      Certainly it has revenue potential - but that doesn't mean it has sufficient ROI compared to other uses of their capital to justify committing the resources. For example, I can spend today on a project that will make me $100 bucks or on one that will make me $120 bucks - and absent compelling reasons to choose the former, I'm going with the latter.

    • by Microlith (54737)

      The 3DS GPU isn't Mali but something developed by a Japanese company, I don't believe it's been used in anything else at this point.

    • 60million units doesn't have revenue potential?

      Wrong question. The correct question is whether it has profit potential. Revenue is just how much you sell. Profits are how much you keep. Profits = Revenue - Expenses. The revenue is not sufficiently larger than the costs then there is no point in being in that line of business.

      • by abigsmurf (919188)
        Seems to me like it could've funded their own alternative to AMD's APU line which seems to be something they'd like a lot given they're investing in CPU development.
        • And if they only made $0.50 per chip in profit, yes that's $30 million dollars but that if that ties up half of the engineers for a company that makes $4B in revenue and $500M in profit, it's not worth it is it? For a console or consumer, the chips (even the CPU) are on thin margins as it is.
  • Forget about AMD GPU optimization. Specific shaders optimization accounts for maybe 10%.

    Think about future games being natively written with 8 cores in mind. No more buy 2 Nvidia cards to see some PhysX sparks, all games will use Havoc, Bullet or some other physics library computing on AMD GPUs.

  • There seems to be a lot of similarities to the Snapper Lawnmower story [slashdot.org].

    "Jim Wier believed that Snapper's health -- indeed, its very long-term survival -- required that it not do business with Wal-Mart. "

  • by onyxruby (118189) <{onyxruby} {at} {comcast.net}> on Friday March 15, 2013 @04:10PM (#43185779)

    Low profit plus opportunity cost equals a bad decision. Nvidia made the right business choice. That capacity can now be used for more profitable products.

It is wrong always, everywhere and for everyone to believe anything upon insufficient evidence. - W. K. Clifford, British philosopher, circa 1876

Working...