Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Graphics Patents Hardware

NVIDIA To License Its GPU Tech 111

An anonymous reader writes "Today in a blog post, NVIDIA's General Counsel, David Shannon, announced that the company will begin licensing its GPU cores and patent portfolio to device makers. '[I]t's not practical to build silicon or systems to address every part of the expanding market. Adopting a new business approach will allow us to address the universe of devices.' He cites the 'explosion of Android devices' as one of the prime reasons for this decision. 'This opportunity simply didn't exist several years ago because there was really just one computing device – the PC. But the swirling universe of new computing devices provides new opportunities to license our GPU core or visual computing portfolio.' Shannon points out that NVIDIA did something similar with the CPU core used in the PlayStation 3, which was licensed to Sony. But mobile seems to be the big opportunity now: 'We'll start by licensing the GPU core based on the NVIDIA Kepler architecture, the world's most advanced, most efficient GPU. Its DX11, OpenGL 4.3, and GPGPU capabilities, along with vastly superior performance and efficiency, create a new class of licensable GPU cores. Through our efforts designing Tegra into mobile devices, we've gained valuable experience designing for the smallest power envelopes. As a result, Kepler can operate in a half-watt power envelope, making it scalable from smartphones to supercomputers.'"
This discussion has been archived. No new comments can be posted.

NVIDIA To License Its GPU Tech

Comments Filter:
  • Translation: (Score:5, Interesting)

    by SeaFox ( 739806 ) on Tuesday June 18, 2013 @09:23PM (#44045733)

    We want to transition to an IP company.
    Then we only have to employ lawyers and executives, and save ourselves the trouble of all that making stuff.

    • Re:Translation: (Score:5, Insightful)

      by Trepidity ( 597 ) <[delirium-slashdot] [at] [hackish.org]> on Tuesday June 18, 2013 @09:28PM (#44045751)

      Notice who gave the announcement?

      NVIDIA's General Counsel, David Shannon, announced that...

    • We want to transition to an IP company. Then we only have to employ lawyers and executives, and save ourselves the trouble of all that making stuff.

      Nah, that's not why. They're following in the footsteps of Apple and Sega - license out your key strengths to strategic partners, and you're sure to succeed.

      Right?!?

      • Nah, that's not why. They're following in the footsteps of ARM - license out your key strengths to strategic partners, and you're sure to succeed.

        Right?!?

        FTFY.

        ]t's not practical to build silicon or systems to address every part of the expanding market.

        citing the 'explosion of Android devices' as one of the prime reasons for this decision.

        'This opportunity simply didn't exist several years ago because there was really just one computing device – the PC. But the swirling universe of new computing devices provides new opportunities to license our GPU core or visual computing portfolio.'

        So breaking a long-held monopoly and opening a market to competition has lead to vastly increased opportunity and innovation. Gee, who'd have thought it?

    • Re:Translation: (Score:5, Insightful)

      by amirulbahr ( 1216502 ) on Tuesday June 18, 2013 @10:10PM (#44045983)

      Yeah because designing a GPU is not really making stuff. A bit like how writing software is done by lawyers and executives.

      This sounds like good news and an obvious step to me. It should lead to smaller and more energy efficient computing devices in the future.

      • Yeah because designing a GPU is not really making stuff. A bit like how writing software is done by lawyers and executives.

        This sounds like good news and an obvious step to me. It should lead to smaller and more energy efficient computing devices in the future.

        I suspect that they also don't have too much of a choice: the cost and energy savings of die-level integration with the CPU are difficult to ignore(and, even if they were less impressive, AMD and Intel both have pet GPUs that they integrate into most of their cores, and can freeze out anything more tightly integrated than a PCIe device at their whim, as Intel indeed did when they changed Northbridge interfaces). Either Nvidia commits to building SoCs that are all things to all people(a rather tall order), o

        • Re:Translation: (Score:4, Insightful)

          by rahvin112 ( 446269 ) on Tuesday June 18, 2013 @11:06PM (#44046211)

          Either Nvidia commits to building SoCs that are all things to all people

          That is what Project Denver was supposed to be. This announcement probably confirms that Project Denver is a failure that will never see the light of day. Denver was supposed to be the companies salvation after HPC, Tegra and everything else failed to meet the projections they set with wall street.

        • Comment removed based on user account deletion
          • Imagine laptops and tablets that could run full windows and Linux when plugged into the wall but when on battery you would switch into "uber battery" mode and get ARM battery life with none of the downsides...who wouldn't want to buy that?

            sounds like a crappy experience - a windows "laptop" that sucks when not plugged in? "no thanx" said the world!

            • Comment removed based on user account deletion
              • look if i need a windows laptop then i'm going to get a windows laptop. if it's a POS normally and only becomes a windows laptop when plugged in, i might as well just get two devices or a desktop comptuter for that matter!
          • Via's greatest value is to somebody else. So if nVidia tried to buy them they would just be outbid.
          • " like having baked in hardware crypto support"

            I was very disappointed indeed to learn the Atom in my router does not have this.

          • Its not like you couldn't afford to buy a little pipsqueak like Via

            Buy? VIA and Nvidia have similar revenue (4-5B/yr). The only thing that could happen would be a merger, and that would never fly because Jen-Hsun Huang would insist on running the combined company.

            And there's no guarantee that the VIA x86 license would still be honored after a buyout. I believe that's one of the reasons nobody has made a move to purchase them.

            Imagine laptops and tablets that could run full windows and Linux when plugged

          • So it would be the converse of AMD/ATI - it would be nVidia buying Cyrix/Centaur??
        • The question I have is why this is actually necessary. Is the market actually demanding to pair nvidia GPUs with crappy CPU cores? Because nVidia is already pairing them with good ones and offering SoCs, e.g. Tegra. Tegra has a metric assload of CPU, it's hard to imagine that they couldn't offer a dual-core and a quad-core version and cover the vast majority of cases.

    • We want to transition to an IP company.
      Then we only have to employ lawyers and executives, and save ourselves the trouble of all that making stuff.

      Nvidia has been fabless since the beginning, the only difference with this announcement is that they'll sell you the ability to put their GPU on your die, rather than exclusively buying and reselling TSMC-fabbed GPUs of their design...

    • Comment removed (Score:4, Insightful)

      by account_deleted ( 4530225 ) on Tuesday June 18, 2013 @10:37PM (#44046105)
      Comment removed based on user account deletion
      • Re:Translation: (Score:5, Insightful)

        by Cassini2 ( 956052 ) on Tuesday June 18, 2013 @10:59PM (#44046183)

        Intel periodically cuts patent cross-licensing deals with AMD that have the side-effect of bailing AMD out financially. This keeps AMD around as a competitor.

        If Intel adopted Apple's "thermonuclear war" attitude, AMD would have been out of business from the legal fees and injunctions long ago. However, if AMD was out of business, then Intel would be a monopoly and that would be bad for Intel.

        Intel manages AMD, as best it can, such that AMD gets 20% market share, and no x86 profits to speak of. With "only" 80% market share, Intel gets to keep all of the profitable market segments, with no FTC and DOJ oversight. AMD is left appealing to those who want cheap CPUs.

        • Comment removed (Score:5, Interesting)

          by account_deleted ( 4530225 ) on Wednesday June 19, 2013 @12:05AM (#44046557)
          Comment removed based on user account deletion
          • Comment removed based on user account deletion
            • Re: (Score:2, Offtopic)

              Comment removed based on user account deletion
              • by Khyber ( 864651 )

                " I would suggest Arctic Silver, first pre-treat both the CPU and the heatsink to get the paste into the tiny imperfections"

                That isn't going to do you any good considering the ball size of the thermal compound components are larger than most imperfections on the surface of the heat sink and processor packaging (40 or so nm compared to the 15 or so nm of a heat sink.) This is why we're looking into carbon nanotube transfer pads, to fit into those very small imperfections and make for much better heat transfe

                • Comment removed based on user account deletion
                  • by Khyber ( 864651 )

                    "dude we are talking about an Acer here, you ever seen how little polishing their heatsinks get?"

                    Depends on the model. The old EEEpc had a pretty nice lapped surface on the thermal module. The Aspire series, yea fuck that, they should've just soldered the damned thing to the package.

                    "I posted hard data showing that not only is it not slower the 4225 GPU it has is significantly better than what comes with any Atom netbook."

                    I am not so sure about that - I had the 4250HD in my dual-core DV7 laptop, that hunk o

            • I usually don't buy AMD to avoid the repeated erratum issues.

              Intel errata lists are not only plenty long, but the probability of them increasing in size with bugs they didn't find and/or didn't want to admit initially is... well, let's just say I produced an FDIV error while trying to calculate it.

            • by Khyber ( 864651 )

              * The bug still occurs if I place the MFENCE+NOP at the beginning
              of the function.

              * Placing the MFENCE+NOP at the end of the function causes the
              bug to stop occuring.

              The bug disappears completely in that case with testing over
              2 days.

              * Placing just a NOP at the end of the function

          • by SLi ( 132609 )

            But that still don't explain why in the fuck Intel don't get busted, after all Apple and Linux were around when the DoJ busted MSFT's ass and if anything Intel has a tighter lock on the market than MSFT ever did. so I want to know who is cashing the checks, who is getting paid off, as i smell some dirty dealing which as we saw with the kickback scandal is SOP for Intel.

            Because the crime is not being in a a monopoly position, but abusing it. Keeping a competitor alive and at 20% market share, while quite an interesting tactic (if true), actually tends to also prevent the worst monopoly abuses.

        • This is a popular myth, that Intel must keep AMD around, otherwise it will be broken up by the goverment, or something.
          In fact, there is nothing illegal about having a monopoly in itself. What is illegal is certain business practices carried out by monopolies.
          Intel does not help AMD in any way. In fact, it wants 100% of x86 market. You can see this by the way it is now going after the lower end of the market with Silvermont based Celerons.

        • by Khyber ( 864651 )

          "Intel periodically cuts patent cross-licensing deals with AMD that have the side-effect of bailing AMD out financially."

          Good thing AMD won the entire console war this upcoming generation, because that means AMD isn't going to have much of a financial problem this next half a decade, at least.

          "Intel manages AMD, as best it can, such that AMD gets 20% market share, and no x86 profits to speak of."

          You might want to rethink that, considering the majority of Intel chips using x86-64 are using AMD 64-bit instruc

      • back then apple could of had a real nice mac pro with dual AMD cpus and Nforce pro chipset so the mac pro 1 did not need to have that pci-e switchers all over the place with less pci-e lanes then the older G5 had.

      • Re:Translation: (Score:4, Interesting)

        by Kjella ( 173770 ) on Wednesday June 19, 2013 @01:55AM (#44047055) Homepage

        Actual translation "Intel fucked us in the ass more than AMD that at least got a billion plus for their ass reaming, all we got was the curb. Now we are just gonna have to become patent trolls because with AMD owning ATI and Intel going their own way we missed the boat...damn we should have bought Via". (...) Oh and for Nvidia fans...sorry but I could have told ya so. AMD [has been so much smarter]

        Yes, because AMD has totally been flowers and sunshine ever since. In their Q1 2013 finances stockholder's equity was down to $415 million, one more total disaster quarter like Q4 2012 with a $473 million loss and they're filing for bankruptcy. Meanwhile nVidia's market cap is more than twice as big as AMD (and that is after AMD's stock recovered, it was 5x a little while there) and they're making money, this is not a back-against-the-wall move. It's the realization that building a complete SoC is complicated and just having good graphics is not enough, better to play the PowerVR game (who are not productless IP trolls) and be other SoCs than to be nowhere at all.

      • by Anonymous Coward

        Actually Intel paid Nvidia over 1 billion [latimes.com] in a settlement two years ago. Also note that Nvidia has announced plan for building a new and impressive campus. I am going to guess that it cost substantially less than a billion dollars.

        Part of Nvidia's agreement with Intel was to cease development of x86 compatible devices. Which explains the shift for Project Denver from x86 to ARM. And with ARM came partnerships with Google/Android and that ecosystem which has outlasted any Tegra deals Nvidia has attempts with

      • What I want to know is...what in the hell does intel have on the DoJ to keep getting away with this shit?

        They build the CPUs that PRISM runs on, and the CPUs backdoor them a copy of everything hoovered up about DoJ employees.

    • David Shannon says:

      PC sales are declining with the rise of smartphones and tablets.

      Uh oh, our traditional PC market is dying.

      High-definition screens are proliferating, showing up on most every machine. Android is increasingly pervasive. Yesterday’s PC industry, which produced several hundred million units a year, will soon become a computing-devices industry that produces many billions of units a year. And visual computing is at the epicenter of it all.

      But wait! The mobile market is hot hot hot!

      For chip-makers like NVIDIA that invent fundamental advances, this disruption provides an opening to expand our business model.

      We should go all in on mobile and get some of that delicious moolah.

      But it’s not practical to build silicon or systems to address every part of the expanding market. Adopting a new business approach will allow us to address the universe of devices.

      How can we like, totally dominate this market?

      So, our next step is to license our GPU cores and visual computing patent portfolio to device manufacturers to serve the needs of a large piece of the market.

      Lets licence out our IP! You saw how it like, totally worked for ARM, right?

      The reality is that we’ve done this in the past. We licensed an earlier GPU core to Sony for the Playstation 3. And we receive more than $250 million a year from Intel as a license fee for our visual computing patents.

      We tried it in baby steps, and the money was delicious.

      Now, the explosion of Android devices presents an unprecedented opportunity to accelerate this effort.

      More money is good.

    • Not sure what you're talking about, sounds a lot more like Xerox (well PARC anyway) or ARM, both of which were damn good.
    • We want to transition to an IP company.
      Then we only have to employ lawyers and executives, and save ourselves the trouble of all that making stuff.

      Oi, ARM is an IP company too! Nobody seems to have any problem with it!

      • There are lots of IP companies that no one has a problem with. There are basically two business models for IP companies:
        • File or buy a load of patents and then, the next time someone independently invents something you've patented, ask for royalties and sue them if you don't get them.
        • Design things of value and sell the rights to use the designs to companies that would end up paying more if they developed something in house.

        There are a load of companies in the second category that are very profitable an

  • I'm guessing the High Performance Computing guys might be interested as well.
    • I'm guessing the High Performance Computing guys might be interested as well.

      I'd imagine that it depends on how heavily current GPU/CPU compute systems lean on the 'CPU' side of the arrangement:

      If the CPU actually keeps reasonably busy(either with aspects of the problem that aren't amenable to GPU work, or with assorted housekeeping tasks required to keep the GPUs fed and coordinated across the cluster), Intel or AMD offer pretty good prices for chips that provide a lot of PCIe lanes, support tons of RAM, and are supported by most of the world's horrid legacy software. Plus, motherb

      • For super-computing type workloads, ARM does not have a CPU fast enough to deliver the Ethernet, Infiniband, SSD, and other communications traffic to keep a Tesla fed with data.

        However, Nvidia's long-term strategy must be to sell low-power and high-power ARM chips with GPU accelerators. Within 2 to 3 years, Intel will have a Xeon product that merges the existing 12-core Xeon processors with the 60-core Xeon Phi accelerators. Similarly, AMD will be building equivalent APU units with their mixed x86, ARM a

        • by dbIII ( 701233 )
          It depends on the workload. Currently for a wide range of problems nothing can keep these things fed quickly enough for them to be able to finish the job before a normal CPU can. For other problems they finish an order of magnitude quicker than normal CPUs can. Memory usage is the main thing that separates the tasks that will or won't work on a GPU.
          I know the above poster would be aware of this I'm just trying to simplify it for everyone else.
  • AMD (Score:5, Interesting)

    by Guppy ( 12314 ) on Tuesday June 18, 2013 @10:24PM (#44046053)

    If you're wondering about AMD, they also had a project doing graphics for ARM CPUs, but it was outright sold-off to Qualcomm.

    Qualcomm's "Adreno" GPU? The name is an anagram of Radeon.

    • Huh, nice catch there, Sparky.

      Aren't AMD getting back into the ARM+GPU game themselves now?
    • Qualcomm's "Adreno" GPU? The name is an anagram of Radeon.

      That explains why the drivers blow so hard. With an assortment of tweaks [xda-developers.com] you can increase Adreno 205 performance by literally 50%.

  • by Anonymous Coward

    Maybe now Intel can licence the tech and *finally* get a decent GPU in its chips.

  • by Anonymous Coward on Tuesday June 18, 2013 @10:43PM (#44046123)

    The ONLY company on this planet with an interest in very high-end desktop class GPU technology for their own use is Intel. No-one else has the need (PowerVR fills the gap for most companies that license GPU designs) or the ability to build such a complex design into their own SoC.

    Anyone else with an interest in Nvidia GPU capabilities would opt to buy discrete chips from Nvidia, or one of Nvidia's existing ARM SoC parts.

    AMD is currently devastating Nvidia in the high end gaming market. Every one of the 3 new consoles uses AMD/ATI tech for the graphics. EA (the massive games developer) has announced their own games engines will be optimised ONLY on AMD CPU and GPUs (on Xbone, PS4 and PC). Nvidia is falling out of the game.

    The x86 space is moving to APUs only. Chips that combine the CPU cluster with the GPU system. Intel's integrated GPU is pure garbage. However, Intel spends more on the R+D for its crap GPU than Nvidia and AMD combined. It would be insanely cheaper for Intel to simply license Nvidia's current and future designs. Doing so would give Intel parts that compete with AMD for the first time ever. Of course, it still wouldn't fix the problem that AMD tech is in the only hardware AAA games developers care about.

    Next year AMD completes its project to take desktop x86 parts to full HSA and Huma (as seen in the PS4). Next year Intel begins the process to use this tech (and will be two years behind AMD at best). Both companies are moving to PC motherboards that solder memory and CPU on the board itself. Both are moving to a 256-bit memory interface, although again AMD will have a significant lead here.

    Intel wants to copy AMD's GDDR5 memory interface (again, as seen in the PS4) but that requires a lot of tech Intel does not have, and cannot develop in-house (god only knows, they've tried). Nvidia also has massive expertise with GDDR5 memory interfaces, and the on-chip systems to exploit the incredible bandwidth this memory offers.

    Everyone should know Intel wanted to buy Nvidia, but would not accept Nvidia's demand to have their people run the combined company. The top of Intel is BRAINDEAD, composed of the useless morons who claimed credit for the 'core' CPU design, when all core was in reality was a return to Pentium 3, after Netburst proved to be a horrible dead-end. This political power grab is responsible for all Intel's current problems, including this biggest disaster in semiconductor history- Larrabee. Intel's FinFET project has crashed twice (Ivybridge was much worse than Sandybridge, despite the shrink, and Haswell is worse again). Intel has no new desktop chips for 2014 as a consequence.

    Now we can see it is likely Intel is readying Nvidia based parts for 2015 at the earliest. Intel has used licensed GPU tech before, notably the PowerVR architecture. However, Intel's utter inability to write or support drivers meant the PowerVR based chips wee a disaster for Intel. Intel's biggest problem with its current GPU design is NOT that it is a Larrabee scale failure, but that Intel is actually making headway. So why is this an issue?

    Well companies like S3 also made successful headway with their own designs, but this didn't matter because they were way behind the competition at the time. It is NEVER a case of being better than you were before, but a question of being good enough to go up against the market leaders. Intel knows its progress means that internally its GPU team is being patted on the back and given more support, and yet this is a road to nowhere. Intel needs to bite the bullet, give up on its failed GPU projects, and buy in the best designs the market has to offer. Nvidia is this.

    Unlike PowerVR, which is largely a take it or leave it design (which is why Intel got nowhere with PowerVR), Nvidia comes with software experts (for the Windows drivers) and chip making experts, to help integrate the Nvidia design with Intel's own CPU cores.

    • by Nutria ( 679911 )

      all Intel's current problems

      With US$18Bn in cash and other marketable securities, sales of US$53.3Bn and net income of $11.0Bn, I'll take Intel's problems any day.

    • by rahvin112 ( 446269 ) on Tuesday June 18, 2013 @11:33PM (#44046389)

      Intel isn't going to buy or license nVidia stuff. They already have a license to use all their patents through a cross license deal that excluded a large chunk of Intel patents and IP. Intel is 100% focused on power consumption at this point and nVidia tech would do nothing but hurt them on this front. Haswell includes a GPU that's almost as good at the nVidia 650 and uses less power than Icy Bridge. It's also cheaper for the OEM/ODM's and provides better total power use.

      It's trivially easy for Intel to just keep advancing the GPU with each processor generation. As people have been saying for years nVidia's biggest problem is that as Intel keeps raising the low end with integrated processors that don't suck they erode significant revenue from nVidia. The reason prices for top end nVidia parts keep going up is because they are continuing to lose margin on the middle end and have lost the low end. Better than half the computers sold no longer even include a discrete GPU. As Intel continues it's slow advance they will continue to eat more and more of the discrete market place. Considering the newest consoles are going to be only marginally better than the current consoles we're probably looking at another 7 years of gaming stagnation which in the long run will damage nVidia more as fewer games require more resources than integrated GPUs. I seriously doubt nVidia can go much higher than the current $1100 Titan and expect to sell anything at all. I expect over the next two years for nVidia to see consecutive quarterly declines in revenue. They've already eroded margin and they can't push price much higher.

      They bet their lunch on HPC, and didn't even come close to their projections on sales. Then they bet the farm on Tegra, they sold none of Tegra1, had just short of no sales on Tegra2, did ok but only with tablets for Tegra3 and have announced not a single win for Tegra4. Project Denver was supposed to be the long term break with Intel that would provide the company the opportunity to move forward as a total service SOC company. Denver is supposed to be a custom designed 64bit ARM processor with integrated nVidia GPU. It was projected for the end of 2012. After missing 2012 they claimed end of 2013, this announcement makes be personally believe project denver has been canceled. Things haven't looked good for nVidia ever since Intel integrated GPU's and blocked them from the chipset market. They won't be selling to Intel because Intel doesn't want them. The other SOC vendors appear to be satisfied with PowerVR products (which focus on power use) except for qualacom which has the old AMD mobile cores to work with. I can't help but believe that this is as other have said, an attempt to go total IP and try to litigate a profit. This is probably the begining of a long slow slide into oblivion. nVidia's CEO has already sold most of his holdings (except for unexecuted options, also a very bad sign).

      • by adolf ( 21054 )

        Better than half the computers sold no longer even include a discrete GPU

        Has it ever been the case in the past decade that more than half of the computers sold included a discrete GPU?

        Once integrated graphics became a useable thing, the vast majority of systems that I see* do not have a dedicated graphics card: Integrated graphics of the day have always been adequate for any non-gaming usage of that same day, and people are (as a rule) cheap.

        *: This is an anecdote based on a couple of decades of fixing co

    • by Anonymous Coward

      AMD is current devastating Nvidia in the hgh end gaming market

      INwat universe? AMD are current anathema due to them STILL failing to fix their massive frame timing issues in multi-GPU setups, and significantly lower efficiency (performance/watt) for smaller or laptop machines.

      And inclusion in consoes is not such a big 'win' as you might think. Console margins are razor thin. It may be a constant source of profit for the next few years, but that profit is not huge and reliant of continusing to improve processes the chips are buit on (to follow the pressure from console

    • by Kjella ( 173770 )

      Well companies like S3 also made successful headway with their own designs, but this didn't matter because they were way behind the competition at the time. It is NEVER a case of being better than you were before, but a question of being good enough to go up against the market leaders. Intel knows its progress means that internally its GPU team is being patted on the back and given more support, and yet this is a road to nowhere. Intel needs to bite the bullet, give up on its failed GPU projects, and buy in the best designs the market has to offer. Nvidia is this.

      The Steam hardware survey [steampowered.com] seems to disagree, 14% of gamers are now happy running Intel chips so how many non-gamers do you think find them good enough? A GPU running as part of a CPU with a <100W total power budget is never going to compete with dual SLI/CF 200W+ discrete chips, both Intel and hardcore gamers know that. Intel just wants to be in mainstream products without AMD/nVidia getting discrete chip sales and they're succeeding, check any statistics for computers shipped with discrete graphics and

    • AMD is currently devastating Nvidia in the high end gaming market. Every one of the 3 new consoles uses AMD/ATI tech for the graphics. EA (the massive games developer) has announced their own games engines will be optimised ONLY on AMD CPU and GPUs (on Xbone, PS4 and PC). Nvidia is falling out of the game.

      where exactly are you getting the number for amd devastating nvidia? both titan and 780 are better than amd gpus, the steam hardware survey [steampowered.com] still shows a 52% for nvidia and 33% for amd. The ps4 and the xbone will use amd because having a combined cpu and gpu is more convenient for a console and intel gpus are not exactly good.

      As for ea do you understand that they got paid by amd to do that? this is something that both nvidia and amd have done for years, all those games that have a "better with nvidia/amd" i

      • Consoles are focused on lowest possible cost of their hardware, since they sell to consumers at a loss, or at the best a slim profit. They need their suppliers to give them hardware for bottom dollar. That means you don't get much profit per unit.

        Now that doesn't mean AMD is getting screwed, I'm sure they are making money per unit sold, but make no mistake: The reason they got the contracts is they could offer the lowest price and that means a thin profit. So 10 million chips sold in the console is less pro

    • Sucks, because nVidia's drivers blow away AMD's. Radeon is a nice architecture, but after years of abuse I was completely fed up with stuff not working, and my new GTX is rock solid and runs everything I throw at it.

      I'm talking about Windows drivers, of course, so little of this matters to the consoles and embedded developers.

    • Everyone should know Intel wanted to buy Nvidia, but would not accept Nvidia's demand to have their people run the combined company. The top of Intel is BRAINDEAD, composed of the useless morons who claimed credit for the 'core' CPU design, when all core was in reality was a return to Pentium 3, after Netburst proved to be a horrible dead-end.

      They didn't just throw netburst away. Bits of it appeared in core, alongside the Pentium 3 technology.

      Unlike PowerVR, which is largely a take it or leave it design (which is why Intel got nowhere with PowerVR), Nvidia comes with software experts (for the Windows drivers) and chip making experts, to help integrate the Nvidia design with Intel's own CPU cores.

      The difference is that PowerVR is crap and has always been crap. I owned the Riva TNT, I owned the original PowerVR board, I owned the original 3dfx... I've owned examples of all (plus radeons) since and PowerVR is the biggest failure, their drivers are even worse than AMD's.

    • by Xest ( 935314 )

      You seem to be talking up AMD a lot and talking down Intel and nVidia.

      Given your points and Intel's supposed major management failures and AMD's devastating of nVidia in the gaming market could you explain how Intel's $11bn of profit and nVidia's $0.5bn of profit factor into the equation against AMD's -$1bn of profit? Yes, AMD lost twice what nVidia made last year, and made $12bn less than Intel.

      Something about your argument doesn't seem to stack up. If AMD was doing so well and Intel was so badly run and n

  • this sounds a lot like arm is doing right now. NVidia will sell the design of their GPU core, maybe some software, IP and other tech to other companies to make and design their own chips based on the NVidia architecture. the thing is, there is demand for high end GPUs, but for it to be reasonable to include NVidia's tech, they need more freedom to design and implement the hardware as they wish. this is going to be a completely new market for NVidia and will bring higher quality graphics and NVidia IP to mor

  • ...and one week later they'll find themselves competing against a hundred Chinese brands that use exactly the same designs.

  • Shannon points out that NVIDIA did something similar with the CPU core used in the PlayStation 3, which was licensed to Sony

    Really? NVIDIA licensed an AMD CPU core to Sony? Nifty.

  • They've been selling only to Cray and the Chinese so their prices are through the roof. I could be developing departmental analytic machines, but apparently working on volume is a scary proposition.
  • Think about it. If they license Kepler patents to a third party SoC developer, then that company will be directly competing against their own Tegra 5 chip. So, the only way it make sense is if they are canceling the Tegra 5 project.

    • No, their successive Tegras will be "high end" mobile chips used in tablets, laptops, consoles like Ouya and Shield etc. and maybe expensive 5" cell phones. But going for high end ARM performance means too much power use for lower end and smaller applications (low end tablet, mid end cell phone etc.) and Cortex A15 is already showing this. Licensing the GPU design means expanding at the bottom where almost nothing currently runs nvidia anyway.

      It's also nice for a few nerds that want a linux cell phone or a

Real Users find the one combination of bizarre input values that shuts down the system for days.

Working...