Forgot your password?
typodupeerror
Graphics Patents Hardware

NVIDIA To License Its GPU Tech 111

Posted by Soulskill
from the sea-change dept.
An anonymous reader writes "Today in a blog post, NVIDIA's General Counsel, David Shannon, announced that the company will begin licensing its GPU cores and patent portfolio to device makers. '[I]t's not practical to build silicon or systems to address every part of the expanding market. Adopting a new business approach will allow us to address the universe of devices.' He cites the 'explosion of Android devices' as one of the prime reasons for this decision. 'This opportunity simply didn't exist several years ago because there was really just one computing device – the PC. But the swirling universe of new computing devices provides new opportunities to license our GPU core or visual computing portfolio.' Shannon points out that NVIDIA did something similar with the CPU core used in the PlayStation 3, which was licensed to Sony. But mobile seems to be the big opportunity now: 'We'll start by licensing the GPU core based on the NVIDIA Kepler architecture, the world's most advanced, most efficient GPU. Its DX11, OpenGL 4.3, and GPGPU capabilities, along with vastly superior performance and efficiency, create a new class of licensable GPU cores. Through our efforts designing Tegra into mobile devices, we've gained valuable experience designing for the smallest power envelopes. As a result, Kepler can operate in a half-watt power envelope, making it scalable from smartphones to supercomputers.'"
This discussion has been archived. No new comments can be posted.

NVIDIA To License Its GPU Tech

Comments Filter:
  • Translation: (Score:5, Interesting)

    by SeaFox (739806) on Tuesday June 18, 2013 @10:23PM (#44045733)

    We want to transition to an IP company.
    Then we only have to employ lawyers and executives, and save ourselves the trouble of all that making stuff.

  • AMD (Score:5, Interesting)

    by Guppy (12314) on Tuesday June 18, 2013 @11:24PM (#44046053)

    If you're wondering about AMD, they also had a project doing graphics for ARM CPUs, but it was outright sold-off to Qualcomm.

    Qualcomm's "Adreno" GPU? The name is an anagram of Radeon.

  • by Anonymous Coward on Tuesday June 18, 2013 @11:43PM (#44046123)

    The ONLY company on this planet with an interest in very high-end desktop class GPU technology for their own use is Intel. No-one else has the need (PowerVR fills the gap for most companies that license GPU designs) or the ability to build such a complex design into their own SoC.

    Anyone else with an interest in Nvidia GPU capabilities would opt to buy discrete chips from Nvidia, or one of Nvidia's existing ARM SoC parts.

    AMD is currently devastating Nvidia in the high end gaming market. Every one of the 3 new consoles uses AMD/ATI tech for the graphics. EA (the massive games developer) has announced their own games engines will be optimised ONLY on AMD CPU and GPUs (on Xbone, PS4 and PC). Nvidia is falling out of the game.

    The x86 space is moving to APUs only. Chips that combine the CPU cluster with the GPU system. Intel's integrated GPU is pure garbage. However, Intel spends more on the R+D for its crap GPU than Nvidia and AMD combined. It would be insanely cheaper for Intel to simply license Nvidia's current and future designs. Doing so would give Intel parts that compete with AMD for the first time ever. Of course, it still wouldn't fix the problem that AMD tech is in the only hardware AAA games developers care about.

    Next year AMD completes its project to take desktop x86 parts to full HSA and Huma (as seen in the PS4). Next year Intel begins the process to use this tech (and will be two years behind AMD at best). Both companies are moving to PC motherboards that solder memory and CPU on the board itself. Both are moving to a 256-bit memory interface, although again AMD will have a significant lead here.

    Intel wants to copy AMD's GDDR5 memory interface (again, as seen in the PS4) but that requires a lot of tech Intel does not have, and cannot develop in-house (god only knows, they've tried). Nvidia also has massive expertise with GDDR5 memory interfaces, and the on-chip systems to exploit the incredible bandwidth this memory offers.

    Everyone should know Intel wanted to buy Nvidia, but would not accept Nvidia's demand to have their people run the combined company. The top of Intel is BRAINDEAD, composed of the useless morons who claimed credit for the 'core' CPU design, when all core was in reality was a return to Pentium 3, after Netburst proved to be a horrible dead-end. This political power grab is responsible for all Intel's current problems, including this biggest disaster in semiconductor history- Larrabee. Intel's FinFET project has crashed twice (Ivybridge was much worse than Sandybridge, despite the shrink, and Haswell is worse again). Intel has no new desktop chips for 2014 as a consequence.

    Now we can see it is likely Intel is readying Nvidia based parts for 2015 at the earliest. Intel has used licensed GPU tech before, notably the PowerVR architecture. However, Intel's utter inability to write or support drivers meant the PowerVR based chips wee a disaster for Intel. Intel's biggest problem with its current GPU design is NOT that it is a Larrabee scale failure, but that Intel is actually making headway. So why is this an issue?

    Well companies like S3 also made successful headway with their own designs, but this didn't matter because they were way behind the competition at the time. It is NEVER a case of being better than you were before, but a question of being good enough to go up against the market leaders. Intel knows its progress means that internally its GPU team is being patted on the back and given more support, and yet this is a road to nowhere. Intel needs to bite the bullet, give up on its failed GPU projects, and buy in the best designs the market has to offer. Nvidia is this.

    Unlike PowerVR, which is largely a take it or leave it design (which is why Intel got nowhere with PowerVR), Nvidia comes with software experts (for the Windows drivers) and chip making experts, to help integrate the Nvidia design with Intel's own CPU cores.

  • by rahvin112 (446269) on Wednesday June 19, 2013 @12:33AM (#44046389)

    Intel isn't going to buy or license nVidia stuff. They already have a license to use all their patents through a cross license deal that excluded a large chunk of Intel patents and IP. Intel is 100% focused on power consumption at this point and nVidia tech would do nothing but hurt them on this front. Haswell includes a GPU that's almost as good at the nVidia 650 and uses less power than Icy Bridge. It's also cheaper for the OEM/ODM's and provides better total power use.

    It's trivially easy for Intel to just keep advancing the GPU with each processor generation. As people have been saying for years nVidia's biggest problem is that as Intel keeps raising the low end with integrated processors that don't suck they erode significant revenue from nVidia. The reason prices for top end nVidia parts keep going up is because they are continuing to lose margin on the middle end and have lost the low end. Better than half the computers sold no longer even include a discrete GPU. As Intel continues it's slow advance they will continue to eat more and more of the discrete market place. Considering the newest consoles are going to be only marginally better than the current consoles we're probably looking at another 7 years of gaming stagnation which in the long run will damage nVidia more as fewer games require more resources than integrated GPUs. I seriously doubt nVidia can go much higher than the current $1100 Titan and expect to sell anything at all. I expect over the next two years for nVidia to see consecutive quarterly declines in revenue. They've already eroded margin and they can't push price much higher.

    They bet their lunch on HPC, and didn't even come close to their projections on sales. Then they bet the farm on Tegra, they sold none of Tegra1, had just short of no sales on Tegra2, did ok but only with tablets for Tegra3 and have announced not a single win for Tegra4. Project Denver was supposed to be the long term break with Intel that would provide the company the opportunity to move forward as a total service SOC company. Denver is supposed to be a custom designed 64bit ARM processor with integrated nVidia GPU. It was projected for the end of 2012. After missing 2012 they claimed end of 2013, this announcement makes be personally believe project denver has been canceled. Things haven't looked good for nVidia ever since Intel integrated GPU's and blocked them from the chipset market. They won't be selling to Intel because Intel doesn't want them. The other SOC vendors appear to be satisfied with PowerVR products (which focus on power use) except for qualacom which has the old AMD mobile cores to work with. I can't help but believe that this is as other have said, an attempt to go total IP and try to litigate a profit. This is probably the begining of a long slow slide into oblivion. nVidia's CEO has already sold most of his holdings (except for unexecuted options, also a very bad sign).

  • Re:Wow (Score:5, Interesting)

    by symbolset (646467) * on Wednesday June 19, 2013 @01:00AM (#44046525) Journal
    nVidia's graphics drivers include proprietary and patented Microsoft technologies. They cannot open source them, ever. They made their deal with the devil and they have to live with it.
  • Re:Translation: (Score:5, Interesting)

    by hairyfeet (841228) <bassbeast1968.gmail@com> on Wednesday June 19, 2013 @01:05AM (#44046557) Journal

    Which is why I sell AMD exclusively, because I hate insider douchebaggery and Intel is king of the douchebags. And ya know what? I have yet to hear a single complaint that their system is too slow, not one. And I put my money where my mouth is, I have an AMD hexacore that just chews through games and transcoding (and does so at the same time if I want) and cost me at $105 shipped less than a Pentium Dual. So unless a person is in one of those rare fields where they need every possible cycle they can squeeze out of a machine they really are just pissing money away. And as a bonus the money I saved let me get a nicer gaming board, twice the memory i would have gotten otherwise, and plenty of upgrade options down the road if I want to go even faster or get the octo-core.

    But that still don't explain why in the fuck Intel don't get busted, after all Apple and Linux were around when the DoJ busted MSFT's ass and if anything Intel has a tighter lock on the market than MSFT ever did. so I want to know who is cashing the checks, who is getting paid off, as i smell some dirty dealing which as we saw with the kickback scandal is SOP for Intel.

  • Re:Translation: (Score:4, Interesting)

    by Kjella (173770) on Wednesday June 19, 2013 @02:55AM (#44047055) Homepage

    Actual translation "Intel fucked us in the ass more than AMD that at least got a billion plus for their ass reaming, all we got was the curb. Now we are just gonna have to become patent trolls because with AMD owning ATI and Intel going their own way we missed the boat...damn we should have bought Via". (...) Oh and for Nvidia fans...sorry but I could have told ya so. AMD [has been so much smarter]

    Yes, because AMD has totally been flowers and sunshine ever since. In their Q1 2013 finances stockholder's equity was down to $415 million, one more total disaster quarter like Q4 2012 with a $473 million loss and they're filing for bankruptcy. Meanwhile nVidia's market cap is more than twice as big as AMD (and that is after AMD's stock recovered, it was 5x a little while there) and they're making money, this is not a back-against-the-wall move. It's the realization that building a complete SoC is complicated and just having good graphics is not enough, better to play the PowerVR game (who are not productless IP trolls) and be other SoCs than to be nowhere at all.

All the evidence concerning the universe has not yet been collected, so there's still hope.

Working...