Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Intel Hardware

Nvidia To Make CPUs, Going After Intel (bloomberg.com) 111

Nvidia said it's offering the company's first server microprocessors, extending a push into Intel's most lucrative market with a chip aimed at handling the most complicated computing work. Intel shares fell more than 2% on the news. From a report: The graphics chipmaker has designed a central processing unit, or CPU, based on technology from Arm, a company it's trying to acquire from Japan's SoftBank Group. The Swiss National Supercomputing Centre and U.S. Department of Energy's Los Alamos National Laboratory will be the first to use the chips in their computers, Nvidia said Monday at an online event. Nvidia has focused mainly on graphics processing units, or GPUs, which are used to power video games and data-heavy computing tasks in data centers. CPUs, by contrast, are a type of chip that's more of a generalist and can do basic tasks like running operating systems. Expanding into this product category opens up more revenue opportunities for Nvidia.

Founder and Chief Executive Officer Jensen Huang has made Nvidia the most valuable U.S. chipmaker by delivering on his promise to give graphics chips a major role in the explosion in cloud computing. Data center revenue contributes about 40% of the company's sales, up from less than 7% just five years ago. Intel still has more than 90% of the market in server processors, which can sell for more than $10,000 each. The CPU, named Grace after the late pioneering computer scientist Grace Hopper, is designed to work closely with Nvidia graphics chips to better handle new computing problems that will come with a trillion parameters. Systems working with the new chip will be 10 times faster than those currently using a combination of Nvidia graphics chips and Intel CPUs. The new product will be available at the beginning of 2023, Nvidia said.

This discussion has been archived. No new comments can be posted.

Nvidia To Make CPUs, Going After Intel

Comments Filter:
  • by sinij ( 911942 ) on Monday April 12, 2021 @12:46PM (#61264262)
    Appears someone important at Nvidia bought into drivel management consultants were pushing. I am skeptical that Nvidia would be able to compete in the cutthroat CPU market seeing how they are not even able to cope with increased demand in GPU market. They have what is essentially a license to print free money and they can't even take full advantage of that!
    • by phalse phace ( 454635 ) on Monday April 12, 2021 @01:07PM (#61264350)

      Appears someone important at Nvidia bought into drivel management consultants were pushing. I am skeptical that Nvidia would be able to compete in the cutthroat CPU market seeing how they are not even able to cope with increased demand in GPU market. They have what is essentially a license to print free money and they can't even take full advantage of that!

      How is Nvidia suppose to deal with their GPU shortages when they're a fabless company who outsources their GPU manufacturing to TSMC?

      • By announcing plans for adding more chips to their product mix and using that to generate funding to build their own fab.

        • Re: (Score:3, Insightful)

          by Anonymous Coward

          It takes years to build a fab and get it up and running. That will not address the immediate GPU shortage.

          And you don't just acquire the skills needed to make advanced 5nm and smaller chips (what Nvidia needs) overnight. There's a reason why TSMC and Samsung are the only ones capable of successfully making them right now.

        • by ShanghaiBill ( 739463 ) on Monday April 12, 2021 @01:43PM (#61264524)

          Nvidia has no expertise in fabbing ICs. They have no ability to build a fab nor run one.

          The cost of getting into the game is astronomical. TSMC is investing $100B over the next five years. Nvidia's annual profit is $6B.

          Companies with decades of fabbing experience and expertise are failing and getting out of the business, or only using legacy-scale technology (14 nm or larger).

          Designing ICs and fabbing ICs are two completely different businesses. The only company left that tries to do both is Intel, and Intel is not doing well.

        • Re: (Score:2, Redundant)

          Build their own fab! Ahahahaha. Good one. Yes, because that's what one does when one needs to make more chips, they just "build their own fab". You guys crack me up.

          The guy who said "hey, we should build our own fabs" got laughed out of the NVidia headquarters.

          • Not everyone can do it. But the world can handle at least one more. If they announce such plans, nobody else would try because it would be a losing proposition once the pandemic shortage ends.

            There just has to be one company out there crazy enough to try to get funded and go for it. Even if it triples the size of their company in the process, it's still easier than a new startup trying to do the same.

            • Not everyone can do it. But the world can handle at least one more.

              There is already one more: Samsung has a 5 nm fab.

              If they announce such plans, nobody else would try

              Nobody else would try because they would be laughing too hard.

              because it would be a losing proposition once the pandemic shortage ends.

              The lead time to build a fab is 5 years if you know what you are doing. Nvidia doesn't know anything about building or running a fab.

              TSMC is investing $100B in new fabs, some at 3 nm. They would be on-line long before Nvidia could make their first chip.

              There just has to be one company out there crazy enough to try to get funded and go for it.

              Nobody would fund it.

        • By announcing plans for adding more chips to their product mix and using that to generate funding to build their own fab.

          Building their own fab? Like it's just that easy. Even Apple doesn't have their own chip fabs and Intel that has decades of experience in chip fabrication is lagging behind TSMC and Samsung. How on earth do you think Nvidia, a company with no experience in fabricating processors, is just going to "build their own fab"?

          If they have the money then they are much better off investing in TSMC or Samsung, companies that do have experience with cutting edge processor fabrication, to build out more capacity for the

      • By building into their silicon hardware counter measures to them being used for computing hashes, either block it outright, or throttle it after detection of long term hashing. This will kill the market for these being used for crypto mining so they can get back into the hands of gamers, engineers and computer enthusiasts. I seriously don't understand why people are still GPU mining, I thought ASICs had far surpassed GPU mining, oh i don't know like nearly a decade ago?
        • by Zak3056 ( 69287 )

          By building into their silicon hardware counter measures to them being used for computing hashes, either block it outright, or throttle it after detection of long term hashing.

          So you're suggesting that the way for them to get the best use out of their current ability to print money is to smash the printing press with a hammer so you can have a cheap GPU?

          Good luck with that.

          • All depends on how long the crypto craze lasts. If they shoot themselves in the foot, piss off the users who want to use the cards as they were designed and can't get them so go over to AMD/ATI or whatever they brand their video chips under now. Then crypto takes a crap or someone comes out with some crazy ASIC they're going to be left holding the bag with a shit ton of video chipsets/cards that no one will want to touch cause everyone has gone somewhere else where they can get what they need. It may take t
            • If the unavailability of cards that you can actually game on goes on long enough you might start having game devs tweak their games to perform better on what people can get their hands on. Whats the point of optimizing for something that only single digit percentages of your customer base can get their hands on? Once game devs start optimizing for hardware that is not what you provide, that will provide even more momentum for people to switch teams. Turns AMD into the leader and Nvidia is having to reverse
        • by Saffaya ( 702234 )

          The price of cryptos has inflated so much that it is profitable to GPU mine some of them.
          Yes, that should tell you something about the irrationality of this raise.
          When the craziness will subside and price get lower again, only ASICs will be profitable, and even then, only the ones made on smaller lithography. Of which there won't be many as the fabs these days are busy with orders for everything but ASIC miners, and thus the makers can't produce more.

      • How is Nvidia suppose to deal with their GPU shortages when they're a fabless company who outsources their GPU manufacturing to TSMC?

        Fab companies have limited capacity, that much is true. Companies like Nvidia, Apple, AMD and Qualcom compete for the capacity. It works on a bid system, the company that offers the most is the one that gets more chips made.

        So Nvidia can absolutely get more chips produced, they just have to up their bid, or engage with more manufacturers. They are not powerless, they just don't want to pay more.

        In the long term this will balance out. TSMC profits from the bidding war, and they are currently investing t

      • Yeah! Totally not at all like all those other companies that are having absolutely no trouble whatsoever keeping up with massive increases in consumer demand during a global pandemic that's causing shortages of everything from semiconductors to shipping containers at every step in the supply chain! You know, companies like ummm [theguardian.com] ...?

    • by DarkOx ( 621550 )

      Umm because diversification of your portfolio of product as long as its not to far from your core competencies is usually a good idea.

      After all if your only big money maker is GPUs and your best argument for your product is more performance/dollar then those other guys there is always a risk your get leap leapfrogged. Ask Intel.

      At least if you have a solid set of complementary products you might be able to push stock of your primary product based on being a single source vendor, integration, or the compleme

      • by Creepy ( 93888 )

        Also it's Acorn RISC Model (ARM) based, a company they are attempting to acquire even though I think it is unlikely (I don't see the EU approving it). That said, ARM manufactures nothing and licenses its technology to pretty much everyone. This highly suggests these machines will target the server market, though both Apple and Microsoft have taken steps to move end users to ARM, so that space should open up.

        Apple jumped in with both feet abandoning Intel, Microsoft is being a little more careful, with Surfa

    • by waspleg ( 316038 )

      Don't forget, Intel is quietly working on discrete GPUs as well.

    • One way if dealing with this is by offering CPUs with integrated graphics. Buttcoin miners will not invest so much into CPUs where 90+ percent of cost is formed by non-GPU functionality. Yes, it will be slower than discrete graphics, but gamers and developers will adapt.

    • Yeah! Totally not at all like all those other companies that are having absolutely no trouble whatsoever keeping up with massive increases in consumer demand during a global pandemic that's causing shortages of everything from semiconductors to shipping containers at every step in the supply chain! You know, companies like ummm [theguardian.com] ...?

  • Just when you thought the news couldn't get any worse for Intel.
    • by bill_mcgonigle ( 4333 ) * on Monday April 12, 2021 @12:57PM (#61264302) Homepage Journal

      > Just when you thought the news couldn't get any worse for Intel.

      Intel's problems are largely speculative.

    • by phalse phace ( 454635 ) on Monday April 12, 2021 @01:08PM (#61264362)

      It's not good news for Intel or AMD.

      AMD stock dropped 3% on the news

      • So what fantastic news did AMD have that made it's stock skyrocket by 3% close to two weeks ago? 3% is a completely irrelevant move in the stock market. I mean just look at the past 4 weeks, there's been 3% moves on the 17th March, on the 22nd March, on the 31st of March, then it was stable for 2 weeks and now another 3% move.

        If you look at AMD's stock price over this time frame you couldn't even identify the NVIDIA announcement, to say nothing of extending the view back a few months (AMD is 20% down from w

    • Comment removed based on user account deletion
      • by Anonymous Coward

        I remember another company that decided to bequeath the low-end server market to another company, thinking that they would continue to profit on the higher-end systems.

        SUN Microsystems ended up imploding and getting sold off to Oracle in a firesale, with Intel pretty much being able to have dominance in the server space for the last decade.

        This stupidity will repeat itself, only with Intel taking place of Sun. I'm not saying that Intel will roll over and die - but they will have a much harder time having t

      • Nvidia is in an excellent position to make ARM more relevant in the compute-heavy part of computing because of their GPU tech.

        People already doing most of their heavy lifting on GPU will be able to get the same performance for less money by avoiding paying for a full x86 processor.

      • by Creepy ( 93888 )

        As I commented elsewhere, Microsoft has a preview of Windows with x64 compatibility out and has x86 support already (I've heard Adobe software still has issues, so it isn't ready for prime time yet). Apple has already fully moved into that space with Mac and I believe they have both an emulator and Rosetta 2, which converts apps to native ARM on install (that still doesn't mean there won't be app problems - for example, VirtualBox depends on x86/hypervisor/etc.). Linux has multiple ARM based flavors but 3rd

    • Yes, because NVidia will produce these advanced chips on either Unicorn farts or the slot they'll be scheduled in TSMC's fabs in..(checks watch) 2031.

      Slight exaggeration, but the fact is TSMC's fabs can only make so much shit. Neither AMD nor Intel need be overly worried about this unless they literally fucking sit still for the next 3 years.

  • AMD Laptop GPUs? (Score:5, Interesting)

    by K. S. Kyosuke ( 729550 ) on Monday April 12, 2021 @12:49PM (#61264276)
    I wonder what this will do to the anticompetitive practice of not putting AMD's GPUs into high-end laptops, now that nVidia and Intel will like each other a bit less.
    • AMD can get away with not having raytracing and MLP accelerators for the moment, but they really need an alternative to DLSS (build on something faster than big MLPs, such as TSVQ). Without it they have a hard time competing.

      • Raytracing seems more important to me than DLSS. I'm not sure DLSS does anything except in some specific instances.

        • > I'm not sure DLSS does anything except in some specific instances.

          DLSS is basically "smart upscaling" or smart upsampling.

          For example, a naive upscaling of a native 480p to 1080p is going to have massive artifacts (jagged lines, etc.). Anti-aliasing is one attempt to draw smooth edges but it has problems with transparency. [wikipedia.org]

          With DLSS you render a high resolution "ground truth" or "reference image" say at 16K resolution. Then you compare how your native 480p upscaled to 1080p looks against the "referen

        • Ray tracing is kind of cool in its own rights and when used correctly can make a good looking game down right gorgeous, but DLSS is a bigger deal if it can be pulled off well.

          Right now if you have an APU or use integrated graphics you're largely limited to 720p unless you want a slide show. DLSS offers the ability to make 1080p gaming at reasonable FPS an actual possibility without having to get a dedicated GPU.

          Additionally the dedicated hardware for ray tracing isn't particularly useful for anything
          • There's many ways to skin a cat, you can have a classifier to determine blending parameters based on something more efficient such as TSVQ.

          • DLSS on the other hand requires hardware that makes ML algorithms run faster and that's a lot easier to generalize. If it comes down to where to spend the transistor budget, I think the hardware accelerators for something like DLSS are far more valuable than those that make ray tracing feasible.

            These days there is no transistor budget. They're effectively unlimited. In laptops and small form factor PCs the limiting factor is now Thermal Design Power. It's easy (for very expensive values of fab 'easy') to pack more than enough transistors in to overwhelm any compact cooling system.

            Nowadays the conversation is where to spend the watts. If you can dream it up, the transistors are there. Keeping them from melting is the hard part.

      • AMD has ray tracing in the desktop GPUs now. They are still behind in performance, but AMD has got a foot in the door.

        • Well they have a hardware implementation of DXR and Vulkan raytracing (the ratified KHR extensions, not the NV-specific ones) but it works on their general purpose compute units like we did with realtime raytracing in the past rather than specialized hardware like on Nvidia so naturally they lag in performance until they have dedicated hardware to perform those tasks.
      • by Creepy ( 93888 )

        nVidia's "Ray Tracing" is kind of a half baked, anyway. Yes, it traces rays, but it bothers me that they call it ray tracing because it isn't traditional ray tracing. It's like calling socialism communism when communism is a subset of socialism and the reverse is not true - also socialism's definition is nebulous as is ray tracing.

  • This is clearly targeted at HPC, not for the web and business logic ... saying this is for servers while strictly true is misleading.

  • by doug141 ( 863552 ) on Monday April 12, 2021 @01:08PM (#61264360)

    You need to own chip fabs to actually "make" CPUs. Intel has fabs, and is building more. Nvidia begs for time on 3rd party fabs to get their stuff made.

    • I wouldn't characterize paying for a service from which there are multiple suppliers as begging. I am sure Intel has third party component and machine suppliers they don't make everything from dirt do they?

      • If it is critical to your core business then you are pretty much caught begging. One layer of abstraction should be in-house; additional layers can reasonably be outsourced if that is the only viable solution.

      • Multiple suppliers? You mean 2, one of whom (Samsung) may or may not not be especially amenable to making a possible future competitor's chips?

        TSMC and Samsung can't make enough chips right now. There's no clear sign this will get better anytime soon. Might be a good time to have your own fabs, even if you're a generation behind.

    • by yarbo ( 626329 )

      They pay for it. I don't beg the supermarket for food, I don't beg my landlord for an apartment, I don't beg my ISP for Internet.

      • They pay for it. I don't beg the supermarket for food, I don't beg my landlord for an apartment, I don't beg my ISP for Internet.

        You must not have Comcast...

      • by doug141 ( 863552 )

        They pay for it. I don't beg the supermarket for food, I don't beg my landlord for an apartment, I don't beg my ISP for Internet.

        There's no supply shortage of the things you mentioned, like there is for chips. Actually, if you pay rent in us dollars, it's probably about to go up due to a huge increase in the money supply and corresponding real estate inflation. For more info, you can search for "chip shortage." You can also search for an Nvidia GPU, but you won't find one for sale at retail.

  • ...Remember when Google was just a search engine, and Nvidia just made graphics cards?

    This happens to all companies that receives overwhelming success, such as Google when they literally branched out into everything, even self driving cars.
    Nvidia is experiencing a sales boom not even they had expected, 3x the demands - this gives them an unprecedented opportunity to invest in everything. And since they're essentially already producing GPU's (which essentially is a form of CPU, just with more specialized ins

  • Samsung, TSMC, and Intel are building fabs in the US. Only one of these three is an American company.

    If you're the military, who gets the contract?

    • Well, the two non-US companies are still from countries which have historically been strong allies with the US.

      • by waspleg ( 316038 )

        The question is, do we fight World War 3 when China finally makes their move for Taiwan. They only have one aircraft carrier but it's doing "training" operations there and they have planes in Taiwanese airspace.

        Fun fact: Mao backed off Taiwan because we threatened nukes. Some how I don't think it would play out that way this time.

        • Re: (Score:2, Insightful)

          Comment removed based on user account deletion
          • Oh not just that. There's nothing preventing Taiwan from doing the same thing Saddam did when pulling out of Kuwait.

          • 20 years ago the same thing was said about marginalizing Hong Kong.

          • Reminds me of the old Eddie Izzard joke where everyone was fine with Stalin killing millions of Russians because they (every other country) had all been trying to kill them (the Russians) for ages, but no one could stand Hitler because he started killing people next door. It's pretty sad to realize that no one would have gone to war over the Holocaust if it were completely contained to Germany.
    • The US military is probably the last large entity on the planet that would migrate to a new CPU architecture. They're not going to overcome their momentum for decades, even if this new chip is utterly fantastic.

      They'd be looking at AWS, Google and Azure long before the US Military.

  • Comment removed based on user account deletion
  • After dealing with the abysmally under-powered overheating CPU/SOC Nvidia made for the Google Nexus 7 tablets, I'll keep far away from this try at reentering the CPU market. https://en.wikipedia.org/wiki/... [wikipedia.org]
  • Idiots (Score:2, Interesting)

    by backslashdot ( 95548 )

    Why didnâ(TM)t they go with RISC-V? Almost wishing the UK blocks their purchase of ARM.

    • Re:Idiots (Score:4, Informative)

      by J. T. MacLeod ( 111094 ) on Monday April 12, 2021 @02:30PM (#61264764)

      RISC-V has an interesting future, but we are still years away from a package to compete with other high-end CPUs and the tooling surrounding it.

      • Fair enough .. but that's where we should encourage nVidia to help.

        • Nvidia is a for-profit company. Right now GPUs and ARM CPUs are the biggest sellers so that's what they're going to make.

          You may want to ask smaller, low-volume entities for RISC-V hardware, such as the Raspberry Pi Fondation. Keep in mind that even their newly-launched 4$USD Raspberry Pi Pico uses a dual-core ARM CPU.

          • Yes they are a for-profit company .. that's why I am saying shareholders and people making purchase decisions should reward them for short term gains.

  • by Tablizer ( 95088 ) on Monday April 12, 2021 @02:14PM (#61264664) Journal

    These appear to be ARM chips, not x86. It's not clear if ARM is good for higher-end servers. x86 has a head-start in server-oriented features and instructions such that ARM may not be able to catch up any time soon. Investors don't like long-term bets and may be disappointed.

    • not x86. It's not clear if ARM is good for higher-end servers

      That depends on the server. Are you after performance at all costs, or performance per watt? ARM servers already exist for this reason. I wonder if NVIDIA is taking on Intel or if they are taking on the Marvell ThunderX2. https://www.gigabyte.com/Enter... [gigabyte.com]

  • If they could now Even make gpu's, it is harder then getting a camel trough needle to get any NVIDIA gpu!!!! Of course you can get China counterfits =) They are fun to use when reported to destroy whole machines....
  • Intel is slowly, but surely encroaching in their space of data center AI processing. It is not lack of spending billions in this effort. NVIDIA has no choice, but to defend this space by building CPU since Apple has shown everyone that the future is computer on a chip. Expect Intel to announce something similar soon. I am confused about AMD on this - instead of beefing up their GPU to compete with Nvidia, they are going into FPGA space. Confusing...
  • are going to become a standard and eat Intel's and AMD's lunch. Apple already switched to ARM and soon a bigger computer will be based on the Raspberry PI with a faster ARM chip.

    I wanted to buy an ARMiga which is an ARM-based Amiga that emulates old Amigas through software.

  • Are you fucjing serious??

    Start being editors, you lazy fucks!

  • make that rtx 3080 i ordered AND PAID FOR 4 months ago now, k thx , fucking americans and their american ways

The optimum committee has no members. -- Norman Augustine

Working...