Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Hardware Technology

Nvidia's Top AI Chips Are Selling for More Than $40,000 on eBay (cnbc.com) 32

Nvidia's most-advanced graphics cards are selling for more than $40,000 on eBay, as demand soars for chips needed to train and deploy artificial intelligence software. From a report: The prices for Nvidia's H100 processors were noted by 3D gaming pioneer and former Meta consulting technology chief John Carmack on Twitter. On Friday, at least eight H100s were listed on eBay at prices ranging from $39,995 to just under $46,000. Some retailers have offered it in the past for around $36,000. The H100, announced last year, is Nvidia's latest flagship AI chip, succeeding the A100, a roughly $10,000 chip that's been called the "workhorse" for AI applications. Developers are using the H100 to build so-called large language models (LLMs), which are at the heart of AI applications like OpenAI's ChatGPT. Running those systems is expensive and requires powerful computers to churn through terabytes of data for days or weeks at a time. They also rely on hefty computing power so the AI model can generate text, images or predictions. Training AI models, especially large ones like GPT, requires hundreds of high-end Nvidia GPUs working together.
This discussion has been archived. No new comments can be posted.

Nvidia's Top AI Chips Are Selling for More Than $40,000 on eBay

Comments Filter:
  • by iAmWaySmarterThanYou ( 10095012 ) on Friday April 14, 2023 @04:35PM (#63450386)

    I can list my belly button lint for $40k. But I'm certain not to get any buyers.

    Would be nice if the editors cleaned up the drama before posting.

    • Absolutely - checking sold listings shows none, zip, nada....

  • by xack ( 5304745 ) on Friday April 14, 2023 @04:37PM (#63450390)
    At least gaming gpus are spared this time because they lack vram.
    • by Rei ( 128717 )

      Well, many kinds of gaming GPUs. Each one that's a stepup in VRAM on a given GPU line will be in-demand for AI. For example, for the RTX 3000 series, the 3060 and the 3090 (non-TI) are the in-demand ones, as the 3060 is your cheapest buy on 12GB and the 3090 is your cheapest buy on 24GB.

      Also, gamers who go with AMD cards will be mostly spared, as AI generally flocks to NVIDIA cards for CUDA.

      • I bought AMD gpu's for years and finally decided to get Nvidia. It was faster for similar price, no glitches, better software, and generally just a better experience.

        Nvidia would have to stumble hard for me to switch back.

    • At least gaming gpus are spared this time because they lack vram.

      A TPU is eight times as energy efficient as a GPU at machine learning.

      The days of using GPUs for machine learning are fading.

      • A TPU is eight times as energy efficient as a GPU at machine learning.

        The days of using GPUs for machine learning are fading.

        That's hard to believe when Google has no public plans to sell their TPU boards outside of their own cloud business that is struggling to compete with AWS and Azure. The TPU's current moment in the sun is based on a self-reported paper on arxiv that compared the latest TPU to the previous Nvidia generation. We'll see if Google is willing to claim those performance numbers in the next MLPerf results.

        It's good to see advances in the state of the art driven by competition. Hopefully, Google, AMD, and others pu

  • by larryjoe ( 135075 ) on Friday April 14, 2023 @04:39PM (#63450392)

    There are several things to keep in mind about the high price of H100s: (1) Ebay is not the best way to gauge market prices. Ebay is the equivalent of looking at after market ask prices and thinking those prices predict tomorrow's opening price. Still, preorders one year ago were reported to be in the mid $30k's. (2) It's not clear how available supply is affecting H100 prices. (3) How many buyers are interested in one or a few cards? Most serious buyers will want a bunch, and there are volume discounts for large buyers. Those volume buyers get priority, likely constricting supply for the smaller buyers. (4) ChatGPT mania is inflating the current prices for H100s. This in turn when combined with supply issues likely further inflates prices.

    Nvidia reports quarterly numbers in a few weeks, so we'll get a better feel for how well H100 has been selling (at least in an indirect way since Nvidia lumps all the data center revenue numbers together).

    • by Rei ( 128717 )

      I disagree. Take digital graphics art, for example. Which do you think takes more energy to complete an art project, an artist not using AI tools who works on her computer for 8 hours, or one who uses AI art tools and completes the job in 20 minutes, with the GPU only in use for a few of those minutes?

      Relative to throughput, AI is a big energy saver. Now, there may be induced demand offsetting this, but that just means economic growth, aka people making and buying more of the things they want to own.

      Furt

      • I disagree. Take digital graphics art, for example. Which do you think takes more energy to complete an art project, an artist not using AI tools who works on her computer for 8 hours, or one who uses AI art tools and completes the job in 20 minutes, with the GPU only in use for a few of those minutes?

        In the real world, this is not what happens though. Real artists still have to work on their computer for 8 hours, and the rest of us can ask DallE to draw us salmons wearing underwears and free-diving.

        So waste of resources.

      • What was the cost to train the AI to do that and the developer time to write the AI and the cost for all the hardware to developer, build, support and run the AI in production?

        Those are not free things you can zero out of the equation.

        • by Rei ( 128717 )

          What was the cost to train the AI to do that and the developer time to write the AI and the cost for all the hardware to developer, build, support and run the AI in production?

          Utterly minuscule compared to usage energy consumption. Stable Diffusion has over 10 million users. ChatGPT, over 100 million.

    • Are environmentalists temporarily embarrassed dictators?

    • The CO2 used in training new transformer systems is large in the sense of being about the same as a transatlantic flight. But there are over a 1000 of those daily https://www.eurocontrol.int/news/celebrating-100-years-transatlantic-flights [eurocontrol.int] and only a handful of new ones are being trained. Running a model even a top tier model is far less energy. As concerns about climate go this should be nowhere near the top of anyone's list.
  • On the latest Video Games.
    $40,000 seems reasonable

    • by EvilSS ( 557649 )

      On the latest Video Games. $40,000 seems reasonable

      These cards suck for gaming. You will get better performance from a consumer GPU in games.

  • We were barely getting a glimmer of hope now that the shitcoins collapse, now AI is gonna drive the price gouging back up again.

    • You'll be fine. Just buy a 40k gpu and it can play your games for you while you sleep.

    • by kriston ( 7886 )

      The days of buying gaming GPUs for AI are long over. There won't be a dent in gaming GPU prices or availability.

      Fun fact: it is against nVIDIA's license agreement to use their desktop gaming GPUs for AI on servers but that didn't stop some AI startup companies from doing it.

    • by MrL0G1C ( 867445 )

      What really facilitates the price gouging is consumers paying the high price, so long as they keep paying - nVidia will keep gouging.

      These AIs are going to change to the world. Computers changed the world over decades, AI will do that over the next few years, well before this decade is over. And with nVidia's prediction that they can improve AI processing power 1 million-fold we'll have a situation where everybody will be able to afford a PC with genius level AI.

      • Our only hope is that computers get smarter at a faster rate than humans getting dumber. With a hint of luck they'll finally take over the world as the dominant species... and hopefully we're cute enough to be pets.

  • It's a top-tier market to start with, a price hike on this order happens pretty much any time there's moderate unexpected demand. If it was a true shortage, prices would be inflated by an order of magnitude or more, not just a factor of four. Right now it's a matter of paying extra to get chips first, just like it was with crypto.

    What we have here is more like finding out that card you were waiting for and expected to be $500 is actually going to be $2000. Sounds familiar, doesn't it? You have the options o

  • The high-end Nvidia cards were all being grabbed by the cryptocurrency faddists during the big runup that digital tulips enjoyed over the last few years. After last year's crash, we saw a resurgence in the use of graphics cards for graphics once again. But recently the price of bitcoin has crept upward anew (Russians? Money launderers? Cartelistas?) to a level that may be high enough to make the Nvidia cards newly viable for mining.

There are two ways to write error-free programs; only the third one works.

Working...