Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Graphics Hardware

Resale Prices Triple for NVIDIA Chips as Gamers Compete with Bitcoin Miners (yahoo.com) 108

"In the niche world of customers for high-end semiconductors, a bitter feud is pitting bitcoin miners against hardcore gamers," reports Quartz: At issue is the latest line of NVIDIA graphics cards — powerful, cutting-edge chips with the computational might to display the most advanced video game graphics on the market. Gamers want the chips so they can experience ultra-realistic lighting effects in their favorite games. But they can't get their hands on NVIDIA cards, because miners are buying them up and adapting them to crunch cryptographic codes and harvest digital currency. The fierce competition to buy chips — combined with a global semiconductor shortage — has driven resale prices up as much as 300%, and led hundreds of thousands of desperate consumers to sign up for daily raffles for the right to buy chips at a significant mark-up.

To broker a peace between its warring customers, NVIDIA is, essentially, splitting its cutting-edge graphics chips into two dumbed-down products: GeForce for gamers and the Cryptocurrency Mining Processor (CMP) for miners. GeForce is the latest NVIDIA graphics card — except key parts of it have been slowed down to make it less valuable for miners racing to solve crypto puzzles. CMP is based on a slightly older version of NVIDIA's graphics card which has been stripped of all of its display outputs, so gamers can't use it to render graphics.

NVIDIA's goal in splitting its product offerings is to incentivize miners to only buy CMP chips, and leave the GeForce chips for the gamers. "What we hope is that the CMPs will satisfy the miners...[and] steer our GeForce supply to gamers," said CEO Jansen Huang on a May 26 conference call with investors and analysts... It won't be easy to keep the miners at bay, however. NVIDIA tried releasing slowed-down graphics chips in February in an effort to deter miners from buying them, but it didn't work. The miners quickly figured out how to hack the chips and make them perform at full-speed again.

This discussion has been archived. No new comments can be posted.

Resale Prices Triple for NVIDIA Chips as Gamers Compete with Bitcoin Miners

Comments Filter:
  • Such a bs (Score:5, Insightful)

    by angar4ik ( 6808878 ) on Sunday May 30, 2021 @06:04PM (#61438024)
    Bitcoin miner doesn't use GPU for years
    • well more accurately nvidia's strategy is stupid... IE the problem isn't people using nvidia's cards to mine bitcoins, the problem is developers of ASIC miners are buying up the same resources nvidia wants thus keeping them from making it... of course nvidia's attempts to block it in their graphics cards, is just silly on all levels for that.
      • Re:Such a bs (Score:4, Insightful)

        by willy_me ( 212994 ) on Sunday May 30, 2021 @07:03PM (#61438194)
        Not a stupid strategy. Without graphics, the crypto-cards can not be resold to gamers. In a few years these crypto-cards will be discarded - or used by university / student researchers. If they had graphics then they would be resold to gamers, with each sale stealing a potential customer away from nvidia. So by separating the cards into two types, nvidia is preserving their future market. AMD will also be helped -- presumably an unfortunate byproduct from nvidia's perspective.
        • That is interesting although the lack of potential resale value will hurt demand for these cards. People don't mine for fun, they mine for money so they have to be thinking about what the resale will be as they are shelling out 10's of $K's to build their operation.
        • by tlhIngan ( 30335 )

          It also helps them use chips where the graphics output port is not working.

          Modern chips are complex and defects are almost a given. The trick has always been to work around the defects to increase yield. If even 10% of the chips don't have working graphics output (bad HDMI or DisplayPort transmitters), rebranding it as a "compute" chip makes a lot of sense - it means you have those 10% failing chips to sell, instead of going into the trash.

          And this is utterly common - most SoCs do way more than their spec s

          • by Binestar ( 28861 )

            But I don't think it's just miners snapping up cards - scalpers are still snapping them up to resell.

            Scalpers don't hold them long enough to be considered a part of the demand. They're certainly a part of why it's hard to get cards at MSRP, but once they have them they're the ones doing the selling at the 3x markup and want to get rid of them as fast as possible. Sitting on inventory is extremely bad for a scalper.

          • It also helps them use chips where the graphics output port is not working.

            Yeah but what are the odds that's the part that will fail? That's not the part that's hard to make reliably.

        • It's NV's fault if old miner cards compete with their latest-and-greatest products. Their new products should be more-desirable to gamers than discarded and janky miner dGPUs.

      • NV produced their latest gen video cards on an 8nm Samsung node. Are you suggesting they are losing access to 5nm or 7nm TSMC wafers, or Samsung wafers, due to companies like Bitmain?

      • by Pimpy ( 143938 )

        NVIDIA is in the business of selling GPUs, which doesn't exactly seem to be a problem, so I don't know that I'd characterize their strategy as stupid. It's perhaps not a bad idea to create product variants that target the needs of the different user demographics better and to apply quotas to these, but many of these will still use the same underlying GPUs, so it doesn't fundamentally solve the supply issue, such that it is. From a strategy point of view, it only really makes sense to delve into supply quota

    • Re: (Score:1, Flamebait)

      by OverlordQ ( 264228 )

      What?

      Plenty are still GPU mined because they use ASIC-resistant algo's

      • Re: (Score:2, Informative)

        by XArtur0 ( 5079833 )

        Let K = { Bitcoin, Ethereum,, Monero, ... } //All cryptocurrencies
        Bitcoin != K
        Bitcoin E K (where E = member of)

        Q.E.D. You are wrong

    • and a few other currencies used for money laundering. News outlets use Bitcoin like folks use "Klenex" or how your gran used to call every video game a Nintendo.
      • No they don't. They use the term "cryptocurrency".

        • by pjt33 ( 739471 )

          If only. It seems that most of the time nowadays they use "crypto", apparently ignorant of that abbreviation's long heritage in representing "cryptography" in general.

    • Then why are they buying up huge stockpiles of GPUs?

      The thing is, whether ASICs or GPUs get better performance depends on the algorithm used for work (or whatever) verification. Bitcoin is absolutely faster on an ASIC, however bitcoin isnt the only game in town anymore, and for etherium Monero etc, GPUs blaze ahead.

  • moronic title (Score:3, Informative)

    by slashmydots ( 2189826 ) on Sunday May 30, 2021 @06:05PM (#61438026)
    Actually, they're mining Ethereum primarily. Literately nobody is mining Bitcoins with GPUs. The ASICs out now are 300,000x as power efficient as a GPU.
  • Bullshit (Score:5, Informative)

    by carvalhao ( 774969 ) on Sunday May 30, 2021 @06:05PM (#61438028) Journal
    Bitcoin is mined with ASICs, not GPUs. Ethereum (soon to be proof-of-stake) and altcoins are the ones getting GPU processing power.
    • by leonbev ( 111395 )

      Yeah, I really wish that folks who write tech articles for mainstream media would stop dumbing down their articles about the link between Bitcoin mining and video card prices. Would it really be THAT hard for them to remove the word "Bitcoin" and replace it with "Ethereum"?

    • I came here to say this, but I knew in my heart that it had already been said!

    • Re: (Score:2, Informative)

      by MatthiasF ( 1853064 )

      Stating that ALL bitcoin mining is being done on ASICs now is completely ludicrous. Some bigger miners use FPGAs as well.

      But MOST of the bitcoin mining is being done on GPUs.

      So long as it is still reasonably profitable to use GPUs, someone out there will be using them.

      https://www.nicehash.com/profi... [nicehash.com]

      Over $200 a month for a six month return on investment is what I would call reasonably profitable.

      • by drtitus ( 315090 )

        If you're using Nicehash as an argument for "Bitcoin mining being done on GPUs" - that's not the case. Nicehash typically has ETH miners, getting paid in BTC. You can make the implication that you're mining, and getting BTC for it, but it's not Bitcoin mining. You're not doing anything with the Bitcoin blockchain/network, apart from getting paid in BTC.

      • by fr ( 185735 )

        If you scroll down on your own link you will see it's not bitcoin that is mined. Currently DaggerHashimoto is the most profitable algorithm for that card. You just get paid in bitcoin on the site.

    • by dohzer ( 867770 )

      Are GPUs not ASICs?

      • No.
        ASIC: Application Specific Integrated Circuit
        GPUs now days are as versatile (if not more) than a CPU.
        Think of the shader units (or compute units, et cetera) as tiny CPUs.
        The only silicon that competes against GPUs for versatility are CPLDs (limited logic) and FPGAs.

  • to break up their duopoly, and dont say Intel because i dont think they make high end gaming GPUs, too bad 3dfx went under, i used to have a Voodoo 5 card back around the turn of the century and it was a really cool card, it could do screen resolutions that other cards ignored, Win2k ran beautifully with it, i miss the old days,
    • by jonwil ( 467024 )

      If the leaks and rumors around the new Intel discreet cards are anything to go by, they may well be a genuine competitor in the not-to-distant future.

      • But don't forget how Intel management works, especially on non x86 projects.

        If their discreet graphics sales don't live up to the hype on first outing, they'll first diminish project funding, then quietly kill it,
        disband the group, lose the collective memory, toss it all in 55 gallon drums to be ground into dust, and never mention the code
        names again. Wave-makers get noticed, get avoided and purged, so everyone just finds a new project.

        If it is successful, some semi-clueless snake with seniority will push t

      • Eh, maybe. Remember that RDNA3 and Hopper/Ampere Next will be out by then though. And if AMD has actually managed to properly nail MCM tech on the GPU, I seriously doubt Intel will be able to compete

        • There are no bad products, just bad prices (well, almost). Intel can fill a niche in the $100-$300 price range if they can't compete on raw performance.

    • The problem isn't the duopoly. The problem is that neither in the duopoly has their own fab plants, so they are competing against each other and every other IC maker in the world trying to get fab time at one of the contract fab houses to make their chips. Intel could be killing it right now if they hadn't sat on their ass for the past 20 years and actually innovated. Rather then just copy and pasting more cores into the templates for their fab machines and painstakingly crawling at shrinking their processe
    • Voodoo5 was awful. If that's the direction you want the industry going in then no thanks.

  • The miners quickly figured out how to hack the chips and make them perform at full-speed again?
    No, NVIDIA released a development version of the driver and firmware that "accidentally" unlocked the full hash rate of these cards...

  • by downfromtherafters ( 7922416 ) on Sunday May 30, 2021 @06:38PM (#61438114)
    The CMP cards are positioned as helping gamers by giving miners a card of their own but all its doing is taking up manufacturing capacity to produce cards that are worthless for resale once the miners are done with them eliminating the resale market.
    • Is there no way to generate a video output from those "miner" cards?
      Because I don't see why those couldn't be used to render graphics...
      I think there might be a way to copy the generated frame into the framebuffer of another card, but that would add a serious bottleneck.

      • Is there no way to generate a video output from those "miner" cards?

        The cards lack video output hardware. However software comes to the rescue. I can't find the video right now but I recall someone flashing a firmware designed for laptops onto a previous generation mining only card and using the NVIDIA Optimus engine (used for laptops to dynamically switch between the power efficient iGPU and the actually capable NVIDIA GPU) to render on the NVIDIA card and redirect the output to Intel's integrate graphics.

        But it was quite a mission getting that to work. No idea if this wor

      • It's not that you can't generate output, it's that you can't generate timely output to a monitor.

        If you wanted to use them for rendering you could. But then, you wouldn't want to.

        They really are useless for anything but compute. They could be used for non-mining compute tasks, perhaps they could be used for physx or something, but not for your graphics card.

    • Nvidia loves that because that means when cards are finally available gamers will have to buy a new card rather than picking one up 2nd hand. In the end China will probably buy up all the cards with no video output. Strip the chips off them and slap them on some no frills reference design card that does have video output and still sell them at the 2nd hand market prices.
    • Also the bitcoin miners will probably keep snatching up the cards with graphics output and find ways around the throttling. Because the ones with graphics output will have resale value after they have spent the last 6-12 months cranking out coins, Throw it up on ebay and get a few hundred for it when the next higher performing cards come out rather than having to deal with disposing of e-waste.
  • An interesting example of unintended consequences. Ridiculous amounts of power consumed. People not being able to buy video cards. Governments cracking down on it. Others copying it. Hysteria over the price shooting up. Interesting indeed.
  • by BrendaEM ( 871664 ) on Sunday May 30, 2021 @06:52PM (#61438162) Homepage
    What could go wrong?
    • Not so. nVidia produces their 3000-series dGPUs on a Samsung process. AMD uses TSMC for their 6000-series dGPUs. TSMC in particular fields a lot of demand on their fabs, though, so your point isn't entirely wrong. It's just that dGPUs are competing with CPUs and other ICs for wafers.

  • Oh, we don't exist. Sorry, my mistake.
    • So much this. Been trying for months to get a 3090 specifically for rendering/modeling. Such a letdown to find out there's really no end in sight to the problem.
      • You'd be better off buying a pre-built with a 3090 in it and selling off the rest of the system piecemeal. It'll take you maybe a month to get the system built and shipped. OEMs don't have to pay scalper prices, and if you're smart, neither do you.

        • by base3 ( 539820 )
          All the pre-builts I've seen have near-scalper pricing incorporated for the cards. Wouldn't be a bad way to go if you were in the market for a PC anyway but otherwise might as well pay the scalpers and not have to deal with the hassle of selling off the rest of the machine.
          • Depends on what card you're looking for. Cheapest I can get a 6900XT is ~$1800 pre-owned without doing a lot of digging. MSRP is $999, and nobody charges MSRP (if you find one in a store, like MC, they will mark it up to around $1400 or more). 3090 is around $2500 or more right now (outside of some dubious card sales from Estonia and Poland, not sure what those are all about). It really comes down to how much you have to pay for a prebuilt and how much you can hope to recover by parting it out or resell

            • by base3 ( 539820 )
              Right - and the parting out/reselling takes time and energy so the difference would have to be bigger than it is to be worth it to me. I'm going to just wait it out and go without if prices never come down to something less ridiculous. I figure at some point there will be a new generation of cards and someone will figure out a way to defeat the Ethereum mining detection that's being used to gimp them. At that point, I'm hoping to be able to grab one of the flood of used cards that should hit the market.
  • So now all the fast computers and good graphics cards have been chucked into the Bitcoin maw. The rest of us will be stick with slow gear and rolling power blackouts until this thing finally crashes.

    David Foster Wallace predicted this in fiction years ago. Read “Infinite Jest” and weep.

  • I'm thinking of diving into a project that could take me a year or two to complete and this got me thinking I could make bank by selling my water-cooled 1080ti for a high price and buy something else later when I'm ready to game again, but I checked eBay and it doesn't look like the price has gone up since beginning of March. On the other hand, it's selling for about $800 on ebay, which is the same price I paid for it 2 years ago, so that ain't bad!
  • by Kryptonut ( 1006779 ) on Sunday May 30, 2021 @10:30PM (#61438556)
    Can we get the damn AMD cards to be first class citizens in the likes of TensorFlow? There's a definite bias toward NVIDIA gpu's / CUDA.
    • No. For all of NV's problems, their CUDA hardware/software stack is peerless in the industry. I love AMD but I have to admit that even ROCm (and all the cruft associated with it) just can't compete.

      • While ROCm et al may not be in the same league as CUDA, it would still be nice to have some sort of OpenCL implementation that could benefit from AMD gpu's, given how many are out there. Being stuck with NVIDIA as the only performant gpu solution (I am aware of OpenVINO, but Intel Graphics? Meh) with these sort of supply issues is insane.
        • AMD still has a functional OpenCL driver stack. Technically you can code with an OpenCL target as a part of AMD's ROCm stack . . . if you can suss out how to get all that set up and working.

          If you're an org speccing out hardware for a GPGPU compute cluster and you're going to be rolling your own software from scratch:

          1). your developers will probably prefer CUDA
          2). even if your developers are platform-agnostic, getting proper memory coherence between GPUs is going to take more work on AMD accelerators

          If yo

  • ..part of the miner capital strategy is to liquidate valuable GPUs on the resale market if there is a sudden downturn in coin prices that make mining unprofitable.
    If they have to resell them in a down market for mining, clearly the would-be buyers will be gamers -not other miners, so it's crazy for miners to buy special purpose cards that can't be sold to wider audience at resale time.

    • The miner-only cards also typically sit in a bad spot on the V/F curve for any given GPU. A miner equivalent of an RTX 3070, for example, will require more voltage and power to sustain the same hashrate as a legit RTX 3070. What miner wants to mess around with that garbage?

  • Got a boutique prebuilt with a 3080, and the $1900 it cost looks like a bargain now.

  • Are people really still trying to mine bitcoin on graphics cards? I thought ASICs far surpassed graphics cards YEARS ago? I understand there are other cryptos that are more geared towards CPU/GPU mining as they were purposely developed to not have an edge with ASIC designs. But I highly doubt that anyone who knows what they are doing is mining bitcoin on these graphics cards.
    • Not trying. Doing. The question isn't if someone else is better, the question is does it make financial sense. The answer with Bitcoin (and even more so Ethereum) has been a resounding yes in the past 6 months thanks largely to the stupidity of the human race driving up the price of these nothing-tokens.

      • LOL no.

        What's the BTC hashrate on a 3090? I'm waiting.

        • I just googled it, apparently around 110-120 MH/s An ASICMiner Block Erupter. Tiny little USB "Thumb Drive" style ASIC could do 330 MH/s when they were released probably nearly a decade ago still beats the shit out of a 3090. The last time I used that thing maybe 4-5 years ago I would earn maybe $2-3 a month in BTC. It wasn't even worth it anymore. The people making money mining are doing them on ASIC contraptions about the size of a shoe box and consume the same amount of power as a space heater and they
          • Yeah that's what I thought. Even if a 3090 could pull 1 TH/s, it would still lose money even if you were only paying $.10 per kW/h.

  • If I buy a card to game on.. why should you prevent me from subsidising the cost of my card by mining when Iâ(TM)m not gaming. I know people will scream about privacy.. but is there someway we can tie our steam/epic/Microsoft Id to a preorder platform.. like one card per person for people who actively play games?
  • The problem doesn't only lie with the miners. My biggest gripe is with the people selling GPUs on ebay IN BULK! If a miner and his friends camp outside of an electronics store to buy cards one at a time, well that's fair game, but how are gamers supposed to compete with people selling 30+ GPUS in bulk for 80k on Ebay? Those GPUs were supposed to appear as inventory in a store somewhere but instead ended up on Ebay, stop blaming miners and start blaming unscrupulous store owners and distributors that scalp t
  • Games are toys not tools, and tool buyers will generally outspend toy buyers.

    Games are fun but not a necessity. MONEY is a necessity. If you cannot afford a particular toy, pass or find a different hobby with a lower barrier to entry. The market will eventually sort the problem but it may take a few years.

    • Entertainment is also a need in humans and many other animals.

      Sure people intending to make money on something will often be more willing to spend money to obtain that thing to increase their amount of money, but they won't spend amounts that will cause them to lose money either, while hobbyists sometimes will. The cryptocoin miners certainly take all that, and expected longevity of the mining techniques and market into account, except for the incompetent ones that will go broke when the cryptobubble burst
  • Gamers aren't the only ones affected by the shortage of high-end GPUs. It's also affecting those of us who use high-end cards to render video in applications like Adobe Premiere Pro and DaVinci Resolve.

  • You can't successfully mine Bitcoin with a GPU. ASICs only.

Fundamentally, there may be no basis for anything.

Working...