GPU Prices Soar as Bitcoin Miners Buy Up Hardware To Build Rigs (computerworld.com) 157
"Bitcoin and other cryptocurrency miners have created a dearth of mid-range and high-end GPU cards that are selling for twice as much as suggested retail," reports Computerworld. "The reason: miners are setting up server farms with the cards."
Lucas123 writes: GPU prices have more than doubled in some cases... Some of the most popular GPUs can't even be found anymore as they've sold out due to demand. Meanwhile, some retailers are pushing back against bitcoin miners by showing favoritism to their traditional gamer customers, allowing them to purchase GPUs at manufacturer's suggested retail price. Earlier this year, NVIDIA asked retailers of its hardware to prioritize sales to gamers over cryptocurrency miners.
Re:so fucking stupid (Score:4, Informative)
Check the nVidia roadmap, this is exactly what they're working on this year.
Re: so fucking stupid (Score:5, Insightful)
Re: so fucking stupid (Score:5, Insightful)
Re: (Score:2)
They won't buy them cuz then they'd have zero resale value after the card is no longer powerful enough to mine. Selling used cards to gamers gets at least a few bucks back.
I wouldn't say zero, so long as they support SLI and will pair up with some card gamers use.
Re: so fucking stupid (Score:5, Insightful)
They won't buy them cuz then they'd have zero resale value after the card is no longer powerful enough to mine. Selling used cards to gamers gets at least a few bucks back.
I seriously don't want a graphics card that has been abused in a mining rig. They aren't meant to run full power 24/7, and I doubt there's more than a couple of use in them.
Re: so fucking stupid (Score:4, Interesting)
Unless they are getting free power they're a lot more likely to slightly undervolt the cards, as running at 70-80% uses 50% less power than trying to max things out.
Re: (Score:2)
correct in a lot of instances, most coins you would undervolt the core and underclock it, and then overclock the ram. the ram takes some abuse sometimes. but nothing more than overclocking the ram in your pc. even modify memory timings on the amd cards. nvidia encrypts their bios files so you cant tweak it unfortunately..
Re: (Score:2)
Re: (Score:3)
Yeah, well gamers figured out real quick that cards run in overdrive for 18months don't last long..
Re: (Score:2)
They won't buy them cuz then they'd have zero resale value after the card is no longer powerful enough to mine.
Mining is not the only market for crunching numbers. These would have resale value to the same people who buy NVIDIA Tesla products.
Re: (Score:2)
Wait a minute... When did Tesla partner up with nVidia?
Re: (Score:2)
that would be illegal. there are no rules saying you cant use the cards for computing, and with all the drivers and documentation I would say quite the opposite.
Re: (Score:2, Insightful)
I don't believe you've thought this through. There is a finite supply of GPUs that can be made by these multi-billion dollar foundries in any given period. Reducing the cost of cards used by miners (no video ports, etc.) will just enable miners to buy more cards and grab a bigger chunk of the GPU supply. Reducing the cost of cards that miners want isn't going to increase the supply of cards that gamers want; ultimately they're all coming from the same source of integrated circuits. If the supply of mine
Re: (Score:2, Insightful)
The solution is to badger and shame these concurrency inventors into using different "proof" algorithms so that they aren't pulling a large fraction of the planet's power supply and buying up all the hardware.
No, the solution is to be a GPU manufacturer.
The one who gets rich in a gold rush is the one who sells shovels.
The hard part is to make them keep buying new cards long after the coins have become too expensive to mine.
Re: (Score:2)
Or run ASIC's on an one node older that has better yields and faster throughput.
Re: (Score:2)
Why in the fuck would they do that? Adding the ports is literally a couple of bucks in parts. Add in the fact that both models would require separate certifications and differntiated aprts and it's just not worth it at all.
This is a basic supply and demand problem. The demand outstrips supply in a radical way. The winners of this are currently Intel with their embedded GPUs.
Re: (Score:1)
The demand outstrips supply in a radical way. The winners of this are currently Intel with their embedded GPUs.
The main problem is that we didn't really have a healthy GPU market before the cryptocurrency fad.
Had there been more than a handful of brands to pick from then not only would they have been cheaper to begin with but the response would have been to try to crank out as many cards as they can.
With the current situation? A higher demand just means that you can abuse your customers more.
Create an even larger shortage, raise the prices and look at the consumers gladly pay through the nose if they are lucky enoug
Re: so fucking stupid (Score:2)
What a shitty post, even for slashdot... (Score:5, Informative)
1) Bitcoin is NOT mined on GPU, since like 5 years. Only on special ASIC devices. You ment to write that crypto-currencies, ALTcoins, are GPU mined
2) This is going on for like 1-2 years now, including the GPU shortage as result of ALT-coin mining
Re:What a shitty post, even for slashdot... (Score:5, Informative)
Indeed. In the last few months, GPU process have actually dropped a fair bit. In January, it was common to see Radeon Vega 64 cards offered for almost 4x MSRP.
Also prices are down 25% (Score:3)
Re: (Score:2)
Re: (Score:2)
Harder to make ASIC for, not impossible. It's effectively impossible to make a crypto currency you can't build an ASIC for.
The company that has the prototype is the same Chinese company that rules the roost when it comes to bitcoin ASIC mining. They're experts in the field, and it took them what, a year and then some to get ASIC designed.
Re: (Score:2)
Bitmain, the company behind the ASIC in question, and the dominant player in Bitcoin mining field appears to disagree with you, as they have a working ASIC.
Re: (Score:1)
Bitmain, the company behind the ASIC in question, and the dominant player in Bitcoin mining field appears to disagree with you, as they have a working ASIC.
That's interesting, I had not noticed Bitmain had an Ethereum miner released just a few days ago. In general, ASICs are all over the place now. Monero just got ASICs as well. Publicly that is, I guess these have been running quite a while now in private.
Monero also just forked to avoid ASIC's but then others continued with the ASIC compatible version and call it with a slightly different name. Ethereum has been planning to switch from proof of work to proof of stake for a long time now, so that will drop an
Re: Also prices are down 25% (Score:2)
True but the ethrum development community is planning an update to break the rig a few weeks/month after it has been released.
Etherum takes an active role in breaking asic rigs.
Re: (Score:2)
They can try. And if miners ignore them, as they recently did with Monero, it'll be just the "community" with no computational power and with yet another alt-coin no one cares about.
Re: (Score:2)
Yes but ethereum is very interesting in that it's not just "a cryptocurrency". It's a whole blockchain platform. You can base your blockchain product in that. Ethereum "coin" can tank and that won't affect you.
Re: (Score:2)
I don't think you quite understand why the people other than a handful of idealists are in on the blockchain craze.
What you're suggesting is that high flying goals that no one but a tiny minority cares about are more important than profit. Like I said, if you think that, that's how you end up with useless alt-coin and platform no one cares for.
Re: (Score:2)
Digging holes and filling them. What a waste of energy. Why build in ASIC resistance? In a sensible system, efficiency is good and waste is bad.
Re: (Score:2)
Bitmain is literally the main player in ASIC market. It has the lion's share of it.
Re: (Score:2)
Exact same reason why all other tooling manufacturers do the same..
Re:Also prices are down 25% (Score:4, Informative)
Ethereum's hash function is designed to use a lot of memory bandwidth, whereas the Bitcoin hash function is primarily just arithmetic. That means that an ASIC can pop down tons of dedicated hardware for the Bitcoin hash function and be much, much more power-efficient than a CPU or GPU. An Ethereum ASIC does not have the same relative efficiency gain -- but it does have some.
For any proof-of-work scheme, there will be some point where an ASIC will be more profitable than a CPU or GPU, but most (that use novel hash functions) don't reach that point because the one-time costs of designing the ASIC are so high. Antminer apparently thinks Ethereum has reached that point -- which may push it towards adopting proof-of-stake sooner.
Re: (Score:2)
You'd think someone would have worked out how to make an ASIC that took cheap RAM modules.
There are multiple tutorials on getting FPGAs working with generic DDR3. FPGAs were stepping stones to ASICs for Bitcoin, there should be some board out there that can handle memory intensive coins.
Re: (Score:2)
A time traveller from 19?? would disagree with you. There is no such thing as an expensive RAM module.
Re: (Score:2)
When I was a kid, I paid $100 for 16k of slow DRAM, and we liked it.
Re: (Score:2)
Re: (Score:2)
There are more cryptocurrencies out there that all those GPUs can be put to work with. Like Monero (ASIC version is a fork), litecoin and ripple. Those are commonly known ones too, so as Ethereum miners move to ASICs, those GPU rigs
Re: (Score:2)
Now if only DDR4 would go back down to $10/gb so I can justify getting another 8 G for my laptop...
Re: (Score:2)
Re: (Score:2)
^^ THIS.
Only a complete noob is using a GPU to mine.
* Mining Hardware Comparison [bitcoin.it]
* Non-specialized Hardware comparison [bitcoin.it]
Re:What a shitty post, even for slashdot... (Score:4, Insightful)
Or possibly people mining cryptocurrencies other than bitcoin.
Incidentally, only cunts use the term 'noob'. It's an infallible indicator.
Re: (Score:2)
1) Bitcoin is NOT mined on GPU, since like 5 years.
It's almost like something happened to bitcoin in the past 5 years that makes it quite viable to use GPUs. Now what was it again? Oh that's right, a 1000x increase in value.
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
And you aren't a moron by assuming you know everything? LOL.
NOT BITCOINS! (Score:1)
No one has mined Bitcoin or any of its offspring with GPU for years. Get your story straight!
Re: NOT BITCOINS! (Score:3)
Re: (Score:2)
His mom doesn't have the basement on a seperate meter.
Old news (Score:4, Informative)
This has been known for a while. Post some stuff that isn't stale bread.
Pumping their own awful prices (Score:2)
Re: Pumping their own awful prices (Score:2)
very stale news (Score:3, Interesting)
literally anyone who would care about this phenomenon already knows about it. In fact, aren't prices coming back down now that the hype has subsided?
Re: very stale news (Score:2)
literally anyone who would care about this
That, sadly, doesn't include the editors of this quality site.
Re: (Score:2)
Re: (Score:2)
Prices are actually falling (Score:4, Informative)
Hothardware reports that pricing is now on a downward trend [hothardware.com], with GPU prices approaching MSRP. They suggest that this is at least in part due to a new Ethereum ASIC miner [hothardware.com]. And they provide citations to show that the prices are actually falling, while computerworld simply makes a claim with no evidence...
Re:Prices are actually falling (Score:4, Informative)
"Bitmain Launches Ethereum ASIC Miner With Hash Rate Performance Of 8 GTX 1080 GPUs For Just $800"
Wow. I'd expect to see a flood of used GPU's piling up on Ebay with this news. Let's see...
Search GTX 1070, click "used", results: 971 listings with the first ~275 under $400.
Yep. "Crisis" over. Expect prices to fall precipitously.
I believe Ethereum is/was the real source of GPU demand given that bitcoin miners long since moved to ASICs. Ethereum was designed to be ASIC resistant [stackexchange.com]. So what has changed? Has there been some breakthrough in ASIC design, driven by cryptocurrency?
Re: (Score:2)
Ethereum's proof of work is based on directed graphs and apparently it is a memory and bandwidth hog so it works pretty well on a GPU but you can't really just make a single ASIC that can tear through hashes like you can with bitcoin. You can certainly make an ASIC that does the work, but you also have to have lots of memory interfaced to that ASIC through a high speed interconnect. So you're really talking about designing a custom computer rather than just an ASIC with a simple interface.
Thanks for the G
Re: (Score:2)
Honestly, 970 to 1070 is not worth it unless you actually have games that are straining you just below your point of comfort.
Re: (Score:1)
Honestly, 970 to 1070 is not worth it unless you actually have games that are straining you just below your point of comfort.
Thanks for that tip; I have dual Zotac 950 AMP!s (one of them was a warranty replacement for a 750Ti, which they didn't have in stock any more) and they bench out at about the same level as a 970. Looks like it's going to be some while yet before I upgrade my graphics card, which is a shame. I was feeling ready for an upgrade.
Re: (Score:3)
I have a 970 myself, and gaming on 1080p, I just can't find all that many (really any) games that go below my point of comfort (which is around 60fps) without going so far below it that 1070 wouldn't be enough of an upgrade to matter.
In general, a good rule of a thumb is that you upgrade ever two-three generations depending on your preferences in games and resolution you play on. The thing that throws many people nowadays off is that last three generations lasted for a much longer time than those before the
Re: (Score:2)
Yeah, my 970 was driving 1440p effortlessly back when it was new, and is now driving 1080p effortlessly on my 'has steering wheel' backup PC.
My 1070 is struggling on brand new games at 1440p if I leave all settings at max, by which I mean it can drop under 50fps occasionally.
Re: (Score:2)
How do you find interfaces on newer games in 1440p btw? I keep hearing horror stories from friends who decide to "upgrade" from 1080 to 1440 only to discover that interfaces become much harder to decipher.
Re: (Score:2)
Never had an issue with newer games. Some very old games struggle at that resolution but new ones kind of expect it.
Re: (Score:2)
Some of us just want a cheap CUDA device for playing around with.
Re: (Score:2)
Then you'd be looking at used market for 700- and 900-series cards.
Re: (Score:2)
The 970 drives my 4 k screen reasonably well, but a 1070 would do better. I wouldn't make that upgrade for retail price, but for $200, sure.
Also, I do medical imaging and loading a full volume into memory and then manipulating it works in 4 GB but would work better in 8. It's handy to have a local machine with a decent card so you don't have to debug on the cluster all the time.
Re: (Score:2)
If you're mainly struggling with memory side of things, 1060 with 6GB may be sufficient. That said, if your company is paying for it, you may as well get a 1080.
Re: (Score:2)
Be careful with eBay graphics cards right now - some scammers have figured out you can hack the firmware on a card to spoof the model identifier. Buy a 1050, replace stickers, fiddle firmware, sell as a 1080 - as far as software reports, it is a 1080. Still performs like a 1050 though, and hope the buyer doesn't realise. Can be done on AMD cards too.
Re: (Score:1)
Any other vendor I could return. I can't do that with Ebay.
eBay guarantees your purchases, and so does PayPal. So if you get a card that doesn't work, you should be able to reverse the transaction. It doesn't matter if the seller "allows returns" if the product is bad. You're using eBay wrong.
Re: (Score:2)
Actually anybody selling off multiple cards that were mined with keep the temps below 70 max, most below 60. anybody datacenter mining keeps them in 30-40 range.
Re: (Score:2)
This story is about 3-6 months beyond its use date (Score:2)
CW is like 3 months behind on these news (Score:2)
welcome to 2 years ago (Score:5, Insightful)
Welcome to 2016. What moron posted this article. GPU prices are dropping not soaring, Bitcoin hasn't been mined on GPU's for years now, alt coin mining since the price crashes has led to nice drops in GPU prices
Re: (Score:2)
A miner who has used cards to sell?
Re: (Score:2)
A miner who has used cards to sell?
Too late for that, ebay et al are flooded with used cards now. Someone trying to offload cards either has to quickly take what they can get or hope for a recovery in crypto.
Okay, ignoring the fact Bitcoin is on ASICs now (Score:3)
If there were a smaller market for GPUs, the economy of scale aspect wouldn't be working in anyone's favor. No chipmaker gives a shit whether your framerates are 30FPS or 60FPS or that you can bump your resolution to 4K versus ... unless you can do so on their competitor's cards at a price point that threatens viability of their own offerings (If nobody buys it because someone else has something way better, they don't make back the sunk costs of R&D, tooling, manufacturing, marketing, etc).
There are still coins you can mine from GPUs. I'm actually intrigued by what the Dogethereum project might do to the market since that's shifting back to GPU.
100,000 gamers: "I want a new GPU for cheap because I want higher framerates, but I'm poor!"
10,000,000 cryptocurrency miners: "I want several better GPUs because I can make more money from them and I'm willing to pay for that privilege if you'll deliver a respectable ROI."
GPU maker: "Okay, miners, you'll get your new card. Gamers, since yields are never perfect, we'll offer the cards to you for cheap if you're okay with the cores that don't work being disabled in hardware. It's still over four times as fast as anything else the other guys can offer for the price."
All: "Great!!"
Re: (Score:3)
Nvidia at least sees the cryptocurrency thing as a flash in the pan. Miners have demanded a lot of high end GPUs all of a sudden, but they might not express that demand over decades like gamers do. They're certainly not driving GPU R&D.
Re: (Score:2)
They're certainly not driving GPU R&D.
Not long-term, but if you run a company and smell easy money, you adapt and chase it. That's just good business, if only for being able to survive... or beat your competition to the punch.
R&D departments don't necessarily release the best of their best all at once unless they want to stomp a competitor. It's like I related back in the 90s: If you know how to make a 24X CD-ROM drive (that may not be reliable above 20x) and your competitor can only make a 6x, just release an 8x that can be boosted to 12x
Re: (Score:3)
"Not long-term, but if you run a company and smell easy money, you adapt and chase it. That's just good business, if only for being able to survive... or beat your competition to the punch."
Chip design isn't easy money. Some people smelled easy money and whipped off some simple ASICs. They probably made back some fraction of a percent of what Nvidia pulls in.
If you're a smart business you try to supply some long term, sustainable demand. Nvidia has publicly said that they don't think cryptocurrency minin
Re: (Score:2)
I don't think I'm entirely mistaken, just off by a bit. But I think you helped me refine my views. I've been every type of gamer, from ditch to high-end. When 3D acceleration was still a novelty in the late-90s, I settled for a 2MB Voodoo Rush card from Intergraph (okay, it had a dedicated 4MB in 2D mode, but that's not what I bought it for). When VR hit, I pre-ordered a GTX 1080 the same day the same day I switched my Rift order to a Vive.
I hear a lot more about "70" GPUs being employed for mining than "80
Free Market (Score:2)
The end (Score:2)
Re: (Score:2)
Of the coin miners I know, like 90% bought solar power rigs for their houses. One had his ASIC miners linked to his brother's under-construction house and paid for something like 30kW of solar gear outright. He made a profit from the start (expensive electricity at that level, plus offsetting heating costs).
When the miners are garbage, the panels will still work.
Selling for what? (Score:2)
selling for twice as much as suggested retail
In dollars or Bitcoin?
Gamers over crypto miners (Score:2)
Okay, /. just notices this NOW? (Score:2)
Seriously. It's been an issue for over a year now...
Hardware lock on GPU's? (Score:1)
Why can't they put a hardware lock on GPU's which detects and prevents cryptocurrency mining, and separately sell a card which is solely used for mining?
Getting tired of seeing GPU's going for 2-4 times their original price. It must be putting a dent in the PC builder market.
Re: (Score:2)
Why can't they put a hardware lock on GPU's which detects and prevents cryptocurrency mining,
Seems like it'd be potentially complicated overhead - how would it detect crypto mining, versus other heavy usage? Wouldn't that affect performance across the board, no matter what you are using the card for? And of course, why would they want to lock out particular uses of the card? (Which is just me pointing out a possible mindset, not agreeing with or disagreeing with it. - though it could be argued that the problem is not the usage alone, but the people buying up huge quantities - supply, purchasing,
I want GPU for Machine Learning, Want ML hardware (Score:1)
I have setup one GPU inside a Linux container but I need more GPUs.
Homebrew miners must have ... (Score:1)
... very very low electricity rates because at this point most cryptocurrencies have reached the point where a GPU spends more money in energy than it generates currency. Either you must be that thick and unwilling to acknowledge this or you have enough solar power on the premises.
Re: (Score:2)
Either you must be that thick and unwilling to acknowledge this ...
That seems to characterize these people pretty well. After all, they are literally mining hot air and are dependent on a "greater fool" to buy from them.
Re: (Score:2)
... very very low electricity rates because at this point most cryptocurrencies have reached the point where a GPU spends more money in energy than it generates currency. Either you must be that thick and unwilling to acknowledge this or you have enough solar power on the premises.
I see this argument quite often and all I can think is people don't know how much power actually costs, or how much power GPU mining actually uses. Even now, with prices in the crapper, most cards are profitable with power rates $0.40/kWh and lower.
I hate cryptocurrencies (Score:2)
Not for long (Score:2)
There are plenty in stock (Score:2)
only because the retailers are charging a fortune for them. I suppose there is a limit to what the miners are willing to pay for them.
Nvidia 1080s are plentiful at the Fry's I was at earlier this evening. They have price tags of $1k each which is probably why they're sitting there on the shelf.
Story from last year maybe? (Score:1)