Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Businesses Bitcoin The Almighty Buck Hardware

GPU Prices Are Falling (venturebeat.com) 149

An anonymous reader shares a report: If you were looking for a new graphics card for your PC over the last year, your search probably ended with you giving up and slinging some cusses at cryptocurrency miners. But now the supply of video cards is on the verge of rebounding, and I don't think you should wait much longer to pull the trigger on a purchase. Earlier this week, Digitimes reported that GPU vendors like Gigabyte, MSI, and others were expecting to see their card shipments plummet 40 percent month-over-month. The market for digital currencies like Bitcoin and Etherum is losing some of its momentum, and at the same time, large mining operations are pulling back on their investment in GPUs in anticipation of dedicated mining rigs (called ASICs) that are due out before the end of the year. These factors working in conjunction seem like they are leading to more supply, which in turn is forcing retailers to cut prices. For example, the Gigabyte GeForce GTX 1080 video card is selling on Amazon right now for $700. Other retailers even have it listed at the original MSRP of $600. These are the lowest prices of 2018 so far.
This discussion has been archived. No new comments can be posted.

GPU Prices Are Falling

Comments Filter:
  • Since the Monero branch, RX Vega cards have become the best value for mining Monero, and there's no chance of getting any of those cards at MSRP
    http://www.nowinstock.net/comp... [nowinstock.net]

    • That's fine. Compared to my Nvidia 1080 they're useless for gaming, so I wouldn't consider a Vega.

      • Compared to my Nvidia 1080 they're useless for gaming

        "Useless" is a wild exaggeration. More accurate: 1080 turns in 8 to 15% higher framerates at ultra-high quality. Vega overclocks better, closing the framerate gap and bumping up power consumption to considerably higher than NVidia (maybe 30% more?). Actually, since nobody really needs 100 FPS, you aren't going to notice much difference in practice. Vega handles 4K better. Vega costs twice as much because of mining, that is my only serious issue with it. Going to be hanging onto the Rx 480 for a while yet.

        • I'm far from a serious gamer, so take my opinion with the pound of salt that it deserves. I'm playing Doom on an older monitor with 1152 vertical pixels (at 60 hz, I think,) an rx480, ryzen CPU, and am quite happy. I've played Overwatch with my oldest kid (he's in California, and I'm in Alaska) and still managed to barely keep up with him and his 980Ti and i7-something-or-other. The rx480 is worth holding on to for a bit more, in my humble opinion.
        • by MrL0G1C ( 867445 )

          Not that I'm fussy about frame-rate myself having gamed through the nineties but the average FPS doesn't matter anywhere near as much as the 0.1 lows because those lows are what get noticed as 'stutter'. There are plenty of gaming purists who are the gaming equivalent of audiophiles. These gamer purists want 144+ fps all the time so their PC is in lockstep with their super-wide monitors and never drop a frame god forbid.

          • Normally those people have bigger wallets than brains. Because they would know that that 144FPS they are playing at is unnoticeable to the human eye. therefor its a waste of money. The only reason to get a better GPU is to play at a solid 60FPS at a higher resolution. with better clarity settings. Anything else is wasted money and electricity, which I feel would be better used for mining.

            • by MrL0G1C ( 867445 )

              That's where we differ, unless it's a cold winter then there's no good reason to mine. It's a 100% artificial requirement, the people designing the coins could simply set the difficulty rate low and mine all of the coins on day one. Setting the difficulty rate high is an affront to ecology.

              • The difficulty changes with the network. At day 1 you are correct very low difficulty. But if the coin devs did mine all the coins on day 1 they essentially make that coin worthless. The value of the coin is in the network of miners. It's not as scammy as people that don't understand it nor the community try to make it sound. Should check it out.. I've bought 4 gpu's that have paid for their self in the last 6 months.. And I bought them at rather high prices, now it's basically all profit from here on out.

                • by MrL0G1C ( 867445 )

                  And they've basically picked the worst way to create new coins - by encouraging people to buy hardware to mine the coins and by wasting extreme amounts of electricity. The fact that there's been a GPU shortage for the last year speak volumes. The question is - who is buying these coins? It seems to me like it's all one massive game of speculation and very little else.

                  • Exactly so you have to get it while the getting is good. Some of the coins seem promising because they have an actual use to them and they may make it, but the rest is exactly that speculation.

              • by stooo ( 2202012 )

                That's where we differ, unless it's a cold winter then there's no good reason to game. It's a 100% artificial requirement.

            • by Mashiki ( 184564 ) <mashiki&gmail,com> on Monday April 30, 2018 @10:02AM (#56528925) Homepage

              You do know that was debunked back in the 1990's. The difference between 144fps and 300fps is noticeable, if you don't think so go buddy up with someone who works at an imax theater and give it a go. The visual acuity of fighter pilots is in the 480-620fps range, just to give you an example. Conscious i.e. direct focus for most people is in the 50-90fps range, but your brain is 'discarding' unimportant information unless you train it not to.

              • Yes the kids complaining about gpu prices and teh fps's!!!! Have trained their self to see 120+ frames. Highly doubtful.

                • by Mashiki ( 184564 )

                  Seriously? Is this /. or "the eye can only see 30fps" pathetic console peasantry? Go dig up a CRT or LED that can output to 120Hz, and you can see the difference.

            • I could easily spot the difference between 60, 90, and 120 fps when playing Quake on a CRT (LG Flatron 795) that could do 120Hz @1024x768.

              The difference is very noticeable when panning quickly. I can see each individual screen update as an image along the movement path - i.e. the opposite of a motion blur. The higher the framerate the more images populated the movement path for the same movement, and the better I felt I could see what was going on. Some people will of course settle for less and be happy wi

        • I'm speaking partially about the compatibility issues. I don't have time to wait for AMD to fix issues after a game releases. I gave them their last chance years ago.

          If you're happy with their hardware, that's great. Myself, I can just pick a top or near top tier Nvidia card when I build a system and save myself a lot of grief.

          And I have a 165 Hz monitor. I'll take as high a frame rate as I can get while keeping detail high. I do notice. I didn't think I would, but I do.

          • I'm speaking partially about the compatibility issues. I don't have time to wait for AMD to fix issues after a game releases. I gave them their last chance years ago.

            I think you're imagining some disparity between AMD and NVidia. As far as I can see, they have roughly equal number of issues. Try searching for "directx issues amd" vs "directx issues nvidia". Roughly equal hits, and drilling into them shows roughly similar kinds of issues.

            I run Linux, so its no contest: I like the open source AMD driver, which recently is arguably better than the binary-only driver, and better than any driver for NVidia. A simpler way to put it: fuck you NVidia.

  • Used Cards (Score:5, Insightful)

    by Ailicec ( 755495 ) on Sunday April 29, 2018 @05:22PM (#56525143)
    Better, when will the big operations start dumping cards by the thousands onto Ebay? A plentiful secondhand market + crypto not buying new cards should deepen the effect. I'm fine with used stuff, but do wonder how much life a GPU that has run flat-out 24/7 for a year or two has left.
    • Re:Used Cards (Score:5, Insightful)

      by Bing Tsher E ( 943915 ) on Sunday April 29, 2018 @05:50PM (#56525225) Journal

      That is called an extensive burn-in. If silicon lasts through the initial burn-in and was not operated in an abusive environment, it's better than a random new card fresh from packaging.

      • by Tablizer ( 95088 )

        and was not operated in an abusive environment

        Wouldn't CC miners expecting better hardware soon stretch the limits?

      • There is some amount of damage due to electro-migration. There is also the potential for damage due to running on a sagging supply rail due to the sheer number of cards, but that would mostly apply to the power delivery stages, which are repairable, and only be found in relatively poorly built mining rigs. There's also the potential drying of the thermal paste, or it's migration. The warmer the environment, the more liquid many become, and the cards are not always designed to run in a vertical orientatio

        • by Mal-2 ( 675116 )

          Also some chips are notorious for breaking the BGA connections that let them talk to the circuit board under them. This is fixable, but not by your average home user. Just baking without properly resoldering might get you six months or so of service.

        • by Anonymous Coward

          You're exaggerating concerns. Cards for mining are typically underclocked and undervolted. Fan wear is your primary concern and it's easy to research which cards are likelier than others to have low quality ball bearings etc. If the card has been running fine for two years it'll run fine for another five.

      • That's assuming that they were and we are on the tail end of a rush so you can be sure that a lot of the people who are about to start their farm sell-offs will be amateurs in it as a "get rich quick" scheme. I wouldn't even be surprised if a greater share of the people selling their mining cards are people who don't know what they're doing than there was among the people who had mining operations in the first place.

        So all in all it's a bit of a gamble. You can get a good card, but there's also the equal
    • by ceoyoyo ( 59147 )

      As soon as ASICs become available. The big operations won't dump their GPUs while they're still profitable, but as soon as ASICs raise the competitive bar, to ebay they'll go.

    • but do wonder how much life a GPU that has run flat-out 24/7 for a year or two has left.

      Is that from the "GPUs have a limited number of instructions they can process before they melt into a puddle" corner of the internet?

      Personally I'd much rather a second hand GPU from a company that keeps them cooled racks in airconditioned rooms than some box under the desk being bumped every few days by a vacuum cleaner. Thermal effects are not going to kill a GPU in a few years unless you run them waaaay out of spec.

  • by Anonymous Coward

    I just bought a new one.

    Fuck you murphy.

  • by Fly Swatter ( 30498 ) on Sunday April 29, 2018 @05:56PM (#56525243) Homepage

    I don't think you should wait much longer to pull the trigger on a purchase

    Let the prices keep falling. I think we should all just wait till it hits rock bottom, then wait some more. Seriously though I always bought the value cards at around $100. Even those appear about 50 percent over what I would care to pay today if I thought I needed anything more than integrated graphics. When I do game, its older titles anyway. For me, gaming peeked at Quake3. Get off my lawn and all that.

    • Re:No. (Score:4, Insightful)

      by Sir Holo ( 531007 ) on Sunday April 29, 2018 @08:10PM (#56525741)

      For me, gaming peeked at Quake3. Get off my lawn and all that.

      Try Far Cry 5. It has multiplayer FPS deathmatch and such, just like Quake (and Unreal Tournament, the true pinnacle, or how about Avara, the FIRST truly 3D internet-multiplayer game).

      Far Cry 5 rocks. Offline or with a friend, you can play through a campaign that has a very interesting and sometimes unpredictable AI, simple random events colliding in the open world for unique situations. Puzzles. Fishing. Prince-of-Persia-like parkour and climbing puzzles. Meaningfully diverse weapons set. Player specialties. NPC specialties. Command-able AI companions.

      Run it on a GTX 1080 Ti for 1080p 60 fps gorgeousness. 4k at 45 fps or so. !!! A Ryzen probably gets you the same.

      • While impressed with FC5, I have to say that there are some amazing oversights that rip me right out of that world and start questioning what the hell they were thinking.

        Yes, they have a big open world to explore, and the thing is fairly non-linear in nature. However, it's mostly a big EMPTY open world, so in order to not feel empty, they cranked up the random spawns of shit-kicker cultists in pickup trucks to a truly absurd level. You can go back and forth between two intersections killing rednecks all d

        • And don't even get me started on the completely out-of-nowhere ending.

          Yeah. They could have built-up to the ending over a longer span of the game. But if you make a habit of listening to the radio, you'll get some of that build-up. Outsidee communications is cut off, but you'd think that NPCs would be talking about the latest news they heard (It's usually music).

          PS –– IIRC, there are six distinct endings possible in FC6.

    • I came here to say this... why would you buy when you are on the *cusp* of falling prices? Let them go down, and when they bottom out, then leap on it... I'm seriously hoping to pounce on a 1080ti.

  • The author says "... anticipation of dedicated mining rigs (called ASICs)." This is wrong. Dedicated mining rigs may use ASICs as the main compute engine (or GPUs, or Xeons, or Unicorn smegma, or...) but Application Specific Integrated Circuits are NOT "mining rigs".
    C'mon /. you know better! (Hopefully, anyway.)
    • by DRJlaw ( 946416 )

      The author says "... anticipation of dedicated mining rigs (called ASICs)." This is wrong. Dedicated mining rigs may use ASICs as the main compute engine (or GPUs, or Xeons, or Unicorn smegma, or...) but Application Specific Integrated Circuits are NOT "mining rigs".
      C'mon /. you know better! (Hopefully, anyway.)

      The sentence made perfect sense in context - "large mining operations are pulling back on their investment in GPUs in anticipation of dedicated mining rigs (called ASICs) that are due out before the

      • As you said they (mining rigs) are USING ASICs - they are NOT ASICs themselves, which is what the sentence said. Not implied - said!
        No shame here for knowing what sentences are supposed to fucking mean.
        • by DRJlaw ( 946416 )

          As you said they (mining rigs) are USING ASICs - they are NOT ASICs themselves, which is what the sentence said. Not implied - said!

          It most certainly did not.

          "large mining operations are pulling back on their investment in GPUs in anticipation of dedicated mining rigs (called ASICs) that are due out before the end of the year."

          It said the dedicated rigs replacing GPUs are called ASICs.

          They [buybitcoinworldwide.com] are [digitaltrends.com] called [vice.com] that [techradar.com], and more to the point your original post asserted that they called ASICs (all ASICs) dedicated mining

  • by Billly Gates ( 198444 ) on Sunday April 29, 2018 @07:54PM (#56525659) Journal

    Jesus. I remember in the good old days when $200 was a good chunk for a great GPU and $350 was for the very fastest ones.

    Wtf happened? Nvidia monopoly and gamers ready to open their wallets because of the Nvidia label seem to be destroying the market. The comments on how the 1050ti is God on YouTube when referring to AMD products and the xboxoneX verify this brainwashing and monopoly.

    Hey PC masterace just don't be shocked when us regular peasants switch to consoles where you can get the same performance for cheaper.

    • by Anonymous Coward

      It's not that simple. A classic entry-level gaming GPU back in 2004 would run you about $200, which is the same as $270 today. $270 will get you a GTX 1050 TI or 1060, which both fill the same general entry-level gaming GPU slot. So there's not really much of a different from today to yesterday.

      On the higher end, that same $350 in 2004 is now $472 today, which would get you about a 1070 or the TI version.

      What happened between then and now (other than basic inflation that people like to forget) is that NV

    • You can't get the same performance in a console. Ever. Pull the other one. Certainly not the same performance of a 1080 or 1080ti.

      Not even the same neighborhood.

      • by Wolfrider ( 856 )

        > Pull the other one ...it's got bells on? ;-)

      • I disagree. This used to be true but the xboxoneX uses a modified RX 580 which is on par with a 1060. A 1060 costs $380. A whole console is just $20 more with the same graphics. $2,000 to play games is stupid and soo 1999. Times changed and now are going backwards

    • by gman003 ( 1693318 ) on Sunday April 29, 2018 @08:48PM (#56525909)

      The top-end cards became halo products. Just like few cars on the road are Mustangs, few video cards are 1080 Tis.

      Checking the Steam Hardware Survey, about 60% of all cards are Nvidia GeForces in the xx50 to xx70 range - the normal, reasonably-priced cards. (AMD is only 10% of the market right now, with much of that being integrated. Their fundamental architecture was just way better-suited for mining, so their prices spiked even harder than Nvidia's did.)

      Also, be aware the GF1xxx series MSRPs were already inflated by mining demand. The "normal" price for a top-end xx80 Ti is $500-$600, and the only-slightly-slower xx80 is usually $350-$400. But the 9xx series was already selling for nearly double MSRP when the 10xx series came out, they'd have been idiots to not bump up the stock prices.

      • AMD is only 10% of the market right now, with much of that being integrated. Their fundamental architecture was just way better-suited for mining, so their prices spiked even harder than Nvidia's did.

        My concrete interpretation of that is, NVidia sunk their transistor budget into tile based rasterization while AMD went for more vector FLOPS, the former being a better tradeoff for video rendering and the latter better for general purpose computing on GPU (not just mining!). Also would explain why AMD's vega/navi roadmap focuses on GPGPU for the next year or two. I suppose AMD's next GPU arch will also join the tiler party.

    • by MrL0G1C ( 867445 )

      Jesus. I remember in the good old days when $200 was a good chunk for a great GPU and $350 was for the very fastest ones.

      3 things happened, Crypto currency mining, memory in high demand (partly smart phones) / a memory cartel, other new use cases including but not limited to GPUs doing AI computation and being used in self-driving vehicles.

      It's about time GPUs forked between gaming and general compute, general compute is becoming big enough that it should get it's own products with different GPUs. For i

    • Jesus. I remember in the good old days when $200 was a good chunk for a great GPU and $350 was for the very fastest ones.

      Wtf happened?

      Nothing happened. You just moved the goalpost to an unreasonable position. You'll have no problem playing any modern computer game with a $200 GPU (MSRP that is) and a $350 one will happily get you those high frame rates with your ultra fast g syncing monitor, or whatever the hell you're doing.

      If you're pushing 4k at 120Hz in 3D across 2 monitors.... well there have always been $1000 GPUs out there if you cared enough to look.

      Personally I have a GTX 1060, MSRP at $249 and I have yet to find a game out there

      • Except a 1060 is worth $390 today. For $10 more I can get a xboxoneX which has the same GPU performance as it's an RX 580.

        The 1060 is not even high end??! Something is up and I smell monopoly.

        • Except a 1060 is worth $390 today.

          What it's worth is irrelevant. You were talking about price comparisons based on monopolies and not the temporary effect of the coin market. So only the MSRP is relevant.

          The 1060 is not even high end??!

          To which I again suggest you define where the goalposts should be. You use "high end" arbitrarily. The 1060 is just as high end as a $200 card from well back in the day you're comparing it to in your original post, able to edge out similar levels of performance (high to ultra high quality at native resolution of common screens) from the gam

    • by ebvwfbw ( 864834 )

      Happened to me. I bought a Nvidia card, I think it was $350. I was thinking hot damn... that'll be something. Then I found out I ONLY have a 1060. To be something it has to be at least a 1070. Well, it works for me.

  • by Lord Ender ( 156273 ) on Sunday April 29, 2018 @08:10PM (#56525739) Homepage

    Ethereum and Monero are the reason GPUs are being snatched up by miners. The value of those coins crashed horribly earlier in the month... to the point where it was barely profitable to mine. But prices have rebounded recently, so you can expect GPUs to start selling out again soon.

    When the blockchain "difficulty factor" for ETH and XMR solidly surpasses their record highs, then you will know these ASICs are really rolling out. From there it won't be long until these $700 cards can be found on Ebay for chump change.

    You can track the difficulty here: https://www.coinwarz.com/diffi... [coinwarz.com]

  • by BlueCoder ( 223005 ) on Sunday April 29, 2018 @08:44PM (#56525897)

    I'm interested in AMD RX 64 since I invested in a freesync monitor. The price right now is around $800. That is $300 more than the MSRP of $500 at release. This is more than 6 months later so right now the second revision and or custom vendor versions should be coming out and the original card should be going for around $450. By my calculation that is far from being on par.

  • calling 'bullshit' (Score:2, Insightful)

    by Anonymous Coward

    ... source is 'venturebeat'.

    yup. bullshit.

    checked price of a 1050ti..... total bullshit.

    and this isn't even a very good mining card. price of these went up because they were all that was left.... and then, these disappeared from store shelves, too, even when priced as high as two hundred fucking dollars... not the $99 or less they should be selling at eighteen months after its introduction...

    if it weren't for mining, we'd have the geforce gtx 11xx series out by now, too. this mining shit isn't just affectin

  • Yeah.

    Fuck that.

  • Wait Until Summer (Score:4, Informative)

    by mentil ( 1748130 ) on Monday April 30, 2018 @01:08AM (#56526845)

    I don't think you should wait much longer to pull the trigger on a purchase

    Actually, rumor is that Nvidia is going to release the 1100 series GPUs in June or July. They're expected to have about 40% higher performance than the 1000 series. Also, the Etherium ASICs are dropping in July; assuming there's not a hard fork that makes them useless (and even if there is), there will be a sharp price drop in Etherium at that time, leading to lower GPU demand by cryptominers.

  • by Anonymous Coward

    For example, the Gigabyte GeForce GTX 1080 video card is selling on Amazon right now for $700.

    Fuuuuck that.

    $700 is stupid for a GPU.

  • I wonder if there has been long-term damage to PC gaming. Combined with extortionate RAM prices, I think a serious chunk of gamers will have moved to PS4 and XBOX, particularly with their attractive recent 'Pro' updates. Still there is the problem of no adaptive sync with Nvidia.
  • ...because parents are realizing the power condumption of their children and forbid to mine. It's easy to mine if you don't have to pay for the power (the same thing as for indoor gras growing).
  • GPU vendors also announced they are drastically cutting back their production to coincide with the mining reduction. They will do everything they possibly can to retain the inflated prices.

  • I wanted to build a gaming computer at the beginning of the year. Looked at the prices, was almost ready to take the plunge on a 1000 GBP PC (Ryzen 7, GTX 1080, plans to splash on VR), with more than half the price being the video card.

    Then a friend woke me up with words to the effect of "Nvidia will launch new products soon". OK... I'm not in any hurry... My Phenom II just needed a usable video card for games instead of the GT 210 I had inside, so I bought a GTX 560 (I was too cheap to go for a 660) with p

  • I get 100% markups still
  • This comes as a surprise to some people, and no surprise to others.

    In the short run, the quantity of a thing for sale is rather fixed and the price is rather variable.

    In the longer run, the quantity of a thing for sale is rather variable and the price is rather fixed.

"Protozoa are small, and bacteria are small, but viruses are smaller than the both put together."

Working...