Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Hardware Technology

Desktop GPU Sales Hit 20-Year Low (tomshardware.com) 167

Demand for graphics cards significantly increased during the pandemic as some people spent more time at home playing games, whereas others tried to mine Ethereum to get some cash. But it looks like now that the world has re-opened and Ethereum mining on GPUs is dead, demand for desktop discrete GPUs has dropped dramatically. From a report: In fact, shipments of discrete graphics cards hit a ~20-year low in Q3 2022, according to data from Jon Peddie Research. The industry shipped around 6.9 million standalone graphics boards for desktop PCs -- including the best graphics cards for gaming -- and a similar number of discrete GPUs for notebooks in the third quarter.

In total, AMD, Intel, and Nvidia shipped around 14 million standalone graphics processors for desktops and laptops, down 42% year-over-year based on data from JPR. Meanwhile, shipments of integrated GPUs totaled around 61.5 million units in Q3 2022. In fact, 6.9 million desktop discrete add-in-boards (AIBs) is the lowest number of graphics cards shipped since at least Q3 2005 and, keeping in mind sales of standalone AIBs were strong in the early 2000s as integrated GPUs were not good enough back then, it is safe to say that in Q3 2022 shipments of desktop graphics boards hit at least a 20-year low.

This discussion has been archived. No new comments can be posted.

Desktop GPU Sales Hit 20-Year Low

Comments Filter:
  • by skam240 ( 789197 ) on Friday December 30, 2022 @05:06PM (#63169060)

    Nvidia's pricing doesnt help. A new 4080 costs $1,200 and good luck finding a 4090 for less than $2k.

    I'm really bummed AMDs latest offerings werent more impressive as Nvidia desperately needs real competition in the high end market to help bring down prices.

    • by Luckyo ( 1726890 ) on Friday December 30, 2022 @05:11PM (#63169064)

      It's not that bad on pricing. You just need to stop thinking in terms of performance per currency and start thinking in terms of performance per kilo.

      My old benchmark of GTX970 which was the most popular card of its generation and kept that post for a long time into the next generation only weighed about 0.5 kilos and was 320 EUR. Reference RTX 4090 is around 2000 EUR, but it's close to five kilos in weight. So it's actually only 400 EUR per kilo, as compared to GTX 970's 640 per kilo.

      A true bargain!

      • by skam240 ( 789197 )

        As a gamer my thinking is value versus alternatives. I love PC gaming and have been a PC gamer since I bought and built my own PC for the first time when I was 16. Twelve hundred for a new card when I'm building a new rig is an awful lot of money when an entire gaming console sells for half that though. Given that gamers are some of the biggest drivers of at least high end GPU sales it's not surprising GPU sales are falling with such poor value being offered relative to competing platforms of game play.

        • by Luckyo ( 1726890 ) on Friday December 30, 2022 @06:12PM (#63169216)

          If you want my take on PC gaming right now, games seem to be in a pretty bad place. Ethereum shitshow with GPUs lasted for about two years, during which time GPUs were almost unobtainium for average gamers.

          Game developers took notice, and a lot of games don't really require a good GPU nowadays. My GTX970 lasted my until very recently, and I don't think there was a game other than CP2077 that it would run at least decently on low/medium 1080p, usually having some issues holding 60 fps but not much.

          I upgraded after one of the fans started giving out and I couldn't be bothered to buy a new fan and basically rebuild the card with it. Instead I went to used GPU market and got a half a year old 2060 for 140 EUR. It has two and a half years of warranty left on it. I now can run most games on high 1080p at above 60fps with minimums dipping just below.

          Moral of the story: if you REALLY need to upgrade right now, buy used. Ignore the new GPU market. There's nothing there right now that's worth the price unless you just don't care about price to performance.

          • I am a casual gamer. Everything runs nice on my 1050 ti. Even at high settings. (Hd screen) I may represent a pretty big part of the market. For a lot of people good is good enough.
            To me, it looks like gaming is going on the same road as cars. A few diehards spend loads of money on state of the art hardware to get that extra Hz or fps. Bragging about some number that is higher than the competition. ... Let them pay 2k for their graphics card. It will make them happy.
        • Not sure about where you are, but here where I am in Canada, PS5 consoles are harder to find and just as price inflated as video cards for PCs. Rarely in stock anywhere, and have been that way since they were released. People do manage to get them, but it is not as easy as just going to the store (or ordering online).

          Building a gaming PC is actually much easier than scoring a console, though admittedly more expensive.
      • by skam240 ( 789197 )

        I know you're joking btw, I just thought I'd spell out my thinking on these new cards having poor pricing.

      • I'm not having problems playing modern games on my 2080TI. I like new shiny techy things and I can afford them, but non gouge pricing on 3090s came around about the time the 40xxs came out. So the 3000 series aren't new and shiny and the 4000 series is in full on price gouging. I haven't splashed out yet.

        My next discrete GPU purchase is probably for the Intel card for AV1 hardware encoding, which is something I need. My games will continue to work fine on the 2080ti for a while and the 'new shiny thing' psy

        • by Rei ( 128717 ) on Friday December 30, 2022 @05:41PM (#63169130) Homepage

          It depends what your target is. For example, in my circle it's all about VRAM on NVidia cards for CUDA, for AI art and other AI applications. So until recently, it was only a two-card race: the 3060 for 12GB if you were on a budget, or the 3090 for 24GB if you had tons of money. The new 40xx series options add in new possibilities so it's not as narrow of a choice (though I expect most to stick with 30xx series for now).

          I myself recently got (but haven't yet set up... fingers crossed) a used 3090 formerly-crypto-mining card with no functional HDMI ports, for $550, alongside a new motherboard+ram+cpu so that it'll (hopefully) all fit and work together. Given that new 3090s are >=$1300 right now, it could either be a steal, or a big waste of money. Gonna try to run both the 3090 and 3060 at once if they'll fit, though I'll probably need to underclock them.

          • I'd love to be able to use GPUs in my day job, because I do a bucket load of data analysis (entropy analysis for cryptography mostly) but those algorithms don't map to GPUs. So I make do with high core count CPUs and the occasional FPGA. The FPGAs take a bit more design investment to design the circuits, but they process a lot of data quickly once it's done.

        • My next discrete GPU purchase is probably for the Intel card for AV1 hardware encoding, which is something I need.

          If it's not happening immediately, might as well wait for the RTX 40 prices to come down, then you won't have a GPU that's only useful for encode.

        • Meet Cyperpunk 2077. Put on Raytracing and connect a 1440P or 4K monitor and get back to me if you still think the 2080ti is awesome.

          Hell I own a 3080TI and it melts and has coil whine at 1440P when I put on raytracing on max settings

      • by Rei ( 128717 )

        If I lash my 3090 to a pole, it doubles nicely as a sledgehammer for driving in fence posts.

      • Wow, Jensen was right, the more kilos you buy, the more you save!

        • by Luckyo ( 1726890 )

          Indeed. Why buy one 4090, when you can buy two and save even more!

          Just SLI them or something.

      • by _xeno_ ( 155264 )

        My old benchmark of GTX970 which was the most popular card of its generation and kept that post for a long time into the next generation only weighed about 0.5 kilos and was 320 EUR. Reference RTX 4090 is around 2000 EUR, but it's close to five kilos in weight. So it's actually only 400 EUR per kilo, as compared to GTX 970's 640 per kilo.

        OK, so you're joking, and I know that, but - I think you're on to something.

        GPU performance has pretty much capped. So they're trying to increase it now by making the cards ever larger and drawing ever more power. Current GPUs are freaking huge! Some of them require special supports because they're so large and the case and PCI connector simply cannot support their weight. I mean, a 5kg GPU, just think about that. That's legitimately the weight of a small dumbbell. And these GPUs were literally melting thei

      • Spoken like a former drug dealer, or a street corner pharmacist. Only keys matter, forget metric or Imperial

      • The 4090 is 2,186 g (2.186 kg) in weight. Close to 5 pounds, not 5 kilos.

        Still, that's a LOT of weight.

        But this is all good news - I'm in the market for a pair of new decent video cards, but I can wait - no rush. Overpaid early this year during the shortages, so I'll be quite happy to see what the market is like in 3-6 months. Kind of "make it up."

        Decent specs, decent power consumption. Because power consumption is a thing - you'll end up paying way more for power to run a 4090 over it's lifetime tha

    • by hdyoung ( 5182939 ) on Friday December 30, 2022 @05:24PM (#63169100)
      For the most part, those cards are “prestige purchases”. You buy one of those if you already have a 2000 dollar PC with a 1500 dollar gaming monitor, and want a graphics card that lets you play *insert currently popular shooter” game at a resolution and graphics setting that are so high that you can’t even make out the differences with the naked eye, at a frame rate so high that it’s literally meaningless because it’s already twice as fast as the reflexes of the fastest person on the planet. Why would you do this? So you can brag about it on social media. That’s literally all it’s good for.

      Meanwhile, for normal people, $1200 buys a full PC, with a monitor, that’s capable of playing literally any PC game at halfway decent settings.

      Let’s set aside the scientists that use graphics cards for actual research. That’s a different story, but most of those types don’t buy the gaming-grade cards. Nvidia has an entirely separate GPU line for that kind of work.

      There are also a few thousand people who crunch video for a living. There, a $1200 graphics card is a business purchase. Other than that, this discussion is purely all about gaming.
      • by skam240 ( 789197 )

        Meanwhile, for normal people, $1200 buys a full PC, with a monitor, that’s capable of playing literally any PC game at halfway decent settings.

        Poor value there in regards to what $1,200 can get you with today's GPU prices alone as that $1,200 PC wont perform any better than consoles costing half as much. Plus that $1,200 gaming PC is likely to feel out of date sooner than a brand new console.

        I know PC gaming has always been more expensive than console gaming but with these GPU prices nowadays it really feels much more so.

        • by mobby_6kl ( 668092 ) on Friday December 30, 2022 @06:53PM (#63169266)

          I really doubt that because my $500 PC from 4 years ago is pretty close to consoles as it is. I put a GTX1070 into an ebay Optiplex and it ran Cyberpunk (at release, not now that it's been debugged) at an actual 1440p. Which is... what the consoles are doing.The only difference they can upscale with FSR and there are some light RT effects thrown in.

          • I put a GTX1070 into an ebay Optiplex and it ran Cyberpunk (at release, not now that it's been debugged) at an actual 1440p

            At what, 2 FPS? I have the same card and even at 1080p (which is my monitor resolution, still) most games can't make 60 fps even at middle settings.

            • What? No, 30-40 fps, medium-high settings: https://i.imgur.com/lQThNgj.jp... [imgur.com]

              Not amazing but playable and comparable to the consoles when they actually render at that resolution. Cyberpunk runs into a CPU bottleneck I think, so I don't remember seeing over 45fps even at 1080p, but something newer than fucking Ivy Bridge would crush it no problem, e.g.: https://youtu.be/qkuR8ZK5TVQ?t... [youtu.be]

              Obviously I'm not saying it's a 1:1 substitute, you're missing RT and DLSS stuff but this was just to illustrate that it's no

      • by guest reader ( 2623447 ) on Saturday December 31, 2022 @01:24AM (#63169804)

        at a frame rate so high that it's literally meaningless because itâ(TM)s already twice as fast as the reflexes of the fastest person on the planet.

        High frame rate is not meaningless according to the following articles:

        Humans perceive flicker artefacts at 500 Hz

        Humans perceive a stable average intensity image without flicker artifacts when a television or monitor updates at a sufficiently fast rate. This rate, known as the critical flicker fusion rate, has been studied for both spatially uniform lights, and spatio-temporal displays. These studies have included both stabilized and unstablized retinal images, and report the maximum observable rate as 50 - 90 Hz. A separate line of research has reported that fast eye movements known as saccades allow simple modulated LEDs to be observed at very high rates. Here we show that humans perceive visual flicker artifacts at rates over 500 Hz when a display includes high frequency spatial edges. This rate is many times higher than previously reported. As a result, modern display designs which use complex spatio-temporal coding need to update much faster than conventional TVs, which traditionally presented a simple sequence of natural images.

        https://www.ncbi.nlm.nih.gov/p... [nih.gov]

        Human Eye Frames Per Second, 220 FPS

        The USAF, in testing their pilots for visual response time, used a simple test to see if the pilots could distinguish small changes in light. In their experiment a picture of an aircraft was flashed on a screen in a dark room at 1/220th of a second. Pilots were consistently able to "see" the afterimage as well as identify the aircraft. This simple and specific situation not only proves the ability to percieve 1 image within 1/220 of a second, but the ability to interpret higher FPS.

        http://amo.net/NT/02-21-01FPS.... [amo.net]

      • > at a frame rate so high that it's literally meaningless because it's already twice as fast as the reflexes of the fastest person on the planet.

        This is actually misleading and ignoring some important contexts, especially for multiplayer:

        1. You want a high framerate so when the worst case is hit with all the particles / smoke / transparency effects (i.e. overdraw) that your framerate is still high at 120+ FPS. Some reviews [youtube.com] will show the 1% Lows in relation to the max FPS.

        2. There is a HUGE difference be

      • Bro if you ever own a 140hz or 165hz panel with vsync turned on or at least have adaptive refresh rates you will experience a smoothness and motion that is like no other.

        I got that on a 144 hz panel in 2015 and can't go back similiar to an SSD. Yes your eyes can see it when you spin or turn after you hit 120 fps with 120 hz perfectly in sync with no 1% lows. Most people still have 60 hz panels or have shitty gpus that only do 70 fps so they never experience it.

        I got 1440p over 4K just for this effect. My go

      • The other reason to do this is to avoid upgrading in the near future. If you've got a new card with good ray tracing support and enough VRAM, you'll be good to go for several years.

    • by Darinbob ( 1142669 ) on Friday December 30, 2022 @05:43PM (#63169138)

      Those are not really intended for the average gamer. Those are ridiculous cards, either for crypto mining or the leet gamers who have something to prove.

      I don't get new GFX cards, I get something a couple years old and the price is much more reasonable. Games are at the point where graphics requirements are not going up, especially if you're sticking to an HD monitor. You do not need 120fps on a 4K monitor, save that money and put it into something useful, like more RAM.

    • Re: (Score:2, Insightful)

      by rsilvergun ( 571051 )
      AMD never competes at the high end. It's not worth it. The sales only look good if you don't consider what it takes to develop those super high-end gpus. Never mind the fact that you're taking silicon you probably could have sold the workstation customers and selling it to gamers at a fraction of the price so you can have some bragging rights.

      AMD's newest offerings are fine for their price point, right now the problem seems to be quality control with heating problems because paste and pads aren't being
    • A somewhat disingenuous argument. You're omitting the fact that some of the upfront cost of the 4090 is defrayed by reducing your heating bills (god help you come summer though.)

    • Price was what killed it for me too. I've been wanting to upgrade my 2015 Hackintosh desktop with some more RAM, a better CPU, and a newer video card. Honestly, I just wanted to be able to play RDR2 on it. I wasn't even planning on getting crazy. ~$350 for a 3050 even after the release of the 40x0s was a bridge too far.

      I bought an ASUS Tuff 15 i7/3070 for $999 on Black Friday. The SDD is too small to hold numerous AAA games at once, but it is upgradeable. I dislike the small keys, coming from an M1 Ai

    • I'm really bummed AMDs latest offerings werent more impressive as Nvidia desperately needs real competition in the high end market to help bring down prices.

      Within +-10% of nvidia's performance in normal things that use stream processors for slightly cheaper seems a good deal to me. Nvidia doesn't need to reduce their prices no matter the competition because of their fixed function hardware and the vendor lock-in to cuda that they have many sub industries by the balls though.

      Nvidia were once the go-to for decent linux driver support, that crown has been in amd's hands for the last ten years. The only real reason to buy nvidia is to use the software that uses nv

    • by mjwx ( 966435 )

      Nvidia's pricing doesnt help. A new 4080 costs $1,200 and good luck finding a 4090 for less than $2k.

      I'm really bummed AMDs latest offerings werent more impressive as Nvidia desperately needs real competition in the high end market to help bring down prices.

      This.

      NVIDIA/AMD: we've jacked up the prices on our cards, why aren't the selling?

      Gamers: Wait... are you talking to us, I couldn't hear you over the sound of you fellating the Cryto-bros.

      TBF, I'm glad I got a 3070 FE because Nvidia insisted that _SOME_ cards had to be sold to gamers (allegedly). However the problem is that there's no cheap cards. In 2016 I could buy a 970 for under £300. A 3070 is still upwards of £550 (I paid &pound:480 for my FE) and that is for the non Ti versio

  • How can they compare the units of shipped discrete GPUs vs integrated GPU? It doesn't make any sense. Every desktop needs a GPU but a discrete GPU is just an option. It's like comparing the number of laptops sold world-wide vs discrete GPUs.

    • by Luckyo ( 1726890 )

      Anecdotal, but I got my father a new laptop recently, and we got a great deal on RTX 3060 intel 12gen i5 laptop. Not even a bad one, it had a great IPS screen, a good quality keyboard and a solid cooling system.

      It cost less than a prebuilt desktop with similar specs (I could probably build one from parts and some bargain hunting for a comparable cost). And those sales are everywhere now.

      • by Rei ( 128717 )

        Note that the laptop GPUs are not comparable to their desktop namesakes. A laptop 3060 is not the same as a desktop 3060. For example, the laptop 3060s only have 6GB of VRAM, vs. 12GB on the desktops.

        • Frequencies (core and memory) and bus width is also usually significantly lowered.

        • by Luckyo ( 1726890 ) on Friday December 30, 2022 @06:29PM (#63169232)

          Before I get into the detail, I need to mention that I fully agree with you that as a matter of a rule, laptop GPUs are cut down variants of desktop GPUs across the board.

          But this is actually somewhat wrong for 3060 specifically.

          Desktop 3060 has less CUDA cores than mobile 3060 (doublecheck it if you want, it's actually true). Specifically desktop variant uses a cut down "smaller 30 gen die" whereas desktop uses "full smaller 30 gen die". But desktop version runs at a higher power limit and slightly higher top frequency. Desktop version also has a wider memory bus, but its memory runs at a significantly lower frequency. Desktop part does have twice the memory, but it's pretty well understood that having 12 gigs of VRAM on a 3060 as opposed to 8 on original 3080 is a nod to miners who could do with memory in multiples of 6GB (as you needed about 5GB of VRAM for a single instance of Ethereum mining), rather than gamers who can't really use it in most cases. Same applies to memory bus width.

          My point is that desktop 3060 isn't actually a GPU aimed at gamers but at miners. Lots of fast memory, cut down die. Laptop 3060 is actually the GPU aimed at gamers. Less and slower memory, but full sized die.

          In this light, 6 GB of VRAM on a mobile variant makes way more sense for a card of this performance level for gaming.

          • by Rei ( 128717 )

            As someone who buys graphics cards specifically for AI art (a small but rapidly growing subset of buyers), it's all about VRAM. I don't care about whether memory is in multiples of 6GB. I care whether I can run Dreambooth, and if I can, at what resolutions and how much do I have to offload to the CPU. What sort of resolutions can I generate in img2img. Whether I can run the X4 upscaler. Whether I'll be able to handle 1024x1024 models when they come out. On and on.

            Performance is of course great - everyon

      • The prices here in Brazil are completely different: assembly a desktop with same horsepower than a laptop generally costs 5X less...
        • by Luckyo ( 1726890 )

          The difference is usually around 40-50% here in Northern Europe.

          But we live in a really weird period.

  • by devslash0 ( 4203435 ) on Friday December 30, 2022 @05:17PM (#63169074)

    ...or is it just the cry of despair caused and exaggerated by the fall of all the crypto ponzi schemes?

    • by leonbev ( 111395 )

      They probably had record sales to crypto miners last year, so this year was probably a huge shock to them.

      For consumers, this was a no-brainer. Buy a used 3080 on eBay for $700 that was priced twice that much a year ago, or spend $1,200 for a 4080 which is only marginally faster in most game titles.

  • good (Score:3, Interesting)

    by bloodhawk ( 813939 ) on Friday December 30, 2022 @05:17PM (#63169076)
    couldn't be more deserved, scumbag price gougers like Nvidia need a healthy dose of reality and with GPU mining collapsing they are thankfully gonna get it in smelly truck loads.
    • I just wish it didn't come at the expense of consumer choice. By all means NVIDIA can go f*** themselves for what they are charging, but I would still like a new high end graphics card, and no AMD doesn't meet my needs (CUDA, AI, Raytracing).

      • Dependence on one company's fixed function hardware (what cuda uses) isn't necessarily a great thing.

        AMD cards can do AI things, but pytorch/tensorflow not having either opencl or vulkan-compute backends severely limits things.

        Strangely enough, NCNN (from tencent) seems to be one of the most trouble-free libraries I've seen, supporting any backend you choose. I've used it on AMD cards for AI video frame interpolation and some other uses and it works a treat.

        Capabilities wise AMD cards seem to fare better on

  • News at 10 (Score:4, Informative)

    by TechyImmigrant ( 175943 ) on Friday December 30, 2022 @05:18PM (#63169078) Homepage Journal

    People don't appreciate being ripped off.

  • I'd LOVE a new GPU (Score:4, Insightful)

    by phoenix182 ( 1157517 ) on Friday December 30, 2022 @05:24PM (#63169098)
    Drop the price of midrange cards by 40-80% so I can consider getting one. I'm not paying $600 for a 4 year old video card, nor will I pay $2000 to build a new system in order to have the connection and power requirements of the new kids on the block (even when the GPU itself is often cheaper).
  • It's almost as if treating your customers like crap and jumping on the crypto craziness and selling direct to crypto miners while ignoring your traditional custom base has an impact. As a result people are making do without GPUs by using phones, tablets, and game consoles. The next gen cards are still crazy expensive, even the new, not shipping yet RTX 4070 is, the 4th from the fastest Nvidia card) is $800 and has the same bus width as a RTX 3060 and *LESS* than a 3060 Ti.

    Sure the new cards are faster, ev

    • and it was 60% cheaper than the 4070 ti is - the equivalent tier. I've been waiting to build a new machine anyway, but I'm in no rush especially with Nvidia's absolutely tone deaf bullshit.

  • by xack ( 5304745 ) on Friday December 30, 2022 @05:42PM (#63169134)
    GPUs have exceeded fps and resolutions that most human eyes can percieve already, so demand was artificially pumped with crypto coins and ai. Now they are collapsing GPU manufacturers are stuck holding the bag.

    They deserve to lose, we could have had capable gpus for $499 but they decided to play with dogecoins instead, so I will stick with my 2070 for another generation.
  • Talking about sales is meaningful for those of us who wish prices were lower, but the real question is whether this is working for nvidia (and to a similar extent, AMD, although they have a less attractive GPU lineup as usual) or it's going to come back to bite 'em. It seems to me like AMD should be selling at cost if necessary to pick up market share, but maybe their driver situation is still dire enough that they don't have the confidence for it.

    • but maybe their driver situation is still dire enough that they don't have the confidence for it.

      On linux AMD has been the go-to for nearly a decade due to the quality of drivers and "just working". I've heard the windows situation is fairly abysmal, but this being slashdot that really shouldn't bother too many of us.

      • On linux AMD has been the go-to for nearly a decade due to the quality of drivers and "just working".

        Unless you have a brand new card that just came out, and then for some reason the Linux drivers still lag behind the Windows ones. So the Windows users get fucked hardest by never having working drivers, but the Linux users get fucked by never being able to buy the latest card and expect it to work. So AMD has gone from never having good drivers, on any platform, to eventually having good drivers on one platform.

        Maybe they should hire some competent driver programmers? It's only been since they were a whole

  • And this is what happens. Plus anything less than 4 years old is competent enough for casual gaming.

    I have always been a 'mid range' bargain hunter and refused to pay over about $120 US for new mid range card, I finally broke that rule and splurged $185 on a mid range card 2 years ago. I am not seeing a reason to upgrade at all. But then I don't play new whiz-bang AAA game sequels.
  • It looks like that demand is finally satisfied. Expect prices to come down soon. Nvidia is going to try to bully their board partners into keeping prices high. I expect in about 6 months to a year to be reading of the story about price fixing scheme Nvidia paid a fine for but accepted no wrongdoing.

    It sure would be nice if I got picked up by the cops for something and all I had to do was pay a token amount of money and have zero criminal record as a result. Must be nice
  • I'm running a card that's about a decade old and it's fine for every single game I own. I would upgrade, but there's not that much benefit for a not so small pile of cash. Not only that, I'd have to upgrade my decade old PC that can play over 99% of all 2022 game releases. Outside of some tyrant at microsoft shoving a requirement for new hardware, there's not reason to upgrade anything.

  • ...with a proper GPU: there's no need to upgrade desktop CPUs nowadays
    • by pjrc ( 134994 )

      If you do streaming or video recording, nVidia's 2000 series (Turing) made significant visual quality improvement in nvenc's real-time video encoding. Apparently nvenc is exactly the same in 3000 and 4000 series cards.

    • by Luckyo ( 1726890 )

      Honestly, this hasn't been true with some of the latest games. I've been CPU capped on my overclocked 6600k on some of the latest Far Cry games with weird hitching because CPU occasionally tilts.

      Most games that are properly optimized run fine though.

    • Mis-read text aside, a CPU greater than 5 years old will be unable to run a supported version of Windows receiving security updates within the next 2 years. And that's before we get into actual CPU bound scenarios. A good GPU doesn't help Photoshop batch process images, doesn't help compression operations, doesn't help process layers in Premier, doesn't help run large excel sheets or many engineering tools. CPU architectures have changed as well. I/O bound applications would benefit greatly from high end PC

  • Integrated graphics is good even for gaming, nowadays
    • No, integrated graphics is passable even for gaming. "Good" is a word reserved for something that can actually run games at highest graphical qualities.

  • still waiting (Score:5, Insightful)

    by awwshit ( 6214476 ) on Friday December 30, 2022 @06:23PM (#63169224)

    I'm still waiting for the market to drop further.

  • by Anonymous Coward on Friday December 30, 2022 @06:35PM (#63169240)

    Why is there seemingly nothing between integrated graphics and outrageously power hungry high end cards that occupy multiple slots with enormous fans?

    I can't be the only one who wants a modest single-slot GPU that isn't using an ancient architecture which is even worse than years old integrated graphics.

    • Because for the past 5 years NVIDIA has chased those 40% generational gains and have brute forced it by doing the equivalent of 1% gain for $1. The performance per dollar doesn't increase. NVIDIA just raises the cap on the high end and slots their products in that new space. The old $300 mid-range slot doesn't exist anymore and what they used to call an xx70 series card is now just a label slapped onto what would have been a Titan on earlier generations.

      They are finding out that while you can technically

    • Single slot really limits your power draw. 75 watts max. (Note- the cards that cover 2 slots but only plug into one, such as the 1050, have the same power limitations). At some point you just need more juice. Not for everyday use, when the cards will power down their fans and only consume 9-18 watts (3060 series as an example), but when you need maximum performance and 200-225-245 watts.

      The real price action now is in 3060s. They've dropped in price by about 25%-30%, so expect more price drops in the fut

      • At that point you might as well buy a console.

        This saddens me as I saw the PC rise from a rich kids elitist thing where all the real exclusives were consoles because of DRM and price to where the PC outdid it and Steam brought games to the masses.

        Now it is going backwards and if I was born 20 years later I would be buying an xbox and ignore the PC. Motherboard makers and case makers too are making expensive products that are outdo the pace of inflation now as it feels like a boat or high end sports car modd

    • by msk ( 6205 )

      You're not the only one. I want a single-slot card for my DVR that can handle hardware encoding of HEVC. AV1 would be really nice to have, but good luck with that.

      The single-slot Nvidia TU117-based PNY T400 series cards can do limited HEVC encoding, but prices are still a bit high.

      My gaming desktop is still fine with a (two-slot) GTX 760 bought in 2014.

    • That's because the integrated graphics have improved greatly. If you aren't doing something where you need one of those high powered GPU's, then chances are the integrated graphics are plenty good enough.

      This isn't like 15-20 years ago where I'd have to buy some GPU, even if it was a basic $50 card, just to have something that would provide a reasonable desktop experience, or could handle more than one display, or to be able to drive that fancy new LCD with something better than crappy obsolete VGA.

    • Re: (Score:2, Flamebait)

      Why is there seemingly nothing between integrated graphics and outrageously power hungry high end cards that occupy multiple slots with enormous fans?

      Because the iGPUs already have a big heatsink with a powerful fan attached.

  • by ffkom ( 3519199 ) on Friday December 30, 2022 @06:53PM (#63169270)
    ... if they had not been priced so way out of proportion, while becoming ever more power hungry.
  • I'm still happily trudging along with a GTX960. When it came time to upgrade a few years back nVidia was in full 'screw the customer' mode. They still are. I will never purchase a new GPU card again. When prices finally settle I'll get a nice used card, probably a 2000 series. I'll never support a company that treated their customers the way nVidia did.
    • by Osgeld ( 1900440 )

      while I agree to a point, it was about this time last year when my wife was working at her computer and said, "do you smell that" I did, it smelled of electronics burning and the machine died shortly thereafter

      That was a nice well made EVGA 980, thanks to the high power demand of the cards made in the last few generations, and the ridiculous price of new cards means there's much more run time on cards to fail at the weakest link. Her's was just a power fet, which could have been repaired but it toasted and

      • Every six months I open the case and vacuum my system out to help prevent this. The 960's fan does accumulate fair amounts of dust. When I realized I would need to hold onto the 960 for a while longer I removed the heat sink and installed new thermal pads. I'm glad I did, the old ones were crumbling. Hopefully regular maintenance will help to avoid "that smell" for a little while longer. :) I'm really surprised the fan has not given up the ghost yet. I have a few friends that I've jerry-rigged a few 40mm an
  • Yes the 3000 series sucked donkey dick. Yes everything is overpriced. But wait a few month for the next catastrophe to hit and they will beg us to sell them a 1060 for $1000.
  • When money is tight and the customer base takes rightful offense at being shat upon why buy yet more expensive toys?Miners bought GPUs as tools but they're no longer worth owning for that purpose.

    Tool buyers don't need them and toy buyers don't want them.

    • by Dwedit ( 232252 )

      GPUs are used for AI-based algorithms. I've probably done more AI upscaling on my GPU than actual games.

      • All the guys I know who do this stuff just rent on AWS or Azure. PC is too expensive and Nvidia has crippled things like double precision on CUDA for their non data center GPUs which are consumer grade.

  • by Opportunist ( 166417 ) on Friday December 30, 2022 @08:23PM (#63169426)

    Could it be that people don't want to pay 1000 bucks for a mid-range GPU and get a new mortgage for a high end one?

    Come back down to sane prices and we'll talk about buying one.

    • by Osgeld ( 1900440 )

      but how will you play quake 2 with raytracing on the 32x32 pixel textures and dozens of polygons per character? The whole RTX series is a solution looking for a problem, and with NVIDIA at least its that or the 16 series, which still cost as much as a 980 from 2015

  • by merde ( 464783 ) on Saturday December 31, 2022 @02:08AM (#63169846)

    My desktop has a 1080ti. For everything I do (and play!) it is good enough. Sure, a more modern card would allow a better 4k gaming, but for the games I play the 'ti is good enough. It is also hybrid cooled and quiet.

    So why would I blow a huge pile of cash on something noisy that makes little difference and burns more power?

  • by jonwil ( 467024 ) on Saturday December 31, 2022 @02:32AM (#63169884)

    I bought an MSI GTX 1660 Super Aero ITX 6G OC (that I am still using and that is more than good enough for what I need it for) back in January of 2020 before things went crap. The same card from the same vendor is more expensive as of right now than it was when I bought it. (this is in Australia).

    • A PlayStation will perform much better than that card. THis is so opposite of 2016 and crazy to even type this. That GPU isn't bad for entry level games and business graphics but is painful as a long time PC enthusiast.

      AMD really needs to get their act together and stop locking in step with Nvidia with the prices. If their crappy 7900 xtx was a 7800 and the 7900 xt was a 7700 and cost $499 and $599 respectively I would say that would be a positive start.

      I think TSMC is simply tipple charging because they ha

  • About a year back, I finally gave up on my trusty GTX 970 and got an RTX 3060.
    Thing is, I could've probably eeked another few years out of that card, but convinced myself I needed an upgrade.

    With energy prices having gone through the roof in Europe, with more pain to come and the my PSU consuming 300watts in some games, it seems a total waste of energy to me.

    It's about time we saw some optimisation in this space - both from manufactures and games developers, because it has reached ridiculous levels.

    People l

  • The last time I purchased a graphic card was for release of Cyberpunk 2077. I knew that my old 970 wasn't really up to the task and would detract from my play. While I can afford a new card, there isn't anything on my gaming list where I must have it. So I am not going to buy it.

Brain off-line, please wait.

Working...