Desktop GPU Sales Hit 20-Year Low (tomshardware.com) 167
Demand for graphics cards significantly increased during the pandemic as some people spent more time at home playing games, whereas others tried to mine Ethereum to get some cash. But it looks like now that the world has re-opened and Ethereum mining on GPUs is dead, demand for desktop discrete GPUs has dropped dramatically. From a report: In fact, shipments of discrete graphics cards hit a ~20-year low in Q3 2022, according to data from Jon Peddie Research. The industry shipped around 6.9 million standalone graphics boards for desktop PCs -- including the best graphics cards for gaming -- and a similar number of discrete GPUs for notebooks in the third quarter.
In total, AMD, Intel, and Nvidia shipped around 14 million standalone graphics processors for desktops and laptops, down 42% year-over-year based on data from JPR. Meanwhile, shipments of integrated GPUs totaled around 61.5 million units in Q3 2022. In fact, 6.9 million desktop discrete add-in-boards (AIBs) is the lowest number of graphics cards shipped since at least Q3 2005 and, keeping in mind sales of standalone AIBs were strong in the early 2000s as integrated GPUs were not good enough back then, it is safe to say that in Q3 2022 shipments of desktop graphics boards hit at least a 20-year low.
In total, AMD, Intel, and Nvidia shipped around 14 million standalone graphics processors for desktops and laptops, down 42% year-over-year based on data from JPR. Meanwhile, shipments of integrated GPUs totaled around 61.5 million units in Q3 2022. In fact, 6.9 million desktop discrete add-in-boards (AIBs) is the lowest number of graphics cards shipped since at least Q3 2005 and, keeping in mind sales of standalone AIBs were strong in the early 2000s as integrated GPUs were not good enough back then, it is safe to say that in Q3 2022 shipments of desktop graphics boards hit at least a 20-year low.
Nvidia pricing doesnt help (Score:5, Interesting)
Nvidia's pricing doesnt help. A new 4080 costs $1,200 and good luck finding a 4090 for less than $2k.
I'm really bummed AMDs latest offerings werent more impressive as Nvidia desperately needs real competition in the high end market to help bring down prices.
Re:Nvidia pricing doesnt help (Score:5, Funny)
It's not that bad on pricing. You just need to stop thinking in terms of performance per currency and start thinking in terms of performance per kilo.
My old benchmark of GTX970 which was the most popular card of its generation and kept that post for a long time into the next generation only weighed about 0.5 kilos and was 320 EUR. Reference RTX 4090 is around 2000 EUR, but it's close to five kilos in weight. So it's actually only 400 EUR per kilo, as compared to GTX 970's 640 per kilo.
A true bargain!
Re: (Score:3)
As a gamer my thinking is value versus alternatives. I love PC gaming and have been a PC gamer since I bought and built my own PC for the first time when I was 16. Twelve hundred for a new card when I'm building a new rig is an awful lot of money when an entire gaming console sells for half that though. Given that gamers are some of the biggest drivers of at least high end GPU sales it's not surprising GPU sales are falling with such poor value being offered relative to competing platforms of game play.
Re:Nvidia pricing doesnt help (Score:5, Insightful)
If you want my take on PC gaming right now, games seem to be in a pretty bad place. Ethereum shitshow with GPUs lasted for about two years, during which time GPUs were almost unobtainium for average gamers.
Game developers took notice, and a lot of games don't really require a good GPU nowadays. My GTX970 lasted my until very recently, and I don't think there was a game other than CP2077 that it would run at least decently on low/medium 1080p, usually having some issues holding 60 fps but not much.
I upgraded after one of the fans started giving out and I couldn't be bothered to buy a new fan and basically rebuild the card with it. Instead I went to used GPU market and got a half a year old 2060 for 140 EUR. It has two and a half years of warranty left on it. I now can run most games on high 1080p at above 60fps with minimums dipping just below.
Moral of the story: if you REALLY need to upgrade right now, buy used. Ignore the new GPU market. There's nothing there right now that's worth the price unless you just don't care about price to performance.
Re: Nvidia pricing doesnt help (Score:3)
To me, it looks like gaming is going on the same road as cars. A few diehards spend loads of money on state of the art hardware to get that extra Hz or fps. Bragging about some number that is higher than the competition.
Re: (Score:2)
Building a gaming PC is actually much easier than scoring a console, though admittedly more expensive.
Re: (Score:2)
I know you're joking btw, I just thought I'd spell out my thinking on these new cards having poor pricing.
Re: (Score:3)
I'm not having problems playing modern games on my 2080TI. I like new shiny techy things and I can afford them, but non gouge pricing on 3090s came around about the time the 40xxs came out. So the 3000 series aren't new and shiny and the 4000 series is in full on price gouging. I haven't splashed out yet.
My next discrete GPU purchase is probably for the Intel card for AV1 hardware encoding, which is something I need. My games will continue to work fine on the 2080ti for a while and the 'new shiny thing' psy
Re:Nvidia pricing doesnt help (Score:5, Interesting)
It depends what your target is. For example, in my circle it's all about VRAM on NVidia cards for CUDA, for AI art and other AI applications. So until recently, it was only a two-card race: the 3060 for 12GB if you were on a budget, or the 3090 for 24GB if you had tons of money. The new 40xx series options add in new possibilities so it's not as narrow of a choice (though I expect most to stick with 30xx series for now).
I myself recently got (but haven't yet set up... fingers crossed) a used 3090 formerly-crypto-mining card with no functional HDMI ports, for $550, alongside a new motherboard+ram+cpu so that it'll (hopefully) all fit and work together. Given that new 3090s are >=$1300 right now, it could either be a steal, or a big waste of money. Gonna try to run both the 3090 and 3060 at once if they'll fit, though I'll probably need to underclock them.
Re: (Score:3)
I'd love to be able to use GPUs in my day job, because I do a bucket load of data analysis (entropy analysis for cryptography mostly) but those algorithms don't map to GPUs. So I make do with high core count CPUs and the occasional FPGA. The FPGAs take a bit more design investment to design the circuits, but they process a lot of data quickly once it's done.
Re: (Score:2)
My next discrete GPU purchase is probably for the Intel card for AV1 hardware encoding, which is something I need.
If it's not happening immediately, might as well wait for the RTX 40 prices to come down, then you won't have a GPU that's only useful for encode.
Re: (Score:2)
Meet Cyperpunk 2077. Put on Raytracing and connect a 1440P or 4K monitor and get back to me if you still think the 2080ti is awesome.
Hell I own a 3080TI and it melts and has coil whine at 1440P when I put on raytracing on max settings
Re: (Score:2)
If I lash my 3090 to a pole, it doubles nicely as a sledgehammer for driving in fence posts.
Re: (Score:2)
Honestly, I'd watch the hell out of that youtube video.
Re: (Score:2)
Wow, Jensen was right, the more kilos you buy, the more you save!
Re: (Score:2)
Indeed. Why buy one 4090, when you can buy two and save even more!
Just SLI them or something.
Re: (Score:3)
My old benchmark of GTX970 which was the most popular card of its generation and kept that post for a long time into the next generation only weighed about 0.5 kilos and was 320 EUR. Reference RTX 4090 is around 2000 EUR, but it's close to five kilos in weight. So it's actually only 400 EUR per kilo, as compared to GTX 970's 640 per kilo.
OK, so you're joking, and I know that, but - I think you're on to something.
GPU performance has pretty much capped. So they're trying to increase it now by making the cards ever larger and drawing ever more power. Current GPUs are freaking huge! Some of them require special supports because they're so large and the case and PCI connector simply cannot support their weight. I mean, a 5kg GPU, just think about that. That's legitimately the weight of a small dumbbell. And these GPUs were literally melting thei
Re: Nvidia pricing doesnt help (Score:2)
Spoken like a former drug dealer, or a street corner pharmacist. Only keys matter, forget metric or Imperial
Re: (Score:2)
The 4090 is 2,186 g (2.186 kg) in weight. Close to 5 pounds, not 5 kilos.
Still, that's a LOT of weight.
But this is all good news - I'm in the market for a pair of new decent video cards, but I can wait - no rush. Overpaid early this year during the shortages, so I'll be quite happy to see what the market is like in 3-6 months. Kind of "make it up."
Decent specs, decent power consumption. Because power consumption is a thing - you'll end up paying way more for power to run a 4090 over it's lifetime tha
Re:Nvidia pricing doesnt help (Score:5, Insightful)
Meanwhile, for normal people, $1200 buys a full PC, with a monitor, that’s capable of playing literally any PC game at halfway decent settings.
Let’s set aside the scientists that use graphics cards for actual research. That’s a different story, but most of those types don’t buy the gaming-grade cards. Nvidia has an entirely separate GPU line for that kind of work.
There are also a few thousand people who crunch video for a living. There, a $1200 graphics card is a business purchase. Other than that, this discussion is purely all about gaming.
Re: (Score:2)
Meanwhile, for normal people, $1200 buys a full PC, with a monitor, that’s capable of playing literally any PC game at halfway decent settings.
Poor value there in regards to what $1,200 can get you with today's GPU prices alone as that $1,200 PC wont perform any better than consoles costing half as much. Plus that $1,200 gaming PC is likely to feel out of date sooner than a brand new console.
I know PC gaming has always been more expensive than console gaming but with these GPU prices nowadays it really feels much more so.
Re:Nvidia pricing doesnt help (Score:4, Interesting)
I really doubt that because my $500 PC from 4 years ago is pretty close to consoles as it is. I put a GTX1070 into an ebay Optiplex and it ran Cyberpunk (at release, not now that it's been debugged) at an actual 1440p. Which is... what the consoles are doing.The only difference they can upscale with FSR and there are some light RT effects thrown in.
Re: (Score:2)
I put a GTX1070 into an ebay Optiplex and it ran Cyberpunk (at release, not now that it's been debugged) at an actual 1440p
At what, 2 FPS? I have the same card and even at 1080p (which is my monitor resolution, still) most games can't make 60 fps even at middle settings.
Re: (Score:2)
What? No, 30-40 fps, medium-high settings: https://i.imgur.com/lQThNgj.jp... [imgur.com]
Not amazing but playable and comparable to the consoles when they actually render at that resolution. Cyberpunk runs into a CPU bottleneck I think, so I don't remember seeing over 45fps even at 1080p, but something newer than fucking Ivy Bridge would crush it no problem, e.g.: https://youtu.be/qkuR8ZK5TVQ?t... [youtu.be]
Obviously I'm not saying it's a 1:1 substitute, you're missing RT and DLSS stuff but this was just to illustrate that it's no
Re:Nvidia pricing doesnt help (Score:4, Informative)
at a frame rate so high that it's literally meaningless because itâ(TM)s already twice as fast as the reflexes of the fastest person on the planet.
High frame rate is not meaningless according to the following articles:
Humans perceive flicker artefacts at 500 Hz
Humans perceive a stable average intensity image without flicker artifacts when a television or monitor updates at a sufficiently fast rate. This rate, known as the critical flicker fusion rate, has been studied for both spatially uniform lights, and spatio-temporal displays. These studies have included both stabilized and unstablized retinal images, and report the maximum observable rate as 50 - 90 Hz. A separate line of research has reported that fast eye movements known as saccades allow simple modulated LEDs to be observed at very high rates. Here we show that humans perceive visual flicker artifacts at rates over 500 Hz when a display includes high frequency spatial edges. This rate is many times higher than previously reported. As a result, modern display designs which use complex spatio-temporal coding need to update much faster than conventional TVs, which traditionally presented a simple sequence of natural images.
https://www.ncbi.nlm.nih.gov/p... [nih.gov]
Human Eye Frames Per Second, 220 FPS
The USAF, in testing their pilots for visual response time, used a simple test to see if the pilots could distinguish small changes in light. In their experiment a picture of an aircraft was flashed on a screen in a dark room at 1/220th of a second. Pilots were consistently able to "see" the afterimage as well as identify the aircraft. This simple and specific situation not only proves the ability to percieve 1 image within 1/220 of a second, but the ability to interpret higher FPS.
http://amo.net/NT/02-21-01FPS.... [amo.net]
Re: (Score:3)
> at a frame rate so high that it's literally meaningless because it's already twice as fast as the reflexes of the fastest person on the planet.
This is actually misleading and ignoring some important contexts, especially for multiplayer:
1. You want a high framerate so when the worst case is hit with all the particles / smoke / transparency effects (i.e. overdraw) that your framerate is still high at 120+ FPS. Some reviews [youtube.com] will show the 1% Lows in relation to the max FPS.
2. There is a HUGE difference be
Re: (Score:2)
Bro if you ever own a 140hz or 165hz panel with vsync turned on or at least have adaptive refresh rates you will experience a smoothness and motion that is like no other.
I got that on a 144 hz panel in 2015 and can't go back similiar to an SSD. Yes your eyes can see it when you spin or turn after you hit 120 fps with 120 hz perfectly in sync with no 1% lows. Most people still have 60 hz panels or have shitty gpus that only do 70 fps so they never experience it.
I got 1440p over 4K just for this effect. My go
Re: Nvidia pricing doesnt help (Score:2)
The other reason to do this is to avoid upgrading in the near future. If you've got a new card with good ray tracing support and enough VRAM, you'll be good to go for several years.
Re:Nvidia pricing doesnt help (Score:4, Insightful)
Those are not really intended for the average gamer. Those are ridiculous cards, either for crypto mining or the leet gamers who have something to prove.
I don't get new GFX cards, I get something a couple years old and the price is much more reasonable. Games are at the point where graphics requirements are not going up, especially if you're sticking to an HD monitor. You do not need 120fps on a 4K monitor, save that money and put it into something useful, like more RAM.
Re: (Score:2)
Except a console won't play the games I want, and I will use that computer for other stuff. So it's basically an above average computer, but with an added $150 card. And if you're comparing to the popular consoles, like xbox 360, a moderate gfx card will outperform it.
Re: (Score:2)
Re: (Score:2)
5 years old? Practically new! (I meant to say xbox One, but some confusion on Microsoft version numbering made me think that 360 > 1.) I heard the latest one (xbox series uber?) was hard to get a hold of. I assume console players are like PC players and make sure their purchase lasts 10 years before replacing? Otherwise if they upgrade every time there's a new model then that definitely costs more than a PC.
Re: (Score:2)
Now show us how well your console runs Cura and outputs to my 3D printer. No?
DaVinci Resolve? No?
Sound editing software? No?
The Windows-only software the company I work for makes? No?
(and they paid half the cost of the PC, btw, so I only paid $600)
Can your console create the fucking PowerPoint slides that play a goddamn video clip of me using my company's software so some mid-level manager doesn't have to actually interact with the software to see how it works because it would fucking break their finger
Re: (Score:2, Insightful)
AMD's newest offerings are fine for their price point, right now the problem seems to be quality control with heating problems because paste and pads aren't being
Re: (Score:2)
A somewhat disingenuous argument. You're omitting the fact that some of the upfront cost of the 4090 is defrayed by reducing your heating bills (god help you come summer though.)
Re: (Score:2)
Price was what killed it for me too. I've been wanting to upgrade my 2015 Hackintosh desktop with some more RAM, a better CPU, and a newer video card. Honestly, I just wanted to be able to play RDR2 on it. I wasn't even planning on getting crazy. ~$350 for a 3050 even after the release of the 40x0s was a bridge too far.
I bought an ASUS Tuff 15 i7/3070 for $999 on Black Friday. The SDD is too small to hold numerous AAA games at once, but it is upgradeable. I dislike the small keys, coming from an M1 Ai
Re: (Score:2)
I'm really bummed AMDs latest offerings werent more impressive as Nvidia desperately needs real competition in the high end market to help bring down prices.
Within +-10% of nvidia's performance in normal things that use stream processors for slightly cheaper seems a good deal to me. Nvidia doesn't need to reduce their prices no matter the competition because of their fixed function hardware and the vendor lock-in to cuda that they have many sub industries by the balls though.
Nvidia were once the go-to for decent linux driver support, that crown has been in amd's hands for the last ten years. The only real reason to buy nvidia is to use the software that uses nv
Re: (Score:2)
Nvidia's pricing doesnt help. A new 4080 costs $1,200 and good luck finding a 4090 for less than $2k.
I'm really bummed AMDs latest offerings werent more impressive as Nvidia desperately needs real competition in the high end market to help bring down prices.
This.
NVIDIA/AMD: we've jacked up the prices on our cards, why aren't the selling?
Gamers: Wait... are you talking to us, I couldn't hear you over the sound of you fellating the Cryto-bros.
TBF, I'm glad I got a 3070 FE because Nvidia insisted that _SOME_ cards had to be sold to gamers (allegedly). However the problem is that there's no cheap cards. In 2016 I could buy a 970 for under £300. A 3070 is still upwards of £550 (I paid £:480 for my FE) and that is for the non Ti versio
Re: (Score:2)
Not to mention the huge power draw. Most people are going to need to build a new rig to use these cards.
Apples and Oranges (Score:2)
How can they compare the units of shipped discrete GPUs vs integrated GPU? It doesn't make any sense. Every desktop needs a GPU but a discrete GPU is just an option. It's like comparing the number of laptops sold world-wide vs discrete GPUs.
Re: (Score:2)
Anecdotal, but I got my father a new laptop recently, and we got a great deal on RTX 3060 intel 12gen i5 laptop. Not even a bad one, it had a great IPS screen, a good quality keyboard and a solid cooling system.
It cost less than a prebuilt desktop with similar specs (I could probably build one from parts and some bargain hunting for a comparable cost). And those sales are everywhere now.
Re: (Score:2)
Note that the laptop GPUs are not comparable to their desktop namesakes. A laptop 3060 is not the same as a desktop 3060. For example, the laptop 3060s only have 6GB of VRAM, vs. 12GB on the desktops.
Re: (Score:2)
Frequencies (core and memory) and bus width is also usually significantly lowered.
Re:Apples and Oranges (Score:4, Informative)
Before I get into the detail, I need to mention that I fully agree with you that as a matter of a rule, laptop GPUs are cut down variants of desktop GPUs across the board.
But this is actually somewhat wrong for 3060 specifically.
Desktop 3060 has less CUDA cores than mobile 3060 (doublecheck it if you want, it's actually true). Specifically desktop variant uses a cut down "smaller 30 gen die" whereas desktop uses "full smaller 30 gen die". But desktop version runs at a higher power limit and slightly higher top frequency. Desktop version also has a wider memory bus, but its memory runs at a significantly lower frequency. Desktop part does have twice the memory, but it's pretty well understood that having 12 gigs of VRAM on a 3060 as opposed to 8 on original 3080 is a nod to miners who could do with memory in multiples of 6GB (as you needed about 5GB of VRAM for a single instance of Ethereum mining), rather than gamers who can't really use it in most cases. Same applies to memory bus width.
My point is that desktop 3060 isn't actually a GPU aimed at gamers but at miners. Lots of fast memory, cut down die. Laptop 3060 is actually the GPU aimed at gamers. Less and slower memory, but full sized die.
In this light, 6 GB of VRAM on a mobile variant makes way more sense for a card of this performance level for gaming.
Re: (Score:3)
As someone who buys graphics cards specifically for AI art (a small but rapidly growing subset of buyers), it's all about VRAM. I don't care about whether memory is in multiples of 6GB. I care whether I can run Dreambooth, and if I can, at what resolutions and how much do I have to offload to the CPU. What sort of resolutions can I generate in img2img. Whether I can run the X4 upscaler. Whether I'll be able to handle 1024x1024 models when they come out. On and on.
Performance is of course great - everyon
Re: (Score:2)
Re: (Score:2)
Run six screens like a pro for whatever your 2D needs are.
You're still going to be fine with 6GB VRAM.
Re: (Score:2)
Run six screens like a pro for whatever your 2D needs are.
You're still going to be fine with 6GB VRAM.
Not with MSFS. Each screen needs 4 to 6 gigs on the video card.
Re: Apples and Oranges (Score:2)
Re: (Score:2)
The difference is usually around 40-50% here in Northern Europe.
But we live in a really weird period.
is it actually a 20-years low? (Score:5, Interesting)
...or is it just the cry of despair caused and exaggerated by the fall of all the crypto ponzi schemes?
Re: is it actually a 20-years low? (Score:2)
Re: (Score:3)
They probably had record sales to crypto miners last year, so this year was probably a huge shock to them.
For consumers, this was a no-brainer. Buy a used 3080 on eBay for $700 that was priced twice that much a year ago, or spend $1,200 for a 4080 which is only marginally faster in most game titles.
good (Score:3, Interesting)
Re: (Score:2)
I just wish it didn't come at the expense of consumer choice. By all means NVIDIA can go f*** themselves for what they are charging, but I would still like a new high end graphics card, and no AMD doesn't meet my needs (CUDA, AI, Raytracing).
Re: (Score:2)
Dependence on one company's fixed function hardware (what cuda uses) isn't necessarily a great thing.
AMD cards can do AI things, but pytorch/tensorflow not having either opencl or vulkan-compute backends severely limits things.
Strangely enough, NCNN (from tencent) seems to be one of the most trouble-free libraries I've seen, supporting any backend you choose. I've used it on AMD cards for AI video frame interpolation and some other uses and it works a treat.
Capabilities wise AMD cards seem to fare better on
News at 10 (Score:4, Informative)
People don't appreciate being ripped off.
I'd LOVE a new GPU (Score:4, Insightful)
Expected (Score:2)
It's almost as if treating your customers like crap and jumping on the crypto craziness and selling direct to crypto miners while ignoring your traditional custom base has an impact. As a result people are making do without GPUs by using phones, tablets, and game consoles. The next gen cards are still crazy expensive, even the new, not shipping yet RTX 4070 is, the 4th from the fastest Nvidia card) is $800 and has the same bus width as a RTX 3060 and *LESS* than a 3060 Ti.
Sure the new cards are faster, ev
Yea, I have a 1070 ti (Score:2)
and it was 60% cheaper than the 4070 ti is - the equivalent tier. I've been waiting to build a new machine anyway, but I'm in no rush especially with Nvidia's absolutely tone deaf bullshit.
GPUs reached peak already (Score:3)
They deserve to lose, we could have had capable gpus for $499 but they decided to play with dogecoins instead, so I will stick with my 2070 for another generation.
What about profits though? (Score:2)
Talking about sales is meaningful for those of us who wish prices were lower, but the real question is whether this is working for nvidia (and to a similar extent, AMD, although they have a less attractive GPU lineup as usual) or it's going to come back to bite 'em. It seems to me like AMD should be selling at cost if necessary to pick up market share, but maybe their driver situation is still dire enough that they don't have the confidence for it.
Re: (Score:2)
but maybe their driver situation is still dire enough that they don't have the confidence for it.
On linux AMD has been the go-to for nearly a decade due to the quality of drivers and "just working". I've heard the windows situation is fairly abysmal, but this being slashdot that really shouldn't bother too many of us.
Re: (Score:2)
On linux AMD has been the go-to for nearly a decade due to the quality of drivers and "just working".
Unless you have a brand new card that just came out, and then for some reason the Linux drivers still lag behind the Windows ones. So the Windows users get fucked hardest by never having working drivers, but the Linux users get fucked by never being able to buy the latest card and expect it to work. So AMD has gone from never having good drivers, on any platform, to eventually having good drivers on one platform.
Maybe they should hire some competent driver programmers? It's only been since they were a whole
Price yourself out of a market. (Score:2)
I have always been a 'mid range' bargain hunter and refused to pay over about $120 US for new mid range card, I finally broke that rule and splurged $185 on a mid range card 2 years ago. I am not seeing a reason to upgrade at all. But then I don't play new whiz-bang AAA game sequels.
Pent-up demand has been keeping prices high (Score:2)
It sure would be nice if I got picked up by the cops for something and all I had to do was pay a token amount of money and have zero criminal record as a result. Must be nice
too effing expensive, no benefit (Score:2)
I'm running a card that's about a decade old and it's fine for every single game I own. I would upgrade, but there's not that much benefit for a not so small pile of cash. Not only that, I'd have to upgrade my decade old PC that can play over 99% of all 2022 game releases. Outside of some tyrant at microsoft shoving a requirement for new hardware, there's not reason to upgrade anything.
Any CPU from last 10 years is "good enough"... (Score:2)
Re: (Score:2)
If you do streaming or video recording, nVidia's 2000 series (Turing) made significant visual quality improvement in nvenc's real-time video encoding. Apparently nvenc is exactly the same in 3000 and 4000 series cards.
Misread the article (Score:2)
Re: (Score:2)
Honestly, this hasn't been true with some of the latest games. I've been CPU capped on my overclocked 6600k on some of the latest Far Cry games with weird hitching because CPU occasionally tilts.
Most games that are properly optimized run fine though.
Re: Any CPU from last 10 years is "good enough"... (Score:2)
Re: (Score:3)
Mis-read text aside, a CPU greater than 5 years old will be unable to run a supported version of Windows receiving security updates within the next 2 years. And that's before we get into actual CPU bound scenarios. A good GPU doesn't help Photoshop batch process images, doesn't help compression operations, doesn't help process layers in Premier, doesn't help run large excel sheets or many engineering tools. CPU architectures have changed as well. I/O bound applications would benefit greatly from high end PC
Re: Any CPU from last 10 years is "good enough"... (Score:3)
Ryzen effect? (Score:2)
Re: (Score:2)
No, integrated graphics is passable even for gaming. "Good" is a word reserved for something that can actually run games at highest graphical qualities.
Re: Ryzen effect? (Score:2)
still waiting (Score:5, Insightful)
I'm still waiting for the market to drop further.
How about selling cards people want? (Score:4, Insightful)
Why is there seemingly nothing between integrated graphics and outrageously power hungry high end cards that occupy multiple slots with enormous fans?
I can't be the only one who wants a modest single-slot GPU that isn't using an ancient architecture which is even worse than years old integrated graphics.
Re: (Score:3)
Because for the past 5 years NVIDIA has chased those 40% generational gains and have brute forced it by doing the equivalent of 1% gain for $1. The performance per dollar doesn't increase. NVIDIA just raises the cap on the high end and slots their products in that new space. The old $300 mid-range slot doesn't exist anymore and what they used to call an xx70 series card is now just a label slapped onto what would have been a Titan on earlier generations.
They are finding out that while you can technically
Re: (Score:2)
Single slot really limits your power draw. 75 watts max. (Note- the cards that cover 2 slots but only plug into one, such as the 1050, have the same power limitations). At some point you just need more juice. Not for everyday use, when the cards will power down their fans and only consume 9-18 watts (3060 series as an example), but when you need maximum performance and 200-225-245 watts.
The real price action now is in 3060s. They've dropped in price by about 25%-30%, so expect more price drops in the fut
Re: (Score:2)
At that point you might as well buy a console.
This saddens me as I saw the PC rise from a rich kids elitist thing where all the real exclusives were consoles because of DRM and price to where the PC outdid it and Steam brought games to the masses.
Now it is going backwards and if I was born 20 years later I would be buying an xbox and ignore the PC. Motherboard makers and case makers too are making expensive products that are outdo the pace of inflation now as it feels like a boat or high end sports car modd
Re: (Score:3)
You're not the only one. I want a single-slot card for my DVR that can handle hardware encoding of HEVC. AV1 would be really nice to have, but good luck with that.
The single-slot Nvidia TU117-based PNY T400 series cards can do limited HEVC encoding, but prices are still a bit high.
My gaming desktop is still fine with a (two-slot) GTX 760 bought in 2014.
Re: (Score:2)
That's because the integrated graphics have improved greatly. If you aren't doing something where you need one of those high powered GPU's, then chances are the integrated graphics are plenty good enough.
This isn't like 15-20 years ago where I'd have to buy some GPU, even if it was a basic $50 card, just to have something that would provide a reasonable desktop experience, or could handle more than one display, or to be able to drive that fancy new LCD with something better than crappy obsolete VGA.
Re: (Score:2, Flamebait)
Why is there seemingly nothing between integrated graphics and outrageously power hungry high end cards that occupy multiple slots with enormous fans?
Because the iGPUs already have a big heatsink with a powerful fan attached.
I would have bought 2 to 3 GPUs in recent years... (Score:5, Informative)
To hell with nVidia (Score:2)
Re: (Score:2)
while I agree to a point, it was about this time last year when my wife was working at her computer and said, "do you smell that" I did, it smelled of electronics burning and the machine died shortly thereafter
That was a nice well made EVGA 980, thanks to the high power demand of the cards made in the last few generations, and the ridiculous price of new cards means there's much more run time on cards to fail at the weakest link. Her's was just a power fet, which could have been repaired but it toasted and
Re: (Score:3)
Do not worry, my brothers! (Score:2)
Gaming PCs are toys not tools (Score:2)
When money is tight and the customer base takes rightful offense at being shat upon why buy yet more expensive toys?Miners bought GPUs as tools but they're no longer worth owning for that purpose.
Tool buyers don't need them and toy buyers don't want them.
Re: (Score:3)
GPUs are used for AI-based algorithms. I've probably done more AI upscaling on my GPU than actual games.
Re: (Score:2)
All the guys I know who do this stuff just rent on AWS or Azure. PC is too expensive and Nvidia has crippled things like double precision on CUDA for their non data center GPUs which are consumer grade.
What could possibly be the reason? (Score:4, Insightful)
Could it be that people don't want to pay 1000 bucks for a mid-range GPU and get a new mortgage for a high end one?
Come back down to sane prices and we'll talk about buying one.
Re: (Score:2)
but how will you play quake 2 with raytracing on the 32x32 pixel textures and dozens of polygons per character? The whole RTX series is a solution looking for a problem, and with NVIDIA at least its that or the 16 series, which still cost as much as a 980 from 2015
An oldish card is good enough (Score:3, Interesting)
My desktop has a 1080ti. For everything I do (and play!) it is good enough. Sure, a more modern card would allow a better 4k gaming, but for the games I play the 'ti is good enough. It is also hybrid cooled and quiet.
So why would I blow a huge pile of cash on something noisy that makes little difference and burns more power?
I bought before the shortage hit... (Score:3)
I bought an MSI GTX 1660 Super Aero ITX 6G OC (that I am still using and that is more than good enough for what I need it for) back in January of 2020 before things went crap. The same card from the same vendor is more expensive as of right now than it was when I bought it. (this is in Australia).
Re: (Score:2)
A PlayStation will perform much better than that card. THis is so opposite of 2016 and crazy to even type this. That GPU isn't bad for entry level games and business graphics but is painful as a long time PC enthusiast.
AMD really needs to get their act together and stop locking in step with Nvidia with the prices. If their crappy 7900 xtx was a 7800 and the 7900 xt was a 7700 and cost $499 and $599 respectively I would say that would be a positive start.
I think TSMC is simply tipple charging because they ha
Absurdly overpowered ... (Score:2)
About a year back, I finally gave up on my trusty GTX 970 and got an RTX 3060.
Thing is, I could've probably eeked another few years out of that card, but convinced myself I needed an upgrade.
With energy prices having gone through the roof in Europe, with more pain to come and the my PSU consuming 300watts in some games, it seems a total waste of energy to me.
It's about time we saw some optimisation in this space - both from manufactures and games developers, because it has reached ridiculous levels.
People l
Purchase motivation (Score:2)