Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Hardware

Nvidia Announces Next-Gen RTX 4090 and RTX 4080 GPUs (theverge.com) 178

Nvidia is officially announcing its RTX 40-series GPUs today. After months of rumors and some recent teasing from Nvidia, the RTX 4090 and RTX 4080 are now both official. The RTX 4090 arrives on October 12th priced at $1,599, with the RTX 4080 priced starting at $899 and available in November. Both are powered by Nvidia's next-gen Ada Lovelace architecture. From a report: The RTX 4090 is the top-end card for the Lovelace generation. It will ship with a massive 24GB of GDDR6X memory. Nvidia claims it's 2-4x faster than the RTX 3090 Ti, and it will consume the same amount of power as that previous generation card. Nvidia recommends a power supply of at least 850 watts based on a PC with a Ryzen 5900X processor. Inside the giant RTX 4090 there are 16,384 CUDA Cores, a base clock of 2.23GHz that boosts up to 2.52GHz, 1,321 Tensor-TFLOPs, 191 RT-TFLOPs, and 83 Shader-TFLOPs.

Nvidia is actually offering the RTX 4080 in two models, one with 12GB of GDDR6X memory and another with 16GB of GDDR6X memory, and Nvidia claims it's 2-4x faster than the existing RTX 3080 Ti. The 12GB model will start at $899 and include 7,680 CUDA Cores, 7,680 CUDA Cores, a 2.31GHz base clock that boosts up to 2.61GHz, 639 Tensor-TFLOPs, 92 RT-TFLOPs, and 40 Shader-TFLOPs. The 16GB model of the RTX 4080 isn't just a bump to memory, though. Priced starting at $1,199 it's more powerful with 9,728 CUDA Cores, a base clock of 2.21GHz that boosts up to 2.51GHz, 780 Tensor-TFLOPs, 113 RT-TFLOPs, and 49 Shader-TFLOPs of power. The 12GB RTX 4080 model will require a 700 watt power supply, with the 16GB model needing at least 750 watts. Both RTX 4080 models will launch in November.
Further reading: Nvidia Puts AI at Center of Latest GeForce Graphics Card Upgrade.
This discussion has been archived. No new comments can be posted.

Nvidia Announces Next-Gen RTX 4090 and RTX 4080 GPUs

Comments Filter:
    • Probably the same PCB manufacturers that have always made the boards.
    • by mobby_6kl ( 668092 ) on Tuesday September 20, 2022 @12:15PM (#62898427)

      The better question is who the fuck is going to buy them. $1200 for a fucking 4080 and $1600 for the 4090 is an amazing joke.

      • by ArchieBunker ( 132337 ) on Tuesday September 20, 2022 @12:38PM (#62898485)

        Tons of gamers.

        • Not necessarily. While gamers can use the 4090, like its predecessor, the 3090, it is meant for video production. Will gamers with money buy them? Yes. But very few games can take advantage of 24GB of GDDR6X memory. The 4080 might be more for gamers.
          • by Shaitan ( 22585 )

            They'll buy them for the same reason they always have... game makers will buy them and target their max settings to take advantage of the top card.

            At least they increased the memory in the 'midrange' so Half-Life Alyx won't complain about not having enough video memory. Which highlights the real consumer use case for cards like these. VR. Whether games or other applications the ideal in VR is textures/models/surfaces 3d scanned from reality or that are photorealistic, with a resolution that can be blown up

            • would translate into being able to run all the games on max settings

              This is the bit that really seems to have changed, it's a new attitude that I think is detrimental. Max settings aren't supposed to be achievable. Or at least, that's how it used to be. Max settings were for future hardware, not for present day hardware. They add longevity to the game and give you present-day options: no you can't run everything at max on current hardware, but you can choose which limited settings you do want to run at max. Choices.

              Nowadays people throw a fit if they can't run everything

              • by Shaitan ( 22585 )

                That is backwards and was never how it was.

                Software, including games, can always be refreshed to take advantage of new hardware. You don't even know what will come down the line so what is in your 'max settings?' And how would you test your 'max settings' which depend on hardware that doesn't exist?

                Yes, you can tweak individual settings that frankly only a handful of game developers and insiders really understand in the first place but expecting that normal users (such as gamers) are likely to do this corre

                • by drnb ( 2434720 )

                  ... can always be refreshed to take advantage of new hardware ...

                  No, updating art assets to take advantage of new capabilities can be a major effort.

              • by Shaitan ( 22585 )

                Maybe you are think max slider rather than max preset. The expectation is running the latest top generation hardware means running the top preset (which you should be able to do with recommended hardware aka the development target). The top preset should select the optimal balance for that hardware that is no less tuned than the version released on a console is for that hardware but it definitely doesn't always max out every available slider.

            • They'll buy them for the same reason they always have...

              For small "theys". And with the ability to mine etherium gone when not gaming, to subsidize the card, today's "they" is probably even smaller than the "they" of past years.

              The steam survey shows that most gamers get x50 and x60 cards. Lets guesstimate about 35% based on a quick look at the data (1000 series and above),. x70 looks about 6%. x80 about 3%.
              https://store.steampowered.com... [steampowered.com]

              I suspect some if those gamers with x80/x90 actually bought those cards for computer vision, machine learning, etc a

          • by Xenx ( 2211586 )
            I don't know if 24GB is necessary yet. However, we're definitely to the point where graphics heavy AAA titles can use 12+GB. So, maybe the 16GB 4080 would be enough in terms of VRAM. Aside from just VRAM, the 4090 is packing roughly 2x the TFLOP performance. Now, I know the TFLOP performance isn't a direct indicator of gaming performance. However, it's generally accepted that within the same architecture the relative performance difference should bear out. That is, as long as the rest of your computer can
            • by Xenx ( 2211586 )
              I just want to clarify, I wasn't intending to say the 12GB 4080 wouldn't be enough for people. I'm saying anyone that is playing at higher res, higher settings, RT. Those people are going to want more VRAM.
          • by EvilSS ( 557649 )
            I don't know. The 4090 has almost double the CUDA cores of the 4080 (16384 vs 9728) for an extra $400. I think a lot of people in the market for the 3080 16GB, at $1199, might just opt for the 4090 instead. I suspect it's going to be well out in front of the 4080 when it comes to pure raster capability. Although the real take away here is the 4080 is exorbitantly overpriced, not that the 4090 is some kind of bargain.

            And let's not even talk about that 4070Ti they decided to call a 4080 12GB.
            • No, let's do talk about it. The 1070ti had 95% the shaders/cuda cores of the 1080, but this "4080" doesn't even have 80% the cores of the full 16GB model. I'd expect this wouldn't even measure up to the traditional 70 model. It appears that they're also limiting the 12GB model to only 75% the bus width (192 vs 256bit), and that marginal clock difference is unlikely to do much to make up for either large performance gap.

              I guess relative cost position matters a lot more than model numbers, but it would be

          • Or the semi pro / pros will wait for the TItan, assuming one is released this generation.

        • I doubt it. Most gamers can't afford to spend more than $500 on a video card, which means that they aren't buying anything better than a 3060Ti right now. These expensive 3080 and 3090 series cards were mostly bought by professionals like video editors and crypto miners.

          You can't profitably mine crypto on a GPU anymore, so I'd expect the demand for the 4080 and 4090 cards to be highly limited. You can also get a used 3090 on eBay for $800 right now, which makes a $899 4080 with half of much VRAM a tough sel

      • by ceoyoyo ( 59147 )

        The 4090 is $100 more than the 3090 was released at. So ~10% cheaper, corrected for inflation. When the 3000s were announced everyone was marvelling at what a magnificent deal they were, and Nvidia sold them faster than they could make them.

        • The 3090 was overpriced garbage at the time it came out though [youtu.be]. If anyone bought it, it was either because there was nothing else available due to the miners, or they needed the VRAM for whatever reason.

          • by drnb ( 2434720 )
            Or they mined when not gaming to subsidize the card, which is no longer an option.
        • by Shaitan ( 22585 )

          Just a month or so ago the talk was about having production lines spun up into overdrive due to the pandemic and overstock. I think people were hoping that would actually translate into a break on pricing and not needing more than the cost of a sane gaming rig for just the GPU needed to run new games on decent settings.

          • by ceoyoyo ( 59147 )

            The 4090 is the "I-gotta-have-the-best" card. The 4080 is almost half the price, and the 4070 is likely to be around half the price of that, and the 4060 even cheaper. All of which should run games on decent settings.

            • by Shaitan ( 22585 )

              And halve that price again and set it as the price of the 4090 and you've got something that is still up there but sane. And is the only one in the lineup likely run all games on decent (aka max) settings with smooth play for the expected lifetime of the purchase (at least 5yrs).

            • by Shaitan ( 22585 )

              Also the 4080 isn't half the price, it is $1200. Seriously between these guys and intel they've jacked the prices of chips up to the point where each chip costs as much as a decent all around PC used to cost BEFORE the pricing came down.

              You'd think personal computing was a brand new thing again rather than the latest of 40 years of iterative improvements.

        • by EvilSS ( 557649 )

          When the 3000s were announced everyone was marvelling at what a magnificent deal they were

          Yes, we were: for the 3060, 3070, and 3080. No one was saying that about the 3090. The only reason they sold so many was because of mining and people buying what they could get their hands on. There is a reason the 3090Ti had a $900 price cut last month after the mining collapse. The performance difference between the 3080 (not to mention the 3080ti) and the 3090 is pretty slim and does not justify the massive price difference ($699 vs $1499 at launch).

          • by ceoyoyo ( 59147 )

            All of which is pretty standard for Nvidia. The 4090 is priced like the 3090, which was released first for the early adopters. Then the x080s a little later, and so on down the line. The only difference from what went before is that Nvidia switched from calling the top end x80 ti to x90.

      • by thegarbz ( 1787294 ) on Tuesday September 20, 2022 @02:42PM (#62898943)

        The better question is who the fuck is going to buy them. $1200 for a fucking 4080 and $1600 for the 4090 is an amazing joke.

        Probably everyone who is interested in top quality gaming at high refresh rates and isn't poor. I know it's hard for people to believe, but premium products actually exist on the market, something that was plainly obvious when you saw the previous generation of cards at that price perpetually sold out as well.

        Don't worry about Nvidia, they'll sell plenty.

      • The same people who are buying 8K displays right now will buy a $1600 GPU to drive them. I've always been glad that there are early adopters to subsidize and "beta-test" technology so that the rest of us can enjoy it with the cost or hassle.

      • The better question is who the fuck is going to buy them. $1200 for a fucking 4080 and $1600 for the 4090 is an amazing joke.

        Oh, I am getting the 4090 to upgrade my 2080Ti. As long as I can afford it, I want to take advantage of the best technology that is available to me during my lifetime. I never thought I would be able to fly a study-level FA-18 in my lifetime with the realism that currently exists. It is truly awesome.

        I get that it is expensive, but I guess it is worth it to me. I do not own a Bent

    • Don't worry, there are still plenty of brands that will make Nvidia based video cards. And every one will offer the same chip in 5 different boost frequencies, 3 memory configurations, and 4 cooling fan layouts.
  • by MindPrison ( 864299 ) on Tuesday September 20, 2022 @12:15PM (#62898425) Journal

    Now that Mining is not that profitable anymore after the ETH V2.0, It's hard to predict how the prices will hold up.

    The prices like last time seem "reasonable" but I still kind of doubt it will be launced "locally" in "insert-your-country-here" at a reasonable price.
    For example, here in Sweden the 4090 will cost a whopping 21990 SEK (roughly 2020 USD).

    • There's more to crypto-currencies than Bitcoin and Ethereum. Still plenty of them can be mined with GPUs.

      • by DRJlaw ( 946416 )

        There's more to crypto-currencies than Bitcoin and Ethereum. Still plenty of them can be mined with GPUs.

        Not for nearly the same magnitude of daily profit.

        If you'd like to buy cards that require 9,000 days [tomshardware.com] to generate a net profit (including capex, not just opex-positive cash flow), then I definitely want you as a customer. Please send me your information ASAP.

      • Does mining take a lot of VRAM? Offhand I'd have guessed you'd want as many cores but as little VRAM as possible.
        • Ethereum mining was notoriously VRAM-bound. Most people would actually underclock the GPU because it was just generating more heat for the same hashrate due to the VRAM bottleneck.

      • by Xenx ( 2211586 )
        First, Bitcoin isn't efficiently mined with GPU anyway. As for altcoins that are still GPU efficient, you'll find that the payouts for mining them dwindled with the flood of existing miners switching over from Ethereum. A quick example I found online. Ethereum Classic, still proof-of-work, went from 70cents per mined block down to 11 cents per mined block after the change over. Ravencoin went from $1.77 per block down to $0.05 per block. I'm sure things will stabilize over time. You might still be able to t
      • by EvilSS ( 557649 )

        There's more to crypto-currencies than Bitcoin and Ethereum. Still plenty of them can be mined with GPUs.

        Sure, for less than a dollar a day on the most powerful GPUs currently on the market, and that is if you have power for free or cheap enough that it's not putting you in the negative to try to mine them. Don't know about you, but a potential 6+ year break-even on a 4080 16GB is not really worth it for me.

  • There's a big market for machine learning GPUs. I'd pay extra for a 3090/4090 with 48 GB of RAM, but that'd undermine their enterprise level cards, which are like $10,000. So we're stuck with 24 GB. That sounds like a lot, but not when you start running the shit that everyone will want to be running in the next few years: Stable Diffusion and other seriously impressive (and VRAM heavy) image generation algorithms.

    • by fazig ( 2909523 )
      The enterprise cards would still have their driver approval and certification going for them, which makes them valuable for professional work in engineering fields.

      I suppose you could always go for a HEDT CPU with a lot of PCIe lanes (with an appropriate mobo of course) and NVLink two or more 3090/4090 together. It certainly won't be as good as a single GPU with twice the VRAM, but for non time critical tasks (where latencies don't matter that much) it'll still do it's job pretty well and might give you a
      • by Xenx ( 2211586 )

        NVLink two or more 3090/4090 together

        At least with the founders 4090, there will not be nvlink support. I know there was a leak showing nvlink support, but I don't know if that will be an option for AIBs or has been removed entirely. I can imagine it's not an option.

        • by fazig ( 2909523 )
          That would be super scummy from NVIDIA. But I'm not surprised as they've been cutting back on that for some time.
          • by Xenx ( 2211586 )
            In reality, gaming grew past SLI. These cards are geared towards gamers. I understand why people that want to use them for compute, or extreme benchmarking, care but that isn't the market they're designing these cards for. Further, while the RTX "Quadro" compute cards do cost a premium they're not prohibitively expensive compared to their gaming counter part when talking about MSRP. 3090 vs A5000 is a $500 markup at MSRP, but the A5000 does outperform the 3090 in some areas. The ups and downs of the GPU mar
            • by fazig ( 2909523 )
              Market prices highly depend on your location.
              In Germany the PNY A5000 still costs around ~$2200 here while a Zotac 3090 can be had for ~$1100 (prices for new products).
              So if you don't need the features that are unique to the A5000 (like ECC for the memory, which might be a requirement if you're doing scientific or engineering work) the 3090 is a decent entry level card into GPU compute tasks here in Germany.
              Disregarding the price for the NVLink bridge and likely the more powerful PSU that is required, ev
              • by Xenx ( 2211586 )

                In Germany the PNY A5000 still costs around ~$2200 here while a Zotac 3090 can be had for ~$1100

                I said MSRP, and specifically mentioned the GPU market has changed the pricing for the gaming cards. Prices have dropped below MSRP. The A5000 is $2000, and the 3090 was $1500. Yes, you can get it cheaper now. But that isn't how initial MSRP works. Having an abundance of 3090's on the market doesn't suddenly mean there is an abundance of A5000's on the market. My point of mentioning pricing was that while there is a higher cost for the RTX cards they're not exactly price gouging those customers. There might

    • I am also wondering at what point (if any) gaming will start to make use of deep neural nets (inference at least) in-game. Perhaps for procedurally-generated architectures and textures, or for NPC's you can converse with in a way that is meaningful to the gameplay?

      Not saying it has been done fruitfully yet or that I would know how to do so, but it also seems kind of inevitable at some point.

      I realize online gaming with other people has totally taken over for the moment, but I don't care for it.

  • by SciCom Luke ( 2739317 ) on Tuesday September 20, 2022 @12:23PM (#62898447)
    The raytracing functionality is still not widely used, but it is paid for.
    Better wait for the RDNA3 cards from the Red Camp.
    • by etash ( 1907284 )
      their ray tracing is and will be WAY behind nvidia's for a long time...
    • The raytracing functionality is still not widely used, but it is paid for.

      Raytracing isn't the only thing that hardware does. Many modern games also run DLSS which will make use of that same hardware. Quite a lot of apps are on the market as well such as image processing tools which use that hardware. And that's before you consider the list of RTX games is already quite large.

      I have a better question for you: Are you buying hardware now to play games of yesterday, or are you buying cards to play games for the next 2 years? Basically every major title which will be cross ported fr

  • Oh those prices... (Score:5, Insightful)

    by achacha ( 139424 ) on Tuesday September 20, 2022 @12:27PM (#62898459) Homepage

    Someone forgot to tell NVidia that Eth mining is not going to prop up ridiculous prices for video cards. May be a good time to short Nvidia, it will be a bloodbath of a quarter.

    • by rsilvergun ( 571051 ) on Tuesday September 20, 2022 @01:58PM (#62898803)
      they sell them to early adopters with deep pockets and YouTubers with patreons paying for them.

      I saw an AMD RX 6600 not too long ago for $220 after rebate. Not exactly comparable but it'll play anything you throw at it at 70fps+ / 1080p. RX 7000 series will be here soon. Yeah, ray tracing sucks on the cards, but all that does is make reflections nicer and give you some scaling options. Point is if you just wanna game you have real options.

      As for crypto the miners are trying to hang on. It's not going to work. The only coins left to mine with GPUs are shit coins. Basically ponzi schemes and rug pulls. The SEC is actively cracking down on those, throwing people in jail and such. This won't stop the scammers but it will stop the Twitch streamers carrying water for them, preventing the scams from getting off the ground.

      I can't blame the miners for trying to hold on. You go from making millions to pennies it's hard to give up. But eventually they'll need to pay those electricity bills. And they'll be fire sales. And the longer they wait the less their old inventory of GPUs will be worth.
    • by leonbev ( 111395 )

      Yeah... you can get a used 3090 on eBay right now for $800, which will likely perform just as well as a 4080 in most games unless you have RTX turned up to the maximum settings. No sane person is going to buy one of these at retail prices at the moment.

    • Re: (Score:2, Insightful)

      by thegarbz ( 1787294 )

      Someone forgot to tell NVidia that Eth mining is not going to prop up ridiculous prices for video cards. May be a good time to short Nvidia, it will be a bloodbath of a quarter.

      Someone forgot to tell you that rich gamers exist and were happily buying 3090s are far higher prices than the 4090 launch price because they were too impatient.

      Put your money where your mouth is, short Nvidia and see how far you get, let's see if this company which has literally no competition in the high end goes bankrupt because you don't think premium high end products should exist.

      • Premium high end products from nvidia used to max out at ~250 USD. the 8800GT was only 5% short of the 8800GTX in performance at 200-250 USD making it the effective bang/buck top end card.
      • by drnb ( 2434720 )
        Its not just rich gamers, it professionals doing computer vision or machine learning that might get a x090. Gaming is a secondary consideration that coincidentally benefits.
    • Someone mentioned it previously but I will say it repeat, video editing software such as DaVinci Resolve is largely dependent on video cards with large memory.
      https://forum.blackmagicdesign... [blackmagicdesign.com]
  • by thoper ( 838719 ) on Tuesday September 20, 2022 @12:44PM (#62898513)
    While i'm sure there will be games that look better using this cards, we reached diminishing returns in graphics a couple generations ago. games dont look particularly better in a 3090 versus lets say a 1070. Sure the 3090 will be capable at 4k and over 60fps, but the 1070 will do the same at 1080p.
    • by MachineShedFred ( 621896 ) on Tuesday September 20, 2022 @12:58PM (#62898569) Journal

      Are you really trying to say that there isn't a difference between 1080p and 4k?

      • by Shaitan ( 22585 )

        For many gamers screens and close viewing distance he is right but when you throw VR into the mix he is completely off-base. VR is where everything is going and there is a substantial difference going to 4k and that difference is still visible beyond 4k.

        • For many gamers screens and close viewing distance he is right but when you throw VR into the mix he is completely off-base. VR is where everything is going and there is a substantial difference going to 4k and that difference is still visible beyond 4k.

          VR is the newest version of the 3d-tv. Resurrected every few years and shoved down the public's throats, only to fail again.

          • by Shaitan ( 22585 )

            I had the same position. Then I decided that VR is the direction tech is going whether I agree or not and tried it a few months ago with the intention of being up on the latest whether I agree with the direction or not. I'm glad I did because I was wrong.

            VR is not a gimmick, it has reached good enough and is mind blowing... especially if you are immersing in VR and not just playing the same old games in VR.

          • by JMZero ( 449047 )

            People have been saying this for years.. but it doesn't really correspond with reality. It's kind of getting silly now.

            I mean, VR was tried in the 90s and failed because it, like... didn't work. The next time it was tried (significantly, anyway) was 2012 (when the Kickstarter for the original Oculus Rift started). It's mostly just kind of grown since then. There's a variety of products and headsets - but the original Oculus Rift CV1 still works fine. I don't know where or when we're imagining all these

            • by Shaitan ( 22585 )

              "Oculus Rift CV1 still works fine"

              It functions. It is shit but it functions. The Quest 2 (with the best accessories) brings the experience to par with much improved visuals and with the right accessories the kind of comfort and hot swappable batteries you need for extended use.

              The biggest issues are that MR/AR is far more practical the minute you step away from games to any sort of productive use and the software for PC connection is the same unstable mess from Oculus... it needs a serious overhaul and real

      • by Xenx ( 2211586 )
        I'm not trying to necessarily support their view, but monitor size and distance does play into it. There are cases where the difference between 1080p and 4k are marginalized. But, there are plenty more where it is definitely an improvement. Further, there are a lot of games designed to be more accessible so they won't see as much improvement at higher resolutions. I think their viewpoint is likely more affected by circumstance.
    • >games dont look particularly better in a 3090 versus lets say a 1070.

      1070s don't have any ray tracing cores, so that's a pretty big generational gap. There are some games where ray tracing is like a night and day difference in image quality.

      Plus a 3090 is 10x faster than 1070, so you can either turn the detail up 10x higher or get 10x the frame rate. Either way, it's a huge difference.

      That said, after I bought my 3090 I went and played a hundred hours of Europa Universalis IV, so what do I know? It's a

  • I expect the top end model will likely be able to process about 150 frames/second of yolov7 for object detection. That's a great security system when coupled with StalkedByTheState :-)

            https://github.com/hcfman/sbts... [github.com]

    Might need to buy some more cameras though.

    • by Shaitan ( 22585 )

      I'm gonna be honest with you. I'm a reasonably paranoid dude and even my brain didn't immediately leap to "OMFG I can improve my AI driven intruder detection things with this sweet new $1600 GPU!"

      Either your paranoia drastically exceeds the threat you are facing or you are facing a threat so overwhelming that intrusion detection isn't going to help. Either way the 20ms this shaves off vs the current generation card for $600 less isn't going to make the difference.

  • You'll be able to play the sames games as a $400 PS5, only you can set grass detail at "extremely high" instead of "very high."

    • The 4090 will probably be able to play the same games as PS5, except in 8K instead of 4K. Do we need all those Ks? I don't know, but the same people who shelled out for an 8K TV are probably pretty desperate to find some content for it.

    • by Xenx ( 2211586 )
      Not enough context to tell if you're only joking, I can tell it's a joke, or if you think a PS5 can at all keep up with a high end gaming computer. I'm not knocking consoles, especially current gen, as they have good cost to performance. They just don't compete on pure performance.
      • They compete exactly with a 3800X ryzen and somewhere between a RX6800 and RTX3080, which are all in the high end and more than you *need* to run games (a 3600 or 10400F and RX6600 will all do 60+fps cyberpunk77)
        • by Xenx ( 2211586 )
          The PS5 is not that close to a 6800/3080. It's low/mid end "current" gen GPU. The PS5, and it's games, are just optimized for the hardware they have. That means you will get better results, with potential shortcuts, for the less hardware. Aside from that, there is a huge difference from barely running a game and running it at full settings above 60fps. It's not even remotely the same as the joke about the grass detail that I replied to.
          • by Shaitan ( 22585 )

            Everything you said is true but don't knock this part "optimized for the hardware they have." Very few PC gamers tune their settings that well for their system. They leave performance on the table in some sliders and while turning others up too far and chase FPS their eye can't even see.

            • by Xenx ( 2211586 )
              Oh, I'm not knocking what the PS5 can do. I'm just saying with a larger budget, a computer can still do a lot better. It's not a negligible difference.
      • Obviously the 4090 is better. I was joking, suggesting they are better in ways that are immaterial to game play and only very slightly better in perceived technical quality.

        • by Xenx ( 2211586 )
          My point is that it's more than just a slight graphical improvement. Sure, it doesn't as much directly affect gameplay itself. But, it does affect gameplay experience. There will of course be diminishing returns, but there is a big enough difference from a top end computer and the PS5 that it cannot just be boiled down to a simple slightly improvement.
          • by Shaitan ( 22585 )

            Most people are playing PC games close up on a smallest screen. The differences look cool until 15minutes in when they become normalized by your brain. On the other hand if you are playing in VR then it makes a real difference.

  • by Babel-17 ( 1087541 ) on Tuesday September 20, 2022 @01:02PM (#62898577)

    And arguably numbed people towards pricing previously seen as unacceptable.

    I think enough people have "the gaming bug" now so that nVidia should have a decent launch, especially if supply is conveniently just a bit constrained. ;) Old habits die hard, and people can go from "Too expensive, I neither need or want that!" to "I want!" as they read about people battling to score a card as supplies quickly dry up.

    For me, as I recently bought an LG C1 OLED, to go with RTX 3080, that is blowing my mind, I have no interest in anticipating 8K panels, and it looks like 8K panels are going to be needed to exploit the kind of performance the highest end of this series is capable of. And there'd be a need of games that had resources that could natively take advantage of 8K panels.

    Off topic, but supposedly the new Unreal engine allows importation of high resolution photographic assets. So maybe that could be part of the equation that eventually generates a serious demand for these expensive new cards.

    I'm hoping for a quicker than usual refresh/process shrink of this new series of cards from nVidia, one that puts a lot of attention to reducing the "watt per frame" ratio.

  • by satsuke ( 263225 ) on Tuesday September 20, 2022 @01:22PM (#62898677)

    I look forward to the act of showing up to a store and actually walking out with one, rather than the nonsense the last few years of endless shortages and above MSRP pricing.

  • by hyperar ( 3992287 ) on Tuesday September 20, 2022 @01:23PM (#62898685)
    Who forgot to tell Jensen that the pandemic and mining are over?, who is going to pay that for a GPU?, let me know when the 4080 16Gb is $600
  • The cost of a single component that vastly outstrips the cost of the entire rest of the rig. Plus inflation is raising prices on everything else. So no thank you.

Beware of Programmers who carry screwdrivers. -- Leonard Brandwein

Working...