Nvidia Announces Next-Gen RTX 4090 and RTX 4080 GPUs (theverge.com) 178
Nvidia is officially announcing its RTX 40-series GPUs today. After months of rumors and some recent teasing from Nvidia, the RTX 4090 and RTX 4080 are now both official. The RTX 4090 arrives on October 12th priced at $1,599, with the RTX 4080 priced starting at $899 and available in November. Both are powered by Nvidia's next-gen Ada Lovelace architecture. From a report: The RTX 4090 is the top-end card for the Lovelace generation. It will ship with a massive 24GB of GDDR6X memory. Nvidia claims it's 2-4x faster than the RTX 3090 Ti, and it will consume the same amount of power as that previous generation card. Nvidia recommends a power supply of at least 850 watts based on a PC with a Ryzen 5900X processor. Inside the giant RTX 4090 there are 16,384 CUDA Cores, a base clock of 2.23GHz that boosts up to 2.52GHz, 1,321 Tensor-TFLOPs, 191 RT-TFLOPs, and 83 Shader-TFLOPs.
Nvidia is actually offering the RTX 4080 in two models, one with 12GB of GDDR6X memory and another with 16GB of GDDR6X memory, and Nvidia claims it's 2-4x faster than the existing RTX 3080 Ti. The 12GB model will start at $899 and include 7,680 CUDA Cores, 7,680 CUDA Cores, a 2.31GHz base clock that boosts up to 2.61GHz, 639 Tensor-TFLOPs, 92 RT-TFLOPs, and 40 Shader-TFLOPs. The 16GB model of the RTX 4080 isn't just a bump to memory, though. Priced starting at $1,199 it's more powerful with 9,728 CUDA Cores, a base clock of 2.21GHz that boosts up to 2.51GHz, 780 Tensor-TFLOPs, 113 RT-TFLOPs, and 49 Shader-TFLOPs of power. The 12GB RTX 4080 model will require a 700 watt power supply, with the 16GB model needing at least 750 watts. Both RTX 4080 models will launch in November. Further reading: Nvidia Puts AI at Center of Latest GeForce Graphics Card Upgrade.
Nvidia is actually offering the RTX 4080 in two models, one with 12GB of GDDR6X memory and another with 16GB of GDDR6X memory, and Nvidia claims it's 2-4x faster than the existing RTX 3080 Ti. The 12GB model will start at $899 and include 7,680 CUDA Cores, 7,680 CUDA Cores, a 2.31GHz base clock that boosts up to 2.61GHz, 639 Tensor-TFLOPs, 92 RT-TFLOPs, and 40 Shader-TFLOPs. The 16GB model of the RTX 4080 isn't just a bump to memory, though. Priced starting at $1,199 it's more powerful with 9,728 CUDA Cores, a base clock of 2.21GHz that boosts up to 2.51GHz, 780 Tensor-TFLOPs, 113 RT-TFLOPs, and 49 Shader-TFLOPs of power. The 12GB RTX 4080 model will require a 700 watt power supply, with the 16GB model needing at least 750 watts. Both RTX 4080 models will launch in November. Further reading: Nvidia Puts AI at Center of Latest GeForce Graphics Card Upgrade.
Who's gonna make the boards (Score:2)
Re: (Score:2)
Re: (Score:2)
Not EVGA.
Re: (Score:3)
Re:Who's gonna make the boards (Score:5, Informative)
Good thing there's still:
Asus
MSI
Gigabyte
ASRock
Zotac
Nvidia (Founders Editions)
PNY
etc.
Re: (Score:3)
>Slashdot still doesnâ(TM)t support Unicode after it was added to the HTML standard in 1997.
The real question is why don't you support ASCII?
Re:Who's gonna make the boards (Score:4, Informative)
The better question is who the fuck is going to buy them. $1200 for a fucking 4080 and $1600 for the 4090 is an amazing joke.
Re:Who's gonna make the boards (Score:4, Informative)
Tons of gamers.
Re: (Score:3)
Re: (Score:2)
They'll buy them for the same reason they always have... game makers will buy them and target their max settings to take advantage of the top card.
At least they increased the memory in the 'midrange' so Half-Life Alyx won't complain about not having enough video memory. Which highlights the real consumer use case for cards like these. VR. Whether games or other applications the ideal in VR is textures/models/surfaces 3d scanned from reality or that are photorealistic, with a resolution that can be blown up
Re: (Score:2)
would translate into being able to run all the games on max settings
This is the bit that really seems to have changed, it's a new attitude that I think is detrimental. Max settings aren't supposed to be achievable. Or at least, that's how it used to be. Max settings were for future hardware, not for present day hardware. They add longevity to the game and give you present-day options: no you can't run everything at max on current hardware, but you can choose which limited settings you do want to run at max. Choices.
Nowadays people throw a fit if they can't run everything
Re: (Score:2)
That is backwards and was never how it was.
Software, including games, can always be refreshed to take advantage of new hardware. You don't even know what will come down the line so what is in your 'max settings?' And how would you test your 'max settings' which depend on hardware that doesn't exist?
Yes, you can tweak individual settings that frankly only a handful of game developers and insiders really understand in the first place but expecting that normal users (such as gamers) are likely to do this corre
Re: (Score:2)
... can always be refreshed to take advantage of new hardware ...
No, updating art assets to take advantage of new capabilities can be a major effort.
Re: (Score:2)
Maybe you are think max slider rather than max preset. The expectation is running the latest top generation hardware means running the top preset (which you should be able to do with recommended hardware aka the development target). The top preset should select the optimal balance for that hardware that is no less tuned than the version released on a console is for that hardware but it definitely doesn't always max out every available slider.
For small "theys" (Score:2)
They'll buy them for the same reason they always have...
For small "theys". And with the ability to mine etherium gone when not gaming, to subsidize the card, today's "they" is probably even smaller than the "they" of past years.
The steam survey shows that most gamers get x50 and x60 cards. Lets guesstimate about 35% based on a quick look at the data (1000 series and above),. x70 looks about 6%. x80 about 3%.
https://store.steampowered.com... [steampowered.com]
I suspect some if those gamers with x80/x90 actually bought those cards for computer vision, machine learning, etc a
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
And let's not even talk about that 4070Ti they decided to call a 4080 12GB.
Re: (Score:2)
No, let's do talk about it. The 1070ti had 95% the shaders/cuda cores of the 1080, but this "4080" doesn't even have 80% the cores of the full 16GB model. I'd expect this wouldn't even measure up to the traditional 70 model. It appears that they're also limiting the 12GB model to only 75% the bus width (192 vs 256bit), and that marginal clock difference is unlikely to do much to make up for either large performance gap.
I guess relative cost position matters a lot more than model numbers, but it would be
Re: (Score:2)
Or the semi pro / pros will wait for the TItan, assuming one is released this generation.
Re: (Score:2)
I doubt it. Most gamers can't afford to spend more than $500 on a video card, which means that they aren't buying anything better than a 3060Ti right now. These expensive 3080 and 3090 series cards were mostly bought by professionals like video editors and crypto miners.
You can't profitably mine crypto on a GPU anymore, so I'd expect the demand for the 4080 and 4090 cards to be highly limited. You can also get a used 3090 on eBay for $800 right now, which makes a $899 4080 with half of much VRAM a tough sel
Re: (Score:3)
The 4090 is $100 more than the 3090 was released at. So ~10% cheaper, corrected for inflation. When the 3000s were announced everyone was marvelling at what a magnificent deal they were, and Nvidia sold them faster than they could make them.
Re: (Score:2)
The 3090 was overpriced garbage at the time it came out though [youtu.be]. If anyone bought it, it was either because there was nothing else available due to the miners, or they needed the VRAM for whatever reason.
Re: (Score:2)
Re: (Score:2)
the very source you post cites "extreme pricing for very modest gains" as the principal con for the 3090.
nvidia changed the performance/cost ratio dramatically with the rtx (20) series, for no reason other than hype, because the new tech wasn't even usable back then. it was a total rip off. the 30 series toned it down a few notches but then there was no stock to buy, so most people got them from sharks and resellers at ridiculous prices, and the move basically allowed nvidia to get rid of their existing ove
Re: (Score:2, Insightful)
well you said cost/performance update was typical, i just showed you it was objectively anything but "typical" or consistent. this has nothing to do with high, medium or low range, but with the whole series and marketing strategy.
i don't see how your infantile assumptions about me are relevant, not to mention you know nothing about me, so i will just ignore them. feel free to embarrass yourself to your heart's content, though.
Re: (Score:2)
Just a month or so ago the talk was about having production lines spun up into overdrive due to the pandemic and overstock. I think people were hoping that would actually translate into a break on pricing and not needing more than the cost of a sane gaming rig for just the GPU needed to run new games on decent settings.
Re: (Score:2)
The 4090 is the "I-gotta-have-the-best" card. The 4080 is almost half the price, and the 4070 is likely to be around half the price of that, and the 4060 even cheaper. All of which should run games on decent settings.
Re: (Score:2)
And halve that price again and set it as the price of the 4090 and you've got something that is still up there but sane. And is the only one in the lineup likely run all games on decent (aka max) settings with smooth play for the expected lifetime of the purchase (at least 5yrs).
Re: (Score:2)
While we're wishing, I'd like to have an Embraer Phenom. And a pony.
Re: (Score:2)
The 4090 is more expensive than a pony.
Re: (Score:2)
Also the 4080 isn't half the price, it is $1200. Seriously between these guys and intel they've jacked the prices of chips up to the point where each chip costs as much as a decent all around PC used to cost BEFORE the pricing came down.
You'd think personal computing was a brand new thing again rather than the latest of 40 years of iterative improvements.
Re: (Score:2)
When the 3000s were announced everyone was marvelling at what a magnificent deal they were
Yes, we were: for the 3060, 3070, and 3080. No one was saying that about the 3090. The only reason they sold so many was because of mining and people buying what they could get their hands on. There is a reason the 3090Ti had a $900 price cut last month after the mining collapse. The performance difference between the 3080 (not to mention the 3080ti) and the 3090 is pretty slim and does not justify the massive price difference ($699 vs $1499 at launch).
Re: (Score:3)
All of which is pretty standard for Nvidia. The 4090 is priced like the 3090, which was released first for the early adopters. Then the x080s a little later, and so on down the line. The only difference from what went before is that Nvidia switched from calling the top end x80 ti to x90.
Re:Who's gonna make the boards (Score:5, Insightful)
The better question is who the fuck is going to buy them. $1200 for a fucking 4080 and $1600 for the 4090 is an amazing joke.
Probably everyone who is interested in top quality gaming at high refresh rates and isn't poor. I know it's hard for people to believe, but premium products actually exist on the market, something that was plainly obvious when you saw the previous generation of cards at that price perpetually sold out as well.
Don't worry about Nvidia, they'll sell plenty.
Re: (Score:3)
to each their own, but ultra extreme shadows arent worth $1600 to me,
You're right, each to their own. You don't want top quality graphics, fine. I personally don't either. Yet there are people out there who spend far more money for far less.
$1600? Cheap hobby. Try buying a boat if you want to actually set a mountain of cash on fire. I personally don't fish, but I know someone who does and their fishing rod cost more than that.
Rich people exist. Nvidia will sell cards.
inventory issues were due to crypto
And yet people bought cards. The point was not the inventory issues, the point was people bought cards. One o
Re: (Score:3)
The same people who are buying 8K displays right now will buy a $1600 GPU to drive them. I've always been glad that there are early adopters to subsidize and "beta-test" technology so that the rest of us can enjoy it with the cost or hassle.
Re: (Score:2)
The better question is who the fuck is going to buy them. $1200 for a fucking 4080 and $1600 for the 4090 is an amazing joke.
Oh, I am getting the 4090 to upgrade my 2080Ti. As long as I can afford it, I want to take advantage of the best technology that is available to me during my lifetime. I never thought I would be able to fly a study-level FA-18 in my lifetime with the realism that currently exists. It is truly awesome.
I get that it is expensive, but I guess it is worth it to me. I do not own a Bent
Re: (Score:3)
Scalpers in 1...2...3 (Score:3)
Now that Mining is not that profitable anymore after the ETH V2.0, It's hard to predict how the prices will hold up.
The prices like last time seem "reasonable" but I still kind of doubt it will be launced "locally" in "insert-your-country-here" at a reasonable price.
For example, here in Sweden the 4090 will cost a whopping 21990 SEK (roughly 2020 USD).
Re: (Score:2)
There's more to crypto-currencies than Bitcoin and Ethereum. Still plenty of them can be mined with GPUs.
Re: (Score:2)
Not for nearly the same magnitude of daily profit.
If you'd like to buy cards that require 9,000 days [tomshardware.com] to generate a net profit (including capex, not just opex-positive cash flow), then I definitely want you as a customer. Please send me your information ASAP.
Re: (Score:2)
Re: (Score:2)
Ethereum mining was notoriously VRAM-bound. Most people would actually underclock the GPU because it was just generating more heat for the same hashrate due to the VRAM bottleneck.
Re: (Score:2)
Re: (Score:2)
There's more to crypto-currencies than Bitcoin and Ethereum. Still plenty of them can be mined with GPUs.
Sure, for less than a dollar a day on the most powerful GPUs currently on the market, and that is if you have power for free or cheap enough that it's not putting you in the negative to try to mine them. Don't know about you, but a potential 6+ year break-even on a 4080 16GB is not really worth it for me.
ML market (Score:2)
There's a big market for machine learning GPUs. I'd pay extra for a 3090/4090 with 48 GB of RAM, but that'd undermine their enterprise level cards, which are like $10,000. So we're stuck with 24 GB. That sounds like a lot, but not when you start running the shit that everyone will want to be running in the next few years: Stable Diffusion and other seriously impressive (and VRAM heavy) image generation algorithms.
Re: (Score:3)
I suppose you could always go for a HEDT CPU with a lot of PCIe lanes (with an appropriate mobo of course) and NVLink two or more 3090/4090 together. It certainly won't be as good as a single GPU with twice the VRAM, but for non time critical tasks (where latencies don't matter that much) it'll still do it's job pretty well and might give you a
Re: (Score:2)
NVLink two or more 3090/4090 together
At least with the founders 4090, there will not be nvlink support. I know there was a leak showing nvlink support, but I don't know if that will be an option for AIBs or has been removed entirely. I can imagine it's not an option.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
In Germany the PNY A5000 still costs around ~$2200 here while a Zotac 3090 can be had for ~$1100 (prices for new products).
So if you don't need the features that are unique to the A5000 (like ECC for the memory, which might be a requirement if you're doing scientific or engineering work) the 3090 is a decent entry level card into GPU compute tasks here in Germany.
Disregarding the price for the NVLink bridge and likely the more powerful PSU that is required, ev
Re: (Score:2)
In Germany the PNY A5000 still costs around ~$2200 here while a Zotac 3090 can be had for ~$1100
I said MSRP, and specifically mentioned the GPU market has changed the pricing for the gaming cards. Prices have dropped below MSRP. The A5000 is $2000, and the 3090 was $1500. Yes, you can get it cheaper now. But that isn't how initial MSRP works. Having an abundance of 3090's on the market doesn't suddenly mean there is an abundance of A5000's on the market. My point of mentioning pricing was that while there is a higher cost for the RTX cards they're not exactly price gouging those customers. There might
Re: (Score:2)
Not saying it has been done fruitfully yet or that I would know how to do so, but it also seems kind of inevitable at some point.
I realize online gaming with other people has totally taken over for the moment, but I don't care for it.
Re: (Score:2, Informative)
https://kotaku.com/steam-pc-ai... [kotaku.com]
I'm sure more is to come.
Raytracing still not widely used (Score:3)
Better wait for the RDNA3 cards from the Red Camp.
Re: (Score:2)
Re: (Score:2)
The raytracing functionality is still not widely used, but it is paid for.
Raytracing isn't the only thing that hardware does. Many modern games also run DLSS which will make use of that same hardware. Quite a lot of apps are on the market as well such as image processing tools which use that hardware. And that's before you consider the list of RTX games is already quite large.
I have a better question for you: Are you buying hardware now to play games of yesterday, or are you buying cards to play games for the next 2 years? Basically every major title which will be cross ported fr
Oh those prices... (Score:5, Insightful)
Someone forgot to tell NVidia that Eth mining is not going to prop up ridiculous prices for video cards. May be a good time to short Nvidia, it will be a bloodbath of a quarter.
This is pretty standard for 1st run cards (Score:4, Insightful)
I saw an AMD RX 6600 not too long ago for $220 after rebate. Not exactly comparable but it'll play anything you throw at it at 70fps+ / 1080p. RX 7000 series will be here soon. Yeah, ray tracing sucks on the cards, but all that does is make reflections nicer and give you some scaling options. Point is if you just wanna game you have real options.
As for crypto the miners are trying to hang on. It's not going to work. The only coins left to mine with GPUs are shit coins. Basically ponzi schemes and rug pulls. The SEC is actively cracking down on those, throwing people in jail and such. This won't stop the scammers but it will stop the Twitch streamers carrying water for them, preventing the scams from getting off the ground.
I can't blame the miners for trying to hold on. You go from making millions to pennies it's hard to give up. But eventually they'll need to pay those electricity bills. And they'll be fire sales. And the longer they wait the less their old inventory of GPUs will be worth.
Re: (Score:2)
Yeah... you can get a used 3090 on eBay right now for $800, which will likely perform just as well as a 4080 in most games unless you have RTX turned up to the maximum settings. No sane person is going to buy one of these at retail prices at the moment.
Re: (Score:2, Insightful)
Someone forgot to tell NVidia that Eth mining is not going to prop up ridiculous prices for video cards. May be a good time to short Nvidia, it will be a bloodbath of a quarter.
Someone forgot to tell you that rich gamers exist and were happily buying 3090s are far higher prices than the 4090 launch price because they were too impatient.
Put your money where your mouth is, short Nvidia and see how far you get, let's see if this company which has literally no competition in the high end goes bankrupt because you don't think premium high end products should exist.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
https://forum.blackmagicdesign... [blackmagicdesign.com]
too much power (Score:3, Funny)
Re:too much power (Score:4, Funny)
Are you really trying to say that there isn't a difference between 1080p and 4k?
Re: (Score:2)
For many gamers screens and close viewing distance he is right but when you throw VR into the mix he is completely off-base. VR is where everything is going and there is a substantial difference going to 4k and that difference is still visible beyond 4k.
Re: (Score:2)
For many gamers screens and close viewing distance he is right but when you throw VR into the mix he is completely off-base. VR is where everything is going and there is a substantial difference going to 4k and that difference is still visible beyond 4k.
VR is the newest version of the 3d-tv. Resurrected every few years and shoved down the public's throats, only to fail again.
Re: (Score:2)
I had the same position. Then I decided that VR is the direction tech is going whether I agree or not and tried it a few months ago with the intention of being up on the latest whether I agree with the direction or not. I'm glad I did because I was wrong.
VR is not a gimmick, it has reached good enough and is mind blowing... especially if you are immersing in VR and not just playing the same old games in VR.
Re: (Score:2)
People have been saying this for years.. but it doesn't really correspond with reality. It's kind of getting silly now.
I mean, VR was tried in the 90s and failed because it, like... didn't work. The next time it was tried (significantly, anyway) was 2012 (when the Kickstarter for the original Oculus Rift started). It's mostly just kind of grown since then. There's a variety of products and headsets - but the original Oculus Rift CV1 still works fine. I don't know where or when we're imagining all these
Re: (Score:2)
"Oculus Rift CV1 still works fine"
It functions. It is shit but it functions. The Quest 2 (with the best accessories) brings the experience to par with much improved visuals and with the right accessories the kind of comfort and hot swappable batteries you need for extended use.
The biggest issues are that MR/AR is far more practical the minute you step away from games to any sort of productive use and the software for PC connection is the same unstable mess from Oculus... it needs a serious overhaul and real
Re: (Score:2)
Re: (Score:3)
Your eyes must be pretty bad then, and you're certainly not Nvidia's target audience for their flagship card.
Re: (Score:2)
Your eyes must be pretty bad then, and you're certainly not Nvidia's target audience for their flagship card.
I am no ones target audience for their flagship anything. I am a cheap bastard, and proud of it.
I also game in a rather odd way. I PC game on my 65" 4K TV, with a keyboard and mouse while relaxing on the sofa.
That target audience is not what you think ... (Score:2)
Your eyes must be pretty bad then, and you're certainly not Nvidia's target audience for their flagship card.
Let me explain how AAA video game developers react when Nvidia shows up and delivers the new "flagship" cards. The video game developer points out that their customers are only just now getting to the point that they commonly have a midrange card from two generations ago. That the midrange cards of the flagship's generation won't be commonly used for another three years.
But thanks for the cards. We will certainly target them in the game just starting development and that will ship in three years.
Re: (Score:2)
>games dont look particularly better in a 3090 versus lets say a 1070.
1070s don't have any ray tracing cores, so that's a pretty big generational gap. There are some games where ray tracing is like a night and day difference in image quality.
Plus a 3090 is 10x faster than 1070, so you can either turn the detail up 10x higher or get 10x the frame rate. Either way, it's a huge difference.
That said, after I bought my 3090 I went and played a hundred hours of Europa Universalis IV, so what do I know? It's a
One word: Yolov7 home security systems (Score:2)
I expect the top end model will likely be able to process about 150 frames/second of yolov7 for object detection. That's a great security system when coupled with StalkedByTheState :-)
https://github.com/hcfman/sbts... [github.com]
Might need to buy some more cameras though.
Re: (Score:2)
I'm gonna be honest with you. I'm a reasonably paranoid dude and even my brain didn't immediately leap to "OMFG I can improve my AI driven intruder detection things with this sweet new $1600 GPU!"
Either your paranoia drastically exceeds the threat you are facing or you are facing a threat so overwhelming that intrusion detection isn't going to help. Either way the 20ms this shaves off vs the current generation card for $600 less isn't going to make the difference.
So impressive (Score:2)
You'll be able to play the sames games as a $400 PS5, only you can set grass detail at "extremely high" instead of "very high."
Re: (Score:2)
The 4090 will probably be able to play the same games as PS5, except in 8K instead of 4K. Do we need all those Ks? I don't know, but the same people who shelled out for an 8K TV are probably pretty desperate to find some content for it.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Everything you said is true but don't knock this part "optimized for the hardware they have." Very few PC gamers tune their settings that well for their system. They leave performance on the table in some sliders and while turning others up too far and chase FPS their eye can't even see.
Re: (Score:2)
Re: (Score:2)
Obviously the 4090 is better. I was joking, suggesting they are better in ways that are immaterial to game play and only very slightly better in perceived technical quality.
Re: (Score:2)
Re: (Score:2)
Most people are playing PC games close up on a smallest screen. The differences look cool until 15minutes in when they become normalized by your brain. On the other hand if you are playing in VR then it makes a real difference.
The COVID lockdown rejuvenated gaming (Score:3)
And arguably numbed people towards pricing previously seen as unacceptable.
I think enough people have "the gaming bug" now so that nVidia should have a decent launch, especially if supply is conveniently just a bit constrained. ;) Old habits die hard, and people can go from "Too expensive, I neither need or want that!" to "I want!" as they read about people battling to score a card as supplies quickly dry up.
For me, as I recently bought an LG C1 OLED, to go with RTX 3080, that is blowing my mind, I have no interest in anticipating 8K panels, and it looks like 8K panels are going to be needed to exploit the kind of performance the highest end of this series is capable of. And there'd be a need of games that had resources that could natively take advantage of 8K panels.
Off topic, but supposedly the new Unreal engine allows importation of high resolution photographic assets. So maybe that could be part of the equation that eventually generates a serious demand for these expensive new cards.
I'm hoping for a quicker than usual refresh/process shrink of this new series of cards from nVidia, one that puts a lot of attention to reducing the "watt per frame" ratio.
Re: (Score:2)
I'm always curious, how much watts would an older GPU require if it were made with the latest node process? Wouldn't it be profitable to make more older/smaller GPUs (in relation to the silicon wafer) versus newer/bigger GPUs? For example, surely the RTX part of the newest GPUs makes them much bigger than the older GTX series.
nVidia sort of did some of that with the GTX 1660 (Score:2)
https://www.pcgamer.com/nvidia... [pcgamer.com]
Actually buying one (Score:4, Funny)
I look forward to the act of showing up to a store and actually walking out with one, rather than the nonsense the last few years of endless shortages and above MSRP pricing.
Someone dropped the ball... (Score:3)
For $1600, you can buy a computer (Score:2)
The cost of a single component that vastly outstrips the cost of the entire rest of the rig. Plus inflation is raising prices on everything else. So no thank you.
Re: (Score:2)
What you mean like upscaling generates fake pixels?
Re: (Score:2)
This is supposed to effectively alleviate both GPU and CPU bottlenecking. The latter could be a pretty huge deal
But since it only works on the frame buffer end of things and is supposed to be completely decoupled for the game itself, I can imagine that
Re: (Score:2)
Interesting. I do know I see a substantial improvement enabling DLSS in games like no man's sky on my RTX 3xxx card, it goes from unplayable and constant tearing on low settings to quite playable on ultra in VR with just that one setting changed.
There really aren't any amazing graphics or physics in that game nor is every element of the world interactive like minecraft, sure there is a big open field but lots of games including much older ones like Eve handle that with grace so I have no clue why that game
Re: (Score:2)
Really, DLSS isn't that different from the video conversions we've all been looking at for years on DVD and BluRay which use 30fps or 60fps outputs but the source film was 24fps. The process used there was referred to as a "2:3 pulldown" where you would end up with some mixed frames to even out the framerate.
DLSS is just doing this in real time. I don't know why someone would just get all pissy about it like the GP post is, because it's been happening in cinema transfers for decades.
Re: (Score:2)
Image interpolation works pretty well in space without knowing anything about the physics of whatever image it is. It works pretty well in time too. Old fashioned projectors would move the film very quickly and hold it stationary, effectively doing nearest neighbour interpolation in time. The phosphors in a CRT or the latching in an LCD do the same thing.
DLSS improves on that by not only using higher order interpolation, but also being able to recognize and separately interpolate the background and foregrou