Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Hardware

Nvidia Unveils GeForce RTX 40 SUPER Series (nvidia.com) 40

Nvidia on Monday announced its new GeForce RTX 40 SUPER series GPUs, promising significant performance gains for gaming, creative workflows and artificial intelligence capabilities over previous models. The new lineup includes the GeForce RTX 4080 SUPER, RTX 4070 Ti SUPER and RTX 4070 SUPER GPUs. Nvidia said the chips deliver up to 52 shader teraflops, 121 ray tracing teraflops and 836 AI teraflops.

The top-of-the-line RTX 4080 SUPER model will go on sale starting Jan. 31 priced from $999, while the RTX 4070 Ti SUPER and RTX 4070 SUPER will hit shelves on Jan. 24 and Jan. 17 respectively, priced at $799 and $599. The company said the new GPUs can accelerate ray tracing visuals in games by up to 4 times with Nvidia's Deep Learning Super Sampling (DLSS) technology. DLSS uses AI to boost frame rates in games while maintaining image quality.

Compared to its predecessor, the RTX 4080 SUPER is 1.4 times faster at 4K gaming than Nvidia's previous top gaming GPU, the RTX 3080 Ti, without DLSS enabled, Nvidia said. With DLSS Frame Generation switched on, the performance gap widens to 2 times as fast. The new GPU lineup also promises significant gains in AI workloads often used by creative professionals, such as video generation and image upscaling, Nvidia said.
This discussion has been archived. No new comments can be posted.

Nvidia Unveils GeForce RTX 40 SUPER Series

Comments Filter:
  • by Stolovaya ( 1019922 ) <skingiii&gmail,com> on Monday January 08, 2024 @02:06PM (#64141385)

    I think I'm more looking forward to video cards that don't take up half my damn case.

    • For me I'd like to see 16GB become more standard. Thanks to low quality PS5 ports that dump all their textures into ram it's becoming mandatory. Just don't bother to program texture streaming....

      It doesn't effect FPS benchmarks but it can kill frame pacing.
      • by edwdig ( 47888 )

        It's quite the opposite - one of the key design goals of the PS5 is an extremely a high bandwidth, low latency bus between the SSD and the video memory. You've got 1 frame latency on the texture load, so the games are designed around constantly streaming textures. PC architecture doesn't have the direct path from the SSD to video memory and can't stream textures as fast. Developers have to compensate for it by preloading more.

        • As I understood ReBar is supposed to help with that, but reading now I suppose it doesn't?

          ReBAR allows the CPU to access the entire frame buffer memory of a compatible discrete GPU all at once, rather than in smaller chunks.

    • And didn't generate enough heat to warm up a room though in the winter it is sort of nice.

    • Then get one of these. They are tiny. The RTX4090 is one of the smallest cards made in years. Sure you need to water cool them or you just need to accept the fact that with incredibly high end capability comes incredibly high end heat dissipation.

    • It's not the case, it's the wallet...
    • The 4070 is still offered in reasonable size factors. Mine is quite a bit smaller than the ancient 1070 it replaced. It's plenty powerful unless you insist on 4k with all quality settings maxed out on the most intensive titles. For 1440p, there's nothing that will take it under 60fps.

    • by edwdig ( 47888 )

      Look at the 4070 Founder's Edition. It's the same size as cards from a few generations ago. Way smaller than the 4080/4090.

      Some of the 3rd 4070 models are larger with bigger cooling systems, hence the Founder's Edition recommendation.

    • by eepok ( 545733 )

      Then you'd be sticking with Nvidia cards for a while. AMD's cards while providing better value for money when just considering retail prices really chew up electricity to get to their performance levels and thus come with larger heatsinks than their similarly performing Nvidia counterparts.

      Nvidia: More power efficient (saves money long-term) and thus smaller heatsinks. Also comes with better upscaling/frame generation abilities.
      AMD: Cheaper retail prices, nicer software suite.

    • I gotta admit I'm still kinda nervous. I had an RX 580 and it was great for 3 years or so... and then it started black screening after a mobo/CPU upgrade.

      I ended up replacing it with a used GTX 1080 last year but as I understand it the problem was Windows installing the default drivers over the AMD ones.

      I had tried 3 or 4 ways to stop that around the time I replaced it, so I never did find out for sure if any of them worked. And it's *very* likely this is a bug in Windows and not the AMD drivers.
      • https://www.guru3d.com/downloa... [guru3d.com]

        I doubt it was a bug in Windows. More like EBKAC.

        As a long time PC tech, I've seen it all. Computer just don't crash unless something has failed or drivers/OS was misconfigured.

        I bet I can fix it. Or at least tell you what is wrong. The event viewer is your friend.

        • My Ryzen laptop's first act when I booted it was the graphics driver crashing.

          That was AS SHIPPED.

          AMD is just shit at drivers. Only the Linux OSS driver is any good, and it doesn't support the newest cards timely.

          This is why I got a 4060 for my desktop. I wanted 16GB and it was the only cost effective option from someone who knows how to write drivers.

          That same laptop has been rock solid running Linux, BTW.

        • Dude I did a completely fresh install of Windows 10 at one point. It wasn't PEBKAC.

          I don't think amd's drivers are bad in the sense that they're poorly written I think the problem is that AMD writes their drivers exactly to spec and Windows 10 doesn't even follow its own specs. Also I know that I've caught Nvidia fixing bugs in games for game developers which is something AMD as a tough time doing because they don't have anywhere near the profit margins of Nvidia even today.

          In a perfect world where
      • it started black screening after a mobo/CPU upgrade

        And of course you got the latest driver for it?

        • Fresh install of Windows 10 with the latest drivers. Just before upgrading to the GTX 1080 I had found another hack that was supposed to prevent windows from installing its default driver. All told there were four separate places you needed to hack into in order to tell Microsoft to stop installing updated drivers.

          Again I really do think that this whole mess wasn't amd's fault but the only one who was going to fix it was AMD. Ironically the problem was most likely caused because AMD insists on keeping th
  • HEVC 422 (Score:5, Insightful)

    by slaker ( 53818 ) on Monday January 08, 2024 @02:37PM (#64141491)

    Nearly all consumer cameras made in the last five or so years that are capable of producing RAW video output do so in 10 bit HEVC 422 chroma sampling. Pro stuff, like Arri or RED cine cameras uses 444 chroma sampling and I definitely understand why consumer GPUs don't handle that. If you're buying $75k cameras, needing a $5k accelerator card isn't a big deal.

    Intel and nearly all contemporary ARM SoCs, including Apple Silicon, support HEVC 422 in hardware. Neither AMD nor nVidia offer 10 bit HEVC 422. They support 8 bit HEVC 420, but doesn't exactly help me when I'm trying to keep my videos in a format I can easily color grade.

    It's hilarious to me that a $1000 RTXwhatever is less valuable for one of my main needs for a GPU than a $100 Arc A380.

    • It's hilarious to me that a $1000 RTXwhatever is less valuable for one of my main needs for a GPU than a $100 Arc A380.

      It's hilarious to me that you find it hilarious a GPU that is primarily targeted at not your use case isn't suitable for your use case. But more so that you think this is an NVIDIA / AMD thing. Most of Matrox's professional encoders don't support 10bit 4:2:2, about half of Maikto's products don't either despite them praising the shift to 4:2:2 in broadcast video.

      Congrats you found a niche and are confused why a general purposing gaming GPU isn't meeting your niche. Intel's ARC relies heavily on other valued

      • by slaker ( 53818 )

        Intel's "gaming" cards supports this, along with a bunch of other creator-friendly nice-to-haves like AV1 hardware codecs.

        I normally use a Threadripper for editing, and it DOES have an Arc card installed, since that plays nicely in Resolve Studio solely as an accelerator card, but my 13900KF + 3070Ti does not. As a desktop platform, it also doesn't have the PCIe lanes to run a second GPU, either.

        It's absolutely idiotic that I'm better off using the SoC in my tablet for certain aspects of video editing than

    • I'm pretty sure NVENC supports 4:2:2 and 4:4:4 10-bit HEVC since the Geforce 10xx series, though they lacked some other HEVC features until the 20xx or 30xx series.
      • by slaker ( 53818 )

        You are mistaken. You get HEVC 420 and, on some chips, HEVC 444.

        There are lots of reasons to want nVidia GPUs for creative applications, especially since a lot of stuff wants CUDA to work properly, but it's missing an important workflow for native camera output. I can transcode everything over to Avid (or ProRes, were I an Adobe person) my editing software likes better, but even that's kind of awful since I have to tie up one or other of my desktop systems just doing that.

  • by Rosco P. Coltrane ( 209368 ) on Monday January 08, 2024 @02:41PM (#64141505)

    artificial intelligence capabilities over previous models.

    Aww God it's been just about impossible to spend a single day without hearing about AI-this or AI-that for the past year. If I ever was excited by AI, all I want now is to stop hearing about it. Enough already!

    Anyway, instead of stuffing artificial intelligence in their graphics cards, Nvidia should spend some of their resources fixing their Linux drivers. But of course fixing bugs won't land them instant PR like sticking "artificial intelligence" in the product release...

    • Anyway, instead of stuffing artificial intelligence in their graphics cards

      How about no, since the "AI" being stuffed into their graphics cards had directly translated to improved gaming performance. Most people don't give a shit what you think about AI or marketing, they care how good the picture looks when DLSS and Frame generation are used to make silky smooth gameplay on their 4k monitors.

      Nvidia should spend some of their resources fixing their Linux drivers.

      Why? Their target market doesn't care about Linux. Why should NVIDIA dedicate serious effort to do something that appeases a minority of their users? It's not a case of getting them instant P

      • Re:AI nausea (Score:4, Insightful)

        by Rosco P. Coltrane ( 209368 ) on Monday January 08, 2024 @04:39PM (#64141879)

        Why should NVIDIA dedicate serious effort to do something that appeases a minority of their users?

        Because they released a Linux driver.

        Why did they release a Linux driver if they don't care about Linux users?

        You're right: if they don't want to support minority users, they're completely within their right not to. But if they choose to, they should do it right.

        I'd rather they told Linux users upfront they're not worth the effort and pulled the driver entirely than do the half-assed job they've been doing for years supporting Linux.

  • This doesn't sound like a great decision. People have as it is been complaining about the naming of the 4 series. Nothing makes sense. The 4070 is what was supposed to be the 4060. Now Nvidia has further complicated their offerings.

    Not only will this confuse customers more, I wonder what impact it will have on their supply chain & software development, thanks to all these new variants that also have to be supported.
    • by ceoyoyo ( 59147 )

      This is certainly the first time Nvidia has used the Super postfix. Their naming in this generation is definitely totally different than it has ever been before.

    • by edwdig ( 47888 )

      The 4070 gets a price cut. The 4070 Ti and 4080 are getting discontinued and replaced by these new models. In the end it's just one additional model to worry about in the supply chain. It's only going to be confusing to consumers for a couple of weeks while the older models are discounted to clear out the inventory. Nvidia stopped production on the cards in question a few months ago, so there probably isn't a ton of excess inventory. The changes are all in the higher end of the market, so it's mostly going

  • I'm clinging to my RTX2060 which I have yet to hit its limits on

    • Hell, I'm still using my trusty GTX960. I have a new 1050 in box to replace the 960 when the day it gives up the ghost finally comes. nVidia can take their insane pricing and shove it up their ...

    • by leonbev ( 111395 )

      Yeah... I was really hoping that we were going to get a "4060 Super" this generation that was twice as fast as my old 2060 at 1080p for around $349.

      NVidia seems to be giving up on mid-range cards and is mostly doing new releases for the high-end market. I guess that I'll have to buy something like a Radeon 7700 instead when they go on sale.

      • by slaker ( 53818 )

        nVidia's strategy is not to back fill the midrange but to continuously release new hardware so that bargain hunters have to fall back on previous-gen cards for decent bang for the buck. An RTX 3080 is currently a pretty good deal if you can find one and probably cheaper than current 4070, for example.

        AMD has pretty decent offerings in the mid range, but if you're using it for AI or content creation tasks, you'll find out pretty quickly that CUDA is the big name in town, and gamers already think AMD's name

  • The 4080 Supers predecessor is not the 3080ti, it is the 4080. The 4080 super is not the "The top-of-the-line" this would still be the 4090 who wrote this ad?
  • by Visarga ( 1071662 ) on Monday January 08, 2024 @05:51PM (#64142155)
    They gave us all specs except memory size and bandwidth which affect AI applications the most
  • When I can get a card in the $300 range that is clearly and usefully better than my AMD 6750, I'll probably jump. I operate perpetually at least two generations in the past - that, for me, is the sweet spot for cost-benefit.

Air pollution is really making us pay through the nose.

Working...