Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
AI Hardware Technology

Nvidia Launches RTX 50 Blackwell GPUs: From the $2,000 RTX 5090 To the $549 RTX (techspot.com) 36

"Nvidia has officially introduced its highly anticipated GeForce 50 Series graphics cards, accompanied by the debut of DLSS 4 technology," writes Slashdot reader jjslash. "The lineup includes four premium GPUs: the RTX 5080 and RTX 5090 are slated for release on January 30, with the RTX 5070 and RTX 5070 Ti following in February. TechSpot recount of the Jensen Huang keynote tries to differentiate between dubious performance claims and actual expected raw output": The new RTX 5090 flagship comes packing significantly more hardware over its predecessor. Not only does this GPU use Nvidia's new Blackwell architecture, but it also packs significantly more CUDA cores, greater memory bandwidth, and a higher VRAM capacity. The SM count has increased from 128 with the RTX 4090 to a whopping 170 with the RTX 5090 -- a 33% increase in the core size. The memory subsystem is overhauled, now featuring GDDR7 technology on a massive 512-bit bus. With this GDDR7 memory clocked at 28 Gbps, memory bandwidth reaches 1,792 GB/s -- a near 80% increase over the RTX 4090's bandwidth. It also includes 32GB of VRAM, the most Nvidia has ever provided on a consumer GPU. [...]

As for the performance claims... Nvidia has - as usual - used its marketing to obscure actual gaming performance. RTX 50 GPUs support DLSS 4 multi-frame generation, which previous-generation GPUs lack. This means RTX 50 series GPUs can generate double the frames of previous-gen models in DLSS-supported games, making them appear up to twice as "fast" as RTX 40 series GPUs. But in reality, while FPS numbers will increase with DLSS 4, latency and gameplay feel may not improve as dramatically. [...] The claim that the RTX 5070 matches the RTX 4090 in performance seems dubious. Perhaps it could match in frame rate with DLSS 4, but certainly not in raw, non-DLSS performance. Based on Nvidia's charts, the RTX 5070 seems 20-30% faster than the RTX 4070 at 1440p. This would place the RTX 5070 slightly ahead of the RTX 4070 Super for about $50 less, or alternatively, 20-30% faster than the RTX 4070 for the same price.
These GeForce 50 series wasn't the only announcement Nvidia made at CES 2025. The chipmaker unveiled a $3,000 personal AI supercomputer, capable of running sophisticated AI models with up to 200 billion parameters. It also announced plans to introduce AI-powered autonomous characters in video games this year, starting with a virtual teammate in the battle royale game PUBG.

Nvidia Launches RTX 50 Blackwell GPUs: From the $2,000 RTX 5090 To the $549 RTX

Comments Filter:
  • Still not 4k-worthy? (Score:3, Interesting)

    by timeOday ( 582209 ) on Tuesday January 07, 2025 @05:52PM (#65071261)
    Their press release benchmarks are still using 1440 for the RTX 5070 which starts at $549. It's been like 7 years now that I thought surely the next generation of mid-range card would be good for 4k. I (only) play split-screen games with my son and we sit up close.
    • by PsychoSlashDot ( 207849 ) on Tuesday January 07, 2025 @06:34PM (#65071351)

      Their press release benchmarks are still using 1440 for the RTX 5070 which starts at $549. It's been like 7 years now that I thought surely the next generation of mid-range card would be good for 4k. I (only) play split-screen games with my son and we sit up close.

      I played in 4k @ 60fps using a 1080ti. Admittedly, that was stuff like Destiny 2, which isn't graphically demanding.
      Now I play in 4k @ 60fps using a 3080. That's in things like Horizon Forbidden West or Ghost of Tsushima or Control, which are graphically demanding.
      I also get about 80fps in Cyberpunk 2077 with most settings nearly maxed out.

      4k gaming has been here for generations of cards. It's mostly about if you need everything set to Ultra, and what your target frame-rate is.

      • Ah, when the 3080 was released I thought the time was imminent for affordable 4k gaming. That card is still over $500! Isn't that a bit nuts for a card launched at $699 in 2020!?
      • by Creepy ( 93888 )

        Try Star Wars: Outlaws. Looks FAR worse than Cyberpunk 2077 on my RTX 3080Ti, stutters horrifically at 4k, even though that was the setting nVidia recommended. It is playable at 1920 with those same settings (and higher resolutions if I knock a few things down). I'm not saying the game is bad by any means, but it definitely is depending on a number of extremely expensive FX. From what I've noticed, hair (which isn't that great IMO) and fur (really well done, have done some close inspections and older techni

    • by bloodhawk ( 813939 ) on Tuesday January 07, 2025 @07:39PM (#65071459)
      The 5070 is still a cripled gimp with only 12GB ram, it is just an overpriced 1440p card.
  • by williamyf ( 227051 ) on Tuesday January 07, 2025 @06:05PM (#65071283)

    The 5090 and 5080 cost waaaay too much.

    The 5060 is severely cripled.

    The 5070Ti is where is at. With the same 16GB than the 5080, but with saner price and not too much crippling.

    Specs and feeds
    https://www.forbes.com/sites/a... [forbes.com]

    • by sinij ( 911942 ) on Tuesday January 07, 2025 @06:11PM (#65071297)

      The 5070Ti is where is at

      Yes, but good luck buying one. Nvidia is all in on miners and AI farms, nothing going to be left for consumers. Hopefully Intel catches up on graphic cards.

      • Seems like Project Digits is set up to try to divert some of that low end AI farm attention, especially since performance isn't as important to LLM's as just high volume of VRAM or the like.
      • by evanh ( 627108 ) on Tuesday January 07, 2025 @06:34PM (#65071355)

        It's all AI craze alone. Crypto-miners all but stopped using GPUs when Ethereum changed from POW to POS. Bitcoin miners haven't used GPUs for over a decade.

        • by larryjoe ( 135075 ) on Tuesday January 07, 2025 @06:59PM (#65071411)

          It's all AI craze alone. Crypto-miners all but stopped using GPUs when Ethereum changed from POW to POS. Bitcoin miners haven't used GPUs for over a decade.

          RTX is only used for AI for home tinkerers and very small setups. There is no corporate demand for RTX for AI. This is not by accident. Gamers may think that $2k for a 5090 is expensive, but Nvidia is going out of is its way to make sure that the $40k+ data center cards are the only ones data centers will consider.

          • RTX is only used for AI for home tinkerers and very small setups. There is no corporate demand for RTX for AI.

            Basically this. You cannot train huge models on single RTX cards because they don't have even nearly enough memory for that, not even the top-end cards. And if you're thinking of building a Beowulf cluster of them, you'd probably be better off with something like a single H100/H200 anyway.

            For personal AI projects, RTX cards are cool. But I have a peeve with how NVidia always cuts support for older CUDA versions in every new generation of RTX cards. Yep, they are not downwards CUDA compatible. I have a handf

      • Yeah sorry it's not 2021 anymore. Virtually all cards have been available at RRP without any significant supply restrictions even only a couple of months after launch.

        AI farms aren't buying GPUs, they are way underpowered compared to dedicated AI hardware. Miners are not in the buying marketing right now as evident by the insane number of cards available second hand.

    • Did you check TDP? 1000W woot \o/

      • by tlhIngan ( 30335 ) <slashdot@wo[ ]net ['rf.' in gap]> on Tuesday January 07, 2025 @10:06PM (#65071677)

        No, that's the power supply requirement. Which means if you put in a beefy CPU, you're going to have to run a new electrical circuit to power your PC.

        It won't be long until you will be putting high powered electrical outlets not just for EV charging, but for your PC as well.

        • oh yeah. I read the wrong wrong. gpu power is only 545W. Still a fuckton of power.

          • by Creepy ( 93888 )

            The funny thing is, there was a time I needed a 1400W power supply and 1100 of it was for GPU. nVidia (and ATI, which is now AMD, my previous card was from them and required 850W but I grew frustrated with their lack of OpenGL extension support, which we used for parity with Cg and DirectX features) started seriously chopping power requirements after those days, though. I remember building my next machine and it only needed a 900W power supply. Those cards had massive heatsink fans, too, like 20cm x 10cm x

        • that's the power supply requirement. Which means if you put in a beefy CPU, you're going to have to run a new electrical circuit to power your PC.

          You can get 12A continuous from a typical 15A household outlet, so 1440W. It's pretty difficult to build a single-processor PC which draws more than that. Of course, you do have to plug your monitor in somewhere too...

          On the other hand, from what I'm seeing on various networks, most people are already balking at these prices. The cheapest card in this lineup is hundreds of dollars more than the cheapest 4xxx. I think Nvidia is going to lose some fanboys in this generation. Bad timing when AMD's graphics dri

    • by Anonymous Coward
      My 1070 has lasted 7+ years. I don't mind spending $1000+ on something that I can get value out of for the better part of a decade.
      • by Anonymous Coward
        The 1070 was an exceptional card when released. The 5070 is underwhelming. I can't see you getting near a decade out of a 5070 unless you aren't playing the latest games ever as with 12GB it is already a low end card for 4k and a very expensive card for 1440p which is where it will increasingly be forced to live in coming years.
  • by williamyf ( 227051 ) on Tuesday January 07, 2025 @06:22PM (#65071317)

    For U$D 250 for a Battlemage you get 12GB VRAM, and decen modern features like DNN upscaling and frame interpolation (XeSS, better Image quality than AMDs FSR 1~3 family), Ray tracing (certainly better than RTX2000 and 3000 series) as well as a modern encoder/decoder, and, as an added bonus, Intel GPU compatibility in linux is superb (AMD's is good to, nVIDIA, not so much, the older you go, the worse it gets).

    All this for a significantly lower price than anything nVIDIA (or AMD, for that matter) has laying around.

    Use XeSS (or, god forbid, FSR) upscaling and Frame Gen liberaly, just like nVIDIA is doing in the RTX5000 series comparisons.

    Yes, some games do not run so good, that will be fixed in drivers.

    If you are a retro-games enthusiast (like me), just get an AMD CPU with an RDNA3 or RDNA 3.5 iGPU and play the retro games on that, and leave the GPU for the "more modern games"

    Skip the ARC Alchemist 700 series unless you NEED the 16GB Memory.

    If you are buying a whole new system (sans monitor) in the U$D 500 to U$D 900 range, the B570 is the best choice.

    If you are building with used (or new old stock) component, just make sure you are compatible:
    Resizeable Bar (a.k.a. AMD Smart Access Memory)
    PCIe 4.0 (or Higher)
    A 10th gen intel Proc. (or equivalent AMD)

    Happy plays.

    • If I was still just playing games I would seriously consider Intel. But I've been tinkering with LLMs recently. My next card needs ALL the memory and a bunch of CUDA cores, so it's NVidia or nothing for me.
      • Except Intel is showing plenty of horsepower for AI and are releasing a 24gb card later this year aimed at productivity.
        So 24gb for probably $500 or less...

        Plus CUDA alternatives are becoming more prolific; gelsinger pointedly said last year "Nvidia, you better watch out because the industry is coming for CUDA" and Intel appears to be delivering on that

    • I bought the B580. So far, no problems at all. Way faster than the Radeon 6600 it replaced. It was $280 I think after taxes shipped.

  • The 4090 was a ridiculous size, now we are back to two slots with the 5090.
    • For me the problem with the 4090 was length. I have a full size ATX case and don't have anything else slotted into the motherboard so taking three slots was not an issue but the 4090 is so long that it quite literally rubs up against the support frame for the 5 1/4" drives at the front of the case. If it was just one millimetre longer it wouldn't have fit.
  • $2,000? (Score:4, Interesting)

    by Khyber ( 864651 ) <techkitsune@gmail.com> on Tuesday January 07, 2025 @11:23PM (#65071781) Homepage Journal

    That's not a graphics card. That's not even an AI card.

    That's a business card.

    • Not sure what you're saying. Too expensive for a graphics card? Too cheap for AI? It's not a business card since it will certainly be locked down for CAD and similar purposes like all other NVIDIA GPUs.

      Rich people exist, a rich market exists. This card may not be targeted at your budget but it still very much is a graphics card on account of the fact that it can't do one of the things you list and is useless for the other (NVIDIA GPUs are bare minimum in the AI world, great for the home tinkerer, but litera

Anyone can do any amount of work provided it isn't the work he is supposed to be doing at the moment. -- Robert Benchley

Working...