Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Graphics Hardware

GeForce RTX 3090 Launched: NVIDIA's Biggest, Fastest Gaming GPU Tested (hothardware.com) 62

MojoKid writes: NVIDIA's GeForce RTX 3090, which just launched this morning, is the single most powerful graphics card money can buy currently (almost). It sits at the top of NVIDIA's product stack, and according to the company, it enables new experiences like smooth 8K gaming and seamless processing of massive content creation workloads, thanks in part to its 24GB of on-board GDDR6X memory. A graphics card like the GeForce RTX 3090 isn't for everyone, however. Though its asking price is about a $1,000 lower than its previous-gen, Turing-based Titan RTX counterpart, it is still out of reach for most gamers. That said, content creation and workstation rendering professionals can more easily justify its cost.

In performance testing fresh off the NDA lift, versus the GeForce RTX 3080 that arrived last week, the more powerful RTX 3090's gains range from about 4% to 20%. Versus the more expensive previous generation Titan RTX though, the GeForce RTX 3090's advantages increase to approximately 6% to 40%. When you factor in complex creator workloads that can leverage the GeForce RTX 3090's additional resources and memory, however, it can be many times faster than either the RTX 3080 or Titan RTX. The GeForce RTX 3090 will be available in limited quantities today but the company pledges to make more available directly and through OEM board partners as soon as possible.

This discussion has been archived. No new comments can be posted.

GeForce RTX 3090 Launched: NVIDIA's Biggest, Fastest Gaming GPU Tested

Comments Filter:
  • Almost (Score:4, Insightful)

    by RazorSharp ( 1418697 ) on Thursday September 24, 2020 @07:51PM (#60541824)

    the single most powerful graphics card money can buy currently (almost).

    That's almost a big deal. Almost.

  • The card is pretty damn impressive. But at ~140% premium for a ~10-20% performance increase you would want to have some very specific work/gaming scnearios to justify it.
    • by EvilSS ( 557649 )
      Now now, it's only a 100% premium (699 vs 1499)! And that 10-20% is at 4K and up. The delta drops significantly below that. So it's either a gaming e-peen, or a workstation card (if you don't need the Titan/Quatro features they driver-neutered off of it).
      • Now now, it's only a 100% premium (699 vs 1499)! And that 10-20% is at 4K and up. The delta drops significantly below that. So it's either a gaming e-peen, or a workstation card (if you don't need the Titan/Quatro features they driver-neutered off of it).

        More likely its mining alt-coins than either of those. Hell a 1060 is profitable at the moment. And for people using it for gaming or math, letting it mine when ottherwise idle in a gaming machine or workstation offsets the price. Might get that 3090 for effectively the cost of a 3080 or 3070.

        • Comment removed based on user account deletion
          • by drnb ( 2434720 )

            More likely its mining alt-coins than either of those. Hell a 1060 is profitable at the moment.

            Sweet Jesus, are people still mining alt-coins?? Hasn't that died yet? How many alt-coins are there now? 10,000? 100,000? 1,000,000??

            Yes, they apparently are. And they are buying hashrate at a price that makes a 1060 6GB profitable (barely) at the moment, between 10pm and 8am when my rate is $.15. And this is not even a mining rig, no tweaked firmware or special drivers, just a stock founders edition card with regular nvidia drivers. I find this quite surprising too but on the other hand the 1060 has recovered 2/3 of its cost during the profitable times of the last few years.

            How many alt-coins? Don't know, don't care. I use a service

            • Name of that service? Asking for a friend.

              Also, that friend is myself.

              • by drnb ( 2434720 )
                NiceHash is one that is oriented towards beginners. They have a bootable linux image with mining software on it that you can install on a flash drive and boot from. That's probably preferable to installing their Windows based software and dealing with all of Microsoft's malware warning. Most mining software is automatically tagged as malware since malware will sometimes install them and use them to mine for their account using your hardware. Plus I like their bootable flash drive route so my hard drive is n
          • "Sweet Jesus, are people still mining alt-coins?? Hasn't that died yet? How many alt-coins are there now? 10,000? 100,000? 1,000,000??"

            And all to 'mine' imaginary fiat "money" backed by nothing but how much power was wasted with this bullshit.

            At least US fiat dollars are (very allegedly) backed by something as well as what's in Fort Knox.

            Shitcoin and the rest of the virtual fantasy bullcrap will be forgotten and dumped in the same shit pile with the dot-bomb wreckage.

            • I gotta wonder at the Carbon being put out by all this. If Bitcoin where to ever graduate out of its current role as fuel for crazed speculators and money laundering for heroin, into actual "currency" thats a LOT of energy blown into solving hashes.

              I mean granted many (American, the europeans tend to be a bit more rational) Libertarians tend to confuse skepticism with incredulity and just declare effectively "physics isnt real in the sky!" and thus carbon not a worry. But for the rest of us in the reality

        • by EvilSS ( 557649 )

          Now now, it's only a 100% premium (699 vs 1499)! And that 10-20% is at 4K and up. The delta drops significantly below that. So it's either a gaming e-peen, or a workstation card (if you don't need the Titan/Quatro features they driver-neutered off of it).

          More likely its mining alt-coins than either of those. Hell a 1060 is profitable at the moment. And for people using it for gaming or math, letting it mine when ottherwise idle in a gaming machine or workstation offsets the price. Might get that 3090 for effectively the cost of a 3080 or 3070.

          NVidia cards pale in comparison to AMD for mining though, and you can buy the current gen AMD cards at MSRP right now. At at those prices, the 3080 would have a way better ROI than the 3090. I'm sure a few fools will buy some for mining but they will be waiting quite a while to break even on it. Hell, you could get a better ROI flipping it on ebay.

          • by drnb ( 2434720 )

            Now now, it's only a 100% premium (699 vs 1499)! And that 10-20% is at 4K and up. The delta drops significantly below that. So it's either a gaming e-peen, or a workstation card (if you don't need the Titan/Quatro features they driver-neutered off of it).

            More likely its mining alt-coins than either of those. Hell a 1060 is profitable at the moment. And for people using it for gaming or math, letting it mine when ottherwise idle in a gaming machine or workstation offsets the price. Might get that 3090 for effectively the cost of a 3080 or 3070.

            NVidia cards pale in comparison to AMD for mining though, and you can buy the current gen AMD cards at MSRP right now. At at those prices, the 3080 would have a way better ROI than the 3090. I'm sure a few fools will buy some for mining but they will be waiting quite a while to break even on it. Hell, you could get a better ROI flipping it on ebay.

            But my system is not simply for mining. It is a workstation where I may work on CUDA code during the day and may play games after work; mining is only when the computer would otherwise be unused. Hence Nvidia. I've also compared systems with Nvidia and ATI of comparable generation, say 1060 vs 570 (CUDA development system and OpenCL development system). The 570 generates slightly more hash but at a disproportionate amount of power as measured at the wall. So Nvidia wins there too, its more profitable. Sligh

            • by EvilSS ( 557649 )
              The 3090 is also a power hog. More than any other card ever made. More power for nearly the same hash rate is a poor trade off. If it's part time and you really need it (and if you have a 1060 today you obviously don't), then sure, but buying it with mining even as a 2nd priority is questionable. Maybe if you live in Antarctica and are getting free electricity...

              As for NV vs AMD, you are using something like Nicehash or another pool, yes? If you mine directly (which is more profitable overall), AMD is
              • by drnb ( 2434720 )
                Its not absolute power, its the ratio of power to hashrate. AMD takes more power to do the same work, well the 570 did compared to the 1060. 10% more hashrate but 20% more power, approximately. I've mined via NiceHash and directly. Measuring at the wall using a kill-a-watt meter I find the 570 using more power in both cases.

                I'm not buying a 3090. A 3070 is about where my price tolerance tops out even with potential mining subsidies. In my mind I emphasize the "potential" and don't count on it, if it happ
                • by EvilSS ( 557649 )
                  You should be making about 20% more even after power with the 570 versus the 1060. Not to mention the 580 was still under a $200 card when it launched, cheaper than a 1060, and even faster. It would make you even more even after power consumption is factored in. But again you’re leaving a ton of money on the table if you’re using nice hash. With Nice Hash you’re always at the mercy of the orders and they may not always be picking the most efficient algorithm for your set up. Not to menti
      • I think its fine actually. the 3080 is great and most people will buy this. I probably will. That means only people that really want to pay way more for a little bit more performance will get the 3090 and that's fine.

        It would bother me more if they didn't release a 3080 and your choice was 1500USD or get an old gen card ;-)

    • by darkain ( 749283 )

      Depends on what you're using it for. COMPUTE is 10-20% faster, but at over 2x the RAM... That RAM isn't free. At professional work loads, like game dev, demand that extra RAM.

      • Purchasing it at this price and configuration would only be justified for sparse matrix math workloads enabled by the Ampere architecture. AI workloads would be better off with Google's TPU v3 or possibly a GA100 chip. The GDDR6X memory increase is deceiving in capacity. Micron has plans for 16Gbit chips, but as is the memory is the same density as on NVIDIA’s RTX 20 series cards, and their GTX 1000 series cards. RTX 3090 gets 24GB of VRAM, but only by using 12 pairs of chips in clamshell mode on a 3
    • There are people who will pay a lot more to have the "best" even if it is only slightly better than the second-best.

      There are people who buy $100K watches. For some people the price simply doesn't matter.

      There may also be a set of problems where it actually makes technical sense.

      • there are definitely scenarios where it makes sense, especially in workstation processing or AI scenarios and hell if money is no object sure. For most gamers though they will not get a noticeable increase in performance, just a bigger e-peen. It is also a bloody massive card that won't fit in many cases.
    • I was going to say $800 could be worth it for a pro to increase rendering speed 10% in Blender. But come to think of it, rendering is easily parallelized so you'd seemingly be much better off with 2 of the slightly slower cards for similar money.
    • by ranton ( 36917 )

      The card is pretty damn impressive. But at ~140% premium for a ~10-20% performance increase you would want to have some very specific work/gaming scnearios to justify it.

      This isn't much different than any luxury item. You will probably only have one gaming machine, and if you are going all out then what you are really comparing is a $3000 machine with a 3080 vs $3800 machine with a 3090. So more like 25% more expensive for a machine which renders your game 15% faster.

      If you are using this for computation, then you need to look at the total cost of ownership. You are probably fitting 3 cards into each machine, and lets say the rest of the machine is $2k. So the cost differen

      • If you are a hobbyist then you probably aren't the target market for the 3090. This is likely why it doesn't make any sense to you why anyone would buy it.

        who said it didn't make any sense to me. I fully understand the niche audience it targets, just that niche is getting increasingly smaller as the difference between this and the card at half its price is narrowing considerably.

        • by ranton ( 36917 )

          Fair enough. I did notice just now that you did mention specific scenarios to justify it in your comment which I glossed over. I guess it was mostly your title of "overpriced" which launched my response, since your actual comment is quite accurate. I still would argue it isn't overpriced though since it still gives plenty of reason for people in those niches to buy it. Or at least their market research must show that, how would either of us know for sure?

    • Overpriced is relative. Everything is overpriced in performance per dollar compared to some sweet spot. Always has been, always will be. Not a single component in a PC has ever followed a linear price performance curve. The higher up the curve you go, the more overpriced it is.

    • by Gwala ( 309968 )

      It's only 10-20% because it's being bottlenecked by the CPU/RAM in the benchmarks they're using. In a pure-GPU comparison it's up to 40% faster.

      You'll see the biggest benefit going from 2K/4K to 8K gaming (which in itself is stupid, but I digress...) because the CPU won't be the bottleneck there - and the added fill rate will help.

      • You'll see the biggest benefit going from 2K/4K to 8K gaming

        No, the biggest benefit will be the ability to raytrace at 2K/4K without framerates dropping into the teens. No one is "8K gaming" yet unless they're running VR in 4x4K for each eye (Pimax 8K Plus, etc).

    • by ceoyoyo ( 59147 )

      You're paying for the memory. The 3090 is a lot cheaper than anything else that's got 24 GB.

      Nvidia will sell a bunch to the pro gaming crowd, and a bunch to AI researchers.

  • Fluff piece (Score:5, Insightful)

    by Rockoon ( 1252108 ) on Thursday September 24, 2020 @08:15PM (#60541866)
    You can tell that its a fluff piece because it doesnt once mention the alarming amount of power the card draws, the alarming amount of heat that is generated as a consequence, or how very very loud the cards will be.
    • You can tell that its a fluff piece because it doesnt once mention the alarming amount of power the card draws, the alarming amount of heat that is generated as a consequence, or how very very loud the cards will be.

      To be fair, if you're in the market for one of these things those probably aren't real concerns. For my computing needs, I use a bunch of old ass shit and speed is almost never an issue. Especially not the GPU. For a video guy I know. . .well, that's another story.

    • Ever read a car review that mentions how much fuel is consumed under full throttle?? Hint: we're paying for the privilege of turning watts into rendered frames; the more the fucking merrier.
      • Ever read a car review that mentions how much fuel is consumed under full throttle?

        Actually I have seen that in reviews of Tesla cars, where monitoring power used in various scenarios is under heavy scrutiny.

        However, the case of this card is kind of different, because essentially in a car you are not going be be driving full throttle on any normal road for any significant length of time.

        Buying a powerful video card pretty much means you are going be using it to play video games for most people, or many hour

    • by AmiMoJo ( 196126 )

      Benchmarks show it draws a lot of power but isn't too loud. The cooling solution seems to be pretty good. Anyone buying one will have a high airflow case and some form of watercooling for the CPU anyway.

      Their stuff about 8k gaming is impressive but there aren't any 8k monitors for sale. Dell used to make one but discontinued it. Hopefully more will come to market now. Would like one for desktop use, 4k has proven inadequate.

    • You can tell that its a fluff piece because it doesnt once mention the alarming amount of power the card draws, the alarming amount of heat that is generated as a consequence, or how very very loud the cards will be.

      That's because none of the stuff you're talking about is remotely relevant. The RTX3080 is quieter than the previous generation due to the better cooler (which NVIDIA has historically sucked at), the RTX3090 has a larger cooler still and only consumes 30W more. Also none of this is even remotely alarming as this card draws less power than most rigs which have run in SLI in the past, and actually draws about the same amount of power as a carefully overclocked 1080Ti.

      Your irrelevant concerns are utterly irrel

    • it doesnt once mention the alarming amount of power the card draws

      Since when have gamers cared about power draw (other than heat waste)? Most gamers I've seen don't care if it requires its own power plant as long as they get a few more FPS in their FPS. Hell, much of the time I get treated like a fool because I want to build a PC with a powerful processor and decent integrated GPU rather than get a power-wasting discrete GPU just to display some productivity apps.

  • And since it's the beginning of a new genre, you need to pay up.

  • The 3090 was IBM's 1980s successor to their System/370 and System/360. I remember working on one when new in the industry. I wonder how many times more FLOPS the nVidia card can do than the IBM 3090's vector unit.

    • Comment removed based on user account deletion
      • by AC-x ( 735297 )

        Wow I had no idea the RTX 3090 performance dropped that much, from 35.58 TFLOPS (fp32) to 556.0 GFLOPS (fp64).

        Guess there is still a use for those Titan V cards after all.
               

      • by sphealey ( 2855 )

        The IBM 3090 could run at that load continuously for 3 years between scheduled maintenance though - what is the nVidia card's sustained high speed performance when 'sustained' is more than 30 minutes?

  • by Luckyo ( 1726890 ) on Thursday September 24, 2020 @10:09PM (#60542032)

    This is a "titan" rebrand. It's a workstation card. You can tell from the fact that the main upgrade is memory and memory bandwidth. You need those for workstations doing things like rendering, where large amount of memory helps expedite things by holding even large renders all in GPU RAM. It's of minimal benefit in gaming.

    I'm going to guess that this is a case of broken phone games going from engineering to marketing. Something like engineer saying "this is a workstation card, but it can also be used in gaming". Top marketing guy "Add gaming to it with an asterisk that it's not really all that great for gaming so we can sell a few extra cards to people who don't care about price". Last guy in marketing before press release: "What's this asterisk? We need to streamline the message, remove it".

    • Except that it's not really a proper workstation card, because it has the same driver limitations designed to stop it from running proper workstation applications well that their gaming cards do. It's effectively a very high-end gaming card at a professional workstation card price.

      • by fazig ( 2909523 )
        Depends on how you define "workstation".
        For engineering it would be an entry level product, that doesn't come with all the features that you want to make you safe on the legal site. So maybe the wouldn't use it for professional work in sensitive areas, for example they might not use it to design parts for an aero plane or a space craft, where liability is a factor.

        Professionals like 3D Artists don't need those features however. But a lot of memory can be quite beneficial to their work as well as high com
      • by K10W ( 1705114 )

        Except that it's not really a proper workstation card, because it has the same driver limitations designed to stop it from running proper workstation applications well that their gaming cards do. It's effectively a very high-end gaming card at a professional workstation card price.

        That's not true anymore for the most part, you can use SD instead of GRD and limits that were present were removed a while back by nvidia, Albeit only because competition offered solutions for less forcing their hand such as 10bit output for colour critical grading and design work. Thus you can still get likes of 10bit output in none fullscreen exclusive mode now on studio drivers. I use the previous gen in a workstation use such apps and it works fine. You can see the independent benchmarks for such pro ap

    • This is a "titan" rebrand.

      No it's not. That card is still coming.

      You can tell from the fact that the main upgrade is memory and memory bandwidth.

      I'm confused. You think instead the main upgrade should be magical performance pixies? Today's high-end workstation workloads are tomorrow's minimum requirements for games. You're not buying the fastest gaming GPU on the market to play yesterday's titles. The RTX 3000 Titan is expected to have 48GB of RAM.

      If you're scared of high-end components, don't buy them. You can wait, it's okay.

    • > This is a "titan" rebrand.

      Sort of. While both the Titan and 3090 have 24 GB RAM the Titan has FP64 at 1:3 FP32 performance while the 3090 only has FP64 at 1:64 FP32 performance as Tom's Hardware review [tomshardware.com] points out:

      With the doubled CUDA cores per SM, that equates to 10496 CUDA cores, with two FP64 capable CUDA cores per SM. In other words, FP64 performance is 1/64 the FP32 performance.

      Linus Tech Tips review [youtu.be] also noted that SPECviewperf 13.0 CATIA was significantly faster, 40%, on the Titan (377 vs 212)

      • That open letter to Jensen Huang was downright cringeworthy.

        Linus Sebastian needs to come down his high horse these days. "Jensen, i know you nerfed this GPU, but i like about you. It shows passion".

    • no you need massive amounts of memory for 8k rendering
  • Where are all these companies that release new products that are slower their their previous generation?

    Ideally I want a card that is faster, smaller, and cheaper. Fast enough for 8K gaming, but small enough to fit on the corner of an Intel CPU's die. And costs $0.

    • by AC-x ( 735297 )

      Wellllllll current top performing integrated graphics seem to have around the performance of high-end cards from 2010, so just wait until 2030 before upgrading and you're golden!

    • by Cederic ( 9623 )

      Well, that's already available. You buy an Intel CPU and it's as good as the cutting edge in 2010. Ten years to go from premium cutting edge gaming performance to 'comes for free with your CPU' doesn't feel unreasonable.

      8K gaming? Wait a couple of years. Five to ten years if you want a usable frame rate.

      • I mostly play point-and-click and text adventures. I suspect 8K gaming is feasible in a matter of months, not years.

        • for what it's worth, 8k was possible years ago (on titles that didnt demand a ton of VRAM and where you could specify the internal res).

          first thing i did with my shiny new 4k TV was play dark souls @8k.
  • I mean, i guess that's how nVidia wants to market it now.

    This guy, IMHO, nails the complete lunacy surrounding the hype for this new card: https://www.youtube.com/watch?... [youtube.com]

    • Oh I know that guy. It's the same guy who in his review of a heavily overclocked RTX 3080 showed that the card wasn't powerful enough to even cross 100fps much less get to the 144fps max of a modern gaming monitor at max graphics for half of his tests!

      Yes it's a gaming GPU. Just like people who ran 3x 1080 Tis in SLI were also gaming, and those people with 4x 980Tis in SLI were also gaming. Pretending that there isn't a market for extreme high end gaming is just being silly.

      That and few people are buying th

  • by Tough Love ( 215404 ) on Friday September 25, 2020 @03:46PM (#60544218)

    Fuck you NVidia. You can keep your crappy proprietary hardware.

The 11 is for people with the pride of a 10 and the pocketbook of an 8. -- R.B. Greenberg [referring to PDPs?]

Working...