Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Graphics Hardware

NVIDIA GeForce RTX 3080 Tested: a Huge Leap Forward In Gaming Performance (hothardware.com) 43

MojoKid writes: NVIDIA CEO Jensen Huang officially unveiled the GeForce RTX 30 series based on the company's new Ampere architecture a couple of weeks back. According to Huang, the GeForce RTX 30 series represents the greatest generational leap in the company's history and he claimed the GeForce RTX 3080 would offer double the performance of its predecessor. The embargo for GeForce RTX 3080 reviews just lifted and it seems NVIDIA was intent on making good on its claims. The GeForce RTX 3080 is the fastest GPU released to date, across the board, regardless of the game, application, or benchmarks used. Throughout testing, the GeForce RTX 3080 often put up scores more than doubling the performance of AMD's current flagship Radeon RX 5700 XT. The RTX 3080 even skunked the NVIDIA Titan RTX and GeForce RTX 2080 Ti by relatively large margins, even though it will retail for almost half the price of a 2080 Ti (at least currently). The bottom line is, NVIDIA's got an absolutely stellar-performing GPU on its hands, and the GeForce RTX 3080 isn't even the best Ampere has to offer, with the RTX 3090 waiting in the wings. GeForce RTX 3080 cards will be available from NVIDIA and third-party board partners on 9/17 for an entry-level MSRP of $699.
This discussion has been archived. No new comments can be posted.

NVIDIA GeForce RTX 3080 Tested: a Huge Leap Forward In Gaming Performance

Comments Filter:
  • All the criminal bitcoin miners will snatch it up using automated scripts that scrape and buy every single retail outlet selling these things.
    • by MojoKid ( 1002251 ) * on Wednesday September 16, 2020 @06:42PM (#60512992)
      Really? Is anyone mining on GPUs anymore? Doesn't it cost more to power them than you actually make?
      • Especially these cards, which draw upwards of 450W at the wall all by themselves.
        • Current most efficient mining card is the 2070 Super.
          If a 3080 pulls twice as much power, but has 3x the TFLOP throughput, that's even more economical.
      • Apparently, some people think it will work:
        https://www.tweaktown.com/news... [tweaktown.com]
        https://www.tomshardware.com/n... [tomshardware.com]
        https://www.techradar.com/news... [techradar.com]
      • by JBMcB ( 73720 )

        Everyone I know who is still mining bitcoin are using ASIC miners that are 10-100x more efficient than GPUs.

        • Exactly, I don't see this thing starting a GPU coin mining resurgence. They're just too inefficient. Real miners are doing it on ASICs now.
        • Everyone I know who is still mining bitcoin are using ASIC miners that are 10-100x more efficient than GPUs.

          GPU owners are paid in bitcoins to mine alt coin for others. As I describe above in a different post, a Nvidia GTX 1060 is currently profitable, as is an AMD Radeon 570.

      • Really? Is anyone mining on GPUs anymore? Doesn't it cost more to power them than you actually make?

        At the moment it is profitable with a 1060 6GB, at a residential $0.15 kWh rate. The break even at the current difficultly level is around $8700. And this is with a home workstation. This system doesn't have any of the optimizations a dedicated GPU mining rig might have (firmware mods, underclocking, etc), and it has more components drawing power than a mining rig. Despite being less than optimal its still profitable.

        Caveat, I am on a time of use billing scheme. I only mine during my lowest rate period,

    • by stikves ( 127823 )

      Nobody mines "bitcoin" on GPUs anymore.

      But, yes there are still mining operations for "alt coins", which are kinda big businesses. They would probably have direct access to distributors, and gobble up bunch of supply.

      On the other hand this is going to bring lots of cheap second hand GPUs to the market in a year or so.

      • Those second-hand cheap GPUs from crypto miners have been run at their maximum temperatures for 24/7 since the day they were bought. That's not something you want to buy even at low prices.

    • Not likely. Looks like Radeon VII is a faster miner. Nobody in their right mind would get a 3080 for $699 just to do 80 MH/s mining ETH.

      Oh, you thought people use video cards to mine Bitcoin? No, no they don't. They might use them for something like Nicehash but that's indirect mining, and it's not very profitable right now.

    • Are you time travelling from 2017/2018? Most minors sold their GPU rigs years ago. It costs more to mine on a GPU than it's worth. Hell the second hand market was flooded with GPUs, which overall was very good for people needing an upgrade at the time.

    • While that would have been valid, say 2-3 years ago, the normal Joe can't make any money mining. You would spend a fortune on electricity before sniffing anything close to a coin.
    • Clearly you know nothing about bitcoin but like to comment on things you don't understand
  • If it can do 8K at 120fps then it can do dual 5K at 120 fps (two 5K panels have less pixels than one 8k panel) .. if it can do dual 5K then we may finally be close to VR without blur or screen door effect.

    • Re:8K at 120 fps (Score:4, Interesting)

      by DamnOregonian ( 963763 ) on Wednesday September 16, 2020 @06:42PM (#60512994)
      That was the first place my head went while pondering "Why would anyone need a 3090 if the 3080 is that fast??"

      Answer- to drive high resolution VR.
      My 2080Ti struggles with my Valve Index in some games at 90hz, and just about all games at 120.
      It would be awesome to see that fluid as butter.
      • The 3090 has 24GB of onboard Ram.

        This makes it appealing to those who utilize GPU rendering systems as onboard Ram ( on current gen non-quadro cards ) is the limitation.
        Large scenes with lots of textures require lots of memory.

      • Answer- to drive high resolution VR.

        That's one answer, but its far from the only one. Here's some others:
        - Future development: You need next gen hardware to design next gen games.
        - Commercial design: When you're not optimising polygons or faking textures the PC can get incredibly slow.
        - Simulation and rendering: Even simple tasks are greatly improved by offloading to the GPU if they are embarrassingly parallel, and if your paycheck depends on it...

        Oh speaking of commercial and simulation: NVLink. The 3080 does not support NVLink, so basically

      • by psavo ( 162634 )

        Is the Index usable/reasonable to use as a desktop monitor(s) replacement? I mean to do code/dev work, not just games

      • That was the first place my head went while pondering "Why would anyone need a 3090 if the 3080 is that fast??" Answer- to drive high resolution VR.

        wrong answer.

        @4k, the 3080 manages ~30 - 120fps with current gen games. that's good compared to prev. gens, but it's not great.

        and we're already moving into the next gen.

    • From some tests I have seen it's not a huge leap, it's at most 40% better on games. Even less on some.

      The latest flight simulator from Microsoft is still not up to 60fps. Maybe it needs the 3090. Maybe it's inefficient code.

      A huge leap to me is at least 2x performance increase over previous generation.

    • by Guspaz ( 556486 )

      The 3090 achieves 8K120 via DLSS from 1440p, so if you want to keep the same number of input pixels to drive dual 5K, you're going to need to upscale from slightly below 1080p per eye. Which is perfectly doable, DLSS for 4K uses 1080p as the input resolution in the lowest quality mode, so you're not stretching it that much. The question would be, does the GPU's tensor cores have the throughput to manage that. You're basically talking about upscaling 240 frames per second, and I don't think I've ever seen DL

  • Who can afford it? (Score:4, Insightful)

    by Snotnose ( 212196 ) on Wednesday September 16, 2020 @07:31PM (#60513202)
    The card itself costs more than my expected next CPU + motherboard + GPU + case + power supply + flowers for the wife.

    Not to mention the power requirements will greatly shorten my Tesla home batteries lifetimes, plus the added air conditioner costs.

    I think I'll wait a year until PS5 prices come down a bit (1/3 the cost of this video card) and there are some decent games for it.
    • Lots of people can afford it and if you take in to consideration this card will be able to play the latest games for 4-5 years without an issue, its a no brainer. Heck, I am still using the 1080GTX I bought back in 2016 and it still plays everything at the highest quality. Only thing it struggles with is VR.

      • by Cederic ( 9623 )

        Yeah, my 1070 is still meeting my needs at 1440p.

        I do turn off AA though, so a more powerful card will be welcome in my next PC.

    • by Arkham ( 10779 )

      I saved over $10,000 not going on vacations due to COVID. I think I can spare $1000 on computer parts...

    • The card itself costs more than my expected next CPU + motherboard + GPU + case + power supply + flowers for the wife.

      Are you joking? I hate to sound elitist, but this is a device for gamers, and its price point puts it squarely towards the mid-tier of the previous generation of equipment while its performance absolutely smashed the top tier. Even the 2080 Ti sold quite well despite it's $1200 price point. People are falling over themselves to try and get this card to the point where pre-orders are being scalped online at a $500 premium.

      So to answer your question: Pretty much the entire target market can afford this card.

      Not to mention the power requirements

      T

    • So you're building a ryzen 3 + GTX 1660 level pc? Solid, but definitely entry level. There are a fair amount of people building $700-1000 PCs for the best value to performance, and all those building between $1k and $2k PCs could imagine bumping up to a 700 dollar graphics card. Especially depending on how much they prioritize having a good gaming PC vs say a new TV or couch or dropping $10k-$15k on having a Tesla home battery or whatever. Just because you say you can't afford it doesn't mean that no one c
  • Whether I am exporting video from Premier or Rendering using Cycles engine in Rhino 3D, nVidia's choice to only offer cards that blow hit air around the case causes problems. Perhaps I will wait out the 3080 until they come out with a blower card, or perhaps I will just buy a pair of used 1080ti's. BTW, Linus did a review and found that the RTX 3080 is not usually twice as ffast as claimed.
    • Edit: Lack of Blower Cards Bad Content Creation.
    • BTW, Linus did a review and found that the RTX 3080 is not usually twice as ffast as claimed.

      There are a lot of reviews with lots of different answers. The answer as always is *it depends*. Many reviews have actually confirmed "twice as fast" or close to it, providing it is used in a situation that the GPU was previously wholly the bottleneck.

      Also agree on the cooling solution. Blower has a purpose, but this one is especially curious since it ejects hot air from the card directly onto the CPU. It's a more "GPU matters everyone else can GGF" approach than usual.

      #watercoolingisthebestcooling.

    • by Agripa ( 139780 )

      BTW, Linus did a review and found that the RTX 3080 is not usually twice as ffast as claimed.

      And slower for computation applications.

  • https://www.youtube.com/watch?... [youtube.com] It's not really 2 times faster but it is the fastest consumer card on the market.

    • The answer as always is "it depends". Honestly I'm impressed you can get through Linus's videos, but the reality is in some situations its twice as fast, in some it's not. Someone should teach Linus how to computer so he stops being disappointed when playing with the fastest consumer card on the market (while at the same time being not the most expensive).

    • Oh, it really is in any technical measure.
      Sequentially rendered frames, however, shouldn't be expected to scale linearly.
      They do scale- but they're far more bound by minimum latencies in the pipeline, which change at a different rate than the gross horsepower of the shader cores.

      i.e., twice as fast isn't inaccurate, just based on the TFLOP throughputs.
      However, I sympathize with those who considered that to be another way to say "twice as many frames per second in any app"- particularly, because that's
  • I've only been waiting like 25 years for a practical consumer ray tracing graphics card. I, for one, will be buying this the day it's available!

    • I've only been waiting like 25 years for a practical consumer ray tracing graphics card. I, for one, will be buying this the day it's available!

      Unless you were very lucky this morning, no you won't. They're sold out everywhere.

Every nonzero finite dimensional inner product space has an orthonormal basis. It makes sense, when you don't think about it.

Working...