Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Graphics The Almighty Buck Games Hardware Technology

Nvidia Unveils Powerful New RTX 2070 and 2080 Graphics Cards (polygon.com) 195

During a pre-Gamescom 2018 livestream from Cologne, Germany, Nvidia on Monday unveiled new GeForce RTX 2070, RTX 2080 and RTX 2080 Ti high-end graphics cards. These new 20-series cards will succeed Nvidia's current top-of-the-line GPUs, the GeForce GTX 1070, GTX 1080 and GTX 1080 Ti. While the company usually waits to launch the more powerful Ti version of a GPU, this time around, it's releasing the RTX 2080 and RTX 2080 Ti at once. Polygon adds: They won't come cheap. The Nvidia-manufactured Founders Edition versions will cost $599 for the RTX 2070, $799 for the RTX 2080 and $1,199 for the RTX 2080 Ti. The latter two cards are expected to ship "on or around" Sept. 20, while there is no estimated release date for the RTX 2070. Pre-orders are currently available for the RTX 2080 and 2080 Ti. Nvidia CEO Jensen Huang announced different "starting at" prices during the keynote presentation. Huang's presentation said the RTX 2070 will start at $499, the RTX 2080 at $699 and the RTX 2080 Ti at $999. Asked for clarification, an Nvidia representative told Polygon that these amounts reflect retail prices for third-party manufacturers' cards.

The RTX 2070, 2080 and 2080 Ti will be the first consumer-level graphics cards based on Nvidia's next-generation Turing architecture, which the company announced earlier this month at the SIGGRAPH computing conference. At that time, Nvidia also revealed its first Turing-based products: three GPUs in the company's Quadro line, which is geared toward professional applications. All three of the new RTX cards will feature built-in support for real-time ray tracing, a rendering and lighting technique for photorealistic graphics that gaming companies are starting to introduce this year

This discussion has been archived. No new comments can be posted.

Nvidia Unveils Powerful New RTX 2070 and 2080 Graphics Cards

Comments Filter:
  • Wake me up when the consumer cards can do accelerated 16-bit floating point math.

  • AMD (Score:5, Insightful)

    by MachineShedFred ( 621896 ) on Monday August 20, 2018 @05:19PM (#57162632) Journal

    AMD: your market share is going to be rising with these prices.

    Holy shit. Seriously, Nvidia?

    • Re:AMD (Score:5, Informative)

      by GuB-42 ( 2483988 ) on Monday August 20, 2018 @05:52PM (#57162840)

      AMD currently doesn't have cards that match nVidia on the high end. No competition there, nVidia is free to overprice.
      The only thing nVidia has to do in order to compete with AMD is to drop the price of their 1070s. Something they probably won't do because they don't want to compete with themselves just to piss off AMD.

      It is not an new situation: nVidia occupying the high end with high priced, high performance cards and AMD occupying the midrange with good value cards is typical.

      Right now, I am not a fan of AMD's discrete GPUs. Their APUs are great though.

      • Re:AMD (Score:5, Insightful)

        by Tough Love ( 215404 ) on Monday August 20, 2018 @07:14PM (#57163306)

        AMD's strategy makes sense, they are now concentrating resources on exploiting their Ryzen advantage while starting to develop their brand new Epyc server market. On the GPU side they just continue to bang out parts on the mature 14nm process which gets cheaper the longer they run it. RX 560/570/580 cards remain highly respectable products, giving AMD the luxury of either fattening their gross profit clawing back more market share. [wccftech.com] They seem to be steering a middle course, with retail prices slowly coming down and market share slowly coming up. The stage is set for a showdown at 7nm in late 2019.

        Personally, AMD vs NVidia is a no brainer because:
        1) the open source AMD drivers are awesome
        2) Vulkan/DX12 are taking over, I don't care about obsolete 3D engines running a bit slower
        3) fuck NVidia.

        • So what AMD card would you recommend for VR? PCMag [pcmag.com] seems to only recommend nvidia chipsets.
          • Vega 64 obviously, but I don't care about VR, so far there is no killer game and the helmet is just too weird. I'll check again 5 years from now.

            • My kids have been asking for VR, and I thought I would be a cool dad. I'm a little curious myself.
              • My brother bought a HTC Vive about 6 months after it came out, he has 4 kids. The kids loved it for 6 months, they havent touched it much yet. the youngest(5) occasionally still uses it, nobody else. I told him that would happen, But he wanted to be a cool dad :) IMO Its too expensive considering you need the Vive, a good GPU, a decent PSU(to run the GPU at full tilt) and a decent CPU.

            • For me, Fallout VR and Skyrim VR are killer apps. I also like a lot of the smaller indie games for VR.

              I live alone. I don't give a flying fuck how weird the helmet looks. I care that it entertains me, and it does.

              Looking forward to adding a even more powerful Nvidia card to my rig. FSM knows AMD isn't up to the task

              • FSM knows AMD isn't up to the task

                You need to get some of that Vulkan. [anandtech.com] If I was into VR I would not be trying to optimize my hardware for an obsolete rendering model.

                • by GuB-42 ( 2483988 )

                  If you want to do VR right now, get a 1070 (1060-6GB to 1080 depending on your budget). Personally VR is the only reason I am looking into high end GPUs. For regular games, pretty much anything goes nowadays if you are willing to tone down the settings a little bit. AMD's APUs look very appealing in that regard, it will probably be my next purchase for my second PC.

                  Why would you get a Vega 64 that is not cheap and struggles with some of the most demanding current VR titles? I understand that you want to foc

                  • Why would you get a Vega 64 that is not cheap and struggles with some of the most demanding current VR titles?

                    Because Vega is the best GPU for Vulkan/DX12. Why would you invest in hardware optimized for obsolete 3D engines? Obviously, VR is moving heavily to Vulkan and friends, otherwise, sucks too much.

            • > no killer game

              You seriously need to experience Elite:Dangerous in VR.

        • by AmiMoJo ( 196126 )

          Freesync is better than NVidia's proprietary crap too.

          Don't know if it is still true but AMD seemed to be a lot faster for compute tasks a couple of years ago too.

    • unlikely, AMD card prices aren't exactly bargains either.
      • Re:AMD (Score:4, Insightful)

        by Tough Love ( 215404 ) on Monday August 20, 2018 @07:26PM (#57163376)

        AMD card prices aren't exactly bargains either.

        Getting there. RX 560 cards are running $130-140 now and RX 580 around $225. When 580 gets down to $200 it's definitely a bargain, and even as it is, it's hard to complain.

    • Re:AMD (Score:4, Interesting)

      by laffer1 ( 701823 ) <luke@@@foolishgames...com> on Monday August 20, 2018 @06:06PM (#57162938) Homepage Journal

      Blame crytpo currency. While the demand is dying on that front, it proved to NVIDIA and AMD that they can charge more for graphics cards and get away with it.

    • by mentil ( 1748130 )

      AMD: your market share is going to be rising with these prices.

      Ryzen, surely?

    • for a top end part. Right now AMD doesn't really have anything close to it either. The Vega line isn't bad but even the 1080 beats it on price, performance and even power consumption and heat. AMD does better in the mid range. The RX 580 can be had for $230 on sale with 8gb of RAM and while it's slower than a 1060 (it's nearest competitor) that will change as newer games want that extra 2gb.

      That said, AMD's had a mountain of driver issues for a long, long time. I've heard they're better now, but I still
    • I want to support the underdog but AMD is not delivering in the GPU market at all.

      Also these GPUs are for crazy well off people / young IT kids in their early jobs, living at home and blowing lots of spare money on their PC. I know, I used to do this stuff.

      "Normal" gamers won't consider these cards and "normal" gamers probably still have a 1080p display.

      You could buy a used 1070 card in a few months for probably $400 US and get 60% of the performance of this card for 35% of the price.

      AMD isn't an option, t

    • Doubtful. I have no reason to buy AMD kit until it can surpass the Nvidia stuff I already have. AND work with the games I want to use it with without any SINGLE bit of me having to do something special.

      They'll never get there.

    • AMD: your market share is going to be rising with these prices.

      Holy shit. Seriously, Nvidia?

      Why would it? In the high end the The RX Vega 64 is outperformed in both raw speed and price by these offerings. The only thing laughable here is AMD's high-end offerings and in the low end both companies are price competitive.

    • by mjwx ( 966435 )

      AMD: your market share is going to be rising with these prices.

      Holy shit. Seriously, Nvidia?

      I bought a 970 a bit over 2 years ago and am yet to stress it with anything, so I think I'll wait for another year for a cheap 2070 or the next gen, 2170 or whatever the plan on calling it.

      • by Luckyo ( 1726890 )

        Problem being that gap between 7xx and 9xx was about two years. 9xx and 1xxx generation was almost three years. Gap between 1xxx and this is going to be over three years it seems.

        So your 970 will have to last almost a decade to be able to jump three generations at this rate.

        Notably, I went to 970 from 560Ti, and that took only about five years. Card wasn't overclocked, and I don't do any crypto, and it burned out in normal gaming use. My current 970 is almost three years old, and it's going strong (knock on

  • by JoeyRox ( 2711699 ) on Monday August 20, 2018 @05:20PM (#57162642)
    Now that GPU sales demand for cryptomining has all but disappeared they're looking for a honey pot to replace it with. Good luck.
    • NVidia got their tail kicked in the crypto market long before it imploded because AMD proved to be more power efficient for that load, not sure exactly why but seems to have something to do with NVidia optimizing only for 32 bit floating point.

      • by ceoyoyo ( 59147 )

        NVIDIA hobbles non-32 bit floating point support on their gaming cards so there's incentive to buy their workstation cards. It sounds like that's going to change with this next generation, probably due to deep learning rather than cryptocurrency mining.

    • they're looking for a honey pot to replace it with

      If you think that a market can implode and a few months later a new product is announced I have a bridge to sell you. It's not there yet, but I can build it for you in a week if you want.

      • I was referring to their pricing of their new boards, not the fact they announced them.
      • by Luckyo ( 1726890 )

        The main thing to remember is that prices are still inflated coming down from crypto boom, even as demand rapidly fell off. There have been reports of OEMs returning unsold GPUs back to nvidia because of it.

        But no one is willing to drop prices, likely because the medium term financial goals for pricing have been set, and big OEMs and large company like nvidia lack institutional flexibility. So we're not going to have a "market implosion". But we'll likely have all the people that have been holding out for p

  • On a big pile of money!

    • Who buys a $2000 gaming monitor when you can get a much bigger 4K TV for half that? Granted, the only screen it makes sense to curve is a gaming monitor screen, since one generally sits much closer to it.
      • by mikael ( 484 )

        You can get way lower than that. Some manufacturers of 3D TV's were running promotions and selling 48" screens at $450

      • by Mal-2 ( 675116 )

        Because latency.

        I bought a cheap 4K TV with the intent of using it as a monitor. Unfortunately, it has a whopping 250 ms of input lag on every input including broadcast TV. I had to dial the delay on the audio all the way up (to 200 ms) and it still has the audio running just a shade ahead of the video. As a monitor, or for console gaming, it is completely unusable. The lag is bad enough to induce not only incorrect inputs, but motion sickness.

        I couldn't return it, as it was deemed to be functioning correct

      • by Z00L00K ( 682162 )

        Most of the curved screens have a pretty low vertical number of pixels rendering them pretty useless for anything but gaming through a letterbox opening.

        And I'm actually a bit disappointed with the new cards, the performance figures don't seem to be a radical improvement over the GTX1080Ti.

        Real time ray tracing might be nice for anyone making a movie though.

  • Poor Value (Score:5, Informative)

    by mentil ( 1748130 ) on Monday August 20, 2018 @05:30PM (#57162716)

    The RTX 2080Ti has 19% more FLOPS than the 1080Ti... and costs 54% more money.
    NVIDIA emphasized the new raytracing performance, presumably to deflect that fact.

    • The RTX 2080Ti has 19% more FLOPS than the 1080Ti... and costs 54% more money. NVIDIA emphasized the new raytracing performance, presumably to deflect that fact.

      Or perhaps it's not just about FLOPS? The 1080Ti lacks RT cores and Tensor cores but if raw FLOPS is all you care about then yes there are better value options.

      • by MrL0G1C ( 867445 )

        Tell me, why should gamers care about tensor cores? And I doubt the RT cores will be used much outside of a couple of demos for a long long time.

        It seems to me they're shoving a lot of non-gaming silicon into gaming cards, seems like a waste of space, gamers are paying for a lot of R&D and silicon that they'll never use.

        • Tell me, why should gamers care about tensor cores? And I doubt the RT cores will be used much outside of a couple of demos for a long long time.

          Because the next generation of games will utilise them, developers need silicon and APIs to build these new games. Very often new products include new features that you can't effectively utilise on day one but you'll pay for anyway because they need to solve the chicken and egg problem, if you think this is supposed to just be a faster 1080 then you've misinterpreted it. That product simply doesn't exist regardless of how much you want it but it does leave the market open for a competitor like AMD to fill t

          • Re:Poor Value (Score:4, Interesting)

            by darkain ( 749283 ) on Monday August 20, 2018 @07:33PM (#57163432) Homepage

            The check/egg problem was already solved with this generational upgrade, actually. nVidia worked with Epic to include RTX support directly into the Unreal 4 engine already. Microsoft already updated DirectX 12 for RTX support. Game studios have had access to the hardware in one form or another for a while now. During the presentation they listed a bunch of games with RTX support, several titles are already on the market and simply getting an upgrade.

      • by mentil ( 1748130 )

        Older games are unlikely to add raytracing support (a couple like FF15 are, though) so that won't affect those titles. Most indie games/VR titles won't have the budget for adding nvidia-specific graphics options. If the RT cores can be used for audio tracing that might be compelling for VR. It's unclear what the tensor cores would be used for in games aside from antialiasing and raytracing denoising.
        Unless the next consoles support it, it'll likely remain a niche feature only supported by a handful of AAA g

        • It's unclear what the tensor cores would be used for in games aside from antialiasing and raytracing denoising.

          They can be used for inferencing, if you take a look at how they've been training neural networks with raytraced images that's a big part of how this generation can do realtime raytracing.

      • by Kjella ( 173770 )

        Or perhaps it's not just about FLOPS? The 1080Ti lacks RT cores and Tensor cores but if raw FLOPS is all you care about then yes there are better value options.

        Well nVidia can tout the benefits all they like but these are effectively halo features from their AI/workstation cards and not available at all on any lesser GPUs. So how many games will create unique effects that'll only work on extremely high end 2018+ cards? Are these features worth sacrificing stream processors for mainstream/value cards or will it remain at the high end? Every bit of those TFLOPS on the 1080 Ti is usable today across a wide variety of games. I got mine at launch and it's starting to l

        • Well nVidia can tout the benefits all they like but these are effectively halo features from their AI/workstation cards and not available at all on any lesser GPUs. So how many games will create unique effects that'll only work on extremely high end 2018+ cards?

          Those features are exposed to applications via APIs like DirectX and Vulkan. DirectX's DXR for example has a layer to support using it without hardware support and those features already have application support via DXR in Unity, Unreal and some of EA's engines. Being able to switch from the rasterizer to the raytracer in a game would be pretty awesome though.

    • you don't buy it for price/performance. You buy it because money isn't an object and you just plain _want_ it.
    • The RTX 2080Ti has 19% more FLOPS than the 1080Ti... and costs 54% more money.

      It has 19% more FLOPS. That's the end of it. The top end of the market of ANY PC component has always been a long tail in terms of performance per dollar. 54% more money is irrelevant if you actually need the 19% more performance. It is also how trickle down technology has always worked. You want the 19% performance but don't want to spend the money? Come and ask me about it in 2 years.

  • Have 1080's come down?
    • by ELCouz ( 1338259 )
      Why buy a two years old GPU card? 1080 price will not drop much until out of stock!
      • Because I don't want to spend 54% more for a new card? I thought 1080's were already over priced.
        • by ELCouz ( 1338259 )
          Bad logic.... look at the old Intel i5/i7 CPU price... they even increased!! Older doesn't mean cheaper! Sadly, waiting for old performing parts to show up don't mean saving money!
        • Because I don't want to spend 54% more for a new card? I thought 1080's were already over priced.

          So buy an AMD GPU then. The thing you seem to be failing to understand here is this isn't "a 1080, but faster", it's a GPU focussed on raytracing. If all you want is a new card that's like the current crop but faster and priced accordingly then this is not the product for you.

          • I want a card that will be capable of any game on the market for the next year or so and can do VR.
            • Capable of any game on the market with the graphics quality on high.
            • Also I know I've bought 'good graphics cards for the time' in the past and they cost more around $300 not $600.
            • I want a card that will be capable of any game on the market for the next year or so and can do VR.

              Well then a multi-GPU setup with the current generation is probably appropriate. While I'm sure the RTX GPUs will be capable they include things you likely won't need for the next year of games like cores devoted to tensor operations and cores designed for raytracing operations and so by buying one you would be paying for things you don't need. Of course if no products exist on the market then no company is going to produce software to utilise those features so this is how you solve the chicken and egg prob

              • So, out of interest, who would use a card like this?
                • So, out of interest, who would use a card like this?

                  Gamers and graphics enthusiasts who want the best performance despite paying for things they don't need right now, i.e. more concerned about performance than price value. Developers building the next generation of games and 3d applications.

              • I guess I'm confused, because the low end of this card is cheaper than a 1080, yet I'm told that the 1080 is a two year old card. When I go to pcmag for good cards for VR they list 1080,1070,1060. You are saying this card is different some how and will not lower the price of the 1080?
                • You are saying this card is different some how

                  Well I don't think I would have to say it, isn't it clear from the announcement and all the articles and documentation that yes indeed it is very very different?

                  and will not lower the price of the 1080?

                  I'm not saying that, I have no idea what pricing effects will occur.

                  • Ray tracing is for lighting in games, so given that I said I wanted to play games with it, why would you tell me these cards aren't for me?
                    • So what is newer and better than a 1080 but doesn't have such prospective features?
                    • Ray tracing is for lighting in games

                      Errr...ok, that's a weird way to describe raytracing. What games are doing are raytraced lighting (more to the point for this product, what are doing raytracing with AI inferencing - which is what all those tensor cores are used for)? I think you'll find pretty much all games are rasterized, not raytraced and any raytracing is done in offline steps to build things like lightmaps.

                      so given that I said I wanted to play games with it, why would you tell me these cards aren't for me?

                      Because - other than your misunderstanding of raytracing - you said you don't want to pay 54% more for a new card so clearly it's

        • by Cederic ( 9623 )

          So buy a 2070, get the shiny new raytracing and superior performance to the 1080?

          I'll stick with my 1070 for another year or so, prices will almost certainly be lower by then.

      • by zlives ( 2009072 )

        because it will make my 5yr old cpu sing?!!

    • Have 1080's come down?

      No, why would they? A 1080 from Nvidia is $550, a 2080 from Nvidia is $800.

      Prices have come down in the sense that crypto-inflated pricing is ending, but prices from Nvidia itself never reflected that inflation, only retail prices did. Nvidia was simply out of stock and new orders were wait listed. At retail a 1080 could have been $1,000 or more. Currently retail is $500-550, maybe slightly higher than pre-crypto mania pricing. Hopefully we get back to pre-crypto, but that's about it, 2080 are so much mo

      • The last time I put a gaming PC together was a long time ago; maybe even 10 years. But a premium video card back then was around $400 CDN. I guess those days are just over.
        • The last time I put a gaming PC together was a long time ago; maybe even 10 years. But a premium video card back then was around $400 CDN. I guess those days are just over.

          Not really. A 1070 is a pretty damn good card and is US$400 from Nvidia, and if we ignore the last year's crypto-inflation less than that from ASUS, MSI, etc. Which should be about the C$ conversion? I'm sure some will violently disagree but I think the difference between a 1070 and 1080 isn't worth the additional US$150.

        • by Luckyo ( 1726890 )

          Inflation exists. Count that in.

  • by ffkom ( 3519199 ) on Monday August 20, 2018 @05:45PM (#57162794)
    Pay $$$$ for a gfx-card that I can trash as soon nVidia loses interest in releasing their proprietary closed-source driver? No, thank you. Even if the GPUs from Intel and AMD are slower, I know I will be able to compile a contemporary kernel with a driver for them, also tomorrow.
    • Re: (Score:2, Flamebait)

      Cool. You stick to those guns. I'll be playing the games I enjoy without giving a shit if the drivers are open sourced.

    • Eh? The 3d portion of the Nvidia card will not be terribly useful if Nvidia stops supporting it. The card itself is not trashed. The non-gaming portion of the card will work fine.

  • Now release a 4K GTA 6 that actually uses the ray tracing! It will probably be a year before the game developers fully utilize this.
    • by Luckyo ( 1726890 )

      Years. Plural. Many of them.

      It's like the much touted "performance functionalities" of DX12 that no one cares about to this day outside "give me the latest and greatest and I don't care if it ever gets used" crowd, and everyone and their grandmother is still on DX11 and DX9 as their main API.

  • Shopping for a video card is more confusing then ever. If a 1080 is "too old", but these cars are overkill and have features games may not even get, what the hell to buy?

Keep up the good work! But please don't ask me to help.

Working...