Nvidia Unveils Powerful New RTX 2070 and 2080 Graphics Cards (polygon.com) 195
During a pre-Gamescom 2018 livestream from Cologne, Germany, Nvidia on Monday unveiled new GeForce RTX 2070, RTX 2080 and RTX 2080 Ti high-end graphics cards. These new 20-series cards will succeed Nvidia's current top-of-the-line GPUs, the GeForce GTX 1070, GTX 1080 and GTX 1080 Ti. While the company usually waits to launch the more powerful Ti version of a GPU, this time around, it's releasing the RTX 2080 and RTX 2080 Ti at once. Polygon adds: They won't come cheap. The Nvidia-manufactured Founders Edition versions will cost $599 for the RTX 2070, $799 for the RTX 2080 and $1,199 for the RTX 2080 Ti. The latter two cards are expected to ship "on or around" Sept. 20, while there is no estimated release date for the RTX 2070. Pre-orders are currently available for the RTX 2080 and 2080 Ti. Nvidia CEO Jensen Huang announced different "starting at" prices during the keynote presentation. Huang's presentation said the RTX 2070 will start at $499, the RTX 2080 at $699 and the RTX 2080 Ti at $999. Asked for clarification, an Nvidia representative told Polygon that these amounts reflect retail prices for third-party manufacturers' cards.
The RTX 2070, 2080 and 2080 Ti will be the first consumer-level graphics cards based on Nvidia's next-generation Turing architecture, which the company announced earlier this month at the SIGGRAPH computing conference. At that time, Nvidia also revealed its first Turing-based products: three GPUs in the company's Quadro line, which is geared toward professional applications. All three of the new RTX cards will feature built-in support for real-time ray tracing, a rendering and lighting technique for photorealistic graphics that gaming companies are starting to introduce this year
The RTX 2070, 2080 and 2080 Ti will be the first consumer-level graphics cards based on Nvidia's next-generation Turing architecture, which the company announced earlier this month at the SIGGRAPH computing conference. At that time, Nvidia also revealed its first Turing-based products: three GPUs in the company's Quadro line, which is geared toward professional applications. All three of the new RTX cards will feature built-in support for real-time ray tracing, a rendering and lighting technique for photorealistic graphics that gaming companies are starting to introduce this year
FP16 support (Score:2)
Wake me up when the consumer cards can do accelerated 16-bit floating point math.
Re:FP16 support (Score:5, Funny)
Wake me up when the consumer cards can do accelerated 16-bit floating point math.
you might want to wake up because it is part of the announcement.
Re:FP16 support (Score:4, Funny)
Green Day is for American Idiots.
AMD (Score:5, Insightful)
AMD: your market share is going to be rising with these prices.
Holy shit. Seriously, Nvidia?
Re:AMD (Score:5, Informative)
AMD currently doesn't have cards that match nVidia on the high end. No competition there, nVidia is free to overprice.
The only thing nVidia has to do in order to compete with AMD is to drop the price of their 1070s. Something they probably won't do because they don't want to compete with themselves just to piss off AMD.
It is not an new situation: nVidia occupying the high end with high priced, high performance cards and AMD occupying the midrange with good value cards is typical.
Right now, I am not a fan of AMD's discrete GPUs. Their APUs are great though.
Re:AMD (Score:5, Insightful)
AMD's strategy makes sense, they are now concentrating resources on exploiting their Ryzen advantage while starting to develop their brand new Epyc server market. On the GPU side they just continue to bang out parts on the mature 14nm process which gets cheaper the longer they run it. RX 560/570/580 cards remain highly respectable products, giving AMD the luxury of either fattening their gross profit clawing back more market share. [wccftech.com] They seem to be steering a middle course, with retail prices slowly coming down and market share slowly coming up. The stage is set for a showdown at 7nm in late 2019.
Personally, AMD vs NVidia is a no brainer because:
1) the open source AMD drivers are awesome
2) Vulkan/DX12 are taking over, I don't care about obsolete 3D engines running a bit slower
3) fuck NVidia.
Re: (Score:2)
Re: (Score:2)
Vega 64 obviously, but I don't care about VR, so far there is no killer game and the helmet is just too weird. I'll check again 5 years from now.
Re: (Score:2)
Re: (Score:2)
My brother bought a HTC Vive about 6 months after it came out, he has 4 kids. The kids loved it for 6 months, they havent touched it much yet. the youngest(5) occasionally still uses it, nobody else. I told him that would happen, But he wanted to be a cool dad :) IMO Its too expensive considering you need the Vive, a good GPU, a decent PSU(to run the GPU at full tilt) and a decent CPU.
Re: (Score:2)
For me, Fallout VR and Skyrim VR are killer apps. I also like a lot of the smaller indie games for VR.
I live alone. I don't give a flying fuck how weird the helmet looks. I care that it entertains me, and it does.
Looking forward to adding a even more powerful Nvidia card to my rig. FSM knows AMD isn't up to the task
Re: (Score:3)
FSM knows AMD isn't up to the task
You need to get some of that Vulkan. [anandtech.com] If I was into VR I would not be trying to optimize my hardware for an obsolete rendering model.
Re: (Score:2)
If you want to do VR right now, get a 1070 (1060-6GB to 1080 depending on your budget). Personally VR is the only reason I am looking into high end GPUs. For regular games, pretty much anything goes nowadays if you are willing to tone down the settings a little bit. AMD's APUs look very appealing in that regard, it will probably be my next purchase for my second PC.
Why would you get a Vega 64 that is not cheap and struggles with some of the most demanding current VR titles? I understand that you want to foc
Re: (Score:2)
Why would you get a Vega 64 that is not cheap and struggles with some of the most demanding current VR titles?
Because Vega is the best GPU for Vulkan/DX12. Why would you invest in hardware optimized for obsolete 3D engines? Obviously, VR is moving heavily to Vulkan and friends, otherwise, sucks too much.
Re: (Score:3)
> no killer game
You seriously need to experience Elite:Dangerous in VR.
Re: (Score:3)
Freesync is better than NVidia's proprietary crap too.
Don't know if it is still true but AMD seemed to be a lot faster for compute tasks a couple of years ago too.
Re: (Score:2)
Re:AMD (Score:4, Insightful)
AMD card prices aren't exactly bargains either.
Getting there. RX 560 cards are running $130-140 now and RX 580 around $225. When 580 gets down to $200 it's definitely a bargain, and even as it is, it's hard to complain.
Re:AMD (Score:4, Interesting)
Blame crytpo currency. While the demand is dying on that front, it proved to NVIDIA and AMD that they can charge more for graphics cards and get away with it.
Re: (Score:2)
No, it just put some money in the bank.
Re: (Score:2)
AMD: your market share is going to be rising with these prices.
Ryzen, surely?
This isn't much above the average (Score:2)
That said, AMD's had a mountain of driver issues for a long, long time. I've heard they're better now, but I still
Re: (Score:2)
I want to support the underdog but AMD is not delivering in the GPU market at all.
Also these GPUs are for crazy well off people / young IT kids in their early jobs, living at home and blowing lots of spare money on their PC. I know, I used to do this stuff.
"Normal" gamers won't consider these cards and "normal" gamers probably still have a 1080p display.
You could buy a used 1070 card in a few months for probably $400 US and get 60% of the performance of this card for 35% of the price.
AMD isn't an option, t
Re: (Score:2)
Doubtful. I have no reason to buy AMD kit until it can surpass the Nvidia stuff I already have. AND work with the games I want to use it with without any SINGLE bit of me having to do something special.
They'll never get there.
Re: (Score:2)
AMD: your market share is going to be rising with these prices.
Holy shit. Seriously, Nvidia?
Why would it? In the high end the The RX Vega 64 is outperformed in both raw speed and price by these offerings. The only thing laughable here is AMD's high-end offerings and in the low end both companies are price competitive.
Re: (Score:2)
AMD: your market share is going to be rising with these prices.
Holy shit. Seriously, Nvidia?
I bought a 970 a bit over 2 years ago and am yet to stress it with anything, so I think I'll wait for another year for a cheap 2070 or the next gen, 2170 or whatever the plan on calling it.
Re: (Score:2)
Problem being that gap between 7xx and 9xx was about two years. 9xx and 1xxx generation was almost three years. Gap between 1xxx and this is going to be over three years it seems.
So your 970 will have to last almost a decade to be able to jump three generations at this rate.
Notably, I went to 970 from 560Ti, and that took only about five years. Card wasn't overclocked, and I don't do any crypto, and it burned out in normal gaming use. My current 970 is almost three years old, and it's going strong (knock on
Nvidia must be drunk off that crypto wine (Score:4, Insightful)
Re: (Score:2)
NVidia got their tail kicked in the crypto market long before it imploded because AMD proved to be more power efficient for that load, not sure exactly why but seems to have something to do with NVidia optimizing only for 32 bit floating point.
Re: (Score:2)
NVIDIA hobbles non-32 bit floating point support on their gaming cards so there's incentive to buy their workstation cards. It sounds like that's going to change with this next generation, probably due to deep learning rather than cryptocurrency mining.
Re: (Score:2)
they're looking for a honey pot to replace it with
If you think that a market can implode and a few months later a new product is announced I have a bridge to sell you. It's not there yet, but I can build it for you in a week if you want.
Re: (Score:2)
Re: (Score:2)
The main thing to remember is that prices are still inflated coming down from crypto boom, even as demand rapidly fell off. There have been reports of OEMs returning unsold GPUs back to nvidia because of it.
But no one is willing to drop prices, likely because the medium term financial goals for pricing have been set, and big OEMs and large company like nvidia lack institutional flexibility. So we're not going to have a "market implosion". But we'll likely have all the people that have been holding out for p
Price drop on GTX 10xx: zero (Score:2)
I wonder how much prices will drop on the 1060 and 1070 now that the next series is announced.
Probably zero. Nvidia currently sells the 1070 for $400, the 2070 will supposedly be $600 from Nvidia. Other vendors -- ASUS, MSI, etc -- sell for less than Nvidia's prices, a 2070 from these sources in expected to be $500, pre-cryto mania they were selling 1070 for well under $400.
However you look at it these first 20xx cards are expensive enough that they will not be pushing 10xx card prices down. If 10xx card prices go down it will more likely be due to the crypto demand ending. 10xx and the announced
Nvidia sleeps well at night. (Score:2)
On a big pile of money!
Re: (Score:2)
Re: (Score:2)
You can get way lower than that. Some manufacturers of 3D TV's were running promotions and selling 48" screens at $450
Re: (Score:2)
Not all of them support 4:4:4.
Re: (Score:2)
True, a good portion of "4K" tv's are really only 4:2:2
Re: (Score:2)
Now I'm seeing some 65" 4K TVs for close to that.
Re: (Score:2)
Because latency.
I bought a cheap 4K TV with the intent of using it as a monitor. Unfortunately, it has a whopping 250 ms of input lag on every input including broadcast TV. I had to dial the delay on the audio all the way up (to 200 ms) and it still has the audio running just a shade ahead of the video. As a monitor, or for console gaming, it is completely unusable. The lag is bad enough to induce not only incorrect inputs, but motion sickness.
I couldn't return it, as it was deemed to be functioning correct
Re: (Score:2)
Most of the curved screens have a pretty low vertical number of pixels rendering them pretty useless for anything but gaming through a letterbox opening.
And I'm actually a bit disappointed with the new cards, the performance figures don't seem to be a radical improvement over the GTX1080Ti.
Real time ray tracing might be nice for anyone making a movie though.
Poor Value (Score:5, Informative)
The RTX 2080Ti has 19% more FLOPS than the 1080Ti... and costs 54% more money.
NVIDIA emphasized the new raytracing performance, presumably to deflect that fact.
Re: (Score:3)
The RTX 2080Ti has 19% more FLOPS than the 1080Ti... and costs 54% more money. NVIDIA emphasized the new raytracing performance, presumably to deflect that fact.
Or perhaps it's not just about FLOPS? The 1080Ti lacks RT cores and Tensor cores but if raw FLOPS is all you care about then yes there are better value options.
Re: (Score:2)
Tell me, why should gamers care about tensor cores? And I doubt the RT cores will be used much outside of a couple of demos for a long long time.
It seems to me they're shoving a lot of non-gaming silicon into gaming cards, seems like a waste of space, gamers are paying for a lot of R&D and silicon that they'll never use.
Re: (Score:2)
Tell me, why should gamers care about tensor cores? And I doubt the RT cores will be used much outside of a couple of demos for a long long time.
Because the next generation of games will utilise them, developers need silicon and APIs to build these new games. Very often new products include new features that you can't effectively utilise on day one but you'll pay for anyway because they need to solve the chicken and egg problem, if you think this is supposed to just be a faster 1080 then you've misinterpreted it. That product simply doesn't exist regardless of how much you want it but it does leave the market open for a competitor like AMD to fill t
Re:Poor Value (Score:4, Interesting)
The check/egg problem was already solved with this generational upgrade, actually. nVidia worked with Epic to include RTX support directly into the Unreal 4 engine already. Microsoft already updated DirectX 12 for RTX support. Game studios have had access to the hardware in one form or another for a while now. During the presentation they listed a bunch of games with RTX support, several titles are already on the market and simply getting an upgrade.
Re: (Score:3)
Older games are unlikely to add raytracing support (a couple like FF15 are, though) so that won't affect those titles. Most indie games/VR titles won't have the budget for adding nvidia-specific graphics options. If the RT cores can be used for audio tracing that might be compelling for VR. It's unclear what the tensor cores would be used for in games aside from antialiasing and raytracing denoising.
Unless the next consoles support it, it'll likely remain a niche feature only supported by a handful of AAA g
Re: (Score:2)
It's unclear what the tensor cores would be used for in games aside from antialiasing and raytracing denoising.
They can be used for inferencing, if you take a look at how they've been training neural networks with raytraced images that's a big part of how this generation can do realtime raytracing.
Re: (Score:2)
Or perhaps it's not just about FLOPS? The 1080Ti lacks RT cores and Tensor cores but if raw FLOPS is all you care about then yes there are better value options.
Well nVidia can tout the benefits all they like but these are effectively halo features from their AI/workstation cards and not available at all on any lesser GPUs. So how many games will create unique effects that'll only work on extremely high end 2018+ cards? Are these features worth sacrificing stream processors for mainstream/value cards or will it remain at the high end? Every bit of those TFLOPS on the 1080 Ti is usable today across a wide variety of games. I got mine at launch and it's starting to l
Re: (Score:2)
Well nVidia can tout the benefits all they like but these are effectively halo features from their AI/workstation cards and not available at all on any lesser GPUs. So how many games will create unique effects that'll only work on extremely high end 2018+ cards?
Those features are exposed to applications via APIs like DirectX and Vulkan. DirectX's DXR for example has a layer to support using it without hardware support and those features already have application support via DXR in Unity, Unreal and some of EA's engines. Being able to switch from the rasterizer to the raytracer in a game would be pretty awesome though.
It's the top end product (Score:2)
Re: (Score:2)
The RTX 2080Ti has 19% more FLOPS than the 1080Ti... and costs 54% more money.
It has 19% more FLOPS. That's the end of it. The top end of the market of ANY PC component has always been a long tail in terms of performance per dollar. 54% more money is irrelevant if you actually need the 19% more performance. It is also how trickle down technology has always worked. You want the 19% performance but don't want to spend the money? Come and ask me about it in 2 years.
1080s (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
you can buy a used one today for less than 299
Re: (Score:2)
1080ti even
Re: (Score:2)
Re: (Score:2)
Because I don't want to spend 54% more for a new card? I thought 1080's were already over priced.
So buy an AMD GPU then. The thing you seem to be failing to understand here is this isn't "a 1080, but faster", it's a GPU focussed on raytracing. If all you want is a new card that's like the current crop but faster and priced accordingly then this is not the product for you.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
I want a card that will be capable of any game on the market for the next year or so and can do VR.
Well then a multi-GPU setup with the current generation is probably appropriate. While I'm sure the RTX GPUs will be capable they include things you likely won't need for the next year of games like cores devoted to tensor operations and cores designed for raytracing operations and so by buying one you would be paying for things you don't need. Of course if no products exist on the market then no company is going to produce software to utilise those features so this is how you solve the chicken and egg prob
Re: (Score:2)
Re: (Score:2)
So, out of interest, who would use a card like this?
Gamers and graphics enthusiasts who want the best performance despite paying for things they don't need right now, i.e. more concerned about performance than price value. Developers building the next generation of games and 3d applications.
Re: (Score:2)
Re: (Score:2)
You are saying this card is different some how
Well I don't think I would have to say it, isn't it clear from the announcement and all the articles and documentation that yes indeed it is very very different?
and will not lower the price of the 1080?
I'm not saying that, I have no idea what pricing effects will occur.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Ray tracing is for lighting in games
Errr...ok, that's a weird way to describe raytracing. What games are doing are raytraced lighting (more to the point for this product, what are doing raytracing with AI inferencing - which is what all those tensor cores are used for)? I think you'll find pretty much all games are rasterized, not raytraced and any raytracing is done in offline steps to build things like lightmaps.
so given that I said I wanted to play games with it, why would you tell me these cards aren't for me?
Because - other than your misunderstanding of raytracing - you said you don't want to pay 54% more for a new card so clearly it's
Re: (Score:2)
Re: (Score:2)
So buy a 1080. The price isn't going to go down because at this point, Nvidia has no reason to produce any more of them. If you find a retailer that wants to clearance them, great. This is the way it has ALWAYS been for graphic card releases. You can likely find a new 2070 or 2060 when they come out for less than a 1080. Yea they might have features you don't consider important, but that's what Nvidia is making now.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
VRrequires a stereo image, so why wouldn't it do better with two separate GPUs rendering left and right eye frames? The only latency would be in making sure they're both updating frames at the same time, and that shouldn't be any worse than conforming to a Vsync is now.
Re: (Score:2)
So buy a 2070, get the shiny new raytracing and superior performance to the 1080?
I'll stick with my 1070 for another year or so, prices will almost certainly be lower by then.
Re: (Score:2)
because it will make my 5yr old cpu sing?!!
2080 no threat to 1080 (Score:2)
Have 1080's come down?
No, why would they? A 1080 from Nvidia is $550, a 2080 from Nvidia is $800.
Prices have come down in the sense that crypto-inflated pricing is ending, but prices from Nvidia itself never reflected that inflation, only retail prices did. Nvidia was simply out of stock and new orders were wait listed. At retail a 1080 could have been $1,000 or more. Currently retail is $500-550, maybe slightly higher than pre-crypto mania pricing. Hopefully we get back to pre-crypto, but that's about it, 2080 are so much mo
Re: (Score:2)
Re: (Score:2)
The last time I put a gaming PC together was a long time ago; maybe even 10 years. But a premium video card back then was around $400 CDN. I guess those days are just over.
Not really. A 1070 is a pretty damn good card and is US$400 from Nvidia, and if we ignore the last year's crypto-inflation less than that from ASUS, MSI, etc. Which should be about the C$ conversion? I'm sure some will violently disagree but I think the difference between a 1070 and 1080 isn't worth the additional US$150.
Re: (Score:2)
Inflation exists. Count that in.
Without open source drivers, not for me (Score:5, Insightful)
Re: (Score:2, Flamebait)
Cool. You stick to those guns. I'll be playing the games I enjoy without giving a shit if the drivers are open sourced.
Re: (Score:2)
Eh? The 3d portion of the Nvidia card will not be terribly useful if Nvidia stops supporting it. The card itself is not trashed. The non-gaming portion of the card will work fine.
Good (Score:2)
Re: (Score:2)
Years. Plural. Many of them.
It's like the much touted "performance functionalities" of DX12 that no one cares about to this day outside "give me the latest and greatest and I don't care if it ever gets used" crowd, and everyone and their grandmother is still on DX11 and DX9 as their main API.
Re: (Score:2)
We had the same thing with "fur simulation" on nvidia cards, and other similar functions. How many games used that again?
That's the point. You're going to have a raster based lighting anyway. And then you have to spend time building additional lighting from ground up to get ray tracing, even if you use tools provided by others. And there's no real certainty that it will be an improvement.
And so, just like DX12, it would be economically unfeasible for years. A lot of extra work, for a high chance of negative
Confusing (Score:2)
Re: (Score:3)
Re: (Score:2)
Here's the key: don't get advice from Slashdot comments.
Here you will find wealthy enthusiasts who will call you an idiot for buying anything that's not the very latest. And you will find crotchety old men (many of them 13) who will berate you for not picking up a used card from four generations ago. You'll also find lots of trolls who just call you names.
Re: (Score:2)
Re: (Score:2)
I felt like it was a little overkilly at the time (those were the market prices at that unfortunate moment in time).
I bought it because I wanted to get into VR, and I didn't want to take chances getting into shitty VR.
I don't know what the benchmarks are, but my 1080Ti mops the floor with my best friends Vega 64, both running first gen Optimus sets. Mine is buttery smooth, and his is... well, not. In the end? I'm glad I spent the money. For VR, at least.
Re: (Score:3)
I wonder what is possible when chaining all these tensor cores together in a block-chain, super computer, or bot-net...
Can you imagine a Beowulf cluster of those? More importantly, though, are they Turing complete?
Re: (Score:2)
What are some major issues these major machines can solve?
Spy on you more efficiently.
Re: (Score:2)
Inflation over two decades and then some.
Re: (Score:2)
You need to read about photolithography defects vs die area.