NVIDIA Unveils GeForce GTX 1080, GTX 1070, Faster Than Titan X For a Lot Less (hothardware.com) 153
MojoKid writes (edited and condensed): NVIDIA has unveiled its next-generation Pascal-based GeForce graphics cards -- known as the GeForce GTX 1080 and GeForce GTX 1070. NVIDIA's Pascal architecture is based on 16nm FinFET technology, similar to that of NVIDIA's high-end data center Tesla P100 processing engine though the GeForce cards are targeted at the consumer gaming market. NVIDIA's GP104 GPU at the heart of the new GeForce cards is comprised of some 8 billion transistors and features a 256-bit memory interface with 8GB of Micron GDDR5X graphics memory on the GeForce GTX 1080. The GTX 1070, however, employs standard GDDR5. The core clock speed of the GeForce GTX 1080 hit 2.1GHz at one point during the demonstration, though GTX 1070 clocks were not disclosed. NVIDIA CEO Jen-Hsun Huang claimed the new GeForce GTX 1080 is faster than a pair of GeForce GTX 980 cards in SLI and faster than the company's very expensive Titan X graphics card but at half the price. The new GeForce GTX 1080 will be offered in two versions, a standard card with an MSRP of $599 or a highly-overclockable Founders Edition for $699. The standard GTX 1070 will arrive at $379, while a Founders Edition will be priced at $449. Availability for the GTX 1080 is slated for May 27th and the GTX 1070 for June 10. Anand Tech has more information.
Waiting for big pascal (Score:2)
Looks nice, but within a year I expect a cosumerish big pascal with hbm2 and closer to 300w as the ultimate single card. Not ready to replace my SLI setup for this one, but multi-gpu support is getting more and more niche. One monster card for 4k gaming would be great.
Re: (Score:1)
Re: (Score:2)
If you just want 4k for desktop pixels (as opposed to high-end gaming) then even an AMD R7 260x will do that. Mine even gets me acceptable framerates at 4k in most of the games I play, which tend to be older (e.g. Skyrim, TF2, Kerbal Space Program, Star Trek Online, etc.).
Re: (Score:2)
80C is about the regular target for temperature with most any card. If it's not gazzling too much gas the fan might be slow running.
Every gen or half gen the power and fan control circuitry get more agressive/precise/low latency so you get something that tries to be high temperature, low heat and low noise.
R7 360 is a bit slower and less power hungry though for HDR maybe you need the follow up generation, only for the DP 1.3 interface plus hdmi 2.0 for TVs.
Hum, pushing 4K Solitaire and Minesweeper?
Re: Waiting for big pascal (Score:2)
gazzling
That's what, a baby goose??
Re: (Score:2)
Considering that 4K monitors still aren't common, you've still got at least another year to go.
Re: (Score:3)
Something I'm hoping for is smoother offloading of physics between multiple GPU's/cards, such that using 1 GPU for graphics, and the other card, not connected to a display, can be used for physics with less clunkiness than what is currently in use. Would be nice for various simulators for example.
Re: (Score:2)
Re: (Score:2)
Yeah, but it's still clunky, and doesn't always synch with the graphics properly
Re: (Score:2, Insightful)
Re: (Score:2)
Re: (Score:2)
In a year in a year in a year.
something better is always going to come out. You can get a huge improvement over your current kit, or you can wring your hands and worry.
The actual question (Score:1)
Will the drivers finally be stable?
Re: (Score:2)
more stable than amd's, yes.
Re: The actual question (Score:1)
Re: (Score:3)
More stable than... fuck, why can I only settle for crappy or less crappy? Don't the capitalism preachers constantly tell me just how much capitalism ensures that only that gets produced what the customer wants, and how happy we should be that we're not in commie hell where we could only buy what The Party thinks is good enough for us?
What's the difference between The Party and The Corporation deciding what the fuck I can buy?
Re: (Score:1)
They *could* devote the time to developing drivers that crash less often, but they'd pass the cost of that developer time onto the consumer. Apparently they've decided that beyond a certain (frustratingly low) point, increases in stability do not add enough value to entice consumers spend more.
Consider a different industry: commercial airlines. Just about every aspect of flying is a horrifying cluster-fuck. Can't they implement a system that doesn't lose baggage on a semi-regular basis? Yep, and it would be
Re: (Score:2)
What that analogy fails at is that air transport is temporary, while the use of a graphics card is a much more prolonged experience. It's easier to swallow to be considered freight for those 4-12 hours in the air than it is to be constantly fighting against your computer for the 2-3 years the average person clings to their graphics adapter.
Mixed GPUs (Score:3, Interesting)
Wouldn't it be nice if the promise of mixed GPUs had arrived already. Then we could buy a new GPU and just add it to the stack we already have. And if the stack is full just drop the worst card out.
Without that I'll probably skip this generation, replacing what I already have for a modest increase is too expensive. Still keeping my fingers crossed that multi-gpu is the way of the future but I'm not holding my breath.
Re: (Score:3)
Re: (Score:1)
And DirectX 12 requires the spyware known as Windows 10.
Re: (Score:2)
Re: (Score:2)
Live long and prosper.
Re: (Score:2)
Ever since AMD and ATI merged, I've been hoping for socketed GPUs that communicate over HyperTransport. But so far, at least, all we've gotten is APUs instead. : (
Re: (Score:2)
There's something like that that has been announced
A high end socket with eight memory channels that takes a 32-core Opteron (16+16 MCM) or a 16-core Opteron and GPU with about 2000 units.
Re: Mixed GPUs (Score:2)
Do video card upgrades even matter anymore (Score:1)
Now days it seems you can run almost any game at high settings with old cards. And the games still don't look as good as old elder school mods, which you need cpu and ram for. Everything is console lvl now.
"muh frames per second are slightly better then your frames per second" is just esat bullshit nowdays when the settings are the same otherwise.
Re: Do video card upgrades even matter anymore (Score:2, Informative)
Think VR
90fps minimum, stereo, and high fov
Re:Do video card upgrades even matter anymore (Score:5, Informative)
> Do video card upgrades even matter anymore?
Yes.
* VR requires 90 Hz minimum (Thank god!)
* 4K Gaming at 120 Hz requires beefy hardware.
* ENB mods [google.com]
If you can't even tell the difference between 24 Hz and 60 Hz ....
OWE my eyes @ 24 fps ! [cachefly.net]
Silky smooth @ 60 fps ! [cachefly.net]
Re: (Score:2)
I'm "budget". Though I get people visiting for the first time that ask if my 720 TV is 4k.Gamers trained for years to recognize glitches may notice. Nobody else does. Good lighting, proper setup and wow someone with inferior content.
Re: (Score:2)
I recently bought a nice big 4K Phillips TV... And then sent it back, it was rubbish, the contrast was poor, the colours weren't good, it couldn't handle 4k 60fps properly, the menus were a bad joke, it took about 11 button presses to get to the brightness change setting. There was no backlight control. By default all profiles had 'sharpness' on, god that thing is an abomination that should be banned, why would anyone want to deliberately screw up their picture with it is beyond me.
Re: (Score:2)
Re: (Score:2)
> My 720p plasma that's 6 years old, gets compliments all the time.
I'm not surprised. Next to OLED, Plasma's superior viewing angles kicks the shit out of LCD's / LED's. Combine that with deep blacks, a good gamut, with a physical black border around the display (old contrast trick) and there isn't even any contest.
Note: I'm a plasma man too. I picked up one of the last 1080p Panny's (TC-P60VT60) right before they went out of stock.
Are you on AvsForum by chance?
Re: (Score:2)
Re: (Score:2)
Should be [htcvive.com]; you only need a GTX 970.
Re: (Score:2)
Yeah I really wish RED would add a complete set of framerate demos [red.com] for:
* 24 Hz
* 30 Hz
* 60 Hz
* 120 Hz
* 144 Hz
The nice thing about using about 120 Hz and 144 Hz refresh is that it is an exact multiple of 24 Hz (5x and 6x, respectively).
I might have to get a HERO4 or some other cheap 240 fps camera and record this demo 120 vs 60 vs 30 [appspot.com] fps, but then again we already have 60 Hz vs 120 Hz [slashdot.org] comparisons.
Re: (Score:3)
Here you go:
https://frames-per-second.apps... [appspot.com]
That is a simple JS app that lets you compare frame rates controlling all aspects (I love how it lets you even configure motion blur settings). For me the best comparison (on my 60fps monitor) is 60fps vs 30 fps both without any motion blur, the quality difference is so blatant that I can't imagine like people still defend frame rates lower than 60fps.
Re: (Score:2)
Yes, absolutely they matter. Especially with 4k starting to become a serious contender to 1080 in the PC space.
Re: (Score:2)
GPUs are increasingly being used for general purpose computation and rendering, not just playing games. For these purposes more computing power is always welcome, in the same way as faster CPUs and more/faster RAM and storage. For example, would you rather process this data set in 1 day or 2?
The display part won't benefit from indefinite improvements, as the human eye has its limits. But for everything behind the display, there's always more computing to be done.
Re: (Score:2)
Do video card upgrades even matter anymore?
Now days it seems you can run almost any game at high settings with old cards. And the games still don't look as good as old elder school mods, which you need cpu and ram for. Everything is console lvl now.
"muh frames per second are slightly better then your frames per second" is just esat bullshit nowdays when the settings are the same otherwise.
The ONLY reason I'm still running 1920x1200 instead of 4k is that driving the monitor at native resolution for gaming would require too much spent on graphics card hardware.
Most likely, even this new card isn't enough for a single-card setup at 4k.
(says the guy who spent the last week playing Dwarf Fortress...)
Re: (Score:2)
You can push a 4k with a 970 pretty well, you're still paying more for the 4k screen than the 970.
Re: (Score:2)
Re: Do video card upgrades even matter anymore (Score:1)
Re: (Score:2)
Sure he can run 60FPS. you just have to turn all the detail down to "complete shit."
Re: (Score:2)
For VR they definitely do. A lot.
Re: Do video card upgrades even matter anymore (Score:2)
Except that a GPU is not 1000 times faster than a CPU.
Re: (Score:2)
Well, sure, technically it's only half as fast, but 2000 times more parallel.
Re: Do video card upgrades even matter anymore (Score:2)
It's not half as fast neither nor 2000 times as parallel.
Are you just saying random stuff out of your ass?
For double precision, a single Xeon CPU is 500 Gigaflops, while a Pascal GPU is 4 Teraflops (possibly lower for those cards, which are gamer models optimized for single precision).
In practice though, servers use two Xeon CPUs, so the GPU offers you something that is 4 times faster in peak computing power.
Another complication is that, while it's not too complicated to reach close to the max on CPUs, it's
Well crud (Score:1)
Re: (Score:2)
Wait for reviews. Based on the chart they showed, I predict that the 1080 GTX is only 15-20% better than a typical non-reference 980 Ti in the $600-$700 range.
If you have a 980 Ti with high clocks you should just wait for the presumed 1080 Ti / Titan Whatever and AMD's Vega.
Buying the non-flagship part is a sucker's game.
Re: Well crud (Score:2)
Meh, the flagship typically has the worst price/performance. But it's rare that it is worth trading down in the lineup, if you have last generation's flagship stick with it or buy a new flagship.
Re: (Score:2)
It'll be interesting to see how much of an improvement Pascal will be relative to the previous gen of Nvidia GPUs, in particular among the flagship models. The 1080 Ti better be amazing with what they have been promising
time (Score:2)
NVIDIA CEO Jen-Hsun Huang claimed the new GeForce GTX 1080 is faster than a pair of GeForce GTX 980 cards in SLI and faster than the company's very expensive Titan X graphics card but at half the price.
that's nice but it doesn't help me 13 months ago. also, if we're being honest here, this is more of an indicator that they charge too much for their products.
Comment removed (Score:3)
Re: (Score:2)
I don't see the point of having 32GB of ram, games won't use it, do you have some special application that will use it?
Re: (Score:2)
Re:Glad I waited.. (Score:5, Informative)
X-Plane 10 [x-plane.com], with AIPilotX's HD Mesh v3 [alpilotx.net] and the Massachusetts Pro VFR Scenery [x-plane.org] will eat up 32 GB of RAM easily.
Flight simulation is one of those areas where I doubt there will ever be such a thing as enough memory. There's always something to model in greater detail.
Re: (Score:2)
I have a lot of Apps that more than need it, I do 3D graphics a lot and Blender loves memory, the more memory - the happier the renderer. And I also do a lot of video editing for my Youtube show - here I would actually love 64 GB ram...can't get enough ram, seriously. Remember - the video files you guys see are COMPRESSED - when you edit video live and REAL time - you have to have the
So no HBM 2 memory for consumers? (Score:2)
Re: (Score:1)
Let's see what AMD will say about that. And whether there will be usable open source drivers (for either manufacturers 2016 GPU lineup).
I can promise(..) you AMD won't release a single GPU Polaris 10 graphics card with HBM 2 right now.
Both companies will likely release theirs in 2017.
Re: (Score:1)
Or well, with 2017 I mean "later."
Neither of them will release HBM 2 cards this summer.
And AMD Polaris 10 may not keep up with the fastest of the Nvidia cards.
You'll have to wait for the replacement of the Fury cards.
Re: (Score:1)
It seems there won't be much competition in the 2016 GPU lineups:
- AMD is doing small to medium chips, their 2016 Polaris line will cover notebooks and midrange desktop GPUs.
- Nvidia is doing fairly high end chips at higher prices.
In 2016, some people might ask themselves if they want "Polaris 10" (the bigger of the AMD chips) or spend more money on the the GTX 1070. But for most the choice will be easy.
2017 will be more interesting, with AMD Vega competing against the just released GTX 1080
I don't think HBM2 is available yet (Score:2)
At least not in any commercial quantity. So a company can't release a card with it for retail sales since they just couldn't make it. All they could do is a paper announcement, as nVidia did with their compute Pascal. If AMD wishes to launch a card soon, it will likely have to either use GDDR5(X) or HBM1 since there just aren't the HBM2 modules out there for it.
Remember there's a non-trivial lag time between a company developing a technology and managing to produce it on a commercial scale.
1080 versus 980 TI (Score:2)
I've been saving to buy a 980 TI but this is kind of interesting.
So is this 1080 faster for less money than the 980 TI? I'm looking at one manufacturers specs for the 1080 versus their 980 TI (overclocked edition) and it looks like the 1080 has more memory and higher clock speeds but also less cuda cores.
I'm going to be interested to see what the end users reviews are when it's available.
Re: (Score:2)
I'm always skeptical about new products and how good they are purported to be before the actual release, but you're saying the 1080 is about the same performance for more money? Yet EndGaget seems to be saying "GTX 1080 GPU is faster than Titan X" and I thought the Titan X was faster than the 980 TI (although not by much). Also lets take the Gigabyte GTX 980 TI, it's MSRP is like $659.99 and yet Amazon sells it for $594.99, so if the MSRP of a 1080 is $599.99 the retail price should be under $500, that make
Why such power hogs? (Score:2)
In an age where most CPUs have TDP of 65W or less, why does it seem every add on graphics card has TDP that starts at 65W and goes up to 250-400W?
Re: (Score:2)
For example a 65-watt card may have 640 processing units while a 250-watt card with the same tech or almost the same might have 3072 processing units.
It's as if Intel sold you a 20-core consumer CPU that uses up to 250 watts, which they don't but that would be physically possible.
Re: (Score:2)
CPU power dissipation is limited by semiconductor die size unless junction temperature is increased which lowers operating life and reliability. Since about the start of the Core2 series, CPU die sizes are dropped so power has had to drop as well. GPU makers use a different trade off sacrificing operating life and reliability for performance so while they do use larger semiconductor die sizes, they use even higher power levels. In this respect AMD and nVidia have been in a race to the bottom with nVidia
Re: (Score:2)
The bottom line is you need a 500+ watt PSU to service your video card, not the rest of the pc. It is crazy.
Re: (Score:2)
For reliability reasons, the power supply should be significantly derated anyway. The manufacturers have gotten really good at designing them so that they fail just out of warranty because the aluminum electrolytic capacitors wear out.
Is it really that fast? (Score:2)
I've seen a bunch of reports that a typically-overclocked GTX 960 is just as fast as the GTX 1080. Are those people just blowing smoke? Or is nVidia just jerking off here?
Great - In 2 years I'll have a Linux driver for it (Score:3)
Re: (Score:1)
They Don't Think It Be Like It Is But It Do
Re:AMD just crapped themselves (Score:5, Informative)
Assuming their "A NEW KING" graph isn't a lie, it's $600-$700 for 1.7x the performance of a single 980. This also jives with their claim about it being faster than 2 980s in SLI.
The graph is missing the 980 Ti though, which is what people paying $600-$700 for a GPU will be comparing it to.
I can understand not including the 980 Ti since we're talking about the 1080 and not a 1080 Ti. But they went ahead and included the Titan X. That's some major bullshit - the Titan X is an expensive piece of shit compared to the 980 Ti. A reference 980 Ti is nearly identical in performance to a Titan X at $400-$500 cheaper. A non-reference 980 Ti will easily beat a Titan X and still save you $200-$300. (The Titan X is only available with the reference cooler.) The non-reference 980 Ti is also at the same $600-$700 price of the GTX 1080 (MSRP).
My guess based on their chart (using the Titan X as a baseline) is that the 1080 is about 30% faster than a reference 980 Ti and 15-20% faster than any of the dozens of non-reference designs out there now priced around $600-$700. (There are dozens more non-reference 980 Ti models priced higher and higher than that, if you've got money to burn.)
Anyone on a current generation card really should wait for the presumed 1080 Ti and AMD's Vega.
Re:AMD just crapped themselves (Score:5, Informative)
> That's some major bullshit - the Titan X is an expensive piece of shit compared to the 980 Ti.
Whoa, hold on. Context is extremely important.
For gaming yup, that's some serious shenanigans(*) ! BUT for rendering [anandtech.com] the Titan X is faster then the 980 Ti.
(*) Obviously, many people don't feel the Price/Performance of the Titan X vs 980 Ti is worth it, myself included. Words along Over-priced, Greedy bastards come to mind, but if performance is king and money is no object then for rendering + scientific computing, the Titan X was the previous crown holder.
It all depends on context.
Re: (Score:2)
And the context is the GTX 1080 - a gaming GPU.
Re: (Score:2)
Re: (Score:2)
For games the Titan X is (slightly) faster than a reference 980 Ti.
There aren't many reference 980 Tis out there. Most are in the $600-$700 range and are factory overclocked with non-reference coolers on them.
Re: (Score:1)
The GTX 980Ti is 7/8 the GPU of what the Titan X is with half the VRAM.
GTX 980 in SLI beats the GTX 980Ti.
Neither the GTX 980Ti or the Titan X is 70% faster than the GTX 980.
The rumors / claims I've heard from AMD side is that Polaris 10 almost reach Fury X and 980Ti performance but at a lower price, so that would mean more in line with the GTX 1070 too me and not on level with GTX 1080. Also it's not intended to replace the Fury cards and there they just released the dual GPU one which DOES beat the 980Ti
Re: (Score:2)
Yep. Except if you need VR or other multi-viewport rendering which seem
Re: AMD just crapped themselves (Score:2)
...jives
While it may in fact do just that (example: "honkey-ass cracker-motherfucker") the word you're looking for is actually jibe.
Re: (Score:2)
Ask all the people who bought a 980 (or worse, Titan X) how it felt when the 980 Ti dropped.
Re: (Score:2)
The Titan X was still marketed heavily toward gamers. Nvidia aren't stupid. They sold buttloads of them. Then they trotted out the 980 Ti and people who bought the Titan X were pissed. Many of them ended up buying the 980 Ti however (or two to SLI). Nvidia knows their customers, and many of their customers are suckers.
Re: AMD just crapped themselves (Score:1)
Re: (Score:3)
AMD has been in trouble for a decade now. On one side they've been getting their ass consistently handed to them by intel. On the other, there's nvidia.
Re:AMD just crapped themselves (Score:5, Insightful)
Maybe so, but for regular people that just want a laptop to browse cat videos on the internet while running a few windows apps, you can't beat something with an AMD CPU for the price.
Also, every console uses AMD.
Re: AMD just crapped themselves (Score:2)
Every console uses AMD but if you look at their semi-custom revenue/profits margins are very slim. It's good for keeping production volume up as the smaller player but it's not good business. Whether that's because of market realities or because AMD lowballed it is hard to say, but I guess they expected a bigger payoff in the desktop market.
Re: (Score:2)
Without their own fabs, production volume is something AMD can no longer take advantage of.
Re: (Score:3)
They pay for a night at a hotel and domestic airfare for maybe 100-200 people. Let's say an average airfare of $300, and the room costs $300, that puts your total schmoozing cost at 60-120k, plus the smaller cost of renting a big room for the presentation.
For that small cost, you get guaranteed coverage that will fire up social media, and even reach less interested sites like Slashdot, and it all happens SIMULTANEOUSLY from all those who attended, because they want to be first to report back from the exclu
Re: (Score:2)
Maybe so, but for regular people that just want a laptop to browse cat videos on the internet while running a few windows apps, you can't beat something with an AMD CPU for the price.
Sure you can. Last AMD product I touched was a laptop with an AMD APU in it. Drivers on the HP website failed to detect the hardware correctly. AMD's own driver download utility correctly identified the chipset and yet said it won't supply drivers for it and I need to go to the vendor. After several hours of screwing around I finally forced the AMD driver to install which fixed most of the problems, but then at that point I was so incredibly pissed of with the entire experience that when I found another adm
Re: (Score:2)
You just described my experience with a toshiba laptop with built in INTEL hd video 5 years ago. The "toshiba" driver had so many bugs and slowdows, i decided to try to get the stock intel driver from intel.com. It didn't install automatically from the auto detect tool intel has, but i downloaded it manually and it fixed most of my problems.
Re: (Score:2)
Ha. You were able to download it? I had to get the driver from a 3rd party website. This wasn't a case of autodetect not working, it was a case of yeah it may say AMD on it, but no you get your thing from somewhere else.
Also with the Intel HD graphics chip I concur. The Surface Pro3 drivers from MS for the longest time only produced a 6-bit colour output, a problem fixed by going to Intel's site and downloading their driver (no autodetect works, but finding the correct driver is trivial). .... Oh and then d
Re: (Score:2)
read my comment again, it was all over the intel hd driver that came with the toshiba laptop. When i had a radeon i never had issues with drivers, however i am aware than in the 2000's ati had a bad rep with drivers. today i am using a gtx 970 because it is the best card for the money
Re: AMD just crapped themselves (Score:2)
Re: (Score:2)
Every console uses AMD not because it's the performant choice, but because it keeps costs down and console margins are narrow. Pound for pound there is no comparison between AMD and Intel. Intel wipes the floor with them.
At the end of the day if performance matters you don't buy AMD.
Re: (Score:2)
Depends on the task, not everything benefits from GPGPU.
Re: (Score:3, Informative)
Re:AMD just crapped themselves (Score:4, Insightful)
The Intel/AMD cross licensing agreement, in place since the SEVENTIES says 'lol'. Further, when it comes to R&D and patents, AMD is a cruel joke in comparison. AMD and Intel are NOT peers, they are not equals, they are not partners. Intel is a Titan of monstrous proportions, AMD is a Demi-God at best. People get this idea in their head that AMD is an equal competitor to Intel, they arent, not even close.
Re: (Score:2)
I suspect what is going to happen is that the PC market will become more like the market existed before PCs. We had "development systems" running operating systems like CP/M which some people were using as PCs. The problem for AMD is that there is not enough demand in such a market to support both them and Intel. Intel talks about moving more into the server business but I do not think that can save AMD when the cost of Intel's CPU development covers their PC processors also.
And of course all it takes is
Re: (Score:2)
It's been indicated that big Pascal is for HPC only, because the HBM2 availability is not there yet. I suppose they plan to sell what they can make at very high margins and that's all.