NVIDIA Launches Maxwell-Based GeForce GTX 980 and GeForce GTX 970 GPUs 125
MojoKid (1002251) writes NVIDIA has launched two new high-end graphics cards based on their latest Maxwell architecture. The GeForce GTX 980 and GTX 970 are based on Maxwell and replace NVIDIA's current high-end offerings, the GeForce GTX 780 Ti, GTX 780, and GTX 770. NVIDIA's GeForce GTX 980 and GTX 970 are somewhat similar as the cards share the same 4GB frame buffer and GM204 GPU, but the GTX 970's GPU is clocked a bit lower and features fewer active Streaming Multiprocessors and CUDA cores. The GeForce GTX 980's GM204 GPU has all of its functional blocks enabled. The fully-loaded GeForce GTX 980 GM204 GPU has a base clock of 1126MHz and a Boost clock of 1216MHz. The GTX 970 clocks in with a base clock of 1050MHz and Boost clock of 1178MHz. The 4GB of video memory on both cards is clocked at a blisteringly-fast 7GHz (effective GDDR5 data rate). NVIDIA was able to optimize the GM204's power efficiency, however, by tweaking virtually every part of the GPU. NVIDIA claims that Maxwell SMs (Streaming Multiprocessors) offer double the performance of GK104 and double the perf per watt as well. NVIDIA has also added support for new features, namely Dynamic Super Resolution (DSR), Multi-Frame Sampled Anti-Aliasing (MFAA), and Voxel Global Illumination (VXGI). Performance-wise, the GeForce GTX 980 is the fastest single-GPU powered graphics card ever tested. The GeForce GTX 970 isn't as dominant overall, but its performance was impressive nonetheless. The GeForce GTX 970 typically performed about on par with a GeForce GTX Titan and traded blows with the Radeon R9 290X.
What we need here... (Score:4, Insightful)
Is mod point for articles themselves.
Imagine the possibilities.
Re: (Score:2)
Imagine the possibilities.
Oh, you mean like a karma score for submitters, which would influence the priority in the queue of their submissions. We can only dream ;-)
Slashvertisment for HotHardware! YAY! (Score:1)
Oh joy.
Instead of linking to multiple articles, we just see endless links to the HH blurb.
Re: (Score:2)
Only 4 displays, sticking to AMD. (Score:2, Informative)
Can only drive up to 4 displays , pretty much any AMD card can drive 6 displays. I don't want to play games but want more screen real estate for software development.
Re:Only 4 displays, sticking to AMD. (Score:5, Funny)
I know what you mean. I don't want to play games but am looking to carry sacks of grain through the Andes, and these cards lack the qualities of a trusty burro.
Re: (Score:2)
I don't think playing games with burros is on-topic on slashdot.
Re: (Score:3)
Re: (Score:3)
Nope http://www.geforce.com/hardwar... [geforce.com] . Trust me I have tried and they don't allow you to drive more than four even in SLI mode.
Finally it has monitor ports, which implies you can drive 5 , nope only 4.
Re: (Score:2)
What about if you don't have the cards in SLI?
Re:Only 4 displays, sticking to AMD. (Score:4, Informative)
If you want more than 4 , you have to pay up from the quadro series. I had more than four in SLI on GTX 760 cards , until they did a driver update and set the max to 4.
Re: (Score:2)
Re: (Score:2)
He's not actually interested (Score:2)
It is AMD fanboy sour grapes. For some reason some people get really personally invested in their choice of graphics card. So when the other company comes out with a card that is substantially better than what their company has, they get all ass hurt and start trying to make excuses as to what it is bad. The nVidia fans did that back when the AMD 5870 came out and nVidia had no response. Same deal here. The GeForce 900 series are a reasonable bit faster than the AMD 200 series, and way more power efficient.
Re: (Score:2)
If I had that setup at home, I'd find the fucking postage stamp I'm allocated at work to be insufferable. Actually I already do. If I had that setup at work, I'd have to drop a few grand to duplicate it at the house.
I'm pretty sure I'm not going to find a game I'd want to play that'd allow me to make effective use of that many monitors. Maybe if I were building a realistic VR flight simulator with X-Plane, or something. I guess you could use it fo
Re: (Score:2)
If I had that setup at home, I'd find the fucking postage stamp I'm allocated at work to be insufferable.
Sounds like you work for a crap employer. Most companies nowadays recognise that developers are far more productive with at least 2 monitors. Where I work we all have 2 dell monitors attached to a laptop docking station for our company issue laptop so we can actually use 3 screens if you don't mind one being smaller than the other two.
If I had that setup at work, I'd have to drop a few grand to duplicate it at the house.
Why? Personally I try and avoid working unpaid hours from home, if it was part of my job requirement then I would want the company to buy be the necessary gear.
I don't mind th
Re: (Score:2)
Most companies nowadays recognise that developers are far more productive with at least 2 monitors. Where I work we all have 2 dell monitors attached to a laptop docking station for our company issue laptop so we can actually use 3 screens
Not quite true. Any perceived increase in productivity is greatly offset by me trying to fine-tune the scripts that handle the screens when docking and undocking, and the docking station showing all 5 (!! HDMI, DVI, VGA and 2xDP) video outputs as "DP2", and cloning the output between them.
Yes, I am running Linux. No, this is not 1999.
Thankyou Lenovo & Intel.
Re: (Score:2)
Re: (Score:2)
For professional use, yea, that does suck...
For gaming, three displays is the sweet spot, but nVidia's surround isn't up to the level of Eyefinity yet. I tried it with a pair of 780 TI cards and it just isn't as good as Eyefinity.
Re: (Score:1)
Then you don't want a super expensive gaming card, but two cheapo nvidia cards in SLI.
Re:Only 4 displays, sticking to AMD. (Score:5, Informative)
No you are still limited to 4 displays, NVIDIA did a dick move and limits it to 4 in the driver. You have to have quadro card to have more than 4 displays. Note when I first got my 2 gtx 760 cards I could drive 6 displays, and then a driver update and they limited to 4 displays even in SLI.
Re: (Score:2)
Someone needs to mod this up.
And also figure out how to hack the driver to increase the limit back up to 6.
Re: (Score:3)
You can http://www.eevblog.com/forum/c... [eevblog.com] but you also need to hack to report of your machine as approved machine to run quadro in SLI.
Re: (Score:2)
You can hack the drivers and graphics card's report name, or you can roll back to said drivers. The functionality is there, it's just disabled in current drivers.
Re: (Score:2)
Yes I know that you can hack it but why NVIDIA does this is just pain in ass.
Re: Only 4 displays, sticking to AMD. (Score:2)
If you need six displays for software development, something is seriously wrong with you.
Re: (Score:2)
Nope don't need 6 but it is helpful I have to the top displays usually having API documentation, bottom three are usually editor windows.
Re: (Score:2)
Can only drive up to 4 displays , pretty much any AMD card can drive 6 displays. I don't want to play games but want more screen real estate for software development.
Then why look at this card at all? You must be able to get something FAR cheaper if all you want is 2d real estate for software development. Wouldn't 2 or 3 cheaper cards be a far better purchase, even if you needed to buy a new motherboard to support it.
Tips? (Score:2)
I'm on the market for a GPU this december. My requirements are: less than 250$, low power consumption, good compatibility with Z77 chipset (intel i5 3570k), no overclocking needed, low noise, best performance/price ratio and of course better performance than my current GPU (msi twin 7850 2GB).
So far i have seen either a r9 280 or gtx 760.
Anyone would like to offer some advice?
Re: (Score:2)
I have just seen a 285 review, which is basically a 280 with more features and same price.
For what i have seen, the 285 beats the 760 on every front, including power draw, and the nvidia response was to lower the price about 30$.
I haven't bought nvidia for years, while they have great drivers and performance, power use and price hasn't been competitive.
I will keep looking to see if the prices change or a new board gets introduced by december. I am not happy the 285 is still a 2GB card, while my 22" monito
Re: (Score:2)
nVidia is like intel, you pay for performance.
Very, very happy with GTX Titan and 780Ti for low noise and performance.
Re: (Score:2)
Unless you're running 50 of these things in your basement 24/7 I doubt you'll notice enough difference on your power bill to justify picking one over the other. I would think driver quality and performance is much more important as you're not buying the gtx line for power economy.
Re: (Score:2)
if my country, if you go over 1300kw/month, your electric bill doubles. Yes, it's insane. Check your power bill and tell me how much you use.
Re:Tips? (Score:4, Informative)
The R9 280 certainly doesn't count as low power (250W), the R9 285 is considerably better in that department (190W) and got some newer features to boot, with a $249 MSRP it should just barely squeeze inside your budget. To stay in your budget limit the nVidia alternative is GTX 760, but I wouldn't buy a Kepler card today, too hot and too noisy. Unfortunately there's not a Maxwell to match your specs, there's a gap between the GTX 750 Ti (which wouldn't be a performance upgrade) and GTX 970 (which blows your budget at $329).
Personally I was very surprised by the GTX 970 launch price though, the GTX 980 @ $549 was as expected but the 970 delivers 13/16ths of the processing power with the same memory size and bandwidth for over $200 less. I bought two to use in a SLI setup, in the games that scale nicely it's a kickass value. I suspect that by December this will have had some market effect at the $250 price point too, so I'd say check again then. Asking for advice 2-3 months out in a market that changes so quickly doesn't really make much sense.
Re: (Score:2)
I was looking at the 970 just before you posted, i can't believe that kind of performance at less than 100$ more of my budget, with that incredible low power draw (which is important to me, on my country we are heavily penalized for high power consumption, since we have blackouts every day)
It's tempting... Let's see what happens on december.
Re: (Score:2)
Faster using less power! (Score:3)
What is nice to see is that these cards are slightly faster than the generation they replace, while using less power.
The power use of video cards has been creeping up in recent years, going up to the point where a pair of PCI-E power cables was required for one card.
Nice to see a fast card that can be put into a modest system, the 970 is 20% slower than the 980, while costing 40% less money and using only 165w of power
That is low enough that it should work with most cheaper rebuilt systems from the likes of Dell/HP/Acer etc.
Re: (Score:2)
Yes, we will start to see more of this as the process technology leaps slow down.
They will have to find improvements in the design of the chips, vs, for years just depending on making them smaller.
Even if no new process technology came out for five years, I suspect they could keep making theses better and better
Re: Faster using less power! (Score:2)
Re: (Score:2)
Yes, but it consumes less power than the 770 it replaces.
Also, it only barely needs the second connection, that is likely there for margin and overclocks, it could probably run just fine at stock clocks with a single PCI-E cable.
Re: Faster using less power! (Score:2)
Re: (Score:2)
The connectors aren't the issue, the power output of the power supply and electric bill are.
The 770 needed a stronger power supply than the 970 does. You can get away with perhaps 50-100 fewer watts in you PS and it saves money over time in your power bill.
These are good things.
Re: (Score:2)
Not really. If you're burning 400-1000€ on a graphics card, you are not going to care about a few extra euro a year in your electric bill.
I can understand this on the low end, but "woo power savings" on the high end is nothing short of amusingly silly. High end has always been and will always be about one thing and one thing alone: raw power.
There's a reason why power supplies that push 1500 watts enthusiasts exist.
Re: (Score:2)
Actually, I disagree... not everyone directly compares the cost of buying something to the cost of power. That power comes from somewhere, in my case, from coal power plants...
In addition, I'd like to upgrade my son's computer, he has a Dell with a limited power supply. I could of course upgrade that, but if the card needs less power, I can put it in without a bunch of modification.
The other benefit is the mid tower case that Dell provides doesn't have a ton of airflow, a cooler running card needs less a
Re: (Score:2)
I don't think you understand. These cards cost more than an entry level PC. 980 in fact costs more than midrange PC. Alone.
If you're shopping for a card like this, you really aren't going to give a damn about power bill increase from it.
Re: (Score:2)
I consider a mid range PC to be north of $550, so I have to disagree with you there. :)
You continue to miss the point... Heat, noise, and source of power are all concerns, none of which have to do with the price of power.
As it stands, my power is 11 cents per kwh, so it matters less to me, but my power comes from coal, I don't like heat and noise, I have to air condition my home, and the airflow in my son's computer case isn't great.
So yea, I do like that these use less power.
Re: (Score:2)
I'm not sure what you consider a mid range PC, but I do know what the suppliers consider one. And it's around 500-600 at most today.
And frankly, "oh noes coal power" argument is equally silly. You're talking a hundred watts savings at best, and that's when machine is under load.
Considering the total usage of these top end cards on steam (below 1%), it's literally these savings are less than a rounding error.
Re: (Score:2)
So spend them. Market is choke full of after market water cooling solutions for people like you. Because people who are target audience for these cards aren't going to balk at a cost of a water cooling addition to their box when it's a tiny fraction of what they paid for said box.
Re: (Score:2)
I get the feeling you're one of those "let them eat cake" types that thinks that *70 and *80 cards are common and not something that rich older men like us grab because we have extra money to spend.
I recommend a look at steam hardware survey for a harsh dose of reality.
Re: (Score:2)
20-year-old kids and slightly older working minimum wage or slightly higher buy these kinds of cards, in some european countries at least. (where you get health insurance with any job and not paying for a car can be an option if you're not e.g. a construction worker)
At around 300 euros it's not a particularly ultra-expensive mass consumer product. it's nice that a 400W PSU and a cheapo vanilla tower will be adequate, plus the power bills are only increasing anyway.
Many, many people don't follow the upgrade
Re: (Score:2)
I get a strong suspicion that you're an right wing American who believes fox news' fiction about Europe.
In real life on the other hand, most of the Europe has been in a sharp recession for last five years, with youth unemployment at record heights and those youths who are employed struggling to remain employed and suffering reduced wages.
Re: (Score:2)
Considering that most of the released custom cooled 970s have a 8 pin and a 6 pin, it's far more likely that dual six pin is barely enough for stock card rather than a massive overkill.
Re: (Score:2)
Re: (Score:2)
Depending on your country, time for payback assuming ten hours uptime daily under load is between three and ten years (ten years for mine, did the math about a week ago).
Chances of a graphics card surviving that long, especially under constant load are pretty slim.
Re: (Score:2)
Re: (Score:2)
Which is why you're not target audience for top end cards. People who are, if faced with problem of heat, simply spend another hundred on a water cooling solution that is very quiet and a much more efficient PSU that would easily run one-two of those cards with minimal fan activity.
Re: (Score:2)
Since you can sell (or otherwise use) the 770, if you pay a premium rate for power, the savings over 2 years may well pay for the upgrade. :)
This is bad for AMD (Score:1)
The GTX 970 is as fast as AMD's flagship R9 290X, much more power efficient and is $170 cheaper. This means AMD will have to knock down prices by a huge amount and
they are sort of depending on graphics revenue to break even, because of falling CPU marketshare.
Looking for info on running 4k screens (Score:3)
This might be a candidate for Ask Slashdot, I guess
Re: (Score:2)
He isn't asking to watch movies, and it does t sound like games either, if he is willing to consider 30hz.
For desktop environments, more resolution is usually better.
Re: (Score:2)
What are you talking about? Any desktop environment is perfectly happy running at 4K. Most games made in the past few years will render at 4K (whether your graphics card can handle it is something else entirely). 4K televisions are still a bit of a solution without a problem, but I could take advantage of a 4K screen on my computer immediately.
Have 30" 2560x1600 @ 60Hz now (Score:3)
Re:Looking for info on running 4k screens (Score:4, Informative)
As I understand it 30p is okay for photo work, but a pretty big compromise for general desktop use so I wouldn't do it. I have a 3840x2160@60p 28" monitor hooked up over DisplayPort 1.2 using SST (single stream transport). It works very well, I can also hook it up to my 1080p TV at the same time on my GTX 670. Just bought dual GTX 970s to replace it though.
There are three ways to support 4K content:
HDMI 2.0
DisplayPort 1.2+ over SST
DisplayPort 1.2+ over MST
Avoid MST (multiple stream transport), it's not worth the issues. DisplayPort 1.2 has been around for a while, the screen is usually the blocker on whether you can use SST. My screen (Samsung UD590) can so I do and that works great. HDMI 2.0 is brand new, the GTX 970/980 are the first graphics cards to support them but I suppose they're the only means to hook up 4K to an UHDTV as I understand most of these don't have a DisplayPort. That's what it's designed to do anyway, but if you jump on HDMI 2.0 now you'll be the first to test it really. For me that's not even an option, I hook it through the sound system and that doesn't support HDMI 2.0 pass-through. I find it's not that essential at couch distance anyway, it's sitting up real close you notice it most.
Re: (Score:2)
Re:Looking for info on running 4k screens (Score:4, Informative)
OK, I have a Seiki 4K 39" running at 3840*2160@30hz, for Autocad and GIS, it's like running a 2*2 19" 1080p monitor array without the bezels in the middle, works fine with my Nvidia GeForce GTX 560Ti, but I had to set a custom resolution in the nvidia control panel to use it at 4k@30hz and 1080p@120hz for gaming (the only TV I know to support this refresh rate without frame interpolation, which it doesn't have), and sometimes i set it at 4k@24hz for watching 1080p movies. Since the card doesn't have official support for 4k sometimes there is some loss of sync @ 4k@30hz or 1080p@120hz, I fix it setting the refresh rate to something lower or higher and back again. Color can be very good after calibration. The sound of the speakers in the TV it's pretty bad.
This TV only has HDMI 1.4a, VGA and component inputs, only supports 4k on the HDMI inputs.
The newly launched Nvidia GTX 980 and 970 support HDMI 2.0 and DP, so these can run 4k@60hz with TV and monitors that support it, I think some Samsung and LG TVs advertise HDMI 2.0 and DP.
The GTX 970 and 980 support multiple displays running 4@60hz simultaneously, 4 IRC.
here are a couple threads where I found most of the information before I bought it:
http://www.overclock.net/t/1442986/playing-with-my-seiki-se39uy04-got-it-to-do-1440p-at-60hz-playing-with-other-custom-stuff
http://hardforum.com/showthread.php?t=1756171
Thanks (Score:2)
The newly launched Nvidia GTX 980 and 970 support HDMI 2.0 and DP, so these can run 4k@60hz with TV and monitors that support it, I think some Samsung and LG TVs advertise HDMI 2.0 and DP.
Ok, so it's a somewhat reasonable idea to get 60Hz on a TV without spending the cash for a 4k monitor.
here are a couple threads where I found most of the information before I bought it: http://www.overclock.net/t/144... [overclock.net] http://hardforum.com/showthrea... [hardforum.com]
Thanks, I'll have a look at these.
Samsung UE40HU6900 has HDMI 2.0, cheap enough (Score:2)
Thanks again for the help!
Re: (Score:2)
I just bought and installed an iiyama B2888UHSU-B1 for ~EUR500. It runs great at 60Hz over displayport 1.2 on an AMD7950. It's a TN-panel (by CMO, which apparently is used in most of the 4k monitors at this price point), but it performs quite well in the color department, according to proper tests ( http://nl.hardware.info/tv/802... [hardware.info] - Dutch, but the tables shown at certain points in the video should be intelligible).
The 7950 drives an extra monitor over HDMI (1080p@60Hz) simultaneously without problems.
1. U
Re: (Score:1)
As for connectors, this unit has 3xHDMI(A) , 2xUSB, 1xDVI, and a handful of RCA ports for sound & componen
Re: (Score:2, Interesting)
Re: (Score:1)
Except AMD now pulls the same shit. Uncrippled Hawaii has 1/2 rate DP...
Re: (Score:2)
I recently bought 2 used 580's for their 32-bit integer compute performance, for $280 total. Buying GTX 780 Ti's with comparable performance would have cost me 5 times as much.
Really? (Score:1)
Re: (Score:2)
Actually they do. nVidia's desktop cards have been very well supported for as long as I can remember (going back to my first nVidia setup, with a GeForce 3).
Still 28nm (Score:5, Informative)
A lot of what you see going on in the GPU and mobile front is being dictated by the failure of TSMC and other fabs to transition to 20nm for processors (memory is a lot easier and reached 16nm in 2013). Intel made the transition from 32nm to 22nm last year with Haswell and Bay Trail. The other fabs were supposed to leapfrog Intel by going from 28nm to 20nm this year. They haven't, which is what's allowed Intel to produce Atom SoCs with power usage approaching that of ARM SoCs. ARM has the lower power tech, but Intel's smaller lithography is mostly wiping out that advantage. If you see Intel successfully make the transition to 14nm in 2015 while the other fabs can't get 16nm to work, things are going to get really interesting on the mobile SoC front..
The GPU front is bleaker. Both nVidia and AMD use third party fabs like TSMC, so there's no competitive advantage to be had. We've just had to suffer with stagnating product lines and slow product releases because the expected lower power consumption in GPUs from 20nm didn't happen in 2014.
Re:Still 28nm (Score:4, Informative)
I doubt that.
TSMC 20nm will be ready for GPUs a lot sooner than their 16nm process. The only reason there are no 20nm GPUs yet is because the initial ramp was fully booked out by Apple.
Meanwhile, a comparison of Apple's 20nm A8 density versus 14nm Core M, indicates Intel's 14nm may not have such a density advantage as they claim: https://www.semiwiki.com/forum... [semiwiki.com]
Re: (Score:1)
The GPU front is bleaker. Both nVidia and AMD use third party fabs like TSMC, so there's no competitive advantage to be had.
Intel 14nm fab capacity is potentially game changing -- they've held back 2 huge fabs with double the normal capacity, and they still have enough for their own chips. I predict that Intel is going into the 14nm business, and either nVidia or AMD would be interesting clients.
Re: (Score:3)
At this point I think it's safe to write off TSMC's 20nm fab process. It's not gonna happen [...]
Except that it already is shipping. Apple's A8 chip used in the iPhone 6 and iPhone 6 Plus is manufactured using TSMC's 20nm process [macrumors.com]. And given Apple's proclivity for consuming entire manufacturing lines for their products, it's entirely possible that TSMC had to turn away other customers if they wanted to keep Apple, simply because they lacked the capacity to do otherwise. It also makes sense why they haven't been able to talk about the fact that they had a major customer lined up, given how religiously Ap
Re: (Score:2)
Depends on resolution used. At 1440p, you're already going to strain a lot of graphics cards to the limits, especially if you want relatively stable 60fps.
At 2160p, you're pretty much going to need the best of the best.
7xx to 9xx ?? (Score:1)
Re: (Score:2)
Next you'll say Haswell's just a tweaked NetBurst maybe?
Useless comparisons (Score:2)
Blender Cycles Rendering Engine (Score:2)
Reading this makes me behave like a kid in a candy store, seriously.
Re: (Score:1)
You're either using very very wrong terminology to describe a GPU rendering cluster, or you're seriously misinformed.
There is no such thing as a card with 1300 GPUs; though there are cards with 2 GPUs on the same board, it's essentially impossible to fit more. You might have a GPU with ~1300 ALUs, but there's an enormous difference between the two. A GPU is an independent device, roughly speaking. An ALU is not, and in GPUs, they execute in lockstep in large sets (64 in AMD's GCN, 192 in nVidia's Kepler,
Re: (Score:2)
It has 1300+ cores (some of the bigger cards got over 2300+ cores), and yes - I realize that they're not full-fledged CPUs and just highly specialized to perform certain calculations, kind of like FPGAs if you like.
Anyone tried the newer, more expensive Radeons? (Score:2)
Re: (Score:2)