Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Upgrades AMD Graphics Hardware

NVIDIA Launches Maxwell-Based GeForce GTX 980 and GeForce GTX 970 GPUs 125

MojoKid (1002251) writes NVIDIA has launched two new high-end graphics cards based on their latest Maxwell architecture. The GeForce GTX 980 and GTX 970 are based on Maxwell and replace NVIDIA's current high-end offerings, the GeForce GTX 780 Ti, GTX 780, and GTX 770. NVIDIA's GeForce GTX 980 and GTX 970 are somewhat similar as the cards share the same 4GB frame buffer and GM204 GPU, but the GTX 970's GPU is clocked a bit lower and features fewer active Streaming Multiprocessors and CUDA cores. The GeForce GTX 980's GM204 GPU has all of its functional blocks enabled. The fully-loaded GeForce GTX 980 GM204 GPU has a base clock of 1126MHz and a Boost clock of 1216MHz. The GTX 970 clocks in with a base clock of 1050MHz and Boost clock of 1178MHz. The 4GB of video memory on both cards is clocked at a blisteringly-fast 7GHz (effective GDDR5 data rate). NVIDIA was able to optimize the GM204's power efficiency, however, by tweaking virtually every part of the GPU. NVIDIA claims that Maxwell SMs (Streaming Multiprocessors) offer double the performance of GK104 and double the perf per watt as well. NVIDIA has also added support for new features, namely Dynamic Super Resolution (DSR), Multi-Frame Sampled Anti-Aliasing (MFAA), and Voxel Global Illumination (VXGI). Performance-wise, the GeForce GTX 980 is the fastest single-GPU powered graphics card ever tested. The GeForce GTX 970 isn't as dominant overall, but its performance was impressive nonetheless. The GeForce GTX 970 typically performed about on par with a GeForce GTX Titan and traded blows with the Radeon R9 290X.
This discussion has been archived. No new comments can be posted.

NVIDIA Launches Maxwell-Based GeForce GTX 980 and GeForce GTX 970 GPUs

Comments Filter:
  • by ThomasBHardy ( 827616 ) on Saturday September 20, 2014 @10:46AM (#47953953)

    Is mod point for articles themselves.

    Imagine the possibilities.

    • by sribe ( 304414 )

      Imagine the possibilities.

      Oh, you mean like a karma score for submitters, which would influence the priority in the queue of their submissions. We can only dream ;-)

  • Oh joy.

    Instead of linking to multiple articles, we just see endless links to the HH blurb.

  • Can only drive up to 4 displays , pretty much any AMD card can drive 6 displays. I don't want to play games but want more screen real estate for software development.

    • by pushing-robot ( 1037830 ) on Saturday September 20, 2014 @11:03AM (#47954041)

      I know what you mean. I don't want to play games but am looking to carry sacks of grain through the Andes, and these cards lack the qualities of a trusty burro.

    • by Elledan ( 582730 )
      Not true, each DisplayPort 1.2 output on a GTX 980 card can drive up to two monitors daisy-chained, so with a single GTX 980 you could have up to 6 displays with DisplayPort alone, more if the other outputs are independently driven (haven't checked into this yet).
    • Sure, if you can afford that card, you can probably afford a couple of monitors, but honestly, other than a few oddballs, who's really going to have more than 2 or 3 monitors at once. If nothing else, the space used by them really adds up in most home environments. So maybe these cards aren't made for the rich guy with specialty needs, which brings up the question of, "yeah, so what?".
      • It is AMD fanboy sour grapes. For some reason some people get really personally invested in their choice of graphics card. So when the other company comes out with a card that is substantially better than what their company has, they get all ass hurt and start trying to make excuses as to what it is bad. The nVidia fans did that back when the AMD 5870 came out and nVidia had no response. Same deal here. The GeForce 900 series are a reasonable bit faster than the AMD 200 series, and way more power efficient.

    • by Greyfox ( 87712 )
      So how many Xterms can you have open with 6 30" displays?

      If I had that setup at home, I'd find the fucking postage stamp I'm allocated at work to be insufferable. Actually I already do. If I had that setup at work, I'd have to drop a few grand to duplicate it at the house.

      I'm pretty sure I'm not going to find a game I'd want to play that'd allow me to make effective use of that many monitors. Maybe if I were building a realistic VR flight simulator with X-Plane, or something. I guess you could use it fo

      • If I had that setup at home, I'd find the fucking postage stamp I'm allocated at work to be insufferable.

        Sounds like you work for a crap employer. Most companies nowadays recognise that developers are far more productive with at least 2 monitors. Where I work we all have 2 dell monitors attached to a laptop docking station for our company issue laptop so we can actually use 3 screens if you don't mind one being smaller than the other two.

        If I had that setup at work, I'd have to drop a few grand to duplicate it at the house.

        Why? Personally I try and avoid working unpaid hours from home, if it was part of my job requirement then I would want the company to buy be the necessary gear.

        I don't mind th

        • Most companies nowadays recognise that developers are far more productive with at least 2 monitors. Where I work we all have 2 dell monitors attached to a laptop docking station for our company issue laptop so we can actually use 3 screens

          Not quite true. Any perceived increase in productivity is greatly offset by me trying to fine-tune the scripts that handle the screens when docking and undocking, and the docking station showing all 5 (!! HDMI, DVI, VGA and 2xDP) video outputs as "DP2", and cloning the output between them.
          Yes, I am running Linux. No, this is not 1999.
          Thankyou Lenovo & Intel.

        • by Greyfox ( 87712 )
          Oh I do my own little projects at home. Lately I've been generating a fair bit of video from gopro footage of my skydives. I also do some programming for fun. Traditionally my setup at home has always been a little better than my setup at work. If I got used to working with a huge amount of screen space at work, I'd want something similar at home.
    • For professional use, yea, that does suck...

      For gaming, three displays is the sweet spot, but nVidia's surround isn't up to the level of Eyefinity yet. I tried it with a pair of 780 TI cards and it just isn't as good as Eyefinity.

    • by Luckyo ( 1726890 )

      Then you don't want a super expensive gaming card, but two cheapo nvidia cards in SLI.

    • If you need six displays for software development, something is seriously wrong with you.

      • by bongey ( 974911 )

        Nope don't need 6 but it is helpful I have to the top displays usually having API documentation, bottom three are usually editor windows.

    • Can only drive up to 4 displays , pretty much any AMD card can drive 6 displays. I don't want to play games but want more screen real estate for software development.

      Then why look at this card at all? You must be able to get something FAR cheaper if all you want is 2d real estate for software development. Wouldn't 2 or 3 cheaper cards be a far better purchase, even if you needed to buy a new motherboard to support it.

  • I'm on the market for a GPU this december. My requirements are: less than 250$, low power consumption, good compatibility with Z77 chipset (intel i5 3570k), no overclocking needed, low noise, best performance/price ratio and of course better performance than my current GPU (msi twin 7850 2GB).

    So far i have seen either a r9 280 or gtx 760.
    Anyone would like to offer some advice?

    • Re:Tips? (Score:4, Informative)

      by Kjella ( 173770 ) on Saturday September 20, 2014 @11:28AM (#47954165) Homepage

      The R9 280 certainly doesn't count as low power (250W), the R9 285 is considerably better in that department (190W) and got some newer features to boot, with a $249 MSRP it should just barely squeeze inside your budget. To stay in your budget limit the nVidia alternative is GTX 760, but I wouldn't buy a Kepler card today, too hot and too noisy. Unfortunately there's not a Maxwell to match your specs, there's a gap between the GTX 750 Ti (which wouldn't be a performance upgrade) and GTX 970 (which blows your budget at $329).

      Personally I was very surprised by the GTX 970 launch price though, the GTX 980 @ $549 was as expected but the 970 delivers 13/16ths of the processing power with the same memory size and bandwidth for over $200 less. I bought two to use in a SLI setup, in the games that scale nicely it's a kickass value. I suspect that by December this will have had some market effect at the $250 price point too, so I'd say check again then. Asking for advice 2-3 months out in a market that changes so quickly doesn't really make much sense.

      • I was looking at the 970 just before you posted, i can't believe that kind of performance at less than 100$ more of my budget, with that incredible low power draw (which is important to me, on my country we are heavily penalized for high power consumption, since we have blackouts every day)

        It's tempting... Let's see what happens on december.

      • by Nemyst ( 1383049 )
        Kepler, hot and noisy? And then you recommend a GCN card? As much as I'll fully agree that you'll get better performance per dollar from AMD, their cards are vastly less efficient than Nvidia's.
  • by FlyHelicopters ( 1540845 ) on Saturday September 20, 2014 @10:52AM (#47953983)

    What is nice to see is that these cards are slightly faster than the generation they replace, while using less power.

    The power use of video cards has been creeping up in recent years, going up to the point where a pair of PCI-E power cables was required for one card.

    Nice to see a fast card that can be put into a modest system, the 970 is 20% slower than the 980, while costing 40% less money and using only 165w of power

    That is low enough that it should work with most cheaper rebuilt systems from the likes of Dell/HP/Acer etc.

    • Um, on the high end of the spectrum, a pair of 6-pin PCIe power connectors are still needed. Even with the 970.
      • Yes, but it consumes less power than the 770 it replaces.

        Also, it only barely needs the second connection, that is likely there for margin and overclocks, it could probably run just fine at stock clocks with a single PCI-E cable.

        • But are you going to be foolish enough to risk undervolting the card?
          • The connectors aren't the issue, the power output of the power supply and electric bill are.

            The 770 needed a stronger power supply than the 970 does. You can get away with perhaps 50-100 fewer watts in you PS and it saves money over time in your power bill.

            These are good things.

            • by Luckyo ( 1726890 )

              Not really. If you're burning 400-1000€ on a graphics card, you are not going to care about a few extra euro a year in your electric bill.

              I can understand this on the low end, but "woo power savings" on the high end is nothing short of amusingly silly. High end has always been and will always be about one thing and one thing alone: raw power.
              There's a reason why power supplies that push 1500 watts enthusiasts exist.

              • Actually, I disagree... not everyone directly compares the cost of buying something to the cost of power. That power comes from somewhere, in my case, from coal power plants...

                In addition, I'd like to upgrade my son's computer, he has a Dell with a limited power supply. I could of course upgrade that, but if the card needs less power, I can put it in without a bunch of modification.

                The other benefit is the mid tower case that Dell provides doesn't have a ton of airflow, a cooler running card needs less a

                • by Luckyo ( 1726890 )

                  I don't think you understand. These cards cost more than an entry level PC. 980 in fact costs more than midrange PC. Alone.

                  If you're shopping for a card like this, you really aren't going to give a damn about power bill increase from it.

                  • I consider a mid range PC to be north of $550, so I have to disagree with you there. :)

                    You continue to miss the point... Heat, noise, and source of power are all concerns, none of which have to do with the price of power.

                    As it stands, my power is 11 cents per kwh, so it matters less to me, but my power comes from coal, I don't like heat and noise, I have to air condition my home, and the airflow in my son's computer case isn't great.

                    So yea, I do like that these use less power.

                    • by Luckyo ( 1726890 )

                      I'm not sure what you consider a mid range PC, but I do know what the suppliers consider one. And it's around 500-600 at most today.

                      And frankly, "oh noes coal power" argument is equally silly. You're talking a hundred watts savings at best, and that's when machine is under load.

                      Considering the total usage of these top end cards on steam (below 1%), it's literally these savings are less than a rounding error.

        • by Luckyo ( 1726890 )

          Considering that most of the released custom cooled 970s have a 8 pin and a 6 pin, it's far more likely that dual six pin is barely enough for stock card rather than a massive overkill.

    • Yep. IM upgrading my 770 to a 970 just for the power savings.
      • by Luckyo ( 1726890 )

        Depending on your country, time for payback assuming ten hours uptime daily under load is between three and ten years (ten years for mine, did the math about a week ago).

        Chances of a graphics card surviving that long, especially under constant load are pretty slim.

        • For me, its not about the money, its about heat, noise, case size, power draw (as related to PSU sizing) etc. I cant keep the 770 on my desk (noise, heat), but i might be able to keep the 970 on my desk. There are other factors. I will give my 770 to my nephew and get his 750 ti back(another maxwell part) to put in my HTPC. When i talk about power consumption, I'm always more concerned with all the other factors besides actual electricity cost vs capital outlay. While it might not pencil out financially, t
          • by Luckyo ( 1726890 )

            Which is why you're not target audience for top end cards. People who are, if faced with problem of heat, simply spend another hundred on a water cooling solution that is very quiet and a much more efficient PSU that would easily run one-two of those cards with minimal fan activity.

      • Since you can sell (or otherwise use) the 770, if you pay a premium rate for power, the savings over 2 years may well pay for the upgrade. :)

  • The GTX 970 is as fast as AMD's flagship R9 290X, much more power efficient and is $170 cheaper. This means AMD will have to knock down prices by a huge amount and
    they are sort of depending on graphics revenue to break even, because of falling CPU marketshare.

  • by grimJester ( 890090 ) on Saturday September 20, 2014 @11:09AM (#47954075)
    I'm thinking about upgrading my monitor to 4k but I'm a bit confused about the current situation with both connectors and screens. How is running 4k@30Hz for normal desktop purposes, with no 3d gaming? Are the cheap 4k 39-40" TVs completely fine for those purposes? What connector(s) do I need from my GPU? How long would I have to wait for GPUs and TVs/monitors to support 4k@60Hz (HDMI 2.0? Displayport x.y?). Can I connect more than one screen to a single GPU card if I want 4k?

    This might be a candidate for Ask Slashdot, I guess
    • by Kjella ( 173770 ) on Saturday September 20, 2014 @11:53AM (#47954297) Homepage

      As I understand it 30p is okay for photo work, but a pretty big compromise for general desktop use so I wouldn't do it. I have a 3840x2160@60p 28" monitor hooked up over DisplayPort 1.2 using SST (single stream transport). It works very well, I can also hook it up to my 1080p TV at the same time on my GTX 670. Just bought dual GTX 970s to replace it though.

      There are three ways to support 4K content:
      HDMI 2.0
      DisplayPort 1.2+ over SST
      DisplayPort 1.2+ over MST

      Avoid MST (multiple stream transport), it's not worth the issues. DisplayPort 1.2 has been around for a while, the screen is usually the blocker on whether you can use SST. My screen (Samsung UD590) can so I do and that works great. HDMI 2.0 is brand new, the GTX 970/980 are the first graphics cards to support them but I suppose they're the only means to hook up 4K to an UHDTV as I understand most of these don't have a DisplayPort. That's what it's designed to do anyway, but if you jump on HDMI 2.0 now you'll be the first to test it really. For me that's not even an option, I hook it through the sound system and that doesn't support HDMI 2.0 pass-through. I find it's not that essential at couch distance anyway, it's sitting up real close you notice it most.

    • by Anonymous Coward on Saturday September 20, 2014 @12:09PM (#47954371)

      OK, I have a Seiki 4K 39" running at 3840*2160@30hz, for Autocad and GIS, it's like running a 2*2 19" 1080p monitor array without the bezels in the middle, works fine with my Nvidia GeForce GTX 560Ti, but I had to set a custom resolution in the nvidia control panel to use it at 4k@30hz and 1080p@120hz for gaming (the only TV I know to support this refresh rate without frame interpolation, which it doesn't have), and sometimes i set it at 4k@24hz for watching 1080p movies. Since the card doesn't have official support for 4k sometimes there is some loss of sync @ 4k@30hz or 1080p@120hz, I fix it setting the refresh rate to something lower or higher and back again. Color can be very good after calibration. The sound of the speakers in the TV it's pretty bad.

      This TV only has HDMI 1.4a, VGA and component inputs, only supports 4k on the HDMI inputs.

      The newly launched Nvidia GTX 980 and 970 support HDMI 2.0 and DP, so these can run 4k@60hz with TV and monitors that support it, I think some Samsung and LG TVs advertise HDMI 2.0 and DP.

      The GTX 970 and 980 support multiple displays running 4@60hz simultaneously, 4 IRC.

      here are a couple threads where I found most of the information before I bought it:
      http://www.overclock.net/t/1442986/playing-with-my-seiki-se39uy04-got-it-to-do-1440p-at-60hz-playing-with-other-custom-stuff
      http://hardforum.com/showthread.php?t=1756171

    • I just bought and installed an iiyama B2888UHSU-B1 for ~EUR500. It runs great at 60Hz over displayport 1.2 on an AMD7950. It's a TN-panel (by CMO, which apparently is used in most of the 4k monitors at this price point), but it performs quite well in the color department, according to proper tests ( http://nl.hardware.info/tv/802... [hardware.info] - Dutch, but the tables shown at certain points in the video should be intelligible).

      The 7950 drives an extra monitor over HDMI (1080p@60Hz) simultaneously without problems.

      1. U

    • I'm writing this on a 50" Seiki 4K-unit, & it's pretty great, (once you get past the intial set-up clunkiness.) 30fps is fine for desktop work, writing code, web, etc. When I want faster performance, (3D-games, animation, movies, etc,) I swap display resolution out for 1920x1080@60-120fps. After all, I'm reading this thread because my videocard is old, if not ancient, & I'm in the market.

      As for connectors, this unit has 3xHDMI(A) , 2xUSB, 1xDVI, and a handful of RCA ports for sound & componen
  • Still 28nm (Score:5, Informative)

    by Solandri ( 704621 ) on Saturday September 20, 2014 @11:29AM (#47954175)
    At this point I think it's safe to write off TSMC's 20nm fab process. It's not gonna happen, with signs pointing to development being shifted to 16nm instead.

    A lot of what you see going on in the GPU and mobile front is being dictated by the failure of TSMC and other fabs to transition to 20nm for processors (memory is a lot easier and reached 16nm in 2013). Intel made the transition from 32nm to 22nm last year with Haswell and Bay Trail. The other fabs were supposed to leapfrog Intel by going from 28nm to 20nm this year. They haven't, which is what's allowed Intel to produce Atom SoCs with power usage approaching that of ARM SoCs. ARM has the lower power tech, but Intel's smaller lithography is mostly wiping out that advantage. If you see Intel successfully make the transition to 14nm in 2015 while the other fabs can't get 16nm to work, things are going to get really interesting on the mobile SoC front..

    The GPU front is bleaker. Both nVidia and AMD use third party fabs like TSMC, so there's no competitive advantage to be had. We've just had to suffer with stagnating product lines and slow product releases because the expected lower power consumption in GPUs from 20nm didn't happen in 2014.
    • Re:Still 28nm (Score:4, Informative)

      by edxwelch ( 600979 ) on Saturday September 20, 2014 @12:49PM (#47954593)

      I doubt that.
      TSMC 20nm will be ready for GPUs a lot sooner than their 16nm process. The only reason there are no 20nm GPUs yet is because the initial ramp was fully booked out by Apple.
      Meanwhile, a comparison of Apple's 20nm A8 density versus 14nm Core M, indicates Intel's 14nm may not have such a density advantage as they claim: https://www.semiwiki.com/forum... [semiwiki.com]

    • by irq-1 ( 3817029 )

      The GPU front is bleaker. Both nVidia and AMD use third party fabs like TSMC, so there's no competitive advantage to be had.

      Intel 14nm fab capacity is potentially game changing -- they've held back 2 huge fabs with double the normal capacity, and they still have enough for their own chips. I predict that Intel is going into the 14nm business, and either nVidia or AMD would be interesting clients.

    • At this point I think it's safe to write off TSMC's 20nm fab process. It's not gonna happen [...]

      Except that it already is shipping. Apple's A8 chip used in the iPhone 6 and iPhone 6 Plus is manufactured using TSMC's 20nm process [macrumors.com]. And given Apple's proclivity for consuming entire manufacturing lines for their products, it's entirely possible that TSMC had to turn away other customers if they wanted to keep Apple, simply because they lacked the capacity to do otherwise. It also makes sense why they haven't been able to talk about the fact that they had a major customer lined up, given how religiously Ap

  • So its a tweaked 780/790... and Nvidia bumps the leading digit by 2? WTF. Shouldn't this of been an 8xx series at most? Pretty f'n misleading.
    • by Nemyst ( 1383049 )
      Yeah, it's a tweaked 780 using a completely different architecture. Sure, tweaked.

      Next you'll say Haswell's just a tweaked NetBurst maybe?
  • I do not want to know the performance in SLI, I want to know the normal performance that the buyer would have a single card. And compared to the currently existing cards on the market.
  • Oh wow. This will do wonders for the Blender Cycles rendering engine. Thanks to Blender & Brecht (The coder behind Blenders Cycles rendering engine) I've been able to enjoy the power of a thousand computers in one card thanks to the accelerated powers of the Nvidia GPU based cards with multiple GPUs, mine has about 1300 GPUs and renders like insanity knows no bounds, I love it. YAY the future looks even better now.

    Reading this makes me behave like a kid in a candy store, seriously.
    • by Anonymous Coward

      You're either using very very wrong terminology to describe a GPU rendering cluster, or you're seriously misinformed.

      There is no such thing as a card with 1300 GPUs; though there are cards with 2 GPUs on the same board, it's essentially impossible to fit more. You might have a GPU with ~1300 ALUs, but there's an enormous difference between the two. A GPU is an independent device, roughly speaking. An ALU is not, and in GPUs, they execute in lockstep in large sets (64 in AMD's GCN, 192 in nVidia's Kepler,

      • I know, but for the sake of simplicity.

        It has 1300+ cores (some of the bigger cards got over 2300+ cores), and yes - I realize that they're not full-fledged CPUs and just highly specialized to perform certain calculations, kind of like FPGAs if you like.
  • And tried them with off the wall games? Like Psychonauts, Stranger's Odyssey, older Need For Speed games or Maybe Fallout 3? I'm wondering how stable they are these days. The 4000 serious turned me off ATI again. Ran fine on Call of Duty and other big titles but the smaller ones were really unstable :(... I miss the nicer image quality, better performance and lower prices...

I THINK THEY SHOULD CONTINUE the policy of not giving a Nobel Prize for paneling. -- Jack Handley, The New Mexican, 1988.

Working...