Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Graphics Hardware

NVIDIA's GeForce GTX 980 Ti Costs $350 Less Than TITAN X, Performs Similarly 156

Deathspawner writes: In advance of the rumored pending launch of AMD's next-generation Radeon graphics cards, NVIDIA has decided to pull no punches and release a seriously tempting GTX 980 Ti at $649. It's tempting both because the extra $150 it costs over the GTX 980 more than makes up for it in performance gained, and despite it coming really close to the performance of TITAN X, it costs $350 less. AMD's job might just have become a bit harder. Vigile adds The GTX 980 Ti has 6GB of memory (versus 12GB for the GTX Titan X) but PC Perspective's review shows no negative side effects of the drop. This implementation of the GM200 GPU uses 2,816 CUDA cores rather than the 3,072 cores of the Titan X, but thanks to higher average Boost clocks, performance between the two cards is identical. And at Hot Hardware, another equally positive, benchmark-laden review.
This discussion has been archived. No new comments can be posted.

NVIDIA's GeForce GTX 980 Ti Costs $350 Less Than TITAN X, Performs Similarly

Comments Filter:
  • Meh (Score:3, Insightful)

    by Cedarbridge ( 1964514 ) on Sunday May 31, 2015 @10:17PM (#49812341)
    I don't really feel the need to drop $1k on a graphics card. Not when 95% of my needs can be met with an old Radeon 6850. Its not like I need to speed render the surface of Mars or anything.
    • Re: (Score:2, Insightful)

      I use an r9 270. I bought it when my old card starting showing age and acting up. For about $150 it runs every game I play on highest settings without batting an eye. That's with an AMD Athlon x2 btw... The whole race to specs domination doesn't add much.
      • Re:Meh (Score:4, Insightful)

        by PhrostyMcByte ( 589271 ) <phrosty@gmail.com> on Sunday May 31, 2015 @11:22PM (#49812543) Homepage

        I use an r9 270. I bought it when my old card starting showing age and acting up. For about $150 it runs every game I play on highest settings without batting an eye. That's with an AMD Athlon x2 btw... The whole race to specs domination doesn't add much.

        You aren't playing the same games I'm playing. My video card and CPU are considerably faster than yours, and I'm unable to max out my settings in most games without considerable FPS drops.

        • Comment removed based on user account deletion
        • by wallsg ( 58203 )

          It really does depend on your games. What works for one gamer may be completely unsuitable for another.

          I'm playing Guild Wars 2 with two crossfired 7870s with the highest graphics settings and I'm CPU bound (OCed i5-3570K) in most cases (except where there's really, really heavy particle effects). Most MMOs are like that. If I played Tombraider or Bioshock Infinite (installed and waiting to be played for over a year) on highest settings then I'd likely be GPU bound.

      • Depends on the resolution you use. Here I have two GTX980 (Gigabyte G1) in SLI and still is not enough to run my Skyrim at 3840x2160 with ENB (4K display user).
        • by Cederic ( 9623 )

          This is why I don't own a 4k monitor. As lovely as it would be for photo editing I'd still end up gaming on my 2560x1440.

          • by JazzLad ( 935151 )
            My GTX680 pushes my 4k just fine* in games.

            *Obviously for varying degrees of 'just fine' - I set my AA really low (like 2x) or sometimes even off because doubling/quadrupling (depending on how you're counting) the resolution (over my 1080p) I find it still looks way better than 1080p + 8-16x AA. My 4k is also 24" (price came down since I bought it, dang it! [amazon.com]) so that helps, I'm sure. Oh, and yes, it is gorgeous for Photoshop :)
            • by Cederic ( 9623 )

              My borrowed GTX970 doesn't have problems with games at full settings (other than AA) at 2560x1440 but I'm still not confident it'd handle full prettiness at 4k.

              But I'm picky and want decent framerates too.

      • I have a Radeon 6950 and I can't get 60 fps in DA Inquisition at 1080p unless I turn the settings down to medium.

    • Sticker shock was my reaction as well. My old HD5870 is only now starting to have unacceptably low performance in some of the games I play, and spending around $300 will probably keep me set for another 4-5 years easily.

    • It sounds like you only use a GPU for displaying/rendering graphics, not as a general purpose (super)?computer.
    • Well some of us do render the surface of mars on a regular basis and can't wait to see what this does to the price of the 970.

    • I'm still 'rocking' a Radeon HD 5870. Looking them up on passmark [videocardbenchmark.net], I have about 400 points on you.

      Personally, my standard for upgrading my video card is that the benchmarks would have to double - which just wasn't happening last year, at least affordably. This year it looks like a GTX 960 might be a good choice.

      5 years out of a video card isn't bad at all.

  • by rsilvergun ( 571051 ) on Sunday May 31, 2015 @10:22PM (#49812353)
    AMD hasn't been able to offer serious competition in quite some time. Driver stability has been a nightmare for them (partially because of nVidia's shenanigans, but that hasn't made games any more stable...). The trouble is nVidia is in such a strong position they can just drop their pants and buy AMD. What I'm wondering is if they're making enough from the consoles to push back. I'm inclined to say no since nVidia turned that contract down...
    • by im_thatoneguy ( 819432 ) on Sunday May 31, 2015 @10:25PM (#49812365)

      ATI/AMD has had shit drivers since the 90s. This whole "Nvidia is stabbing us in the back!" doesn't carry water since they've *never* had reliable products.

      • Re: (Score:3, Informative)

        by Pubstar ( 2525396 )
        Funny, I've had zero problems with my drivers form AMD cards for the past 5-6 years. Of course, prior to that, it was a nightmare. Also, go look up GameWorks. It actively fucks AMD in that they cant optimize anything that runs GameWorks.
        • If AMD didn't screw over tessellation performance for opencl wouldn't be in such a spot but hey, Nvidia is USING a standard like AMD fanboyz all wanted.
          • It's not even the tessilation. Its when AAA games use GameWorks, it screws AMD's performance. They cannot see how the graphics engine is making calls to the APIs. Its also in the listening agreement between the developers and Nvidia that the code used for the engine CANNOT be shared without outside sources. Well, it can be, in some cases, but it specifically excludes AMD from ever seeing what is really going on behind the curtain.
            • by Dog-Cow ( 21281 )

              What's stopping AMD from buying a few copies of something using the GameWorks engine and profiling it? I am sure AMD developers are quite skilled with kernel-mode debuggers and would have no problem figuring out what the engine is doing. This is not illegal, no matter what crap is in the EULA of the game.

              • by 0123456 ( 636235 ) on Monday June 01, 2015 @12:24AM (#49812715)

                When I worked on 3D drivers, oh, how we used to laugh when some idiot developer put in code that deliberately broke the game when run under a debugger. Yet they still expected it to work well on our cards...

                But, yeah, it wasn't at all unusual for developers of little clue to do completely retarded things that worked on other hardware, but not ours. Often because we actually implemented the feature they were using in hardware, whereas the other drivers simulated it in software.

                • >code that deliberately broke the game when run under a debugger

                  I'd never heard of this unless you are talking about SECUREROM

              • The damage would still be done as they would have to wait until after the launch of the game to do any optimization for the game. All the benchmarking and testing would already be done and published.
      • Re: (Score:3, Informative)

        by Anonymous Coward

        We just finished a project with a high-end ATI/AMD R9 290X so we could use their EyeFinity tech to make a single desktop span over multiple monitors and be adjusted for bezels etc.

        While the card itself runs very fast and it finally worked as intended, the Catalyst Control Center (CCC) is an absolute joke. It's embarrassingly bad, some of the settings make almost no sense, the bezel adjustment is totally broken (on the latest stable drivers) to the point where the on-screen display during setup is meaningles

        • My memory of Nvidia before I dumped them was of terrible drivers and broken and removed features. CCC is indeed a clusterfuck but Nvidia have rarely been far behind them.

          I dropped Nvidia thanks to their eagerness to just drop features in new drivers (like usable PAL TV support years ago) and a really shitty attitude to fixing user reported problems, you really have to be an AAA developer before they care. Don't have this years new card and you can forget getting support.

        • We just finished a project with a high-end ATI/AMD R9 290X so we could use their EyeFinity tech to make a single desktop span over multiple monitors and be adjusted for bezels etc.

          Honest question, but why do you need eyefinity for that? On X, I can place physical screens pretty much wherever I like on an arbitrary grid with whatever arbitrary gaps I want, and that's spanning multiple physical cards with chipsets (though not with 3D support in that case). Do other windowing systems not gennerally support suc

      • You weren't using a nVidia card in Jan 2007 on a Vista machine then, were you?

        Their drivers sucked. AMD wasn't perfect, but their launch drivers were better than nVidia.

        Both companies have had periods of issues, at the moment, both are doing pretty well.

    • Re: (Score:2, Insightful)

      by sg_oneill ( 159032 )

      Don't be so eager to chant for AMDs downfall. Competition from AMD (Both to intel in processors and Nvidia in GFX cards) keeps intel and nvidia honest. Without competiton they'd have no incentive to innovate and keep prices down. The browser wars prior to firefoxes rise showed what happens when a market is without competition, it stagates, and thats bad for everyone.

    • It'll be interesting to see how AMD does this time around. Apparently they're making use of HBM (see here [anandtech.com] for an in-depth description). If they make good use of this technology, which seems likely, they could produce a pretty dramatic increase both in performance and performance per watt. It seems unlikely that nVidia will be able to compete with older-generation GPU's (nVidia will be using similar memory technology next year, so if AMD pulls ahead, it probably won't be for all that long).

  • Sure, it performs exactly the same until you run out of memory and then its performance goes to 0% since it can't do anything. You're getting 12GB of RAM with the TitanX vs the 6GB on the TI. The Titan is essentially cheap Tesla for compute heavy tasks like Adobe Premiere or 3D Rendering. I wish though that Nvidia would embrace Titan's position and enable GRID virtualization. AMD has enabled at least GPU pass-through on their entire high end line. I guess Virtual GPUs are still so niche that it's not

    • by Luckyo ( 1726890 ) on Sunday May 31, 2015 @10:46PM (#49812453)

      That's not how GPU memory management works. When you max out GPU's onboard RAM, GPU starts calling to shared system memory located in system RAM. This limits performance a lot since PCI-E throughput is about 1/10 of GDDR5 speed, but it most certainly is not zero.

      Regardless, TitanX, unlike previous Titan series is crippled for compute to the point where Nvidia itself officially recommended previous Titan cards for it over TitanX. It was clearly aimed at gamers who have more money than sense, and now that they collected that money, they are releasing a more sensibly priced card for the same weight class.

      • by Scutter ( 18425 ) on Sunday May 31, 2015 @11:43PM (#49812593) Journal

        $650 is "sensibly priced" for a gaming card? That's almost double the cost of a current-gen console and you still have to buy the rest of the computer.

        • by rwa2 ( 4391 ) *

          Yeah, well, you have to pay a premium to get on top of the list:
          http://videocardbenchmark.net/... [videocardbenchmark.net]
          (BTW, very useful resource for making rough sense of the alphanumeric soup)

          For my part, I'd be happy with the $200 GPU that gets me in the top 20... (GTX 960). I'm guessing that will get the Ti treatment next.

          Though I'm still pretty happy with my 560 Ti, which is still pretty decently placed at roughly 1/3rd the speed of the top card. I think the last couple of generations have been skippable, though I'm now

        • by Kjella ( 173770 ) on Monday June 01, 2015 @02:21AM (#49813031) Homepage

          $650 is "sensibly priced" for a gaming card? That's almost double the cost of a current-gen console and you still have to buy the rest of the computer.

          And you're playing at most 1920x1080x60 Hz, from what I understand often less. This is the kind of card you want if you're looking for 2560x1440x144 Hz or 3840x2160x60 Hz gaming on say an Acer XB270HU or XB280HK, pushing at least 4x as many pixels. For games that only run at 30 fps or 720p/900p make that 6x-8x as many pixels. Sure, it's like comparing a soccer mom car to a $100k+ sports car, it's not "sensibly" priced. It has terrible MPG with a 250W power consumption. But when you put the pedal to the metal, it's seriously fast.

          The Titan X was clearly a "because we're the fastest, charge double" card. I guess you're always looking at it from your point of view and saying the others are the insane ones, "Paying a $1000 for a graphics card? That's crazy, I'll settle for a $650 GTX 980 Ti". Next guy says "Paying $650 for a graphics card? That's crazy, it'll settle for a $199 GTX 960" and so on. Basically you spend relative to your interest and the amount of money you can comfortably spend. Don't go to a five star luxury resort if the budget says a hostel, but if you can afford the resort do it. YOLO and all that.

          • And you're playing at most 1920x1080x60 Hz, from what I understand often less. This is the kind of card you want if you're looking for 2560x1440x144 Hz or 3840x2160x60 Hz gaming on say an Acer XB270HU or XB280HK, pushing at least 4x as many pixels.

            Both the Xbox One and the PlayStation 4 support 4k output. I don't know what the performance is like, but I can easily play a game at 2560x1440 without any sort of issue with performance. I don't have anything higher than that, so I can't speak to its performance at higher resolutions.

            • by PPalmgren ( 1009823 ) on Monday June 01, 2015 @07:23AM (#49813733)

              All outputs are not created equal. First, most consoles target 30 FPS. Second, like the old consoles at 1080p, this output is likely just an upscale. They simply do not have the horsepower to render content at that resolution. Equivalent GPUs can be had for $100 or less in computers.

              • All outputs are not created equal. First, most consoles target 30 FPS. Second, like the old consoles at 1080p, this output is likely just an upscale. They simply do not have the horsepower to render content at that resolution. Equivalent GPUs can be had for $100 or less in computers.

                Well that is true. I don't whether or not it scales at that resolution. I have a 5 year old desktop card that can do Kerbal Space Program at 2560x1440 with no problem. You'd think the latest gen console could at least do that without scaling, but I don't know.

                • How many years ago was the latest console designed?

                  You realize consoles aren't even competitive with high end graphics when they are brand new?

                  Build a ship with 3 or 4 hundred components and watch the frame rate crash in KSP.

                  • How many years ago was the latest console designed?

                    You realize consoles aren't even competitive with high end graphics when they are brand new?

                    Build a ship with 3 or 4 hundred components and watch the frame rate crash in KSP.

                    I do realize that they do not use the latest and greatest hardware in consoles. The current console generation has been on the market for less than two years. The GPU in there is based on a 2011 model from AMD. I just checked and my desktop card is based on a 2009 product line. I've not built anything with more than about 200 components, but it has not been an issue so far at that resolution.

          • by bongey ( 974911 )
            YOLO ? Your geek card has been revoked.
        • by MrDoh! ( 71235 )
          One Summer Steam Sale seems to shift the PC > Console price argument.
          • You do know that PSN and the Xbox Marketplace have sales as well, right?

            I'm not personally familiar with the Xbox ecosystem, but PSN has WEEKLY sales/discounts, then there are holiday sales, and themed sales (had a star wars one a few weeks back) and seasonal sales.

            https://store.playstation.com/... [playstation.com]||price~asc

            • As a PSPlus subscriber and a Steam fanboy, I can tell you they don't even compare. In summer sale I can get all the AAA titles I missed for 50-75% off, and catch up on DLC for exceptional games for pennies. Now, I do get occasional "free" casual/sega genesis/old arcade ports for my $5.99 PS+ monthly subscription, and sometimes big name games are cheap, but it can't even compare to the 2x a year discount salepocalypse that happens on Steam.
              • As a PSPlus subscriber and a Steam fanboy, I can tell you they don't even compare.

                I find it hard to believe you're a PS+ subscriber because:

                Now, I do get occasional "free" casual/sega genesis/old arcade ports for my $5.99 PS+ monthly subscription,

                It's much more than that. Either you aren't paying attention, or you are intentionally understating the PS+ freebies because you're a PC Master Race type.

                This is the master list of ps+ games;

                http://en.wikipedia.org/wiki/L... [wikipedia.org]

                There also serious discounts during the seasonal sales. Some PSone classics have been under a buck.

                but it can't even compare to the 2x a year discount salepocalypse that happens on Steam.
                In summer sale I can get all the AAA titles I missed for 50-75% off, and catch up on DLC for exceptional games for pennies.

                But "is pennies" a good thing. If everyone waits till the game is "pennies" that might make developers less likely to favor the

      • Agreed, in a year or two, the Nvidia card will show its age.

        • by Luckyo ( 1726890 )

          Highly unlikely that it will because of how game development is linked to console generations.

          • We're already two years deep in this generation.

            • by Luckyo ( 1726890 )

              More than two, as development for games started at least two years before console releases. But if you look at costs sunk into current consoles, and how long previous generation lasted, you'll understand that current generation is highly unlikely to start showing its age as long as it can perform fine in the handful of PC exclusives that will actually put it under serious load like Star Citizen.

              • I've been following videogame industry since the early days. I simply disagree on your assessment of anything > 5y AAA cycle, and > 10 year overall lifecycle.

                I have access to the latest in 3d tech, these consoles will be gone just as quickly as the last batch.

                • by Luckyo ( 1726890 )

                  I don't want to sound dismissive of your insider information, but trend has been that each console generation lasted longer than previous one, and previous generation lasted almost 10 years (360 came out in November 2005, PS3 in November 2006).

                  Because it's not the "latest 3d tech" that dictates the speed of game development and console technology, but the cost of development (both in terms of games and hardware) and the cost of hardware to end consumer.

                  Considering just how little speed improvement current G

      • I can't think of a single application which actually tries to swap memory off of PCI-E. Because you not only have to load the new memory, you also then have to *reload* the old memory. Especially with something like raytracing where you never know which memory is going to be called on the next ray. You would be pathetically and uselessly slow. So ok, fine technically "that's not how GPU memory management works" but back in the real world there are few practical applications where a single task is us

        • by Luckyo ( 1726890 )

          That would probably be because application doesn't get to make the choice in the first place.

          Decision on what goes where is up to the driver. Unless you have some weird "down to the metal" GPGPU software. That's why Nvidia made 970 the way it is - it knows that no matter how shitty of a code devs write, it's their driver that decides when to go to 3.5 GB of fully featured VRAM, when to go to system RAM and when to go to that crippling 0.5 GB of separately mapped VRAM.

  • Nah... the sweet spot is still on the 750Ti, which is a nice 60W GPU that has plenty oompf and costs just $180,-
    Why would you want to heat your office with an extra 0.25KWatt heater if you don't live in Alaska?
    Top Tip: Pass on this one, and take the 750Ti.

    • Yeah, I don't actually see why people put so much weight on what the card can do when we live in this age of console ports. For me, the max TDP is a much bigger factor (for fan noise reasons more than electricity). And Nvidia seems to be winning on that score also.
    • I know Slashdot isn't really up to date on gaming, but I'll just drop a hint: if someone's looking at the 980Ti, they are NOT the target market for a 750Ti.
      • Markets do change though. For the longest time, I was a non-gamer. Then Steam for Linux came out. Two years ago I bought a GTX 760, which was many times the cost of the GT 430 I bought before that. I bought the 760 because it was the best card I could get without upgrading my power supply (which split 12 volt rails). But I've found that the 760 is under powered for driving my 1440p display at reasonable framerates (I have the display for productivity purposes). It's time to build a new computer this year, a

        • by Cederic ( 9623 )

          Price/performance metrics put the GTX970 as a clear winner right now.

          It's good enough for 1440p for current generation games. May struggle over the next couple of years so if you're buying for the future then.. well, it's a bad time to build. Although you're always only a year away from a new graphics architecture Nvidia will soon be dropping 16 or 20nm fabrication which should seriously improve performance/power ratios and that opens the door for a noticeable performance boost.

          Makes it a tricky choice for

    • Why would you want to heat your office with an extra 0.25KWatt heater if you don't live in Alaska?

      Electricity prices are high enough up here in Alaska that you don't want to be using a space heater either. Oil/Gas is much cheaper.

  • by malx ( 7723 ) on Monday June 01, 2015 @06:56AM (#49813627)

    Despite advances, these figures show that FPS in 4K is still not ready for prime-time even on top-class cards.

    When there are cards that can handle it, I'll think about upgrading my 1920x1200 monitor. Until then, I'll wait it out, and so can my aging graphics card.

    Part of the problem is that at higher resolutions it becomes more important to use high graphics settings (high res textures, better lighting effects, further draw distance), not less. So if you're interested in 4K gaming, you'll want to do it with everything turned up to 11. The exception to this rule is anti-aliasing, which decreases in value the higher the resolution.

    • You can do 4K now, but you will need SLI. But it depends on the game too, if I do not make any changes in Skyrim for example, then I can run it in 4K with a single GTX980 easily.
    • by Nyder ( 754090 )

      Despite advances, these figures show that FPS in 4K is still not ready for prime-time even on top-class cards.

      When there are cards that can handle it, I'll think about upgrading my 1920x1200 monitor. Until then, I'll wait it out, and so can my aging graphics card.

      Part of the problem is that at higher resolutions it becomes more important to use high graphics settings (high res textures, better lighting effects, further draw distance), not less. So if you're interested in 4K gaming, you'll want to do it with everything turned up to 11. The exception to this rule is anti-aliasing, which decreases in value the higher the resolution.

      All 4k'rs I know use SLI. (They have have nvidia cards,980's). I wouldn't be surprised if some of them got the new card. Me, i'm cool with my 1080p and my 970.

    • Define "handle."

      Developers decide the level of detail and effects they want to be the max settings in their games, and they can always make this higher than the current state of the art, regardless of what that state is. Generally, they're going to be targeting 1920x1080 displays, since that's what the overwhelming majority of users have right now, even with the best video cards. And this has *always* been the case. Back when I had a 21" CRT with "MultiSync," I would never be able to run the latest games

    1. Those that want to win. That is those that don't have to by extremely expensive graphics cards or need ulta-mega-high resolutions with 24 monitors, etc.. These use single monitors so that have full range of view without moving their head and use lowest possible settings in order to accelerate game play and responsiveness. They often uses other pieces of antiquity like keyboard/mouse cords and ethernet cables. These people often save hundreds of dollars (sometimes more) by using "lesser" graphics capab
    • by zlives ( 2009072 )

      thank you for reiterating why MMO's are to be avoided at all cost.

    • by Cederic ( 9623 )

      These people often save hundreds of dollars (sometimes more) by using "lesser" graphics capable systems.

      No, these people buy as high quality a card as they can afford and then turn the detail levels down anyway.

      Framerates are king and shitty graphics cards don't give you guaranteed top end framerates.

  • NVIDIA has decided to pull no punches and release a seriously tempting GTX 980 Ti at $649

    Only the slimmest of demographics (pun intended) would this appeal to. Let me know in 6-12 months when this gets down to the $100-$150 range.

  • I'm hoping this spurs some price drops on the rest of the 900 series. I've been itching for a good GPU sale. I'm still running an old 465 and it has become the bottleneck on my system. I'll probably settle for a 960 since that would be ~400% improvement on my current card (well, maybe not quite since my current mobo doesn't support 3.0) and wouldn't break the bank, but I'd really like to see the 970 come down a bit more in price. The 970 is probably overkill for what I do since I don't think I'll be moving

"Everything should be made as simple as possible, but not simpler." -- Albert Einstein

Working...