Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Graphics Games Hardware Technology

NVIDIA GeForce GTX 690 Benchmarked 119

MojoKid writes "NVIDIA has lifted the embargo on benchmarks and additional details of their GeForce GTX 690 card today. According to a few folks at NVIDIA, company CEO Jen Hsun Huang told the team to spare no expense and build the best graphics card they possibly could, using all of the tools at their disposal. As a result, in addition to a pair of NVIDIA GK104 GPUs and 4GB of GDDR5 RAM, the GeForce GTX 690 features laser-etched lighting, a magnesium fan housing, a plated aluminum frame, along with a dual vapor chamber cooler with ducted airflow channels and a tuned axial fan. The sum total of all of these design enhancements results in not only NVIDIA's fastest graphics card to date, but also one of its quietest. In the performance benchmarks, NVIDIA's new dual-GPU powerhouse is easily the fastest graphics card money can buy right now, but of course it's also the most expensive." The GeForce GTX 690 has been reviewed lots of different places today, Tom's Hardware and AnandTech to name a few.
This discussion has been archived. No new comments can be posted.

NVIDIA GeForce GTX 690 Benchmarked

Comments Filter:
  • Finally (Score:5, Funny)

    by busyqth ( 2566075 ) on Thursday May 03, 2012 @07:47PM (#39885321)
    Finally I can play minecraft the way it was meant to be played!
    • Re:Finally (Score:4, Insightful)

      by MrEricSir ( 398214 ) on Thursday May 03, 2012 @07:50PM (#39885353) Homepage

      Not to mention Minesweeper!

    • You kid, but the shaders mod rapes framerate. Even my 9600 GT can only run it at 10fps with decent settings.

  • WTF (Score:5, Interesting)

    by Billly Gates ( 198444 ) on Thursday May 03, 2012 @07:53PM (#39885369) Journal

    Tomshardware is showing GTX beating ATI by 50 - 200% in every benchmark. Anandtech shows the opposite with ATI still winning under the same games? Anyone else notice this?

    Does Toms Hardware or Anandtech get paybacks from either company for biased remarks?

    • Re:WTF (Score:4, Interesting)

      by deweyhewson ( 1323623 ) on Thursday May 03, 2012 @08:09PM (#39885495)
      I've seen it rumored in more than a few places that Tom's Hardware is very Intel and Nvidia, shall we say, "friendly". Obviously colloquial evidence is nothing to base a hard opinion on, but the thought does come into my head whenever I see review discrepancies like this pop up.
      • by Sycraft-fu ( 314770 ) on Thursday May 03, 2012 @09:37PM (#39886081)

        I don't care for Anad's benches much because they seem to like synthetic compute benchmarks. That is really all kinds of not useful information for a game card. I want to see in game benchmarks. If any compute stuff is going to be benchmarked, let's have it be an actual program doing something useful (like Sony Vegas, which uses GPUs to accelerate a lot of what it does).

        Personally I'm a HardOCP fan when it comes to benchmarks. Not only are they all about game benchmarks, but they are big on actual gameplay benchmarks. As in they go and play the game, they don't run a canned benchmark file. This does mean that it isn't a perfect, "each card sees the precisely equal frames" situation, but it is far more realistic to the task they are actually asked to do, and it all averages out over a play session. I find that their claims match up well with what I experience when I buy a card.

        http://hardocp.com/article/2012/05/03/nvidia_geforce_gtx_690_dual_gpu_video_card_review [hardocp.com] is there 690 benchmark. It's a selection of newer games, generally played with triple head (the game displayed across three monitors at once) on a 690, 2 680s SLI'd and two 7970s CF'd.

        • I don't care for Anad's benches much because they seem to like synthetic compute benchmarks. That is really all kinds of not useful information for a game card. I want to see in game benchmarks. If any compute stuff is going to be benchmarked, let's have it be an actual program doing something useful (like Sony Vegas, which uses GPUs to accelerate a lot of what it does).

          But... They tested with 10 games, 1 raytracer and 0 synthetic benchmarks. I don't know what they usually do but this article was very focused on real world performance.

      • I was kind of bummed when AMD bought ATI. I have always been a AMD & NVidia fan.
        • Comment removed based on user account deletion
          • Try running 6 EvE clients across two 24" 16X10 displays and you will notice the difference between the newer cards.

            My GTX265 starts to choke on 4 clients running at full speed, but with ISBoxer I am able to run two large screens at 40FPS and four small screens running at 15FPS.

            Not everyone runs FPS games where you have one screen you are doing everything in, some games multi client very well.

      • Re:WTF (Score:5, Informative)

        by Gadget_Guy ( 627405 ) * on Thursday May 03, 2012 @11:12PM (#39886559)

        I've seen it rumored in more than a few places that Tom's Hardware is very Intel and Nvidia, shall we say, "friendly".

        That would explain why in their most recent Best Graphics Cards For The Money [tomshardware.com] AMD's cards only won 5 categories compare with Nvidia's massive win in 1 category (plus a tie in another and 3 categories with no winners). Basically if you ignore all the times that they say good things about AMD, then it is obvious that they favour Intel and Nvidia.

        As for the original poster claiming big differences in the rankings, I just don't see it. If you filter out the cards that are not tested on both sites you get the following rankings:

        Battlefield 3
        Toms: 680GTX-SLI, 690GTX, 7970CF, 6990, 590GTX, 680GTX, 7970, 580GTX
        Anan: 680GTX-SLI, 690GTX, 7970CF, 6990, 590GTX, 680GTX, 7970, 580GTX

        Skyrim
        Toms: 680GTX-SLI, 690GTX, 7970CF, 590GTX, 6990, 680GTX, 7970, 580GTX
        Anan: 680GTX-SLI, 690GTX, 590GTX, 680GTX, 7970, 580GTX, 6990, 7970CF

        DiRT 3
        Toms: 680GTX-SLI, 690GTX, 7970CF, 680GTX, 6990, 590GTX, 7970, 580GTX
        Anan: 680GTX-SLI, 690GTX, 7970CF, 590GTX, 680GTX, 6990, 7970, 580GTX

        Metro 2033
        Toms: 7970CF, 680GTX-SLI, 690GTX, 6990, 590GTX, 7970, 680GTX, 580GTX
        Anan: 7970CF, 680GTX-SLI, 690GTX, 6990, 590GTX, 7970, 680GTX, 580GTX

        Only Skyrim seems to show any major differences, and that was probably due to some driver issues, game version or alternative testing methods.

        • And this is why you're the gadget guy.
          • And this is why you're the gadget guy.

            He he. You made me think back to the days when I first came up with the Gadget moniker. I was probably using a TNT2 Ultra video card then. It was a far cry from the monster cards we are looking at today! Even so, my card had great TV input/output and included LCD 3D glasses. It seems the actual feature set if graphics cards hasn't improved a lot over the years.

    • well it depends on what you want things for

      basically I don't really see much difference in the graphics openGL/DX11 side of things but this was very interesting to me :

      http://www.anandtech.com/show/5805/nvidia-geforce-gtx-690-review-ultra-expensive-ultra-rare-ultra-fast/15 [anandtech.com]

      regards

      John Jones

      • Well, when you test a gaming card by running GPGPU stuff on it, when nVidia specifically sells GPGPU cards, maybe you are running the wrong test.

    • by Anonymous Coward

      I simply don't trust Tom's Hardware anymore. I bought an A7V Mobo (1st gen Athlon) many years ago based on TH's glowing reviews. The board was buggy as all hell and it came out months later that Asus had paid for that review either directly or indirectly through advertising. I now take a fairly sceptical view of all reviews I read now based on that experience.

      • by dbIII ( 701233 )
        I think I've still got a couple of desktop machines still around with that board but it does have an "SE" in the name. It's funny, give people a bigger disk, more memory and a second monitor and they are happy with an old PC for years.
    • They seem to have merely omitted the games which favor AMD more strongly. Compare, for example, the Metro 2033 [anandtech.com] benchmarks [tomshardware.com] (or BF3, or Skyrim) and you can see that they are relatively similar. THG did not test Crysis or Total War: Shogun 2, which the AMD cards perform better on.

    • Tom's Whoreware? No, they've never taken a sweetener.

    • Tom's Hardware is a joke. They're moneyhatted by Nvidia and Intel all the time.

    • by westyvw ( 653833 )

      I dont know about that, buit I do wonder why Toms Hardware wont let Archive.org's wayback machine show their old pages.....

  • by poly_pusher ( 1004145 ) on Thursday May 03, 2012 @07:54PM (#39885381)
    The GTX 680 and 690 have turned out to be pretty spectacular. The most impressive aspect is the relatively low power consumption for a high performance card.

    I'm still waiting for the GK110-based "Big Fermi" due out Q3. Considering how well the 680 and 690 have performed the Gk110 will be a monster, probably power hungry but still a monster. Nvidia really hit gold with their latest generation, it is speculated that the current 680 was intended to be the 660 until it outperformed AMD's top offering. Can't wait to get my hands on a 4gb GK110.
    • by Anonymous Coward

      Nvidia really hit gold with their latest generation,

      But only if you do single-precision FP workloads. If you do integer workloads, the 680 can't even beat Nvidia's own 580. Pass.

      • by billcopc ( 196330 ) <vrillco@yahoo.com> on Thursday May 03, 2012 @09:04PM (#39885839) Homepage

        If you're doing serious GPGPU stuff, you shouldn't be relying on fickle consumer boards in the first place. This is a gaming card marketed to extreme gamers. I've fooled around with CUDA stuff like raytracing and H.264 encoding, mostly as a curiosity, but the reason I bought this quad-SLI setup years ago was for games and real-time 3D rendering. I couldn't care less about FP performance, and neither does Nvidia's target market for this product line.

        GPGPU on consumer cards is still a novelty at this point. We're getting close to the tipping point, but for most users, as long as it plays their game and can handle 1080p video, they're content. If and when that balance tips in favour of OpenCL and CUDA, both GPU manufacturers will adjust their performance targets accordingly. Their #1 priority is still 3D gaming for now.

        • by Khyber ( 864651 )

          "This is a gaming card marketed to extreme gamers."

          And since games are probably the most resource-intensive fucking thing, you should expect your GAMING CARD to kick major ass at everything else if it has the capability.

          This is why nVidia is losing in the general-purpose GPU arena. AMD just keeps trucking along, upgrading EVERYTHING. NVidia? Gimps your shit.

          • by FyRE666 ( 263011 ) *

            "This is a gaming card marketed to extreme gamers."

            "And since games are probably the most resource-intensive fucking thing, you should expect your GAMING CARD to kick major ass at everything else if it has the capability."

            Did you not understand what he said?! It's a GAMING CARD - it's designed to kick ass when rendering games. Everything else is secondary - it's not a general purpose card, and it's not marketed as anything other than a high end gaming card. If it happens to kick ass as a more general purpos

            • by Khyber ( 864651 )

              "it's not a general purpose card"

              THEN WHY FUCKING INCLUDE CUDA AT FUCKING ALL IN THE HARDWARE?

              Derp, you're not thinking this morning. Go get yourself some coffee and think a little harder.

      • by Sycraft-fu ( 314770 ) on Thursday May 03, 2012 @09:08PM (#39885861)

        Oh that's right: Video games. You know, the thing it was made for.

        The GTX series are nVidia's gaming cards. They are made for high performance when you wanna play 3D games. They aren't made for compute performance. That is not to say they cannot handle compute stuff, just that it isn't what they are primary designed for. So the kind of compute stuff they are the best at will be more related to what games want.

        Their compute products with be the Teslas. They are made for heavy hitting compute performance of all kinds. If you are after purely GPGPU stuff, they are what you want.

        nVidia seems to be separating their designs for the two to an extent. Still common over all design, but concentrating on making the desktop GPUs more efficient, at the expensive of high end computer features (like Integer and FP64 power), and the workstation/compute cards good at everything, even if they need beefier power and are louder.

        I'm ok with that. I buy a GeForce to play games, not to do high end GPGPU stuff. We buy Teslas at work for that.

        Also, there's a shitload of other things out there GPGPU wise that are FP32, and the 680 really is killer at that. Does a great job accelerating video encoding and the like.

        • by tyrione ( 134248 )
          If I'm after GPGPU and OpenCL I chose AMD, not Nvidia whose implementation is weak and have been kicking and screaming that CUDA isn't what the industry has adopted.
    • by Sycraft-fu ( 314770 ) on Thursday May 03, 2012 @09:00PM (#39885815)

      There is zero actual evidence that there is going to be a "GK110" this year, or that if there is it will be a high end part (bigger numbers in their internal code names don't always mean higher end parts).

      I see people all in a lather about the supposed amazin' graphic card that is up and coming, and lots of furious rumors, but nothing in the way of any proof. I also can see some fairly good arguments as to why nVidia would NOT be releasing a higher end card later on (excluding things like Teslas and Quadros, which are higher end in a manner of speaking).

      Speaking of Teslas and Quadros, that may be all that it is: A version of the hardware with a redesigned shader setup to give higher FP64 speed. As it stands the card is quite slow at FP64 calculations compared to FP32. It could be 50% of the speed, in theory, but is more like 1/16th. Basically it seems to be missing the necessary logic to link the 32-bit shaders together to do 64-bit calculations for all but a fraction of the shaders. Maybe to protect their high end market, maybe to keep size and heat down (since it does take additional logic). Whatever the case a Tesla/Quadro version with that in place would have much improved FP64 speed, and thus compute performance for certain things, but be no increase to gaming at all.

      So I think maybe people need to settle down a bit and stop getting so excited about a product that may not even exist or be what they think, and may not launch when they think even if it is. Chill out, see what happens. Don't get this idea that nVidia has something way MOAR BETTAR that is Coming Soon(tm). You don't know that, and may be setting yourself up for a big disappointment.

    • I've been watching my UPS power load meter since I upgraded from a GTX 560 to a GTX 680. I'd estimate the 680 uses a bit less than half the power of the 560 when idle. At peak usage the 680 uses more, but only by a hair.

      I was never happy with the 560 in general. The 3D performance was surprisingly glitchy at 1080p. Even though I wasn't too keen on trying NVIDIA again after that, I gotta admit they won me back with the 680.

    • by cbope ( 130292 )

      Sorry, but NVIDIA have already hinted strongly that there is no "big Fermi" gaming card coming this year. At least nothing that will eclipse the 690. I'm seriously starting to wonder what happened to big Fermi, if it was all just a rumor or perhaps they are saving it for Quadro/Tesla.

      Still, I can't wait for 680 prices to drop a bit so that I can replace my overclocked 570. AMD hardware is pretty decent these days, but every time I touch their drivers... I go back to NVIDIA. Hate to say it, but NVIDIA's driv

  • by edxwelch ( 600979 ) on Thursday May 03, 2012 @08:05PM (#39885465)

    According to Semiaccurate there's a mask design flaw in the GK104, which has caused poor yields. Less than 10,000 GTX 680s shipped worldwide, even though it's been released a month ago.
    http://semiaccurate.com/2012/05/01/why-cant-nvidia-supply-keplergk104gtx680/ [semiaccurate.com]

    • by Sycraft-fu ( 314770 ) on Thursday May 03, 2012 @09:25PM (#39885975)

      I would encourage people to look at the site's name before taking anything they say seriously. And then I'd encourage them to look in the archives (if they keep true and accurate archives of their past stuff, I've never checked) to see all the shit they get wrong (and there is a lot of it). Then maybe you understand that like most rumour sites, you don't want to take it too seriously.

      For some overall perspective, consider that Charlie Demerjian, the guy who runs it, was given the boot from The Inquirer, which is not precisely what one would call a bastion of journalistic excellence.

      As an example of one major madeup story from them, in February they claimed that the GTX680 would have a "PhysX Block" basically either dedicated hardware to speed up PhysX, or special instructions/optimizations for it at the expense of other things. They said that the supposed edge in benchmarks was only because of that, the 7970 would out do it in most games.

      That is not at all the case, it turns out. The GTX680 has nothing particularly special for PhysX, other than a shit-ton of shaders, and it in fact outperforms the 7970 by a bit in nearly all game, including ones without PhysX. HardOCP (http://hardocp.com/article/2012/03/22/nvidia_kepler_gpu_geforce_gtx_680_video_card_review/) has them both tested with real gameplay, as usual.

      So really, don't take anything that site says seriously. It is a blatant rumours site that just makes shit up.

      • I do follow that site and most of the stuff is spot on. It's true a few stories are wild speculation, but this story rings true.

        • "This story rings true"? In other words "I'm an AMD fan that wants to see nVidia fail, so this story sounds true to me because I like it."

  • who cares (Score:2, Interesting)

    by epyT-R ( 613989 )

    who is going to pay $1000 for a piece of hardware with a halflife of maybe one year? this card is really worth about $400 at most.. and the 680 should be $200. what games actually take advantage of this? there are hardly any pc games worth playing nowadays :\. It's too bad too, because I LIKE new graphics hardware. it's always fun to play with, but at $1000 I can't justify it.

    • Re:who cares (Score:5, Insightful)

      by billcopc ( 196330 ) <vrillco@yahoo.com> on Thursday May 03, 2012 @09:24PM (#39885973) Homepage

      Normally I'd have preordered two of these already, but it's too rich for my blood right now. This card is for us nutjobs who want quad-SLI and panoramic "3D Surround", with our custom-built driving cockpits and 3 large monitors, or the equally obsessive flight sim crowd. In my case, these displays run at 2560x1440 and that requires a ton of memory bandwidth on each card, just to push all those bits around.

      For almost everyone else, a single $300 GPU is enough to run just about any game at 1920x1080 with very respectable settings.

      As for your suggested prices, you're just talking out of your ass. If you're going to lowball the latest and greatest GPU on the market, maybe you should set games aside for a while and look at your income. Even though I agree the price is a bit high, spending $1000 on a hobby is nothing. You save up for that shit, and it lasts a very long time. My current cards are over 3 years old, so it works out to just over a dollar a day for kickass gaming graphics. Even if I played for just a few hours a week, it's still cheaper than any other form of modern entertainment. Cheaper than renting a movie, cheaper than a single pint at the pub, cheaper than basic cable TV, cheaper than bus fare to get to and from a free goddamned concert. For what I get out of it, having the highest end gaming hardware ends up being a sweet deal.

    • by Kjella ( 173770 )

      Meh, if there was a reasonable (no, a $36000 Eizo doesn't count) 4K/QFHD monitor I'd consider it. I don't like triple screen setups with their bezels and odd aspect ratio with stretching and whatnot, I want it all on one screen. IMO the problem is not the price of the graphics card, it's having something useful to show it on. Even at 2560x1440 I'd have to pay more for a single monitor than for a 680 GTX, which is why I'm still on a good 1920x1200 IPS monitor. Of course it helps that I'm not a FPS junkie but

      • by Smauler ( 915644 )

        Even at 2560x1440 I'd have to pay more for a single monitor than for a 680 GTX

        No you wouldn't. I mentioned this monitor [ebay.com] earlier in the discussion... I'm not trying to sell them, by the way, I was suprised at the cheapness. Most of the consumer feedback with them has been good, too.

        • by Cederic ( 9623 )

          Ah, nod. A brand I've never heard of, on a site I don't trust, with no UK suppliers providing it.

          Most of the astroturfing has been good though, I agree :)

          Given the resolution you can get on the iPad2 it's reasonable to expect 2560x1440 for $400, especially without the free iPad2 thrown in too. It has taken monitor providers several years to really push into that market though.

          Even the one you're linking to.. 27 inches? Why not 22? Hell, I'll happily use a 15" 1920x1080 screen, so why not a 19" screen with 2

  • So FP64 performance went from 1/8th FP32 performance in the 500 series to 1/24th FP32 in the 600 series? I, for one, would love it if using doubles in OpenGL 4.x didn't suck so much. I write visualization software with planet-sized ranges and having that extra precision on the card would be quite nice.
  • when getting a new graphics card every few years was like a new console launch to me. I get that they're being used by statisticians but seriously, what do gamers do with these? The only game that comes close to taxing a $150 graphics card is Crysis 2, and even that's not doing much... Maybe nvidia could put more effort into making these easy to program for so we'd get better games cheaper? I think we've hit the limit on graphic quality, if only because it's too much work to do the art assets...
    • Re: (Score:3, Insightful)

      by Anonymous Coward

      Mostly multihead gaming. While a $150 card is plenty at 1080p, at 5400x1920 or 4320x2560 it's a different story.

  • One question though: If I can play Skyrim with all settings to max at 1920x1200 with a GTX 560, what is SLI of two GTX 690's needed for?

    • by Smauler ( 915644 )

      According to this page [techspot.com], a GTX 560 _averages_ 25fps at 1920x1200. That's not that good.

  • Correction: this is the most expensive GAMING card you can buy. The price of a professional card can be up to US$10,000.

If all else fails, lower your standards.

Working...