Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Graphics Upgrades Hardware

NVIDIA GeForce GTX TITAN Uses 7.1 Billion Transistor GK110 GPU 176

Vigile writes "NVIDIA's new GeForce GTX TITAN graphics card is being announced today and is utilizing the GK110 GPU first announced in May of 2012 for HPC and supercomputing markets. The GPU touts computing horsepower at 4.5 TFLOPS provided by the 2,688 single precision cores, 896 double precision cores, a 384-bit memory bus and 6GB of on-board memory doubling the included frame buffer that AMD's Radeon HD 7970 uses. With a make up of 7.1 billion transistors and a 551 mm^2 die size, GK110 is very close to the reticle limit for current lithography technology! The GTX TITAN introduces a new GPU Boost revision based on real-time temperature monitoring and support for monitor refresh rate overclocking that will entice gamers and with a $999 price tag, the card could be one of the best GPGPU options on the market." HotHardware says the card "will easily be the most powerful single-GPU powered graphics card available when it ships, with relatively quiet operation and lower power consumption than the previous generation GeForce GTX 690 dual-GPU card."
This discussion has been archived. No new comments can be posted.

NVIDIA GeForce GTX TITAN Uses 7.1 Billion Transistor GK110 GPU

Comments Filter:
  • GK110 vs. 7970 (Score:3, Interesting)

    by Anonymous Coward on Tuesday February 19, 2013 @11:57AM (#42945171)

    Hmm. $999 for 4.5 TF/s vs. $399 for 4.3 TF/s from AMD Radeon 7970. Hard to choose.

  • Re:What's the point? (Score:5, Interesting)

    by fuzzyfuzzyfungus ( 1223518 ) on Tuesday February 19, 2013 @12:29PM (#42945575) Journal

    All games that have the budget for graphics these days are targeted at console limitations. I can't really see any reason to spend that much on a graphics card, except if you're a game developer yourself.

    Buying the absolute-top-of-range card(or CPU) almost never makes any sense, just because such parts are always 'soak-the-enthusiasts' collectors items; but GPUs are actually one area where (while optional; because console specs haven't budged in years) you actually can get better results by throwing more power at the problem on all but the shittiest ports:

    First, resolution: 'console' means 1920x1080, maximum, possibly less'. If you are in the market for a $250+ graphics card, you may also own a nicer monitor, or two or three running in whatever your vendor calls their 'unified' mode. A 2550x1440 is pretty affordable by the standards of enthusiast gear. That is substantially more pixels pushed.

    (Again, all but the shittiest ports) you usually also have the option to monkey with draw-distance, Anti-aliasing, and sometimes various other detail levels, particle effects, etc. Because consoles provide such a relatively low floor, even cheap PC graphics will meet minimum specs, and possibly even look good doing it; but if the game allows you to tweak things like that(even in an .ini file somewhere, just as long as it doesn't crash), you can throw serious additional power at the task of looking better.

    It is undeniable that there are some truly dire console ports out there, that seem hellbent on actively failing to make use of even basic things like 'a keyboard with more than a dozen buttons'; but graphics are probably the most flexible variable. It is quite unlikely(and would require considerable developer effort) for a game that can only handle X NPCs in the same cell as the player on the PS3 to be substantially modified for the PC release that has access to four times the RAM or enough CPU cores to handle the AI scripts or something. That would require having the gameplay guys essentially designing and testing parallel versions of substantial portions of the gameplay assets, and potentially even require re-balancing skill trees and things between platforms.

    In the realm of pure graphics, though, only the brittlest 3d engines freak out horribly at changing viewport resolutions or draw distances, so there can be a reward for considerably greater power(for some games, there's also the matter of mods: Skyrim, say, throws enough state around that the PS3 teeters on the brink of falling over at any moment. However, on a sufficiently punchy PC, the actual game engine doesn't start running into (more serious than usual) stability problems until you throw substantially more cluttered gameworld at it.

  • Re:What's the point? (Score:3, Interesting)

    by K. S. Kyosuke ( 729550 ) on Tuesday February 19, 2013 @01:18PM (#42946121)

    First, resolution: 'console' means 1920x1080, maximum, possibly less'. If you are in the market for a $250+ graphics card, you may also own a nicer monitor, or two or three running in whatever your vendor calls their 'unified' mode. A 2550x1440 is pretty affordable by the standards of enthusiast gear. That is substantially more pixels pushed.

    And almost all those pixels go to waste. I'm still waiting for display units that would be able to track in which direction you're actually looking and give the appropriate hints to the graphics engine. You'd save a lot of computational power by not displaying the parts of scene falling into the peripheral vision area in full resolution. Or, alternatively, you could use that computational power to draw the parts you *are* looking at with greater amount of details.

  • Wow, 4 games (Score:5, Interesting)

    by Sycraft-fu ( 314770 ) on Tuesday February 19, 2013 @04:43PM (#42948133)

    Seriously man, this isn't a console-fan argument nor is that one you want to have in relation to PC hardware because you'll lose. The point is, most games these days are targeted at 1280x720, or lower, at 30fps. The problem is to target anything higher you trade something off. Want 60fps? Ok, less detail. Want 1080? Ok, less detail. There is just only so many pixels the hardware can push. Crank up the rez, you have to sacrifice things.

    Computers can do more than that, but need more hardware to do it. The target on my system is 2560x1600 @ 60fps, with no detail loss. My 680 can't handle that all the time, that's the point.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...