Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Graphics Upgrades Hardware

NVIDIA GeForce GTX TITAN Uses 7.1 Billion Transistor GK110 GPU 176

Vigile writes "NVIDIA's new GeForce GTX TITAN graphics card is being announced today and is utilizing the GK110 GPU first announced in May of 2012 for HPC and supercomputing markets. The GPU touts computing horsepower at 4.5 TFLOPS provided by the 2,688 single precision cores, 896 double precision cores, a 384-bit memory bus and 6GB of on-board memory doubling the included frame buffer that AMD's Radeon HD 7970 uses. With a make up of 7.1 billion transistors and a 551 mm^2 die size, GK110 is very close to the reticle limit for current lithography technology! The GTX TITAN introduces a new GPU Boost revision based on real-time temperature monitoring and support for monitor refresh rate overclocking that will entice gamers and with a $999 price tag, the card could be one of the best GPGPU options on the market." HotHardware says the card "will easily be the most powerful single-GPU powered graphics card available when it ships, with relatively quiet operation and lower power consumption than the previous generation GeForce GTX 690 dual-GPU card."
This discussion has been archived. No new comments can be posted.

NVIDIA GeForce GTX TITAN Uses 7.1 Billion Transistor GK110 GPU

Comments Filter:
  • Re:GK110 vs. 7970 (Score:2, Insightful)

    by NoNonAlphaCharsHere ( 2201864 ) on Tuesday February 19, 2013 @12:03PM (#42945247)
    Hmm. $999 (2013) for 4.5 TF/s vs. $15 million (1984) for 400 MF/s from Cray-XMP. Hard to believe.
  • by dtjohnson ( 102237 ) on Tuesday February 19, 2013 @12:20PM (#42945439)

    Software (other than games) that can actually benefit from this type of hardware is scarce and expensive. This $1000 card will probably be in the $5 bargain box at the local computer recycle shop before there is any significant software in widespread use that could put it to good use.

  • by Anonymous Coward on Tuesday February 19, 2013 @12:31PM (#42945611)

    I have no need for this therefore nobody does.

    Why do people find this argument convincing? It's just dumb.

  • by Sockatume ( 732728 ) on Tuesday February 19, 2013 @12:56PM (#42945881)

    The Next Big Thing is all-real-time lighting. Epic has been demoing a sparse voxel based technique that just eats GPU power.

  • Nope (Score:5, Insightful)

    by Sycraft-fu ( 314770 ) on Tuesday February 19, 2013 @04:55PM (#42948295)

    Let's have a look at some recent non-FPS games:

    Darksiders II: 1152x640
    Dishonoured : 1280x720
    Mass Effect 3: 1280x720
    Need For Speed: Most Wanted: 1280x704
    Soul Calibur V: 1280x720
    Sleeping Dogs: 1152x640
    X-Com Enemy Unknown: 1280x720

    That's just a selection of games released last year, that aren't FPS's that use 1280x720, or lower, on the PS3.

    Most PS3 games don't do 1920x1080. It doesn't have the fillrate, or the VRAM, to deal with that without some serious quality sacrifices so most developers choose less rez for more eye candy.

    Remember that the resolution it is outputting at isn't the same as rendering. You can upsample any output you like, hence how a DVD player can output a 1080p signal though the DVD is 720x480 anamorphic.

"If it's not loud, it doesn't work!" -- Blank Reg, from "Max Headroom"