NVIDIA Unveils Dual-GPU Powered GeForce GTX 690 93
MojoKid writes "Today at the GeForce LAN taking place in Shanghai, NVIDIA's CEO Jen Hsun Huang unveiled the company's upcoming dual-GPU powered, flagship graphics card, the GeForce GTX 690. The GeForce GTX 690 will feature a pair of fully-functional GK104 "Kepler" GPUs. If you recall, the GK104 is the chip powering the GeForce GTX 680, which debuted just last month. On the upcoming GeForce GTX 690, each of the GK104 GPUs will also be paired to its own 2GB of memory (4GB total) via a 256-bit interface, resulting in what is essentially GeForce GTX 680 SLI on a single card. The GPUs on the GTX 690 will be linked to each other via a PCI Express 3.0 switch from PLX, with a full 16 lanes of electrical connectivity between each GPU and the PEG slot. Previous dual-GPU powered cards from NVIDIA relied on the company's own NF200, but that chip lacks support for PCI Express 3.0, so NVIDIA opted for a third party solution this time around."
Re: (Score:3)
Re:Wake me up when GK110 hits. (Score:5, Interesting)
I do have to hand it to Nvidia. The power requirements for the current 680 are very low and performance is quite impressive but GK 110 is going to be a monster...
Re: (Score:2)
Amen. I have a GT240 specifically because it's low-power. I'm not installing any graphics card so power-hungry it needs its own magical power connector. I did once and it was a mistake.
Re: (Score:2)
From the TFA: the top right connector is different (Score:3)
The top right connector is different; any idea why this is? I also have cables that look like that, and in a moment of lazy weakness and a lack of initial comment would love it if someone cleared that up for us?
Re:From the TFA: the top right connector is differ (Score:5, Informative)
Re:From the TFA: the top right connector is differ (Score:4, Informative)
They're all dual-link (at least the connectors are - that doesn't guarantee the hardware behind them is). Single-link connectors have two blocks of nine pins on each side, and the middle block of nine pins is only on dual-link connectors. The top connector is dual-link DVI-D, while the others are dual-link DVI-I. A DVI-D port will not support a VGA adapter.
Dual-link DVI? (Score:1)
Sure... (Score:5, Funny)
But can it mine bitcoins?
Re: (Score:3, Interesting)
Re: (Score:3, Funny)
yes, it can (Score:1)
Mine bit coins, run Crysis, and ignite your weed.
But can it feel love?
Re:Sure... (Score:5, Funny)
No but your power bill surely get high!
Great (Score:2, Informative)
It is pretty much impossible right now to get a GTX 680 unless one wants to get gouged due to the short supply.
When will nVidia get enough chips out so my searches aren't forever out of stock?
Re: (Score:1)
Use auto-notify on Newegg and if you miss it once or twice, complain. You will get the next one then. That's how I got an eVGA GeForce GTX 590 Classified last year. No need to upgrade for quite a while. It's still kickass..
Re: (Score:2, Insightful)
nVidia knows exactly what it is are doing.
When the demand is filled (Score:3)
It isn't like they are doing this on purpose. The 680 is just a card that a lot of people want. The thing is, there's only so fast they can have them produced. TSMC is their sole supplier, and they only have one 28nm production line up and running. That line is still having some troubles (TSMC has been a bit over ambitious with its half-node plans and has had trouble at the beginning with them) so total yields aren't what they might like.
Then the real problem is just that everyone wants a piece. TSMC has a
Oh man! (Score:5, Funny)
Re: (Score:2)
Re: (Score:2)
Never mind that, Tux Racer will look amazing! Linux gaming is going to rock!
Re: (Score:2)
Man on Dog? Oh wait, he dropped out of the race, right?
Re: (Score:2)
Unless he cooked it over an nVidia GPU, I don't see how that's relevant!
Re: (Score:3, Insightful)
Except Nvidia has had SLI based multi gpu boards since at least the 8000 series, whereas 3dfx hit the limits of their Voodoo architecture, and required external wall power by the time Voodoo5 came out, and for all the extra hassle, you had a card that performed about as well as a GeForce256, but which also took a spot on your power strip. That's why 3dfx died, not because of SLI boards.
Re: (Score:1)
If you ask me wall outlets were a very good idea. GPU's are the number 1 reason we have to upgrade our power supplies. And the necessity for power requirements to be correct means that bringing your own power supply can be the source if a plethora of bugs and crash's. Consistent power and precise currents with power hungry 3.5 billion transistor microchip's is a necessity. Pairing the power supply with the board means resolving a very real problem most end customers don't know exists.
I disagree and think it's quite valid to need a new internal power supply when the hardware requirement consumes more power... a seperate power supply means another point of failure and a pain in the ass for nothing really. I wouldn't however buy a new power supply because it doesn't have enough leads or the right type of leads (that do the same thing, but with just a different plug)
On a side note anonymous coward postings deserve score-ability (without which most people never see these posts. However valid they might be.)
Welcome to slashdot, get an account, it's free.
Re: (Score:2)
You must be stupid, as ACs can get rated/modded up.
4-digit UID my ass. Maybe 7.
Re: (Score:2, Insightful)
Re: (Score:3)
Well, it's also the fastest graphics card on the market... that's enough nerd porn to warrant an article. And it sets a new record as the first kilobucks card too.
Re: (Score:2)
http://www.newegg.com/Product/Product.aspx?Item=N82E16814133347 [newegg.com]
$3998 for a card. The 690 is just the first non-workstation kilobuck card.
Re: (Score:2)
Ummmmm yeah... no.
http://www.amd.com/us/products/desktop/graphics/ati-radeon-hd-5000/hd-5970/Pages/ati-radeon-hd-5970-overview.aspx [amd.com]
Re: (Score:2)
The last company to get all "multiple core happy" and "SLI On A Board" happy was 3dfx. Who NVidia bought out when they... oh yeah, crash and burned.
Whoops.
I'm pretty sure that both ATI and Nvidia(or one of their OEM partners at the time) have kicked out a 'logically speaking, this is two cards in SLI/Crossfire; but on one card!!!' product for basically every generation since the introduction of the respective GPU-linking features.
The hard part is the fancy tricks that make cooperation between two separate GPUs work at all. Once the vendors decided that they did, in fact, want that to work, the rest is constrained largely by the fact that people willing to
Re: (Score:2)
Re: (Score:2)
Tian-He 1A has 7,168 Teslas, and is the fastest supercomputer using GPUs. Titan (formally Jaguar) will have 18,000 GPUs. Amazon probably has quite a few.
The very top HPC projects may buy 10,000 lots, but most don't.
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
*Whom*
CUDA Double Precision? (Score:2)
Does anyone know if this new card will be capable of taking advantage of double precision under CUDA as is the case with some of their other high end Tesla boards?
Re: (Score:1)
Yes.
http://en.wikipedia.org/wiki/CUDA
Re:CUDA Double Precision? (Score:5, Interesting)
They are. However, their relative FP64 performance has dropped compared to the previous generation. If I remember correctly, there is now separate silicon to do FP64, rather than just a modified path in the FP32 cores. In the previous architecture, we were down to 1/12 of FP32 performance, only a third of some of the Fermi chip cores could do FP64, and at half speed. In the new chip, the FP64 cores can do full-speed calculations, but there are only 8 such cores, versus 192 conventional cores, giving a 1/24 performance ration.
However, Ryan Smith at Anandtech [anandtech.com] speculated that the existence of dedicated FP64 cores means that a future Fermi based on Kepler will be a mean beast, if they do a tape-out with exclusively FP64 cores. The only thing holding back double-precision then will be memory bandwidth (which would be a large enough deterrent in many cases).
Re: (Score:2)
Re: (Score:2)
Re: (Score:1)
Nvidia cripples GPGPU in Geforce GTX 680 [theinquirer.net]
Benchmark Results: Sandra 2012 [tomshardware.com]
NVIDIA GTX 680 Reviewed: A New Hope [brightsideofnews.com]
Re: (Score:2)
The run length on PCIe 3.0 is quite limited (about a foot, I believe, though it's not given). There is an ePCIe spec. There are external devices that will do external PCIe 2.0 and are used for external card chassis or to host SSD storage with run length to two meters. While it's theoretically possible to do a laptop dock with one of these inside it, I don't see that happening anytime soon because there's not enough market for it. As the frequencies increase the distance a workable signal can propagate is
Re: (Score:2)
Re: (Score:1)
There are cheap solutions that aren't pretty...
Like this is example you could use to hook it up to a laptop...
http://www.hwtools.net/adapter/PE4L-EC000A.html [hwtools.net]
As a game developer... (Score:2, Interesting)
As a game developer, I can tell you that the only thing that significantly affects frame rate in a GPU-bound game is GFLOPs. And as the owner of a 3-year old PC with a stock power supply, I'm most interested in the "x40" cards, because those are the highest card you can install in a machine with a stock 350W power supply.
According to what I see on Wikipedia [wikipedia.org], NVIDIA apparently pulled a fast one this generation and re-branded some 500 series cards as the PCIe 2.0 x16 versions, while all the cards with impress
Re: (Score:2)
Is this optimistic theory a horrible pack of lies in general, are Nvidia products specifically broken in this respect, or do the newer ones make assumptions about bus speed that cause them to underperform on PCIe 2.0 boards?
Re: (Score:2)
Re: (Score:3)
Internet anecdote suggests that this glorious vision may or may not actually be 100% realized, yo
Re: (Score:2)
Re: (Score:2)
...beowulf cluster of these!
Emulated on GPUs...
Can someone explain... (Score:2)
Can someone explain to me why general purpose CPU-memory interfaces don't have this kind of bandwidth to keep the newer 6 and 8 core monsters well fed with data and code to crunch?
Re: (Score:2)
Because gamers pay big bucks for a couple more FPS. Office workers won't get one tiny bit of speed out of a faster CPU. Scientists have real computers to use instead of PCs.
Re: (Score:2)
"Scientists have real computers to use instead of PCs."
really then what do they use?
ok sure I am sure somewhere somebody has a cray ... which is powered by a fuckload of x64 cpu's, but really I bet almost all of them are using dell laptops with i7's and nvidia quadros
Re: (Score:2)
That super computer/cluster market is precisely why I would have thought there would be a market for super-bandwidth CPUs. Such systems tend to use the highest of the high end processors already, along with custom memory interfaces and backbones to speed up the communications within the cluster.
Some posters seem to have assumed I was talking about PCs. I specifically said CPU because I wasn't concerned about maintaining compatibility with desktop architectures, but the really big data crunching engines
Re: (Score:2)
Plus, your GPU doesn't have to deal with some random manufacturer's memory chips hiding behind plug interfaces. If I take 1/3 of the ram out of one of my boxes (the furthest of 3 slots), memory timing magically tighte
Re: (Score:2)
Re: (Score:2)
Three things:
1. Datawidth: CPUs use one-channel 64-bit wide DIMMs (sometimes 2 if you are lucky), you can find high end GPUs with 12 to 16 32-bit channel to dram chips. Hard to find that many spare pins on a CPU package.
2. DIMMs: People that buy CPUs want to plug in memory modules and the physics of connectors and their electrical limitations limit the performance. For example, DDR3 DIMMs need read/write "leveling" per-bit-lane compensation for clock time-of-flight across the DIMM, GPUs tend to use solde
$1000 a pop. (Score:1)
TSMC's yield on 28nm has been really low. They priced it sky high because they simply don't have enough chips to make many of these monsters -- supply and demand I suppose.
The real story in my mind is how the tech press will go gaga over a part that few will ever own and how that will inevitably help frame the entire nVidia 6xx product line and sell parts that are not the GTX 690. I guess it's no different than Chevrolet building a high performance sportscar to improve the perception of the bowtie logo.
I'm not sure I understand all this mumbo-jumbo... (Score:1)
Re: (Score:2)
Re: (Score:2)
About $4000, and $3000 more for the box to put it in.
$3500, you forgot the power supply
whoppie (Score:2)
now I can play my xbox360 ports (that would run pretty decent with a geforce 8 series) at 180fps instead of 120, let me just shit myself
here nvidia, have 1000 bucks!
Re: (Score:1)
how fas csn it sewuence a human genome? (Score:2)
That's what GPUs are used for these days.