AMD Announces 16 TFLOP Radeon Pro Duo (hothardware.com) 42
MojoKid writes: Remember that Radeon R9 Fury X2 graphics card that AMD CEO Lisa Su showed off months ago? We were previously lead to believe that the dual-GPU graphics card would deliver performance of around 12 TFLOPs. However, the card will actually deliver in excess of 16 TFLOPs. AMD says that this is more than enough to allow developers to "Develop content more rapidly for tomorrow's killer VR experiences while at work, and playing the latest DirectX 12 experiences at maximum fidelity while off work." And the Radeon R9 Fury X2 name? That's dead and buried — the card is now known as the Radeon Pro Duo. Not much is known about the new card at this point but the Radeon Pro Duo will apparently be available during the second quarter with an estimated street price of $1,499.
Find out what is and isn't actually news (Score:4, Insightful)
Not news: AMD slapped two GPUs from last year onto a card in an under-clocked and over-priced configuration like AMD/Nvidia have been doing for years.
News: Even AMD couldn't avoid posting pictures of a nice shiny red AMD developer system that's clearly running a Haswell-E CPU with an LGA-2011 motherboard to make the "X2" or "Pro" or whatever they are calling it be functional.
What can I do with it? (Score:1)
Re: (Score:3)
What you can do with it: talk about how awesome your new GPU is, while never actually getting that performance out of it because of the piss poor state of AMD's drivers.
Re: (Score:2)
Get a game VR ready for Windows. 4K, 90fps.
A few observations (Score:3)
Going by the pricing, this is clearly being pitched at competing with the Nvidia Titan range. The current entry in that range is currently a white elephant, with performance that is basically matched by the (much cheaper) 980ti. However, past Titans (and their *90 predecessors) have generally had a successful enough niche in the super-premium section of the market.
However, and this is where I can speak from personal experience, multi-GPU cards are not always a great use of money, even at the top end of the market. I've owned two of them before (both Nvidia): the 7950GX2 and the 590. Both of them had problems. In both cases, support in individual games was patchy. In some titles, you would get only limited benefit. In a fairly large number of titles, you would get no benefit. In some titles (including various iterations of World of Warcraft), you could get odd performance artefacts and stability problems that meant that the dual-GPU card was actually weaker than the top-end single-GPU card. That situation has not changed; the last twelve months have seen a number of major PC releases with poor, no, or seriously bugged multi-GPU support.
The other point is that these cards are not necessarily the easiest to live with on a day-to-day basis. While Nvidia have made great strides in reducing the heat and noise output, as well as the power consumption, of their high end cards recently (the 980 behaves like a low-to-mid end card from a few generations ago and even the 980ti is reasonably civilised), AMD cards remain louder, hotter and more power-hungry. God only knows what the profiles of this latest beast are going to look like.
For a lot of users, that may mean PSU and system-cooling upgrades. It might make this card a poor choice for living-room PCs (which are increasingly popular, thanks to Steam big-picture mode and the like). And it does raise lingering worries about longevity; some past dual-GPU cards, like the 7950GX2, have been notorious for burning out after 18 months or so.
Re: (Score:2)
However, and this is where I can speak from personal experience, multi-GPU cards are not always a great use of money
Are they ever? It seems like two cards in CF/SLI have pretty much the exact same performance and drawbacks, with much less of a premium. Now I'd really like a single card that could drive my 4K monitor, but even the 980 Ti/Fury X aren't quite there. I'm guessing I'll stick with what I have until 14/16nm and 8GB of HBM2 is an option.
Re: (Score:1)
Supercomputing (Score:4, Insightful)
Where AMD seems really missing out is supercomputing. If you are building a computing cluster, you always go with NVidia, because of CUDA's overwhelming presence in the ecosystem. (Cracking might be an exception.) For example, all the major deep learning frameworks work just with CUDA. Why doesn't AMD care? It must be losing a lot of sales on this.
If AMD paid three guys fulltime to add OpenCL backends to the most popular open source libraries and built a CuDNN equivalent, the world would be a better place for everyone, but most clearly for AMD.
Re: (Score:2)
That would not be sufficient. It is not just about the libraries : powerful AMD cards are usually loud and power hungry, plus they do not have Linux/Unix support which means that they are not cluster-friendly.
The message with ATI/AMD has always been clear : they only care about the gaming market and will focus it only on Windows platform.
Re: (Score:1)
Re: (Score:2)
Yes, that looks like a good start. Probably a long way from being able to just run this on something complex like theano to make it work with OpenCL, though. (But if I'm wrong about that, all the better.)
Re: (Score:1)
Re: (Score:3)
I'd say there are quite a few OpenCL projects that run rather well on AMD cards - POEM@HOME and Einstein@HOME seem to work better on AMD (And I'm running AMD on Linux too mind you)
Great more TFLOP's (Score:5, Insightful)
More TFLOP's are great. But what I'm really interested in out of AMD are:
1) Better driver support
2) Something to compete with Nvidia's Physx, Gameworks, built-in video encoding, etc.
3) Better support from game developers
I recently abandoned the red team because I got sick of waiting for them to get their act together while Nvidia got all the developer and exclusive-feature love.
Re: (Score:3)
More TFLOP's are great. But what I'm really interested in out of AMD are:
1) Better driver support 2) Something to compete with Nvidia's Physx, Gameworks, built-in video encoding, etc. 3) Better support from game developers
I recently abandoned the red team because I got sick of waiting for them to get their act together while Nvidia got all the developer and exclusive-feature love.
This. I don't care about FLOPS, I'm not a mainframe sysadmin. Give me Linux drivers or give me bust.
Re: (Score:2)
I, like many of us, have used both companies' products, and for a while alternated with each generation from one to the other (going back to, I believe, the AMD All-in-Wonder card series). But until the three items you've laid out are addressed, I'm sad to say that my future purchases will consist exclusively of NVidia boards.
Terra flops (Score:2)
I don't think this article is complete, but the terra flops barrier was broken around 1997: https://en.wikipedia.org/wiki/... [wikipedia.org] and those computers where considered "super computers"
And now we have off the shelf GPUs doing 16TFLOPs in a PC.
I guess EVE Online needs to change the specs of their ships and shift to peta flops ;D
Overpriced because of Bitcoin/Litecoin (Score:2)
Lead to believe (Score:2)
We were previously lead
As in Pb, the metal?