Air-Cooled AMD Radeon R9 Fury Arrives For $100 Less With Fury X-Like Performance 77
MojoKid writes: When AMD launched the liquid-cooled Radeon Fury X, it was obvious the was company willing to commit to new architecture and bleeding edge technologies (Fiji and High-Bandwidth Memory, respectively). However, it fell shy of the mark that enthusiasts hoped it would achieve, unable to quite deliver a definitive victory against NVIDIA's GeForce GTX 980 Ti. However, AMD just launched their Radeon R9 Fury (no "X" and sometimes referred to as "Fury Air"), a graphics card that brings a more compelling value proposition to the table. It's the Fury release that should give AMD a competitive edge against NVIDIA in the $500+ graphics card bracket. AMD's Radeon R9 Fury's basic specs are mostly identical to the liquid-cooled flagship Fury X, except for two important distinctions. There's a 50MHz reduction in GPU clock speed to 1000MHz, and 512 fewer stream processors for a total of 3584, versus what Fury X has on board. Here's the interesting news which the benchmark results demonstrate: In price the Fury veers closer to the NVIDIA GeForce GTX 980, but in performance it sneaks in awfully close to the GTX 980 Ti.
Re: (Score:3)
Re: (Score:2)
With you on this (Score:3)
Still, I'm happy as larry that the PC world has finally decided to leave 1080 panels behind. I was running higher res than 1080 for years, and then those pesky TV panels turned up everywhere and put us back years.
Re: (Score:2)
1366x768 is an actual improvement over 1024x768, not so much for people who "upgraded" from 1280x1024 though.
A damn shame that 1440x900 isn't the standard.
Re: (Score:3)
Perhaps for current-day games, but the proposed specifications for the commercial Oculus Rift are quite high (and that's just the "recommended" specs): https://www.oculus.com/en-us/b... [oculus.com]
The high-end cards of today will be the mid-high range cards of next year, so I wouldn't be surprised if some of the more demanding VR games make full use of the available power.
Re: (Score:2)
I'm starting to think I'm getting old and am the only person who doesn't give a shit about the Occulus Rift or any other VR setups, at least not yet. Seriously, I've always been a graphics junkie, since the days when CGA was the standard for color, and I really am quite happy right now with 1080p on a flat screen.
Re: (Score:2, Interesting)
My old-ass eyes can barely tell the difference between 1080 and 4k. Give me a nice big monitor, and a game that runs smoothly (which apparently is hard for some companies *arkham knight*) and I don't really need to spend the money on two Titans. Who decided that we need photorealism in games, anyway?
Re: (Score:2)
Fancy ass Arkham City didn't need a $200+ video card.
Re: (Score:2)
Re: (Score:1)
Oh, you're paying for the on-board graphics. You can't NOT pay for it, because Intel welded it to the CPU.
Re: (Score:2)
Who decided that we need photorealism in games, anyway?
The developers did, to make up for not really adding any new gameplay or content. They want to sell 'the next big thing' without really needing to do anything but reiterate the old stuff at higher rez.
Re: (Score:2)
So turn off smoothing. It's a function of the tv, and takes about 10 seconds to turn off. I personally hate the feature, it makes movies look like they were filmed for basic cable tv.
Re: (Score:2)
You should get some cough mixture for that cough. It sounds terrible.
Re: (Score:1)
Next year is the year of the Linux Desktop.
x
Re: (Score:2)
Wait for Ubuntu 16.04 (then three monthes after, you'll long for Ubuntu 18.04 but sssh...)
Re: (Score:2)
Apple has thousands of developers, artists and other experts that get paid to work on OSX.
When do you think a Linux desktop company will be come close to matching that?
Granted, most of the work that they do at Apple is thrown away before it reaches the consumer, but that is often the nature of product development, when you don't know which features the users will need or want. The same would be true for a Linux desktop OS company.
Re: (Score:1)
I'm just curious what your use case is that it would necessitate using this kind of hardware? Otherwise this ticks off so many boxes in the troll checklist that I can't take it seriously.
Re: (Score:3)
As soon as you write a driver for it.
It's open source, after all. Nothing stopping you.
None (Score:1)
You're a whiny zealot. Companies don't cooperate with you folks because no matter what you will bitch and cry about something. So in the end its easier to just not deal with the one in a million Linux users.
Re: (Score:3, Funny)
Sigh. 28nm... (Score:5, Interesting)
Re: (Score:3)
That's because manufacturers run into limits, especially around cost, since Moore's law has reached the end of the line. A transistor on 20nm or 14nm is more expensive than a transistor on 28nm.
Re: (Score:1)
Yeah, if you don't like it, talk to TSMC (and maybe Samsung), and the people that supply them. This isn't NVIDIA's fault, and AMD/ATI is in the exact same boat, so quit whining.
Re: (Score:2)
Re: (Score:2)
I don't really share your want for lower-power graphics cards though. These are *desktop* parts connected to the electrical mains. I don't live in communist germany wh
Re: (Score:1)
Re: (Score:2)
It's not a notebook GPU. It is a desktop GPU. Why would you be worrying about power consumption and heat? This is marketed toward PCs.
BECAUSE I DON'T WANT TO HAVE TO SHOUT OVER ALL THIS FAN NOISE!
Well, the noise issue is mostly solved with aftermarket coolers, but that still leaves power consumption and heat. I guess none of this matters for the occasional gamer, but if you do productive work on GPUs 24/7, and (gasp) pay for your electricity, then these things matter.
(I've been building silent, often fanless computers since about 2003, since I simply don't want any extra noise where I live. Besides, I've never understood why it's OK
Re: (Score:1)
"BECAUSE I DON'T WANT TO HAVE TO SHOUT OVER ALL THIS FAN NOISE! "
Son, unless you're running Delta fans, you have no right to say shit about noise.
I've got a single delta fan louder than a QUAD SLI TITAN setup.
Re: (Score:1)
In the last years, Nvidia have made big strides in reducing their power consumption for a given performance. You can buy the "latest-and-greatest" in performance, which will outperform older cards, OR you can get similar performance in a smaller, cooler and cheaper package. The 750Ti comes to mind:
It is "only" a midrange card, but with a power consumption of 60-70W it does not even need an additional PCIe power connector.
Recently, AMD are also getting closer with HBM on the Fury (although they are still fal
Re: (Score:1)
I usually go by the performance index of www.3dcenter.org, which gives an average performance value relative to the Radeon HD 5750/6750 GDDR5, which is defined as 100%.
The index is not based on theoretical GFLOPS, but on tests by various review sites (mostly gaming) and calculated for benchmark results at 1920x1080 with 4x multisampling anti-aliasing.
This explains why Nvidia looks better in the 3dcenter.org ranking, as they usually get more gaming performance out of cards with the same GFLOPS.
3dcenter.org a
Re: (Score:2)
If you're so loaded that you're buying a 50 inch 4k TV and a $500+ graphics card, I'm sure you can shell out $20 for an adapter. http://www.amazon.com/dp/B00E9... [amazon.com]
Re: (Score:2)
I'd like to know where you're buying a 50" 4k TV for 600$...
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:1)
" i doubt a regular vga cable can transport 4k at 60 hz.. because i know it allready has problems with full hd.."
Bullshit. I was doing OVER Full HD (That would be 2048x1536) on VGA OVER A DECADE AGO on a 21" Trinitron.
Re: (Score:2)
as most 50 inch+ 4k tv's don't come with with displayport, but with a hdmi 2.0 port, and 4k pc monitors don't come in usefull sizes and prices. This card isn't worth looking at for me.
DisplayPort contains HDMI, this is like complaining a your laptop doesn't have a microUSB port and therefore you cannot connect it to a phone. Just get a dumb DP to HDMI cable.
Does it have good Linux drivers? (Score:3)
Does it have good Linux drivers? I.e. that have the same performance, memory requirements etc. that the windows counterpart? Doesn't have to be free, only good.
No? Then I'm not interested...
Re: (Score:2)
I'm sure they're fine with that. They'll just sell to the millions more Windows users.
Re: (Score:2)
We'll see if they'll be fine with the negative publicity if the Linux drivers are crap.
Re: (Score:2)
It's not news that the drivers for Linux are crap. They have been for many years.
You're acting like this is something new, that all the old products and their competitors products have excellent Linux driver support.
Re: (Score:2)
NVidia's Linux drivers are just as bad, or even more so if you had the misfortune of having having a laptop with Nvidia "Optimus" integrated graphics over the past few years.
Re: (Score:2)
The open drivers work just fine for everything that's not bleeding edge. We'll see how the amdgpu driver comes along.
Re: (Score:1)
Re: (Score:1)
They're not interested because of the general attitude of the Linux community.
Same reason I don't contribute my driver fixes upstream. You people are never satisfied and your attitude shows it.
I don't suffer ingrates.
Buggy support (Score:1, Informative)
Benchmarks mean nothing when so many games have extra bugs with AMD products. They really need to do something about building up a reputation for stable drivers that offer stable performance, even for new games. Because right now, raw performance means dick to the customers when it comes to crashes and poor FPS.
Heatsink mass vs airflow (Score:1)
What is really interesting to me about these aircooled Fury cards is that even though the PCB of the card is much shorter than that of a typical flagship GPU card the heatsinks being use extend the card length out to the typical 12" length. Why is that interesting you say? the power consumption of the card is on par with other AMD GCN cards and when it comes to dissipating the associated heat it still requires the same mass of copper and aluminum fins to avoid temperature spikes the associated fan accelerat
Performance only par at 4K (Score:5, Interesting)