AMD's New Radeons Revisit Old Silicon, Enable Dormant Features 75
crookedvulture writes "The first reviews of AMD's Radeon R7 and R9 graphics cards have hit the web, revealing cards based on the same GPU technology used in the existing HD 7000 series. The R9 280X is basically a tweaked variant of the Radeon HD 7970 GHz priced at $300 instead of $400, while the R9 270X is a revised version of the Radeon HD 7870 for $200. Thanks largely to lower prices, the R9 models compare favorably to rival GeForce offerings, even if there's nothing exciting going on at the chip level. There's more intrigue with the Radeon R7 260X, which shares the same GPU silicon as the HD 7790 for only $140. Turns out that graphics chip has some secret functionality that's been exposed by the R7 260X, including advanced shaders, simplified multimonitor support, and a TrueAudio DSP block dedicated to audio processing. AMD's current drivers support the shaders and multimonitor mojo in the 7790 right now, and a future update promises to unlock the DSP. The R7 260X isn't nearly as appealing as the R9 cards, though. It's slower overall than not only GeForce 650 Ti Boost cards from Nvidia, but also AMD's own Radeon HD 7850 1GB. We're still waiting on the Radeon R9 290X, which will be the first graphics card based on AMD's next-gen Hawaii GPU."
More reviews available from AnandTech, Hexus, Hot Hardware, and PC Perspective.
So does that mean... (Score:2)
Re: (Score:1)
Summary for those who missed it and got right to commenting: go ahead and try it, let us know how it goes!
Do the kids still chase the newest video card? (Score:2)
Or have we reached a diminishing return point and/or a point where money is being spent elsewhere (consoles, mobile, tablets, etc)?
Re:Do the kids still chase the newest video card? (Score:5, Informative)
Or have we reached a diminishing return point and/or a point where money is being spent elsewhere (consoles, mobile, tablets, etc)?
The problem is that PC games have been cripppled for years by being developed on consoles and ported to PCs. Some do take advantage of the extra power of PC GPUs, but the majority will run fine on a GPU that's several years old, because it's more powerful than the crap in the consoles.
Re: (Score:2)
Just wait until games are being made for the PS4 and xb one. I'm looking forward to the optimizations and the fact that they'll be the same architecture as discrete cards. Hopefully that means game developers will allow their games to scale more since it shouldn't really be much work and they don't need to port them.
Re: (Score:3)
It will certainly be an improvement, but from what I've read they're only comparable to current mid-range PC GPUs. By the time many games are out, a high-end gaming PC will still be several times as powerful.
Re: (Score:2)
You mean a machine that costs 4-5x what the console costs will be more powerful than the console? Shocking!
You mean you can't read and comprehend the thread before replying to it? Shocking!
Re: (Score:2)
Re: (Score:3)
The gripe is not that consoles are less powerful than PCs. The gripe is that many games are designed around the limitations of consoles and don't take advantage of all of the power in a PC. Back in the days of yore, new games would be able to take advantage of cutting edge GPUs. Now they (often) don't.
I'm just restating the OP [slashdot.org], who said it very clearly himself:
The problem is that PC games have been cripppled for years by being developed on consoles and ported to PCs. Some do take advantage of the extra power of PC GPUs, but the majority will run fine on a GPU that's several years old, because it's more powerful than the crap in the consoles.
So yes, learn to read.
Re: (Score:2)
The gripe is not that consoles are less powerful than PCs. The gripe is that many games are designed around the limitations of consoles and don't take advantage of all of the power in a PC.
If you would read the very next post in the subthread, (here it is [slashdot.org]), it has a reasonable response to that gripe. Here, I'll quote it for you:
I'm looking forward to the optimizations and the fact that they'll be the same architecture as discrete cards. Hopefully that means game developers will allow their games to scale more since it shouldn't really be much work and they don't need to port them.
If that's hard to understand, I'll explain it. The console GPU architecture is basically PC GPU architecture, even though it's not quite as powerful as the best PC graphics cards. So the effort required by the game developers to use better PC hardware is hopefully low since it should be a pretty natural extension of what they're already doing for the consoles, as op
Re: (Score:2)
I didn't really understand this to be anything other than complaining about the consoles being underpowered...
Yes, you didn't really understand. The OP did not acknowledge that having console architecture closer to PC architecture solves the problem. He acknowledged that it is an improvement, not a solution. As an example, Xbox has always had hardware architecture pretty similar to a PC, but that does not mean games ported from Xbox to PC take good advantage of high-end PC hardware. It still takes more work for the dev team to create higher quality assets (textures, models, etc) and make use of advanced HW features
Re: (Score:2)
He acknowledged that it is an improvement, not a solution.
And then instead of saying any of the things that you said as to why it's not a complete solution (which are reasonable rebuttals), he just complained about underpowered console GPUs and compared them to high-end PC GPUs.
What he really did was read the first line of the response, ignore the rest of it, and assume that the argument that the response was making that games written for the new consoles would be better only because they would be targetting a GPU that is more powerful than the last-gen consoles.
Re:Do the kids still chase the newest video card? (Score:5, Insightful)
It's true that the OP's comment did not give much explanation, but it at least had a constructive tone to it. Your response, however, was sarcastic and insulting. You have some good insight. Your comment history shows a lot of intelligence, but so much of your energy seems to go into belittling others. If you take a more constructive approach, you'll reach a lot more people. Occasionally a sarcastic remark can be an effective way to make a point, but it usually just turns people away and makes your effort go to waste.
Re: (Score:2)
> from what I've read they're only comparable to current mid-range PC GPUs.
Yeah. But that's still shit-loads better than a 10 year old high-end PC GPU.
Re: (Score:1)
My ass. PC games are notoriously unoptimized because you can throw more hardware at the problem.
Graphics APIs these days are basically just a way of get shaders into the GPU. Odds are, pretty much the same shaders are running on the PC as the console, so there's no room to 'unoptimize' them.
And, on the CPU side, I rarely see mine more than 20% used when playing games. So they're not 'unoptimized' there, either.
Re: (Score:3)
And then there's star citizen. Honestly I'm looking at purchasing my first PC for gaming in over a decade. The last PC I bought primarily for gaming purposes was around 2001. Then the studios stopped producing the flight, space combat sims and FPS's like the original Rainbow 6 and Ghost Recon games I liked to play.
I've been looking around. I have a 3 year old desktop here that I'm thinking for $150 for a new PSU and 7xxx AMD card will get me through the beta for Star Citizen.
So I've just started looking
Re: (Score:2)
Or have we reached a diminishing return point and/or a point where money is being spent elsewhere (consoles, mobile, tablets, etc)?
The problem is that PC games have been cripppled for years by being developed on consoles and ported to PCs. Some do take advantage of the extra power of PC GPUs, but the majority will run fine on a GPU that's several years old, because it's more powerful than the crap in the consoles.
But most of the nice visual effects like antialiasing are done in the drivers without the game needing to know to much about it so this is not necessarily true. Also, there are plenty of companies that develop for PC then port to consoles. Compare Skyrim on the PC on a NVidia 680 or 780 to running on the Xbox 360 to see the difference. Another example of a game that looked far better on PC than on consoles is BF3.
Maybe you should have caveated your post by saying that a lot of crap studios release crippled
Re: (Score:2)
Re: (Score:2)
They certainly do above all sense. If you have a 1920x1080 monitor there is only so much GPU power you need for all current games at max detail. Doesn't stop people spending far too much.
As someone who just went from a Nvidia 480 to an Nvidia 780 I noticed an improvement. Firstly I could turn on full Antialiasing which made a big difference in things like Skyrim and BlackOps2. I imagine it will make a bigger difference in Black Ops Ghosts that comes out next month though which is what I really bought it for.
Re: (Score:2)
Once in a while I check Tom's Hardware for the video card roundups and they all seem priced accordingly. There is no longer a "best bang for you buck" card. A $70 Nvidia cards performs as well as a $70 AMD card.
Re: (Score:2)
Once in a while I check Tom's Hardware for the video card roundups and they all seem priced accordingly. There is no longer a "best bang for you buck" card. A $70 Nvidia cards performs as well as a $70 AMD card.
The last bang for buck video card that I had was the GeForce 4 TI 4200 64MB. It lasted roughly 3 years before I migrated to the a 6600
Re: (Score:1)
I bought an HD 6970 (used from ebay) just two weeks ago. Really enjoying it so far. The new cards need PCIe 3.0 and this old mobo can't do that : / It seems like a gpu upgrade every two years is good enough. CPU upgrades are super easy too if the socket is long-lived. Just wait until a cpu model goes into the bargain bin, which doesn't take very long at all.
Re: (Score:1)
280x requires PCI Express x16 3.0 : /
Re: (Score:1)
Supports and requires are not the same thing.
Re: (Score:2)
The new cards need PCIe 3.0
no
Re: (Score:1)
All of the 7xxx series required PCIe 3.0: http://www.amd.com/US/PRODUCTS/DESKTOP/GRAPHICS/7000/7730/Pages/radeon-7730.aspx#2 [amd.com]
The new cards are rebranded 7xxx.
Re:Do the kids still chase the newest video card? (Score:4, Informative)
no, they support, not require
Marketing Numbers (Score:5, Insightful)
Why didn't AMD's Marketing team name these 8000 series cards? Do they keep changing the naming scheme to be intentionally confusing?
Re: (Score:2)
At least Dell fixed this recently with *most* of their enterprise laptops.
A 6430, for example is a series-6 laptop with a "4"-teen inch screen in the 3'rd revision.
I have no clue what a 7970 is, of how it compares to an R7-260.
Re: (Score:3)
ATIs were sane for quite a while. In the Radeon X and HD series numbers were seven digits (ABCD), such as a Radeon HD 5770
A: Generation name. A 7xxx card is newer then a 5xxx card
B: Chip series. All chips in a generation with the same B number (x9xx) were based on the same GPU
C: Performance level. A lower number was clocked slower then a higher one (so a 7750 was slower then a 7770). Exception: the x990 was a dual GPU chip
D: Always 0
So, to compare ATI cards, a x770 was slower then an (x+1)770 which was
Re:Marketing Numbers (Score:5, Informative)
ATI/AMD has actually been consistent for several years now - they're literally just-now changing their scheme
The old system was a four-digit number. First digit is generation - a 7950 is newer than a 6870, and way newer than a 4830 or a 2600. The next two digits are how powerful it is within the generation - roughly, the second digit is the market segment, and the third is which model within that segment, but that's rough. They did tend to inflate numbers over time - the top-end single-GPU cards of each generation were the 2900 XT, the 3870, the 4890, the 5870, the 6970, and the 7970GE. Put simply, if you sort by the middle two digits within a generation, you also order by both power and price.
The fourth digit is always a zero. Always. I don't know why they bother.
Sometimes there's a suffix. "X2" used to mean it's a dual-GPU card, cramming two processors onto one board, but now those get a separate model number (they also only do that for the top GPU now, because they've found it's not worth it to use two weaker processors). "GE" or "Gigahertz Edition" was used on some 7xxx models, because Nvidia beat them pretty heavily with their 6xx series release so AMD had to rush out some cards that were essentially overclocked high enough to beat them. "Eyefinity Edition" used to be a thing, mainly it just meant it had a shitload of mini-DP outputs so you could do 3x2 six-monitor surround setups, which AMD was (and is) trying to push. And there were some "Pro" or "XT" models early on, but those were not significant.
Now forget all that, because they're throwing a new one out.
It's now a two-part thing, rather like what Intel does with their CPUs. "R9" is their "Enthusiast" series, for people with too much money. Within that, you have six models: the 270, 270X, 280, 280X, 290 and 290X. They haven't fully clarified things, but it seems that the X models are the "full" chip, while the non-X model has some cores binned off and slightly lower clocks. Other than that, it's a fairly straightforward list - the 290 beats the 280X beats the 280 beats the 270X and so on. Under those are the "R7" "gamer" series, which so far has the 240 through 260X, and an R5 230 model is listed on Wikipedia even though I've not seen it mentioned elsewhere.
Sadly, it's still a bit more complicated. See, some of the "new" ones are just the old ones relabeled. They're all the same fundamental "Graphics Core Next" architecture, but some of them have the new audio DSP stuff people are excited about. And it's not even a simple "everything under this is an old one lacking new features" - the 290X and 260X have the new stuff, but the 280X and 270X do not. And it gets worse still, because the 260X actually is a rebadge, it's just that they're enabling some hardware functionality now (the 290X actually is a genuine new chip as far as anyone can tell). So far, everything is 2__, so I would assume the first digit in this case is still the generation.
Oh, and there actually are some 8xxx series cards. There were some mobile models released (forgot to mention - an M suffix means mobile, and you can't directly compare numbers between them. A 7870 and 7870M are not the same.), and it looks like some OEM-only models on the desktop.
But yeah, it is a bit daunting at first, especially since they're transitioning to a new schema very abruptly (people were expecting an 8xxx and 9xxx series before a new schema). But not much has really changed - you just need to figure out which number is the generation, and which is the market segment, and you're good.
Re: (Score:2)
Re:Marketing Numbers (Score:5, Informative)
because there already is a 8000 series, which is a rebadge of the 7000 series. They rebadged so much that they ran out of numbers
Re: (Score:3)
Why didn't AMD's Marketing team name these 8000 series cards? Do they keep changing the naming scheme to be intentionally confusing?
Because there's a psychological barrier to naming a card 10,000 or higher, and as you approach that, the effect starts to show. It diminishes the numbers in your mind and makes it "pop" less. Because in certain peoples minds, going from a 7000 series to an 8000 series means more than going from a 10,000 series to an 11,000 series. The other option was to start using k, but then how do you differentiate different cards in the 10k series? 10k1? 10k2? Now you're in a different area where people don't wan
Re: (Score:2)
Actually, the problem was that they caught up to the first Radeons. Those actually started in the 7000's, but I didn't see as many of those around as the later 8000's and 9000's. It would have been way too confusing to have different Radeon 9600's around, even if the old one was a decade-old AGP part, so they went to a new scheme.
Incidentally, after the old 9000 series they went to "X" for 10, such as the X600, then later the "X1" which I guess meant 11, like the X1400. Then they decided it was just sill
Re:Marketing Numbers - A Brief History (Score:2)
Marketing loves dealing with superlatives. ATI started with the Graphics Wonder card. After a while, new cards came out, and more superlatives were required. Combinations of superlatives were the new convention, ie: the VGA Wonder Plus, and the Graphics Ultra Pro. After the 3D Pro Turbo Plus card, no one tried using superlatives again.
ATI then proceeded to start naming Radeon cards 7000, 8000 and 9000 series. After MIPS 10k, no one wanted numbers larger than 10,000. As such, ATI tried the Radeon 300 s
7790 gets no love (Score:5, Interesting)
The HD 7790 never seems to get any love in reviews -- it is always pointed out that its slower than such and such, or more expensive than such and such... missing the point entirely
The HD 7790 is only 85 watts. It is often compared against the GTX 650 Ti, which is 110 watts and is only marginally better than the 7790 in some benchmarks (the regular GTX 650 however, is actually very competitive in power consumption, but is notably slower in most benchmarks than the 7790)
Now we see this new R7 260X getting dumped on in the summary for essentially the same ignorant reasons. The R7 260X is supposed to use slightly less power than the 7790, but here it is being compared to cards that use 50%+ more power.. essentially cards in a completely different market segment.
Reviewers are fucking retards.
Re: (Score:2)
Because most users don't care about power - they care about cost and performance. The reviewers are comparing to cards of similar cost.
The cost is only comparable if you don't factor in having to buy a new power supply or whatever.
I had a video card die, and a decent portion of the costs to replace it went into a power supply because the best bang-for-buck GPU was just not going to work on the power supply I had in the system (which could only supply a single PCI-E connector - and didn't really have much headroom to use an adapter). I had half-considered downgrading just for that reason.
By all means make the comparisons, but pointing out
wow (Score:2)
Dizzy (Score:2)
AMD/Radeon is dead (Score:1)
AMD/Radeon is dead. I was a big AMD/ATI guy for nearly a decade but their drivers and compatibility issues just kept getting worse and worse and worse. Their multi-monitor support is terrible. Their support for hardware accelerated video decoding took far to long to get straitened out. Their linux drivers dropped support for the majority of their older cards, which is silly as the majority of linux installs go on older computers. I have had ATI cards literally set 3 different motherboards on FIRE in the pas
Re:AMD/Radeon is dead (Score:5, Insightful)
*shrugs* Everybody has their own experiences. I have a Core i5 2500k system with 16GB of RAM, and a Radeon HD 6970, and have never had a problem despite its age. It still runs all of my current games library without breaking a sweat (and that includes recent AAA titles on Steam running under WINE), and I've never had any of the issues you claim happened to yours.
In fact, I'm at a loss to explain how it's even possible for a video card to set your system on fire. You could blow some capacitors, I suppose, if you have a cheap motherboard with cheap caps, you could crater a chipset by sending too much voltage, you could even wreck a cold solder, but the flash point on the plastic they use to make motherboards is high enough that the system would have shut down for critical heat *long* before it ever got hot enough to set the silicon on fire....
All of the above would be solved by not having a crap motherboard, btw... I've seen all of the symptoms I've listed in computers, but every single one of them was either a cheap motherboard or a cheap power supply, and not really anything the CPU vendor could have controlled... (I've seen them all in Intel systems as well as AMD)
Interesting, I just went from NVidia to ATI (Score:2)
Your story is interesting to read. I have recently bought an AMD 7870 card for my main desktop system. The main cause for me to switch was the openCL support that AMD in their proprietary drivers has. True, I have had trouble with multi monitor support and stability that was only fixed (for me at least) very recently and I contemplated switching back. However, with the latest drivers, I have had no trouble so far and the openCL performance I get out of the card is way better than a similarly priced NVidia b