Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Graphics AMD Hardware Technology

AMD's New Radeons Revisit Old Silicon, Enable Dormant Features 75

crookedvulture writes "The first reviews of AMD's Radeon R7 and R9 graphics cards have hit the web, revealing cards based on the same GPU technology used in the existing HD 7000 series. The R9 280X is basically a tweaked variant of the Radeon HD 7970 GHz priced at $300 instead of $400, while the R9 270X is a revised version of the Radeon HD 7870 for $200. Thanks largely to lower prices, the R9 models compare favorably to rival GeForce offerings, even if there's nothing exciting going on at the chip level. There's more intrigue with the Radeon R7 260X, which shares the same GPU silicon as the HD 7790 for only $140. Turns out that graphics chip has some secret functionality that's been exposed by the R7 260X, including advanced shaders, simplified multimonitor support, and a TrueAudio DSP block dedicated to audio processing. AMD's current drivers support the shaders and multimonitor mojo in the 7790 right now, and a future update promises to unlock the DSP. The R7 260X isn't nearly as appealing as the R9 cards, though. It's slower overall than not only GeForce 650 Ti Boost cards from Nvidia, but also AMD's own Radeon HD 7850 1GB. We're still waiting on the Radeon R9 290X, which will be the first graphics card based on AMD's next-gen Hawaii GPU." More reviews available from AnandTech, Hexus, Hot Hardware, and PC Perspective.
This discussion has been archived. No new comments can be posted.

AMD's New Radeons Revisit Old Silicon, Enable Dormant Features

Comments Filter:
  • ... that updating the BIOS on my 7870 might unlock these features?
    • by Anonymous Coward

      Summary for those who missed it and got right to commenting: go ahead and try it, let us know how it goes!

  • Or have we reached a diminishing return point and/or a point where money is being spent elsewhere (consoles, mobile, tablets, etc)?

    • by 0123456 ( 636235 ) on Tuesday October 08, 2013 @06:56PM (#45076297)

      Or have we reached a diminishing return point and/or a point where money is being spent elsewhere (consoles, mobile, tablets, etc)?

      The problem is that PC games have been cripppled for years by being developed on consoles and ported to PCs. Some do take advantage of the extra power of PC GPUs, but the majority will run fine on a GPU that's several years old, because it's more powerful than the crap in the consoles.

      • Just wait until games are being made for the PS4 and xb one. I'm looking forward to the optimizations and the fact that they'll be the same architecture as discrete cards. Hopefully that means game developers will allow their games to scale more since it shouldn't really be much work and they don't need to port them.

        • by 0123456 ( 636235 )

          It will certainly be an improvement, but from what I've read they're only comparable to current mid-range PC GPUs. By the time many games are out, a high-end gaming PC will still be several times as powerful.

          • by wisty ( 1335733 )

            > from what I've read they're only comparable to current mid-range PC GPUs.

            Yeah. But that's still shit-loads better than a 10 year old high-end PC GPU.

      • And then there's star citizen. Honestly I'm looking at purchasing my first PC for gaming in over a decade. The last PC I bought primarily for gaming purposes was around 2001. Then the studios stopped producing the flight, space combat sims and FPS's like the original Rainbow 6 and Ghost Recon games I liked to play.

        I've been looking around. I have a 3 year old desktop here that I'm thinking for $150 for a new PSU and 7xxx AMD card will get me through the beta for Star Citizen.

        So I've just started looking

      • Or have we reached a diminishing return point and/or a point where money is being spent elsewhere (consoles, mobile, tablets, etc)?

        The problem is that PC games have been cripppled for years by being developed on consoles and ported to PCs. Some do take advantage of the extra power of PC GPUs, but the majority will run fine on a GPU that's several years old, because it's more powerful than the crap in the consoles.

        But most of the nice visual effects like antialiasing are done in the drivers without the game needing to know to much about it so this is not necessarily true. Also, there are plenty of companies that develop for PC then port to consoles. Compare Skyrim on the PC on a NVidia 680 or 780 to running on the Xbox 360 to see the difference. Another example of a game that looked far better on PC than on consoles is BF3.

        Maybe you should have caveated your post by saying that a lot of crap studios release crippled

    • I think we've hit a temporary lull, but you'll see renewed interest once newer, larger monitors start to enter the market. i.e., I'm fine with my rig so long as I can play any game with the settings maxed. Rules will have to change once 4k monitors become the new norm.
    • Once in a while I check Tom's Hardware for the video card roundups and they all seem priced accordingly. There is no longer a "best bang for you buck" card. A $70 Nvidia cards performs as well as a $70 AMD card.

      • Once in a while I check Tom's Hardware for the video card roundups and they all seem priced accordingly. There is no longer a "best bang for you buck" card. A $70 Nvidia cards performs as well as a $70 AMD card.

        The last bang for buck video card that I had was the GeForce 4 TI 4200 64MB. It lasted roughly 3 years before I migrated to the a 6600

    • by tibman ( 623933 )

      I bought an HD 6970 (used from ebay) just two weeks ago. Really enjoying it so far. The new cards need PCIe 3.0 and this old mobo can't do that : / It seems like a gpu upgrade every two years is good enough. CPU upgrades are super easy too if the socket is long-lived. Just wait until a cpu model goes into the bargain bin, which doesn't take very long at all.

  • Marketing Numbers (Score:5, Insightful)

    by ScottCooperDotNet ( 929575 ) on Tuesday October 08, 2013 @06:51PM (#45076253)

    Why didn't AMD's Marketing team name these 8000 series cards? Do they keep changing the naming scheme to be intentionally confusing?

    • At least Dell fixed this recently with *most* of their enterprise laptops.

      A 6430, for example is a series-6 laptop with a "4"-teen inch screen in the 3'rd revision.

      I have no clue what a 7970 is, of how it compares to an R7-260.

      • by armanox ( 826486 )

        ATIs were sane for quite a while. In the Radeon X and HD series numbers were seven digits (ABCD), such as a Radeon HD 5770

        A: Generation name. A 7xxx card is newer then a 5xxx card
        B: Chip series. All chips in a generation with the same B number (x9xx) were based on the same GPU
        C: Performance level. A lower number was clocked slower then a higher one (so a 7750 was slower then a 7770). Exception: the x990 was a dual GPU chip
        D: Always 0

        So, to compare ATI cards, a x770 was slower then an (x+1)770 which was

      • Re:Marketing Numbers (Score:5, Informative)

        by gman003 ( 1693318 ) on Wednesday October 09, 2013 @12:37AM (#45078567)

        ATI/AMD has actually been consistent for several years now - they're literally just-now changing their scheme

        The old system was a four-digit number. First digit is generation - a 7950 is newer than a 6870, and way newer than a 4830 or a 2600. The next two digits are how powerful it is within the generation - roughly, the second digit is the market segment, and the third is which model within that segment, but that's rough. They did tend to inflate numbers over time - the top-end single-GPU cards of each generation were the 2900 XT, the 3870, the 4890, the 5870, the 6970, and the 7970GE. Put simply, if you sort by the middle two digits within a generation, you also order by both power and price.

        The fourth digit is always a zero. Always. I don't know why they bother.

        Sometimes there's a suffix. "X2" used to mean it's a dual-GPU card, cramming two processors onto one board, but now those get a separate model number (they also only do that for the top GPU now, because they've found it's not worth it to use two weaker processors). "GE" or "Gigahertz Edition" was used on some 7xxx models, because Nvidia beat them pretty heavily with their 6xx series release so AMD had to rush out some cards that were essentially overclocked high enough to beat them. "Eyefinity Edition" used to be a thing, mainly it just meant it had a shitload of mini-DP outputs so you could do 3x2 six-monitor surround setups, which AMD was (and is) trying to push. And there were some "Pro" or "XT" models early on, but those were not significant.

        Now forget all that, because they're throwing a new one out.

        It's now a two-part thing, rather like what Intel does with their CPUs. "R9" is their "Enthusiast" series, for people with too much money. Within that, you have six models: the 270, 270X, 280, 280X, 290 and 290X. They haven't fully clarified things, but it seems that the X models are the "full" chip, while the non-X model has some cores binned off and slightly lower clocks. Other than that, it's a fairly straightforward list - the 290 beats the 280X beats the 280 beats the 270X and so on. Under those are the "R7" "gamer" series, which so far has the 240 through 260X, and an R5 230 model is listed on Wikipedia even though I've not seen it mentioned elsewhere.

        Sadly, it's still a bit more complicated. See, some of the "new" ones are just the old ones relabeled. They're all the same fundamental "Graphics Core Next" architecture, but some of them have the new audio DSP stuff people are excited about. And it's not even a simple "everything under this is an old one lacking new features" - the 290X and 260X have the new stuff, but the 280X and 270X do not. And it gets worse still, because the 260X actually is a rebadge, it's just that they're enabling some hardware functionality now (the 290X actually is a genuine new chip as far as anyone can tell). So far, everything is 2__, so I would assume the first digit in this case is still the generation.

        Oh, and there actually are some 8xxx series cards. There were some mobile models released (forgot to mention - an M suffix means mobile, and you can't directly compare numbers between them. A 7870 and 7870M are not the same.), and it looks like some OEM-only models on the desktop.

        But yeah, it is a bit daunting at first, especially since they're transitioning to a new schema very abruptly (people were expecting an 8xxx and 9xxx series before a new schema). But not much has really changed - you just need to figure out which number is the generation, and which is the market segment, and you're good.

    • Re:Marketing Numbers (Score:5, Informative)

      by edxwelch ( 600979 ) on Tuesday October 08, 2013 @07:53PM (#45076729)

      because there already is a 8000 series, which is a rebadge of the 7000 series. They rebadged so much that they ran out of numbers

    • Why didn't AMD's Marketing team name these 8000 series cards? Do they keep changing the naming scheme to be intentionally confusing?

      Because there's a psychological barrier to naming a card 10,000 or higher, and as you approach that, the effect starts to show. It diminishes the numbers in your mind and makes it "pop" less. Because in certain peoples minds, going from a 7000 series to an 8000 series means more than going from a 10,000 series to an 11,000 series. The other option was to start using k, but then how do you differentiate different cards in the 10k series? 10k1? 10k2? Now you're in a different area where people don't wan

      • Actually, the problem was that they caught up to the first Radeons. Those actually started in the 7000's, but I didn't see as many of those around as the later 8000's and 9000's. It would have been way too confusing to have different Radeon 9600's around, even if the old one was a decade-old AGP part, so they went to a new scheme.

        Incidentally, after the old 9000 series they went to "X" for 10, such as the X600, then later the "X1" which I guess meant 11, like the X1400. Then they decided it was just sill

    • Marketing loves dealing with superlatives. ATI started with the Graphics Wonder card. After a while, new cards came out, and more superlatives were required. Combinations of superlatives were the new convention, ie: the VGA Wonder Plus, and the Graphics Ultra Pro. After the 3D Pro Turbo Plus card, no one tried using superlatives again.

      ATI then proceeded to start naming Radeon cards 7000, 8000 and 9000 series. After MIPS 10k, no one wanted numbers larger than 10,000. As such, ATI tried the Radeon 300 s

  • 7790 gets no love (Score:5, Interesting)

    by Anonymous Coward on Tuesday October 08, 2013 @06:54PM (#45076279)

    The HD 7790 never seems to get any love in reviews -- it is always pointed out that its slower than such and such, or more expensive than such and such... missing the point entirely

    The HD 7790 is only 85 watts. It is often compared against the GTX 650 Ti, which is 110 watts and is only marginally better than the 7790 in some benchmarks (the regular GTX 650 however, is actually very competitive in power consumption, but is notably slower in most benchmarks than the 7790)

    Now we see this new R7 260X getting dumped on in the summary for essentially the same ignorant reasons. The R7 260X is supposed to use slightly less power than the 7790, but here it is being compared to cards that use 50%+ more power.. essentially cards in a completely different market segment.

    Reviewers are fucking retards.

  • Talk about alphabet soup! Or, in this case, alphanumeric soup.
  • Is it just that I'm tired from a long day at work, or is the summary really that incoherent, disorganized and lacking even a rudimentary grasp of proper sentence structure? I hate to be one of those "Nazis" but I shouldn't have to read and re-read the bloody summary to tease meaning out of it.
  • AMD/Radeon is dead. I was a big AMD/ATI guy for nearly a decade but their drivers and compatibility issues just kept getting worse and worse and worse. Their multi-monitor support is terrible. Their support for hardware accelerated video decoding took far to long to get straitened out. Their linux drivers dropped support for the majority of their older cards, which is silly as the majority of linux installs go on older computers. I have had ATI cards literally set 3 different motherboards on FIRE in the pas

    • by realityimpaired ( 1668397 ) on Tuesday October 08, 2013 @10:26PM (#45077749)

      *shrugs* Everybody has their own experiences. I have a Core i5 2500k system with 16GB of RAM, and a Radeon HD 6970, and have never had a problem despite its age. It still runs all of my current games library without breaking a sweat (and that includes recent AAA titles on Steam running under WINE), and I've never had any of the issues you claim happened to yours.

      In fact, I'm at a loss to explain how it's even possible for a video card to set your system on fire. You could blow some capacitors, I suppose, if you have a cheap motherboard with cheap caps, you could crater a chipset by sending too much voltage, you could even wreck a cold solder, but the flash point on the plastic they use to make motherboards is high enough that the system would have shut down for critical heat *long* before it ever got hot enough to set the silicon on fire....

      All of the above would be solved by not having a crap motherboard, btw... I've seen all of the symptoms I've listed in computers, but every single one of them was either a cheap motherboard or a cheap power supply, and not really anything the CPU vendor could have controlled... (I've seen them all in Intel systems as well as AMD)

    • Your story is interesting to read. I have recently bought an AMD 7870 card for my main desktop system. The main cause for me to switch was the openCL support that AMD in their proprietary drivers has. True, I have had trouble with multi monitor support and stability that was only fixed (for me at least) very recently and I contemplated switching back. However, with the latest drivers, I have had no trouble so far and the openCL performance I get out of the card is way better than a similarly priced NVidia b

Our OS who art in CPU, UNIX be thy name. Thy programs run, thy syscalls done, In kernel as it is in user!

Working...