Slashdot is powered by your submissions, so send in your scoop


Forgot your password?
AMD Hardware

AMD Unveils New Family of GPUs: Radeon R5, R7, R9 With BF 4 Preorder Bundle 188

MojoKid writes "AMD has just announced a full suite of new GPUs based on its Graphics Core Next (GCN) architecture. The Radeon R5, R7, and R9 families are the new product lines aimed at mainstream, performance, and high-end gaming, respectively. Specs on the new cards are still limited, but we know that the highest-end R9 290X is a six-billion transistor GPU with more than 300GB/s of memory bandwidth and prominent support for 4K gaming. The R5 series will start at $89, with 1GB of RAM. The R7 260X will hit $139 with 2GB of RAM, the R9 270X and 280X appear to replace the current Radeon 7950 and 7970 with price points at $199 and $299, and 2GB/3GB of RAM, and then the R9 290X, at an unannounced price point and 4GB of RAM. AMD is also offering a limited preorder pack, that offers Battlefield 4 license combined with the graphics cards, which should go on sale in the very near future. Finally, AMD is also debuting a new positional and 3D spatial audio engine in conjunction with GenAudio dubbed 'AstoundSound,' but they're only making it available on the R9 290X, R9 280X, and the R9 270X."
This discussion has been archived. No new comments can be posted.

AMD Unveils New Family of GPUs: Radeon R5, R7, R9 With BF 4 Preorder Bundle

Comments Filter:
  • What happened to the Radeon R4, R6, R8?
  • by Fwipp ( 1473271 ) on Wednesday September 25, 2013 @08:45PM (#44955035)

    Wow, that $89 R5 actually looks surprisingly attractive. If the benchmarks hold up, I might think about replacing the old power-hungry card I've got in my main desktop machine right now - I'd probably save energy and get better performance to boot.

    • Wow, that $89 R5 actually looks surprisingly attractive

      I'm guessing that /. is the only website where this comment would find general agreement.

    • Maybe you should instead buy a new PC to replace the power hungry CPU and small storage you have in it and gain faster GFX with the built in on the CPU? Without giving specs of what you are replacing, your comment sounds awfully like a fake review.
  • by epyT-R ( 613989 )

    that work reliably for more than the current crop of just released games, I don't care how much faster these chips are. I've had too many glitches with radeon drivers over the years to consider them again. Their opengl is horrible, and CCC is a bloated pos.

    • by LesFerg ( 452838 ) on Wednesday September 25, 2013 @09:00PM (#44955145) Homepage

      Yeah I feel the same way about their driver support, couldn't trust them with too much of my limited gaming hardware budget.
      Also, would it be really really difficult for them to hire some decent programmers and produce a new version of Catalyst control center that doesn't have to run on .Net?
      Whatever happened to C++ and fast reliable software?

      • Re: (Score:2, Informative)

        by epyT-R ( 613989 )

        What happened? Point-and-stick software 'development.' Visual basic on steroids (.NET), and huge interpreted runtimes (python/php/ruby/.NET/ad nauseum) being used to write programs that could be done in a few dozen lines of C or shellcode..

        This disease is everywhere. Basic system software should have as few dependencies as possible. GNU land suffers from this too. Honestly if CCC was the only problem, I could live with it.

        • by Nemyst ( 1383049 )
          Interfaces in a few dozen lines of C? Hahahaha.
        • by MojoMagic ( 669271 ) on Thursday September 26, 2013 @01:54AM (#44956819)
          A couple of problems with this statement:
          - .Net is not a programming language. Your comparison is just silly.
          - In case you meant to refer to C#, no part of this development process is "point-and-click". In this regard, it is no different to C++ (I develop in both).
          - It is not interpreted. Nor has it ever been.
          - I think you'll find that the simple programs of "a few dozen lines" that you mention would likely be smaller (3 of lines) in C# than C++. But, again, this is a silly comparison and shouldn't be used in any reasonable comparison. If things like this are a problem, you are just using the wrong libraries; in most cases it has little to do with the language directly.
          • - I think you'll find that the simple programs of "a few dozen lines" that you mention would likely be smaller (3 of lines) in C# than C++. But, again, this is a silly comparison and shouldn't be used in any reasonable comparison. If things like this are a problem, you are just using the wrong libraries; in most cases it has little to do with the language directly.

            I'm also a professional software developer but if you stop to think about it I think that you'll find MSVCRT (Microsoft Visual C++ Runtime) is significantly smaller then the .NET Framework that MSIL applications can't run without. I'm not sure about the size of the Mono runtime in comparison.
            Most of the time this won't be important, especially given that .NET framework is pre-installed on Windows, but depending on your project and targeted runtime environment it could be a factor.

        • You list Python/PHP/Ruby and then talk about shellcode as an alternative? Huh?
          • by Xest ( 935314 )

            This is what happens when non-programmers try and converse about programming whilst having a certain arrogance that prevents them seeing that they're way out of their depth.

            He also listed .NET as interpreted which is flat out wrong (unless you're talking about code executing on the DLR but given his level of understanding I doubt he even knows what that is).

            The idea you can write more concise code in C or Shellcode than the others is also laughable. The fact you have to explicitly write code for dynamic mem

      • by Xest ( 935314 )

        "Whatever happened to C++ and fast reliable software?"

        That makes no sense, C# and .NET let you write fast, reliable software. In fact, the very nature of .NET means it's more reliable by default because it has better handling of common programming mistakes and the JIT compiler means you can get equally good performance out of it, .NET even gives you a decent amount of control over the GC so is lesser plagued by the nature of that in say Java and even Java can perform equally well otherwise.

        CCC's problems ha

        • by LesFerg ( 452838 )

          ... CCC's problems have nothing to do with the development environment, language, and framework used ...

          Well I seem to remember repeated faults with mismatched .Net library dependencies and somehow ending up with a CCC installation that would not load up its user interface and could not be fixed by uninstall/re-install and wasted many days of effort. But I guess you are right, it takes a special kind of developer to make such a poor hash-up of a user interface.

    • Wish I had mod points for you. This is absolutely true. Going far back as 2003, the CCC was bloated, buggy, and aggravating to install. Particularly annoying was that .NET was required. That means having to install updates and .NET in default VGA mode first before you could load the drivers on a fresh format/reinstall of XP. Not sure if that's still the case though. Yuck!

      You can have the best GPU in the world. It doesn't do much good without quality drivers however.

    • NVidia drivers tend to be worse nowdays.

      I hear them on hear saying how crappy Windows 7 is because aero brings thier GTX 680s to a crawl. Funny my parents Intel GMA 950 integratred 2007 era graphics run aero fine. Again driver issues.

      I only had one bug with my ati drivers and if you base your data from 10 years ago then it is obsolete.

      • by epyT-R ( 613989 )

        I based it starting at 10 years ago.. actually it starts with the ATI rage 128 which came out in 98(?), through to the radeon 5000 series. that 128 used to bsod windows on a regular basis with opengl applications (eg quake2). Years later, a litany of broken scenes, kernel panics due to unhandled exceptions (HANDLE YOUR DAMNED EXCEPTIONS!), tearing in video playback, completely broken support in non-game accelerated applications, etc, have kept me far far away. There's a reason adobe, autodesk et al, (and

        • by Mashiki ( 184564 )

          I do know some were having some issues with post 314.x drivers, but I didn't run into any.

          Some? Anything post 290.x have been complete crap on 400-600 series cards. At best, they might be stable, at worst you're going to see amazing hardlocks which require a complete powerdown to fix. The last time I looked on the nvidia forums with that issue, there was a thread on this with nearly 140k views. Funny enough it was the bad drivers that broke me, and I dumped my 560ti for a 7950 I have no complaints of doing so.

  • Mantle API (Score:5, Interesting)

    by LordMyren ( 15499 ) on Wednesday September 25, 2013 @08:51PM (#44955079) Homepage

    Personally I would've gone for a mention of Mantle, the proprietary API they are introducing that sidesteps OpenGL and DirectX. I don't really know what it does yet, haven't found good coverage, but DICE's Battlefield 4 is mentioned as using it, and the description I've read said it enabled a faster rate of calling Draw calls. []

    • by tibman ( 623933 )

      Windows only (for now), blah. Still really exciting though! I remember glide being pretty awesome back in the day. It's funny that NVIDIA bought 3dfx and got glide but it is AMD that built a new low-level api. NVIDIA's NVAPI doesn't seem like an openGL or directX replacement but a helper of sorts for managing all kinds of stuff on the card.

    • by PhrostyMcByte ( 589271 ) <> on Wednesday September 25, 2013 @10:25PM (#44955667) Homepage

      The idea is that operating systems introduce a huge amount of overhead in the name of security. Being general purpose, they view their primary role as protecting all the other apps from your unstable app. And, lets face it, even AAA games these days are plagued with issues -- I'm really not sure I want games to have low-level access to my system. Going back to the days of Windows 98's frequent bluescreens isn't on my must-have list of features.

      John Carmack has been complaining about this for years, saying this puts PCs at such a tremendous disadvantage that consoles were able to run circles around PCs when it came to raw draw calls until eventually they simply brute-forced their way past the problem.

      Graphics APIs have largely gone a route that encourages keeping data and processing out of the OS. That's definitely the right call, but there are always things you'll need to touch the CPU for. I'm curious exactly how much of a benefit we'll see in modern games.

      • by epyT-R ( 613989 ) on Wednesday September 25, 2013 @11:23PM (#44956021)

        1. today's consoles also run protected mode (or architecture specific equivalent) operating systems too. The userland kernel hardware latencies are present.

        2. You're complaining about games? Today's operating systems are hardly any better off. There is no way the vendors can vouch for the security of 10gb worth of libraries and executables in windows 7 or osx. The same is true for OSS. Best practice is to just assume every application and system you're using is compromised or compromisable and mitigate accordingly.

        3. IIRC that particular carmack commentary was done to hype up the new gen systems. It's largely bogus. I'm sure the latencies between the intel on-die hd5000 gpu and cpu are lower, but that doesn't mean it's going to perform better overall. Same thing goes with the amd fusion chips used in the new consoles. They're powerful for their size and power draw, but they will not outperform current gaming pc rigs..

        • 3. Sorry, you're completely wrong. Try making 10k calls per frame (30fps) on a modern high-end PC (use the latest Haswell i7 + 7970/780 if you want), and your game will still slow down quite a bit due to single-threaded CPU overhead. On the PS3 (which is over 6 years old now?), you can make hundreds of thousands of draw calls per frame on libGSM and get away with it just fine. Why? Simple: every time you make a draw call on OpenGL/DX, you have to validate/potentially flush/sync all sorts of states. On libGS
  • I can't find any information on the scrypt hash rate of these cards. Does anybody have any info? Thanks
  • by Anonymous Coward

    AMD has totally ruined the future of Nvidia and Intel in the AAA/console-port gaming space. Working with partners at EA/DICE, AMD has created a 'to-the-metal' API for GPU programming on the PS4, Xbox One, and any PC (Linux or Windows) with AMD's GCN technology. GCN is the AMD architecture in 7000 series cards, 8000 series, and the coming new discrete GPU cards later this year and onwards into the foreseeable future. It is also the GPU design in all future AMD CPUs with integrated graphics.

    GCN is *not* the a

    • by Lawrence_Bird ( 67278 ) on Wednesday September 25, 2013 @09:27PM (#44955311) Homepage

      Using OpenGL or DirectX to 'program' a modern GPU is like using Fortran to program the CPU

      Are you saying that OpenGL and DirectX are the fastest? Because Fortran code sure is.

    • by Guspaz ( 556486 ) on Wednesday September 25, 2013 @09:59PM (#44955485)

      So, you're convinced that the slight improvement in performance brought about by a reduction of software overhead is going to completely cripple nVidia? Yeah, sure.

      Even if Mantle does produce faster performance (and there's no reason to doubt that it will), the advantages will be relatively small, and about all they might cause nVidia to do is adjust their pricing slightly. The won't be anything that you'll be able to accomplish with Mantle that wasn't possible without it, such is the nature of fully programmable graphics processors.

      Game publishers, for their part, will hesitate to ignore the 53% of nVidia owners in favour of the 34% AMD owners. It's highly unlikely that this will cause a repeat of the situation caused by the Radeon 9700, which scooped a big win by essentially having DirectX 9 standardized around it. In that case, ATI managed to capture significant marketshare, but more because nVidia had no competitive products on the market for a year or two after. This time around, both companies have very comparable performance, and minor differences in performance usually just result in price adjustments.

    • I have a completely different prediction: you don't know what the fuck you're talking about. Nothing can kill OpenGL, if DirectX couldn't do it, certainly not this proprietary shit.

      • by xiando ( 770382 )

        Nothing can kill OpenGL, if DirectX couldn't do it, certainly not this proprietary shit.

        Android based phones, tablets, consoles and even laptops and dekstops (yes, they are coming and they are getting better very very fast) all use a simple version of OpenGL. You'd have to kill that fast-moving train to kill OpenGL and that is not going to happen. Yes, AMD got a big win with it being the base of the new XBox and Playstation toys. That would have major implications a few years back. I'm not convinced it will make such a huge difference today. As I said, there are Android consoles for sale right

    • by TejWC ( 758299 )

      Isn't nVidia's Cg very close to metal as well? Also, you seem to be implying that making an GPL driver for Mantle should be easy since it will just send the compiled code directly to the card. Could Linux finally be able to get "release day" open source drivers for AMD cards?

    • by DarkTempes ( 822722 ) on Wednesday September 25, 2013 @10:21PM (#44955643)
      Mantle does sound like good news but they also said it is an open API and so I wouldn't be too worried about Nvidia...they'll just implement it themselves if it's so good.

      And Nvidia has been crushing AMD/ATI in the PC market for a while (the Steam hardware survey shows 52.38% Nvidia to 33.08% AMD/ATI with 14% Intel).
      Hopefully this will even things out some but I don't see it making OpenGL or DirectX obsolete.
      OpenGL and DirectX have so much momentum and market share that game devs are going to have to target and support them for a while yet.

      Also, until we get more solid details about Mantle we won't know how good it really is. I am cautiously optimistic but at most this will cause me to delay my next video card purchase until things shake out.
    • by epyT-R ( 613989 )

      So, how much is amd paying you? I'd like to supplement my income.. OpenGL and D3D aren't going anywhere for the immediate future. We went down this vendor-api route with glide, and while it did run well, it created support issues for the consumer that fragmented the market and made it difficult to make money selling gpus. It would be nice, however, to see better support parity between the vendors' shader compilers.

    • Hello Mr AC,

      Of course, ATI customers with 6000 series cards or earlier (or Zacate, Llano, or Richland APUs) are as out-of-luck as Intel and Nvidia GPU users

      So existing ATI customers are being given the shaft

      With the rise of Mantle, many console games developers are going to desire that the PC market rapidly changes to AMD only

      because games producers are going to ditch 66% of the pc gaming marketplace

      Any PC gamer interested in high-performance would be INSANE to buy any Nvidia product from now on

      in favor of the minority of sane gamers

      Nvidia, on the other hand, will be cursing themselves for ever starting this war

      because the market leader in graphics cards is going to start crying into their huge wads of cash.

      Next time you write a new API, try creating world peace. We need that more than faster fly by wire shooters.


      • because games producers are going to ditch 66% of the pc gaming marketplace

        I'd like to mention that many of them are currently ditching 100% of the pc gaming marketplace, and are doing fine.

    • Low end models in each familar are almost always a rebadge of the high end models from the previous family. This has been the case for a very long time. It allows the manufacturer to better move inventory that would otherwise be unsold.

      • When you say "a very long time" I assume you only mean a generation or two? because this tacky shit wasn't done by any of them a while back. Each model number in a 3xxx series was all based on the 3xxx tech or the 2xx or whatever. Now as you state the top end part in a series is new, the middle end parts in the new series are old.

        It's deceptive.

        • by armanox ( 826486 ) <> on Wednesday September 25, 2013 @10:34PM (#44955731) Homepage Journal

          nVidia is famous for rebadging. I'll give an example: the Geforce 8800GTX became the 9800 GTX, and then the GTS 250.

          ATI on the other hand, has followed a different pattern. All cards of a series (HD 2xxx, 3xxx, 4xxx, etc) are based on the same tech. The 6xxx series cards were tuned versions of the 5xxx cards, and I think what's happening is the new R-series cards are tuned versions of the 7xxx series. nVidia does this with their cards now too - the Fermi family (4xx and 5xx) and Kepler family (6xx and 7xx) introduce a chip in the first gen, and refine that chip in the second.

        • Both companies have done this for quite some time. They often introduce new products into emerging markets to sell off old silicon.

          The Radeon 9100 (2003) was a rebadged Radeon 8500 (2001)

          The HD 3410 (2009) was a rebadged HD 2400 (2007), and the HD 3610 (2009) is a rebadge and slightly retuned HD 2600 (2007). In fact, most of the HD 3000 series was a patch for the short lived HD 2000 series.

          The HD 4580 (2011) is a rebadged HD 3750 (2008).

          With the exception of the HD 6900 series GPUs, the entire HD 6000 famil

    • They also seem to be saying that the flagship R9 290X is going to be based on the new technology.

  • I'll ask what /.ers think of the stability of low end ATI hardware. I've heard once you get into the $250 range it's fine, but everything I've tried below $130 has crashed hard on everything except the biggest titles :(. I miss my super stable 1650...
    • by Khyber ( 864651 )

      I never had a problem with my crap of the line HD4200. Sure, it's not going to run the latest and greatest at any respectable frame rate, but hey, it worked and didn't die.

    • I'm currently running a 7750 (my critera were basically SC2 and CS:GO at a solid fps-capped 128FPS, so anything more would have been overkill) and haven't had any issues whatsoever. Rock solid cheap card...

    • The most recent AMD card that I have is the Radeon 5650, embedded on my Vaio E-Series. It certainly won't win any speed contest, but it can comfortably runs MOH, BF3, Skyrim, Deus Ex Human Revolution, Dragon Age 2 on windows, TF2, Dota 2, and Strike Suit Zero from Steam's Linux library, and finally Street Fighter x Tekken on wine on Ubuntu. No crash whatsoever, at least nothing GPU related. I started with ATI Rage Pro, and went Nvidia for several generations (Riva TNT2, Ti 4200, and 6xxx). I returned to AT
    • I've been running a 5670 for a couple years. Other than some minor glitches in Skyrim, no issues.
      • by tibman ( 623933 )

        Same here. I do have a small problem with the driver though. I have to use a special version or the box might lockup. But it is a great card and has been rendering perfectly for years. Just ordered a 6970.. very excited for the upgrade!

e-credibility: the non-guaranteeable likelihood that the electronic data you're seeing is genuine rather than somebody's made-up crap. - Karl Lehenbauer