Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AMD Hardware

AMD's Radeon R9 290X Launched, Faster Than GeForce GTX 780 For Roughly $100 Less 157

MojoKid writes "AMD has launched their new top-end Radeon R9 290X graphics card today. The new flagship wasn't ready in time for AMD's recent October 8th launch of midrange product, but their top of the line model, based on the GPU codenamed Hawaii, is ready now. The R9 290 series GPU (Hawaii) is comprised of up to 44 compute units with a total of 2,816 IEEE-2008 compliant shaders. The GPU has four geometry processors (2x the Radeon HD 7970) and can output 64 pixels per clock. The Radeon R9 290X features 2816 Stream Processors and an engine clock of up to 1GHz. The card's 4GB of GDDR5 memory is accessed by the GPU via a wide 512-bit interface and the R290X requires a pair of supplemental PCIe power connectors—one 6-pin and one 8-pin. Save for some minimum frame rate and frame latency issues, the new Radeon R9 290X's performance is impressive overall. AMD still has some obvious driver tuning and optimization to do, but frame rates across the board were very good. And though it wasn't a clean sweep for the Radeon R9 290X versus NVIDIA's flagship GeForce GTX 780 or GeForce GTX Titan cards, AMD's new GPU traded victories depending on the game or application being used, which is to say the cards performed similarly."
This discussion has been archived. No new comments can be posted.

AMD's Radeon R9 290X Launched, Faster Than GeForce GTX 780 For Roughly $100 Less

Comments Filter:
  • by Suiggy ( 1544213 ) on Thursday October 24, 2013 @08:30PM (#45230161)

    That should have been the real headline.

    • by Anonymous Coward on Thursday October 24, 2013 @08:57PM (#45230299)

      Unless you use a titan to do modelling that requires double precision. The titan is a super cheap k20x. It just happens to double as a gaming card.

      • by Suiggy ( 1544213 )

        Many other consumer/gaming cards support double precision floating-point from both nVidia and AMD. Including all of the AMD R9 2xx cards. Double precision hasn't been exclusive to workstation GPUs for a while now.

        • by etherelithic ( 846901 ) on Thursday October 24, 2013 @10:25PM (#45230761) Homepage
          But NVIDIA's consumer oriented cards have very slow double precision processing, something like 1/16 the processing speed of single precision. And they even artifically hobbled the DP performance of the GTX 780, which is otherwise a slightly cut down Titan (i.e. big kepler). All of AMD's 79XX cards (and its rebranded brethren the 280X card), and the new 290X card have 1/4 DP performance. I've consistently bought AMD Radeon cards for my OpenCL applications because their $300 cards are almost as fast as NVIDIA's $1000 card, and in some cases faster, for DP calculations.
        • There's a difference between "supported" and "actually usable".

          The 780 has a theoretical single-precision compute rate of 4.0TFLOPS, comparable to the Titan's 4.5TFLOPS. Go up to double-precision, and the 780 plummets to 165GFLOPS (1/24th the power), while the Titan remains high at 1.5TFLOPS (1/3rd the power).

          I'm not sure what the exact reason for that discrepancy in performance is, whether it's an actual hardware difference, some hardware being disabled during binning, or even just a driver change (althoug

    • Re: (Score:1, Insightful)

      by Salgat ( 1098063 )
      The Titan was never meant to be competitive based on price/performance. It's not a fair comparison.
    • Re: (Score:2, Insightful)

      by Nemyst ( 1383049 )
      Irrelevant. The Titan was never meant to be a consumer-level card, it's something that's meant to be sitting between the consumer (sub-700) market and the professional (1000+) cards. Its performance is within 10% of the GTX780, which makes it a bad buy even amongst NVIDIA cards. The real reason to buy one is that it has full speed double-precision, whereas all the consumer cards are significantly slower (an artificial restriction to somewhat justify the cost of professional cards).

      I'm not saying that the
      • by Entropius ( 188861 ) on Thursday October 24, 2013 @10:03PM (#45230655)

        A lot of compute applications are memory bandwidth limited, so single precision will give you only twice as many flop/sec as double.

        There's another thing about the Titans, though: reliability.

        I do lattice gauge theory computations on these cards. We've got a cluster of GTX480's that is a disaster: the damn things crash constantly. We're in the process of replacing them with Titans, which have been rock solid so far, as good as the cluster of K20's I also use. (They're also a bit faster than the K20's.) The 480's are especially bad, but I imagine the Titans are better than (say) GTX580's.

        The Titan doesn't make that much sense as a high-end gaming card, but it makes a great deal of sense as a ghetto compute card for people who don't want to buy the K20's/K40's. (We've benchmarked a K40 and the Titan still beats it, but only barely.)

        • by Arkh89 ( 2870391 ) on Friday October 25, 2013 @01:12AM (#45231351)

          You're lucky then... We replaced our cluster of 580s by Titans and these things keep crashing for no apparent reason (about 2/3 of the cards will randomly hang up on computation are run fine on the remaining cards)...

          • Hm, interesting -- if we're going to get Titans as an upgrade this is worth knowing. What are you doing on them? We're doing a computation that uses a lot of single-precision, somewhat less double precision, and occupies about 70% of the 6GB memory on each (they run in pairs).

            Oh -- make sure you have the new drivers. I kept getting random crashes and it turns out that the old Linux Nvidia driver was at fault since it didn't really support them. I upgraded the drivers and everything was fine.

            • by Arkh89 ( 2870391 )

              I should be more precise on the context...
              We are mainly doing FFTs and Linear algebra in both single and double precisions. To give a little bit more details about the problem : we run the same code on multiple GPUs at the same time (each instance of the program has its own GPU and is not communicating with the other processes). It appears that, after a random number of iterations (it might be 1K, 10K or 100K), a kernel from CuFFT, or CuBLAS, or my own gets stuck and the program is killed by the watchdog of

              • You run your program 3 times on 3 different hardware setups, get 2 complete results and compare results, which are the same ? The other computation did not complete.

      • by XaXXon ( 202882 )

        What does that even mean? It's for sale and if you wanted that performance you had to pay $1000 for it. Now you don't.

        People who want that performance are sure going to be interested in learning they can pay half as much.

        The alternative is to say that nvidia has nothing with this performance, but then people are going to say "But the Titan does"... so what are you going to do?

      • SLI Titans perform much better than SLI 780s. This is because a 780 actually has dual GPUs on a single card and Titan just has one.
      • Irrelevant. The Titan was never meant to be a consumer-level card, it's something that's meant to be sitting between the consumer (sub-700) market and the professional (1000+) cards.

        Is that marketing bullshit that I'm smelling?

        an artificial restriction to somewhat justify the cost of professional cards.

        I don't think 'justify' is quite the right word here...

    • Re: (Score:2, Informative)

      by Anonymous Coward

      I hate to be that guy but if the reviews are to be trusted, amd overclocked this thing to the very limit of the chip's potential just to beat the competition.
      There's almost no headroom for overclocking and stock, it's ridiculously hot, loud and power-hungry.
      For 400 bucks more, you got something that's still better in most games at a relevant resolution about 8 months ago.

      • by Anonymous Coward

        Other reviews show the thing can easily be pushed by another 10%, even without lifting the 40% fan speed restriction.

      • AMD designed this chip for maximum performance in minimum die space. They managed to cram Titan-level performance in 435mm^2. That's including a 512-bit memory bus AND 64 ROPs, so they're not exactly cutting corners!

        The Titan uses a 551mm^2 die-size, and although some of that is fused-off, the majority of the difference is because Nvidia designed it wide and slow for power first, performance second. This is because the part was targeted first-and-foremost at professionals, where performance/watt and coole

    • by JDG1980 ( 2438906 ) on Thursday October 24, 2013 @10:51PM (#45230873)

      The GTX Titan is a double-precision computing card that happens to do very well at gaming. It's not really a fair comparison. Ever since the GTX 780 was released, pretty much every review site has recommended it over the Titan for gamers on price/performance grounds.

      • by Anonymous Coward

        HD69xx, HD79xx, R9 280 and R9 290 are all double-precision computing cards that happen to do very well at gaming.
        So it is a fair comparison.

    • Now if AMD would give us back the Driver Only install instead of forcing .Net4 and Catalyst Control center down our throats. The driver is good but I don't need the damn CCC App crashing and restarting the video driver when it didn't crash.

      This is probably the biggest reason I no longer use AMD cards even though their as good performance wise as Nvidia. Or course Nvidia is having driver issues again so I guess I need to stick with Intel only for stable and open source drivers.

  • by Suiggy ( 1544213 ) on Thursday October 24, 2013 @08:35PM (#45230197)

    Mantle support, 4GB of VRAM, 512-bit memory bus for fast transfers... we're in heaven.

    With that much VRAM, there should be enough for a rich geometry buffer and room to spare for a decent sized scene represented by a sparse voxel DAG. Ray-cast the voxel DAG into the geometry buffer, then do your polygonal rendering pass, followed by your deferred lighting passes, and final composition.

    • Re: (Score:2, Insightful)

      by XaXXon ( 202882 )

      I'll be excited about this "mantle" thing when I actually see stuff that benefits from it instead of a bunch of theoretical mumbo jumbo.

      Graphics API overhead this.. 10x more draw calls that..

      Show me a real game and show me how it's actually better than the alternatives that actually exist at that same time. "Look this thing that doesn't really exist (or isn't in use) is faster than stuff that's actually here now and being used". Everyone can win at that game.

      Real numbers on real games. Until then, you ca

      • by aliquis ( 678370 )

        Also redo these benchmarks in Linux ..

      • If you're interested, one thing to watch is Star Citizen development. It will be a 64 bit game being created for next-gen systems. It will have extremely high poly counts with super realistic physics and shaders coming out the airlocks. There is already a downloadable, small demonstration of the engine called "The Hangar Module", but I think you have to be a contributor to get it.

        Trust me, Crysis is dead. The next big question will be, "Does it run SC?"

        • by abies ( 607076 )

          I'm really unimpressed with the demo so far. Don't see any real difference to Eve Online hangar rendering for example. Plus, rendering spaceships is probably easiest thing you can aim for - compared to trees/foliage, water, mossy rocks, realistic sky etc.
          If SC will require top-end hardware it will be only because they are lazy, not because there are so big requirements to render it nicely. High-poly models doesn't make sense when you end up having 6 polygons for each pixel... and when you can achieve 99% of

          • I didn't check out the Hangar Module myself, my PC is not up to it.
            But the videos they have floating around do look mighty impressive. Look at the massively detailed cockpits and the realistically moving parts on the ships. Obviously, it's still in development, but the features they are promising, like walking around your own spaceship or walking inside a capital ship and being able to look through a window and watch the battle taking place outside, there is nothing like it at the moment.
            And it will take a

        • Well if it runs Crysis, it certainly handles Supreme Commander (SC) quite well. Get your Abbreviations right before "Opening your mouth and confirming your a fool" as President Lincoln once said.

          • Back off newbie. SC, as everyone knows, officially stands for "Star Control", since 1990 and exactly up to the point at which Star Citizen is released.

            However, I came to realize that my claim about Crysis being dead is a little ironic, considering that SC is based on CryEngine 4. :D

        • by aliquis ( 678370 )

          SC is Starcraft.

  • by Chas ( 5144 ) on Thursday October 24, 2013 @08:36PM (#45230201) Homepage Journal

    Now let's hope to god they have their driver situation hashed out.

    AMD/ATI has always put out fairly nice hardware. But, more often than not, they're always falling on their faces because of shoddy drivers.

    • by Anonymous Coward

      They've made major improvements recently with regards to performance consistency in their drivers, especially with multi-GPU and multi-monitor setups.

    • by Knuckx ( 1127339 )

      No one makes decent graphics drivers. Intels drivers have so many strange oddities it's not funny (random garbage textures/shader faults), AMDs are generally naff, nVidias break themselves every so often and need a full reinstall (wiping your configuration out along with it), and Matrox releases updates once every 3 years (if you are lucky).

    • by Nemyst ( 1383049 )
      Exactly. Most reviews don't really cover long-term usage and that's where AMD has issues. I've gone AMD/ATI since the X series and I'm probably going to move to NVIDIA next time because I've had a lot of driver issues across numerous computers (including laptops). It's quite frustrating too because if their drivers were roughly on par with NVIDIA's, AMD would be crushing the competition.
    • by jakobX ( 132504 )

      Never had any problems with AMD drivers and ive used their cards since radeon 8500le. Ive also had Nvidia cards in my machine and also didnt have any major problems.

      Sure sometimes drivers will give you problems but to say that one company has consistently better drivers is nonsense. At the moment i have no problems with my hd7850, hd4870 and even my nvidia ION HTPC system stopped giving me problems after two years of "fun" (HDMI related).

  • I RTFA and just like the summary the 780 and the 290X are pretty close on everything and both lead on different games.
    One thing that was disappointing about the article is the SLI/crossfire benchmarks. They only compared a couple games that no one plays and only compared it against the 780 in SLI instead of the Titan which is the real king of SLI. They didn't do any 4k or multidisplay testing.
  • by sayfawa ( 1099071 ) on Thursday October 24, 2013 @09:35PM (#45230489)
    Nothing useful to say here, but since I know this will turn into an AMD vs Nvidia thing, I just wanted to share how sick I am of those frickin' unskippable 3-second-long Nvidia promos that play every damn time I start half of my games. That's the only thing I have against them, but it's starting to be really irritating.[/firstworldproblems]
    • just download the blank .bnk files and overwrite the stupid nvidia video, along with any others!
      • You don't usually even need to download anything extra. Just move the existing file to a backup directory and put an empty text file in it's place renamed to match.

      • Thanks, I had no idea this was possible. never really thought about it tbh. bit of a noob, I guess.
    • On the other hand, watching TV directly at abc.com has annoying commercial breaks :-)

    • What's with all the suggestions in the replies about downloading replacement bink files or replacing them with empty placeholder files? Just delete or name the damn things - most games will simply skip to the next video file, or in the absence of them all, go straight to the menu.

      For me though, I try to avoid doing this and if there's a way to skip via the config file (such as with Dishonored or Rage), then it's preferable.

  • Last January I went with Nvidia because the AMD graphics card was buggy.

    • by Bigbutt ( 65939 )

      Did the same thing last December.

      [John]

    • I can't speak for the 2XX series, but the 7XXX series I went for, swapped out and swapped back in again is now much better than it was in January. I purchased it, found out it wasn't stable and wouldn't drive two dual link DVI screens, put an old NVidia 8600GT in and felt frustrated for a few months. Then I bought a displayport-DVI adapter that had dual link capability and put the 7XXX back in. It was two major releases further in driver version and the stability problems I had running Linux were gone. I'm
  • I had enough problems with my last Radeons. No thanks.

    [John]

  • AMD pushed the new Hawaii chip pretty hard to get these results. It will usually bump up against the thermal wall (max 95 degrees C) when gaming at full load, and on 'Uber' mode (there's a switch to choose between that and 'Silent'), it's quite loud. Part of the problem is that AMD is using a mediocre blower-style cooler, which can't run at or near 100% fan speed without putting off an unacceptable level of noise, and can't dissipate enough heat to keep the card from running up against the thermal wall. To

  • by rsilvergun ( 571051 ) on Thursday October 24, 2013 @11:40PM (#45231037)
    I buy them for stability. I can't be the only one that's had, and continues to have, trouble with ATI's hardware. Maybe it's different in the $200+ range, but I buy in the $90-$130 range... I can't find it now but my bro was telling me that one of the gaming PC manufacturers dropped ATI because of the support calls :(.

    I miss the color quality from my 1650, but I haven't had any luck with their hardware since then...
    • Funny, last time I bought an Nvidia (a 8600GT with the infamous G86 chip) card it died within months because of internal solder thermal failure.

  • It's a great news that finally AMD got faster than NVIDIA and that too with $100 less. Good work AMD, but just waiting to see how long they remain faster than NVIDIA. Hope this time their hardware run for longer peroid too.
  • Now we have a new AMD card that can generate more OpenGL errors per second!

    Seriously, working with AMD is hard. Their OpenGL implementation never works properly and we always need workarounds to get the job done.
    NVidia on the other hand has always been working better for me as a developer.

  • This will be a good deal when prices drop below the MSRP of $549 if you are going to water cool it. It still uses 50%+ more power at idle and quite a bit more power when gaming though. It also runs hotter and will stress a water cooling system that much more, especially in crossfire mode. Nevertheless it seems like a good card for a water cooling setup.

    What bothers me is that you pretty much *have to* water cool it if you don't want it to sound like a vaccuum cleaner. The Nvidia cards are usable with stock

  • Cant seem to find the original nvidia vs ati render stuttering, but these will do.
    http://www.anandtech.com/show/6857/amd-stuttering-issues-driver-roadmap-fraps [anandtech.com]
    http://techreport.com/review/24022/does-the-radeon-hd-7950-stumble-in-windows-8/10 [techreport.com]

    I couldn't care less if this this the cheapest/fastest card on the planet.
    Until AMD fix the core stuttering issues with their drivers, instead of just patching it for a AAA game now and then. I'am really not interested.

    Frame rate isnt everything, stability and consistent

    • You're quoting reviews that are months old. The newer driver updates were designed specifically to fix these problems, and for the most part, they have succeeded. (There are still issues in some specific CrossFire and/or multi-monitor configurations, but these won't affect most users.)

      One of the reviews you cited was from The Tech Report, which did a good job of documenting these frame pacing issues with hard numbers a couple of months back. Well, let's see what they have to say about the R9 290X now [techreport.com]:

      You ca

One man's constant is another man's variable. -- A.J. Perlis

Working...