Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
AMD Graphics Hardware Technology

AMD Radeon HD 7970 GHz Edition: Taking Back the Crown 132

An anonymous reader writes "The benchmarks are in for the Radeon HD 7970 GHz Edition. Starting at $500, AMD's new single-GPU flagship boosts the original 7970's clock speed from 925 MHz to 1 GHz (1050 MHz with Boost). The GHz Edition also sports 3 GB of faster 1500 MHz GDDR5 memory, pushing 288 GB/s as opposed to 264 GB/s. While the AMD reference board runs hot and loud, retail boards will use different cooling solutions. A simple test of aftermarket GPU coolers shows that any other option will shave degrees and slash decibels. But it's the Catalyst 12.7 beta driver that really steals the show for AMD, pushing FPS scores into overdrive. With the new Catalyst, Nvidia's GeForce GTX 670 can no longer beat the original Radeon HD 7970, and the GHz Edition outmaneuvers the GeForce GTX 680 in most cases. However, when factoring price and possible overclocking into the equation, the original Radeon HD 7970 and GeForce GTX 670 remain the high-end graphics cards to beat."
This discussion has been archived. No new comments can be posted.

AMD Radeon HD 7970 GHz Edition: Taking Back the Crown

Comments Filter:
  • Where's my X2 edition?

  • Re: (Score:1, Flamebait)

    Comment removed based on user account deletion
    • I don't know about ATI, but AMD's Radeon cards have been competitive for a long time.

    • by Sir_Sri ( 199544 )

      Competitive is an odd word in the hardware business. If you want to spend 1200 dollars on a CPU does AMD have a competitive offering? How about 500? What's the different in performance between a 500 dollar part and a 1200 dollar one?

      With GPU's AMD and nVIDIA are pretty close in rendering performance, for specialized tasks (GPU computing) particular hardware may favour one guy over the other. But if 90% of the market is in GPU parts that cost less than 400 dollars, whether or not you hold the top spot at

  • My x1950 sucks for 1080p video (since, y'know, the HD line was the NEXT card after that), but it's fine for QuakeLive, which is the only game I really care about that much. But I know they have cards that "game good, video bad" as well as "game bad, video good". It's frustrating trying to figure out which product I should buy. My card was $200 in 2007. If I don't really want that much more than what I got then, why would I want to pay $500?

    Now, I'm an ATI man who's been using TV out since 1995, non-stop. But I'm not willing to throw them so much money, especially when I have to change my entire operating system to accommodate their abandonment of "old" OSes like XP. Man, that jump to 64bit required updating so many scripts, and replacing so many utilities. Don't force change on me and I might give you more money, ATI.

    I like stories.

    • Comment removed based on user account deletion
    • I have an ancient x1650 card with a Q6600 cpu running windows 7 and 1080p video uses minimal cpu. Your card is not the bottleneck. I'd like a newer card but when you look at the numbers the lowest end Fermi card is still slower than the old 9800GT series.

      • I also had an x1650 with an E6300 (until recently) - ran video at every resolution fine on both WinXP and Win7 x64. However, I'd recommend spending the $20 and upgrading to a Radeon HD 5450... much better performance in Win7 and only draws 18W.

      • by ClioCJS ( 264898 )
        Card's definitely the bottleneck in my case. Just because you have a lower card doesn't mean much - our system, software, and hardware is not identical. Cpu is nowhere near maxed out. Also, I play a lot of stuff using only software acceleration (not 1080p though, it still gets all kinds of tearing in VLC and media player classic), because I have an ambilight system on my computer that needs to be able to see the pixels to match the light color. Card ran better when I bought it than now, too. It's 5 yrs old
      • by ClioCJS ( 264898 )
        It also depends on the bitrate for me - if it's a 2 hr movie encoded to a 3G 1080p MKV, i'm going to be fine. If it's a 2hr movie encoded to an 8G 1080p MKV, I'm not. Still not the cpu maxing out though.
    • If you aren't gaming, just grab a mid range 5xxx series and you will be fine. Should have very little trouble finding one for less than $80.

      If you want more than that, get a HD 6770. You can have em for about $115 and it should keep up with any HD, moderate gaming(as long as you don't expect maxed settings) and should last you another 3/4 years before its too outdated.

      There is no reason to spend more than $200 on a video card unless you are doing hardcore gaming on multiple or high resolution monitors.

      • by ClioCJS ( 264898 )
        I'm gaming, just - i don't need much gaming power behind what an x1950 offers. I want to play quakelive in 1080p with 60fps, but i'm on an HDTV anyway, at 30Hz usually, so, framerate's not TOO pressing of an issue.
      • I would recommend HD7770 as it consumes a lot less power compared to HD5770/6770 and the cheapest ones cost around $110 - $120.
    • Have you thought about putting that money towards something like one of the new integrated graphics options, like Ivy Bridge or the AMD stuff? I can say at the least that SandyBridge HD2000 was sweet, and I cant imagine how sweet IB must be.

      It wont match a mid-highend GPU, but it may do better than what you had. Just something to consider when checking out benchmarks.

      • by ClioCJS ( 264898 )
        I don't plan on upgrading my motherboard anytime soon tho.. i'm super-picky on motherboards. Quite traumatized by abit going out of business.
    • That problem went away in the 5000 series, since they made it all one unit. Sure, some cards are faster at one thing than others, but basically anything after the 5000 level of cards is wicked at both along with open cl

    • I have an old 1950Pro that while might be a bit better than an x1950 shouldn't be all that different. It also should be easily powerful enough for 1080p video which is not hard at all (less resolution than normal really). So I would check your settings for your drivers for updates or whatever. Modern integrated graphics should handle basic 1080p, a dedicated gaming card, even from a number of years ago shouldn't have any problems.

      • by ClioCJS ( 264898 )
        I'm most definitely running the latest driver for the card. They stopped updates a few yrs ago. Again, ati did not advertise HD playback until the NEXT card, so it's not a surprise it has trouble with higher-bitrate (but not lower-bitrate) 1080p. What is surprising is that things seem to have gotten worse after 5 yrs of use - probably dirt in the fan. I have it overclocked about 1% at this point, nothing to lose.
  • All of my expensive fancy video cards have died, usually right after any kind of warranty and I'm squeeking by on some horribly low res, limited palette and no hardware acceleration for graphics. But at least it's reliable!

    • by Ogi_UnixNut ( 916982 ) on Friday June 22, 2012 @04:37PM (#40416853) Homepage

      Ditto! I kept buying top end Nvidia cards for CUDA work, only to have them die just after the warranty, usually a year or so. I dug out an old Nvidia Quadro 285 card from the early 2000's, and am using it again. Also the 8400gs I got works just peacy for simple CUDA stuff.

      It is like they engineer their top end cards to fail after a year or so, no matter what. My GTX 280 never went beyond 50 degrees, and was underclocked to boot (I didn't need all the power). Yet it died after a year or so, about as long as the 8800GTX I had beforehand.

      The Quadro has been in use in some form for more than half a decade, and it still does 99% of what I need (Apart from the CUDA stuff, otherwise it would be perfect). Their older stuff seems more solid.

      • by ackthpt ( 218170 )

        Ditto! I kept buying top end Nvidia cards for CUDA work, only to have them die just after the warranty, usually a year or so. I dug out an old Nvidia Quadro 285 card from the early 2000's, and am using it again. Also the 8400gs I got works just peacy for simple CUDA stuff.

        It is like they engineer their top end cards to fail after a year or so, no matter what. My GTX 280 never went beyond 50 degrees, and was underclocked to boot (I didn't need all the power). Yet it died after a year or so, about as long as the 8800GTX I had beforehand.

        The Quadro has been in use in some form for more than half a decade, and it still does 99% of what I need (Apart from the CUDA stuff, otherwise it would be perfect). Their older stuff seems more solid.

        My suspicion, after looking at a few cards under a loupe, is the technology is exceding the board itself to host such densely packed, current hungry and heat producing electronics. To be able to sell and profit from these units they are produced rapidly by a robotic assembly line. If they slowed that line down a bit the failure rate would decline, but they rather operate under an acceptable rate of failure (early or later) as the assembly line will be tooled for something else after the run.

        Our older card

        • by F34nor ( 321515 )

          You might try heating them in the oven to reflux the solder. It worked from my D820 laptop motherboard.

          • Oooh, could you give me some more info? My GTX280 cost a lot of money to me, and seems to have some memory corruption rendering it useless for anything but simple 2D displaying. I can't use it as is, and nobody wants to buy a faulty GPU.

            What settings to use? How long to keep it in the oven? Do I need to disassemble it? Any instructions on doing it? It would be excellent if I could get it working again!! It would save me having to splash out another few hundred pounds for a new card :/

      • Re: (Score:3, Insightful)

        by Anonymous Coward

        There are differences between the professional lines (Quadro, Tesla, FireGL, Firestream) and the consumer lines (GeForce and Radeon). The professional lines are built for the GPU manufactures to controlled specs and designed for longer life. The consumer lines are built by OEMs from a reference design with incentives to push clock speeds and component specs to the limit.

        Your experience likely has more to do with the old card being a Quadro than it does with newer cards being more fragile.

        • The consumer lines are built by OEMs from a reference design with incentives to push clock speeds and component specs to the limit.

          Indeed.. and with that in mind, its not at all silly to buy a card with lower clock rates but the same gpu reference.. you probably wouldnt notice the framerate difference, but you WILL notice the temperature difference as the highest clocked cards are always maxing out their fans... even on menu screens

      • Really? I've never had any issues with NVIDIA cards. Would be interested to know which brand of card, mobo and PSU you were using.
      • by JonySuede ( 1908576 ) on Friday June 22, 2012 @07:57PM (#40418083) Journal

        this is the detailed article explaining why the things are the way they are : http://www.geeks3d.com/20100504/tutorial-graphics-cards-voltage-regulator-modules-vrm-explained/2/ [geeks3d.com]

      • by anethema ( 99553 )
        There are a few cards which offer lifetime warranties you could check out. Zotac makes one, and evga used to, not sure if they are down to a 3 year. http://ncix.com/products/?sku=71569&vpn=ZT-60301-10P&manufacture=Zotac&promoid=1048 [ncix.com] Etc.
      • by Anonymous Coward

        Your GTX-280 is a fine-pitch BGA, where your old Quadro is a gullwing package.

        FPBGAs have practically zero tolerance for board flex. You probably did the fatal damage when you installed the card, and it laid in wait for the time to fail, as moisture got into the cracked ball and corroded it.

        I got a handful of people at my office to bring me their video cards from home so I could run them through our high-res x-ray imager (we use for failure analysis). Every single one had cracked solder balls from board fle

    • by malloc ( 30902 )

      What card are you running that has limited palette? I haven't had that since I gave up my ISA Trident 9000 card. (For sake of argument I'm considering a 24-bit RGB signal as "unlimited". Consumers aren't going to go worrying about a 10-bit LUT in their hardware).

    • You are fucking something up, most likely your cooling or power. Or I suppose you could just be really unlucky. Regardless, just get a card that has a lifetime warranty. eVGA will sell you one.

      Not buying ultra high end cards because they cost too much is a good reason not to buy them. You can get near their performance for much less money. Not buying them because you can't be bothered to build a system with proper power and cooling and do a bit of research to get a longer warranty is a silly reason not to b

      • He probably bought the highest end card but then cheaped out and got the one with a bad cooling system.
        I have made that mistake before but now I spend the extra $20 to get the version that runs the coolest and quietest. That way it doesn't die after a year of use and doesn't sound like a vacuum.
  • They need to add a benchmark for BirCoin, since it makes a lot of the market of high end graphic card buyers, and AMD is way faster per Whatt than Nvidia.

    • Re: (Score:1, Insightful)

      by Anonymous Coward

      For the 3 people that care about shitcoin?

    • by h4rr4r ( 612664 )

      What is this "a lot of the market"?
      You really think more than a tiny percentage of folks use these cards for bitcoin?

      Most people prefer to play games with them, instead of entering into pyramid schemes. Cash out while you still can.

      • I think many of these miners need to L2math. So one of these cards will run you $500. Running it full bore will take around 250 watts so 1kWh for ever 4 hours it runs. Also have to factor in cooling, if you live in a warmer area. Also factor in computer power (and cooling for that) if it would normally be off during that time.

        Well you need to run the numbers for your own power costs and so on, but that is a lot of mining you have to do to break even depending on what price you can get per bitcoin at a parti

    • Comment removed based on user account deletion
  • by rtkluttz ( 244325 ) on Friday June 22, 2012 @04:25PM (#40416779) Homepage

    Regardless Torvalds recently getting his feathers ruffled with Nvidia.... In most cases Nvidia just works on Linux. I swore off AMD/ATI loooong back because JUST about time they finally get a decent proprietary linux driver support for one of their chipsets, it drops off the back side of support. I DESPISE forced upgrades and won't get caught in that trap again. All of our perfectly working AMD video laptops still work great but no proprietary driver support and the open source driver is waaaay worse. Nvidia proprietary drivers still support VERY old chipsets.

    • by Anonymous Coward

      torvalds couldn't go off at amd because they at least opened the source on part of their drivers which now has group working to produce an open source version. it still means the official amd drivers are a pile of sh*t, but it at least means they can "try" do something better.

    • The open source Radeon driver works just fine, I'm using it for heavy 3D work right now. Not the case with NVidia. Linus had every reason to flip NVidia the bird, especially considering NVidia's ambition to win bags of gold selling Android chipsets.

      • by Anonymous Coward

        The open source Radeon driver works just fine

        So not supporting OpenGL features which the hardware is capable of and running with abysmal performance is "just fine"? The performance of the open source Radeon drivers is utter crap and something that should actually shame AMD. But they don't give a shit.

        And guess what, Android systems won't be running any open source OpenGL drivers anytime soon, regardless of NVidia.
        It's galling that the only way to get good 3D performance is to run NVidia/Linux instead of GNU/Linux, with a proprietary black blob three t

        • I'm getting 75 million Phong shaded triangles/second at 1920x1200 out of a 6450 running the Xorg Radeon driver. What's not good about that? Note: that's a fanless $50 card.

        • So not supporting OpenGL features which the hardware is capable of and running with abysmal performance is "just fine"? The performance of the open source Radeon drivers is utter crap and something that should actually shame AMD. But they don't give a shit.

          The open source driver is not meant to be better than the blob in terms of performance. It's meant to evolve much slower, but to support older cards much longer and to work more closely with open source projects. Fglrx is often slow to react to newer Xorg ABIs, for example, and in such cases radeon is there to pick up the slack. It isn't perfect when running any modern card, but it gives you a desktop, good video playback and all. Gaming is still far behind, but it's very possible to play a lot of older tit

      • Piss on your open source radeon driver. It has never supported the X1250 graphics in my R690M chipset correctly in spite of the core being ancient. The graphics corruption I've encountered with every try to date is actually worse than before in more recent versions — just tried it today with Precise. And of course, this core is too "old" to be supported by fglrx, which was true the day I bought the system brand new at the store.

        AMD linux support is ass unless you happen to have one of the few cards th

      • by wrook ( 134116 )

        I have an HD6950 which has relatively good horsepower, I guess. I had been using the proprietary driver simply because that's what my distro set up and I've been too lazy to change it. I don't really play games all that often, but I have one or two kicking around. As you mentioned this, I decided to try out the open source driver.

        On the plus side, the performance of 2D and video is actually quite a lot better than the proprietary driver. Everything is quick and smooth and no tearing. I had to fiddle wi

    • by GrumpyOldMan ( 140072 ) on Friday June 22, 2012 @05:14PM (#40417119)

      +1 I had an ATI in my last Linux desktop. Never again.

      The proprietary fglrx drivers tend to have weird bugs and as you say, they drop chips that are old enough to have decent support. On the flip-side, the open-source radeon drivers tend to require various bleeding edge bits and pieces to work correctly, so they are nearly impossible to run on stable distros, like an Ubuntu LTS or a RHEL.

      Nividia's proprietary drivers just work, once you finally figure out how to blacklist nouveau hard enough that it doesn't get loaded via the initrd. Plus they support VDPAU for projects like MythTV and XBMC.

    • Goodbye nvidia, i wont miss you. For years I've stuck with nvidia because everyone said that they are the best choice for linux, but i've hated every minute of it. They were a constant annoyance for the following reasons.
      1. Every time you build a kernel you have to download the latest nvidea driver and install it separately.
      2. Want to try a release candidate kernel? nope forget it. Just to get the damn thing to install you have to reverse engineer their installation scripts, and even after you get that working, t
    • by ianare ( 1132971 )

      Funny you should mention that right when AMD wins a huge order [phoronix.com] of graphic chips precisely because they have open source drivers.

      And anecdotally, I've never had a problem with AMD hardware, generally by the time the proprietary driver loses support, the open source one matches its performance.

  • by Anonymous Coward

    This cycle, the latest nvidia GPUs have a lot going for them. The Kepler series really is impressive, and generally has much lower power consumption than the AMD parts. (This was flipped last gen funny enough. Those first fermi cards were heat machines)

    I picked up a factory OC'd version of the 670 and It's shocking how fast it is. (And how quiet and cool it is) Remember that moment your system became good enough to run Oblivion at fully maxed settings with really high framerates? Or morrowind? Yeah, that ju

    • The GTX 670 is probably one of the best cards I have gotten to date. I am happy I got it instead of the GTX 680. Granted everyone who got a 680 were kicking themselves in the pants when the 670 was just slightly less powerful than the 680 at a much cheaper price.
  • Seriously, I can't even directly compare Radeon to GeForce cards because they're functionally crippled due to the fact that game developers are ignoring them in Windows and the binary driver is completely borked in Linux so no kudos THERE either. Then there is that issue that every single time I buy one of these cards they spontaneously flame-out. Speed isn't everything and they've been stomped in the drivers and support by NVidia for years. Radeon's all fail to render graphics properly even to this day and
    • by Mashiki ( 184564 )

      Might have something to do with developers you know having a bug up their ass, and still developing for consoles. You know 10 year old hardware. Nah couldn't be...

  • ... will work just fine in my Apple Mac Pro! Oh wait.....

    Seriously, this is the kind of boost Apple *should* have been after, since they're now stretching out the upgrade of the Mac Pro until some time in 2013..... They could at least update OS X with an incremental release and start offering this card for the now 2 year old Mac Pro they did a slight CPU speed bump to and called "updated", so there'd be SOME sane reason for people to order one.

    • by Greyfox ( 87712 )
      That's one of the main reasons I've stopped buying Apple Hardware. Having the choice between one crappy three-generation-old ATI video card and a much crappier four-generation-old nvidia card on a $5000 desktop machine wasn't really much of a choice at all. One of the supposedly-nice things about the platform was that you could find at least SOME games for it, kind of the situation we were in back when Loki Games was still supplying games to the Linux community. Except the video card they provided could bar
  • by Leo Sasquatch ( 977162 ) on Friday June 22, 2012 @05:16PM (#40417125)
    OMG, does this mean I can now run Crysis at 80fps instead of 75? F**k me sideways with a spastic badger, my life is now complete.

    Why does anyone care that the two major card makers are still in their dick-waving war? Is it just to keep the review sites in business? Hey, look, another new top-of-the-range GFX card, not totally dis-similar to the one we reviewed last month, only we got it for free, and you'll have to part with some serious wedge if you want to have the same toys as the cool kids!

    There have been no real, serious differences between any of the last dozen iterations of hardware. Anything made in the last couple of years should run any game on the market at full shiniez at decent resolution. It won't, sadly, make the gameplay any better.
    • Because some of us want to see what it's like to play a game with maximum bling enabled?

      My machine still chokes on Alice: Madness Returns, and that's with a 6970.

      • My machine still chokes on Alice: Madness Returns, and that's with a 6970.

        There's something wrong with your machine, then. My machine has a 6870, and that game runs perfectly at 1920x1080 max graphics settings. (CPU is an i5 2500k oc'd to 4.7GHz, with 16GB of RAM). Aside from some idiotic load cues triggering hard drive access in the middle of some of the puzzles (and the game not caching the loaded results so if you turn around and attempt the puzzle again the cues triggers again), no problems at all, but that's bad game design not bad graphics support, and the problem went away

    • Considering that a 680 outclasses the living hell out of a 280, to the point where a 680 is able to turn up graphics higher than a pair of 280's in SLI, your blanket comparison (involving anything, including stuff down at the 640 level, which is still significantly weaker than a single 280) seems unfounded.
    • > Anything made in the last couple of years should run any game on the market at full shiniez at decent resolution.
      Sigh. Clearly your video games diet is fairly bland. Try one of the DCS combat simulators (A-10C, Ka-50, P-51D), or even Armed Assault II and you'll quickly notice the difference between a high end video card and a run-of-the-mill one. Just because the 'mainstream' is designed as twitch games that fight on maps the size of postage stamps doesn't mean all games/simulations are like this. I
    • I agree, and $500? wow. It is nice to run games at a decent frame rate, unfortunately for Radeon users their driver support seems to be falling behind their hardware by quite a lot. My Nvidia 560ti runs my favorite game Tribes Ascend with full bling, while Radeon users of higher end cards report all sorts of issues.

      BTW the new Tribes is FTP..Join me https://account.hirezstudios.com/tribesascend/?referral=1207516&utm_campaign=email [hirezstudios.com] for anyone who remembers tribes 1 and tribes 2, you know why you should

    • by IorDMUX ( 870522 )

      Why does anyone care that the two major card makers are still in their dick-waving war?

      Because people buy the stuff they make (I dunno who, but their top-line stuff makes money somehow), and because, once they've replaced it with something newer and better, the price drops fast so the rest of us can build a nice, inexpensive gaming machine.

    • Mostly but not entirely. I currently run an old hand-me-down Geforce 8800 GTS and while its processor is certainly powerful enough to calculate the scene in my native resolution it has an entirely different problem that makes newer games (say, Far Cry 2) run horribly on higher settings: 320 megabytes of RAM. You can have all the power you want in your GPU core but it all amounts to nothing if anything but the lowest settings induce noticeable pumping as the core spends most of its time mobing around data be
    • Continuos 5% more dick waving of the time means upgrades are worth it after 3 or so years. Assuming the drivers aren't tweaked. It keeps the industry healthy.

  • But how many bitcoin hashes per second can it do???
  • Why would the "reference board [run] hot and loud" ? AMD has the engineers to develop this stunning achievement, but not the engineers or the time or money to indicate a recommended cooling solution?
    • ..because with the latest top-end GPU's, "proper" cooling is very expensive... even the mid-range cards are using multiple fans these days...

      From what I am reading, the original 7970 drew 40 amps / 210W TDP at reference clocks and upwards of 100 amps when OC'd, and because this new 7970 is basically OC'd...
  • Wake me up when their drivers are fast AND correct.
  • I enjoy reading the articles posted on SemiAccurate.com [semiaccurate.com] about AMD, nVidia, Intel, etc. Most of the articles are by two writers, and the most entertainingly acerbic ones are by Charlie Demerjian (I'll call him "CD").

    Five months ago, CD thought nVidia was going to crush AMD on the high end:

    http://semiaccurate.com/2012/01/19/nvidia-kepler-vs-amd-gcn-has-a-clear-winner/ [semiaccurate.com]

    However, nVidia seemingly can't produce their high-end chips in any useful quantity. So, CD snipes at nVidia about that in his comments about

  • How does this compare to the multi-thousand dollar CAD video cards?
    • How does this compare to the multi-thousand dollar CAD video cards?

      Consumer GPUs are fast as piss now, but they may not have the ability to drive really fancy high-end monitors.

  • It's embarrassing to see comments about how you should never use ATI-badged video cards in a Linux box, only to go home and watch my creaky-old 4550 run not only just fine, but also play 1080p video, and render 3D, while driving two monitors. Let me rephrase that: YOU should be embarrassed to say comments about how you should never use ATI-badged video cards yada yada yada...

    Ditto for the "but I need proprietary drivers for Nvidia" crap. Nouveau drivers are getting just as good. Have a POS AGP FX5200? N

  • best bang of buck should be the geforce gtx 560 for now. About 180$ and lots faster than the 7770.

Repel them. Repel them. Induce them to relinquish the spheroid. - Indiana University fans' chant for their perennially bad football team

Working...