Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics AMD Games Hardware Technology

AMD Radeon HD 7970 Launched, Fastest GPU Tested 281

MojoKid writes "Rumors of AMD's Southern Island family of graphics processors have circulated for some time, though today AMD is officially announcing their latest flagship single-GPU graphics card, the Radeon HD 7970. AMD's new Tahiti GPU is outfitted with 2,048 stream processors with a 925MHz engine clock, featuring AMD's Graphics Core Next architecture, paired to 3GB of GDDR5 memory connected over a 384-bit wide memory bus. And yes, it's crazy fast as you'd expect and supports DX11.1 rendering. In the benchmarks, the new Radeon HD 7970 bests NVIDIA's fastest single GPU GeForce GTX 580 card by a comfortable margin of 15 — 20 percent and can even approach some dual GPU configurations in certain tests." PC Perspective has a similarly positive writeup. There are people who will pay $549 for a video card, and others who are just glad that the technology drags along the low-end offerings, too.
This discussion has been archived. No new comments can be posted.

AMD Radeon HD 7970 Launched, Fastest GPU Tested

Comments Filter:
  • by Anonymous Coward
    ...if most PC games weren't just shitty console ports these days. If you spend over $150 on a graphics card you're an idiot.
    • by Hatta ( 162192 ) on Thursday December 22, 2011 @12:29PM (#38461080) Journal

      Hush. Those idiots finance the advance of technology.

      • Hopefully the prices on the 5000 and 6000 series start dropping after Christmas. My 4670's are starting to show their age...
        • I was actually hoping they come out with a 7850 or similar soon. My 5770 is still pretty strong, but I wouldn't mind an upgrade soon, and I like to have all the newest features (I can do without the top speed).
        • You've been able to get a 6850 for darn near $100 for some time now. It's a pretty good deal unless you only buy high-end GPUs, which honestly seems like a waste anymore. My 4 year old 8800GT is just now starting to feel inadequate. I'm looking for the second coming of the 8800GT to emerge from this generation so I can hold on to it for another 4 years.
        • This [bensoutlet.com] is a pretty good deal for a real good card(5870). Faster than the 6870, but more of an energy hog.
    • by Lumpy ( 12016 ) on Thursday December 22, 2011 @12:37PM (#38461170) Homepage

      Says the idiot that only uses a PC for gaming.

      Adobe After Effects will use the GPU for rendering and image processing.

      • Arent you way better off with a workstation card for most workstation loads? From what Ive read, a GTX or ATI HD makes for a poor CAD or Adobe machine.

        • by Lumpy ( 12016 ) on Thursday December 22, 2011 @12:56PM (#38461390) Homepage

          Nope. Bang for buck this new card kicks the butt hard of the Workstation cards.

        • by billcopc ( 196330 ) <vrillco@yahoo.com> on Thursday December 22, 2011 @01:24PM (#38461744) Homepage

          Depends on the type of processing. GTX and Radeon cards artificially limit their double-precision performance to 1/4 of their capabilities, to protect the high-margin workstation SKUs. If all you're doing is single-precision math, you're fine with a gaming card.

          • by Bengie ( 1121981 )

            For now anyway. MS is looking at Double Precision becoming the standard for some future DirectX. That's probably still a few years off.

          • I'm still working with integer* based rendering engines, you insensitive clod!


            *16 integers. None of this new fangled 32 bit garbage kids play with these days.
          • by Lumpy ( 12016 )

            And even that can be fixed. the limit is in the firmware. I have a PC ATI card in my PPC mac that is running a workstation firmware that unlocked some serious processing power. This was back when ATI video cards for the quad core G5 were anal rape robbery pricing and the exact same hardware for the PC was going for $199.00 That system utterly screamed running shake and after effects back in 2007-2008

    • I think it is an issue that most game graphics have reached a peak with the current rendering technology. Where you need exponential work (In man power) to get a linear improvement.

      Black and White text... All find and good until we need a graph.
      Back and White graphics... Now only if it could do color.
      CGA... What bad colors.
      EGA... Looking a lot better if only we could get some shading and skin tones.
      VGA... Enough visible colors to make realistic pictures. But a higher resolution will make it better.
      SVGA...
      • by Bengie ( 1121981 ) on Thursday December 22, 2011 @01:47PM (#38462150)

        Ray Tracing!!!

        We're also capped right now because of too many single-threaded game engines. A given thread can only push so many objects to the GPU at a time. Civ5 and BF3, being the first games to make use of deferred shading and other DX11 multi-threading abilities, can have lots of objects on the screen with decent FPS.

        The biggest issue I have with nearly all of my games is my ATI6950 is at 20%-60% load and only getting sub 60fps, while my CPU has one core pegged. My FPS isn't CPU limited, it's thread limited.

    • by durrr ( 1316311 )
      It's the fastest GPU in the known universe! surely it have to be worth something!
  • I rebuild my machines every two years. My previous rig couldn't do Crysis as max settings so my latest system has dual 5870's that I got for $400 a piece. I'll never splurge like that on video cards again. Then again, 2 years later, I still max out the sliders on every game I get. It's great to have that kind of computing power... but maybe I should have waited 6 months? Those cards are going for $150 today.

    • Pretty sure most systems cant run Crysis perfect at max settings, simply for the fact that Crysis is one of the worst optimized games ever developed.
      • by billcopc ( 196330 ) <vrillco@yahoo.com> on Thursday December 22, 2011 @01:37PM (#38461960) Homepage

        Pretty sure today's mid-range PCs trounce 2007's high-end with ease.

        Just for shits, when I got my current rig just a couple years ago, I played through Crysis again. On a single GTX260, it was butter smooth at 1680x1050. When I switched to quad-SLI 295's, it was butter-smooth in triple-wide surround.

        People who continue to claim Crysis is an unoptimized mess are:

        - not programmers
        - not owners of high-end hardware

        Could it be improved ? Sure. Is it the worst optimized game of the 21st century ? FUCK NO, not even close, and subsequent patches greatly improved the situation.

    • by Luckyo ( 1726890 )

      In perfect honesty, it's better to buy a single powerful card (to avoid early problems in games) for 200-250 range, and upgrade every couple of years. Cheaper and you should be able to max or near max all games that come during lifetime of the card.

      Obvious exceptions are the extreme resolutions, 3D vision and multi-monitor gameplay.

  • Overpowerful. (Score:5, Interesting)

    by unity100 ( 970058 ) on Thursday December 22, 2011 @12:28PM (#38461068) Homepage Journal
    Due to console gaming retarding pcs.

    im on single radeon 6950 (unlocked to 6970 by bios flash), and i am doing 5040x1050 res (3 monitor eyefinity) on swtor (the old republic), all settings full, and with 30-40 fps on average, and 25 fps+ on coruscant (coruscant is waaaaaaay too big).

    same for skyrim. i even have extra graphics mods on skyrim, fxaa injector etc (injected bloom into game) this that.

    so, top gpu of the existing generation (before any idiot jumps in to talk about 6990 being the top offering from ati ill let you know that 6990 is 2 6970s in crossfire, and 6950 gpu is just 6970 gpu with 38 or so shaders locked down via bios and underclocked - ALL are the same chip), is not only able to play the newest graphics-heavy games in max settings BUT also do it on 3 monitor eyefinity resolution.

    one word. consoles. optional word : retarding.
    • Re:Overpowerful. (Score:5, Insightful)

      by parlancex ( 1322105 ) on Thursday December 22, 2011 @12:45PM (#38461254)

      ... 30-40 fps on average, and 25 fps+ on coruscant (coruscant is waaaaaaay too big). same for skyrim...

      Looks like PCs isn't the only thing gaming consoles have been retarding. Most PC gamers would have considered 25 fps nearly unplayable, and 30-40 FPS highly undesirable before the proliferation of poor frame rates in modern console games. There are still many of us that are unsatisfied with that level of performance, but are unwilling to compromise graphics quality.

      • Yes, it's really a shame that 30 fps became an acceptable framerate for games nowadays, thanks to crappy underpowered consoles.

        Funny, however, is that back in 1999 (Dreamcast days) any console game that didn't run at a solid 60 fps was considered a potential flop.

        This framerate crap is one of the many reasons I'll never go back to console gaming.

        Times change, no?

        • by Bengie ( 1121981 )

          I use to play software rendered Quake @ 320x200 8bit color and sub 20fps.

          This is what it feels like to play on consoles, when coming from PC. Flat lighting, crappy models, poor special effects, and a FPS that makes it feel like I'm roller skating with a strobe light.

      • Depends on the game.

        Back in the 90's, many top-selling PC games ran at 30 or even 15 fps, and were perfectly playable. I played the fuck out of Doom and Quake at then-acceptable framerates, which today would be considered slideshows. I sometimes play WoW on my laptop, where it can drop to 20-25 fps during intense fights, and it's just fine.

        The only place where absurdly high framerates are mandatory are fast-paced shooters like Quake 3/4, Call of Duty, Team Fortress etc. Racing titles also benefit from a

    • Consoles support 5040x1050? Color me suprised.

      • by Junta ( 36770 )

        His point being that game developers are conservative about pushing graphical complexity such that they don't even produce a workload that remotely challenges modern top-end cards. He attributes this to developers targeting weaker consoles. I think it's just because they want to perhaps have a slightly larger market than those willing to shell out $600 a year in graphics cards *alone*, regardless of game consoles. Now to push framerates down to a point where it looks like things matter, they have to turn

    • by Warma ( 1220342 )

      This is highly a matter of preference. I feel that 25fps is just flat out unplayable and anything under 60 is distracting and annoying. I believe that most gamers would agree. I always cut detail and effects to get 60fps, even if this means the game will look like shit. Framerate is life.

      So no, based on your description, the top of the previous generation is NOT able to play those games in the environment you defined. You would need around twice the GPU power for that. The benchmarks suggest that 7970 won't

      • by Jeremi ( 14640 )

        This is highly a matter of preference. I feel that 25fps is just flat out unplayable and anything under 60 is distracting and annoying.

        It's partially a matter of what you are accustomed to. I remember playing Arcticfox on a 386/EGA at about 4FPS and thinking it was awesome, because compared to the other games available at the time it was.

  • and others who are just glad that the technology drags along the low-end offerings, too

    Has the advance of high-end NV and AMD GPUs dragged along the Intel IGP in any way, shape, or form?

    • not by much, but it does mean that $50 for an amd and nv card will get you more power.
    • Intel Sandy bridge graphics are enough for most things, and Ivy Bridge, is supposed to increase its performance by another 20%.

  • Bitcoin (Score:5, Funny)

    by Anonymous Coward on Thursday December 22, 2011 @12:32PM (#38461122)

    Yes yes.. Rendering yada yada. How many Mhash/s does it average when bitcoin mining? And what is the Mhash/Joule ratio?

    • by Guppy ( 12314 )

      I've never been interested in Bitcoin mining, but as it becomes less worthwhile, I'm hoping it will depress prices on the used graphics card market, as former miners liquidate their rigs.

      • Not to speculate too much but we've probably passed the peak of mining hardware being listed for sale. Due to difficulty decreases and recently increasing prices, you may see increasing prices on GPUs in the next month or two.
    • by Kjella ( 173770 )

      Sadly I know the answer to this as it appeared someone asked in all seriousness. The new GCN architecture is better for compute in general, but worse for BitCoin as they switch from VLIW to a SIMD architecture. But please buy one and eBay it for cheap afterwards all the same ;)

  • by chill ( 34294 ) on Thursday December 22, 2011 @12:34PM (#38461134) Journal

    What is the state of Linux drivers for AMD graphics cards? I haven't checked in a few years, since the closed-source nVidia ones provide for excellent 3D performance and I'm happy with that.

    But, I'm in the market for a new graphics card and wonder if I can look at AMD/ATI again.

    No, I'm not willing to install Windows for the one or two games I play. For something like Enemy Territory: Quake Wars, (modified Quake 3 engine), how does AMD stack up on Linux?

    • Re: (Score:3, Insightful)

      by Anonymous Coward

      They suck just like they always have. But don't feel left out, they suck on Windows as well.

      ATI/AMD may at times make the fastest hardware but their Acillies Heel has and apparently always will be their sucky drivers. The hardware is no good if you can't use it.

      They need to stop letting hardware engineers write their drivers and get some people that know what they are doing in there. They need solid drivers for Windows, Linux, and a good OpenGL implementation. Until then they can never be taken seriousl

    • Quake3 probably doesnt need a top of the line graphics card. Go with an nVidia, for years that has been the best move if you think you may use Linux at some point.

      $30 should get you a card that maxes out anything quake3.

    • The almost-but-not-quite-latest card is generally fairly well-supported by fglrx. If your card is old enough to be supported by ati then it may work but it probably won't support all its features. You're far better off with nvidia if you want to do gaming.

      Every third card or so I try another AMD card, and wish I hadn't immediately. Save yourself.

    • How is this modded as insightful?
      What have Linux driver to do with this card? How are Linux users in any way the target market for a high end enthusiast GAMING graphics card?

      Perhaps once you can purchase BF3 or the like for Linux, then ATI and NV will spend more time writing drivers for Linux.

      I cannot imagine that anything more than an older HD48xx series will help you in any way.

      • Re:I don't get it. (Score:5, Insightful)

        by chill ( 34294 ) on Thursday December 22, 2011 @01:47PM (#38462162) Journal

        Because it was a question that people other than just me were curious about?

        Did you read the entire post? Or did your head just explode when seeing "Linux" in a gaming thread?

        nVidia already spends time on quality Linux graphics drivers. They run fine on both 32-bit and native 64-bit Linux systems. I was wondering if the AMD/ATI stuff had matured as well is all.

        Take a valium and go back to getting your ass n00bed by 10-year-olds on BF or MW.

    • by div_2n ( 525075 )

      The closed drivers have serious quality issues with major regressions seemingly every other release.

      The open drivers are making great strides, but the performance isn't there yet for newer cards. If you are using a pre-HD series card, you'll find pretty decent performance that often beats the closed driver.

      Based on the progress I've seen over the last year, I would expect the performance for this new series of cards to be acceptable in a year or so for the simple fact that as they finish the code for older

    • Re: (Score:3, Informative)

      by karolbe ( 1661263 )
      State of ATI/AMD drivers on Linux is rather poor, much worse than nVidia. My recommendation is to stay away from AMD GPUs if you plan to use Linux. If you are looking for more details about AMD & Linux read this article on Phoronix: http://www.phoronix.com/scan.php?page=article&item=amd_ayir_2011&num=1 [phoronix.com]
  • What bugs me most (Score:3, Interesting)

    by Psicopatico ( 1005433 ) on Thursday December 22, 2011 @01:02PM (#38461466)

    Why card manufacturers utilize (rightfully) new manufacture processes (28nm transistors) only to push higher performances?

    Why the hell don't they re-issue a, say, 8800GT with the newer technology, getting a fraction of the original power consumption and heat dissipation?
    *That* would be a card I'd buy in a snap.
    Until then, I'm happy with my faithful 2006's card.

    • by jandrese ( 485 ) <kensama@vt.edu> on Thursday December 22, 2011 @01:23PM (#38461724) Homepage Journal
      The problem is that speed is only one part of the equation. that 8800GT only supports DX10.0. DX10.1 games may run, but you'll find them crashing after awhile unless the developer was very careful (they were not). DX11 games won't work at all.

      You're much better off with a modern card that just has fewer execution units if you want to save money. They won't be out right away (the first release is always near the top end), but they will eventually show up. Since you're worried about saving money/power, you don't want to be an early adopter anyway. Oftentimes the very first releases will have worse power/performance ratios than the respins of the same board a few months down the road.
  • But can it run Unity on two screens without lag? I suspect that whatever video card I buy, the modern Linux dualhead display will feel slower than it did in 2005 :-/
  • by Rinisari ( 521266 ) on Thursday December 22, 2011 @03:44PM (#38464270) Homepage Journal

    What's the Bitcoin Mhash/sec?

Keep up the good work! But please don't ask me to help.

Working...