Forgot your password?
typodupeerror
Graphics AMD Games Hardware Technology

AMD Radeon HD 7970 Launched, Fastest GPU Tested 281

Posted by timothy
from the that's-rent-in-some-towns dept.
MojoKid writes "Rumors of AMD's Southern Island family of graphics processors have circulated for some time, though today AMD is officially announcing their latest flagship single-GPU graphics card, the Radeon HD 7970. AMD's new Tahiti GPU is outfitted with 2,048 stream processors with a 925MHz engine clock, featuring AMD's Graphics Core Next architecture, paired to 3GB of GDDR5 memory connected over a 384-bit wide memory bus. And yes, it's crazy fast as you'd expect and supports DX11.1 rendering. In the benchmarks, the new Radeon HD 7970 bests NVIDIA's fastest single GPU GeForce GTX 580 card by a comfortable margin of 15 — 20 percent and can even approach some dual GPU configurations in certain tests." PC Perspective has a similarly positive writeup. There are people who will pay $549 for a video card, and others who are just glad that the technology drags along the low-end offerings, too.
This discussion has been archived. No new comments can be posted.

AMD Radeon HD 7970 Launched, Fastest GPU Tested

Comments Filter:
  • Overpowerful. (Score:5, Interesting)

    by unity100 (970058) on Thursday December 22, 2011 @01:28PM (#38461068) Homepage Journal
    Due to console gaming retarding pcs.

    im on single radeon 6950 (unlocked to 6970 by bios flash), and i am doing 5040x1050 res (3 monitor eyefinity) on swtor (the old republic), all settings full, and with 30-40 fps on average, and 25 fps+ on coruscant (coruscant is waaaaaaay too big).

    same for skyrim. i even have extra graphics mods on skyrim, fxaa injector etc (injected bloom into game) this that.

    so, top gpu of the existing generation (before any idiot jumps in to talk about 6990 being the top offering from ati ill let you know that 6990 is 2 6970s in crossfire, and 6950 gpu is just 6970 gpu with 38 or so shaders locked down via bios and underclocked - ALL are the same chip), is not only able to play the newest graphics-heavy games in max settings BUT also do it on 3 monitor eyefinity resolution.

    one word. consoles. optional word : retarding.
  • by Lumpy (12016) on Thursday December 22, 2011 @01:56PM (#38461390) Homepage

    Nope. Bang for buck this new card kicks the butt hard of the Workstation cards.

  • What bugs me most (Score:3, Interesting)

    by Psicopatico (1005433) <psicopatico AT a ... DOT zzn DOT com> on Thursday December 22, 2011 @02:02PM (#38461466)

    Why card manufacturers utilize (rightfully) new manufacture processes (28nm transistors) only to push higher performances?

    Why the hell don't they re-issue a, say, 8800GT with the newer technology, getting a fraction of the original power consumption and heat dissipation?
    *That* would be a card I'd buy in a snap.
    Until then, I'm happy with my faithful 2006's card.

  • Re:Overpowerful. (Score:5, Interesting)

    by anonymov (1768712) on Thursday December 22, 2011 @02:34PM (#38461926)

    You keep talking about "research", may be _you_ care to provide a research that shows "24 fps should be enough for everyone"? (hint: it's not, and it's the reason for current studies for 50p/60p/72p film and television).

    Why, you can just go here http://frames-per-second.appspot.com/ [appspot.com] and tell us "I don't see any difference". And then we'll just tell you to visit your eye doctor.

  • by billcopc (196330) <vrillco@yahoo.com> on Thursday December 22, 2011 @02:37PM (#38461960) Homepage

    Pretty sure today's mid-range PCs trounce 2007's high-end with ease.

    Just for shits, when I got my current rig just a couple years ago, I played through Crysis again. On a single GTX260, it was butter smooth at 1680x1050. When I switched to quad-SLI 295's, it was butter-smooth in triple-wide surround.

    People who continue to claim Crysis is an unoptimized mess are:

    - not programmers
    - not owners of high-end hardware

    Could it be improved ? Sure. Is it the worst optimized game of the 21st century ? FUCK NO, not even close, and subsequent patches greatly improved the situation.

  • Re:Overpowerful. (Score:4, Interesting)

    by ponos (122721) on Thursday December 22, 2011 @04:41PM (#38464236)

    You silly newb. HDMI uses 24fps for compatibility reasons and the initial decision was probably based on an quality-cost tradeoff back in the days when actual film was used and the NTSC/PAL specifications were defined. Using 60fps would mean that the tape would last half the time, for example. There is the famous "notion" that eyes cannot see over 24fps, but in fact eyes are very sensitive to some kinds of motion, colors and contrast and less sensitive to others, so you cannot generalise that 24fps is "enough" for all kinds of motion, image and people (ye, people are different too). Furthermore, even if the above were not true, in fact you need an average of at least 50-60 fps in most games to ensure that the MINIMUM will not go below 30fps, which is not only visible but also implies a between-frame reaction time of 30ms (plus ping, plus input lag, plus keyoard lag etc). In hardcore-land this mean PWNAGE for you and your silly rig.

Small is beautiful.

Working...