AMD Radeon HD 7970 GHz Edition: Taking Back the Crown 132
An anonymous reader writes "The benchmarks are in for the Radeon HD 7970 GHz Edition. Starting at $500, AMD's new single-GPU flagship boosts the original 7970's clock speed from 925 MHz to 1 GHz (1050 MHz with Boost). The GHz Edition also sports 3 GB of faster 1500 MHz GDDR5 memory, pushing 288 GB/s as opposed to 264 GB/s. While the AMD reference board runs hot and loud, retail boards will use different cooling solutions. A simple test of aftermarket GPU coolers shows that any other option will shave degrees and slash decibels. But it's the Catalyst 12.7 beta driver that really steals the show for AMD, pushing FPS scores into overdrive. With the new Catalyst, Nvidia's GeForce GTX 670 can no longer beat the original Radeon HD 7970, and the GHz Edition outmaneuvers the GeForce GTX 680 in most cases. However, when factoring price and possible overclocking into the equation, the original Radeon HD 7970 and GeForce GTX 670 remain the high-end graphics cards to beat."
X2 (Score:2)
Where's my X2 edition?
Re: (Score:1)
Re:X2 (Score:4, Interesting)
Forget X2. I want an All-In-Wonder version.
Completely unrelated /. trivia, but I just noticed the captcha isn't required at all for posting if logging in at the same time.
Re: (Score:2)
X2 All-In-Wonder edition.
Yes, I think that will do it.
Re: (Score:2)
Lol. HDMI in, HDMI out. You can't explain that! ;-)
Re: (Score:2)
HDMI in is verboten on consumer recording equipment.
Re: (Score:2)
Fine...then I'll take a Displayport in, and this handy HDMI-to-Displayport adapter.
Re: (Score:2)
Verboten, too.
The only things allowed are analog inputs, so you have to buy HDMI to analog converter, than digitize its output back. This is what I do (but my converter is limited to 1080i, so it's still painful).
Re: (Score:1)
Re: (Score:1, Flamebait)
Re: (Score:2)
I don't know about ATI, but AMD's Radeon cards have been competitive for a long time.
Re: (Score:2)
Re: (Score:3)
Competitive is an odd word in the hardware business. If you want to spend 1200 dollars on a CPU does AMD have a competitive offering? How about 500? What's the different in performance between a 500 dollar part and a 1200 dollar one?
With GPU's AMD and nVIDIA are pretty close in rendering performance, for specialized tasks (GPU computing) particular hardware may favour one guy over the other. But if 90% of the market is in GPU parts that cost less than 400 dollars, whether or not you hold the top spot at
which OTHER ati card should I buy? (Score:3)
Now, I'm an ATI man who's been using TV out since 1995, non-stop. But I'm not willing to throw them so much money, especially when I have to change my entire operating system to accommodate their abandonment of "old" OSes like XP. Man, that jump to 64bit required updating so many scripts, and replacing so many utilities. Don't force change on me and I might give you more money, ATI.
I like stories.
Re: (Score:3)
Must be your cpu (Score:2)
I have an ancient x1650 card with a Q6600 cpu running windows 7 and 1080p video uses minimal cpu. Your card is not the bottleneck. I'd like a newer card but when you look at the numbers the lowest end Fermi card is still slower than the old 9800GT series.
Re: (Score:2)
I also had an x1650 with an E6300 (until recently) - ran video at every resolution fine on both WinXP and Win7 x64. However, I'd recommend spending the $20 and upgrading to a Radeon HD 5450... much better performance in Win7 and only draws 18W.
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
If you aren't gaming, just grab a mid range 5xxx series and you will be fine. Should have very little trouble finding one for less than $80.
If you want more than that, get a HD 6770. You can have em for about $115 and it should keep up with any HD, moderate gaming(as long as you don't expect maxed settings) and should last you another 3/4 years before its too outdated.
There is no reason to spend more than $200 on a video card unless you are doing hardcore gaming on multiple or high resolution monitors.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Have you thought about putting that money towards something like one of the new integrated graphics options, like Ivy Bridge or the AMD stuff? I can say at the least that SandyBridge HD2000 was sweet, and I cant imagine how sweet IB must be.
It wont match a mid-highend GPU, but it may do better than what you had. Just something to consider when checking out benchmarks.
Re: (Score:2)
Re: (Score:1)
That problem went away in the 5000 series, since they made it all one unit. Sure, some cards are faster at one thing than others, but basically anything after the 5000 level of cards is wicked at both along with open cl
Re: (Score:2)
Re: (Score:2)
I have an old 1950Pro that while might be a bit better than an x1950 shouldn't be all that different. It also should be easily powerful enough for 1080p video which is not hard at all (less resolution than normal really). So I would check your settings for your drivers for updates or whatever. Modern integrated graphics should handle basic 1080p, a dedicated gaming card, even from a number of years ago shouldn't have any problems.
Re: (Score:2)
I'm done with spendy, top of the line cards (Score:3)
All of my expensive fancy video cards have died, usually right after any kind of warranty and I'm squeeking by on some horribly low res, limited palette and no hardware acceleration for graphics. But at least it's reliable!
Re:I'm done with spendy, top of the line cards (Score:5, Informative)
Ditto! I kept buying top end Nvidia cards for CUDA work, only to have them die just after the warranty, usually a year or so. I dug out an old Nvidia Quadro 285 card from the early 2000's, and am using it again. Also the 8400gs I got works just peacy for simple CUDA stuff.
It is like they engineer their top end cards to fail after a year or so, no matter what. My GTX 280 never went beyond 50 degrees, and was underclocked to boot (I didn't need all the power). Yet it died after a year or so, about as long as the 8800GTX I had beforehand.
The Quadro has been in use in some form for more than half a decade, and it still does 99% of what I need (Apart from the CUDA stuff, otherwise it would be perfect). Their older stuff seems more solid.
Re: (Score:3)
Ditto! I kept buying top end Nvidia cards for CUDA work, only to have them die just after the warranty, usually a year or so. I dug out an old Nvidia Quadro 285 card from the early 2000's, and am using it again. Also the 8400gs I got works just peacy for simple CUDA stuff.
It is like they engineer their top end cards to fail after a year or so, no matter what. My GTX 280 never went beyond 50 degrees, and was underclocked to boot (I didn't need all the power). Yet it died after a year or so, about as long as the 8800GTX I had beforehand.
The Quadro has been in use in some form for more than half a decade, and it still does 99% of what I need (Apart from the CUDA stuff, otherwise it would be perfect). Their older stuff seems more solid.
My suspicion, after looking at a few cards under a loupe, is the technology is exceding the board itself to host such densely packed, current hungry and heat producing electronics. To be able to sell and profit from these units they are produced rapidly by a robotic assembly line. If they slowed that line down a bit the failure rate would decline, but they rather operate under an acceptable rate of failure (early or later) as the assembly line will be tooled for something else after the run.
Our older card
Re: (Score:3)
You might try heating them in the oven to reflux the solder. It worked from my D820 laptop motherboard.
Re: (Score:2)
Oooh, could you give me some more info? My GTX280 cost a lot of money to me, and seems to have some memory corruption rendering it useless for anything but simple 2D displaying. I can't use it as is, and nobody wants to buy a faulty GPU.
What settings to use? How long to keep it in the oven? Do I need to disassemble it? Any instructions on doing it? It would be excellent if I could get it working again!! It would save me having to splash out another few hundred pounds for a new card :/
Re: (Score:1)
What settings to use?
Learn to use google. You're asking for an essay when the answer is just seconds away.
Re: (Score:2)
I used a heat gun just on the GPU chip. The oven trick is a little scarier to me. Also you might even be able to find a lab to do it for real. I had an iPhone that the solder for the battery was damaged and they were able to repair it.
Re: (Score:2)
http://www.youtube.com/watch?v=oqmn0sDWOYk [youtube.com]
Re: (Score:3, Insightful)
There are differences between the professional lines (Quadro, Tesla, FireGL, Firestream) and the consumer lines (GeForce and Radeon). The professional lines are built for the GPU manufactures to controlled specs and designed for longer life. The consumer lines are built by OEMs from a reference design with incentives to push clock speeds and component specs to the limit.
Your experience likely has more to do with the old card being a Quadro than it does with newer cards being more fragile.
Re: (Score:2)
The consumer lines are built by OEMs from a reference design with incentives to push clock speeds and component specs to the limit.
Indeed.. and with that in mind, its not at all silly to buy a card with lower clock rates but the same gpu reference.. you probably wouldnt notice the framerate difference, but you WILL notice the temperature difference as the highest clocked cards are always maxing out their fans... even on menu screens
Re: (Score:1)
Re:I'm done with spendy, top of the line cards (Score:4, Informative)
this is the detailed article explaining why the things are the way they are : http://www.geeks3d.com/20100504/tutorial-graphics-cards-voltage-regulator-modules-vrm-explained/2/ [geeks3d.com]
Re:I'm done with spendy, top of the line cards (Score:4, Informative)
I forgot to add to read that one after reading the page I linked : http://www.geeks3d.com/20091209/geforce-gtx-275-vrm-damaged-by-furmark/ [geeks3d.com]
Re: (Score:2)
Re: (Score:2)
Re: (Score:1)
Your GTX-280 is a fine-pitch BGA, where your old Quadro is a gullwing package.
FPBGAs have practically zero tolerance for board flex. You probably did the fatal damage when you installed the card, and it laid in wait for the time to fail, as moisture got into the cracked ball and corroded it.
I got a handful of people at my office to bring me their video cards from home so I could run them through our high-res x-ray imager (we use for failure analysis). Every single one had cracked solder balls from board fle
Re: (Score:2)
What card are you running that has limited palette? I haven't had that since I gave up my ISA Trident 9000 card. (For sake of argument I'm considering a 24-bit RGB signal as "unlimited". Consumers aren't going to go worrying about a 10-bit LUT in their hardware).
Well if that happens often (Score:2)
You are fucking something up, most likely your cooling or power. Or I suppose you could just be really unlucky. Regardless, just get a card that has a lifetime warranty. eVGA will sell you one.
Not buying ultra high end cards because they cost too much is a good reason not to buy them. You can get near their performance for much less money. Not buying them because you can't be bothered to build a system with proper power and cooling and do a bit of research to get a longer warranty is a silly reason not to b
Re: (Score:2)
I have made that mistake before but now I spend the extra $20 to get the version that runs the coolest and quietest. That way it doesn't die after a year of use and doesn't sound like a vacuum.
One word, "BITCOIN" (Score:2)
They need to add a benchmark for BirCoin, since it makes a lot of the market of high end graphic card buyers, and AMD is way faster per Whatt than Nvidia.
Re: (Score:1, Insightful)
For the 3 people that care about shitcoin?
Re: (Score:3)
What is this "a lot of the market"?
You really think more than a tiny percentage of folks use these cards for bitcoin?
Most people prefer to play games with them, instead of entering into pyramid schemes. Cash out while you still can.
No kidding (Score:2)
I think many of these miners need to L2math. So one of these cards will run you $500. Running it full bore will take around 250 watts so 1kWh for ever 4 hours it runs. Also have to factor in cooling, if you live in a warmer area. Also factor in computer power (and cooling for that) if it would normally be off during that time.
Well you need to run the numbers for your own power costs and so on, but that is a lot of mining you have to do to break even depending on what price you can get per bitcoin at a parti
Re: (Score:3)
AMD Linux support sucks (Score:5, Informative)
Regardless Torvalds recently getting his feathers ruffled with Nvidia.... In most cases Nvidia just works on Linux. I swore off AMD/ATI loooong back because JUST about time they finally get a decent proprietary linux driver support for one of their chipsets, it drops off the back side of support. I DESPISE forced upgrades and won't get caught in that trap again. All of our perfectly working AMD video laptops still work great but no proprietary driver support and the open source driver is waaaay worse. Nvidia proprietary drivers still support VERY old chipsets.
Re: (Score:1)
torvalds couldn't go off at amd because they at least opened the source on part of their drivers which now has group working to produce an open source version. it still means the official amd drivers are a pile of sh*t, but it at least means they can "try" do something better.
Re: (Score:3)
The open source Radeon driver works just fine, I'm using it for heavy 3D work right now. Not the case with NVidia. Linus had every reason to flip NVidia the bird, especially considering NVidia's ambition to win bags of gold selling Android chipsets.
Re: (Score:1)
The open source Radeon driver works just fine
So not supporting OpenGL features which the hardware is capable of and running with abysmal performance is "just fine"? The performance of the open source Radeon drivers is utter crap and something that should actually shame AMD. But they don't give a shit.
And guess what, Android systems won't be running any open source OpenGL drivers anytime soon, regardless of NVidia.
It's galling that the only way to get good 3D performance is to run NVidia/Linux instead of GNU/Linux, with a proprietary black blob three t
Re: (Score:2)
I'm getting 75 million Phong shaded triangles/second at 1920x1200 out of a 6450 running the Xorg Radeon driver. What's not good about that? Note: that's a fanless $50 card.
Re: (Score:2)
What is good about facts? Oh, never mind, I see you don't let any such thing stand in the way of a perfectly good rant.
Re: (Score:2)
(Phong shaded triangles do not use textures. I suspect that despite the big number of them you get per second, your actual framerate is fairly low).
For your information, Phong shading is quite demanding because of the exponentials involved. No my framerate is not low, it is capped at 60 FPS. Very impressive for this class of card. And at least a single layer of textures does not seem to bother it. See, I'm not throwing crap at the card, I'm using it the way it was meant to be used, that makes quite a difference. Now please crawl back into your hole fudster, and don't come out again until you are armed with some facts.
Re: (Score:2)
High poly count was the benchmark for the mid-late 1990's. In the 2nd decade of the 21st century, the benchmark is texture fill rate. Modern graphics usually use phong shading in addition to textures to give better lighting effects. In simpler terms... good looking phong-shaded games showed up in the 1980's, but it was nearly 20 years later before good-looking texture-shaded games showed up.
The numbers you quote aren't bad, and are quite playable for a variety of games, but the truth is that it's about par
Re: (Score:2)
Look, you're rambling. You keep switching between driver performance and card performance. The point is, the open source Radeon driver performance is pretty impressive, contrary to the FUD going around. Sure, it could be better, but it's already not bad, admit it.
Fill rate is just not one of the things that varies a lot depending on driver quality. Triangle setup is. That's why I talked about triangles, and mentioned Phong just so you know the driver is actually doing some work. But apparently you don't kno
Re: (Score:2)
So not supporting OpenGL features which the hardware is capable of and running with abysmal performance is "just fine"? The performance of the open source Radeon drivers is utter crap and something that should actually shame AMD. But they don't give a shit.
The open source driver is not meant to be better than the blob in terms of performance. It's meant to evolve much slower, but to support older cards much longer and to work more closely with open source projects. Fglrx is often slow to react to newer Xorg ABIs, for example, and in such cases radeon is there to pick up the slack. It isn't perfect when running any modern card, but it gives you a desktop, good video playback and all. Gaming is still far behind, but it's very possible to play a lot of older tit
Re: (Score:2)
Well, it could be the aim. Right now it isn't feasible. ATI simply knows much more about its products, and a lot sooner, than open source developers. That's where it has to change for comparable performance to be attainable. And I am in full agreement - Intel is doing it right. As a result, ever since Sandy Bridge's launch, Intel's crappy graphics always have the best possible support. They're still a bit on the crappy side, which is to be expected since Intel has only started taking graphics seriously abou
Re: (Score:1)
Piss on your open source radeon driver. It has never supported the X1250 graphics in my R690M chipset correctly in spite of the core being ancient. The graphics corruption I've encountered with every try to date is actually worse than before in more recent versions — just tried it today with Precise. And of course, this core is too "old" to be supported by fglrx, which was true the day I bought the system brand new at the store.
AMD linux support is ass unless you happen to have one of the few cards th
Re: (Score:2)
I have an HD6950 which has relatively good horsepower, I guess. I had been using the proprietary driver simply because that's what my distro set up and I've been too lazy to change it. I don't really play games all that often, but I have one or two kicking around. As you mentioned this, I decided to try out the open source driver.
On the plus side, the performance of 2D and video is actually quite a lot better than the proprietary driver. Everything is quick and smooth and no tearing. I had to fiddle wi
Re:AMD Linux support sucks (Score:5, Informative)
+1 I had an ATI in my last Linux desktop. Never again.
The proprietary fglrx drivers tend to have weird bugs and as you say, they drop chips that are old enough to have decent support. On the flip-side, the open-source radeon drivers tend to require various bleeding edge bits and pieces to work correctly, so they are nearly impossible to run on stable distros, like an Ubuntu LTS or a RHEL.
Nividia's proprietary drivers just work, once you finally figure out how to blacklist nouveau hard enough that it doesn't get loaded via the initrd. Plus they support VDPAU for projects like MythTV and XBMC.
Re: (Score:1)
Re: (Score:3)
Funny you should mention that right when AMD wins a huge order [phoronix.com] of graphic chips precisely because they have open source drivers.
And anecdotally, I've never had a problem with AMD hardware, generally by the time the proprietary driver loses support, the open source one matches its performance.
Re: (Score:2)
because 3D unity really taxes a video card... come back when it works with a multitude of video games as well as wine "without problems" then you'd be on to something. Running the desktop is the lowest bar you can get.
Nividia has some strong products. (Score:1)
This cycle, the latest nvidia GPUs have a lot going for them. The Kepler series really is impressive, and generally has much lower power consumption than the AMD parts. (This was flipped last gen funny enough. Those first fermi cards were heat machines)
I picked up a factory OC'd version of the 670 and It's shocking how fast it is. (And how quiet and cool it is) Remember that moment your system became good enough to run Oblivion at fully maxed settings with really high framerates? Or morrowind? Yeah, that ju
Re: (Score:1)
Half of it's functionality STILL won't work! (Score:1)
Re: (Score:2)
Might have something to do with developers you know having a bug up their ass, and still developing for consoles. You know 10 year old hardware. Nah couldn't be...
And of course the HD 7970Ghz edition ... (Score:2)
... will work just fine in my Apple Mac Pro! Oh wait.....
Seriously, this is the kind of boost Apple *should* have been after, since they're now stretching out the upgrade of the Mac Pro until some time in 2013..... They could at least update OS X with an incremental release and start offering this card for the now 2 year old Mac Pro they did a slight CPU speed bump to and called "updated", so there'd be SOME sane reason for people to order one.
Re: (Score:2)
5% more shiniez? I *must* have one! (Score:5, Insightful)
Why does anyone care that the two major card makers are still in their dick-waving war? Is it just to keep the review sites in business? Hey, look, another new top-of-the-range GFX card, not totally dis-similar to the one we reviewed last month, only we got it for free, and you'll have to part with some serious wedge if you want to have the same toys as the cool kids!
There have been no real, serious differences between any of the last dozen iterations of hardware. Anything made in the last couple of years should run any game on the market at full shiniez at decent resolution. It won't, sadly, make the gameplay any better.
Re: (Score:2)
Because some of us want to see what it's like to play a game with maximum bling enabled?
My machine still chokes on Alice: Madness Returns, and that's with a 6970.
Re: (Score:2)
My machine still chokes on Alice: Madness Returns, and that's with a 6970.
There's something wrong with your machine, then. My machine has a 6870, and that game runs perfectly at 1920x1080 max graphics settings. (CPU is an i5 2500k oc'd to 4.7GHz, with 16GB of RAM). Aside from some idiotic load cues triggering hard drive access in the middle of some of the puzzles (and the game not caching the loaded results so if you turn around and attempt the puzzle again the cues triggers again), no problems at all, but that's bad game design not bad graphics support, and the problem went away
Re: (Score:2)
Re: (Score:2)
Sigh. Clearly your video games diet is fairly bland. Try one of the DCS combat simulators (A-10C, Ka-50, P-51D), or even Armed Assault II and you'll quickly notice the difference between a high end video card and a run-of-the-mill one. Just because the 'mainstream' is designed as twitch games that fight on maps the size of postage stamps doesn't mean all games/simulations are like this. I
Re: (Score:2)
I agree, and $500? wow. It is nice to run games at a decent frame rate, unfortunately for Radeon users their driver support seems to be falling behind their hardware by quite a lot. My Nvidia 560ti runs my favorite game Tribes Ascend with full bling, while Radeon users of higher end cards report all sorts of issues.
BTW the new Tribes is FTP..Join me https://account.hirezstudios.com/tribesascend/?referral=1207516&utm_campaign=email [hirezstudios.com] for anyone who remembers tribes 1 and tribes 2, you know why you should
Re: (Score:2)
Why does anyone care that the two major card makers are still in their dick-waving war?
Because people buy the stuff they make (I dunno who, but their top-line stuff makes money somehow), and because, once they've replaced it with something newer and better, the price drops fast so the rest of us can build a nice, inexpensive gaming machine.
Re: (Score:2)
Re: (Score:1)
Continuos 5% more dick waving of the time means upgrades are worth it after 3 or so years. Assuming the drivers aren't tweaked. It keeps the industry healthy.
But, but, but... (Score:1)
Hot and loud? (Score:1)
Re: (Score:2)
From what I am reading, the original 7970 drew 40 amps / 210W TDP at reference clocks and upwards of 100 amps when OC'd, and because this new 7970 is basically OC'd...
Re: (Score:2)
Amps? I think you might be wrong there Chief. Typical household wiring circuits are rated at 15 amps.
Thats 15 amps at 120 volts, sparky.
...which is 150 amps at 12 volts.
Yawn (Score:1)
Semi-Accurate comments on this (Score:2)
I enjoy reading the articles posted on SemiAccurate.com [semiaccurate.com] about AMD, nVidia, Intel, etc. Most of the articles are by two writers, and the most entertainingly acerbic ones are by Charlie Demerjian (I'll call him "CD").
Five months ago, CD thought nVidia was going to crush AMD on the high end:
http://semiaccurate.com/2012/01/19/nvidia-kepler-vs-amd-gcn-has-a-clear-winner/ [semiaccurate.com]
However, nVidia seemingly can't produce their high-end chips in any useful quantity. So, CD snipes at nVidia about that in his comments about
Compared to CAD Cards (Score:2)
Re: (Score:1)
How does this compare to the multi-thousand dollar CAD video cards?
Consumer GPUs are fast as piss now, but they may not have the ability to drive really fancy high-end monitors.
Re: (Score:2)
Re: (Score:2)
Just the ones that they don't sell any more, so not really.
Please stop the "don't use ATI/AMD for Linux" FUD (Score:1)
It's embarrassing to see comments about how you should never use ATI-badged video cards in a Linux box, only to go home and watch my creaky-old 4550 run not only just fine, but also play 1080p video, and render 3D, while driving two monitors. Let me rephrase that: YOU should be embarrassed to say comments about how you should never use ATI-badged video cards yada yada yada...
Ditto for the "but I need proprietary drivers for Nvidia" crap. Nouveau drivers are getting just as good. Have a POS AGP FX5200? N
gtx 560 (Score:2)
best bang of buck should be the geforce gtx 560 for now. About 180$ and lots faster than the 7770.