AMD Radeon HD 7970 Launched, Fastest GPU Tested 281
MojoKid writes "Rumors of AMD's Southern Island family of graphics processors have circulated for some time, though today AMD is officially announcing their latest flagship single-GPU graphics card, the Radeon HD 7970. AMD's new Tahiti GPU is outfitted with 2,048 stream processors with a 925MHz engine clock, featuring AMD's Graphics Core Next architecture, paired to 3GB of GDDR5 memory connected over a 384-bit wide memory bus. And yes, it's crazy fast as you'd expect and supports DX11.1 rendering. In the benchmarks, the new Radeon HD 7970 bests NVIDIA's fastest single GPU GeForce GTX 580 card by a comfortable margin of 15 — 20 percent and can even approach some dual GPU configurations in certain tests." PC Perspective has a similarly positive writeup. There are people who will pay $549 for a video card, and others who are just glad that the technology drags along the low-end offerings, too.
This would be really cool... (Score:2, Insightful)
Re:This would be really cool... (Score:5, Funny)
Hush. Those idiots finance the advance of technology.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re:This would be really cool... (Score:5, Insightful)
Says the idiot that only uses a PC for gaming.
Adobe After Effects will use the GPU for rendering and image processing.
Re: (Score:3)
Arent you way better off with a workstation card for most workstation loads? From what Ive read, a GTX or ATI HD makes for a poor CAD or Adobe machine.
Re:This would be really cool... (Score:5, Interesting)
Nope. Bang for buck this new card kicks the butt hard of the Workstation cards.
Re:This would be really cool... (Score:5, Informative)
Depends on the type of processing. GTX and Radeon cards artificially limit their double-precision performance to 1/4 of their capabilities, to protect the high-margin workstation SKUs. If all you're doing is single-precision math, you're fine with a gaming card.
Re: (Score:2)
For now anyway. MS is looking at Double Precision becoming the standard for some future DirectX. That's probably still a few years off.
Re: (Score:2)
*16 integers. None of this new fangled 32 bit garbage kids play with these days.
Re: (Score:2)
And even that can be fixed. the limit is in the firmware. I have a PC ATI card in my PPC mac that is running a workstation firmware that unlocked some serious processing power. This was back when ATI video cards for the quad core G5 were anal rape robbery pricing and the exact same hardware for the PC was going for $199.00 That system utterly screamed running shake and after effects back in 2007-2008
Re: (Score:3)
Black and White text... All find and good until we need a graph.
Back and White graphics... Now only if it could do color.
CGA... What bad colors.
EGA... Looking a lot better if only we could get some shading and skin tones.
VGA... Enough visible colors to make realistic pictures. But a higher resolution will make it better.
SVGA...
Re:This would be really cool... (Score:5, Informative)
Ray Tracing!!!
We're also capped right now because of too many single-threaded game engines. A given thread can only push so many objects to the GPU at a time. Civ5 and BF3, being the first games to make use of deferred shading and other DX11 multi-threading abilities, can have lots of objects on the screen with decent FPS.
The biggest issue I have with nearly all of my games is my ATI6950 is at 20%-60% load and only getting sub 60fps, while my CPU has one core pegged. My FPS isn't CPU limited, it's thread limited.
Re: (Score:3)
Re: (Score:3)
Not true. You just have to find the sweet spot of performance/$. My current card (I think it's a 6870 but I'd have to double-check to be sure) cost less than $150 a couple months ago and runs Witcher 2 quite smoothly with high settings. Haven't tried BF3.
Re: (Score:2)
Witcher 2 is a bad example anyway, because with ubersampling it will rape pretty much any modern graphics card when everything else is on ultra. You'll need SLI or the card above for that workload.
That said, most modern 200ish cards will handle B3 high and W2 high just fine at sub-1080p. It's the best possible settings and very high resolutions where you need the high end offerings.
Re: (Score:2)
Re: (Score:2)
Still is. Granted I can't push the resolution beyond 1080p, but with everything maxed it, the fact that it's a console port is very visible. Because at that point the screen size is big enough that you start seeing the textures for what they really are - dirty smudges. You also start seeing them cheating on geometry in comparison to more PC-optimized offerings like BF3.
Re: (Score:3)
This small step up for texture dimensions won't give much impact on modern hardware, and from a bit of testing Skyrim seems to be mostly CPU/VS limited, not pixel shading limited, anyways.
Only once have I splurged like that (Score:2)
I rebuild my machines every two years. My previous rig couldn't do Crysis as max settings so my latest system has dual 5870's that I got for $400 a piece. I'll never splurge like that on video cards again. Then again, 2 years later, I still max out the sliders on every game I get. It's great to have that kind of computing power... but maybe I should have waited 6 months? Those cards are going for $150 today.
Re: (Score:2)
Re:Only once have I splurged like that (Score:4, Interesting)
Pretty sure today's mid-range PCs trounce 2007's high-end with ease.
Just for shits, when I got my current rig just a couple years ago, I played through Crysis again. On a single GTX260, it was butter smooth at 1680x1050. When I switched to quad-SLI 295's, it was butter-smooth in triple-wide surround.
People who continue to claim Crysis is an unoptimized mess are:
- not programmers
- not owners of high-end hardware
Could it be improved ? Sure. Is it the worst optimized game of the 21st century ? FUCK NO, not even close, and subsequent patches greatly improved the situation.
Re: (Score:2)
In perfect honesty, it's better to buy a single powerful card (to avoid early problems in games) for 200-250 range, and upgrade every couple of years. Cheaper and you should be able to max or near max all games that come during lifetime of the card.
Obvious exceptions are the extreme resolutions, 3D vision and multi-monitor gameplay.
Overpowerful. (Score:5, Interesting)
im on single radeon 6950 (unlocked to 6970 by bios flash), and i am doing 5040x1050 res (3 monitor eyefinity) on swtor (the old republic), all settings full, and with 30-40 fps on average, and 25 fps+ on coruscant (coruscant is waaaaaaay too big).
same for skyrim. i even have extra graphics mods on skyrim, fxaa injector etc (injected bloom into game) this that.
so, top gpu of the existing generation (before any idiot jumps in to talk about 6990 being the top offering from ati ill let you know that 6990 is 2 6970s in crossfire, and 6950 gpu is just 6970 gpu with 38 or so shaders locked down via bios and underclocked - ALL are the same chip), is not only able to play the newest graphics-heavy games in max settings BUT also do it on 3 monitor eyefinity resolution.
one word. consoles. optional word : retarding.
Re:Overpowerful. (Score:5, Insightful)
... 30-40 fps on average, and 25 fps+ on coruscant (coruscant is waaaaaaay too big). same for skyrim...
Looks like PCs isn't the only thing gaming consoles have been retarding. Most PC gamers would have considered 25 fps nearly unplayable, and 30-40 FPS highly undesirable before the proliferation of poor frame rates in modern console games. There are still many of us that are unsatisfied with that level of performance, but are unwilling to compromise graphics quality.
Re: (Score:3)
Yes, it's really a shame that 30 fps became an acceptable framerate for games nowadays, thanks to crappy underpowered consoles.
Funny, however, is that back in 1999 (Dreamcast days) any console game that didn't run at a solid 60 fps was considered a potential flop.
This framerate crap is one of the many reasons I'll never go back to console gaming.
Times change, no?
Re: (Score:3)
I use to play software rendered Quake @ 320x200 8bit color and sub 20fps.
This is what it feels like to play on consoles, when coming from PC. Flat lighting, crappy models, poor special effects, and a FPS that makes it feel like I'm roller skating with a strobe light.
Re: (Score:2)
Depends on the game.
Back in the 90's, many top-selling PC games ran at 30 or even 15 fps, and were perfectly playable. I played the fuck out of Doom and Quake at then-acceptable framerates, which today would be considered slideshows. I sometimes play WoW on my laptop, where it can drop to 20-25 fps during intense fights, and it's just fine.
The only place where absurdly high framerates are mandatory are fast-paced shooters like Quake 3/4, Call of Duty, Team Fortress etc. Racing titles also benefit from a
Re: (Score:2)
Not really.
First, different people have different perception, so 20 may be enough for some and 60 just right for others.
Second, fast motion on low framerate requires some amount of motion blur to be perceived as "smooth". When shooting a film, this blur is already there thanks to the nature of filming. When rendering, developers have to care about it, and as it can be costly it's often dropped.
Re: (Score:2)
Re:Overpowerful. (Score:5, Interesting)
You keep talking about "research", may be _you_ care to provide a research that shows "24 fps should be enough for everyone"? (hint: it's not, and it's the reason for current studies for 50p/60p/72p film and television).
Why, you can just go here http://frames-per-second.appspot.com/ [appspot.com] and tell us "I don't see any difference". And then we'll just tell you to visit your eye doctor.
Re: (Score:2)
So it seems you don't have a research on hand to show us that 24 fps is totally enough for synthetic images without temporal smoothing.
And you was talking so confidently, I almost believed you talk from knowledge :(
Re: (Score:2)
'synthetic images with temporal smoothing' -> oh yeah.
Re: (Score:3)
Actually the flicker on those monitors was often traceable back to the flicker from the lights; when the 50/60hz mains '0' coincided with a scan rate, the flicker becomes obvious. It wasn't so much the refresh rate, but the fact that there was an interference pattern. At higher frequencies, the phosphor had less time to fade, so the dip in brightness became less obvious(except at low ambient light levels...). Remember a LCD has no fading effect,so much lower frequencies are now acceptable.
However, he is wro
Re: (Score:2)
Here's a good one http://www.100fps.com/how_many_frames_can_humans_see.htm [100fps.com]
"So the conclusion is: To make movies/Virtual Reality perfect, you'd have to know what you want. To have a perfect illusion of everything that can flash, blink and move you shouldn't go below 500 fps."
Also from Wikipedia to debunk the 24fps recording thing:
Judder is a real problem in this day[when?] where 46 and 52-inch (1,300 mm) television sets have become the norm. The amount an object moves between frames physically on scr
Re: (Score:2)
Re: (Score:2)
So you can't trust your own eyes? You only want to read smart words, experimental proof is too lowly for you?
Re: (Score:2)
http://web.cs.wpi.edu/~claypool/papers/fr/fulltext.pdf
some dudes' master thesis counts as research now ?
a study which polled the other gamer dudes in their department, and ASKED them whether they were able to perceive stuff by the way. yeah.
if i do the same in an overclock forum, i can assure you that i can come up with higher than 100% percentage testimonies to that regard.
that doesnt make it anything scientific. does not modify their biology/physiology either.
im still waiting for the research to prove the bullshit. apart from some gamer dudes pol
Re: (Score:2)
Re: (Score:2)
False. The human eye's focus is indeed typically incapable of sensing more then 24 frames every second. On the other hand, peripheral vision can in some cases distinguish over 100 images every second. This was very visible back in CRT days when monitors caused headaches as peripheral vision saw the blinking on the crappy 60hz and sometimes 75hz monitors stressing the hell out of your eyes.
The issue dates from the way out eye evolved, the way it processes the image, compresses it and sends it to the brain vi
Re: (Score:2)
just because there materialized a percentage of gamers that think they do.
Re: (Score:2)
the fact that your eye can send 60 hz flickering light on/off situation to your brain does not mean that your brain is able to interpret the picture in front of you at the same rate.
and still, where is the research backing that 60 fps proposition ?
Re:Overpowerful. (Score:4, Insightful)
The argument for 60fps isn't about genetically engineered people. It's about spikes. If you are running your game at 30FPS, you'll turn a corner or some monsters spawn in the next room that drop your FPS below that. The reason people want high frame rates is because of these spikes.
Re:Overpowerful. (Score:4, Interesting)
You silly newb. HDMI uses 24fps for compatibility reasons and the initial decision was probably based on an quality-cost tradeoff back in the days when actual film was used and the NTSC/PAL specifications were defined. Using 60fps would mean that the tape would last half the time, for example. There is the famous "notion" that eyes cannot see over 24fps, but in fact eyes are very sensitive to some kinds of motion, colors and contrast and less sensitive to others, so you cannot generalise that 24fps is "enough" for all kinds of motion, image and people (ye, people are different too). Furthermore, even if the above were not true, in fact you need an average of at least 50-60 fps in most games to ensure that the MINIMUM will not go below 30fps, which is not only visible but also implies a between-frame reaction time of 30ms (plus ping, plus input lag, plus keyoard lag etc). In hardcore-land this mean PWNAGE for you and your silly rig.
Re: (Score:2)
Consoles support 5040x1050? Color me suprised.
Re: (Score:2)
His point being that game developers are conservative about pushing graphical complexity such that they don't even produce a workload that remotely challenges modern top-end cards. He attributes this to developers targeting weaker consoles. I think it's just because they want to perhaps have a slightly larger market than those willing to shell out $600 a year in graphics cards *alone*, regardless of game consoles. Now to push framerates down to a point where it looks like things matter, they have to turn
Re: (Score:3)
This is highly a matter of preference. I feel that 25fps is just flat out unplayable and anything under 60 is distracting and annoying. I believe that most gamers would agree. I always cut detail and effects to get 60fps, even if this means the game will look like shit. Framerate is life.
So no, based on your description, the top of the previous generation is NOT able to play those games in the environment you defined. You would need around twice the GPU power for that. The benchmarks suggest that 7970 won't
Re: (Score:2)
This is highly a matter of preference. I feel that 25fps is just flat out unplayable and anything under 60 is distracting and annoying.
It's partially a matter of what you are accustomed to. I remember playing Arcticfox on a 386/EGA at about 4FPS and thinking it was awesome, because compared to the other games available at the time it was.
yes. 60 fps. (Score:2)
no. the question is rhetorical. there isnt one single research that shows humans are able to perceive a difference in between 40 fps and 60 fps. its total bullshit.
hdmi specification requires 24 fps. not 60 fps. because, 24 fps is scientifically backed, whereas the only thing backing 'i can perceive 60 fps' is the self-propagated bullshit from gamers. n
Re: (Score:2)
If humans can perceive at most 24fps then you need to display at a minimum rate of 48fps (Nyquist rate) in order to reliably convey that information to a human; else you'll jitter.
But humans don't actually take frame-like snapshots. The flicker fusion threshold for black & white is about 60fps with noticeable variation between individuals. Human beings can reliably identify an object flashed in from of them for only 5 milliseconds: effectively 200fps for one frame. Even though they cannot distinguis
Re: (Score:2)
The flicker fusion threshold for black & white is about 60fps with noticeable variation between individuals.
FLICKER fusion threshold. not image interpretation threshold.
which is why we are comfortable with 60 hz lcd/led monitors, as opposed to being uncomfortable with 60 hertz crt monitors.
Re: (Score:2)
Re: (Score:2)
While I can't "link you papers" as I've studied from actual dead tree books back in university, the structure of human eye can be accessed pretty well just by looking at wikipedia. You have focus in the center, highly populated with slow and data-heavy color-sensitive receptors, and periphery highly populated with intensity-sensing fast receptors. Peripheral vision is designed by evolutionary process to track movement, and as a result is capable to distinguish a lot more images every second then focus.
Anoth
Re: (Score:2)
yeah please link me one of the papers that say people can distinguish in between 30 fps, 40 fps and 60 fps and therefore 60 being the 'necessary norm' for pc gaming. im waiting.
Re: (Score:2)
The intentional blurring is a part of it, yes, but it's just a part. There is also the issue of how we as humans process the visual data (i.e. over 24 pictures per second gets interpreted as continuous movement).
Re: (Score:2)
cant hear you ?! where ?
nowhere i guess. other than subjective propositions and perceptions. 'i can see this i can feel this bleh blah'.
yes yes. (Score:2)
oh, one of them linked to a random website/blog that says the same thing, from some dude's mouth tho. thats what they understand from 'research'.
Re: (Score:2)
And you're yet to deliver a research showing 24 fps is enough, don't forget that.
Re: (Score:3, Insightful)
No need to use strong language and unbacked opinions. You are simply incorrect.
Put two FPS players of similar skill in front of a computer. Configure the computers so that the other shows 30fps and the other shows 60fps (easy to do in Quake and most other shooters). No matter what you may hope for the truth to be, the player with the 60fps display will have an enormous advantage.
It is extremely easy to test this yourself, just go play a game and record your performance by some metric while alternating the f
Re: (Score:2)
HDMI is for passive entertainment, not twitch action games. The dynamics and requirements are entirely different, yet you're using them just the same. The standards also exists because of legacy reasons concerning existing video material, processing speeds and data storage limitations of existing media. Compromises to provide best image quality with certain limitations for consumer use. Perhaps you would want to investigate what kind of framerates and response times the US army uses in their remote feeds an
Re:Overpowerful. (Score:5, Informative)
Human eye does not see in frames per second. It has a certain data transfer speed, and the way brains process the information is also not as discrete as you might want to wish.
For example, the flicker fusion point (inability to distinguish alternating black and white images) is somewhere around 60fps and the army has done experiments on showing images for a very short time to see whether they could be identified by pilots. The shortest intervals were way less than those postulated even by the flicker fusion point. It also matters greatly how large amount the object moves in your absolute field of vision between frames for your brain to understand motion and simulate smooth movement. Your brain has interpolation algorithms that piece together information streams to form smooth motion.
This has some information, but not many references: http://www.100fps.com/ [100fps.com]
Wikipedia has more stuff and it's a starting point to look for research. Here's some by BBC on fast-moving objects in sports: http://downloads.bbc.co.uk/rd/pubs/whp/whp-pdf-files/WHP169.pdf [bbc.co.uk]
Also, you can test the difference of 30 and 60 fps here: http://frames-per-second.appspot.com/ [appspot.com]
At least to me, it is blatantly evident.
You were wrong, and acted like an ass over it towards me and other posters. Will you please apologize and shut up?
Re: (Score:2)
Re: (Score:2)
It doesn't rebuke HDMI specification. The reason for 24 fps being the typical video speed is hidden in the fact that our brain stop perceiving individual images and starts perceiving concurrent frames as motion around that number.
It doesn't mean that eye and brain are incapable of distinguishing or things in motion across the screen being "jerky" (i.e. tell that it's separate frames) without either significant smoothing effects typically used in movie industry or other similar methods.
Re: (Score:2)
However, this question also does not have a single straight-forward answer. If the image switches between black and white each frame, the image appears to flicker at frame rates slower than 30 FPS (interlaced). In other words, the flicker fusion point, where the eyes see gray instead of flickering tends to be around 60 FPS (inconsistent).
where is the reference ?
Re: (Score:2)
Why don't you link research showing the contrary? Or at least research validating your opinion of "24 fps, the magic number"
you gave your answer in your retort to hdmi point below your post. rationalizing it like 'because film has been like that forever' is one step away from rationalizing it as 'film has been like that forever because it was the minimum frame rate that humans were able to interpret moving images forever'. not 5 fps, not 10 fps, but 24 fps.
Re: (Score:2)
Re: (Score:2)
Well, I'm sorry you're an idiot. You're somewhat right, actually the eye just takes 18-20 fps to feel smooth if the scene is motion blurred. Reality doesn't have frames, during that 1/20th of a second everything moves. A rendered screen is not motion blurred and will seem extremely stuttering. Yes, perhaps if you rendered at 60 fps and averaged down to 24 fps you wouldn't notice the difference, but having a graphics card that can only render at 24 fps is clearly insufficient. You should go see an optician i
Does GMA still stand for Graphics My ___? (Score:2)
and others who are just glad that the technology drags along the low-end offerings, too
Has the advance of high-end NV and AMD GPUs dragged along the Intel IGP in any way, shape, or form?
Re: (Score:2)
Re: (Score:2)
Intel Sandy bridge graphics are enough for most things, and Ivy Bridge, is supposed to increase its performance by another 20%.
Re: (Score:2)
Bitcoin (Score:5, Funny)
Yes yes.. Rendering yada yada. How many Mhash/s does it average when bitcoin mining? And what is the Mhash/Joule ratio?
Re: (Score:2)
I've never been interested in Bitcoin mining, but as it becomes less worthwhile, I'm hoping it will depress prices on the used graphics card market, as former miners liquidate their rigs.
Re: (Score:2)
Re: (Score:2)
Sadly I know the answer to this as it appeared someone asked in all seriousness. The new GCN architecture is better for compute in general, but worse for BitCoin as they switch from VLIW to a SIMD architecture. But please buy one and eBay it for cheap afterwards all the same ;)
Linux Driver State? (Score:5, Insightful)
What is the state of Linux drivers for AMD graphics cards? I haven't checked in a few years, since the closed-source nVidia ones provide for excellent 3D performance and I'm happy with that.
But, I'm in the market for a new graphics card and wonder if I can look at AMD/ATI again.
No, I'm not willing to install Windows for the one or two games I play. For something like Enemy Territory: Quake Wars, (modified Quake 3 engine), how does AMD stack up on Linux?
Re: (Score:3, Insightful)
They suck just like they always have. But don't feel left out, they suck on Windows as well.
ATI/AMD may at times make the fastest hardware but their Acillies Heel has and apparently always will be their sucky drivers. The hardware is no good if you can't use it.
They need to stop letting hardware engineers write their drivers and get some people that know what they are doing in there. They need solid drivers for Windows, Linux, and a good OpenGL implementation. Until then they can never be taken seriousl
Re: (Score:2)
Quake3 probably doesnt need a top of the line graphics card. Go with an nVidia, for years that has been the best move if you think you may use Linux at some point.
$30 should get you a card that maxes out anything quake3.
Re: (Score:2)
Bah! My mistake. It is a heavily modified id Tech 4 engine [modwiki.net], which is Quake 4/Doom 3 -- not Quake 3. No $30 card will max that out.
My fault.
Re: (Score:3)
The almost-but-not-quite-latest card is generally fairly well-supported by fglrx. If your card is old enough to be supported by ati then it may work but it probably won't support all its features. You're far better off with nvidia if you want to do gaming.
Every third card or so I try another AMD card, and wish I hadn't immediately. Save yourself.
I don't get it. (Score:2)
How is this modded as insightful?
What have Linux driver to do with this card? How are Linux users in any way the target market for a high end enthusiast GAMING graphics card?
Perhaps once you can purchase BF3 or the like for Linux, then ATI and NV will spend more time writing drivers for Linux.
I cannot imagine that anything more than an older HD48xx series will help you in any way.
Re:I don't get it. (Score:5, Insightful)
Because it was a question that people other than just me were curious about?
Did you read the entire post? Or did your head just explode when seeing "Linux" in a gaming thread?
nVidia already spends time on quality Linux graphics drivers. They run fine on both 32-bit and native 64-bit Linux systems. I was wondering if the AMD/ATI stuff had matured as well is all.
Take a valium and go back to getting your ass n00bed by 10-year-olds on BF or MW.
Re: (Score:3)
The closed drivers have serious quality issues with major regressions seemingly every other release.
The open drivers are making great strides, but the performance isn't there yet for newer cards. If you are using a pre-HD series card, you'll find pretty decent performance that often beats the closed driver.
Based on the progress I've seen over the last year, I would expect the performance for this new series of cards to be acceptable in a year or so for the simple fact that as they finish the code for older
Re: (Score:3, Informative)
What bugs me most (Score:3, Interesting)
Why card manufacturers utilize (rightfully) new manufacture processes (28nm transistors) only to push higher performances?
Why the hell don't they re-issue a, say, 8800GT with the newer technology, getting a fraction of the original power consumption and heat dissipation?
*That* would be a card I'd buy in a snap.
Until then, I'm happy with my faithful 2006's card.
Re:What bugs me most (Score:5, Insightful)
You're much better off with a modern card that just has fewer execution units if you want to save money. They won't be out right away (the first release is always near the top end), but they will eventually show up. Since you're worried about saving money/power, you don't want to be an early adopter anyway. Oftentimes the very first releases will have worse power/performance ratios than the respins of the same board a few months down the road.
But can it run Unity without lag? (Score:2)
Other Reviews (Score:3)
http://www.overclockers.com/amd-radeon-hd-7970-graphics-card-review/ [overclockers.com]
http://www.madshrimps.be/articles/article/1000250/#axzz1hFPj6oTt [madshrimps.be]
http://www.techpowerup.com/reviews/AMD/HD_7970/ [techpowerup.com]
http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/49646-amd-radeon-hd-7970-3gb-review-25.html [hardwarecanucks.com]
Bitcoin (Score:3)
What's the Bitcoin Mhash/sec?
Re: (Score:2)
Re:It's too bad... (Score:5, Insightful)
Re: (Score:2)
When I brought home my R690M-chipset-based netbook ATI had already abandoned the X1250 graphics in it and dropped them from fglrx (assuming they were ever in there) and they apparently haven't given the folks making the ati driver enough information to support it properly despite their claimed commitment to open source (IME intel has made good on this in more cases than AMD) so it craps all over my system if I try to run Linux, even with RenderAccel disabled.
ATI Rage Pro stuff is the only ATI stuff that see
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
I paid $100 for one of those (with 1GB) when I built my system almost two years ago. It's 75% of the performance of a GT250 at 50% of the power and (at the time) just over 50% of the money. It's a great card, or family thereof.
Re: (Score:3)
I suspect that ATI and nVidia have been able to use the silent-restart feature to sell more defective cards. If your system totally locks up every half hour when playing a game, you're going to return the card. If it freezes but then resumes silently you may b
Re: (Score:3)
I'd recommend using OCCT to run a stability test on the card, and also note the temperatures and system voltages it hits when fully loaded.
What you're describing could also be an overheating issue or a power supply shortage. If the temperatures are approaching boiling, that's a problem. If the voltage drops significantly when your CPU and GPU scale up, that's also a bad sign.
Alternatively, yeah it could be crummy drivers.