AMD's Radeon HD 2900 XT Reviewed 126
J. Dzhugashvili writes "The folks at The Tech Report have whipped up a detailed expose of the new AMD Radeon HD 2900 XT graphics card's architecture and features, with plenty of benchmarks. While the card dazzles with 320 stream processors, a 512-bit memory bus, and oodles of memory bandwidth, its performance and power consumption seem disappointing in the face of Nvidia's six-month-old GeForce 8800 graphics cards."
Let's hope (Score:2)
Re: (Score:2)
I still want a comparison of every DX10 card available, running a selection of say 10 of the most popular games from the last 3 years with all the pretty options turned up to the max, on XP and on Vista, in both a top-end God Box and a typical 18-month old good-but-not-outstanding-for-its-age PC.
2 tables of aggregate fps scores at the beginning not the end, one for XP and one for Vista, each with one row per card and 4 columns: normal PC @1280x1024; normal PC @1600x1200; god box @1280x1024; god box @1600x1
Re:Let's hope (Score:4, Informative)
First measure FPS in your favorite app at the lowest resolution. That's the measure of your CPU bottleneck. No matter how nice of a GPU you buy, you'll never get higher FPS than that.
Memory is one of those things you can never have enough. Just don't worry about the bandwidth too much. Your only going to squeeze out just a few frames per second with top of the line RAM. Just watch to see if your comp is hitting the hard drive much and consider more if it is.
Most new games are still GPU limited and this is where you want to focus your attention. Look for benchmarks at resolutions you play at. This is a good baseline of what to expect. Anything over 60fps avg I tend to be happy with, but you may want consider the minimum too. Right now the only benchmarks I've really been interested in are of Rainbow Six: Vegas. It uses the Unreal 3 Engine, and a lot of games are coming out that are going to be using it too. Other benchmarks might be important to you as well, but they tend to rank in the hundreds and so you know performance won't be an issue.
Re: (Score:1)
Re:Let's hope (Score:5, Informative)
This comes up every single time, so forgive me if I'm not as polite about it as I could be.
The human eye sees ~25-30fps, true, but it does not sample the same way your monitor outputs it. The human eye refreshes at that rate, which means anything that's seen for less than that amount of time leaves a partial imprint. Thus, the motion blur you see when something, even in real life, goes by really fast. Since the monitor is outputting static frames, you don't get that partial imprint, and it looks choppy. Television, on the other hand, does pick up the motion blur, because of the way the cameras work. There are a number of studies showing that we benefit from higher FPS, up to and over 100 sometimes.
Also, there's really no such thing as 32-bit color. I suppose you could put a different number of bits for RGB, and many schemes do, but the 32-bit you're thinking about is RGBA. 24-bit is the exact same thing, without the alpha channel, and we also benefit from far more colors than current hardware outputs, because the current 16.7m colors that are output don't account for much luminosity, and for plenty of other reasons I don't care to look up right now.
When you try to stick your hand through the monitor and pick up a Coke, or catch a running kitten, that's when you can say we've got enough. Until then, please try to at least understand the subject you're discussing, and not try to come off as authoritative when you don't even know what 32-bit color means.
32 bit can= 10+10+10+2 (Score:1)
In many displays you will see 10-bit 1.07-billion color display capability mentioned as a feature.
Re: (Score:2)
Then you need an extended color gamut to cover the full range of color that the eye can see. There are some projector systems that do that, using more than the usual 3 color filters.
Then you need autostereoscopic displays to get proper 3D (I won't even mention the lack of accomodation depth percepti
Re: (Score:2)
doesn't it also depend on the display though? If my LCD is running at 60 hertz anything over 60fps wont make it past that bottle neck.
Re: (Score:1)
Whatever knowledge you have, you cant expect the rest of the world to know, but i guess that true geeks are snobs anyways
Re: (Score:1)
by talking down to anyone you dont know, your a jerk, and i was not saying anything like it was the asboluute truth, you read things betwwen lines that arent there.
Re: (Score:2)
Chris, I used to think that I wanted to have kids someday, but the risk that I might end up with someone like you as the fruit of my loins terrifies me. I'd like to think that I'd raise them better than that, and that they'd #1 - not parrot "facts" they heard from "everybody" so that they never have to deal with a jerk like me, and #2 - know how to deal with a jerk like me--but most likely, I'd end up raising a mouth-breathing illiterate who cries himself to sleep every night because daddy doesn't even seem
Re: (Score:2)
Not quite true. Do some research before quoting the same myth that everyone quotes.
Check out this [amo.net] link for more.
Re: (Score:1)
the same myth that everyone quotes,,,i'm part of everyone and used the knowledge i had, or thought was o.k, i did not think i needed a college degree.
And if by any chance any human here can see 16.7 million color and point them to me i'll shake his hands.
I'm gamer myself and frankly i dont see any difference between 60 fps and 100 fps. the game aint smoother or better, higher frame rate to me just means that when it gets heavy on cpu and memory thw lowest FPS i'll get will be higher than 30
Re: (Score:2)
Expect NVIDIA to make an announcement soon... (Score:2)
Re: (Score:2)
Re: (Score:2)
How many DX10 games are there at the moment? My x800XT runs all of the most popular games from the last 3 years with all the pretty options set to max.
That said, I am getting an 8800 for my next rig.
Re: (Score:1)
Not sure why you were modded as troll, I was clearly wrong.
Strangely it's Nivida with sucky drivers right now (Score:2, Insightful)
Anyway, while these x2900 do not seem to be great performers I suspect their Vista drivers are better. As a Vista user the GF8800 is right now out of the question, less the driver situation have changed recent
Re:Strangely it's Nivida with sucky drivers right (Score:2)
I must have missed an announcement. Are there DX10 games now? I've had my head in Supreme Commander (and of course, Eve) so I don't know.
Re: (Score:1)
http://www.hwupgrade.it/articoli/skvideo/1725/ati- radeon-hd-2900-xt-e-il-momento-di-r600_19.html [hwupgrade.it]
Here's linkage to video footage:
http://194.71.11.70/pub/games/PC/guru3d/generic/r6
http://194.71.11.70/pub/games/PC/guru3d/generic/r6
http://194.71.11.70/pub/games/PC/guru3d/generic/r6
Re:Strangely it's Nivida with sucky drivers right (Score:1)
What has AMD done with ATI (Score:5, Interesting)
AMD/ATI losing out to Intel with the onboard graphics.
nVidia has a better closed source linux driver than ATI.
At the moment the only appeal of ATI is there mediocre graphics cards have open source 2D+3D drivers on Linux with R200(helped by ATI) or R300(no help from ATI/AMD) drivers.
At the moment AMD's best strategy is to build some fantastic onboard graphics chips for their AMD processors and try and beat nVidia by basically making and AMD chip + on board graphics as brilliant combination (ie no need to add an aftermarket card).
Re: (Score:2)
AMD's big future problem (Score:4, Insightful)
However, unless AMD sorts all this out over the next couple of years, they are in for a huge amount of very costly trouble, and it may be terminal to their future in the desktop market. The problems ahead lie in the area of CPU-GPU integration.
We are told that AMD purchased ATI because they needed graphics expertise for a projected future in which scalar and vector processing is merged in an extremely parallel multi-core processor architecture. It's easy to see the reasoning here, as tight integration would decrease communication latencies and power consumption simultaneously. The benefits of tight integration are likely to be collosal, and AMD knows this from their success with hypertransport.
Unfortunately, such tight integration also means that ATI's remarkable incompetence at producing even half-decent drivers will bring AMD down badly, unless something is done about it. And short of firing the whole ex-ATI driver team, it's hard to see how to resolve this issue. You can't resolve it by trying to educate bad software engineers, that's for sure.
AMD have quite a problem on their hands.
Re: (Score:1)
Re: (Score:1)
If the financial reports are accurate, AMD doesn't have a couple of years to sort it out. Their timeframe for sorting it out is significantly shorter.
Re: (Score:3, Interesting)
We all know ATI had really poor driver development in the 90s. However, for at least the past five years or so -- since the introduction of the Radeon 9x00 DirectX9 (R300) generation hardware, their drivers have been at least a
Re: (Score:2)
Re: (Score:2)
I've had a Radeon 9500Pro since 2002, and been very happy with the quality of the hardware and drivers, but I've waited for six months for an ATI high end part to become the centerpiece of my next gaming rig. In the end, R600 clearly has enormous potential, but the 2900XT
Re: (Score:2)
ATI and Linux (Score:2)
Re: (Score:2, Interesting)
Re: (Score:2)
The price of the 8800 right now is what most people pay for a desktop computer and monitor. All this hand-wringing over high-end performance marginal gains is no different than the "Ford vs Chevy' nonsense the pickup-truck crowd is always going off about.
MOD DOWN! SPOILER TROLL! (Score:1, Insightful)
I opt out of mod points, but someone mod this douchebag down. The "quote" contains a spoiler for something.
Re: (Score:1, Troll)
Bah (Score:2, Interesting)
Re:Bah (Score:5, Funny)
Re:Bah (Score:5, Funny)
A real hacker doesn't even need a screen - they just stick their tongue on the HD15* cable and imagine what the screen looks like from the electrical pulses.
(*) Get off of my LAWN!
Re: (Score:2)
Helllloo Child Support! (Score:1)
Re: (Score:2)
Expensive? They are a deal compared to days gone (Score:2)
You don't need these cards to draw pictures on your screen, you need them to animate the pictures on your screen. Sure you could play all games "Myst" style, but that isn't what people are after.
Plus, no one is forcing you to buy the latest and greatest. Quite a few games benefit from these cards, but many can be very playable by knock
Re:Expensive? They are a deal compared to days gon (Score:1)
I know what you mean, I had to pull my pci based gum ball machine last time I upgraded my video card.
Re: (Score:2, Insightful)
Re: (Score:2)
Ob (Score:1, Redundant)
Re: (Score:1, Funny)
Re: (Score:1)
Re: (Score:3, Funny)
This is why we need open source 3D drivers... (Score:5, Interesting)
The hardware probably screams. But ATI has a reputation for really shitty drivers. Without solid, fast, high-quality drivers, fast hardware doesn't matter as much.
NVidia has typically produced fast drivers. They're not open-source, but they're at least good.
If ATI can't get its shit together and write some decent drivers, the only reasonable option for them would be to open-source their 3D drivers so that the community can fix them properly. And I expect the community would do just that, because a lot of developers are also avid PC gamers, so they have a personal stake in it.
It'll be interesting to see where this heads, given the statements made by ATI about open-sourcing their drivers, but I'm not going to hold my breath over it. For now, it's NVidia for my gaming rigs. That'll change as soon as ATI actually open-sources their full 3D drivers.
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
Re: (Score:3, Interesting)
Re: (Score:2)
Why do you think those parts cost the most to produce? Given the specs, I would imagine the parts very close to the hardware cost the least. Why do you think otherwise?
Also, why assume the negative? Why not release the unencumbered parts and see what happens? What is there to lose? If you are right then nothing is lost. If you are wrong we end up with some good open drivers.
Re: (Score:1)
I recently assembled a new rig which I intended to run Linux on. I had previously had some bad experiences with the drivers for an nVidia nForce4 motherboard (SATA-drivers corrupted files on disk and hardware CPU offloading for networking caused corrupt downloads and BSODs), but people were telling me how much better nVidia's Linux support was, so I went for a 7600GS anyway.
I'm not saying I regret going for a 7600GS, but I don't think I've ever had the misfortune
Re: (Score:2)
Re: (Score:1)
A number more reviews (Score:5, Informative)
[H]ardocp's take: http://enthusiast.hardocp.com/article.html?art=MT
techPowerUp (Warning, streaming video at the start >.>): http://www.techpowerup.com/reviews/ATI/HD_2900_XT
The Inquirers expected vapid coverage: http://www.theinquirer.net/default.aspx?article=3
I think I'll wait for more ATI drivers and some DX10 games before calling this one... Looks a little underwhelming at the moment though. I'm not regreting my 8800GTX purchase yet.
Re: (Score:2)
Unsurprising, given that the reviews point to the HD 2900 XT being slower than the 8800 GTX (and the 8800 Ultra). It does surprise me, though, that ATI are 6 months behind and still couldn't beat NVIDIA for the performance crown - but it's nice to see a real fight again on price/performance, and I'm looking forward to seeing how the HD 2600 XT stacks up against the 8600 GTS (for those of us with a sanity budget restriction in place).
Re: (Score:2)
Folding@home performance (Score:1, Informative)
http://forum.folding-community.org/fpost185371.ht
http://folding.stanford.edu/FAQ-ATI.html [stanford.edu]
idle & load power ratings are scary (Score:5, Informative)
----
Radeon 2900XT - 183
GeForce 8800 Ultra - 192
GeForce 8800 GTX SLI - 296
Radeon 2900XT Crossfire - 317
Full Load
---------
Radeon 2900XT - 312
GeForce 8800 Ultra - 315
GeForce 8800 GTX SLI - 443
Radeon 2900XT Crossfire - 490
This could get very expensive for people that leave their computers running 24/7.
Re: (Score:1)
Re:idle & load power ratings are scary (Score:5, Informative)
Re: (Score:3, Insightful)
I use a mac G4/dual 500 (i.e. an OLD old machine) as my 24/7 box - cost about $200 bucks, and does just fine quietly humming away in the corner drawing 75 watts.
If your idle numbers are right, you'd better have a good friend at the power company if you plan on leaving that machine running 24/7.
Re: (Score:2)
I expect most energy saving methods currently being used in laptops to be ported to desktops in the near future, because otherwise the power requirements will become unmanageable. If it continues lik
Re: (Score:2)
For a $300+ card, I think one solution might be to have a low-power state vdieo chipset on-board which needs no cooling and draws 90% less power.
Re:idle & load power ratings are scary (Score:4, Interesting)
Now don't get me wrong, I love to see these types of improvements in real time graphics rendering, but you know there's something wrong with the industry if they can ask PC Enthusiasts with a straight face to use power supplies powerful enough for Air Conditioning Units (albeit small ones) in their computers. That being said, I still commend the improvements made and I look forward to the lower end, passively cooled, versions becoming available for my next HTPC.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:1)
"our 700W power supply wasn't up to the task of powering a Radeon HD 2900 XT CrossFire rig."
am I missing something?
Re: (Score:1)
Re: (Score:2, Informative)
Power supply manufacturers typically pick a number close to the maximum possible consumption the unit can provide utilizing the maximum across all the different voltages it provides. So when you look at the sticker on the PSU, it will show you the maximum amps per each voltage. You take all of these numbers, multiply the voltage by the amps to get the watts (watts = volts * amps). Then add all of those numbers together to get the total maximum power the PSU can provide. That number should be fairly close t
Re: (Score:1)
Re: (Score:2)
I just bought a Radeon X1950Pro since my previous card wasn't handling Lord of the Rings Online very well, and I never considered that it might increase my power consumption when im not using the system.
(Genuinly interested in a response to this if someone really knows, would change my current behavior of just putting the system in powersave to actually turning it off).
Re: (Score:2)
Really - I can't believe those numbers. Is there an error? How could they _waste_ 200 watts _at idle_?
Re: (Score:1)
I presume you can't shut off the PCIe slot, which means the card is powered. Which probably means electricity running through a whole bunch of circuits, generating heat (i.e. using power). If the card is generating heat, the fan is on, and fans are never, ever winners in the efficiency game. If the fan isn't designed to step very well, that would add to the problem (i.e. if the settings are 'fast' and 'faster' rather th
Re: (Score:2)
Re: (Score:2)
Sounds like they should sell a
Agreed, 150w+ is too much for a graphics card (Score:2)
I was pretty happy when I picked up my 90nm $300 7900 GT last year - same power consumption as my old 6600 GT, but three times the performance! If I HAD to buy today, I'd get the 8800 GTS, but because Nvidia didn't design it with different 2D and 3D clocks, the 2D idle consumption is higher than it should be. The x2900 XT has a "low" idle power consumption in respect to the 8800-series because it supports a lower 2D clock.
Hopefully, this will be corrected in the 65
Re: (Score:3, Interesting)
At least wait for a june refresh if you're going to buy nvidia.
Re: (Score:2)
Not Direct Competitor to 8800gtx (Score:3, Insightful)
The 2900XT is a competitor to the 2 8800GTS models.
They are avoiding the top end market because more often then not the risk of that market does not meet the reward.
They are playing little ball to compare to base ball, trying to manufacture base hits and runs not home runs.
Offering 3 Cards starting at less than $100 and going to $400ish is a good strategy for the main stream market.
The HDMI dongle innovation (carries video and audio on the video card because all of the new cards have an audio processor on them) is a boon for them as well, helping carry the image of media center capable video cards, for a newer computer user age.
These will help push down prices on all of the cards within that price range. And possibly help push innovation in the marketplace.
Re: (Score:1)
They spent a fair amount of focus on developing a GPU for the Xbox 360 - and that R&D did not bring direct translation to their GPU offerings here.
It remains to be seen if it'll make that much difference in the long run, but at the moment, it looks like ATi hasn't got a whole lot to offer - of course, until we see some DX10 games and com
Re: (Score:1)
Re: (Score:1)
Re: (Score:1)
Re: (Score:1)
Re: (Score:1)
nVidia has higher quality (Score:1)
From my limited point of view, nVidia sells higher quality cards.
Two weeks ago, I had to replace my ATI 1600 pro with a nVidia 7600 GS. They are roughly equivalent cards.
In windowsxp running Oblivion, I notice a drop in performance with my new 7600gs. In Ubuntu Linux I notice a glxgears score 10 times higher! Now I understand that this improvement is because of the better drivers, but its
Re: (Score:1)
Is there any hope? (Score:1)