First R600 Review - The Radeon HD 2900XT 157
mrneutron2004 writes "Tweaktown seems to have the first review out of the gate on AMD's flagship R600 core. 'Our
focus today is solely on the HD 2900 XT 512MB GDDR-3 graphics card –
it is the first GPU with a fast 512-bit memory interface but what
does this mean for performance? ... After taking a look at the GPU and the card
from PowerColor as well as some new Ruby DX10 screenshots, we will
move onto the benchmarks and compare the red hot flaming Radeon
monster against Nvidia's GeForce 8800 GTX along with the former ATI GPU king, the Radeon X1950 XTX."
Insta-Slashdotted (Score:3, Informative)
Re: (Score:1)
Re: (Score:1)
The article is gone, but the ads remain! (Score:3, Funny)
Re: (Score:2, Interesting)
Re: (Score:2, Insightful)
Re: (Score:2, Insightful)
Re: (Score:1)
Re: (Score:2, Interesting)
Re: (Score:1, Troll)
Another review (Score:1)
Summary: Down due to server issues (Score:3, Insightful)
Well TFA is slashdotted, but I think I can guess what it said. This GPU is super fast, super expensive, and super power hungry.
Re: (Score:2, Funny)
Re:Summary: Down due to server issues (Score:4, Funny)
Re: (Score:2, Funny)
Nah, they couldn't afford the resulting electricity bill.
Re: (Score:2)
Re: (Score:2)
Slashdotted (Score:1, Redundant)
Coral Cache only got up to page four before getting the same.
Nothing in Google Cache.
6 Comments (Score:2)
Well peopel are really chomping at teh bit (Score:2)
So it is no wonder everyone wants to know what is up with
Re:Well peopel are really chomping at teh bit (Score:4, Insightful)
From what I know, ATI was much busier then NVidia was with the "next gen" consoles. The GPU inside the XBox 360 is quite sophisticated, and the Wii doesn't just have a faster variant of the GameCube GPU. ATI spent real research time on these products, and this is when ATI came up with their solution for unified shader units on the GPU. So here we are in May of 2007, and ATI has shipped way more unified shader products then NVidia, simply because their product was inside a console that has sold millions. The 8800 series likely hasn't hit a million. Where as NVidia went with a GPU design mirrored off their 7x00 series of products for the Playstation 3, while trying to work out their own unified shader cards.
I think ATI made the better move here. They have been recouping the research money on unified shader GPUs from a much bigger market segment, though it does make it appear they are lagging behind in the PC gaming sector.
The good news for gamers is neither company is likely to go away anytime soon, because they both are in many different markets. This is a lesson 3dfx didn't learn, and many other now dead or almost dead graphics providers.
Re:Well people are really chomping at the bit (Score:2)
Now normally I wouldn't be so concerned but AMD just got their ass handed to them in the form of the Core 2 Duo and you can see it in the massive loss they've posted. The last thing they need is problems in their graphics division. Console contracts are all well and good, but currently the computer market is where the real money i
Re: (Score:3, Informative)
You're missing one fact: the PC GPU market is MUCH LARGER market than the console GPU market.
Here are some recent sales numbers: 76 million units in Q3 2006. [reghardware.co.uk] With ATI holding roughly 1/4 of the market (~18 million), that's more units than ATI sold in the last 6 months on the 360 and Wii comb
Re: (Score:2)
http://en.wikipedia.org/wiki/Xenos [wikipedia.org]
Specifically:
48-way parallel floating-point dynamically-scheduled shader pipelines[3]
Unified shader architecture (each pipeline is capable of running either pixel or vertex shaders)
2 shader ALU operations per pipeline per cycle (1 vector4 and 1 scalar, co-issued)
10 FLOPS per pipeline per cycle
48 billion shader operations per second th
Karma whoring - first four pages (Score:5, Informative)
AMD's long awaited R600 DX10 GPU finally arrives
It has been a long time coming but today AMD is finally set to release its massively anticipated GPU codenamed R600 XT to the world with the official retail name of ATI Radeon HD 2900 XT. It is a hugely important part for AMD right now, who recently posted massive profit loss figures. It is counting on all these new models, along with the high-end 512MB DDR-3 DX10 part with 512-bit memory interface to kick ass and help raise revenue reports against the current range from the green GeForce team, which is selling like super hot cakes.
The new R600 range of graphics processing units was set to see a release on March 30 (R600 XTX) but due to production issues and lack of decisiveness to make any firm decisions, it got delayed and delayed. It was beginning to look like AMD would letdown its loyal fan base; some even began suggesting the R600 was vaporware. That would have shaken up the industry immensely and thankfully for all, that did not happen. AMD is finally able to introduce some competition to Nvidia's GeForce lineup of cards with its new series of DX10 and Windows Vista ready products.
Eventually the folks at AMD got their act together and made some clear-cut decisions and got production issues under control and underway - probably due to indecisiveness between using GDDR-3 or GDDR-4 and associated cost vs. performance concerns. It was eventually leaked out to the world that the R600 XTX (the highest end model) would be reserved for system integrators due to its size and heat related issues - you may or may not see this GPU in OEM systems from companies like Dell and HP. That model will measure a staggering 12-inches long and probably will not be suitable for every computer case or configuration. It was deemed unacceptable for the consumer retail space and hence was scrapped from all plans.
Today AMD is launching an enthusiast part HD 2900 series with the HD 2900 XT, performance parts with the HD 2600 series including HD 2600 XT and HD 2600 PRO, along with value parts including HD 2400 XT and 2400 PRO. The HD 2600 and 2400 series have had issues of their own and you will need to wait a little longer before being able to buy these various models on shop shelves (July 1st). The HD 2900 XT will be available at most of your favorite online resellers as of today. Quantity is "not too bad" but a little on the short side with most of AMD's partners only getting between 400 - 600 units which is not that much considering the huge number of ATI fans out there. You may want to get in quick and place your order, if you are interested - some AIB companies are not sure when they will get in their next order, too.
Our focus today is solely on the HD 2900 XT 512MB GDDR-3 graphics card - it is the first GPU with a fast 512-bit memory interface but what does this mean for performance? While it is AMD's top model right now, it is actually priced aggressively at around the US$350 - US$399 mark in United States, which puts it price wise up against Nvidia's GeForce 8800 GTS 640MB. After taking a look at the GPU and the card from PowerColor as well as some new Ruby DX10 screenshots, we will move onto the benchmarks and compare the red hot flaming Radeon monster against Nvidia's GeForce 8800 GTX along with the former ATI GPU king, the Radeon X1950 XTX.
Samsung 225BW (Black) LCD Monitor
Page 2 [HD 2900 XT GPU]
Radeon HD 2900 XT GPU
R600 is AMD's first range of top to bottom DirectX 10 graphics cards with fully certified support for Microsoft's Windows Vista operating system. While DX10 GPU support might not be very important right at this moment, soon it will be a requirement to experience the best graphics potential from current games, which are awaiting DX10 patches, and upcoming games such as Crysis, Alan Wake and Unreal Tournament 3. Sadly it is basically impossible for us to provide comparative DX10 benchmark numbers between AMD and Nvidia graphics cards at the moment - AMD gave the press a DX10 benchma
Thanks, but... (Score:2)
84 degrees Celsius actually isn't that bad - my MSI 7950GX2 starts throttling at 122C (never gets above 85 maxed out, less than 60C idle), and the 8800GTX in the system I'm building for a client throttles at 127C (also nev
Re: (Score:3, Interesting)
Re: (Score:3, Informative)
However, if he's nearly burning his fingers on the thing, than I wouldn't want it in my PC.
Re: (Score:1, Interesting)
Re: (Score:2)
That's PRETTY FUCKING HOT.
Re: (Score:1)
Not as hot as Kathleen Fent. Last time I checked, she was well above 100C.
Oh... so that's what the ban stick looks like!
Re: (Score:2)
Re: (Score:2)
I don't know the average running temps for the stock speeds on your card, but I guarantee that it idles at less than 99c. Running hotter means it will last much shorter, and you're also increasing the case temps warming up the rest of your computer, shortening those components lives, as well.
Re: (Score:2)
Re: (Score:3, Funny)
Watch out, Ron Jeremy. The graphics cards are catching up!
Re: (Score:2)
Too late... (Score:5, Funny)
Tomorrow?! The GPU will be obsolote that time already...
There are other sites... (Score:5, Informative)
Re: (Score:1)
No 8800 GTS Comparison? (Score:1)
Re:No 8800 GTS Comparison? (Score:5, Insightful)
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Am I the only one who reads this as TruForm II? [wikipedia.org]
I owned the only graphics card to support TruForm in hardware (Radeon 8500), and I played exactly one game with TruForm support (Counterstrike), and boy was it disappointing. Will TruForm II suffer a similar fate?
Installing in a Mac Pro? (Score:1, Offtopic)
Why bother reviewing it? (Score:3, Funny)
Re: (Score:1)
Re: (Score:2)
Re: (Score:1)
Re: (Score:1)
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Watt (Score:1, Interesting)
I didn't see that in the article, did I miss it?
I don't understand how CPUs get faster and lower power, yet GFX cards get faster and require new power stations
Re: (Score:1)
Re:Watt (Score:5, Interesting)
CPUs, on the other hand, are driven a large part by servers, which do sit in racks and need to run on as low power as possible, because power = heat = bad.
One important thing to recognise is that power requirements per unit speed are actually dropping, it's just that speed increases faster than this increase in efficiency. CPUs have a slower rate of speed increase in terms of what is required of them, so power efficiency (Which is also a higher priority) has a chance to catch up.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Here's the power-and-noise [xbitlabs.com] for the 8600GTS.
Sounds like a fine product ... for a BOYCOTT! (Score:3, Informative)
And it's not only 3D performance that sucks. The 2D performance of their drivers is an ORDER OF MAGNITUDE slower than the open-source driver, and nVidia's driver at XRENDER performance ( ie rendering the webpage you're looking at
Like I said
Re: (Score:2)
Re: (Score:2)
Spouting the words Boycott is pure drivel.
If you find that the performance of their cards is crap in the environment that you want to use it, THEN DON'T BUY THEM, buy something else... that's what you do with items, buy the thing that best fits your need.
But to suggest a boycott? Dear god that's truly over the top and lame, and it's that sort of "You're not supporting this tiny user base who DEMANDS you spends stupidly hu
Re: (Score:2)
Re: (Score:2)
After all, it can't be slower if it doesn't exist
Re: (Score:3, Interesting)
I won't comment on the rest of your rant.
Re: (Score:2)
Let me point you to the complete lack of any actual Open Source drivers released by ATI as of this moment. ATI is known for promising a lot of stuff, but their "Commitment to Open Source" (first announced around 1999, repeated consistently since then) has resulted in nothing of any value at all.
Until they actually release Open Source 2D AND 3D drivers, or (even better) release programming docs for their hardware, a boycott is a damn good idea.
Re: (Score:2)
The ATI installer also sucked badly, generating a corrupt xorg.conf. It got confused because the machine had integrated Intel graphic
Re: (Score:2)
Though installing the ati driver th
Re: (Score:2)
I wonder if these cards will be fast enough to run current games. If yes, no doubt they'll be a hit amongst linux gamers.
Re: (Score:2)
Intel graphics are a bit slow for very recent games, but for the majority of Linux users who aren't hardcore gamers the Intel graphics are way, way better than anything that Nvidia or ATI offer. They provide both 2D and 3D acceleration, they don't have stupid bugs that the community can't fix, and they even work great for older games - Quake 3 and stuff like Wolfenstien: Enemy Territory should run great.
Re: (Score:1)
I'll take the open source Intel drivers, any day and would say it is the NIVIDIA that is actually "shit."
Re: (Score:1)
What I'm looking for in a graphics cards... (Score:1, Interesting)
Re: (Score:3, Informative)
Same for H.264 decoding.
Beg to differ on beryl.. (Score:2)
Re: (Score:2)
[a] costs next-to-nothing for any motherboard maker to integrate (and many of which do)
[b] unlike my gaming rig which houses an 8800GTS, the i945 integrated chipset does not pull 250 Watts when idle. It pulls something much closer to zero.
[c] Due to [b], my X60 does not make me pay the cost of a high-end GPU every year through the electricity bill.
[d] Due to [b], my X60 can stay afloat on battery for 8 hours. (More like 6-7 running
Re: (Score:1)
Re: (Score:2)
So, If you don't mind the odd thing going slow then 945G is fine, otherwise try to get a MB with a 6200 onboard.
If you can't do either, then there are quite a few passive 7300s around - they'll be overkill but that's your fault for picking an intel CPU but not getting the 945G
AA? (Score:1)
If only... (Score:1)
Re: (Score:1)
Try this... (Score:1)
If these number are real then well AMD is having one hell of a bad year so far....
Re: (Score:1)
Not the first review... (Score:2)
VR-Zone's X2900XT Pre/Review [vr-zone.com]
Oh, they aren't slashdotted either, but have been getting hit hard from hardware junkies.
HardOCP review is online (Score:2)
Bottom line, the 2900XT is "...a day late and a dollar short."
Re: (Score:2)
The power consumption of silicon is temperature-dependent, but it's usually small enough to be ignored. Forty watts difference is HUGE.
Re: (Score:2)
Hot! (Score:2)
I have a now ancient (3+ year old) Radeon 9800 XT, which is still more than decent for graphics. However, I have to have the case open with a Honeywell tornado fan blowing on the card to kee
Re: (Score:2, Insightful)
Re:Lets see, another graphics card? is it needed? (Score:5, Funny)
Don't feel too badly. When you upgrade, you can pop off the heatsinks like opening a vintage wine and smell the air that was actually trapped there in 2006. Let your nose travel back in time to those heady days of yesteryear!
Re: (Score:3, Funny)
Old? You think that's old?! You make me sick!
I recently upgraded from a GeForce 3 Ti to a 6200, you insensitive clod!
Re: (Score:1, Troll)
Re: (Score:2, Insightful)
And if you haven't bought a new graphics card for several years, I don't think you have the faintest idea what you're talking about. Current graphics cards aren't "just 100 MHz faster" or "just 100 megs more RAM" than the graphics cards of a few years ago. They're an order of magnitude faster a
Re: (Score:1, Interesting)
Re: (Score:1)
Re: (Score:2)
Re:Lets see, another graphics card? is it needed? (Score:5, Insightful)
1) X develops hardware x. 2) Y develops hardware y, where y > x. 3) X develops x2, where x2 > y. 4) And so on..
The good thing is that competition usually gets you better pricing and better products. And when people purchase these products, developers adapt to it and find new ways to improve software that takes advantage of the new hardware.
With your thinking, we could go back to the days when Wolfenstein was the latest in graphics and computing, but we don't want that. We need improvements, just like in every other industry.
Also, if you haven't upgraded your computer in years, then you probably don't care much about games. Well, millions of people do. Is Ferrari bankrupt because you're not buying their cars? No, because other people buy them.
Re: (Score:2)
Few companies ever want to take the blame for messes caused by their own products...