Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
AMD Graphics Software Hardware

First R600 Review - The Radeon HD 2900XT 157

mrneutron2004 writes "Tweaktown seems to have the first review out of the gate on AMD's flagship R600 core. 'Our focus today is solely on the HD 2900 XT 512MB GDDR-3 graphics card – it is the first GPU with a fast 512-bit memory interface but what does this mean for performance? ... After taking a look at the GPU and the card from PowerColor as well as some new Ruby DX10 screenshots, we will move onto the benchmarks and compare the red hot flaming Radeon monster against Nvidia's GeForce 8800 GTX along with the former ATI GPU king, the Radeon X1950 XTX."
This discussion has been archived. No new comments can be posted.

First R600 Review - The Radeon HD 2900XT

Comments Filter:
  • Insta-Slashdotted (Score:3, Informative)

    by Jare ( 790431 ) on Sunday May 13, 2007 @03:36PM (#19106249) Homepage
    "Sorry but our servers aren't up to this amount of hits"
    • by ytzombe ( 530215 )
      it's gone in a flash.
    • by Almir ( 1096395 )
      cool, they got it down to two steps. 1. show ads without actual content 2. profit!
    • That was so fast you have to wonder if this isn't blatant self-promotion combined with a hoax. That or a con job. I'm sure they'll drum up a healthy bit of ad revenue from this little tease.
    • Re: (Score:2, Interesting)

      by ruiner13 ( 527499 )
      But yet they have no problem leaving all the ads on the page that says they pulled the article, so they can make money off the people expecting the article to be there, thus getting paid for hosting nothing. What a crock of shit.
      • Re: (Score:2, Insightful)

        Um, yeah, notice how those ads arent hosted on the page itself. It's not their bandwidth which is being used.
        • Re: (Score:2, Insightful)

          by Anonymous Coward
          The ads, like Ross said, are just links. I could send you to a page that has one word and a hundred adds or to a page with 500 words and no adds. Though the 500 words take up less space than the hundred ads it has more of an effect on the server than the ads do because the ads are hosted on random servers across the internet.
          • I realize that. The point I was trying to make is that they had an article, and took it down. It would be common decency to not make money off of people who are not getting what they expected when they visit the site. My comment had nothing to do with bandwidth, it had to do with the ethics of putting up a page that people expect to be one thing and instead get one line and a bunch of ads. Just seems shady.
            • Re: (Score:2, Interesting)

              by heinousjay ( 683506 )
              Exactly how does it hurt you? Or are you just the kind of person who likes to whine whenever someone else makes money?
    • Also slow, but at least working http://www.vr-zone.com/print.php?i=4946 [vr-zone.com]
  • by mastershake_phd ( 1050150 ) on Sunday May 13, 2007 @03:40PM (#19106297) Homepage
    Summary: Down due to server issues - check back later, sorry!
     
    Well TFA is slashdotted, but I think I can guess what it said. This GPU is super fast, super expensive, and super power hungry.
    • Re: (Score:2, Funny)

      by aichpvee ( 631243 )
      Don't forget it's probably decent competition for a TNT2 in 3D games under Linux!
    • Maybe they should have used the card to power their servers?
      • Re: (Score:2, Funny)

        by Ice Wewe ( 936718 )

        Maybe they should have used the card to power their servers?

        Nah, they couldn't afford the resulting electricity bill.
    • by Kjella ( 173770 )
      According to the VR-Zone reviewed linked further down the page, it's neither super fast nor very expensive, but it sure is super power hungry. It's a match for the 8800GTS series, but not more. I sure wouldn't want to see the power reqs for a Crossfire rig with these...
      • Read further in the mentioned article and see that apparently the drivers are still being polished (a point release driver yielded ~30-40% gains in Q4). So, while the card may be quite fast, it's not being used to it's full potential yet. Wait for the next driver release (hopefully coming in a few days) to make your ultimate judgment.
  • Slashdotted (Score:1, Redundant)

    by Nimey ( 114278 )
    Mirrordot's got the "can't handle the load" page.

    Coral Cache only got up to page four before getting the same.

    Nothing in Google Cache.
  • And the site is dead. This is a hot piece of hardware though - dosent really suprise me that it's down already. ATI really needs to hit one out of the park with it to keep AMD afloat in the coming months.
    • I mean it is almost unthinkable that ATi would have NO response to nVidia for so long. Prior to the 8800 launch, the chatter was about ATi having a card with unified shaders and nVidia having a more classic card (DX10 requires only unified API, not unified hardware) and so on. Then the 880 gets dropped on the world and ATi does.... nothing. It's been over 6 months, which is essentially a whole cycle in the graphics industry, and there's still nothing.

      So it is no wonder everyone wants to know what is up with
      • by Drakino ( 10965 ) on Sunday May 13, 2007 @05:57PM (#19107119) Journal
        Sure, ATI had no response to NVidia in the gaming computer graphics market, but that isn't the only market that these companies operate in. ATI's lead over NVidia with the Radeon 9700 didn't kill NVidia, and this current situation with the 8800 having such a huge lead won't kill ATI.

        From what I know, ATI was much busier then NVidia was with the "next gen" consoles. The GPU inside the XBox 360 is quite sophisticated, and the Wii doesn't just have a faster variant of the GameCube GPU. ATI spent real research time on these products, and this is when ATI came up with their solution for unified shader units on the GPU. So here we are in May of 2007, and ATI has shipped way more unified shader products then NVidia, simply because their product was inside a console that has sold millions. The 8800 series likely hasn't hit a million. Where as NVidia went with a GPU design mirrored off their 7x00 series of products for the Playstation 3, while trying to work out their own unified shader cards.

        I think ATI made the better move here. They have been recouping the research money on unified shader GPUs from a much bigger market segment, though it does make it appear they are lagging behind in the PC gaming sector.

        The good news for gamers is neither company is likely to go away anytime soon, because they both are in many different markets. This is a lesson 3dfx didn't learn, and many other now dead or almost dead graphics providers.
        • I hope not because the 2900 just launched and it fails to impress. It is maybe as good as a 8800GTS (depends on which review method is used) but uses way more power and costs more.

          Now normally I wouldn't be so concerned but AMD just got their ass handed to them in the form of the Core 2 Duo and you can see it in the massive loss they've posted. The last thing they need is problems in their graphics division. Console contracts are all well and good, but currently the computer market is where the real money i
        • Re: (Score:3, Informative)

          I think ATI made the better move here. They have been recouping the research money on unified shader GPUs from a much bigger market segment, though it does make it appear they are lagging behind in the PC gaming sector.

          You're missing one fact: the PC GPU market is MUCH LARGER market than the console GPU market.

          Here are some recent sales numbers: 76 million units in Q3 2006. [reghardware.co.uk] With ATI holding roughly 1/4 of the market (~18 million), that's more units than ATI sold in the last 6 months on the 360 and Wii comb
  • by Nimey ( 114278 ) on Sunday May 13, 2007 @03:47PM (#19106349) Homepage Journal
    Page 1 [Introduction]

    AMD's long awaited R600 DX10 GPU finally arrives

    It has been a long time coming but today AMD is finally set to release its massively anticipated GPU codenamed R600 XT to the world with the official retail name of ATI Radeon HD 2900 XT. It is a hugely important part for AMD right now, who recently posted massive profit loss figures. It is counting on all these new models, along with the high-end 512MB DDR-3 DX10 part with 512-bit memory interface to kick ass and help raise revenue reports against the current range from the green GeForce team, which is selling like super hot cakes.

    The new R600 range of graphics processing units was set to see a release on March 30 (R600 XTX) but due to production issues and lack of decisiveness to make any firm decisions, it got delayed and delayed. It was beginning to look like AMD would letdown its loyal fan base; some even began suggesting the R600 was vaporware. That would have shaken up the industry immensely and thankfully for all, that did not happen. AMD is finally able to introduce some competition to Nvidia's GeForce lineup of cards with its new series of DX10 and Windows Vista ready products.

    Eventually the folks at AMD got their act together and made some clear-cut decisions and got production issues under control and underway - probably due to indecisiveness between using GDDR-3 or GDDR-4 and associated cost vs. performance concerns. It was eventually leaked out to the world that the R600 XTX (the highest end model) would be reserved for system integrators due to its size and heat related issues - you may or may not see this GPU in OEM systems from companies like Dell and HP. That model will measure a staggering 12-inches long and probably will not be suitable for every computer case or configuration. It was deemed unacceptable for the consumer retail space and hence was scrapped from all plans.

    Today AMD is launching an enthusiast part HD 2900 series with the HD 2900 XT, performance parts with the HD 2600 series including HD 2600 XT and HD 2600 PRO, along with value parts including HD 2400 XT and 2400 PRO. The HD 2600 and 2400 series have had issues of their own and you will need to wait a little longer before being able to buy these various models on shop shelves (July 1st). The HD 2900 XT will be available at most of your favorite online resellers as of today. Quantity is "not too bad" but a little on the short side with most of AMD's partners only getting between 400 - 600 units which is not that much considering the huge number of ATI fans out there. You may want to get in quick and place your order, if you are interested - some AIB companies are not sure when they will get in their next order, too.

    Our focus today is solely on the HD 2900 XT 512MB GDDR-3 graphics card - it is the first GPU with a fast 512-bit memory interface but what does this mean for performance? While it is AMD's top model right now, it is actually priced aggressively at around the US$350 - US$399 mark in United States, which puts it price wise up against Nvidia's GeForce 8800 GTS 640MB. After taking a look at the GPU and the card from PowerColor as well as some new Ruby DX10 screenshots, we will move onto the benchmarks and compare the red hot flaming Radeon monster against Nvidia's GeForce 8800 GTX along with the former ATI GPU king, the Radeon X1950 XTX.

    Samsung 225BW (Black) LCD Monitor

    Page 2 [HD 2900 XT GPU]
    Radeon HD 2900 XT GPU

    R600 is AMD's first range of top to bottom DirectX 10 graphics cards with fully certified support for Microsoft's Windows Vista operating system. While DX10 GPU support might not be very important right at this moment, soon it will be a requirement to experience the best graphics potential from current games, which are awaiting DX10 patches, and upcoming games such as Crysis, Alan Wake and Unreal Tournament 3. Sadly it is basically impossible for us to provide comparative DX10 benchmark numbers between AMD and Nvidia graphics cards at the moment - AMD gave the press a DX10 benchma
    • There isn't really much in the first four pages that we didn't already know from The Inquirer. I also recall them linking to a recent and thorough benchmarking of the HD2900 XT by it-review.net [it-review.net] - don't think that was a hoax, so this wouldn't be the first review of the R600 by any measure.

      84 degrees Celsius actually isn't that bad - my MSI 7950GX2 starts throttling at 122C (never gets above 85 maxed out, less than 60C idle), and the 8800GTX in the system I'm building for a client throttles at 127C (also nev
      • Re: (Score:3, Interesting)

        by Quantam ( 870027 )
        Can anybody verify that this guy knows the difference between Celsius and Fahrenheit? As far as I know CPUs don't usually live past 90 C or so, let alone 127 C.
        • Re: (Score:3, Informative)

          by WoLpH ( 699064 )
          He does know the difference between Celcius and Fahrenheit, and it's indeed "normal" for videocards to run at temperatures like this. The temperature a processor can handle greatly varies, while CPU's usually stop working around 80-90C die temperature, videocards can take at least 20 degrees above that.

          However, if he's nearly burning his fingers on the thing, than I wouldn't want it in my PC.
        • Re: (Score:1, Interesting)

          by Anonymous Coward
          My "core slowdown threshold" on a 6800gt is 120*c. It is currently running at 58c. Cpu get damaged about like 85 or 86*c but video cards seem to handle much more. I thought the max temp was determined by the heat required to crack silicon, but I guess I dont know what im talking about either.
      • "84 degrees Celsius = 183.2 degrees Fahrenheit"

        That's PRETTY FUCKING HOT.
        • "84 degrees Celsius = 183.2 degrees Fahrenheit" That's PRETTY FUCKING HOT.

          Not as hot as Kathleen Fent. Last time I checked, she was well above 100C.

          Oh... so that's what the ban stick looks like!

        • That's almost hot enough to boil water. I don't know why anybody would buy something that operated at such high temperatures. It's sure to burn a hole in itself within the first year.
      • by jZnat ( 793348 ) *
        84 C not that bad? You can probably go sterile just looking at the damn thing. :P
    • Re: (Score:3, Funny)

      by llamaxing ( 895844 )

      That model will measure a staggering 12-inches long...


      Watch out, Ron Jeremy. The graphics cards are catching up!
      • by lendude ( 620139 )
        Graphic card length long surpassed 'The Hedgehog's' equipment - not that I'm saying I've looked mind you: I only watch RJ pr0n for the humour!
  • Too late... (Score:5, Funny)

    by yacTheFourth ( 1074755 ) on Sunday May 13, 2007 @03:47PM (#19106351) Homepage
    "We will put it online tomorrow when other sites release their reviews to balance the load."

    Tomorrow?! The GPU will be obsolote that time already...
  • by Cave Dweller ( 470644 ) on Sunday May 13, 2007 @03:47PM (#19106355)
    VR-Zone, for example: http://www.vr-zone.com/?i=4946&s=1 [vr-zone.com]
  • It seems odd to me that they don't compare the 2900XT to the 8800 GTS 640MB. Comparing it with the top-end 8800 GTX means comparing to a much more expensive card. This review isn't really a fair comparison. If they wanted to include the 8800 GTX for info, fair enough, but they should be comparing the card to its intended competitor. You can't draw fair comparisons when comparing to a much more expensive card.
    • by CanadaIsCold ( 1079483 ) on Sunday May 13, 2007 @04:08PM (#19106509)
      They do note that they find the card priced aggresively. This is the highest end graphics card they are producing for the retail space. Direct competitors do not always cost the same thing. They may be trying to undercut the price of the nvidia card which is why the review compares the similarily featured rather than the similarily priced.
      • If the card performed similarly to an 8800 GTX I would agree with you, but it loses nearly all the benchmarks to the GTX, consequently, they should compare it with the GTS too, show where it lies in the larger scheme of things.
    • From the VR-Zone review [vr-zone.com] linked to in a previous comment by Cave Dweller [slashdot.org]:

      It is slightly off tradition that the GPU company's flagship product sails off not to meet the flagship of it's competitor, but one target lower. Then again, the lower we go down the price pyramid, the bigger the audience, more people with the budget to spend. I'd say that there is no clear winner between the 8800 GTS and X2900XT, the GTS displayed more consistent performance behavior while the X2900XT fluctuates around due to the in-ma

      • Easy win for Nvidia: Linux drivers not apparently programmed by the outsourcing company's tea boy.
      • New programmable Tessellation technology

        Am I the only one who reads this as TruForm II? [wikipedia.org]

        I owned the only graphics card to support TruForm in hardware (Radeon 8500), and I played exactly one game with TruForm support (Counterstrike), and boy was it disappointing. Will TruForm II suffer a similar fate?
  • This is off-topic, but I was curious if anyone's tried to get one of the current higher-end cards working in a Mac Pro? Are there driver issues in doing so?
  • by Anonymous Coward on Sunday May 13, 2007 @04:09PM (#19106515)
    How about a warning instead, something like;

    WARNING: This is an ATI card and requires their sucky drivers.
    • Sucky no more! They'll be "open" (GPL, presumably) soon, remember...
    • by SP33doh ( 930735 )
      though I'd still always go nvidia for a computer that'll run linux, I've heard that ATI's windows drivers have gotten a lot better.
      • by jZnat ( 793348 ) *
        In my experience, both NVidia's and ATI's offerings in the driver front on Windows suck. Of course, this is more so based on my friends' experiences nowadays since I usually use Intel (notebooks) or NVidia (desktops, Linux) for graphics anyhow.
      • I don't get all the ATI driver bashing either. I will agree that their .NET based configuration program absolutely blows, but otherwise the drivers themselves seem to be pretty stable.
  • Watt (Score:1, Interesting)

    by Anonymous Coward
    Performance per watt is a much more interesting & important figure to release.
    I didn't see that in the article, did I miss it?

    I don't understand how CPUs get faster and lower power, yet GFX cards get faster and require new power stations :-)
    • There are actually a lot of really good reasons for that. CPUs have only emphasized low power consumption recently, and they sacrifice some performance for it. In the high end video card market performance reigns supreme whereas in the CPU market, server needs reign supreme. Server's don't take a high end GPU and, therefore, aren't driving lower power consumption.
    • Re:Watt (Score:5, Interesting)

      by jacksonj04 ( 800021 ) <nick@nickjackson.me> on Sunday May 13, 2007 @05:02PM (#19106849) Homepage
      Because the only people who use performance GPUs are those who want to simply get as much kick as possible, be it graphics designers, gamers or medical imaging. In these situations the machines won't be sat in a rack, and power is irrelevant as long as the PSU can supply it. The heat can quite easily be dissipated from high performance desktop towers either via liquid cooling or just enormous heat sinks and fans.

      CPUs, on the other hand, are driven a large part by servers, which do sit in racks and need to run on as low power as possible, because power = heat = bad.

      One important thing to recognise is that power requirements per unit speed are actually dropping, it's just that speed increases faster than this increase in efficiency. CPUs have a slower rate of speed increase in terms of what is required of them, so power efficiency (Which is also a higher priority) has a chance to catch up.
      • by Kjella ( 173770 )
        Well, maybe I'm the exception to the rule but I got a Shuttle SD39P2 (one of those small XPCs) and a 8800GTS. There's no competition for me because 1) I doubt the 400W PSU in the Shuttle could handle it and 2) all that extra heat in a small case is hopeless and 3) 12 inches would block the whole air intake (you wouldn't want to put a 8800GTX in one of these either). It's a very nice box to travel around with tho, it's hardly a laptop but as desktops go it's massive power in a tiny package. And I do know tha
      • My 6600gt was pretty cheap and i use it for medical imaging (can't wait till HDR monitors and cards kick into place. 12-bit here we come!). I suppose i can easily cripple it though with a large enough DTMRI data set and some module with a high polygon count.
    • X-bit labs always measure power and noise for GPUs, but they don't have a review up yet.
      Here's the power-and-noise [xbitlabs.com] for the 8600GTS.
  • by vandan ( 151516 ) on Sunday May 13, 2007 @04:45PM (#19106735) Homepage
    ATI's own linux drivers are absolute SHIT. Their latest and greatest 3D offerings are easily outperformed by bargain basement cards from nVidia. And ATI have broken their word on their plan to 'support' open-source drivers, refusing to give any hardware specifications to developers, leaving them to reverse-engineer everything.

    And it's not only 3D performance that sucks. The 2D performance of their drivers is an ORDER OF MAGNITUDE slower than the open-source driver, and nVidia's driver at XRENDER performance ( ie rendering the webpage you're looking at ... have you ever wondered why scrolling in Firefox is so fucking slow on an ATI card? ). See http://ati.cchtml.com/show_bug.cgi?id=7 [cchtml.com]. The bottom comment says:

    Yes, you read that correctly, the Radeon X1400 is 15x slower than the (now
    obsolete) Radeon 7500, and 114x slower than the (similarly priced) Geforce
    6600. To buy new hardware only to find that it's exponentially slower than the
    old hardware at the most basic of tasks is insulting. There shouldn't be any
    problem with making the X1400 *at least* as fast as the 7500, and preferably,
    competitive with it's similarly priced competition.

    Unless this situation is rectified (either by the fglrx drivers being fixed,
    or documentation being released so that open source drivers can be developed),
    I will not buy any more ATI hardware, simply because it is embarrassingly
    slow.


    Like I said ... sounds like a fine product for a boycott . Get an Intel graphics accelerator instead. They have excellent open-source drivers, and are about to release a stand-alone graphics card ( previously all have been integrated ).
    • I'll save my boycotts for issues that involve life or death situations/consequences.
    • by spoco2 ( 322835 )
      Dear god, could you get a little more irate about something that really DOESN'T MATTER!

      Spouting the words Boycott is pure drivel.

      If you find that the performance of their cards is crap in the environment that you want to use it, THEN DON'T BUY THEM, buy something else... that's what you do with items, buy the thing that best fits your need.

      But to suggest a boycott? Dear god that's truly over the top and lame, and it's that sort of "You're not supporting this tiny user base who DEMANDS you spends stupidly hu
      • by jZnat ( 793348 ) *

        THEN DON'T BUY THEM
        Uh, if I'm not mistaken, that's pretty much what a boycott is. Yeah, good advice there, buddy.
    • by Ant P. ( 974313 )
      ATi's linux driver won't be slower on this card for at least half a year.

      After all, it can't be slower if it doesn't exist ;-)
    • Re: (Score:3, Interesting)

      by Hemogoblin ( 982564 )

      And ATI have broken their word on their plan to 'support' open-source drivers, refusing to give any hardware specifications to developers, leaving them to reverse-engineer everything.
      Let me refer you to the Slashdot story posted two hours before this one: AMD Promises Open Source Graphics Drivers [slashdot.org]

      I won't comment on the rest of your rant.
      • Let me point you to the complete lack of any actual Open Source drivers released by ATI as of this moment. ATI is known for promising a lot of stuff, but their "Commitment to Open Source" (first announced around 1999, repeated consistently since then) has resulted in nothing of any value at all.

        Until they actually release Open Source 2D AND 3D drivers, or (even better) release programming docs for their hardware, a boycott is a damn good idea.

    • by AaronW ( 33736 )
      I don't care if the drivers are even slow (though not too slow). I have major problems with just 2-D graphics! Xemacs on my ATI X1300 leaves crap all over the window constantly, and the cursor is completely corrupted. I have to force the window to redraw constantly just to get rid of the corruption. I've also seen this problem a few times in other editors, but only rarely.

      The ATI installer also sucked badly, generating a corrupt xorg.conf. It got confused because the machine had integrated Intel graphic
      • Ati budget cards seem to have a history of bad or too high overclocked ram. I had an ATI X800GTO that caused the same problems you describe but in Windows. Dropping the gfx ram speed by 10-20MHz cleared up the issues, but that was obviously unacceptable. I couldn't get a reply from ati, so in the end I borrowed a ATI X1900XTX from a mate, and am now replacing it with an nVidia 8800GTS. I now run linux (Ubuntu Feisty x86-64), so I could do with a card that has fewer issues.

        Though installing the ati driver th
    • Get an Intel graphics accelerator instead. They have excellent open-source drivers, and are about to release a stand-alone graphics card

      I wonder if these cards will be fast enough to run current games. If yes, no doubt they'll be a hit amongst linux gamers.

      • Intel graphics are a bit slow for very recent games, but for the majority of Linux users who aren't hardcore gamers the Intel graphics are way, way better than anything that Nvidia or ATI offer. They provide both 2D and 3D acceleration, they don't have stupid bugs that the community can't fix, and they even work great for older games - Quake 3 and stuff like Wolfenstien: Enemy Territory should run great.

  • by Anonymous Coward
    I'm not a gamer, so I'm just looking for something that can handle aero/compiz/beryl and do accelerated HD video decoding (H.264/XviD/Divx) while using the least amount of power, and with just passive cooling. Having not followed graphics cards for a while, I'm sort of out of the loop. What cards out there fit my needs?
    • Re: (Score:3, Informative)

      Beryl doesn't need a high end video card at all, you can use a GeForce 6200A and play with any eye candy you want.

      Same for H.264 decoding.
      • Have a Quadro FX 1000 (relatively ancient, but respectable). If I use a handful of windows, yes, it's fast, but once I start opening my typical workload of windows, craws to tens of FPS quickly on most operations.
    • Aero? Dude, my laptop (Thinkpad X60) has got an i945 integrated chipset that
      [a] costs next-to-nothing for any motherboard maker to integrate (and many of which do)
      [b] unlike my gaming rig which houses an 8800GTS, the i945 integrated chipset does not pull 250 Watts when idle. It pulls something much closer to zero.
      [c] Due to [b], my X60 does not make me pay the cost of a high-end GPU every year through the electricity bill.
      [d] Due to [b], my X60 can stay afloat on battery for 8 hours. (More like 6-7 running
    • by LIGC ( 974596 )
      According to http://www.anandtech.com/video/showdoc.aspx?i=2977 [anandtech.com], the new Nvidia 8500 and 8600 series cards have 100% H.264 offload acceleration, so it might be the kind of card you are looking for. The 8500 is also relatively cheap, coming in under $100. As for Beryl/Compiz, even a Radeon 7000 and integrated graphics will do, while for Aero Glass most DirectX9 compliant cards will handle it well.
  • by SP33doh ( 930735 )
    are there any image quality comparisons between the R600's CFAA and the G80's CSAA?
  • ...they had hosted the site on the HD 2900 XT.
  • http://it-review.net/index.php?option=com_content& task=view&id=1314&Itemid=1 [it-review.net]

    If these number are real then well AMD is having one hell of a bad year so far....

  • VR-Zone had theirs up at 3:51am EST

    VR-Zone's X2900XT Pre/Review [vr-zone.com]

    Oh, they aren't slashdotted either, but have been getting hit hard from hardware junkies.
  • http://enthusiast.hardocp.com/article.html?art=MTM 0MSwxNywsaGVudGh1c2lhc3Q= [hardocp.com]

    Bottom line, the 2900XT is "...a day late and a dollar short."
    • I wonder if the highly temperature-dependent power consumption can be blamed on the new digital PWM power regulation chips? Guess we'll never know for sure.

      The power consumption of silicon is temperature-dependent, but it's usually small enough to be ignored. Forty watts difference is HUGE.
      • Yup. Burns more power, performs slower, and is $50 more than the better performing NVidia card. Sure sounds like a mega-flop to me. ;)
  • From TFA:

    Ouch! Using some AMD monitoring software, we noted idle temperatures of between 65 - 70-degrees Celsius from the core die and that is just sitting in Windows. At full load halfway through a 3DMark06 benchmark run, we noted a maximum temperature of 84-degrees Celsius from the core die - in other words, very hot!

    I have a now ancient (3+ year old) Radeon 9800 XT, which is still more than decent for graphics. However, I have to have the case open with a Honeywell tornado fan blowing on the card to kee

The world is not octal despite DEC.

Working...