Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
AMD Graphics Software Hardware

AMD's Radeon HD 2900 XT Reviewed 126

J. Dzhugashvili writes "The folks at The Tech Report have whipped up a detailed expose of the new AMD Radeon HD 2900 XT graphics card's architecture and features, with plenty of benchmarks. While the card dazzles with 320 stream processors, a 512-bit memory bus, and oodles of memory bandwidth, its performance and power consumption seem disappointing in the face of Nvidia's six-month-old GeForce 8800 graphics cards."
This discussion has been archived. No new comments can be posted.

AMD's Radeon HD 2900 XT Reviewed

Comments Filter:
  • this drives down prices. I still want an 8800GTX. :)
    • I still want a comparison of every DX10 card available, running a selection of say 10 of the most popular games from the last 3 years with all the pretty options turned up to the max, on XP and on Vista, in both a top-end God Box and a typical 18-month old good-but-not-outstanding-for-its-age PC.

      2 tables of aggregate fps scores at the beginning not the end, one for XP and one for Vista, each with one row per card and 4 columns: normal PC @1280x1024; normal PC @1600x1200; god box @1280x1024; god box @1600x1

      • Re:Let's hope (Score:4, Informative)

        by Psiven ( 302490 ) on Tuesday May 15, 2007 @05:35AM (#19127771)
        You can do pretty well estimating your performance if you have a general understanding of how your components work together.

        First measure FPS in your favorite app at the lowest resolution. That's the measure of your CPU bottleneck. No matter how nice of a GPU you buy, you'll never get higher FPS than that.

        Memory is one of those things you can never have enough. Just don't worry about the bandwidth too much. Your only going to squeeze out just a few frames per second with top of the line RAM. Just watch to see if your comp is hitting the hard drive much and consider more if it is.

        Most new games are still GPU limited and this is where you want to focus your attention. Look for benchmarks at resolutions you play at. This is a good baseline of what to expect. Anything over 60fps avg I tend to be happy with, but you may want consider the minimum too. Right now the only benchmarks I've really been interested in are of Rainbow Six: Vegas. It uses the Unreal 3 Engine, and a lot of games are coming out that are going to be using it too. Other benchmarks might be important to you as well, but they tend to rank in the hundreds and so you know performance won't be an issue.

        • Those aren't really good measurable benchmarks you've chosen. The best benchmark is knowing how the hardware works. A big item to look at is the number of pixel pipelines (my GeForce 7800 GS OC can beat the crap out of a 8500 because it has 16 pixel pipelines) and the shading engine support. More and more applications are using less geometry (because it complicates the physics) and more shading and higher-resolution textures to attain graphical nirvana. The better the shader engine and the more pixel pi
      • NVIDIA's current card is six months old and they're obviously just waiting for AMD to show their hand. I wouldn't be the least bit surprised if they announce something with a lot more processors and 25% higher clock speed in the next couple of weeks.

        • They could, but the evidence so far indicates that they don't need to. They've already got the 8800GTS selling for a lower price than the 2900XT and beating the latter in testing. Nvidia can take whatever time they need.
      • "still want a comparison of every DX10 card available, running a selection of say 10 of the most popular games from the last 3 years with all the pretty options turned up to the max, on XP and on Vista, in both a top-end God Box and a typical 18-month old good-but-not-outstanding-for-its-age PC."

        How many DX10 games are there at the moment? My x800XT runs all of the most popular games from the last 3 years with all the pretty options set to max.

        That said, I am getting an 8800 for my next rig.
  • by Anonymous Coward
    At least on Windows. I got a free el-cheepo x1300 which I ended up replacing my GF6600 with. Sure the latter scored better in 3D Mark whatever, but at the cost of jerky frame rates in non-mainstay games. Such as Outrun 2006 (Which is a bit odd since I heard Sega use the GF6600 in their arcade machine).

    Anyway, while these x2900 do not seem to be great performers I suspect their Vista drivers are better. As a Vista user the GF8800 is right now out of the question, less the driver situation have changed recent
  • by MountainMan101 ( 714389 ) on Tuesday May 15, 2007 @04:18AM (#19127437)
    AMD/ATI losing out to nVidia in the extreme power cards.
    AMD/ATI losing out to Intel with the onboard graphics.

    nVidia has a better closed source linux driver than ATI.

    At the moment the only appeal of ATI is there mediocre graphics cards have open source 2D+3D drivers on Linux with R200(helped by ATI) or R300(no help from ATI/AMD) drivers.

    At the moment AMD's best strategy is to build some fantastic onboard graphics chips for their AMD processors and try and beat nVidia by basically making and AMD chip + on board graphics as brilliant combination (ie no need to add an aftermarket card).
    • I think you need to credit ATI with these accomplishments and not try to blame AMD. nvidia have had the uppper hand for a long time if you're happy with closed source drivers. By the time you're looking at the workstation cards then it's even more obvious, as the FireGL line really do suck compared to the Quadros. I was really disappointed when SGI teamed up with ATI for their graphics cards, as there was never much hope of it ending well (and it didn't).
    • by Morgaine ( 4316 ) on Tuesday May 15, 2007 @05:51AM (#19127851)
      At the present time, the problems that AMD inherited when it bought ATI don't really matter greatly (except as a perception), because only enthusiasts buy graphics cards that cost as much as a basic PC. It's not the volume market.

      However, unless AMD sorts all this out over the next couple of years, they are in for a huge amount of very costly trouble, and it may be terminal to their future in the desktop market. The problems ahead lie in the area of CPU-GPU integration.

      We are told that AMD purchased ATI because they needed graphics expertise for a projected future in which scalar and vector processing is merged in an extremely parallel multi-core processor architecture. It's easy to see the reasoning here, as tight integration would decrease communication latencies and power consumption simultaneously. The benefits of tight integration are likely to be collosal, and AMD knows this from their success with hypertransport.

      Unfortunately, such tight integration also means that ATI's remarkable incompetence at producing even half-decent drivers will bring AMD down badly, unless something is done about it. And short of firing the whole ex-ATI driver team, it's hard to see how to resolve this issue. You can't resolve it by trying to educate bad software engineers, that's for sure.

      AMD have quite a problem on their hands.
      • by tuxicle ( 996538 )

        We are told that AMD purchased ATI because they needed graphics expertise
        I thought it was to gain ATI's experience with chipset design, which AMD has been famously deficient in (AMD760, anyone?)
      • by fitten ( 521191 )

        However, unless AMD sorts all this out over the next couple of years,


        If the financial reports are accurate, AMD doesn't have a couple of years to sort it out. Their timeframe for sorting it out is significantly shorter.
      • Re: (Score:3, Interesting)

        by Gordo_1 ( 256312 )
        You had me for a moment there, but then you went and used the old "ATI makes bad drivers" shtick again. I understand that their Linux support has been more or less non-existent, but believe it or not, ATI drivers have been quite solid on the Windows side of the house for a while now.

        We all know ATI had really poor driver development in the 90s. However, for at least the past five years or so -- since the introduction of the Radeon 9x00 DirectX9 (R300) generation hardware, their drivers have been at least a
        • Though the silicon may well clock 50-100% higher and blow away nVidia's 8800GTX, it turns out that it eats 600W PSUs for breakfast -- that's the real reason AMD couldn't release a high-end part: Few except for hardcore overclockers have 700W power supplies ready to feed this thing. That, and no one in their right mind wants a computer that uses 400+ watts idling on Microsoft Word. So AMD had to settle for the mid-market, with mediocre performance that's within an acceptable (albeit still very high) power en

          • by Gordo_1 ( 256312 )
            Yup, I'm with you there. 65nm will be their saving grace, if they can get something out to market before the end of the year. Nvidia's getting way too comfortable with their monopoly in the high end to allow this to continue for very long.

            I've had a Radeon 9500Pro since 2002, and been very happy with the quality of the hardware and drivers, but I've waited for six months for an ATI high end part to become the centerpiece of my next gaming rig. In the end, R600 clearly has enormous potential, but the 2900XT
    • I completely agree. I bought an ATI card a few years ago, but it will definitely be my last. Their Linux support sucks and so I will vote with my dollars. Bye ATI, hello NVIDIA.
    • Re: (Score:2, Interesting)

      There are problems at AMD/ATI in addition to falling behind the competition. I have a recurring problem ticket I re-opened recently at ATI Support [ati.com] where I got a little bitchy and suggested I'd be going back to NVIDIA if they couldn't get their act together. (I must admit my ticket was mostly a complaint about sloppy work, since I already hacked my system registry and fixed their issue.) Judging from their response to the ticket, I'd say there might be an attitude problem developing there as well.

      We respe

    • AMD competes on price, not necessarily performance. If they lose the enthusiast market completely they will still remain pretty profitable. Their strategy has always been price and courting OEMs with their cheap chips, and now graphics cards.

      The price of the 8800 right now is what most people pay for a desktop computer and monitor. All this hand-wringing over high-end performance marginal gains is no different than the "Ford vs Chevy' nonsense the pickup-truck crowd is always going off about.
  • Bah (Score:2, Interesting)

    by drsquare ( 530038 )
    Graphics cards are all too expensive anyway. You shouldn't have to pay more than the actual processor just to draw pictures on the screen.
    • Re:Bah (Score:5, Funny)

      by maxwell demon ( 590494 ) on Tuesday May 15, 2007 @04:25AM (#19127475) Journal
      You don't even need a processor to draw pictures on the screen. A simple permanent marker suffices.
    • When the voodoo1 and voodoo2 came out no one flinched at 300 cards, let alone 2 of them. Yet you still needed a separate 2d card before you could even use the voodoo cards.

      You don't need these cards to draw pictures on your screen, you need them to animate the pictures on your screen. Sure you could play all games "Myst" style, but that isn't what people are after.

      Plus, no one is forcing you to buy the latest and greatest. Quite a few games benefit from these cards, but many can be very playable by knock
      • My only issue has been a lot of the high end cards eat two slots. I am not concerned about the heat or power as its already a given when making a high end game system.

        I know what you mean, I had to pull my pci based gum ball machine last time I upgraded my video card.

    • Re: (Score:2, Insightful)

      by Anonymous Coward
      Oh you can. You only need to make a processor-to-VGA converter cable. This should be trivial for someone with your intelligence level.
    • by Lorkki ( 863577 )
      The CPU doesn't have its own on-board memory (besides cache) or a high-speed DAC part, and the motherboard provides all of its other auxiliary circuitry. Bundle all of that on the same board and you have your explanation for the expensiveness of high-end cards.
  • Ob (Score:1, Redundant)

    Does it run on Linux?
    • Re: (Score:1, Funny)

      by Anonymous Coward
      Unfortunately it doesn't.
    • by ettlz ( 639203 )
      Let me rephrase. "Hey! AMD! You gonna tell the rest of the world how to program the motherfucker?" There.
    • Re: (Score:3, Funny)

      by wild_berry ( 448019 )
      I'd answer "not yet" -- I'm sure that there's a memory management unit on the chip, so don't be surprised if someone does a port...
  • by kcbrown ( 7426 ) <slashdot@sysexperts.com> on Tuesday May 15, 2007 @04:33AM (#19127503)

    While the card dazzles with 320 stream processors, a 512-bit memory bus, and oodles of memory bandwidth, its performance and power consumption seem disappointing in the face of Nvidia's six-month-old GeForce 8800 graphics cards.

    The hardware probably screams. But ATI has a reputation for really shitty drivers. Without solid, fast, high-quality drivers, fast hardware doesn't matter as much.

    NVidia has typically produced fast drivers. They're not open-source, but they're at least good.

    If ATI can't get its shit together and write some decent drivers, the only reasonable option for them would be to open-source their 3D drivers so that the community can fix them properly. And I expect the community would do just that, because a lot of developers are also avid PC gamers, so they have a personal stake in it.

    It'll be interesting to see where this heads, given the statements made by ATI about open-sourcing their drivers, but I'm not going to hold my breath over it. For now, it's NVidia for my gaming rigs. That'll change as soon as ATI actually open-sources their full 3D drivers.

    • Does make you wonder what sort of company it is that can't really write drivers for its own hardware? The whole idea of them borrowing NV's archietecture in the first place was to simplify driver writing. graphics card + crap drivers = waste of electricity.
    • by dave420 ( 699308 )
      If it's even possible to open-source them. If there's proprietary licensed technology in them, it might be a non-starter. Not all closed-source is so because of ideology...
      • by MartinG ( 52587 )
        They could leave those parts out to be rewritten by the community, just like Sun have done with parts of OpenJDK.
        • Re: (Score:3, Interesting)

          by dave420 ( 699308 )
          You *do* realise those are the bits of the drivers that interface with the hardware? The same bits that cost millions of dollars to produce, due to the sheer amount of raw performance needed to be squeezed out of them. I doubt the open-source community, regardless how talented (and I know there's some insane talent out there) could replicate those in a timely fashion. Remember - they're playing catch-up with AMD. AMD will keep bringing out new cards, and these guys will have to keep re-engineering these
          • by MartinG ( 52587 )
            Where is your evidence that the encumbered parts are the parts that interface with the hardware? Have you seen the source?

            Why do you think those parts cost the most to produce? Given the specs, I would imagine the parts very close to the hardware cost the least. Why do you think otherwise?

            Also, why assume the negative? Why not release the unencumbered parts and see what happens? What is there to lose? If you are right then nothing is lost. If you are wrong we end up with some good open drivers.
    • by Flodis ( 998453 )
      Getting a little off-topic here, but...
      I recently assembled a new rig which I intended to run Linux on. I had previously had some bad experiences with the drivers for an nVidia nForce4 motherboard (SATA-drivers corrupted files on disk and hardware CPU offloading for networking caused corrupt downloads and BSODs), but people were telling me how much better nVidia's Linux support was, so I went for a 7600GS anyway.
      I'm not saying I regret going for a 7600GS, but I don't think I've ever had the misfortune
    • For now, it's NVidia for my gaming rigs. That'll change as soon as ATI actually open-sources their full 3D drivers.

      ... or as soon as Intel releases a decent gaming-grade gfx card !

    • by aka1nas ( 607950 )
      The funny thing is that ATI has had consistently better drivers than Nvidia for the last year or so. Here's hoping that both companies get their act together and release higher quality drivers more frequently.
  • by bad_fx ( 493443 ) on Tuesday May 15, 2007 @04:45AM (#19127565) Journal
    As usual Anandtech is extremely thorough: http://www.anandtech.com/video/showdoc.aspx?i=2988 &p=26 [anandtech.com]

    [H]ardocp's take: http://enthusiast.hardocp.com/article.html?art=MTM 0MSwxLCxoZW50aHVzaWFzdA== [hardocp.com]

    techPowerUp (Warning, streaming video at the start >.>): http://www.techpowerup.com/reviews/ATI/HD_2900_XT/ [techpowerup.com]

    The Inquirers expected vapid coverage: http://www.theinquirer.net/default.aspx?article=39 580 [theinquirer.net]

    I think I'll wait for more ATI drivers and some DX10 games before calling this one... Looks a little underwhelming at the moment though. I'm not regreting my 8800GTX purchase yet. ;)
    • I'm not regreting my 8800GTX purchase yet.

      Unsurprising, given that the reviews point to the HD 2900 XT being slower than the 8800 GTX (and the 8800 Ultra). It does surprise me, though, that ATI are 6 months behind and still couldn't beat NVIDIA for the performance crown - but it's nice to see a real fight again on price/performance, and I'm looking forward to seeing how the HD 2600 XT stacks up against the 8600 GTS (for those of us with a sanity budget restriction in place).

      • Well, the 8800GTS is just under $300 while the new ATi card is $430, the budget leans towards nVidia and some extra RAM or a hard drive.
  • by Anonymous Coward
    This board has still more "crunching" performance than older generations, but the power usage is insane:
    http://forum.folding-community.org/fpost185371.htm l#185371 [folding-community.org]
    http://folding.stanford.edu/FAQ-ATI.html [stanford.edu]
  • by Adult film producer ( 866485 ) <van@i2pmail.org> on Tuesday May 15, 2007 @05:04AM (#19127637)
    Idle
    ----
    Radeon 2900XT - 183
    GeForce 8800 Ultra - 192
    GeForce 8800 GTX SLI - 296
    Radeon 2900XT Crossfire - 317


    Full Load
    ---------
    Radeon 2900XT - 312
    GeForce 8800 Ultra - 315
    GeForce 8800 GTX SLI - 443
    Radeon 2900XT Crossfire - 490


    This could get very expensive for people that leave their computers running 24/7.
    • by davetv ( 897037 )
      I suppose that would depend on your screensaver and power saving settings - unless you play 3d games 24/7. At Idle its comparable to an 8800
    • by bad_fx ( 493443 ) on Tuesday May 15, 2007 @05:33AM (#19127765) Journal
      Just a note, that is for the entire system, rather than just the graphics card. Still high compared to older generations. Just thought I'd point it out, since it may not be clear.
    • Re: (Score:3, Insightful)

      by odoketa ( 1040340 )
      Given most people's expected load during non-gaming periods, it makes a lot more sense to have a second computer for your 24/7 machine.

      I use a mac G4/dual 500 (i.e. an OLD old machine) as my 24/7 box - cost about $200 bucks, and does just fine quietly humming away in the corner drawing 75 watts.

      If your idle numbers are right, you'd better have a good friend at the power company if you plan on leaving that machine running 24/7.

      • by DrLex ( 811382 )
        I use a laptop (MacBook Pro C2D 15") as my '24/7' machine (actually more like 15/7). It uses 27W when doing simple stuff like browsing the web, 20W when idle with backlight off, and up to 60W when playing UT2k4. Of course the main reason why it's a laptop, is that it's slightly more portable than my gaming PC.

        I expect most energy saving methods currently being used in laptops to be ported to desktops in the near future, because otherwise the power requirements will become unmanageable. If it continues lik
      • Its not necessarily a bad idea to have a 24/7 machine as long as you don't buy a brand new machine. IMO, your Return on Investment on power savings would likely take much longer to recoup then the system will last.

        For a $300+ card, I think one solution might be to have a low-power state vdieo chipset on-board which needs no cooling and draws 90% less power.
    • by Abeydoun ( 1096003 ) on Tuesday May 15, 2007 @06:42AM (#19128117)
      Here's a quote from TFA that I also found quite unnerving... "Also, we found that our 700W power supply wasn't up to the task of powering a Radeon HD 2900 XT CrossFire rig. In order to achieve stability, we had to switch to a new Thermaltake 1kW PSU with a pair of eight-pin connectors that AMD supplied."

      Now don't get me wrong, I love to see these types of improvements in real time graphics rendering, but you know there's something wrong with the industry if they can ask PC Enthusiasts with a straight face to use power supplies powerful enough for Air Conditioning Units (albeit small ones) in their computers. That being said, I still commend the improvements made and I look forward to the lower end, passively cooled, versions becoming available for my next HTPC.
      • by afidel ( 530433 )
        If you want a passively cooled card that can still play games look at the Geforce 8600GTS line. There are even models with HDCP chips to support Bluray/HDDVD playback.
        • by Fweeky ( 41046 )
          XFX make a nice range of passively cooled 7950GT's too, though the heatsink design means they use two slots in the "wrong" direction.
      • by bad_fx ( 493443 )
        I wonder what brand of PSU that 700W one was though? All PSUs are definitely not created equal, as I've had no problem running a hell of a lot off a 220W Shuttle power supply, while I come across people having no end of trouble running similar things off no name brand 300 or 400W power supplies. o.O
      • Funny, what I find unnerving is the maths:
        "our 700W power supply wasn't up to the task of powering a Radeon HD 2900 XT CrossFire rig." ..."Full Load: Radeon 2900XT Crossfire - 490W"
        am I missing something?
        • I'm guessing it has more to do with consistency of current rather than actual power. The Full load rating is an average, but it still doesn't make sense to need >200 extra Watts for peaking. It may be like bad_fx implied and they were using a crappy over-rated 700W PSU (though for a benchmarking site, I would hope that's not the case).
        • Re: (Score:2, Informative)

          by tknd ( 979052 )

          Power supply manufacturers typically pick a number close to the maximum possible consumption the unit can provide utilizing the maximum across all the different voltages it provides. So when you look at the sticker on the PSU, it will show you the maximum amps per each voltage. You take all of these numbers, multiply the voltage by the amps to get the watts (watts = volts * amps). Then add all of those numbers together to get the total maximum power the PSU can provide. That number should be fairly close t

    • Simple solution... it's called S3 suspend...
    • by muffen ( 321442 )
      Does it still draw power, or at the same high consumption-rate, if the screen is blanked?

      I just bought a Radeon X1950Pro since my previous card wasn't handling Lord of the Rings Online very well, and I never considered that it might increase my power consumption when im not using the system.

      (Genuinly interested in a response to this if someone really knows, would change my current behavior of just putting the system in powersave to actually turning it off).
    • by ceeam ( 39911 )
      Holy shit. I measured my whole 2GHz AMD64 desktop computer (sans monitor) power consumption a while and got only 74 watts at outlet in idle.

      Really - I can't believe those numbers. Is there an error? How could they _waste_ 200 watts _at idle_?
      • If (if!) the numbers are right, there are a number of places the electricity could go:

        I presume you can't shut off the PCIe slot, which means the card is powered. Which probably means electricity running through a whole bunch of circuits, generating heat (i.e. using power). If the card is generating heat, the fan is on, and fans are never, ever winners in the efficiency game. If the fan isn't designed to step very well, that would add to the problem (i.e. if the settings are 'fast' and 'faster' rather th

    • Actually it makes a lot of sense, and is environmentally friendly, if you connect an air-out duct from your PC to your clothes dryer or your oven.
    • I ignored all of the wiz-bang FPS ratings after viewing that power consumption chart. Every few weeks by friends kid invites his friends around for a small lan party (about 9 total). They run power cords from other rooms to drive their rigs (everyone brings desktop+lcd and use usb headsets) but now I'm imagining the speed the power meter wheel will spin when 9 computers, each with a 1kw PSU, connect in that house. Thats assuming his 60+ year old home will even survive that.

      Sounds like they should sell a
    • I'm waiting for the 65nm refresh, myself.

      I was pretty happy when I picked up my 90nm $300 7900 GT last year - same power consumption as my old 6600 GT, but three times the performance! If I HAD to buy today, I'd get the 8800 GTS, but because Nvidia didn't design it with different 2D and 3D clocks, the 2D idle consumption is higher than it should be. The x2900 XT has a "low" idle power consumption in respect to the 8800-series because it supports a lower 2D clock.

      Hopefully, this will be corrected in the 65
  • by gyranthir ( 995837 ) on Tuesday May 15, 2007 @08:17AM (#19128865)
    AMD has not released and probably will not release for some time a direct competitor to the 8800gtx or the 8800ultra.
    The 2900XT is a competitor to the 2 8800GTS models.
    They are avoiding the top end market because more often then not the risk of that market does not meet the reward.
    They are playing little ball to compare to base ball, trying to manufacture base hits and runs not home runs.
    Offering 3 Cards starting at less than $100 and going to $400ish is a good strategy for the main stream market.
    The HDMI dongle innovation (carries video and audio on the video card because all of the new cards have an audio processor on them) is a boon for them as well, helping carry the image of media center capable video cards, for a newer computer user age.
    These will help push down prices on all of the cards within that price range. And possibly help push innovation in the marketplace.
    • by Kymri ( 1093149 )
      In addition, their lack of a really impressive, robust, high-end part is probably due to the same thing that kept nVidia out of the game for a while a few years back:
      They spent a fair amount of focus on developing a GPU for the Xbox 360 - and that R&D did not bring direct translation to their GPU offerings here.

      It remains to be seen if it'll make that much difference in the long run, but at the moment, it looks like ATi hasn't got a whole lot to offer - of course, until we see some DX10 games and com
      • I think you're looking at this the wrong way. the 8800gtx series is about 2% of the market for video cards. The bleeding edge, pretty not worth the reward other than for bragging rights. Especially for a company working getting back on it's feet.
    • It seems like this was supposed to be their answer to the GTX, only they couldn't get the clocks high enough. So, they lowered the price to position it against the GTS's and now wait for the process shrink.
      • I really doubt it, Look at there product line. All of it is consumer looking not enthusiast looking. There top end card cuts out at $400 not $850+ like nvidia. They are pushing the media center, home user, dabbling user market. Not the ultra hardcore market (which is always a risk and is always very fast moving, where one thing is the best today may not be tomorrow), the risk versus reward of the ultra hard core market isn't worth the risk to a company that just came off a 600 million dollar loss.
        • A $400 video card is not "dabbling".... it is realizing that paying $600 for a few FPS more at resolutions most monitors only dream of in their worst nightmares is insane. To say that ATI's new cards don't compete is to ignore the fact that the majority of the market doesn't buy $600 video cards. Of course, now people will read that "nVidia cards are faster", then go out and overpay for a 7600GS when for the same money they could've had a faster ATI card.
          • By Dabbling I mean dabbling enthusiast, not a power user or hardcore enthusiast... The person like me that doesn't have $2500 to throw at a new computer. I may however splurge on a $400 video card, when I build my new system. But I do not have $600-$1100 to spend on some water cooled monstrosity of a video card. That is why I agree with ATi's move in the market, it's a great idea for them to push more into the normal user market not the hyper bleeding edge bragging rights only game where the risk is no w
  • I have seen 4 ATI cards bomb out since 2000, all of them needed replacing. I have NEVER seen an nVidia card bomb out.
    From my limited point of view, nVidia sells higher quality cards.

    Two weeks ago, I had to replace my ATI 1600 pro with a nVidia 7600 GS. They are roughly equivalent cards.
    In windowsxp running Oblivion, I notice a drop in performance with my new 7600gs. In Ubuntu Linux I notice a glxgears score 10 times higher! Now I understand that this improvement is because of the better drivers, but its

Ocean: A body of water occupying about two-thirds of a world made for man -- who has no gills. -- Ambrose Bierce

Working...