Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Graphics Upgrades Hardware

Getting Away With a Cheap Graphics Card 290

theraindog writes "High-end graphics cards get all the glory, but most folks have a difficult time justifying $300 or more for a single PC component. But what if you could get reasonable performance in all the latest games from a budget card costing as little as $70? With game developers targeting the relatively modest hardware available in current consoles and trickle-down bringing cutting-edge features down to budget price points, today's low-end graphics cards are more capable than ever. To find out which one offers the best value proposition, The Tech Report has rounded up eight graphics cards between $70 and $170, comparing their game performance, Blu-ray playback acceleration, noise levels, and power consumption, with interesting results."
This discussion has been archived. No new comments can be posted.

Getting Away With a Cheap Graphics Card

Comments Filter:
  • by EVil Lawyer ( 947367 ) on Thursday September 25, 2008 @10:14PM (#25161139)
    Um, to me at least, $170 for a graphics card is not "cheap"...
    • by bigstrat2003 ( 1058574 ) * on Thursday September 25, 2008 @10:17PM (#25161153)

      It's the high end of cheap. $170 is going to get you a midrange graphics card, which, while not cheap in an absolute sense, is cheap compared to other graphics cards out there.

      Cheapness always has to be compared to other objects in its class. Would you say a $170 car is not a cheap car? Of course not, because most cars are far more expensive than that. The idea is the same here.

      • by c_forq ( 924234 ) <forquerc+slash@gmail.com> on Thursday September 25, 2008 @10:20PM (#25161195)
        Still bullshit. By using your useless relative scale a new Jaguar is cheap, because it is way less than a Ferrari, Maserati, or Bugatti. (Dang it, I used a car analogy; enter moderation limbo).
        • Re: (Score:3, Informative)

          Well, it's not my fault if you don't understand how this concept works. A Ferrari, Maserati, or Bugatti is so much more expensive than a normal car that they make the price curve look exponential. Graphics cards, by contrast, tend to have a pretty damned linear price curve. Price comparisons against the most expensive member of the class fail when that member is so expensive it completely fucks up the curve.

          • by c_forq ( 924234 ) <forquerc+slash@gmail.com> on Thursday September 25, 2008 @10:44PM (#25161427)
            While Ferrari and Bugatti may be out there the Maserati entry level is comparable with the high end of Jaguar, but my point is $170 is still a hell of a lot for the budget minded consumer, substantial for the budgeting consumer, and considerable for the consumer with a flexible budget.
            • by aliquis ( 678370 )

              But then it's the best of them as well, you can still get the card which I've only seen in the article yet, the HD4670, or if $60 is to much for you something like a 7600 GT or 8600 GT or something such, maybe, or something used. HD4850 is really nice, and I guess it may be one of those more expensive cards, haven't read it thru yet as I said.

              HD4870 is the same GPU but with higher clock and GDDR5, Hd4870X2 beats everything there is (?) and has two of the same GPUs on one card.

              And so far from the article it

      • It's the high end of cheap. $170 is going to get you a midrange graphics card, which, while not cheap in an absolute sense, is cheap compared to other graphics cards out there.

        It's also the low end of expensive. Not expensive in an absolute sense, but expensive compared to many other highly capable graphics cards out there.

      • Would you say a $170 car is not a cheap car?

        If a $170 graphics card can handle my 60 mile commute as well as my van, I'll order two tomorrow.

      • Would you say a $170 car is not a cheap car?

        I'd say it was asking for trouble.

    • This is why graphics for linux is adequate, but not great: the developers think $300 is a good price for a graphics card, and get tired of them and upgrade when they age to under $200.

      • by aliquis ( 678370 )

        More likely I expect them to code the game so that it will be "playable" technology-wise for as long as possible / as many people as possible. So they try to make the low ugly settings still playable on old crap, but still make it so the highest settings can hardly or can't be used on even the newest cards. That way someone can buy and still enjoy the game one year later and think it looks nice.

        If the highest settings was designed for old crap the hard core gamers and future consumers would diss the game be

        • by Z34107 ( 925136 )

          The other problem is that games can easily take upwards of 4 years to develop. (Studios have been seeking venture capital to start writing a game, for cryin' out loud.)

          But... do you want your game to have 4-year-old graphics? To look dated before it's even released?

          There's not really a good solution to that. People I've talked to (many years ago) and the internets say that you can A) design your game to kick the shit out of everything available today, because that will be "midrange" by the time your gam

          • by Fred_A ( 10934 )

            But... do you want your game to have 4-year-old graphics?

            Don't all games have 4 year old graphics, by definition ?

            DNF will have, what, 15 year old graphics ?

    • by FuturePastNow ( 836765 ) on Thursday September 25, 2008 @10:32PM (#25161305)

      Luckily, for people like you and me, there are cards closer to $70 than $170.

      I actually read that Tech Report article earlier today, and I've read a couple of other reviews of the 4670. It looks really good, especially considering that it's a small card with no extra power connector.

      Of course, my needs aren't very high- the #1 game I'm looking forward to is Starcraft 2- but I'd still like to be able to play at the native res of my 24" monitor.

      • by aliquis ( 678370 )

        Warcraft III have scaled very well, I could play it on my GF2 Pro in low 800x600 or something such (and it would lag in tower defence or maybe when there was lots of casters) and now I have a 8800m GT and play in OS X so I still have to run it at medium 1280x1024 or something such to make sure it don't get to slow in bigger fights. In Windows I could probably run it at high and 1440x900.

        I expect starcraft 2 to be similar, so if you want 1920x1200 with everything at high and be able to handle 200 units fight

      • Careful, I'd be pretty surprised if a 4670 can do a native 24" without a hiccup. With the settings turned down, sure, but Starcraft 2 is targeted at a very modern market, and Blizzard likes its graphics to be pretty... . What IS the native resolution on your monitor? 2560ish? That's a lot to ask of these little cards in this review.

    • I don't thing $70 is cheap for a graphics card, but I'm a tightwad and don't play a lot of games. (I do a bit of graphics programming, but it's all ray tracing and the GPU doesn't help for that.)
      • by aliquis ( 678370 )

        A normal new game cost what? $50? More maybe? Things like warhammer and such is subscription based. If you really do spend some time playing games which would require a half-decent card how can you NOT afford $70?

        13 months of warhammer over here would cost $300, it's not worth it to pay $70 for decent graphics in the game then?

        • Re: (Score:2, Insightful)

          by Ragzouken ( 943900 )
          This argument doesn't work unless you assume anyone who's serious about gaming plays subscription games.
    • Re: (Score:3, Interesting)

      by purpledinoz ( 573045 )
      $170 used to be cheap, when all other components were quite a lot more expensive. But today $170 would probably make it the most expensive component (maybe next to the CPU).
      • Nah, you can get CPUs for around $100 easily.

        The OS would be the next most expensive component, if you used Windows.

    • by TheLink ( 130905 )
      1) The range is 70 to 170.
      2) If you can't afford any of the reviewed cards, stick to onboard video and wait for stuff to get cheaper (and/or save up some money).
    • Re: (Score:3, Insightful)

      by WDot ( 1286728 )
      Think of it this way--It costs $200 to get the cheapest of the current-gen consoles. Or, you could spend $170 on a video card and put it in the computer you already own, and after about the same amount of work as hooking up and configuring your console, you can play PC games. For $30 less. If you're clever and have some PC-gaming friends who upgrade every new generation, you can pick up that same card as a hand-me-down for less.

      So, I'd say $170 is pretty cheap considering a $170 video card is designed
  • It's subjective, and I can't really justify spending $500 on a video card, but I still want to.

    I have bought high end cards for over a decade. I've been happy with all of the except the first. I originally bought an ATI Rage128 card before they came out from buy.com. The product didn't ship on time, and so I waited six months (buy.com was happy to take my $160), and I got an obsolete product. After that I got my first geforce 2 card. And the rest is history. I'm an nVidia fanboy and I'm not ashamed of it.

    Those who spend that much money on a single component are usually going to spend a lot more on the rest. There's nothing worse than a yugo with a chevy 350 big block in it (to use a car analogy).

    If you don't want to sped that much, you will get far less performance than me. And that makes a lot of difference to the experience of gaming.

    • Vacuum cleaner...Mouse.

      Garbage disposal...Mouse.

      Golf cart...Mouse.

      Lawn mower...Mouse.

      Go Cart...Mouse.

      350SL gull wing...Mouse.

      Mini cooper...Mouse.

      Real Car or truck...Rat.

      Granted your going to replace the whole drive train.

      It will wind up with RWD like God intended it to be.

  • by GlobalColding ( 1239712 ) on Thursday September 25, 2008 @10:18PM (#25161161) Journal
    Prices on graphic cards have been plummeting, both due to the overall memory prices dropping fast and because of the huge saturation of inventories in the market. Cards that few months ago were going for $300+ have been getting blown up for under $100. So before you compromise, make sure you do your due dilligence and check price engines like google shopping or pricewatch, you will be surprised how far your buck travels these days. Also, don't bother with brick and mortar retailers, they turn their inventory slower and their best deals are still month or so behind and usually involve some mail in rebates.
  • Hmmm (Score:4, Informative)

    by ArchieBunker ( 132337 ) on Thursday September 25, 2008 @10:19PM (#25161179)

    My Radeon X1650 has no trouble playing 1920x1080 movies, and it cost around $50.

    • Re:Hmmm (Score:5, Funny)

      by jd ( 1658 ) <imipakNO@SPAMyahoo.com> on Thursday September 25, 2008 @10:21PM (#25161205) Homepage Journal
      Well, not many movies came out in 1920. Even fewer in 1080 - the Norman cameramen could never grasp the fact they needed to hold the camera straight.
      • Even fewer in 1080 - the Norman cameramen could never grasp the fact they needed to hold the camera straight

        Plus all they ever did was shoot remakes of the "Grendel's Mother Project."

    • I bought an AGP Powercolor ATI 3650 card two months ago. It cost me $90. I can play most modern games at 1280x1024 at good detail w/ 2x AA. As a really casual gamer, this is more than adequate. I can't tell much of the difference anyways. HD video is pretty good but some of it is CPU/memory bound - which is my problem 2.4 Xeon, 1 GB Ram. Still not bad when you consider an AGP card can still run most modern games.

      • by aliquis ( 678370 )

        With 2 GB of vram low bus bandwidth isn't as much of a factor as it was back when you had 4 or 8 MB ...

    • no trouble playing 1920x1080 movies

      There isn't a video card on the market today that can't play high-res movies. "Movies" are very, very basic tasks for a video output device.

      In summary, tell us something about your video card that means something.

      P.S. Now, if you said that your monitor had a native resolution of 1920x1080 or higher, I'd be impressed. Mine's native res is 1680x1050, and it's big. Real big.

      • The GP was commenting on the second paragraph of TFA --

        What if, through the magic of technological progress, dropping 80 bucks on a video card could get you a GPU that will slice through the latest games with relative ease? What if it could help decode HD video streams perfectly, even on a slow CPU? If such a beast existed,...

        -- and was pointing out exactly the same thing that you did: that it is difficult to walk five steps without falling over as many such beasts.

        Not everyone feels a deep-seated emotional need to boast of the power of their video card, or the size of their monitor, you know.

      • CRT, my friend.
      • P.S. Now, if you said that your monitor had a native resolution of 1920x1080 or higher, I'd be impressed. Mine's native res is 1680x1050, and it's big. Real big.

        My main machine at home has one monitor running at 1920x1200 and a second running at 1360x768 and I wouldn't mind getting something even more hi-res.

        At work however, I have 1680x1050 and it feels really cramped, I'd love to get a second monitor or something with a higher native resolution. 1680x1050 isn't really that big if you multitask and don't run all your windows multitasked because you grew up on Win 3.x running at 640x480.

        /Mikael

  • I guess, technically, I should say it's a "512 MiB" card, but I'd rather claw my eye out with a fork.

    Wow, MiB is failing the spork test.

    • Re: (Score:2, Insightful)

      512 MiB would be an awe-inspiring sight. They look so damn bad-ass in those black suits!

      And yeah, MiB is a fucking retarded term for storage capacity. The old way has worked beautifully for forever, and I'm not about to change my habits because some metric purists got upset about it.

      • by Zorque ( 894011 )

        That, or hard drive company apologists who are trying to make it okay that we buy 200 gigabytes and get sold 200000000 bytes.

        Either way, I hate the MiB.

        • That, or hard drive company apologists who are trying to make it okay that we buy 200 gigabytes and get sold 200000000 bytes.

          I'm also angry at the people who made my network card. Gigabit ethernet? Nonsense, only 1,000,000,000 bps. Same goes for my CPU, which they said would run at 2.6GHz, but does it hell...

      • by Cochonou ( 576531 ) on Friday September 26, 2008 @12:36AM (#25162129) Homepage
        Be careful, MB might have worked good in retail space as "everyone except hard drives manufacturers" knew what it was supposed to mean, but it didn't work as well in engineering space as soon as you mixed storage space (power 2) with data transfert rates (power 10). A MP3 encoded at 128 Kb/s is encoded at 128000 b/s, not 131072 b/s.
        So, regardless of the fact they were coined rather abruptly, I find the whole Ki / Mi / etc prefixes to be a rather good move forward.
        • Re: (Score:2, Insightful)

          I find the whole Ki / Mi / etc prefixes to be a rather good move forward.

          I disagree. If we have a problem with the units of measurement being disparate, we should reconcile them, not split them into two. Not to mention that the Ki/Mi/etc prefixes sound like baby talk, which makes me want to smack whoever came up with them upside the head.

          • by BrentH ( 1154987 ) on Friday September 26, 2008 @02:23AM (#25162697)
            The thing is that IT-people and Computer Scientists have this uncanny drive to keep talking of and thinking in powers of two, insisting on starting the counting with 0 and generally don't care about the long standing conventions there already were in the rest of the world. k=1000, M=1000000, etc, period. If you insist on using rediculous numbers like 1024, 1048576, etc, you're gonna use your own damn prefixes for them. No hijacking please.
            • Then quit calling then "bytes". I've used computers with 5 to 10 bit characters and 8 to 13 bit "bytes". The correct standardized term for 8 bits of data is an "octet".

              So it's either MB (traditional) or MiO (formal). Never MiB.

            • by ja ( 14684 )

              If you insist on using rediculous numbers like 1024, 1048576, etc, you're gonna use your own damn prefixes for them.

              Why so grumpy my friend? For every kilo you buy we are actually giving you an extra 24 bytes for free, gratis! How can that be bad for consumers? That other vendors may insist on giving you only excactly what they advertize and what you have paid for is really not any of our concerns.

            • by smoker2 ( 750216 )
              So 0 isn't a number ? (be careful what you say)
          • by renoX ( 11677 )

            >>I find the whole Ki / Mi / etc prefixes to be a rather good move forward.
            >I disagree. If we have a problem with the units of measurement being disparate, we should reconcile them, not split them into two. Not to mention that the Ki/Mi/etc prefixes sound like baby talk, which makes me want to smack whoever came up with them upside the head.

            Uh? Using the same prefix for two different measure is dumb whatever you say about 'reconciliation'.

            As an aside the sounding isn't bad for everyone: in French K

        • by Zorque ( 894011 )

          It would be nice if we had started off that way, but to switch measurement terms 20-30 years in is a little ridiculous. You can't just say "Oh, you know all those measurements you've been using since you were a kid? Yeah, they're not valid anymore. Sorry about the fact that a bunch of your programs and hardware don't work anymore, but we needed to arbitrarily switch those on you."

          I would prefer if they had been 1000 instead of 1024 to start with, but it's a few decades too late for that.

  • by Seriman ( 775126 ) on Thursday September 25, 2008 @10:51PM (#25161491)
    The 8600GT 512 has been available for a while now, I have one myself, and it was ~$120. They're even cheaper these days. That card can handle about anything you care to throw at it, unless you're running Vista, at which point you shouldn't care about the cost, because you're already paying Mistress Xanthia hundreds per month to kick you in the beans.
    • I've been buying GeForce 9500 GT 512s at work for ~$70 each, before rebates. Those things pack a great punch for the price...

  • Those 8800GTs (Score:3, Interesting)

    by iteyoidar ( 972700 ) on Thursday September 25, 2008 @10:53PM (#25161503)
    I don't really keep up with video cards except when I'm trying to buy one ever 3 or 4 years, but those 8800GTs are like $100 and can run just about anything. $100 isn't cheap but for a card that will let you play every game out right isn't bad, especially when getting that last 10-20% performance increase bumps your price up a few hundred dollars
    • Re:Those 8800GTs (Score:5, Interesting)

      by Kargan ( 250092 ) on Friday September 26, 2008 @12:56AM (#25162257) Homepage

      Yep, I just bought a factory overclocked 8800GT (ZOTAC Amp! Edition, to be specific) for $117 a couple of weeks ago. It does indeed run Crysis, COD4, Assassin's Creed, etc. at very high quality and framerates. And NVIDIA just released driver update 178.13 today, with the following changes:

      # WHQL-certified driver for GeForce 6-series, 7-series, 8-series, 9-series, and 200-series GPUs, including the newly released GeForce 9800 GTX+, 9800 GT, 9500 GT, and 9400 GT GPUs.
      # Adds support for NVIDIA PhysX acceleration on all GeForce 8-series, 9-series and 200-series GPUs with a minimum of 256MB dedicated graphics memory (this driver package installs NVIDIA PhysX System Software v8.09.04).
      # Experience GPU PhysX acceleration in several full games and demos today by downloading the GeForce Power Pack.
      # Adds support for 2-way NVIDIA SLI technology with GeForce GTX 200-series GPUs on Intel® D5400XS motherboards.
      # Supports single GPU and NVIDIA SLI(TM) technology* on DirectX 9 and OpenGL.
      # Supports CUDA(TM).
      # Supports Folding@home distributing computing application. Download the high performance client for NVIDIA GPUs here and join the NVIDIA team: #131015.
      # Supports GPU overclocking and temperature monitoring by installing NVIDIA System Tools software.
      # Includes several 3D application performance improvements. The following are examples of improvements measured with v178.13 WHQL versus v175.19 WHQL driver:

              * Single GPU increases up to 11% in 3DMark Vantage (performance preset)
              * Single GPU increases up to 11% in Assassin's Creed DX10
              * Single GPU increases up to 15% in Bioshock DX10
              * Single GPU increases up to 15% in Call of Duty 4
              * Single GPU increases up to 8% in Enemy Territory: Quake Wars
              * 2-way SLI increases up to 7% in Bioshock DX10
              * 2-way SLI increases up to 10% in Company of Heroes: Opposing Fronts DX10
              * 2-way SLI increases up to 12% in Enemy Territory: Quake Wars
              * 2-way SLI increases up to 10% in World in Conflict DX10

      # Includes numerous 3D application compatibility fixes. Please read the release notes for more information on product support, features, driver fixes and known compatibility issues.

      • by Kjella ( 173770 )

        Yep, I just bought a factory overclocked 8800GT (...) It does indeed run Crysis (...) at very high quality and framerates.

        Nothing runs Crysis at high res / high FPS. It wouldn't stand a chance of playing Crysis on my 1920x1200 monitor which I don't consider particularly high resolution anymore, and to play at 2560x1600 @ 30fps you need a GTX 280 SLI or 4870 x2 / 4870 CF solution. If it runs at very high quality and framerates for you, you can't be using a very large monitor.

        • by Fred_A ( 10934 )

          I don't know, I ran Crysis on my rig at 1920x1200 and while I didn't measure the framerate it wasn't too bad with decent settings. Video was handled by a nVidia 7950GX2 and the CPU is a generic intel 6700.

          Same with Assassin's Creed (more fluid than Crysis though I think). And Solitaire just flies !

    • by Artemis3 ( 85734 )

      I agree, mine was 320$ last year nov-dec when there was a rush for them (Anandtech article: 8800 GT, "The only card that matters" [anandtech.com]) they were always out of stock so i had to rush with a PNY OC edition + state taxes :(

      For 100$-120$ its a steal. I play most games at 1600x1200 and they go fine.

  • by mandark1967 ( 630856 ) on Thursday September 25, 2008 @11:00PM (#25161551) Homepage Journal

    $700 for a video card solution. Unless I'm going SLI, then it's like $1200 or so for two cards because you gotta get 'em the day they're released...NOT after the inevitable price drop. Of course, you gotta throw in extra for the water blocks and pump, and tubes, and reservoir and such, so in reality I never spend more than like $850 each...Unless I am buying for my Tri-SLI capable board...then it's like $2450, and add like $250 for a 1200watt PSU and like $550 for three water blocks and stuff, so it's like close to, but under $3000 for video cards...wait...why is there only Raman Noodles in the cupboard?

    • by Molochi ( 555357 )

      LOL, I hear ya.

      I recently picked up an Ati4850 because...

      It's a single slot card.
      It works on a 500W PSU
      It's quiet.
      It'll probably run every game well for at least 2 years.
      I like eating Ribeyes and drinking good Scotch.

    • Of course, you gotta throw in extra for the water blocks and pump, and tubes, and reservoir and such,

      does it need tubes so you can get the internet on it?

  • For those who can read Russian IXBT [ixbt.com] has graphic card roundups updated quite regularly.Among other things it compares performance/price and potential longevity of the cards. To understand the comparison tables you do not even need Russian.
  • by Joe The Dragon ( 967727 ) on Thursday September 25, 2008 @11:13PM (#25161653)

    790gx and 780g with side port ram are good for basic video work / vista and you can add $50 card for a boost as well. Also they cost less then Intel board that cost more and are slower with poor divers that use system ram.

  • Before i switched to ppc-mac/xbox 360 few years ago, i owned a self-built PC with cheapest functional hardware. what i did was getting a used parts from ebay. i got new graphics card for $30 in order to play WOW because the old one couldn't render 3D graphics so WOW looked like a mozaic slush. I was never fond of spending too much money on gaming so i looked for alternative; XBOX 360. Cheap. no upgrade required. no installation. being a busy university student and having number of part time jobs going on ,
  • I very much like that they looked at noise in this article.

    Quite simply, most of the cards didn't register above the ~40 dB volume threshold of our sound level meter

    One of the things that makes me shy away from the new top of the line graphics cards is the very loud cooling systems they put on them. Lower performance is actually more attractive if it means my computer doesn't sound like a hairdrier.

  • I just bought a GIGABYTE GA-EG45M-DS2H motherboard with built in Intel G45 graphics with the 45nm Intel Core 2 Duo E7200 Wolfdale @ 2.53GHz.

    The price is about $120 each and the system overclocks easily to 3.5GHz.

    It has an HDMI 1080p output and digital surround.

    Works just fine for gaming and HD movies. And best of all, with the money saved, I can buy a new computer every 6 month, rather than building an expensive computer and upgrade in 2 years.

    This setup also works great with no HD receiver and other extern

    • by Molochi ( 555357 )

      I went with the P43 version running a C2D 2.4GHz @3.2GHz and Dual Channel DDR2-800 on a matched 1600MHz Bus. No point in faster memory when you're bottlenecked by the FSB and NOTHING maxes out the CPU. The money I saved went towards an ATi 4850, because without a real videocard you might as well just run a 5 year old computer.

  • When it comes to games, hardware has been outpacing the ability of software to actually ustilize it for several years. That's why when I started my game, I decided to actually use everything I had; and sure enough, I've overheated a few cpu's and graphics cards of unsuspecting players (not permanently, of course!).

    It's very high poly and is a "big room" game, which takes lotsa gpu and cpu both. And one day, when it's finished (in a decade?) it'll be playable on a modest machine :-)
  • Cooling (Score:5, Insightful)

    by Detritus ( 11846 ) on Friday September 26, 2008 @12:13AM (#25161997) Homepage
    I'd like to see more graphics cards with passive cooling. Every time I see one of these cards with a big honking fan on it, I wonder how long it will last and whether it is even possible to replace the fan if it fails.
    • In most cases it definitely is possible to replace the fan. In fact, with the last 3 graphics cards I bought, one of the first things I did was replace the noisy standard fan with one of Zalman's quieter graphics card fans.

      Right now, I'm using a Geforce 7600GS with the Zalman VF700-Cu, and it's absolutely inaudible unless I open the case and put my ear right up to it.

    • After fighting some obnoxiously loud video card fans for a while I swapped my main desktop system over to a Gigabyte GV-NX86T256H [newegg.com], which is a fanless 8600GT. That seemed to be the cheapest product level capable of dual DVI output, and I have my doubts about whether a more powerful card can run with passive cooling effectively.

      I'm pretty happy with mine, it is hard to get good gaming performance from a fanless design though without the whole thing overheating (as you can tell from the amount of negative co

  • by MostAwesomeDude ( 980382 ) on Friday September 26, 2008 @12:36AM (#25162133) Homepage

    I feel like I'm plugging myself, but the Radeon X1950 is a massively capable card, and is available for as little as $60-70. It's also fully accelerated with the open-source driver stack as of Mesa 7.1. (I'm currently on one, running Compiz Fusion with Xserver 1.5. It's good times.)

  • "Trickle-down?" (Score:2, Insightful)

    by hdon ( 1104251 )

    Am I the only person who found this to be a *really* strange turn of phrase?

    With game developers targeting the relatively modest hardware available in current consoles and trickle-down bringing cutting-edge features down to budget price points, today's low-end graphics cards are more capable than ever.

  • If $170 is cheap... (Score:2, Interesting)

    by Judinous ( 1093945 )
    I managed to pick up an HD4870 from Newegg this week for $200 with a combo deal on a motherboard that I was going to get anyway. If the high end is only $200, I think that they'd be hard-pressed to call $170 a budget card. Then again, maybe it was just a really good deal.
  • July last year saw me splurge out on a new box. At the time I got an MSI GeForce 768MB O/Clocked 8800GTX. It cost me a decent chuck of the final PC price ($820 AUD).
    This card was beautiful. It just ate up every game I threw at it smoothly and perfectly. Running at 1920 resolution on my 26 inch Acer.
    A year later my PC died whilst playing HL2. Turns out that between my GFX card and Sound card was a small little firewire chip that controlled the 1 firewire port at the back (that I had an external HDD plugged t

  • I'm out of the house 13 hours a day 5 days a week. I get home and I have chores and a new born to deal with.

    When I get to play a game, I don't want to play the crap games at 640x480. Also my game genre of choice flight simulation. Sure you can get away with a cheap crappy video card...if you like stick figures.

    If you buy a cheaper car, it usually still gets you from A to B. However you don't buy a cheaper car if you're planning o haul a large boat. Its not suitable. Likewise light weight graphics cards aren

  • by Anonymous Coward

    I wish these reviews would give comparisons to older cards so people thinking about upgrading could tell how much of a performance increase they're getting for their money. As it is these data are pretty meaningless to me.

  • i dont get it. just 20 difference, over 170. get a 4850 and enter high mid range.

No spitting on the Bus! Thank you, The Mgt.

Working...