Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AMD Graphics Games Hardware

A $99 Graphics Card Might Be All You Need 618

Vigile writes "With the release of AMD's latest budget graphics card, the Radeon HD 4770, the GPU giant is bringing a lot of technology to the table. The card sports the world's first 40nm GPU (beating out CPUs to a new process technology for the first time), GDDR5 memory, and 640 stream processors, all for under $100. What is even more interesting is that as PC gaming has evolved it appears that a $99 graphics card is all you really need to play the latest PC titles — as long as you are comfortable with a resolution of 1920x1200 or below. Since so few PC gamers have screens larger than that, could the world of high-end PC graphics simply go away?"
This discussion has been archived. No new comments can be posted.

A $99 Graphics Card Might Be All You Need

Comments Filter:
  • by HerculesMO ( 693085 ) on Tuesday April 28, 2009 @04:05PM (#27749953)

    Xbox.

    It's exactly the same principle, that you have a 'standard' set of guidelines.

    The PC world brings you the ability to get deeper textures and whatever if you want better graphics, or LESS if you want faster framerates. It's nice customizability, and while a $99 graphics card may be all you need to play the titles, the options don't end just there... and that's why there will always be a market for higher end graphics cards, or processors for that matter.

  • Re:Complexity (Score:4, Interesting)

    by Red Flayer ( 890720 ) on Tuesday April 28, 2009 @04:16PM (#27750149) Journal

    As long as artists can dream, we will require more and more power from our graphics renderers.

    You mean, as long as the market supports ever-increasing poly counts etc?

    At some point we hit a point of diminishing returns on better graphics units... the human eye can only distinguish so much.

    Eventually we'll hit the point where there's simply not enough benefit to be gotten out of an expensive GPU. For me, that time is long past. For others, it may come in the next few years. For a small portion, the 'dreamers', it'll never come... but why would any company spend millions and millions developing new and better chips for such a small market?

  • Blow their money on hundreds upon hundreds of dollars on super high-end processors, super high-end video cards, and super high-end RAM.

    They will probably never learn that all those super high-end cards are such a waste of money. IMO the best thing to do is to shoot towards the middle to low high-end cards at most. In addition SLI is kind of stupid. Your better off using your money to get one high end video card. SLI/Crossfire doesn't double performance, it increases it substantially of course but it certainly isn't double performance.

    Also you won't see performance gains on most games for a while on your super-duper high end cards, and by the time you do your card would be a middle-end card.

    With how fast prices drop, the best thing to do is get decent stuff and upgrade it ever 1-2 years depending on your budget. Performance wise, Getting a 200 dollar video card ever 2 years is better than getting a 600 dollar SLI set of video cards ever 4 years.

    And this is why I choose to get a Clevo laptop when I got a gaming laptop, but I would rather pay a little extra for an upgradable solidly built upgradable laptop with quad core support because it will last longer than a slightly cheaper dell POS.

  • Re:Once upon a time (Score:3, Interesting)

    by Threni ( 635302 ) on Tuesday April 28, 2009 @04:23PM (#27750233)

    Point is, though, there was probably a time where each generation new scientific calculators came out with more (useful) features or more (meaningful) speed increases - but that nowadays what's out there at the lowest price point is probably good enough for practically everyone. Certainly that was the way things were about 10 years ago in the 2d world of graphics cards. The makers kept cranking the drivers and improving the hardware but I don't think anyone cared because no-one was waiting around for 2d text to be rendered. By the sound of it, it's getting that way for 3d cards now for all but the saddest and richest of gamers. I'm perfectly happy with my on-motherboard graphics when I'm playing OpenArena under Ubuntu (couldn't tell you what it's like on Windows). Sure, the effects are turned down, but I can play the game, which is all I'm really after when I'm playing games. Sounds like graphics cards are becoming a commodity item you buy on price, not features. I suppose the manufacturers should be worried. Of course, they started off being called 'windows accelerators', didn't they? Now graphics are fast enough, perhaps more effect could be spent on physics engines, sound (especially latency)and perhaps helping out with AI etc - it might make for slightly less tedious single player games (if people still play them, that is).

  • by LWATCDR ( 28044 ) on Tuesday April 28, 2009 @04:30PM (#27750345) Homepage Journal

    From the 4830 uses 30 watts more power and runs 20 degrees hotter. It is a very good card for the money but it isn't much faster than 4770 and the 4770 is no so will only come down in price. Both are good choices but I think the 4770 has more value than you are giving it.

  • by Anonymous Coward on Tuesday April 28, 2009 @04:36PM (#27750473)
  • by PingPongBoy ( 303994 ) on Tuesday April 28, 2009 @04:37PM (#27750485)

    There are many untapped aspects of graphics. Showing a multiple-screens, multiple-angles viewpoint better is in immediate demand, but really high dpi, dots per inch, has yet to be available to budget PC users. Several years ago, IBM was reported to have monitors that have a resolution equivalent to what you find on the printed page. With that kind of resolution, a typical small laptop screen should fit inside 1 square inch with room to spare. I don't know if this is CRT technology rather than LCD, but higher resolution could be around the corner.

    After 2D, there's 3D, and real time 3D. So keep buying better graphics, and there will be even better graphics coming.

  • Re:Once upon a time (Score:5, Interesting)

    by Idiomatick ( 976696 ) on Tuesday April 28, 2009 @04:38PM (#27750523)
    Yeah it is a shift that has already happened. 6years ago you were expected to spend more on a video card to properly game than you are now. I'd say the average amount a gamer spends on his video card has halved in that time.
  • by mkettler ( 6309 ) on Tuesday April 28, 2009 @04:38PM (#27750533)

    Well, it's hardly clear cut to call the 4830 higher performing.

    The 4830 may have slightly better memory performance, but the higher core clock gives the 4770 higher processing performance. Also, quite a lot of the detriment of 128bit memory is made up for by much higher effective clockrates on the 4770's GDDR5 memory. You really can't look at bus width alone, bandwidth is a better measure.

    in general, the 4770 vs the 4830 has:

    29.7% more FLOPS (960 vs 740 GFLOPS)
    11.1% less memory bandwidth ( 51.2 vs 57.6 GB/sec)

    The 4770 also consumes less power, thus makes less heat (80w vs 110w), but that might not be a problem for you.

    Which will be more critical (memory vs processing) depends a LOT on the game being played. I suspect the 4830 will win out in heavily texture-loaded environments, and the 4770 will win out in shader-intensive environments.

  • by Nightspirit ( 846159 ) on Tuesday April 28, 2009 @04:42PM (#27750617)

    Like Nvidia's are any better. They haven't had flat panel scaling working for I don't know how long.

  • Uh, no. (Score:3, Interesting)

    by beavis88 ( 25983 ) on Tuesday April 28, 2009 @04:43PM (#27750639)

    Well below 30 FPS average in Crysis 1920x1200 with only 0xAA and 8xAF? No thanks. Why would I buy a card that's underpowered on today's^H^H^H last year's games at far less than max quality?

  • Re:Complexity (Score:5, Interesting)

    by chill ( 34294 ) on Tuesday April 28, 2009 @04:50PM (#27750783) Journal

    Or until they decide to abandon polys and go to true solid-object geometry. Computing the intersection of a ray and a flat poly is trivial. Computing the shading/reflection/refraction/etc. on a ray and an arbitrary curve takes significantly more horsepower.

    I remember using a program on the Amiga way back when -- Real3D from RealSoft -- that did this. Excellent rendering, but dog slow compared to Lightwave and some others.

  • Re:Once upon a time (Score:3, Interesting)

    by hattig ( 47930 ) on Tuesday April 28, 2009 @04:51PM (#27750797) Journal

    I remember buying my Radeon 9500 when it first came out because it was the cheap option at the time - it meant I could play then-current games at medium resolution and medium settings. It cost twice as much as this card that can play most current games at very high resolution and high settings. Even two years ago the 9600GT upon release couldn't achieve that. Whilst TSMC's 40nm process isn't the best, it allows for great die shrinkage and hence for a competitive price. This is definitely the best $100 card upon release given the state of the games at the time for a long time, if not ever.

  • Re:Complexity (Score:4, Interesting)

    by je ne sais quoi ( 987177 ) on Tuesday April 28, 2009 @04:53PM (#27750817)

    Eventually we'll hit the point where there's simply not enough benefit to be gotten out of an expensive GPU. For me, that time is long past. For others, it may come in the next few years.

    I agree. I haven't played Crysis, but I'm on my second time through Far Cry 2, and playability issues aside, the game looks just astounding. E.g., 1) the human models are so realistic they're descended deep into the Uncanny Valley [msn.com] and creep you out. 2) While the various areas you can go into do have a lot of artificial constraints about where you can walk (cliffs in this case, in the original doom it was walls of hallways and rooms), there is plenty of areas that don't have that and there is no fog of war or limited sight distance needed. I remember there were some hacks to remove the limited sight distance for NWN 1, and it looked okay right up until you started moving around and then it would make the game laggy and crash a lot. 3) Segregated areas. Again, with NWN 1 and lot of other similar games you had to segregate various areas to keep the number of polygons manageable, but with Far Cry 2 they seem to scale things in the distance to a lower resolution so that it stays manageable. They do have distinct areas, but they seem to have made the transition between the two relatively seamless, you only notice a little stuttering when you cross from one map to the next.

    Anyhow, it seems that these sorts of games are very close to as realistic as you'd really want before you get diminishing returns for what can physically be portrayed on a 2-D screen. Now in FC2 they could have made a great game if only in addition to the graphics they would have worked on the AI of the soldiers and some decent factions instead of the 100% accurate aim, "x-ray specs vision" soldiers who are in separate factions but all hate you and instantly recognize you and will shoot you on sight.

  • not an answer :) (Score:3, Interesting)

    by Xtifr ( 1323 ) on Tuesday April 28, 2009 @04:54PM (#27750851) Homepage

    Technically, "Nethack at full res" would be the GL ports Falcon's Eye [users.tkk.fi] and its successor Vulture's Eye [lighthouseapp.com]. Despite the oddball names and fancy 3d graphics, these are Nethack. And it probably is possible to find a card that would struggle to run these versions of Nethack (though you might have to go to the used market).

    So...your question wasn't actually quite as dumb as you probably intended it to be. Still dumb enough that I won't waste your time or mine by actually answering it, though. :)

    cheers

  • by santiagodraco ( 1254708 ) on Tuesday April 28, 2009 @05:01PM (#27751027)

    This is a link to a Goole search for nVidia BSOD in 2008. http://www.google.com/search?hl=en&q=nvidia+bsod+2008&btnG=Search [google.com]

    So I have to agree, if it doesn't work it doesn't work.

    Of course if you don't know how to use it you don't know how to use it. You stuck with a card that BSOD since last October? /boggle

    Try turning off Vista AERO.

  • by Anonymous Coward on Tuesday April 28, 2009 @05:09PM (#27751155)

    I work in a university lab that has 2 of those IBM monitors. They have a resolution of 3840x2400, at 204 dpi. Unfortunately, their max refresh rate is ~20Hz. See http://en.wikipedia.org/wiki/IBM_T221

  • by Warlord88 ( 1065794 ) on Tuesday April 28, 2009 @05:10PM (#27751187)

    I don't think there has been a marked change in the trend in GPU pricing. Crysis is already one and half years old. So it is no surprise that today's modestly priced card should be able to run it well enough. Similarly, the budget GPU in the $100-150 segment should be able to run most of the games at reasonably high settings.

    But what about few years down the lane? The developers will keep churning out better games. These budged GPU's won't be able to cope up with their requirements. I am pretty sure that today's GTX295 will not require an upgrade for at least 5 years. Please correct me if I'm wrong. One guy I know bought the 8800 GTS 640 MB just when it was released for ridiculously high price. Even today he has no problems playing the latest games at max settings.

  • Re:Agreed! (Score:3, Interesting)

    by aj50 ( 789101 ) on Tuesday April 28, 2009 @05:14PM (#27751253)

    Unfortunately, even with 1920x1200 displays, we're still not there yet. Anti-aliasing at a lower resolution will often look better because it smooths out the sharp edges, essentially blurring the image slightly.

  • by Anonymous Coward on Tuesday April 28, 2009 @05:17PM (#27751315)

    indeed, it may be an "on the horizon" technology but i look forward to a true 3D display. perhaps some sort of holographic / 3D video hybrid.

    but as far as the trusty 2D displays go right now the reason we're seeing fairly powerful hardware at a reasonable price point is simply because of demand. sure, the economy for one, but primarily the demand of the software. right now so few applications really take full advantage of the hardware technology, theres no need to have the latest and greatest. back in 1998 you couldnt run unreal without an unreal PC we're talkin Pentium 166 MHz, 16 MB RAM, 1 MB video card, CD-ROM drive and 100 MB hard disk space. And for all that you were still getting mediocre performance. for context, at the time i was rockin a 486 sx20MHz OC's to 33 with 4Gb ram and a 2x CD ROM with 107Mb total disk space. (legacy hardware credentials: check) the point is nowadays even the benchmark games like Crysis, will run at reasonable frame rates with pretty good detail settings on a very large chunk of midrange hardware. As mentioned elsewhere, when the price of decent hardware comes down, more developers will invest their time in making more graphics intensive apps. a developer wants to get to the largest audience possible and that means usually targeting the low-middle, be it age group, intelligence group, or performance group.

  • Re:Once upon a time (Score:4, Interesting)

    by Bigjeff5 ( 1143585 ) on Tuesday April 28, 2009 @05:33PM (#27751609)

    I remember when "high-end" meant $800-1000, this was particularly true when SLI first came out. At it's peak, maxing out your graphics capabilities meant spending up to $1400. $500 was maybe high side of mid-range, and you could get by with low settings on any game with a $300 card. The $300 level was where I spent most of my money, as I couldn't afford the high end stuff, and there were -always- settings I could not turn on because I didn't have the power. And this was when "High-resolution" Was in the 1600x1400 range (at the time of SLI we started seeing higher), monitors that went higher than that were prohibitively expensive.

    $100 cards that max settings on most games with most monitors is pretty significant. Granted I didn't RTFA, so I don't know if that's exactly what they are saying, but either way it is significant. In fact, $100 cards that can play new games with most settings on high at 1080p resolution is pretty frickin impressive, even if it can't max it out. Makes me want to buy a new card.

  • by Anonymous Coward on Tuesday April 28, 2009 @05:34PM (#27751619)

    What makes you think open-source drivers automatically makes them suck less? Open-source is a concept, and when applied, quite often does not meet the demands or requirements of the real world.

    That said, yes, I have been ignoring AMD/ATI for the past *4* years: ever since I reported a bug with hardware mouse cursor support on their Radeon cards. Four completely different Radeon cards (one even made by Appian at the time) and four different PCs (including two stock out-of-the-box Dell systems) -- yet ATI's response was "we can't reproduce this, can you send us the **entire PC**?" As far as I know, the bug still exists (reading Radeon driver ChangeLogs indicates no related fixes), and I'm not willing to spend money on a Radeon card just to find out it still exists.

    Thus, all the systems are now nVidia. Different set of bugs, but none so far that affect GDI or hardware mouse cursor support.

    Bottom line: we live in a world where when the axe begins to fall, QA is first on the chopping block.

  • by Anonymous Coward on Tuesday April 28, 2009 @05:35PM (#27751641)

    There are many untapped aspects of graphics. Showing a multiple-screens, multiple-angles viewpoint better is in immediate demand, but really high dpi, dots per inch, has yet to be available to budget PC users. Several years ago, IBM was reported to have monitors that have a resolution equivalent to what you find on the printed page. With that kind of resolution, a typical small laptop screen should fit inside 1 square inch with room to spare. I don't know if this is CRT technology rather than LCD, but higher resolution could be around the corner.

    After 2D, there's 3D, and real time 3D. So keep buying better graphics, and there will be even better graphics coming.

    The monitor you mention was produced by IBM and Viewsonic up until 2005. The monitor is 22" and has a resolution of 3840x2400, which is freaking awesome, except for the fact that scrolling from one side of the screen to the other is insanely slow because it's so large. The other drawback was the fact that new they were $7k plus. More details available at http://en.wikipedia.org/wiki/IBM_T220/T221_LCD_monitors

  • Diminishing returns (Score:5, Interesting)

    by obarthelemy ( 160321 ) on Tuesday April 28, 2009 @05:49PM (#27751871)

    I think we've reached a point where

    - graphics are no longer a limiting factor for a game's enjoyment. Wireframe spaceships sucked. 100.000-polygons ones instead of 10.000-polygons ones probably don't make a huge difference. On the contrary, too many moving things actually distract. We can go "more lifelike", and blend (pun inteded) the boundary between games and films, but still...

    - graphics costs are ballooning, both in terms of creating the ressource files, and programming all the candy/actions. At the same time, the attention is moving to other topics (IA...), and budgets are tight.

    - there's probably a limit on how big a PC screen, and how small the dots on it, can be. Actually, most LCD screens don't even render all that many colors anyway.

    Which explains why nVidia in particular is desperately trying to find other uses for a GPU. They are the only of the big 3 that don't have much else in their portfolio.

  • by billcopc ( 196330 ) <vrillco@yahoo.com> on Tuesday April 28, 2009 @06:15PM (#27752215) Homepage

    It's nice that ATI keeps releasing value-conscious products for those cheap gamers, but it is rather short-sighted and sensational to say that "a $99 card is all you need". Ten years ago, a $99 card was enough to play Quake 3 in medium-res medium-graphics. The one thing these graphics companies are good at is marketing. They have figured out how to maximize their sales, and that meant crippling the used resale value of their products to capture the idiotic low-end market. They sell these crippled products in big box stores to people who don't know better, to get them hooked on the upgrade treadmill. Six months later, Joe Stupid is a budding gamer, wants to play Call of Duty 8 and drops another $99 on that month's cheapo card. After a couple of years, Joe has upgraded 3-4 times, while he could have spent the $300 up front for a good card that would still have some fight left in it.

    I have seen this cycle far too often. I dunno, maybe these people suck at math, but they're clearly not saving money in the end. Some people are happy with their $99 card and keep it for the lifetime of their PC, but those people would have been just as happy with "free" Intel integrated graphics. Gamers always want more.

    That's also why we've seen a ton of movement in the low-end segment, but very little progress at the top end. If you spent $500 on graphics two years ago, you're still within 10-15% of today's $500 graphics solutions, and that's just pathetic.

  • by Targon ( 17348 ) on Tuesday April 28, 2009 @06:33PM (#27752459)

    One thing that this does is push the game developers to make games with better graphics faster/sooner than they would in the past.

    Developers need to look at the low end, the high end, and the average for CPU and graphics power for their target audience. In the past, we would see a ton of Intel garbage graphics in systems, and that was the baseline that developers had to code for. As time has moved on, more and more systems, even with integrated graphics have shown up with NVIDIA graphics on the Intel side, and AMD systems have always had either AMD or NVIDIA graphics, which raises the bar by quite a bit.

    With the level of GPU power in a $99 card, it shouldn't take too long for integrated graphics to show a significant improvement over the Radeon 3300 graphics on the AMD 790GX chipset. The question remains how long it will take, and how good or bad the integrated version ends up being.

    Now, that raises the bar. While resolutions may not increase, the detail and quality we can run at will go up. Yes, a $100 card may run fine with medium graphics settings, but can you really expect a $100 card to run every game at 1024x768 at max settings and AA? That is the key to why people will buy higher end cards, so they can see games in their full glory.

  • by Locke2005 ( 849178 ) on Tuesday April 28, 2009 @06:40PM (#27752559)
    Run a video card stress test, like this one [freestone-group.com] If your computer crashes/reboots/hangs, then your graphics card is overheating. I had an nVidia card with no built in fan that came pre-installed in a high-end Sony Media PC with no provision made for cooling the graphics card. I always had trouble with it, even after installing an auxiliary fan to cool it, so I replaced it with the best ATI card that would work in my box. Now it has no problem passing stability tests. And I don't buy crap made by Sony anymore either.
  • by Plekto ( 1018050 ) on Tuesday April 28, 2009 @06:52PM (#27752737)

    The specs for DX11, such as they are at this point, call for real-time ray tracing. This will require a massive increase in power that frankly this new card is only starting to get close to being capable of. There's tons of new room to grow here. Perhaps it would be the last DX10 card you'd ever need, but not even close for future use.

    That said, there should also be a standardized ray tracing test in the video suites. IIRC, there is already a ray traced version of Quake 4 out.

    http://www.idfun.de/temp/q4rt/ [idfun.de]

  • Re:Once upon a time (Score:4, Interesting)

    by Misanthrope ( 49269 ) on Tuesday April 28, 2009 @07:08PM (#27752897)

    I'm from San Jose, I remember when all the semiconductor and tech company buildings were orchards.

  • Re:Once upon a time (Score:3, Interesting)

    by complete loony ( 663508 ) <Jeremy.Lakeman@g ... .com minus punct> on Tuesday April 28, 2009 @08:28PM (#27753705)
    I remember some time ago, it was the impending release of Half-Life 2 and Doom 3 that were defining GPU purchases. I think the performance benchmark hasn't moved very much since then. GPU's that can play HL2 / Doom3 well at your maximum resolution can probably play anything released since then reasonably well. I'd say that valve's hardware survey is playing a big part of this, showing that not all gamers are upgrading to the bleeding edge.
  • by Bigbutt ( 65939 ) on Tuesday April 28, 2009 @09:11PM (#27754015) Homepage Journal

    Amusing. This guy says the same thing as I did but with Nvidia and he's insightful but I'm marked as a Troll when it's clear he was trolling and I wasn't.

    Meh, the Nvidia fanboys must be out tonight.

    [John]

  • by Nom du Keyboard ( 633989 ) on Tuesday April 28, 2009 @09:14PM (#27754031)

    A raytraced 3d view is much much more realistic than anything a modern GPU can muster.

    Actually a raytraced 3d view compared to the current raster hacks available is A LITTLE more realistic than anything a modern GPU can muster. Some very clever hacks have proven to bring raster graphics close to raytraced results - and still in realtime.

  • by Nom du Keyboard ( 633989 ) on Tuesday April 28, 2009 @09:20PM (#27754059)

    Realtime ray-tracing.

    This post would have been Redundant if it had been First Post.

    Realtime raytracing on the desktop is 5 years away. It has always been 5 years away, and it will always be 5 years away.

    Why? Because monitors will always be much bigger and faster 5 years from now, multiplying the level of the requirements for realtime raytracing.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...