A $99 Graphics Card Might Be All You Need 618
Vigile writes "With the release of AMD's latest budget graphics card, the Radeon HD 4770, the GPU giant is bringing a lot of technology to the table. The card sports the world's first 40nm GPU (beating out CPUs to a new process technology for the first time), GDDR5 memory, and 640 stream processors, all for under $100. What is even more interesting is that as PC gaming has evolved it appears that a $99 graphics card is all you really need to play the latest PC titles — as long as you are comfortable with a resolution of 1920x1200 or below. Since so few PC gamers have screens larger than that, could the world of high-end PC graphics simply go away?"
Once upon a time (Score:4, Insightful)
I used to have a top-of-the-line 3dfx graphics card. It was all I ever thought I'd need.
Today, that kind of power is available in my scientific caluclator.
Just goes to show that today's technology will become yesterday's technology in a very short period of time.
Complexity (Score:5, Insightful)
A more expensive card == a bigger e-Peen (Score:4, Insightful)
Therefore, no. The high end will not be going away. Some folks will always feel inadequate and seek to compensate.
Could the world of high-end PC graphics go Away? (Score:5, Insightful)
No is the easy answer.
High-end graphics cards are rarely sold because of their real-world in game performance which is often insanely high; too high to notice in any game on release anyway. Nope, in my experience $600 graphics cards is all about bragging rights and benchmarks. It's the same category of people that buy water-cooling and ram chip heat-sinks & fans; they just want to squeeze that last 2% throughput out their probably insanely overclocked systems for the highest benchmarks possible.
It's actually good fun if you're into that; what you learn in overclocking is quite astonishing, but the super-high-end graphics cards are all part of that game.
Re:GTA4?? (Score:3, Insightful)
Re:That's why we have... (Score:3, Insightful)
I expect we'll find that Xbox4000, PS4 and standard PC platform (TM) will be just as common in the study as in the lounge room, and vice versa - the upgrade treadmill will be broken.
I expect there will be few complaints, since everyone stands to benefit from that kind of transition. Players will have machines that are smaller and can do more, game devs can target hardware more closely and spend more time actually making games, GPU manufacturers can exploit longer product cycles and broader sales.
The only folks who will suffer are those insufferable people who like to flaunt bleeding edge hardware like it's a technological penis extension.
Re:Could the world of high-end PC graphics go Away (Score:4, Insightful)
Wait...
Where does the heat in the water go?
Re:Once upon a time (Score:4, Insightful)
Less games like Crysis? The majority of PC games aren't like Crysis in their demands at the high end anyway. So what are you trying to say exactly? Crysis has always been the exception, not the rule.
I once had a $300K SGI computer (Score:5, Insightful)
Vacuum your case out... (Score:5, Insightful)
Re:Agreed! (Score:5, Insightful)
Re:Complexity (Score:3, Insightful)
Well we aren't yet to the point where a cheap card can produce completely photorealistic movies in real-time that are completely indistinguishable from real life. Until we get there, I'm sure people will keep pushing those limits.
Once we get there, I'm not sure what will happen. Maybe they'll still want faster cards so they can offload some other kinds of processing (physics? AI?).
High-end what? (Score:5, Insightful)
I've been 'into computing' since a '286/20 was described as 'lightning fast'. I've never, ever spent more than 100 dollars on a video card. I've always bought last-years' high flyer for 60-80 dollars and I've never hurt for lack of fun games to play at resolutions that I've ever noticed as a problem.
Last years' CPU on last years' mobo costs 100 dollars for the pair. HDD upgrades for sale at 60 dollars - who isn't happy with this? Your average computer lasts about 4 years, by buying 1 year late you get 3/4 the performance life at 1/4 the cost while staying within the range of the target platform for most of the latest games.
Why is this even a question?
and yet, their drivers still suck (Score:3, Insightful)
All you need? Going away? WTF? (Score:4, Insightful)
Clearly not written by anyone who is very familiar with the graphics requirements of games like Crysis or Farcry 2. Can you run these games on a budget card? Yes. Is it possible to enjoy those games at a lower resolution or frame rate? Quite possible. Can either of those titles be enjoyed at their maximum potential? No
There are plenty of idiots who say bigger this, bigger that == bigger e-peen. That is really just stupid. There is a large segment of the gaming population who actually enjoy playing their games in the way the designers intended. Using physix, anti-alliasing, etc to achieve a full cinematic effect.
This goes for any enthusiast niche market. You have your audiophiles, your car guys, musicians, and artists, the list goes on. Why does a musician want a certain amp or guitar? Is it because he wants his peen to go to 11?
Re:and yet, their drivers still suck (Score:3, Insightful)
Then why the hell do you keep buying ATI cards?
Re:But their drivers still suck (Score:4, Insightful)
Probably shouldn't be a troll here. I have a $250 high end Radeon. Bought it along with a new system back in October. From the beginning, it would blue screen on boot but only once in a while. Now it's doing it more often (event log identifies the problem as with the ATI driver), it randomly boots the machine, and currently the machine is in a reboot cycle. Searching on the problem shows it's well known. Suggestions are to upgrade to the newest driver (fails) and disable some feature (fails). Reports of contacting ATI results in "it's Microsoft's fault". Calls to Microsoft result in "it's ATI's fault".
Yea. I agree. No matter the price, if it doesn't work, it doesn't work.
[John]
I can play this too.
Probably shouldn't be a troll here. I have a $250 high end Geforce. Bought it along with a new system back in October. From the beginning, it would blue screen on boot but only once in a while. Now it's doing it more often (event log identifies the problem as with the Nvidia driver), it randomly boots the machine, and currently the machine is in a reboot cycle. Searching on the problem shows it's well known. Suggestions are to upgrade to the newest driver (fails) and disable some feature (fails). Reports of contacting NVidia results in "it's Microsoft's fault". Calls to Microsoft result in "it's NVidia's fault".
Yea. I agree. No matter the price, if it doesn't work, it doesn't work.
Nvidia is known to pay forum users and the like to post FUD like this.
Ever since AMD bought ATI the drivers have been improving by leaps and bounds. With AMD/ATI, you now get a driver release every month. Their drivers have been completely stable for at least a year or two now, and game support has been growing and solidifying as well. The only game that ATI cards struggle with now is UT3; all the others the newest line (4850/70/90) thoroughly trounces the equally priced Nvidia card.
Think of it this way-- would you rather have the Nvidia 285 for $330, or the 4890 for $230? They perform the same, and drivers are not an issue.
Re:But their drivers still suck (Score:2, Insightful)
It would be nice if you actually made a point. I'm suprised you were modded up.
You and I know ATI cards are top performers and SOLID stable cards. The trumping they gave nVidia over the last year speaks volumes.
The release of this card does nothing more than to say they are sticking to a tried and true strategy. While nVidia is forced to sell more costly hardware ATI is able to produce less expensive hardware that outperforms the competition.
Re:Once upon a time (Score:3, Insightful)
I disagree, I'm not saying this is it, but at some point you reach a point of diminishing returns. I'd say sound cards reached it several years ago such that only real audiophiles buy high end sound cards now days and on-board sound is good enough for most people.
I think it's fair to expect graphics cards to reach a plateau at some point as well and that point maybe sooner rather than later. You can only boost the resolution and push more and more polygons for so long until it stops making much difference.
Re:But their drivers still suck (Score:2, Insightful)
I made the unfortuneate error of choosing an ATI product over an nVidia product when making my selection when building my media center machine. Even though the specs were similar and the ATI board was $5 more I went with ATI because of the superior scaling options for HD panels. This was prior to nVidia's driver update that kinda threw it together.
The newest Catalyst drivers will not display high bitrate video. They borked it up sometime after the 8.7 release, of course I could just use the 8.7 version right, except it doesn't have the scaling options...
I stand by my original post, if it doesn't work, it's a waste of money regardless of what they charge. Fine boards, crappy drivers.
Re:parent not really a troll (Score:2, Insightful)
Also, if you're into the whole "Free as in speech, not free as in Beer" thing, Ati should be the hardware of choice, even though their proprietary drivers aren't as good as NV's.
And apart from ATI's support for OSS driver projects, NVidia has pulled off some highly [theinquirer.net] questionable [theinquirer.net] moves in the recent past, comparable on the moral scale to Microsoft business tactics, effectively making them a no-buy in my book as long as ATI puts out competetively priced and performing products.
Re:Complexity (Score:5, Insightful)
At some point we hit a point of diminishing returns on better graphics units... the human eye can only distinguish so much.
But we're nowhere NEAR that argument yet. State of the art movie-quality CG is still not quite there, and you are talking rendering times of hours per frame, not frames per second.
Eventually we'll hit the point where there's simply not enough benefit to be gotten out of an expensive GPU. For me, that time is long past. For others, it may come in the next few years. For a small portion, the 'dreamers', it'll never come... but why would any company spend millions and millions developing new and better chips for such a small market?
Graphics are not the only thing a GPU is used for these days. Game physics on the GPU is still in the early stages, and game AI on the GPU is almost non-existent so far. 3D gaming is still pretty new (and will be niche until display technology improves) and (at least) doubles the GPU requirements.
And who's to say 20-30 years from now we're not projecting stereo images directly onto your retina, or even your optic nerve? I sure hope that is at a better resolution than 1900x1200. We are orders of magnitude away from anything graphics and physics-wise that can fool the human brain.
I can't believe there are so many people here who really think a technology like this is "good enough" today. Have a bit of imagination, and it's pretty obvious (to me at least) that we've barely scratched the surface of 3D computer graphics.
Then it's time for (Score:3, Insightful)
Re:High-end what? (Score:5, Insightful)
I would have said that until 1-2 years ago, the best "value per dollar" for video cards was about at $200. This is how much I spent on my first Voodoo2 card and my Geforce 6800. This past year, I spent less than $100 for a card that is arguably better performance per dollar, relative to the demand of the games on the market. So I would agree, $100 is the old $200 in terms of video cards.
Re:Complexity (Score:4, Insightful)
Then it'll move onto rendering more things on the screen, like games with ten thousand characters on screen at one time, all completely unique, and a landscape with infinite draw depth, nothing popping up but for instance a tree appearing as a single pixel on the horizon, getting closer and bigger until you have a photo-realistic microscopic view of the bark.
Nvidiots are still the same. (Score:4, Insightful)
Sorry, but whether you're an ATidiot or NVidiot, the same is true.
I used an ATi board up until I needed an Nvidia back (to get my old VRStandard shutter glasses usable again). Then NVidia fucked me over by making the "new" 3D glasses driver Vista-only and proprietary to their own fucking brand glasses, forcing me to choose between running an old driver (which won't work for certain games) or buying $500 in new hardware AND infecting my PC with Vista.
Bottom line is, if you're not doing something like that, you don't really care whether you have NVidia or ATi. Buy whatever is at the "sweet spot" in the pricing point. The 4770 for $99 certainly is a great price.
Oh, and one other thing to remember - Are you "Okay" with playing in 1900x1200? Fuck, man, I remember when 640x480 was stellar. When 800x600 at 30 frames was something to goggle at. To this day, I run a 21" CRT monitor that does 120 Hz at 1280x1024, I still have a NVidia 7800GS card (though I'll upgrade in a few months finally... after THREE AND A HALF YEARS on my current rig with no tears shed) and that's all I need.
Does anyone "need" 1900x1200? I doubt it. "High-end" graphics haven't been used by anyone but a few people who look more for bragging rights than fun in gaming for years. Hell, what are you going to play on it anyways - all the MMORPG's are still designed to run on 5 year old hardware, and anything "intensive" like Crysis is more of a fucking tech demo than an actual playable game anyways. The fun games, except for the MMORPG's, now come out on the consoles first and maybe get a PC port if you're lucky a year later.
Re:Nvidiots are still the same. (Score:5, Insightful)
Does anyone "need" 1900x1200? I doubt it. "High-end" graphics haven't been used by anyone but a few people who look more for bragging rights than fun in gaming for years.
I run at 1900x1280, not because I want bragging rights but because that is the native resolution of my monitor and any non-native resolution looks fuzzy in comparison. The fact that I have a 24" monitor running at a high res may make me a pixel junkie, but that has nothing to do with gaming and everything to do with ordinary apps on my desktop.
Re:It's still under a TeraFLOPS, marginally (Score:3, Insightful)
Unfortunately, that's the way it goes. AGP is obsolete.
The sole advantage of AGP was a faster, dedicated bus for graphics. PCI-Express accomplishes this and much more while being significantly faster than AGP was. AGP has gone the way of the dinosaur, and PCI is the new ISA (potentially useful in increasingly specific, niche applications).
Why would a manufacturer cram the latest technology into an obsolete interface? They probably wouldn't recoup the costs of re-configuring for AGP in sales if they did. Lets face it, if you are stuck with AGP, you are probably not enjoying all the benefits of modern multi-core technology either. An upgrade now would be very significant, and you should still be able to get your graphics card/mobo/cpu upgrade for under $300.
You're not alone though, I'm in the same position.
Re:But their drivers still suck (Score:3, Insightful)
Ah, it's the old "now they're better" argument. My laptop with a Radeon 9600 still can't suspend with the proprietary driver. Sometimes it locks up when I enable an external monitor with their utility (gotta save all my work before trying that one.) Seriously, I hear the same thing about MS and security. If they're living with a reputation they've earned, don't expect that to change overnight. And don't blame users who've gotten bad support, even if their data is a little out of date. If I'm going to get screwed again, at least it won't be by the same company.
My next laptop will have Nvidia based on the experiences with my current one. Maybe after that they'll get another shot.
Re:parent not really a troll (Score:3, Insightful)
Unintended consiquences? (Score:4, Insightful)
Re:Nvidiots are still the same. (Score:2, Insightful)
I suppose no one "needs" that resolution any more than anyone "needs" any particular resolution. That said, my primary display is 2560x1600, so I suppose, yes, I "need" more than 19x12. Could I get by with less? Sure. I can "get by" with my 1.6 GHz Atom Netbook opening browser windows sllloooowly. That doesn't mean I don't want to use my 3.0 GHz quad-core Penryn desktop when I can.
Vanguard, Saga of Heroes, a Feb 2007 Sony MMORPG, chokes on anything much less than a bleeding edge video card and PC if you're trying to raid with 24 people and have decent graphics quality. That's not at 19x12, or even 16x12; that's at resolutions like 1280x1024. True, if you just want to play the game and group, you can get by quite happily on a $500 PC. If you want to raid with horrible graphics quality, you can get by on a well under $1000 PC. If you want decent graphics quality, aye, there's the rub.
That's a 2-year-old MMO from a major MMO company that manifestly is not designed to "run well" on 5 year old hardware. There are others like it.
This much is true: cards like this may sharply narrow the need (and hence market) for purchasing (high end) enthusiast cards. Indeed, beyond that, ultimately (2015?) we may well find that integrated graphics are good enough for 95% of the non-casual gaming market. If that happens it will be a startling transformation, especially for NVidia.
Re:Complexity (Score:3, Insightful)
Re:High-end what? (Score:4, Insightful)
Let's use ye olde law of Moore:
Let's assume a 6-month release cycle for hardware and games (pretty close to reality - these things do tend to come in batches around twice a year), and average the performance out over 4 years of ownership.
At 1 year old, your shit is at 71% of the new, hot shit.
At 1.5 years old, your shit is at 59%.
At 2 years old, your shit is at 50%.
At 2.5 years old, your shit is at 42%.
At 3 years old, your shit is at 35%.
At 3.5 years old, your shit is at 30%.
At 4 years old, your shit is at 25%.
At 4.5 years old, your shit is at 21%.
At 5 years old, your shit is at 18%.
That's an average (over your 4 years of ownership) of 39%.
If you buy brand new shit:
Brand new, your shit is at 100% of the new, hot shit. .5 years old, your shit is at 84%.
At
At 1 years old, your shit is at 71%.
At 1.5 years old, your shit is at 59%.
At 2 years old, your shit is at 50%.
At 2.5 years old, your shit is at 42%.
At 3 years old, your shit is at 35%.
At 3.5 years old, your shit is at 30%.
At 4 years old, your shit is at 25%.
That's an average (over your 4 years of ownership) of 55%.
If you plan 4 years of ownership (plus some slight overlap at the end) then waiting a year is beneficial if you can save just 29% on the price.
I chose to use specific points and average them since moore's law doesn't apply to retail prices smoothly, nor does the desire for performance (that tends to line up with hardware and software releases).
Re:Once upon a time (Score:5, Insightful)
This reminds me of a conversation I once had with some guy at a (rather geeky) birthday party. I asked him about the SLI setup he bought two month ago. He told me that he'll replace it soon because "there are random frame drops when I play a recent game and watch a DVD on the other screen". He was really serious about this. I pretended to be interested for another 3 minutes and left him alone before my urge to punch him in the face became overwhelming
So in other words: I believe that there will be a market for such cards as long as there are enough clueless people who earn enough money to barely afford them. In my experience this target group is pretty immune to arguments - there is no reason to assume that they'll ever wise up...
Re:I once had a $300K SGI computer (Score:3, Insightful)
Less powerful than these cards.
I'm old enough to remember such things. Your comment should've been modded Insightful.
Re:Nvidiots are still the same. (Score:4, Insightful)
Does anyone "need" 1900x1200? I doubt it.
You don't need a computer at all. You can just sit in a cave and eat whatever crawls below your feet.
Re:But their drivers still suck (Score:3, Insightful)
What's probably going to happen is that the second that the OGP starts to get a decent graphics card, some of the major vendors will start releasing documentation and/or much better Free Software drivers.
While ATI is attempting to do this, frankly, I don't see why you would assume that. It seems to me that the more likely outcome is a patent lawsuit.
Re:Once upon a time (Score:4, Insightful)
Re:Complexity (Score:4, Insightful)
The domination of polygon and SDS workflow in 3D modeling was mostly about the convenience for the artist. NURBS were available long before SDS became common. Subdivided polygons replaced real curved surfaces simply because they are so much easier to work with. CSG models still exist in some markets like CAD because the generally superior polygon workflows are inadequate.
Re:Once upon a time (Score:3, Insightful)
Re:Once upon a time (Score:3, Insightful)
If that were true, it would have been a pretty good job of time traveling . . .
Re:Once upon a time (Score:4, Insightful)
Crysis today is like Quake and Hexen II when they first came out. It's a game based on a bleeding-edge graphics engine that won't be truly playable (at high quality) on commodity hardware until another generation or two of graphics chipsets come to market.
There are always going to be a few bleeding-edge games that break the rules. Most people who want to play them without breaking the bank will buy the console version. Others will just wait until hardware gets better.
Re:All you need? Going away? WTF? (Score:4, Insightful)
"Can either of those titles be enjoyed at their maximum potential?"
Tetris is way more fun if you turn up the resolution and clipping.
nope (Score:3, Insightful)
as long as they can push more polygons (or rays?!), bigger, badder video cards will still be on the market
Re:parent not really a troll (Score:3, Insightful)
In the specific field of graphics cards, I got burned by nVidia's horrible, horrible "GeForce 4 MX" line. Luckily it was just a spare one at work that I borrowed for a quick upgrade - I brought it back the next day, my then-3-year-old GeForce 2 Ti was significantly faster. There should be some cardinal rule of marketing: "If it's not better you can't put a bigger number on it."
Re:Once upon a time (Score:4, Insightful)
This reminds me of a conversation I once had with some guy at a (rather geeky) birthday party. I asked him about the SLI setup he bought two month ago. He told me that he'll replace it soon because "there are random frame drops when I play a recent game and watch a DVD on the other screen". He was really serious about this. I pretended to be interested for another 3 minutes and left him alone before my urge to punch him in the face became overwhelming ;)
So in other words: I believe that there will be a market for such cards as long as there are enough clueless people who earn enough money to barely afford them. In my experience this target group is pretty immune to arguments - there is no reason to assume that they'll ever wise up...
Don't get mad, and don't try to convince them otherwise, for heaven's sake. Guys like that are paying for the R&D costs of the uber-high-end cards that you can I enjoy for $100 a few years later.
Re:Once upon a time (Score:5, Insightful)
The whole premise is silly and reaks of someone who has no experience in ... well anything really. As I said, every industry has high end stuff adopted by a few, which eventually becomes standard and adopted by the masses. Welcome to the evolution of technology.
I'd think someone on slashdot would at least realize that.
I think the argument here isn't that "there is always a very expensive 'high end' and a more moderately priced and still quite adequate 'mid range'". It's more along the lines of "As technology advances, there ceases to be a 'high end' market for some products."
Look at it this way - when was the last time you bought a dedicated serial I/O card? When was the last time you bought a dedicated sound card, or network card, or firewire card? All of these are now so trivial that they're ubiquitously built in to midrange motherboards, so there is no "high end" market for them any more. TFA is just saying that video cards are next.
Re:But their drivers still suck (Score:2, Insightful)
Re:Nvidiots are still the same. (Score:3, Insightful)
Oh, and one other thing to remember - Are you "Okay" with playing in 1900x1200? Fuck, man, I remember when 640x480 was stellar. When 800x600 at 30 frames was something to goggle at.
Yup - Starcraft runs at 640x480, DiabloII only runs at the stratospheric 800x600 after the expansion came out. Both are fun, still sell well and no one has evry complained about the graphics.
Good graphics is not the same as High-Res