A $99 Graphics Card Might Be All You Need 618
Vigile writes "With the release of AMD's latest budget graphics card, the Radeon HD 4770, the GPU giant is bringing a lot of technology to the table. The card sports the world's first 40nm GPU (beating out CPUs to a new process technology for the first time), GDDR5 memory, and 640 stream processors, all for under $100. What is even more interesting is that as PC gaming has evolved it appears that a $99 graphics card is all you really need to play the latest PC titles — as long as you are comfortable with a resolution of 1920x1200 or below. Since so few PC gamers have screens larger than that, could the world of high-end PC graphics simply go away?"
Once upon a time (Score:4, Insightful)
I used to have a top-of-the-line 3dfx graphics card. It was all I ever thought I'd need.
Today, that kind of power is available in my scientific caluclator.
Just goes to show that today's technology will become yesterday's technology in a very short period of time.
Re:Once upon a time (Score:5, Informative)
Personally I think this is true. And I think most game companies have targeted 100$ or less video cards for a while now. But there will always be games like crysis that will allow you to make use of your cutting edge 500$ card. Games can easily be built to 'work' on a 50$ card and still with a few settings tax a 500$ card. There is minimal coding investment compared to other features so people will always want it.
Re:Once upon a time (Score:4, Insightful)
Less games like Crysis? The majority of PC games aren't like Crysis in their demands at the high end anyway. So what are you trying to say exactly? Crysis has always been the exception, not the rule.
Re:Once upon a time (Score:5, Interesting)
Re: (Score:3, Interesting)
I remember buying my Radeon 9500 when it first came out because it was the cheap option at the time - it meant I could play then-current games at medium resolution and medium settings. It cost twice as much as this card that can play most current games at very high resolution and high settings. Even two years ago the 9600GT upon release couldn't achieve that. Whilst TSMC's 40nm process isn't the best, it allows for great die shrinkage and hence for a competitive price. This is definitely the best $100 card
Re:Once upon a time (Score:4, Insightful)
Re:Once upon a time (Score:4, Insightful)
Crysis today is like Quake and Hexen II when they first came out. It's a game based on a bleeding-edge graphics engine that won't be truly playable (at high quality) on commodity hardware until another generation or two of graphics chipsets come to market.
There are always going to be a few bleeding-edge games that break the rules. Most people who want to play them without breaking the bank will buy the console version. Others will just wait until hardware gets better.
Re:Once upon a time (Score:4, Interesting)
I remember when "high-end" meant $800-1000, this was particularly true when SLI first came out. At it's peak, maxing out your graphics capabilities meant spending up to $1400. $500 was maybe high side of mid-range, and you could get by with low settings on any game with a $300 card. The $300 level was where I spent most of my money, as I couldn't afford the high end stuff, and there were -always- settings I could not turn on because I didn't have the power. And this was when "High-resolution" Was in the 1600x1400 range (at the time of SLI we started seeing higher), monitors that went higher than that were prohibitively expensive.
$100 cards that max settings on most games with most monitors is pretty significant. Granted I didn't RTFA, so I don't know if that's exactly what they are saying, but either way it is significant. In fact, $100 cards that can play new games with most settings on high at 1080p resolution is pretty frickin impressive, even if it can't max it out. Makes me want to buy a new card.
Re: (Score:3, Funny)
Crysis is shit.
It's a shitty game.
It has shitty AI.
It's buggy as fuck.
And it's so demanding because it's so unoptimized.
Benchmarking your computer against Crysis is like seeing how much feces your new blender can handle.
Re:Once upon a time (Score:5, Insightful)
This reminds me of a conversation I once had with some guy at a (rather geeky) birthday party. I asked him about the SLI setup he bought two month ago. He told me that he'll replace it soon because "there are random frame drops when I play a recent game and watch a DVD on the other screen". He was really serious about this. I pretended to be interested for another 3 minutes and left him alone before my urge to punch him in the face became overwhelming
So in other words: I believe that there will be a market for such cards as long as there are enough clueless people who earn enough money to barely afford them. In my experience this target group is pretty immune to arguments - there is no reason to assume that they'll ever wise up...
Re:Once upon a time (Score:4, Insightful)
This reminds me of a conversation I once had with some guy at a (rather geeky) birthday party. I asked him about the SLI setup he bought two month ago. He told me that he'll replace it soon because "there are random frame drops when I play a recent game and watch a DVD on the other screen". He was really serious about this. I pretended to be interested for another 3 minutes and left him alone before my urge to punch him in the face became overwhelming ;)
So in other words: I believe that there will be a market for such cards as long as there are enough clueless people who earn enough money to barely afford them. In my experience this target group is pretty immune to arguments - there is no reason to assume that they'll ever wise up...
Don't get mad, and don't try to convince them otherwise, for heaven's sake. Guys like that are paying for the R&D costs of the uber-high-end cards that you can I enjoy for $100 a few years later.
Re:Once upon a time (Score:5, Insightful)
The whole premise is silly and reaks of someone who has no experience in ... well anything really. As I said, every industry has high end stuff adopted by a few, which eventually becomes standard and adopted by the masses. Welcome to the evolution of technology.
I'd think someone on slashdot would at least realize that.
I think the argument here isn't that "there is always a very expensive 'high end' and a more moderately priced and still quite adequate 'mid range'". It's more along the lines of "As technology advances, there ceases to be a 'high end' market for some products."
Look at it this way - when was the last time you bought a dedicated serial I/O card? When was the last time you bought a dedicated sound card, or network card, or firewire card? All of these are now so trivial that they're ubiquitously built in to midrange motherboards, so there is no "high end" market for them any more. TFA is just saying that video cards are next.
Re:Once upon a time (Score:5, Funny)
"...today's technology will become yesterday's technology in a very short period of time."
Yeah, in only one day.
Re: (Score:3, Interesting)
Point is, though, there was probably a time where each generation new scientific calculators came out with more (useful) features or more (meaningful) speed increases - but that nowadays what's out there at the lowest price point is probably good enough for practically everyone. Certainly that was the way things were about 10 years ago in the 2d world of graphics cards. The makers kept cranking the drivers and improving the hardware but I don't think anyone cared because no-one was waiting around for 2d tex
I once had a $300K SGI computer (Score:5, Insightful)
High-end what? (Score:5, Insightful)
I've been 'into computing' since a '286/20 was described as 'lightning fast'. I've never, ever spent more than 100 dollars on a video card. I've always bought last-years' high flyer for 60-80 dollars and I've never hurt for lack of fun games to play at resolutions that I've ever noticed as a problem.
Last years' CPU on last years' mobo costs 100 dollars for the pair. HDD upgrades for sale at 60 dollars - who isn't happy with this? Your average computer lasts about 4 years, by buying 1 year late you get 3/4 the performance life at 1/4 the cost while staying within the range of the target platform for most of the latest games.
Why is this even a question?
Re:High-end what? (Score:5, Insightful)
I would have said that until 1-2 years ago, the best "value per dollar" for video cards was about at $200. This is how much I spent on my first Voodoo2 card and my Geforce 6800. This past year, I spent less than $100 for a card that is arguably better performance per dollar, relative to the demand of the games on the market. So I would agree, $100 is the old $200 in terms of video cards.
Re:High-end what? (Score:5, Funny)
Re:High-end what? (Score:4, Funny)
I heard it wasn't the size alone, but also the width. Make sure it supports USB 2.0 for the better bandwidth.
Re:High-end what? (Score:4, Insightful)
Let's use ye olde law of Moore:
Let's assume a 6-month release cycle for hardware and games (pretty close to reality - these things do tend to come in batches around twice a year), and average the performance out over 4 years of ownership.
At 1 year old, your shit is at 71% of the new, hot shit.
At 1.5 years old, your shit is at 59%.
At 2 years old, your shit is at 50%.
At 2.5 years old, your shit is at 42%.
At 3 years old, your shit is at 35%.
At 3.5 years old, your shit is at 30%.
At 4 years old, your shit is at 25%.
At 4.5 years old, your shit is at 21%.
At 5 years old, your shit is at 18%.
That's an average (over your 4 years of ownership) of 39%.
If you buy brand new shit:
Brand new, your shit is at 100% of the new, hot shit. .5 years old, your shit is at 84%.
At
At 1 years old, your shit is at 71%.
At 1.5 years old, your shit is at 59%.
At 2 years old, your shit is at 50%.
At 2.5 years old, your shit is at 42%.
At 3 years old, your shit is at 35%.
At 3.5 years old, your shit is at 30%.
At 4 years old, your shit is at 25%.
That's an average (over your 4 years of ownership) of 55%.
If you plan 4 years of ownership (plus some slight overlap at the end) then waiting a year is beneficial if you can save just 29% on the price.
I chose to use specific points and average them since moore's law doesn't apply to retail prices smoothly, nor does the desire for performance (that tends to line up with hardware and software releases).
Re:High-end what? (Score:4, Funny)
What is it with all these newbies? 286s at 20mhz? Get away from my Vic20 at 1mhz! My glorious tape drive holds all the knowledge I need! Touch my patch cable and I'LL HUNT YOU DOWN!
GET OFF MY LAWN TOO!
Re:Once upon a time (Score:5, Funny)
I used to have a top-of-the-line 3dfx graphics card. It was all I ever thought I'd need.
I remember when this WHOLE website was nothin' but ORCHARDS; as far as the eye could see.
Re:Once upon a time (Score:4, Funny)
I used to have a top-of-the-line 3dfx graphics card. It was all I ever thought I'd need.
I remember when this WHOLE website was nothin' but ORCHARDS; as far as the eye could see.
ORCHARDS? I remember when it was just a patch of dirt. We had to plant the trees first.
(cue up someone saying they remember the molten rock)
Re:Once upon a time (Score:4, Interesting)
I'm from San Jose, I remember when all the semiconductor and tech company buildings were orchards.
Re: (Score:3, Insightful)
I disagree, I'm not saying this is it, but at some point you reach a point of diminishing returns. I'd say sound cards reached it several years ago such that only real audiophiles buy high end sound cards now days and on-board sound is good enough for most people.
I think it's fair to expect graphics cards to reach a plateau at some point as well and that point maybe sooner rather than later. You can only boost the resolution and push more and more polygons for so long until it stops making much difference.
Re: (Score:3, Insightful)
Unfortunately, that's the way it goes. AGP is obsolete.
The sole advantage of AGP was a faster, dedicated bus for graphics. PCI-Express accomplishes this and much more while being significantly faster than AGP was. AGP has gone the way of the dinosaur, and PCI is the new ISA (potentially useful in increasingly specific, niche applications).
Why would a manufacturer cram the latest technology into an obsolete interface? They probably wouldn't recoup the costs of re-configuring for AGP in sales if they did.
Re: (Score:3, Informative)
Agreed! (Score:4, Informative)
I recently purchased an Nvidia 9800 for around 129 bucks. It came with two Call of Duty games, so I imagine the card is significantly cheaper than that.
It runs everything without so much as a single complaint, on max details.
And is it just me, or does FSAA have little real effect on visual quality? I never have it on, and even with it on (such as in WoW), I can't notice a bit of difference on a 19" LCD monitor. Turning FSAA can save you tons of money (and framerates!)
Re:Agreed! (Score:5, Funny)
Yes, but turning japanese can save you child support payments. I really think so.
Oh... I see... you accidentally the whole thing.
Re: (Score:3, Funny)
You accidentally the whole topic.
Re:Agreed! (Score:4, Funny)
I accidentally your mum.
Re:Agreed! (Score:5, Informative)
Well I'm not an expert of any kind, but AFAIK the point of antialiasing is pretty much to compensate for low-resolutions displays. If you have a high enough DPI or a big enough display (and so you can sit far enough away) then FSAA isn't going to make a huge difference anymore.
Re:Agreed! (Score:5, Informative)
It exists to compensate for rendering artifacts due to rendering points on a regular grid; having more pixels per steradian (whether due to higher resolution or greater viewing distance) doesn't eliminate the artifacts, though it will, for most kinds of rendering artifacts, make them less noticeable. AA tries to eliminate the artifacts by sampling additional points around the "real" location on the grid and blending them to create the actual value rendered for the pixel.
Re:Agreed! (Score:5, Insightful)
Re: (Score:3, Interesting)
Unfortunately, even with 1920x1200 displays, we're still not there yet. Anti-aliasing at a lower resolution will often look better because it smooths out the sharp edges, essentially blurring the image slightly.
Complexity (Score:5, Insightful)
Re:Complexity (Score:5, Funny)
You're obviously wrong. This story is about how a $99 graphics card might be all you need.
It's on the internet, so it must be true.
Re:Complexity (Score:5, Funny)
Best argument for shooting artists I've heard all week!
Re:Complexity (Score:4, Interesting)
You mean, as long as the market supports ever-increasing poly counts etc?
At some point we hit a point of diminishing returns on better graphics units... the human eye can only distinguish so much.
Eventually we'll hit the point where there's simply not enough benefit to be gotten out of an expensive GPU. For me, that time is long past. For others, it may come in the next few years. For a small portion, the 'dreamers', it'll never come... but why would any company spend millions and millions developing new and better chips for such a small market?
Re:Complexity (Score:5, Interesting)
Or until they decide to abandon polys and go to true solid-object geometry. Computing the intersection of a ray and a flat poly is trivial. Computing the shading/reflection/refraction/etc. on a ray and an arbitrary curve takes significantly more horsepower.
I remember using a program on the Amiga way back when -- Real3D from RealSoft -- that did this. Excellent rendering, but dog slow compared to Lightwave and some others.
Re:Complexity (Score:4, Insightful)
The domination of polygon and SDS workflow in 3D modeling was mostly about the convenience for the artist. NURBS were available long before SDS became common. Subdivided polygons replaced real curved surfaces simply because they are so much easier to work with. CSG models still exist in some markets like CAD because the generally superior polygon workflows are inadequate.
Re:Complexity (Score:4, Interesting)
I agree. I haven't played Crysis, but I'm on my second time through Far Cry 2, and playability issues aside, the game looks just astounding. E.g., 1) the human models are so realistic they're descended deep into the Uncanny Valley [msn.com] and creep you out. 2) While the various areas you can go into do have a lot of artificial constraints about where you can walk (cliffs in this case, in the original doom it was walls of hallways and rooms), there is plenty of areas that don't have that and there is no fog of war or limited sight distance needed. I remember there were some hacks to remove the limited sight distance for NWN 1, and it looked okay right up until you started moving around and then it would make the game laggy and crash a lot. 3) Segregated areas. Again, with NWN 1 and lot of other similar games you had to segregate various areas to keep the number of polygons manageable, but with Far Cry 2 they seem to scale things in the distance to a lower resolution so that it stays manageable. They do have distinct areas, but they seem to have made the transition between the two relatively seamless, you only notice a little stuttering when you cross from one map to the next.
Anyhow, it seems that these sorts of games are very close to as realistic as you'd really want before you get diminishing returns for what can physically be portrayed on a 2-D screen. Now in FC2 they could have made a great game if only in addition to the graphics they would have worked on the AI of the soldiers and some decent factions instead of the 100% accurate aim, "x-ray specs vision" soldiers who are in separate factions but all hate you and instantly recognize you and will shoot you on sight.
Re:Complexity (Score:5, Insightful)
At some point we hit a point of diminishing returns on better graphics units... the human eye can only distinguish so much.
But we're nowhere NEAR that argument yet. State of the art movie-quality CG is still not quite there, and you are talking rendering times of hours per frame, not frames per second.
Eventually we'll hit the point where there's simply not enough benefit to be gotten out of an expensive GPU. For me, that time is long past. For others, it may come in the next few years. For a small portion, the 'dreamers', it'll never come... but why would any company spend millions and millions developing new and better chips for such a small market?
Graphics are not the only thing a GPU is used for these days. Game physics on the GPU is still in the early stages, and game AI on the GPU is almost non-existent so far. 3D gaming is still pretty new (and will be niche until display technology improves) and (at least) doubles the GPU requirements.
And who's to say 20-30 years from now we're not projecting stereo images directly onto your retina, or even your optic nerve? I sure hope that is at a better resolution than 1900x1200. We are orders of magnitude away from anything graphics and physics-wise that can fool the human brain.
I can't believe there are so many people here who really think a technology like this is "good enough" today. Have a bit of imagination, and it's pretty obvious (to me at least) that we've barely scratched the surface of 3D computer graphics.
Re:Complexity (Score:4, Insightful)
Then it'll move onto rendering more things on the screen, like games with ten thousand characters on screen at one time, all completely unique, and a landscape with infinite draw depth, nothing popping up but for instance a tree appearing as a single pixel on the horizon, getting closer and bigger until you have a photo-realistic microscopic view of the bark.
Re: (Score:3, Insightful)
Well we aren't yet to the point where a cheap card can produce completely photorealistic movies in real-time that are completely indistinguishable from real life. Until we get there, I'm sure people will keep pushing those limits.
Once we get there, I'm not sure what will happen. Maybe they'll still want faster cards so they can offload some other kinds of processing (physics? AI?).
A more expensive card == a bigger e-Peen (Score:4, Insightful)
Therefore, no. The high end will not be going away. Some folks will always feel inadequate and seek to compensate.
Could the world of high-end PC graphics go Away? (Score:5, Insightful)
No is the easy answer.
High-end graphics cards are rarely sold because of their real-world in game performance which is often insanely high; too high to notice in any game on release anyway. Nope, in my experience $600 graphics cards is all about bragging rights and benchmarks. It's the same category of people that buy water-cooling and ram chip heat-sinks & fans; they just want to squeeze that last 2% throughput out their probably insanely overclocked systems for the highest benchmarks possible.
It's actually good fun if you're into that; what you learn in overclocking is quite astonishing, but the super-high-end graphics cards are all part of that game.
Re:Could the world of high-end PC graphics go Away (Score:4, Insightful)
Wait...
Where does the heat in the water go?
Re:Could the world of high-end PC graphics go Away (Score:5, Funny)
Reminds me of the guy in my wife's office who kept a window unit AC sitting (and running) on his shelf. His office had no windows.
Re: (Score:3, Funny)
Re: (Score:3)
Wait...
Where does the heat in the water go?
Water cooling systems have a radiator and pump setup. Budget setups may have the radiator the size of a 120mm case fan, easily keeping it all 'in-case', while more expensive setups will have the radiator the same size as the case itself like this [billzilla.org]. High end cases these days tend to feature 'holes' to run the water cooling piping out of to external radiators too.
*Really* high end liquid cooling features full refrigeration systems using vapor - compression systems and whatnot - like this [extremeoverclocking.com], which easily sit well
Re: (Score:3, Funny)
How does that work? You're still producing the same amount of heat. Water cooling just moves it away from the electronics and into the room faster.
Easy, he refrigerates the water so it's colder. Don't see how this is so hard for you to understand...
/s
ATI 4830 is a better deal... (Score:3, Informative)
Second, Newegg lists the ATI 4770 as $109 USD [newegg.com] with a 128-bit memory.
Third, the ATI 4830 are a better deal for under $99 [newegg.com] with a 256-bit memory.
Re:ATI 4830 is a better deal... (Score:5, Informative)
First, the 4770 is running GDDR5 at approximatly the same clock rate as the 4830 running GDDR3 so they have the same effective memory bandwidth.
Second, while they both have 640 universal shaders, the shaders on the 4770 are running ~40% faster.
Third, so the 4770 has approximately the same or better performance than a 4850 that costs $130-150.
So I think the 4770 is a deal at $109 ... the price will probably come down after the inital rush and the 4830 will disappear.
Re: (Score:3, Interesting)
From the 4830 uses 30 watts more power and runs 20 degrees hotter. It is a very good card for the money but it isn't much faster than 4770 and the 4770 is no so will only come down in price. Both are good choices but I think the 4770 has more value than you are giving it.
Re: (Score:3, Interesting)
Well, it's hardly clear cut to call the 4830 higher performing.
The 4830 may have slightly better memory performance, but the higher core clock gives the 4770 higher processing performance. Also, quite a lot of the detriment of 128bit memory is made up for by much higher effective clockrates on the 4770's GDDR5 memory. You really can't look at bus width alone, bandwidth is a better measure.
in general, the 4770 vs the 4830 has:
29.7% more FLOPS (960 vs 740 GFLOPS)
11.1% less memory bandwidth ( 51.2 vs 57.6 GB/s
Rich kids will always... (Score:3, Interesting)
Blow their money on hundreds upon hundreds of dollars on super high-end processors, super high-end video cards, and super high-end RAM.
They will probably never learn that all those super high-end cards are such a waste of money. IMO the best thing to do is to shoot towards the middle to low high-end cards at most. In addition SLI is kind of stupid. Your better off using your money to get one high end video card. SLI/Crossfire doesn't double performance, it increases it substantially of course but it certainly isn't double performance.
Also you won't see performance gains on most games for a while on your super-duper high end cards, and by the time you do your card would be a middle-end card.
With how fast prices drop, the best thing to do is get decent stuff and upgrade it ever 1-2 years depending on your budget. Performance wise, Getting a 200 dollar video card ever 2 years is better than getting a 600 dollar SLI set of video cards ever 4 years.
And this is why I choose to get a Clevo laptop when I got a gaming laptop, but I would rather pay a little extra for an upgradable solidly built upgradable laptop with quad core support because it will last longer than a slightly cheaper dell POS.
Re:Rich kids will always... (Score:4, Funny)
We know that cards like that are a waste of money.
We don't care.
Money comes and goes, but owning some little punk with a sniper rifle in glorious realistic detail and hearing them cry about it in your headphones is worth every penny.
Some people spend stupid amounts of money on cars that they won't even take out into the rain. Some people collect stamps. We collect the bitter tears of gamers who are confined to a budget.
In the market (Score:5, Funny)
not an answer :) (Score:3, Interesting)
Technically, "Nethack at full res" would be the GL ports Falcon's Eye [users.tkk.fi] and its successor Vulture's Eye [lighthouseapp.com]. Despite the oddball names and fancy 3d graphics, these are Nethack. And it probably is possible to find a card that would struggle to run these versions of Nethack (though you might have to go to the used market).
So...your question wasn't actually quite as dumb as you probably intended it to be. Still dumb enough that I won't waste your time or mine by actually answering it, though. :)
cheers
Graphics Will Advance (Score:5, Interesting)
There are many untapped aspects of graphics. Showing a multiple-screens, multiple-angles viewpoint better is in immediate demand, but really high dpi, dots per inch, has yet to be available to budget PC users. Several years ago, IBM was reported to have monitors that have a resolution equivalent to what you find on the printed page. With that kind of resolution, a typical small laptop screen should fit inside 1 square inch with room to spare. I don't know if this is CRT technology rather than LCD, but higher resolution could be around the corner.
After 2D, there's 3D, and real time 3D. So keep buying better graphics, and there will be even better graphics coming.
Re:Graphics Will Advance (Score:4, Informative)
You're thinking of the T221 [wikipedia.org]. It's a single 22" LCD with a resolution of 3840×2400 and an initial price of ~$20k.
Uh, no. (Score:3, Interesting)
Well below 30 FPS average in Crysis 1920x1200 with only 0xAA and 8xAF? No thanks. Why would I buy a card that's underpowered on today's^H^H^H last year's games at far less than max quality?
All you need? Going away? WTF? (Score:4, Insightful)
Clearly not written by anyone who is very familiar with the graphics requirements of games like Crysis or Farcry 2. Can you run these games on a budget card? Yes. Is it possible to enjoy those games at a lower resolution or frame rate? Quite possible. Can either of those titles be enjoyed at their maximum potential? No
There are plenty of idiots who say bigger this, bigger that == bigger e-peen. That is really just stupid. There is a large segment of the gaming population who actually enjoy playing their games in the way the designers intended. Using physix, anti-alliasing, etc to achieve a full cinematic effect.
This goes for any enthusiast niche market. You have your audiophiles, your car guys, musicians, and artists, the list goes on. Why does a musician want a certain amp or guitar? Is it because he wants his peen to go to 11?
Re:All you need? Going away? WTF? (Score:4, Insightful)
"Can either of those titles be enjoyed at their maximum potential?"
Tetris is way more fun if you turn up the resolution and clipping.
640 stream processors... (Score:3, Funny)
640 stream processors ought to be enough for anybody.
Then it's time for (Score:3, Insightful)
High-end graphics cards went away a long time ago. (Score:5, Informative)
The world of high-end graphics cards went away a decade ago. Evans and Sutherland, Dynamic Pictures, and Lockheed all had graphics cards for PCs in the $1000-$5000 range. Ten years ago, I had a $3000 graphics board from Dynamic Pictures. For a while I had something called a Fujitsu Sapphire graphics board on loan; Fujitsu gave up and exited the business before launching a product. And I'm ignoring SGI here.
The high-end guys were run over by the gamer card industry, which had real volume and was "good enough" for high-end animation tools. "High end" today is a few hundred dollars, not a few thousand.
The big headache for the animation community has been insufficient graphics memory. Gamer cards tended to stress fill rate over texture memory. Nobody in animation cares about frame rate once it passes 30FPS. What you need for animation is plenty of space for big textures. Game textures are shrunk to fit, but that happens late in the development pipeline. During content creation (and for movie and TV work) you need much larger texture maps. A few gigabytes of texture memory would not be too much. For most of a decade, you couldn't get that on PCs. Finally, you can.
Diminishing returns (Score:5, Interesting)
I think we've reached a point where
- graphics are no longer a limiting factor for a game's enjoyment. Wireframe spaceships sucked. 100.000-polygons ones instead of 10.000-polygons ones probably don't make a huge difference. On the contrary, too many moving things actually distract. We can go "more lifelike", and blend (pun inteded) the boundary between games and films, but still...
- graphics costs are ballooning, both in terms of creating the ressource files, and programming all the candy/actions. At the same time, the attention is moving to other topics (IA...), and budgets are tight.
- there's probably a limit on how big a PC screen, and how small the dots on it, can be. Actually, most LCD screens don't even render all that many colors anyway.
Which explains why nVidia in particular is desperately trying to find other uses for a GPU. They are the only of the big 3 that don't have much else in their portfolio.
Unintended consiquences? (Score:4, Insightful)
Real cute, but it never ends (Score:3, Interesting)
It's nice that ATI keeps releasing value-conscious products for those cheap gamers, but it is rather short-sighted and sensational to say that "a $99 card is all you need". Ten years ago, a $99 card was enough to play Quake 3 in medium-res medium-graphics. The one thing these graphics companies are good at is marketing. They have figured out how to maximize their sales, and that meant crippling the used resale value of their products to capture the idiotic low-end market. They sell these crippled products in big box stores to people who don't know better, to get them hooked on the upgrade treadmill. Six months later, Joe Stupid is a budding gamer, wants to play Call of Duty 8 and drops another $99 on that month's cheapo card. After a couple of years, Joe has upgraded 3-4 times, while he could have spent the $300 up front for a good card that would still have some fight left in it.
I have seen this cycle far too often. I dunno, maybe these people suck at math, but they're clearly not saving money in the end. Some people are happy with their $99 card and keep it for the lifetime of their PC, but those people would have been just as happy with "free" Intel integrated graphics. Gamers always want more.
That's also why we've seen a ton of movement in the low-end segment, but very little progress at the top end. If you spent $500 on graphics two years ago, you're still within 10-15% of today's $500 graphics solutions, and that's just pathetic.
There will always be a good reason for higher end. (Score:3, Interesting)
One thing that this does is push the game developers to make games with better graphics faster/sooner than they would in the past.
Developers need to look at the low end, the high end, and the average for CPU and graphics power for their target audience. In the past, we would see a ton of Intel garbage graphics in systems, and that was the baseline that developers had to code for. As time has moved on, more and more systems, even with integrated graphics have shown up with NVIDIA graphics on the Intel side, and AMD systems have always had either AMD or NVIDIA graphics, which raises the bar by quite a bit.
With the level of GPU power in a $99 card, it shouldn't take too long for integrated graphics to show a significant improvement over the Radeon 3300 graphics on the AMD 790GX chipset. The question remains how long it will take, and how good or bad the integrated version ends up being.
Now, that raises the bar. While resolutions may not increase, the detail and quality we can run at will go up. Yes, a $100 card may run fine with medium graphics settings, but can you really expect a $100 card to run every game at 1024x768 at max settings and AA? That is the key to why people will buy higher end cards, so they can see games in their full glory.
Not Even Close - Ray Tracing is Coming (Score:3, Interesting)
The specs for DX11, such as they are at this point, call for real-time ray tracing. This will require a massive increase in power that frankly this new card is only starting to get close to being capable of. There's tons of new room to grow here. Perhaps it would be the last DX10 card you'd ever need, but not even close for future use.
That said, there should also be a standardized ray tracing test in the video suites. IIRC, there is already a ray traced version of Quake 4 out.
http://www.idfun.de/temp/q4rt/ [idfun.de]
parent not really a troll (Score:5, Informative)
Re: (Score:3, Insightful)
Re:But their drivers still suck (Score:5, Interesting)
Like Nvidia's are any better. They haven't had flat panel scaling working for I don't know how long.
Re: (Score:3, Informative)
Wow, it's not 2001 anymore. ATI/AMD have monthly driver releases, you very rarely hear about issues on the tech websites, and they're opening up the hardware specifications for open source drivers, which will take time to arrive but at least it's a good move for people who want an open source only desktop.
and yet, their drivers still suck (Score:3, Insightful)
Re: (Score:3, Insightful)
Then why the hell do you keep buying ATI cards?
Re: (Score:3, Funny)
Because the Nvidia ones aren't as good, apparently. ;)
Vacuum your case out... (Score:5, Insightful)
Re:Vacuum your case out... (Score:5, Informative)
I had all the same problems with my Nvidia card and then I looked at NV Monitor and saw that it was running at 92 degree celcius. Turns out the slot cooling fan I was using wasn't helping at all. I removed it and now I'm at a healthy 62.
Of course it also just sounds like a defective card or it's not seated correctly. ATI cards in the past would sort of work if they weren't seated correctly.
These days it seems AMD/ATI is putting out better drivers than Nvidia. It's a nice change to see given that I remember a time when it was the other way around.
Re: (Score:3, Informative)
Re:But their drivers still suck (Score:4, Insightful)
Probably shouldn't be a troll here. I have a $250 high end Radeon. Bought it along with a new system back in October. From the beginning, it would blue screen on boot but only once in a while. Now it's doing it more often (event log identifies the problem as with the ATI driver), it randomly boots the machine, and currently the machine is in a reboot cycle. Searching on the problem shows it's well known. Suggestions are to upgrade to the newest driver (fails) and disable some feature (fails). Reports of contacting ATI results in "it's Microsoft's fault". Calls to Microsoft result in "it's ATI's fault".
Yea. I agree. No matter the price, if it doesn't work, it doesn't work.
[John]
I can play this too.
Probably shouldn't be a troll here. I have a $250 high end Geforce. Bought it along with a new system back in October. From the beginning, it would blue screen on boot but only once in a while. Now it's doing it more often (event log identifies the problem as with the Nvidia driver), it randomly boots the machine, and currently the machine is in a reboot cycle. Searching on the problem shows it's well known. Suggestions are to upgrade to the newest driver (fails) and disable some feature (fails). Reports of contacting NVidia results in "it's Microsoft's fault". Calls to Microsoft result in "it's NVidia's fault".
Yea. I agree. No matter the price, if it doesn't work, it doesn't work.
Nvidia is known to pay forum users and the like to post FUD like this.
Ever since AMD bought ATI the drivers have been improving by leaps and bounds. With AMD/ATI, you now get a driver release every month. Their drivers have been completely stable for at least a year or two now, and game support has been growing and solidifying as well. The only game that ATI cards struggle with now is UT3; all the others the newest line (4850/70/90) thoroughly trounces the equally priced Nvidia card.
Think of it this way-- would you rather have the Nvidia 285 for $330, or the 4890 for $230? They perform the same, and drivers are not an issue.
Re: (Score:3, Insightful)
Ah, it's the old "now they're better" argument. My laptop with a Radeon 9600 still can't suspend with the proprietary driver. Sometimes it locks up when I enable an external monitor with their utility (gotta save all my work before trying that one.) Seriously, I hear the same thing about MS and security. If they're living with a reputation they've earned, don't expect that to change overnight. And don't blame users who've gotten bad support, even if their data is a little out of date. If I'm going to get sc
Re:But their drivers still suck (Score:4, Interesting)
Re:But their drivers still suck (Score:5, Informative)
Have you been ignoring AMD/ATI for the past year?
They've been releasing documentation on most of their chips lately, and the open source drivers have been making good use of it. The open-source 3d drivers aren't as good as the proprietary drivers, but if open-source drivers are a must for you, AMD is clearly the way to go, and has been for quite some time.
Nvidiots are still the same. (Score:4, Insightful)
Sorry, but whether you're an ATidiot or NVidiot, the same is true.
I used an ATi board up until I needed an Nvidia back (to get my old VRStandard shutter glasses usable again). Then NVidia fucked me over by making the "new" 3D glasses driver Vista-only and proprietary to their own fucking brand glasses, forcing me to choose between running an old driver (which won't work for certain games) or buying $500 in new hardware AND infecting my PC with Vista.
Bottom line is, if you're not doing something like that, you don't really care whether you have NVidia or ATi. Buy whatever is at the "sweet spot" in the pricing point. The 4770 for $99 certainly is a great price.
Oh, and one other thing to remember - Are you "Okay" with playing in 1900x1200? Fuck, man, I remember when 640x480 was stellar. When 800x600 at 30 frames was something to goggle at. To this day, I run a 21" CRT monitor that does 120 Hz at 1280x1024, I still have a NVidia 7800GS card (though I'll upgrade in a few months finally... after THREE AND A HALF YEARS on my current rig with no tears shed) and that's all I need.
Does anyone "need" 1900x1200? I doubt it. "High-end" graphics haven't been used by anyone but a few people who look more for bragging rights than fun in gaming for years. Hell, what are you going to play on it anyways - all the MMORPG's are still designed to run on 5 year old hardware, and anything "intensive" like Crysis is more of a fucking tech demo than an actual playable game anyways. The fun games, except for the MMORPG's, now come out on the consoles first and maybe get a PC port if you're lucky a year later.
Re: (Score:3, Informative)
If free drivers are really a concern to you, you might consider helping out with a project that is working to develop a graphics card that itself is open source.
http://www.opengraphics.org
Consider making a donation to help out developers:
http://linuxfund.org/projects/ogd1/
Re:Nvidiots are still the same. (Score:5, Insightful)
Does anyone "need" 1900x1200? I doubt it. "High-end" graphics haven't been used by anyone but a few people who look more for bragging rights than fun in gaming for years.
I run at 1900x1280, not because I want bragging rights but because that is the native resolution of my monitor and any non-native resolution looks fuzzy in comparison. The fact that I have a 24" monitor running at a high res may make me a pixel junkie, but that has nothing to do with gaming and everything to do with ordinary apps on my desktop.
Re: (Score:3, Informative)
And if you were using agile tech, rather than static LCD, this wouldn't be a problem!
Re:Nvidiots are still the same. (Score:4, Insightful)
Does anyone "need" 1900x1200? I doubt it.
You don't need a computer at all. You can just sit in a cave and eat whatever crawls below your feet.
Re: (Score:3, Funny)
Re:Try playing older games (Score:5, Funny)
Re:Try playing older games (Score:5, Funny)
I still have grill marks on my arm from playing Oregon Trail on my George Foreman grill. I think one of the members of my party got bit by a snake, but I am not sure.
Re:Hey, Jealousy (Score:5, Funny)
"could the world of high-end PC graphics simply go away?" I wish it would! I'm tired of carrying around all this envy directed at people with the kind of coin required to buy top-of-the-line graphics cards. I got a wife and kids to support!
Haha tremble before my single childless income.
...so lonely
Re:Hey, Jealousy (Score:5, Funny)
Meh. Go render yourself a girlfriend. It's what the rest of us do.
I tried that but we have nothing in common so she dumped me.
Re: (Score:3, Funny)
Re: (Score:3, Insightful)
Re: (Score:3, Insightful)
I expect we'll find that Xbox4000, PS4 and standard PC platform (TM) will be just as common in the study as in the lounge room, and vice versa - the upgrade treadmill will be broken.
I expect