Cheap New GeForce 8800 GT Challenges $400 Cards 402
J. Dzhugashvili writes "What would you say to a video card that performs like a $400 GeForce 8800 GTS for $200-250? Say hello to the GeForce 8800 GT. The Tech Report has tested the new mid-range wonder in Crysis, Unreal Tournament 3, Team Fortress 2, and BioShock. It found that the card keeps up with its $400 big brother overall while drawing significantly less power and — here's the kicker — generating slightly less noise."
Help me understand. (Score:4, Interesting)
The only thing I can think of is that the production costs were higher for the GTS, resulting in less profit per card...
Can anyone clue me in?
Re:Kicker (Score:5, Interesting)
Weird.
Re:Help me understand. (Score:4, Interesting)
Overkill? (Score:3, Interesting)
Re:Yup (Score:3, Interesting)
In other words, if you play PC games at 1600x1200 or above, this is the only choice that you really have now - nothing else makes sense unless you're playing on a 30" monitor or want to throw away money.
Re:$200-250 is NOT cheap! (Score:1, Interesting)
But when I was doing data analysis on a few genomes, I would have killed for *any* discrete graphics card. In the end, I got an ancient PCI card and snuck it in the box when no one was looking. By specs, it couldn't compete with the onboard video, but I didn't get all the delays.
*shrug* I've noticed the issue on Windows, Linux (to a lesser extent) and FreeBSD (to a much lesser extent), so I suspect it has more to do with the drivers. Still good card + bad drivers + no better drivers = not a good card as far as the user experience goes.
Re:Help me understand. (Score:3, Interesting)
This way, early adopters don't feel like they got screwed into mild feature obsolecense by a card that costs half as much, people wanting the upgrades see more reason to buy the 8850GT because "Hey, it's a xx50 model - new features and still cheaper than the one I already bought!" without the lingering ball-ache of a $150 price drop - and, you've still got cards that devastate the existing ATI lineup with the potential to say "Well, screw ATI, now we're releasing the 8850GTS and GTX with more RAM, more monster cooling and both higher and lower prices." if ATI/AMD comes out with something that actually competes, unlike the 2900HD.
However, I do believe that nVidia is going to take a few pretty big steps in regards to more powerful cards before Christmas. The only real question is how much more power, and how much will they cost? The GTS and GTX models that are out now still dominate and command a high price that's out of reach for a lot of people. Are they just going to drop both the existing models like a hot potato as soon as new ones come out, or multi-tier with them some how?
Re:Help me understand. (Score:3, Interesting)
Re:half price (Score:3, Interesting)
I worked at a newspaper for a while, and they had a number of Pagemaker 7.5 machines running with some really lousy on-board graphics that were sold as "High-Spec" by the con artist offering tech support for the place for years previously. It took me bringing in my old GeForce 4 MX and dropping it into my workstation for the publisher to realize he'd been scammed, and order three more cards from Newegg to go into all the production machines. The speed increase working with these cards was gigantic - it felt to the production manager like she'd been given a whole new computer on her desk. $150 later, and speeds flipping between pages of broadsheet layout went from 20 seconds to instantaneous on three boxes, probably saving several work hours per week.
Re:Help me understand. (Score:4, Interesting)
Meh, I'm unconvinced SLI is anything more than markting hot-rods to idiots. I think this is like the dual 3dfx Voodoo Monster II all over again. If the next generation cards can do in a single slot what todays cards need two or more in SLI for, then 99% of consumers will just wait for the next card, and only the twits who need/want the bragging rights of an SLI unit will go for it.
I doubt any games are ever going to require an SLI setup.
In any case think back to the 3dfx monster stuff and recall how that panned out. Instead of everyone needing an array of video cards to run the latest games the entire dual card thing was rendered obsolete because a single next gen card could beat a dual monster setup for half the price.
And look at whats happening in CPU's... virtually nobody has a quad socket motherboard; and even dual sockets are a rare niche product. Yet we've had support for it on the desktop since 2000. But instead the trend has been to multi-core cpu's. The cost benefit just isn't there for multiple socket cpus or multiple card video solutions. However, if they can do "SLI on a single board"... that will be your next generation solution.
My 0.02 on the subject...
Double Precision Floating Point Support? (Score:3, Interesting)
(the reviews I have seen have been far less technical on new chip features than in previous graphics card launches).