Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Cheap New GeForce 8800 GT Challenges $400 Cards 402

J. Dzhugashvili writes "What would you say to a video card that performs like a $400 GeForce 8800 GTS for $200-250? Say hello to the GeForce 8800 GT. The Tech Report has tested the new mid-range wonder in Crysis, Unreal Tournament 3, Team Fortress 2, and BioShock. It found that the card keeps up with its $400 big brother overall while drawing significantly less power and — here's the kicker — generating slightly less noise."
This discussion has been archived. No new comments can be posted.

Cheap New GeForce 8800 GT Challenges $400 Cards

Comments Filter:
  • Help me understand. (Score:4, Interesting)

    by IndustrialComplex ( 975015 ) on Monday October 29, 2007 @01:56PM (#21159157)
    I can understand if this card were released by a competitor, but why would Nvidia release a card that competes with their top of the line at such a low price? Who wouldn't want the cheaper card?

    The only thing I can think of is that the production costs were higher for the GTS, resulting in less profit per card...

    Can anyone clue me in?
  • Re:Kicker (Score:5, Interesting)

    by Joce640k ( 829181 ) on Monday October 29, 2007 @02:02PM (#21159245) Homepage
    Half the price and almost the same frame rate is irrelevent?

    Weird.

  • by Lord Ender ( 156273 ) on Monday October 29, 2007 @02:09PM (#21159325) Homepage
    The GTS was to get money from early adopters, and remains on the market to squeeze money out of people who make purchasing decisions based on emotional ("I have the best!") rather than financial considerations. Everyone other serious game will henceforth buy the GT.
  • Overkill? (Score:3, Interesting)

    by lymond01 ( 314120 ) on Monday October 29, 2007 @02:17PM (#21159445)
    I run TF2 in 1680x1050 with a GeForce 6800GS Overclocked-out-of-the-box. Never skips, never gets busy, no artifacts. My processor is a single core Athlon (somewhere in the 3.2 GHz range). 2 GB of memory. It's not a "new" box by any means, but I haven't found a game that doesn't run on full (except FEAR with some of the most advanced features) graphics.
  • Re:Yup (Score:3, Interesting)

    by RightSaidFred99 ( 874576 ) on Monday October 29, 2007 @02:25PM (#21159541)
    The news is that this is a leap forward in price/performance. There have only been a few comparable video card releases in history. Typically, it goes "you pay $500 for a high end card, then it goes to $450, then $400, etc...". This is a card that costs $250 (or less) that is almost as fast as a card that costs $400 or more.

    In other words, if you play PC games at 1600x1200 or above, this is the only choice that you really have now - nothing else makes sense unless you're playing on a 30" monitor or want to throw away money.

  • by Anonymous Coward on Monday October 29, 2007 @02:25PM (#21159543)
    It depends. With high cpu/memory stuff, I've found the interface of a machine can get extremely unresponsive with i945 onboard chipsets. I haven't tried any of the ?3x00 chipsets by intel yet, so I don't know how far across the board this goes.

    But when I was doing data analysis on a few genomes, I would have killed for *any* discrete graphics card. In the end, I got an ancient PCI card and snuck it in the box when no one was looking. By specs, it couldn't compete with the onboard video, but I didn't get all the delays.

    *shrug* I've noticed the issue on Windows, Linux (to a lesser extent) and FreeBSD (to a much lesser extent), so I suspect it has more to do with the drivers. Still good card + bad drivers + no better drivers = not a good card as far as the user experience goes.
  • by TellarHK ( 159748 ) <tellarhk@NOSPam.hotmail.com> on Monday October 29, 2007 @02:27PM (#21159573) Homepage Journal
    I would have initially assumed that nVidia, fearing a threat from ATI, would have taken a different route to try and get the big bucks. Drop the price on existing cards by a hundred bucks or so, use the NV92 used for the 8800GT as the 8850GT in order to differentiate the price features, and charge a premium price for it in the $350 range.

    This way, early adopters don't feel like they got screwed into mild feature obsolecense by a card that costs half as much, people wanting the upgrades see more reason to buy the 8850GT because "Hey, it's a xx50 model - new features and still cheaper than the one I already bought!" without the lingering ball-ache of a $150 price drop - and, you've still got cards that devastate the existing ATI lineup with the potential to say "Well, screw ATI, now we're releasing the 8850GTS and GTX with more RAM, more monster cooling and both higher and lower prices." if ATI/AMD comes out with something that actually competes, unlike the 2900HD.

    However, I do believe that nVidia is going to take a few pretty big steps in regards to more powerful cards before Christmas. The only real question is how much more power, and how much will they cost? The GTS and GTX models that are out now still dominate and command a high price that's out of reach for a lot of people. Are they just going to drop both the existing models like a hot potato as soon as new ones come out, or multi-tier with them some how?
  • by krelian ( 525362 ) on Monday October 29, 2007 @02:45PM (#21159847)
    I bought a 8800 GTS 320MB about year after it came out, I don't consider it being an early adopter. Now, Nvidia brings a better card for a lower price. That's very different from the usual price drop on older cards.
  • Re:half price (Score:3, Interesting)

    by TellarHK ( 159748 ) <tellarhk@NOSPam.hotmail.com> on Monday October 29, 2007 @02:50PM (#21159921) Homepage Journal
    I've gotta disagree with that. You want a card with a good memory setup for things like Photoshop, just because you're working with such huge data structures on your screen. You don't want $20 cards, but a $50 card will likely do the trick.

    I worked at a newspaper for a while, and they had a number of Pagemaker 7.5 machines running with some really lousy on-board graphics that were sold as "High-Spec" by the con artist offering tech support for the place for years previously. It took me bringing in my old GeForce 4 MX and dropping it into my workstation for the publisher to realize he'd been scammed, and order three more cards from Newegg to go into all the production machines. The speed increase working with these cards was gigantic - it felt to the production manager like she'd been given a whole new computer on her desk. $150 later, and speeds flipping between pages of broadsheet layout went from 20 seconds to instantaneous on three boxes, probably saving several work hours per week.
  • by vux984 ( 928602 ) on Monday October 29, 2007 @02:56PM (#21160015)
    However, this would be an excellent time for nVidia to start letting Intel use SLI on chipsets.

    Meh, I'm unconvinced SLI is anything more than markting hot-rods to idiots. I think this is like the dual 3dfx Voodoo Monster II all over again. If the next generation cards can do in a single slot what todays cards need two or more in SLI for, then 99% of consumers will just wait for the next card, and only the twits who need/want the bragging rights of an SLI unit will go for it.

    I doubt any games are ever going to require an SLI setup.

    In any case think back to the 3dfx monster stuff and recall how that panned out. Instead of everyone needing an array of video cards to run the latest games the entire dual card thing was rendered obsolete because a single next gen card could beat a dual monster setup for half the price.

    And look at whats happening in CPU's... virtually nobody has a quad socket motherboard; and even dual sockets are a rare niche product. Yet we've had support for it on the desktop since 2000. But instead the trend has been to multi-core cpu's. The cost benefit just isn't there for multiple socket cpus or multiple card video solutions. However, if they can do "SLI on a single board"... that will be your next generation solution.

    My 0.02 on the subject...
  • by Brit_in_the_USA ( 936704 ) on Monday October 29, 2007 @06:01PM (#21162855)
    It was rumored pre-release that he G92 may have double precision floating point support. Is there any confirmation or firm denial of this?
    (the reviews I have seen have been far less technical on new chip features than in previous graphics card launches).

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...