Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

512MB GeForce 6800 Ultra Reviewed 468

Timmus writes "If you thought the $500 GeForce 6800 Ultra and $550 Radeon X850 XT PE were excessive, wait until you see nVidia's GeForce 6800 Ultra 512MB: it officially retails for $999.99! Firingsquad has a review of the card manufactured by BFG. They ran tests with 6 different configurations (including a pair of 512MB cards running in SLI) with widescreen benchmarks at 1980x1200 as well."
This discussion has been archived. No new comments can be posted.

512MB GeForce 6800 Ultra Reviewed

Comments Filter:
  • I might wait.... (Score:5, Insightful)

    by Anonymous Coward on Tuesday May 17, 2005 @05:43PM (#12560371)
    til next year.

    Then buy a PS3.

  • by winkydink ( 650484 ) * <sv.dude@gmail.com> on Tuesday May 17, 2005 @05:44PM (#12560384) Homepage Journal
    A grand for a video card? A grand? All I can say is some folks have more dollars than sense, but that's just MHO.

    A mirror of the print version is here [networkmirror.com] and a mirror of the full article is here [networkmirror.com]
    • But hey... (Score:3, Insightful)

      by 8086ed ( 876715 )
      That means it's only $2000 for the _graphics cards_ in a top of the line SLI rig... this month.
    • by geeber ( 520231 ) on Tuesday May 17, 2005 @05:56PM (#12560560)
      Yeah, but it comes with a t-shirt! That makes all the difference!
    • All I can say is some folks have more dollars than sense, but that's just MHO.

      Maybe you should be blaming the company for the price, not the consumer. After all, it's the company that set it.
      • Companies set their prices based on what they will give them the highest profit. In order for NVidia to profit by selling $1000 video cards, someone has to buy them.
      • Without people paying that high of a price, then they wouldn't retail it at that price. Also, if nobody buys it at that price, then it'll drop substantially over a short amount of time. Also, you need to take into account the cost of R&D and production of the card itself. There's really nobody to 'blame' here. Still, as of this point in time, one thousand dollars for a graphics card is too much. Hell, I spent $150 on a 9700, and it's suting me very well. It's not top of the line, but it still makes the
    • by mikael ( 484 ) on Tuesday May 17, 2005 @06:31PM (#12560960)
      That's how much graphics accelerator cards used to cost back in the mid 1980's - and they didn't even do texture-mapping or 3D.

      Hercules Graphics Station Card = $750

      + 2Mbyte VRAM + PROM chips = $200
    • by Erpo ( 237853 ) on Tuesday May 17, 2005 @08:49PM (#12562096)
      All I can say is some folks have more dollars than sense, but that's just MHO.

      I remember when the "high end" cards were priced around $200, and that wasn't very long ago at all.

      From the article:
      It employs the same six-pin power input you'd expect on any other high-end PCI Express graphics card, and the board sports a very similar active cooler for its graphics processor.

      I also remember when graphics cards didn't require a loud, whining fan to keep from catching on fire, not to mention a secondary power connector direct from the PSU.

      What really gets me, though, is how normal firingsquad tries to make it sound. It employs the same six pin power connector and "active cooler" you'd expect. No, I don't expect that. It's bizarre. It's wrong.

      Gaming isn't about faster and faster hardware performance. It's about games.

      As far as I can tell, the only way out of this mess is to buy used hardware and games two or three years after they're released. By that time, the bugs are ironed out and your friends have already emptied their wallets figuring out what's worth playing.
    • Re: (Score:3, Insightful)

      Comment removed based on user account deletion
  • $999? (Score:5, Funny)

    by Anonymous Coward on Tuesday May 17, 2005 @05:44PM (#12560389)
    Do you now buy the computer as something to run the graphics card on, rather than vice-versa?
  • Damn. I'm thinking this is a very small nice-market.

    Except for scientific aplications and video work, what can use this?
  • by composer777 ( 175489 ) * on Tuesday May 17, 2005 @05:45PM (#12560408)
    a complete waste of money. For an extra $500 you get maybe 1 or 2 fps. What I find strange is that firingsquad is split over whether or not readers should buy it. The whole review seems to be a better benchmark of how much of an industry shill firingsquad is than the graphics card itself.
    • From TFA it appears that you don't even get that much -- in many cases the 512MB card is slower than a considerably cheaper 256MB card.

      It strike me that the 512MB card may be of use to someone (e.g. scientific visualization?) who can find a use for all the video RAM ... but that would be it.
    • Impartial reviews will never result from vendor donated hardware. Do not bite the hand which feeds you.

      The parallels between Thresh's firingsquad and MS / SUN / Red Hat's bought and paid for style reviews are somewhat disturbing.

      -- RLJ

    • Well, the hardware does have a smidgen of future-proofing in it, since you can be fairly sure that 512MB will be enough memory to run games for at least a couple of years, and games that need the full 512MB WILL run worse on the much cheaper 256MB card.

      Of course you could probably buy the 256MB card now and upgrade to a 512MB in a couple of years and end up paying less for both cards combined than you will for this card alone. $1,000 really doesn't make much sense, except that the price will undoubtedly
  • by Some_Llama ( 763766 ) on Tuesday May 17, 2005 @05:46PM (#12560414) Homepage Journal
    Well for Longhorn and Quake4 I think this is now the minimum? Or is it 2 of these in an SLI setup?

    I'm still saving up for the 4way multi-core CPU minimum requirement =/
  • by aliens ( 90441 ) on Tuesday May 17, 2005 @05:47PM (#12560430) Homepage Journal
    This card costs $999 with 512MB DDR3, someone tell me how much the Xbox 360 comes with?

    See where I'm going with this? Just how big of a loss are Sony and MS willing to take with their consoles this time around? I mean either way the consumer wins out big.

    Even by the time winter rolls around you're not going to see this card or it's 256MB version for $50.
  • No one (Score:5, Funny)

    by Mad Ogre ( 564694 ) <ogre@ m a d o gre.com> on Tuesday May 17, 2005 @05:49PM (#12560456) Homepage
    No one needs that much graphics processing... *looks at Longhorn* Nevermind.
  • 3 PS3s (Score:5, Interesting)

    by mnmn ( 145599 ) on Tuesday May 17, 2005 @05:49PM (#12560457) Homepage
    So for that price, I can buy 3 PS3s, or a PS3 with a large TV, or a PS3 with LOTS of titles.

    I have a geforce4ti, and wonder why will I need more GPU power anyway. HL2 and doom3 run fine, and seem to need more memory and cpu bandwidths than triangle-pushers.

    Theres a major lackage of a physics processor right now. Given the nice placement of GPU cards... on a high bandwidth bus of the northbridge, I'd say put the physics chip on the video card. Otherwise on a PCIX card.

    Anyone care to comment where a card like this Geforce will be REQUIRED?
    • So for that price, I can buy 3 PS3s, or a PS3 with a large TV, or a PS3 with LOTS of titles.

      Not without a time machine, you can't.
      • Re:3 PS3s (Score:3, Informative)

        by goneutt ( 694223 )
        Take the $2000 and buy a CD from the bank. I'm not talking John Tesh plays the songs that kill dogs.

        Or better, pay off part of your credit card. Saving 20% intrest works better than making 4% intrest. Ben Franklin's Credit card maxim.
    • Re:3 PS3s (Score:5, Funny)

      by Sponge Bath ( 413667 ) on Tuesday May 17, 2005 @05:55PM (#12560533)
      So for that price, I can buy 3 PS3s, or a PS3 with a large TV, or a PS3 with LOTS of titles.

      Or 3 nice [see note] hookers.

      Note: The kind without a penis.

    • Re:3 PS3s (Score:5, Funny)

      by EulerX07 ( 314098 ) on Tuesday May 17, 2005 @06:01PM (#12560606)
      Around my parts, you could also buy around 50 cases of 24 beers. That's 1200 beers, enough to make your SNES look like the best machine ever for the next 100 days. You'll even get 8X AA/AF at no cost, and tons of gaussian blur.

      Then you'll need a new kidney.
    • Congrats - my GeFore 3 Ti 200 didn't run Doom 3 particularly well. The GeForce 6800GT I upraded to did, though.

      That said, lottery win notwithstanding, I would never drop a grand on a graphics card just to get more RAM on it.
    • A PS3 that still outputs to a low resolution, low refresh television [okay okay, and possibly a much nicer HD TV].

      The main feature of this card is to display onto big Apple displays at about 12 times the resolution... Fairly different audiences, even though both will likely be playing games.

      This card will likely be Required in the same places the Quattro was required: Big rendering houses for animation and LARGE picture work, and for game devs looking to make a game for "common" hardware 3-4 years from n
    • By the time you can actually buy a PS3, these cards won't be $999.
    • You could actually buy or come close to buying a new PC (maybe sans monitor) video a video card capable of handling more current games...

      Or you could buy a PS3 and a not-quite-so-bloody-expensive-but-still-damn-good video card...

      Maybe they're just hoping that by offering an obscene initial price the cards will seem really spectacular. A few rich fanboys will buy 'em, then they can dump the price and others will think they've become a good deal...
    • Re:3 PS3s (Score:3, Informative)

      by Some_Llama ( 763766 )
      "Theres a major lackage of a physics processor right now. Given the nice placement of GPU cards... on a high bandwidth bus of the northbridge, I'd say put the physics chip on the video card. Otherwise on a PCIX card."

      Someone is developing something like this, it will be a seperate add-in card, but sounds interesting

      http://www.megagames.com/news/html/hardware/physi c sdedicatedhardwaresoon.shtml [megagames.com]

      Although this article is a bit old, not sure if it is still in the works or not...

      "I have a geforce4ti, and w
  • by stratjakt ( 596332 ) on Tuesday May 17, 2005 @05:50PM (#12560470) Journal
    The price tags just dont justify what you get in return. So in order to make the "bling ding" cards attractive, they quietly drop support for "obsolete" hardware, that is, you don't see any bug fixes or software features being added in ATI's catylyst set for the 9x00 series anymore.

    On top of that, those "obsolete" cards haven't gotten any cheaper as new products usurp them. The 9800 I saw on the shelf last weekend still cost as much as when I bought mine a year ago.

    So far all signs point to the next gen of consoles being pretty much on par, visually, with the greatest crap that ATI and nVidia churn out.

    It's really hard to see the point of PC gaming anymore. What's it got that consoles dont? Online gaming with annoying mouthy 14 year olds? Check. Overpriced titles, and half-baked content delivery mechanisms? Check. Half finished products that require patches and updates to work correctly? Check.

    For what this card costs, I could get a jillion-inch widescreen high-def DLP set to hook my PS3 and XBox 360's up to.

    Just posting to keep the "pc gamer" vs "console gamer" wars going strong. It's fun to watch dweebs and simps fight.
    • For what this card costs, I could get a jillion-inch widescreen high-def DLP set to hook my PS3 and XBox 360's up to.

      No, you couldn't. I agree though, consoles are coming in as much better value for money.
    • I thought both ATI and nVidia were supplying chips for the next gen consoles. They probably don't make as much money per console, but they won't be out that much business unless both console and PC gaming does out.

      Keep in mind that the new consoles won't come out until late this year at the earliest, more likely some time in 2006.
    • Pc gaming got mouse control for FPS, real time strategy, and the option NOT to buy the 999$ gfx card...
    • Half finished products that require patches and updates to work correctly?

      This whole "patches are bad" argument sucks for one reason--it assumes that console games are always bug free. But they're not. MVP Baseball 2004 came out with a fairly big bug on Xbox, PS2, and PC (left handed hitters had a serious lack of power). The PC version got patched. The PS2 version never did (I don't know about XBox). So why is the fact that the PS2 version can't get patched a good thing?

      • Yes, but here we're talking about driver patches. If a console needed it, probably a huge pain in the arse yes (BIOS patch or hardware fix perhaps). But with a PC one of the major issues are the sheer number of different hardware options. On a console, games are built towards the hardware... which will always be the same (barring legacy support such as PS1 games on a PS2, etc).

        The game will always *know* what the hardware is, and during testing they can catch more errors. On a console, the vendor can't te
    • This new card is for a small market segment I like to call "suckers". ATI, nVidia, and the publishers of games know this. New games are and will continue to be accessable to anyone who's willing to spend about $1000 every 2 years on computer parts. Why not put out a card for those with more money than sense?

      PC gaming may die off, but it'll be cheap off the shelf PC equivilents that resemble the PS3 or 360 that'll kill it. All they need is MS Office 360 edition and the like. Next gen systems are a soft
    • Right, they're killing it... Sure. Whatever you say.

      It might be a pain on the wallet if any titles actually required anything that expensive. But they don't and never will, because, well, a game wouldn't sell if most people couldn't afford the hardware to run it.

      No, what they're doing is capitalizing on the people that for one reason or another just absolutely must have the latest, greatest, and most (expensive), despite all sensibility.

      This is the same type that buys Rolexes, when a Timex would do jus
    • "On top of that, those "obsolete" cards haven't gotten any cheaper as new products usurp them. The 9800 I saw on the shelf last weekend still cost as much as when I bought mine a year ago.
      "

      I've got a Riva TNT2 that still runs the latest drivers as this new $1k card. Still gets performance enhancements from newer drivers too. Not as often, but its not uncommon to see a few more fps after the occasional driver upgrade.

      As for prices coming down, Nvidea GeForce FX 5200 AGP8X 128MB DDR is $60 on froogle. I
      • A GeForce 5200 is 60$ for a reason. That POS may support DX9 features in hardware, but the GeForce ti4400 will outperform it even when emulating those features via cool drivers. I want to get a GeForce 5900XT, because those guys should be roughly 150$ CDN right now. I'd love to buy a GeForce 4 Ti4800 or 4400, on the premise they'd be about 100$ CDN or 80$ CDN. The lowest priced card I can find that will perform better than a GeForce 5900XT or Ti4800 is a GeForce 6600GT. They are 300$ CDN for the AGP ve
    • How about a keyboard and a mouse? and how about the ability to download and play any game you like without the need for a modchip.

      How about the ability for the rest of the family to watch tv whilst you play your video games?

  • It is twice the price, but offers equivalent (and in some cases worse) performance than cards with half the memory, because they have faster memory.

    Show-offs only need apply, for now.
  • Dual-link DVI (Score:4, Insightful)

    by Anonymous Coward on Tuesday May 17, 2005 @05:51PM (#12560485)
    The idea is that anyone with enough money to buy one or two of these 512MB cards is also planning to use a nice display. Thankfully, BFG had the foresight to employ two, dual-link DVI connectors, each of which supports resolutions up to 2048x1536 at 85Hz. You'll get away with up to 1920x 1080 at 60 Hz using the single-link port featured on 256MB Ultra cards. But if you really want to go big, Apple's 30-inch Cinema HD display, for instance, requires a dual-link DVI output for operation (BFG's product manager makes the clarification that the 30-inch Cinema HD is not supported in SLI mode, though). Previously, this was a feature only available on high-end Quadro cards, so including it with the GeForce 6800 Ultra is a big deal for graphics professionals.

    I don't think the 30-inch Cinema HD display is supported in this over-priced cards dual-link mode either. According to Apple, the optimum resolution [apple.com] of the 30-inch HD display is 2560 x 1600 pixels. The let's-drop-a-grand card supports a maximum of 2048 x 1536 (according to the article). Do the people who spend the money on these things expect blurriness?
  • by TripMaster Monkey ( 862126 ) * on Tuesday May 17, 2005 @05:51PM (#12560490)


    All I can say is that for a grand, this card better blow me and make me toast in the morning.
  • Someone like John Carmack or Pixar might want to tinker around with this kind bleeding edge technology, but there are tons of kids out there who will end up buying this card so they can play their Halo 2 and Ultimate Marbles.

    In a few months the price will drop to less than half, and BFG, LeadTek, or Asus will release the same board but with 1GB of RAM.
  • For that price I'd rather get a used [ebay.com] onyx [sgi.com].
  • We all know the VGA, SVGA resolutions. My question is: who comes up with these screen resolution combinations? How far up can you go in pixels on one screen?

    It seems to me the graphics chip guys are pushing the MBs on the cards instead of the resolution they put out. I wonder why?
    • My monitor is a 23'' LG2320, with a native resolution of 1900x1200. I play WoW and Guild Wars at that resolution with excellent results with my 128Mb 6600GT.
    • The Dell LCD 2405 is 1900x1200...I have one and it is sweet. Will probably pair it with another one or a 1600x1200 LCD since some games don't scale well to widescreen. For programming, the extra screen space is very useful. Plus a little picture in picture from my mythtv box makes it even better.

    • by eddy ( 18759 )

      I think the really interesting question is: Didn't FSAA come a little late to the scene, considering the ridiculous resolutions we can now play our game at?

      Every where you go you'll see websites benchmarking at 1900x1200 4xFSAA 16-tap and I'll just go... what the hell?

      Anti-Aliasing made a hell of a lot more sense to me back at 320x200 to 800x600... but maybe that's just me. I'm sure we'll have 16x FSAA at 8192x6160 too, and everyone will say it's da bomb! "How can you play without anti-aliasing? Don't y

  • Most Obvious Use. (Score:5, Insightful)

    by Kaenneth ( 82978 ) on Tuesday May 17, 2005 @05:56PM (#12560549) Journal
    Game Developers.

    If you are starting a new, state of the art game now: by the time you get it out the door, this level of video card will be standard built into motherboards. Almost Every PC game company in the world will need a few of these for testing, if nothing else.
  • Turns out.. (Score:5, Informative)

    by slicenglide ( 735363 ) on Tuesday May 17, 2005 @05:56PM (#12560556)
    I read the article, the card didn't do that great against ASUS's 256mb card, and in fact, in most of the tests the Asus 256mb card did better. ATI got blown away in pretty much all the tests.
    • Re:Turns out.. (Score:3, Interesting)

      I thought entry models always underperform a bit, until they tweak it up with drivers and balance things out, and improve the logics. To eventually end up with the "better and newer" version.

      When the first DirectX 9.0 Graca's came out they underperformed and were still 'evolving' (=buggy) compared to their matured DirectX 8.1 end-models. It's the way it goes.

  • Should read 1920x1280 not 1980x1200.
  • by RichM ( 754883 )
    Slashdotted already, anyone got a Mirrordot/Coral link?
  • So who wants to hack this into the Mac Mini mezzanine slot?
  • $1000 for a video card when Dell is selling entire desktop systems [dell.com] for $299 now.
  • Excess (Score:5, Insightful)

    by Gilmoure ( 18428 ) on Tuesday May 17, 2005 @06:34PM (#12560988) Journal
    Now, I drive a big block Chevy. I understand the need for more power and performance than sanity admits. But, with this card, are you actually getting more performance? I know I am with my engine mods. Or is this just a big dick exercise in marketing?
  • 512mb.. (Score:5, Interesting)

    by fenrisjlk ( 841357 ) on Tuesday May 17, 2005 @06:45PM (#12561092)
    What do people not get? Seriously, it's not the amount of VRAM that is included in the card, but the speed of the GPU. I'd rather spend that grand on two equally powerful cards, or a dual GPU card.
  • by CatOne ( 655161 ) on Tuesday May 17, 2005 @06:57PM (#12561183)
    You can pay an extra $500 for the card, and there is ZERO performance advantage WHATSOEVER.

    None.

    Zero, zilch, nada.

    Their only note is "well, with all that RAM, perhaps tomorrow's games will take advantage of it!"

    Thing is, in 1 year, you'll be able to get a card with 512 MB of RAM, which is 2x as fast as this card, for $399. In 2 years, that same card will be $199. So there is ZERO advantage to getting it now, because nothing can use it, and by the time technology *can* use it, it will be old hat.

    82% Rating? These guys are on the take.
  • by Dread Pirate Shanks ( 860203 ) on Tuesday May 17, 2005 @07:13PM (#12561310)
    The only reason I can justify buying a 512mb video card for gaming (the workstation benefits should be far greater, but this is not a workstation card) is to run Doom 3 at the ultra setting without SLI. The textures in ultra mode are larger than 256mb, so a card without that much memory gets drastic performance penalties. If firingsquad wanted to show off the capabilities of the card, they should have shown that in Doom 3, at ultra graphics settings, with one card, the performance gain for the 512mb card should actually be something to talk about.

    Nonetheless, even if you justified buying the card on the grounds that you don't need SLI, chances are you still have to upgrade your motherboard to PCI-E, and you still spend $1000 on video cards without the gain in performance achieved with two graphics processors.

    But hey, at least you're ready for Half-Life 3.

Anyone can make an omelet with eggs. The trick is to make one with none.

Working...