Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Graphics Software Upgrades Hardware

Positive Reviews For Nvidia' GeForce 6800 Ultra 564

Sander Sassen writes "Following months of heated discussion and rumors about the performance of Nvidia' new NV4x architecture, today their new graphics cards based on this architecture got an official introduction. Hardware Analysis posted their first looks at the new GeForce 6800 Ultra and takes it for a spin with all of the latest DirectX 9.0 game titles. The results speak for themselves, the GeForce 6800 Ultra is the new king of the hill, beating ATI's fastest by over 100% in almost every benchmark." Reader egarland adds "Revews are up on Firing Squad, Toms Hardware, Anandtech and Hot Hardware." Update: 04/14 16:54 GMT by T : Neophytus writes "HardOCP have their real life gameplay review available."
This discussion has been archived. No new comments can be posted.

Positive Reviews For Nvidia' GeForce 6800 Ultra

Comments Filter:
  • by Seoulstriker ( 748895 ) on Wednesday April 14, 2004 @12:16PM (#8860693)
    I am really quite impressed with the performance of the 6800. Across the board, the 6800 is nearly twice the performance of the current top of the line cards. Going from 4x2 pipes to 16x1 was definitely worth it for nVidia, as their shading performance is simply astounding! Halo actually runs incredibly well on the 6800, getting 2x-3x current performance.

    Now, as DooM 3 is supposedly being released with the 6800, can we expect DooM in mid-may? This is truly an incredible day for PC gaming as we will have cinematic computing in the near future.

    I'm giddy. :-)
  • Its HUGE (Score:5, Interesting)

    by silas_moeckel ( 234313 ) <silas@@@dsminc-corp...com> on Wednesday April 14, 2004 @12:17PM (#8860703) Homepage
    Ok this card has great specs etc etc etc. Did you look at the thing it's taking up at least 1 PCI slot for the fan and another for it's intake to the fan. This thing should have just come with water cooling out the back. Granted it's specs look great I do have to ask will it drive that IBM T221 LCD display that hits 204DPI at 22" thats about the only thing I can think of that realy would do the card justice.
  • Impressive! (Score:4, Interesting)

    by cK-Gunslinger ( 443452 ) on Wednesday April 14, 2004 @12:17PM (#8860709) Journal

    I must admit, after looking at the benchmarks from Tom's and Anand's earlier this morning, I am *very* impressed by the results of this chipset. I still have concerns about the cooling and power requirements, as well as the image quality, but that may be partly related to my newfound ATI fanboy-dom. ;-)

    Speaking of which, I can't wait to see what the boys from Canada have coming next week. 16 pipelines? Mmmm....

  • Re:nvidia's back (Score:4, Interesting)

    by ackthpt ( 218170 ) * on Wednesday April 14, 2004 @12:17PM (#8860711) Homepage Journal
    These are the guys that managed to crush every single other player into the ground..

    Is it considered "safe" to buy any of the Nvidia chipset motherboards, or are they still pretty sketchy?

  • by Seoulstriker ( 748895 ) on Wednesday April 14, 2004 @12:19PM (#8860732)
    They are comparing the latest nVidia GPU to the 9800XT, which is several months old. When ATI's next-gen chip comes out (two weeks?), only then will we be able to see who holds the GPU Speed crown.

    I don't think so. The first ATi card to be released will be a 12x1 pipe version while the first nVidia card will be a 16x1 pipe version. ATi seriously underestimated what nVidia was planning as they moved the production schedule of their 16x1 pipe version 5 months ahead of schedule. ATi was scared s***less and for good reason as we found out today.
  • by Guspaz ( 556486 ) on Wednesday April 14, 2004 @12:21PM (#8860755)
    ATI's next-gen offering is to be launched about the same time as nVidia's GeForce 6800, and we haven't seen reviews from it yet.

    I'd wait until the Radeon X800 benchmarks are out before crowning a new king. For all we know ATI's new offering will beat the new GeForce.
  • Re:nvidia's back (Score:3, Interesting)

    by LqqkOut ( 767022 ) on Wednesday April 14, 2004 @12:27PM (#8860824) Journal
    The damn thing still won't fit into a Shuttle case... It'd be nice it they said something about noise. [H] [hardocp.com] is /.'ed too, I wonder what they have to say.

    I've been a hardcord nVidia follower for years, but after last year I was left with a bad taste in my mouth. I'm glad to see another generation of video cards and I can't wait to see what ATI's got to offer - it's been a while since nVidia has had to play catch-up.

    Yea! More horsepower for Doom ]|[ (only 2 more months!)

  • Re:Fanboyism (Score:4, Interesting)

    by Nogami_Saeko ( 466595 ) on Wednesday April 14, 2004 @12:31PM (#8860866)
    Well said! The amount of "epenis" bickering that surrounds videocards is legendary, but the fact of the matter for me is that I buy what's fastest with best quality at any given time (assuming relatively stable drivers of course). Of course, price does figure into it as well. I'm not going to pay a huge premium for a card unless it's significantly better than the competition. A few extra percent on a benchmark simply won't open my wallet more.

    Had a NVidia GEForce2 when it was at the top of the pile a few years ago, picked up an ATI 9700Pro when it was released. May go back to Nvidia, may stay with ATI (shrug).

    In the longrun, all of us consumers benefit from some healthy competition. Granted, as a Canuck, I'm happy to see ATI do well - but they also earned it. At the time when the 9700Pro was released, ATI blew Nvidia out of the water. Nvidia had grown a tad complacent, and they paid for it.

    Now we'll see what happens with Nvidia having a fast new card and ATI about to release their new offering in a few more weeks.

    N.
  • by egarland ( 120202 ) on Wednesday April 14, 2004 @12:33PM (#8860890)
    ATI is supposed to announce the 420 soon. They've had some time to redesign too. I switched to ATI in the last round of upgrades and was very happy. I'll need a good reason to switch back. So far I have good reason but ATI could take it away with a decent new product.
  • Re:Its HUGE (Score:3, Interesting)

    by afidel ( 530433 ) on Wednesday April 14, 2004 @12:34PM (#8860905)
    I take it you haven't seen some of the games out now with LOTS of eye candy? Silent Storm is absolutly amazing looking even with crappy settings, I turned on all the eye candy for a while just to look at it but my lowly GF3 can barely do 1FPS. People with the newest Radeon's and GForce's can't run it at high resolution with everything cranked. This is an engine that Nival obviously designed for a seriously long lifespan. Oh yeah and AI processing eats my 1.2GHz Athlon for breakfast. I think this game is going to make me finally upgrade my PC =)
  • by fitten ( 521191 ) on Wednesday April 14, 2004 @12:43PM (#8861010)
    From actually RTAs, most reviewers say that the card is surprisingly quiet and has acceptable noise levels, in spite of the large, scary heatsink/fan/heatpipe.
  • 2d Performance (Score:5, Interesting)

    by phorm ( 591458 ) on Wednesday April 14, 2004 @12:49PM (#8861054) Journal
    Which actually brings me to a good question: Graphics cards have been improving in fast-3d-rendering performance, but are often not that great at crisp 2d rendering (compare an NVidia card to a Matrox and see what I mean).

    How well does this one do at 2d rendering? I do play 3d games a lot but that doesn't mean I want my web-browsing and other non-3d activities to be sub-par
  • Re:2d Performance (Score:5, Interesting)

    by Paladin128 ( 203968 ) <aaron&traas,org> on Wednesday April 14, 2004 @12:53PM (#8861091) Homepage
    That used to be true, but the gap is closing. Most GeForce FX cards have pretty fantastic RAMDACs. Yeah, the Parhelia does look a hair better (but only on your primary monitor, if you're using more than one with it). NVIDIA beats Matrox in price, performance, and driver quality.

    Besides, I'm not going to be using an analog output for too long... DVI kills the whole "2d quality" argument; the color values are passed digitally via a TMDS transmitter. Doesn't matter if
  • A little early still (Score:1, Interesting)

    by Anonymous Coward on Wednesday April 14, 2004 @12:55PM (#8861114)
    While this is impressive, I expect the architecture to hit a high point when they release a version based off 90nm processing and a NATIVE PCI-Express connection. None of this custom bridge crap. The drivers will also be more mature by then and any unforseen performance bottlenecks should be alleviated.

    Neither ATi nor Nvidia is being conservative on power or heat with their high end graphics cards. Arguing over this is point is moot because you're simply not going to get performance any other way. The solution, if you really want it, is to have less powerful processors with less transistors. I'm not opposed to that since I use a 100% fanless desktop (external power supply, too), but that's just me.
  • my next computer (Score:3, Interesting)

    by WormholeFiend ( 674934 ) on Wednesday April 14, 2004 @01:04PM (#8861198)
    I now fully expect to have to build my next PC around a video card, with the rest of the hardware being peripheral to the VPU and its board/heatsink.

    Crazy.

    I bet in a few generations more, home PCs will have fans so big, you'll be able to drive them around the house and mow the lawn, too!
  • 16 pipelines. (Score:3, Interesting)

    by flaming-opus ( 8186 ) on Wednesday April 14, 2004 @01:21PM (#8861421)
    The top-of-the-line card is always cool to drool over, and a few people with too much money will undoubtedly run out and buy this monster. However the mid-range and budget derivatives are generally much more interesting. (compare the number of GF5600/RA9600 cards sold to the number of GF5950/RA9800 sold)

    They made this haul ass by doubling the number of pipes, but the first thing they are going to do when they put out a mid-range card is to halve, or quarter the number of pipes. How much has been done to refine this card, and how much impact will the new design have for those of us with $150 to spend on a video card?
  • by Zathrus ( 232140 ) on Wednesday April 14, 2004 @01:37PM (#8861605) Homepage
    It's not like those other games are using the hardware shaders yet anyway (or are they?).

    They are -- FarCry is probably the most intensive game out there right now, fully utilizing DX9 specs. Halo is no slouch either, although a lot of its speed issues are from wanting to use hardware that simply isn't present (on PCs -- it is on the Xbox; why they didn't port away from this is beyond me).

    Aquanox 2, Tomb Raider: Angel of Darkness, Painkiller, UT2k4, BF: Vietnam, and several others utilize DX9 to varying lengths as well. And there's the upcoming games -- Half Life 2, STALKER, Soldner (with an umlaut on the o), World of Warcraft, Everquest 2, and numerous others.

    Quake 3 simply isn't a reliable benchmark anymore. It utterly fails to excercise the newer features of the cards -- which are really the only features to bother upgrading for. If all you're going to do is play Q3-era games then a GeForce2 is more than sufficient. If you want to run games already out, and those coming out in the next year, with all the graphical options turned up and at high-res then you'll be best served by either the latest nVidia or (probably) ATI card.

    And (most importantly to me, and many others) if you want to get a card that can run new games at reasonable resolutions with most of the graphical bells and whistles on, but at a reasonable price... well, those $400 cards are going to be sub-$200 very quickly now, and the $200 cards are going to drop to around $100.
  • by willy_me ( 212994 ) on Wednesday April 14, 2004 @02:00PM (#8861961)
    The chipsets are all very similar. It's the external components, filters and such, that determine quality. Matrox has a good reputation because they use high quality components. Same with ATI. NVidia has a poor rep because of all the different card makers trying to undercut each other by using cheaper components. Same is true for the clone ATI boards.

    So long as you have a quality graphics card, it really doesn't matter who's chipset is powering it. For example, even though NVidia has a poor rep, there are still high quality cards out there.

  • Thoughts (Score:3, Interesting)

    by fullofangst ( 724732 ) on Wednesday April 14, 2004 @02:24PM (#8862280)
    Ahh well this is nice to see - a new generation of graphics card that will now allow me to play practically any of my games at up to 1600x1200 without gameplay-affecting slowdown. So far, so good.

    I am genuinely happy that Nvidia have released a product that can perform 'significantly' better than their currently available flagship card. As ATi are going to retaliate with their own card, this can only be a good thing and I hope they do actually keep this large performance jump up for the next generation(s).

    One thing to note in some benchmarks which I've seen so far, are that some of the results give the maximum framerate of a game. I'd be more happy reading either an average or Minimum framerate achievable, as in a frenetic multiplayer game you are going to be usually rendering a lot more stuff than in a single player. The minimum framerate is what I'll be watching out for as that is where the most frustration will come from - nothing quite so annoying as experiencing slowdown when something critical happens, or if you are in the middle of a hellishly large battle (which happens quite a bit in UT2004 Onslaught, for example).

    Unfortunately I won't be able to use this card in my Shuttle. The card is too big and too power-hungry. As someone else says, noise isn't exactly a problem as you would generally get this card to play fancy loud games on anyway.

    And recommending a 480w power supply? Hmm. Oh well, wish I was a hardware site journalist under NDA, I'd have had time to buy some shares in Enermax ;)
  • Re:Fanboyism (Score:1, Interesting)

    by Anonymous Coward on Wednesday April 14, 2004 @03:06PM (#8862746)
    There's an OBVIOUS place to go:

    Freaking reduce the power requirements for that kind of performance...

    No WAY am I picking up a 600W+ power supply, with all the power suck that implies, just so I can get 10 fps more at a resolution that I can't see the difference in the first place.

    I have ZERO interest in this product because of that 120W pull. Also the likely $500+ price tag of course.

    Geez. Do you doinks realize how incredibly dumb this all is?
  • by Skuld-Chan ( 302449 ) on Wednesday April 14, 2004 @03:57PM (#8863216)
    Are you serious? Every ATI video card I've ever had has had serious quality control problems. If it wasn't the drivers it was the physical hardware.

    I bought a 9500 Pro a year ago and I've only ever been able to use it for a month. I'm on card number 4 now because of a flaw in the way the heatsink/heatsink shim was made (something their customer service reps admit to). I was so burned by the 9500 that I could honestly never bring myself to by another ATI card for as long as I live. Much in the same way it would be hard to bring yourself to stick your finger in a light socket. The third card (which came straight from ATI) I gave to my brother, was DOA - it had garbage all over the screen long before we even tried to install the drivers for it.

    Its not the only ATI card I've had problems with - the Rage 128 had the worst drivers on earth, and the Raedon 8500 drivers gave me delayed write failures on my hdd (search google for this - its a pretty funny problem - especially if you work in tech support like me).

    I went out and bought a NVidia 5900 and I'll never look back. Its been the most problem free (I haven't had any problems with it actually) video I've ever owned since I got into computers.
  • by Anonymous Coward on Wednesday April 14, 2004 @06:26PM (#8864091)
    Does anybody know if the build-in video processor (that can do mpeg 2 and mpeg 4 compression) is likely to be available in Linux? Toms Hardware mentions that recent Nvidia cards also has had a video processor. Are these supported in Linux?

THEGODDESSOFTHENETHASTWISTINGFINGERSANDHERVOICEISLIKEAJAVELININTHENIGHTDUDE

Working...