Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Graphics Upgrades Hardware

Nvidia's GF100 Turns Into GeForce GTX 480 and 470 132

crazipper writes "After months of talking architecture and functionality, Nvidia is finally going public with the performance of its $500 GeForce GTX 480 and $350 GeForce GTX 470 graphics cards, both derived from the company's first DirectX 11-capable GPU, GF100. Tom's Hardware just posted a comprehensive look at the new cards, including their power requirements and performance attributes. Two GTX 480s in SLI seem to scale impressively well — providing you have $1,000 for graphics, a beefy power supply, and a case with lots of airflow."
This discussion has been archived. No new comments can be posted.

Nvidia's GF100 Turns Into GeForce GTX 480 and 470

Comments Filter:
  • by NotSoHeavyD3 ( 1400425 ) on Saturday March 27, 2010 @12:38AM (#31636820) Journal
    I mean look at it like this. You can probably get a card for $120-$150 now that will probably run every current game well right now. (Well except for Crisis) So there is no point in buying it for current games. You could get that $500 card hoping that it will run future games well but it never seems to happen that way.(They're slow no matter what old card you have.) Instead you can just buy another $120-$150 card in a few years and that one will run it well. (This way you end up spending less money and actually get better performance.) So my experience is just buy a decent card ($120-$150) and in a few years buy another one and do whatever with the old one. (Sell it, give it to a family member whatever.)
  • I want (Score:2, Interesting)

    by Anonymous Coward on Saturday March 27, 2010 @01:47AM (#31637214)

    a 40nm 9800GT with 80W TPD. The 9800 is fast enough for my needs and has been for 2 years now. Less heat. Less power. Less noise. A 150W video card has absolutely no appeal to me.

  • by Anonymous Coward on Saturday March 27, 2010 @02:16AM (#31637322)

    The only real reason I wanted to get a Fermi / GTX480 card was to experiment with GPGPU and finally be able
    to work at reasonable performance using double precision algorithms which my 8800GT won't do.
    Now I find that they've crippled the double precision performance to something like 1/4th the hardware's actual capability
    just to price gouge the developers that want that capability as opposed to just playing video games.
    So as it stands the AMD 5870 is about 2/3rds the price or so and has 4x the double precision performance, runs a fair
    bit cooler, and has been available for many months as well. I think I'll have to pass on the Fermi / GTX4xx series cards
    until they get to their senses and make a product that is fully competitive with the much older and much cheaper AMD
    58xx series products performances in this regard.

    I don't really see why NVIDIA would think that it is reasonable to cripple DP performance for market segmentation reasons
    as if somehow DP wasn't a mainstream necessity for consumer and small business computing; every CPU out there has had a DP
    FPU for decades now (and wouldn't have if it wasn't useful to absolutely ordinary tasks), and OpenCL / DirectCompute / HDR / etc. etc. are all technologies that very much benefit from DP that are being pushed heavily for mainstream multimedia, image processing, and ordinary PC application performance enhancement.

    It is hardly esoteric HPC level stuff these days. Actually the real question is why it has taken so long
    to get quad precision / long double / whatever standardized into the computer languages / compilers (C, C++, CLR) and CPUs / GPUs, it would've been a logical progression around the time things went to 64 bit or earlier (for different but analogous reasons).

    Now if only AMD's drivers and OpenCL implementations weren't quite so bad. . .

  • by tirefire ( 724526 ) on Saturday March 27, 2010 @03:00AM (#31637442)

    I mean look at it like this. You can probably get a card for $120-$150 now that will probably run every current game well right now. (Well except for Crisis)

    Crysis came out in Q3 2007. It's not really a current game anymore. Its use as a benchmark for video card performance is frustrating because it's an incredibly inefficient game engine. Don't get me wrong, it looks beautiful... but so do games that will run at twice the frame rate on the same system.

    So my experience is just buy a decent card ($120-$150) and in a few years buy another one and do whatever with the old one. (Sell it, give it to a family member whatever.)

    Right on. This is what I used to do until spring of 2007, when I bought an nVidia 8800 GTS 320 MB to play STALKER. That card continues to serve me well with any game I throw at it. I was expecting to need to upgrade it in 2009, but I never did... new games kept running great on it. I've had that card for almost exactly THREE YEARS now and it still amazes me. I've never had any piece of computing hardware that did that.

    Changes in graphics card features and speed were really taking place at a white-hot pace between about 2003 and 2007. Those years saw the introduction of cards like the Radeon 9800, the GeForce 6800, and the GF 8800. All of those cards totally smashed their predecessors (from both nVidia AND ATI) in benchmarks. It was even more amazing than the CPU world from 1999 to 2004, when clock rates where shooting through the roof and when AMD embarrassed Intel with the introduction of the 64-bit hammer core (Athlon 64).

  • by bertok ( 226922 ) on Saturday March 27, 2010 @03:14AM (#31637482)

    And you know who buys the top of the line super expensive cards? Pretty much no one. Everyone else either buys a mid-range card or last years top of the line. Both of those will last you a few years and the all around computer cost is less than a console.

    Don't believe me that consoles are more expensive? I'm a PC gamer (who occasionally plays console games) and a friend of mine is a console gamer (who occasionally plays PC games). He tries to use your argument about "it's expensive with upgrading your computer", yet he ignores the fact that 1) console games virtually never go down in price, where PC games drop in price very quickly after the first few months and 2) Consoles nickel and dime you to death. We actually sat down and did the math one time and for his Wii, 360, PS3 and enough controllers for 4 players on each, it came out to over $2,500 for just the console hardware. You can easily buy two very good gaming systems for less money over the course of the lifespan of a console generation.

    So no, people don't turn to consoles because they're cheaper, people turn to consoles because they can't do basic math.

    Actually, people do buy the super expensive cards, and it's often not a bad deal.

    I got myself an NVIDIA GTX8800 when it just came out. It ran super hot, cost me quite a bit, but it was the fastest single-card/single-chip 3D accelerator on the market for something like a year, and even when faster cards came out, the difference was something like 10% for a long time.

    In the end however, it was cheaper for me to buy a very good card once and keep it for a couple of years, than to repeatedly buy older model cards at a lower price to be able to play the latest games.

    I could play Crysis just fine at 1920x1200 when it was first available, which was pretty much only possible on that card or an SLI system, unless you enjoyed playing the "Crysis slideshow". If I had an older model card, I'd have been forced to upgrade.

  • We actually sat down and did the math one time and for his Wii, 360, PS3 and enough controllers for 4 players on each, it came out to over $2,500 for just the console hardware. You can easily buy two very good gaming systems for less money over the course of the lifespan of a console generation.

    So you can buy two PCs (that can have one, or at most two people playing at once) or you can buy three consoles and enough peripheral hardware to have four people playing at once on each console and... consoles are more expensive?

    Consoles are also more convenient. Turn it on. Put in a disc, or load a game off the hard drive. Play. Turn it off. Easy.

  • by im_thatoneguy ( 819432 ) on Saturday March 27, 2010 @04:04AM (#31637642)

    Maybe. But that assumes that your GPU is just being used to render DX or OpenGL games.

    I think Nvidia made a very wise business decision with Fermi. Right now there is NO DEMAND for a video card on Fermi's level. All of the popular games run at full quality in full HD with AA. There is no "Crysis" which nobody can run at a decent framerate. We've sort of plateaued at "Good enough" since most games are cross developed for consoles (which are running aging video cards) and PC. Both AMD and Nvidia have released gaming cards that are overkill. So Nvidia has decided to take a different tact. They've managed to release a gaming card that is competitive with the very best video card for gaming and also redesigned their cores to be fast GPGPUs.

    In the AnandTech review the GTX400 is 2x-10x faster than the GTX 285 or Radeon 5870.

    That might not do much for Modern Warfare 2 but Modern Warfare 2 already runs great. It will offer huge performance improvements in things like video encoding, photoshop or any other CUDA ready application.

    As OpenCL gets used more in games for things like hair and cloth simulation or ray-traced reflections Nvidia will have an architecture ready to deliver that as well. At some point AMD is going to need to go through a large re-architecture as well. But the longer they wait the more likely they'll be trying to push out a competing product while the competition is fierce. If there is a time to deliver an average product and suffer huge delays it's during an economic turn down and a period where there is little reason to upgrade.

  • by beleriand ( 22608 ) on Saturday March 27, 2010 @08:48AM (#31638722)
    The previous gen. NV Cards don't do DX11, thus where running this bench in DX10 mode. Which is kind of a misleading thing for THG to do, while they explained it in the text they should have made seperate chart for those two modes to make it clear on thirst glance. There is a comparison of image quality in Unigine Benchark (DX10 vs. DX11) out there somewhere, and the difference is night and day.
  • by Hadlock ( 143607 ) on Saturday March 27, 2010 @09:09AM (#31638838) Homepage Journal

    I got halfway through the first paragraph before I started looking for the link to the L4D2 benchmarks, which are a pretty good indicator of how well your computer is going to run L4D2, TF2, and very importantly, Portal2. None detected, even though it's one of their primary tests on all of their video card shootouts. Another failure for the guys at Tom's Hardware.

  • Folding@Home (Score:2, Interesting)

    by shino_crowell ( 1776802 ) on Saturday March 27, 2010 @03:44PM (#31642000) Homepage
    For most gaming applications ATI ran away with this round in the price/performance category. For F@H though, I think this is going to be a very interesting card, Nvidia just folds better than ATI. There are numerous reasons for this, and finger-pointing is futile, but thats the cold hard fact. The extended time that software-side engineers have had to play around with CUDA seems to have been beneficial. In time, and with work on their OpenCL implementation, I think the current generation Radeons will catch up, but not for a while. I'm mostly interested in seeing how this card performs against the GTX 295, currently the best single PCB GPU folder. If the retail prices of the GTX 470, with its optimized CUDA Cores, stays within the $350-$400 range I'd love to pick one up to play with. Do not take this as an endorsement for either company. I simply choose the best hardware to fit my specific needs: ATI for gaming, Nvidia for F@H

And it should be the law: If you use the word `paradigm' without knowing what the dictionary says it means, you go to jail. No exceptions. -- David Jones

Working...