Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Upgrades Hardware

Nvidia's GF100 Turns Into GeForce GTX 480 and 470 132

crazipper writes "After months of talking architecture and functionality, Nvidia is finally going public with the performance of its $500 GeForce GTX 480 and $350 GeForce GTX 470 graphics cards, both derived from the company's first DirectX 11-capable GPU, GF100. Tom's Hardware just posted a comprehensive look at the new cards, including their power requirements and performance attributes. Two GTX 480s in SLI seem to scale impressively well — providing you have $1,000 for graphics, a beefy power supply, and a case with lots of airflow."
This discussion has been archived. No new comments can be posted.

Nvidia's GF100 Turns Into GeForce GTX 480 and 470

Comments Filter:
  • by Artem S. Tashkinov ( 764309 ) on Saturday March 27, 2010 @12:05AM (#31636580) Homepage
    To summarize Fermi paper launch:
    • Fermi is a damn hot and noisy beast
    • Fermi is more expensive and only slightly faster than the respective ATI Radeon cards, thus DAAMIT will not cut prices for Radeons in the nearest future
    • Punters will have to wait at least for two weeks for general availability
    • Fermi desperately needs a reboot/refresh/whatever to attract masses

    It seems like NVIDIA has fallen into the same trap as with GeForce 5XXX generation launch.

  • by RzUpAnmsCwrds ( 262647 ) on Saturday March 27, 2010 @12:25AM (#31636734)

    More power draw than a CPU from the bad old days of Prescott

    Prescott at its hottest (Pentium 4 HT 571) was only 115W, which is about the same or (in some cases) vastly less than nearly every mid-range to high-end GPU today.

    Radeon 5830 is 175W
    Radeon 5850 is 151W
    Radeon 5770 is 108W

    Prescott at its hottest actually used less power than some of the current high-end Core i7 CPUs (i7-920 is 130W), although of course that's comparing a 1-core CPU to a vastly faster 4-core CPU.

    What's happened is that CPU coolers have gotten much better (thanks in part to heatpipes and larger fins/fans), power supplies have gotten more efficient and larger, and cases are better ventilated. The result is that today a 130W CPU is no big deal, whereas with the Prescott it caused all kinds of thermal nightmares for people building their own PCs (professionally engineered commercial PCs generally fared OK with Prescott).

    Still, 250W on a GPU is stupid. Even with modern efficient air cooling, it's hard to keep such a GPU cool without making a ton of noise. Add the crazy power supply requirements (most people are recommending 550W or more, which means $100+ if you want a quality PSU), and it's a pretty big burden. The real problem is that the ATI card is almost as fast, cheaper, and 80 watts cooler. And it's been on the market for 8 months.

  • Anand Tech Review (Score:5, Informative)

    by alvinrod ( 889928 ) on Saturday March 27, 2010 @12:34AM (#31636802)
    There's also an Anand Tech [anandtech.com] review which is pretty good and has plenty of different benchmarks. It has the added benefit of testing a 480 SLI configuration which produces some interesting results. It also presents some benchmarks that help to show off nVidia's GPGPU performance as well, which is something that they've been using to hype these new cards.

    In my own opinion, ATI still has a competitive advantage, especially considering that they can always drop their price if they feel threatened. nVidia is lucky that they have the ION and Tegra to fall back on, because it doesn't seems as though they don't have a pot to piss in right now in terms of high-end desktop graphics offerings. The 480 seems to be about equal to similarly priced ATI offerings and doesn't give them the edge in performance that they're accustomed to having.
  • by afidel ( 530433 ) on Saturday March 27, 2010 @04:07AM (#31637652)
    You're missing the best card on a performance per watt basis, the HD5750. The Powercool Go!Green edition pulls 62W max, 52W in normal gaming. It's so efficient it doesn't even need a PCIe power cord. It will get you 95% of the performance of the HD5770 pulling twice as much power. Oh and for HTPC's that are on 24x7 the 14W idle is nice too =) Now if only they would come down from $160 and Newegg would get them back in stock...
  • by yoyhed ( 651244 ) on Saturday March 27, 2010 @08:00AM (#31638544)
    Exactly; it's barely on-par with the months-old HD 5870, and it gets taken to school by the 5970. I love when AMD wins.
  • by MartinSchou ( 1360093 ) on Saturday March 27, 2010 @08:50AM (#31638728)

    In the AnandTech review the GTX400 is 2x-10x faster than the GTX 285 or Radeon 5870.

    That's overstating it WAY too much.

    In certain benchmarks the GTX480 is quite a bit faster than the 5870, but what you're saying is that it is across the board, which is just not true. From the conclusion of the AnandTech review:

    To wrap things up, let's start with the obvious: NVIDIA has reclaimed their crown - they have the fastest single-GPU card. The GTX 480 is between 10 and 15% faster than the Radeon 5870 depending on the resolution, giving it a comfortable lead over AMD's best single-GPU card.

    There is a massive difference between "10 to 15%" and "2x-10x faster".

  • by guidryp ( 702488 ) on Saturday March 27, 2010 @10:52AM (#31639464)

    http://www.legitreviews.com/article/1258/15/ [legitreviews.com]

    I discovered that the GeForce GTX 480 video card was sitting at 90C in an idle state since I had two monitors installed on my system. I talked with some of the NVIDIA engineers about this 'issue' I was having and found that it wasn't really an issue per say as they do it to prevent screen flickering. This is what NVIDIA said in response to our questions:

    "We are currently keeping memory clock high to avoid some screen flicker when changing power states, so for now we are running higher idle power in dual-screen setups. Not sure when/if this will be changed. Also note we're trading off temps for acoustic quality at idle. We could ratchet down the temp, but need to turn up the fan to do so. Our fan control is set to not start increasing fan until we're up near the 80's, so the higher temp is actually by design to keep the acoustics lower." - NVIDIA PR

    Regardless what the reasons are behind this, running a two monitor setup will cause your system to literally bake.

    Yikes!

    I already wasn't impressed, but after reading this it looks more like a fiasco, than just a mild disappointment.

  • by WhatAmIDoingHere ( 742870 ) <sexwithanimals@gmail.com> on Saturday March 27, 2010 @11:07AM (#31639570) Homepage
    I mean the real world results posted on hardocp.
  • by ooshna ( 1654125 ) on Saturday March 27, 2010 @12:08PM (#31640044)
    I'm not sure but I've seen pics first in dx10 then in dx11 and it added some nice visuals alot more detail in the texture of things like stairs out of bricks with some of them misaligned instead of perfect rectangles and alot of detail on the dragon statue. In fact here dx9 dx10 dx11 comparison [overclock.net]
  • Re:Anand Tech Review (Score:3, Informative)

    by Terrasque ( 796014 ) on Saturday March 27, 2010 @12:10PM (#31640060) Homepage Journal

    http://forums.anandtech.com/showthread.php?t=2062218 [anandtech.com] have some info about it... Or rather, a lot of people reporting the same, and nothing from the site admins.

"What man has done, man can aspire to do." -- Jerry Pournelle, about space flight

Working...