NVIDIA's 55nm GeForce GTX 285 Launched 82
Visceralini writes "NVIDIA is launching yet another high-end 3D graphics offering, an optimized version of their top shelf GeForce GTX 280 single GPU card, dubbed the GeForce GTX 285.
This new GeForce is a
55nm die-shrunk version of the legacy GTX 280 with lower power consumption characteristics that don't require an 8-pin PCI Express connector, rather just a pair of more standard 6-pin plugs.
Performance
metrics are shown here in a number of the latest game titles including
Fallout 3, Left 4 Dead, Far Cry 2 and Mirror's Edge. The new GTX 285 is
about on par or slightly faster than a GTX 280 but with
less power draw and some room for overclocking over the reference design."
Power Savings!! (Score:5, Funny)
The new GTX 285 is about on par or slightly faster than a GTX 280 but with less power draw and some room for overclocking over the reference design.
40W less while idle (vs. 280), @ $0.12 kWh, means if I can pick one up for $400 (I can dream, can't I?), it will have paid for itself - through power savings - in less than 10 years!! I know what I'm spending my tax refund on!!
Re: (Score:2, Informative)
in less than 10 years!! I know what I'm spending my tax refund on!!
You can also use it to crack passwords even Faster! [hothardware.com]
Re: (Score:2, Funny)
Re: (Score:2)
Not exactly... but if you were about to spend $250 on the mid-range GTX260 card, you can now get the high-end GTX285 which will pay for the difference in power saving after a few years.
Re: (Score:3, Insightful)
Most people that rush out to buy these type of cutting-edge hardware replace them every few months or so. The savings realized by power conservation will never cover the difference between the two with the crowd these new cards (indeed, all new cards) are targeted at.
Re: (Score:2)
What enthusiast that spends big dollars on the latest cards uses their card for more than a year?
Paid $400 for an 8800 GTX well over a year ago, wish I would have caught the 8800 GTSs. But anyways, I plan to keep it for more at least another 12 months. We'll see what's out then.
Re: (Score:2)
Re: (Score:2, Interesting)
Re: (Score:2)
Why not throw in a an atom too so when your are browsing the web etc. you draw even less. That's the thing about Moore's law. You never need to buy a new computer ever again if you don't play games, edit video, steal video, or render shit. 99% of the crap I do could run on my Nokia e71 (including Quake) if I had an external usb keyboard and video card.
Re: (Score:3, Interesting)
The PSU requirement is apparently 550 Watts, and you can usually save a lot of money when you drop from 700 Watts to 550-600, however, I remember seeing a 700 Watt PSU at NewEgg for $50 after rebate, which is about what I paid for my 500 Watt at Christmas (after rebate - I'd planned for 600+, but PSU and a few other things fell to budget axe).
If you're building a system from scratch you may be able to save additional money with a lower power draw card. Also, waste heat from the PSU is lower with smaller PS
Re: (Score:1)
State of the Market (Score:4, Informative)
- The best performance setup was (before this card) a tossup between dual GeForce GTX 295s (quad SLI) and three GeForce GTX 280s (three-way SLI).
- The overclocking potential of the GeForce GTX 285 & reduced power consumption might make a three-way 285 setup preferrable to a dual 295 setup (for enthusiasts)
Re: (Score:2, Informative)
All I care about is best performance under 50W and under $100. Is there anything out there better than the 8600GT?
The Radeon HD 3870 is extremely similar in terms of price and performance. I guess it depends on what you can get the best deal on.
Re: (Score:2)
Are you kidding me? Nvidia drivers are terrible. I installed the latest chipset (nvidia based) and VGA (8800GT) drivers a few weeks ago on my Windows XP partition. Now every time I go through the drivers to adjust anything from overclocking options to simple fan speed adjustments, next restart the picture on the screen gets corrupted and the PC locks up. You have to restart in safe mode, delete the video cards (SLI setup) out of device manager and re-install them.
These are brand new drivers! My next card wi
Re: (Score:3, Informative)
Re: (Score:2)
I've heard NForce motherboards tend to have a fair number of compatibility problems. Don't be so quick to blame the Geforce (VGA) drivers or the Geforce card.
'Anything from overclocking options to simple fan speed adjustments' are handled by the NTune options, not the Nvidia geforce options. It integrates into the same control panel, but they're different things. There are also usually enough quirks with NTune itself to prefer relying on Rivatuner (driver overclock, hardware direct fan speed control) if you
Re: (Score:3, Interesting)
I wonder if he has tried NOT overclocking the card or changing the fan speed? :D
Overclocking is the stupidest, stupidest thing people can do on modern hardware. By designing a graphics card or CPU that overclocks you're pandering to the statistics freaks who want to get that extra 1% performance increase and therefore "more bang for their buck".
What a f**king ridiculous market. Processors and graphics chips go through sorts and testing for a good reason; they're not rated to go any higher because there is
Re: (Score:2)
Overclocking is the stupidest, stupidest thing people can do on modern hardware. By designing a graphics card or CPU that overclocks you're pandering to the statistics freaks who want to get that extra 1% performance increase and therefore "more bang for their buck".
Although you are completely entitled to your opinion, I have to disagree. If you were a racing car driver, would you only put your throttle down to 80% of its capacity just in case you might push the engine too hard? These graphics cards/CPU's/Mainboards/etc are designed to have extra headroom for stability purposes just like the car you may own. People overclock to get the performance out of the unit that it's capable of. Same as people who get aftermarket parts for cars. Suspension, exhaust, engine manage
Re: (Score:2)
Overclocking by 1-5% gives you how much of a performance benefit, really? You're far more likely to get a decent speed increase with simply better drivers and more efficiently written games.
For the car analogy; just because your car's RPM meter goes up to 10000, doesn't mean you should hammer it at that constantly, or even ever. A racing driver would never keep his car running at full tilt, nor even try and get there, because the simple fact is that when you do the whole thing becomes unstable and the engin
Re: (Score:1)
8600GT usually bests the 3870 for a similar price. Recently the 9600GT was priced similarly to the faster/better 8600GTs at ~$85.
The 8600GT can slightly edge out in a variety of games (including GTA4), whereas the 3870's higher processing power can beat it in a few things that don't use complex shaders at all. The 9600GT utterly demolishes both in virtually everything, was within 10-15% of the 9800 GT, at $60 cheaper. Both are slightly more expensive with the big sales season over, though I'm sure there'll
Re: (Score:1)
I did say quite a bit that it was more of a driver parity issue than anything else.
The 9600 is faster than 8600. I meant that the price point of $85 was for the 'higher quality' 8600s. $85 was also a happy price point for decent 8600GTs over the holiday. Pre-overclocked, better fans, other things that would supposedly warrant a few dollars extra, compared to the $45-60 'standard' 8600GTs.
The 3870 has better specs, obviously. Specs don't mean much if the driver doesn't take advantage. On Nvidia drivers, some
Re: (Score:2)
Re: (Score:3, Informative)
This lists it at 46W which is much more in line with my experience. The TDP might be 105W for some crazy reason, but my system with Athlon 64x2 4200+EE, 2x7,200rpm HDD and a 9600GSO uses about 150W when gaming so there is no way in hell the GPU is pulling down 100W by itself.
Re:State of the Market (Score:4, Interesting)
Re: (Score:2)
TDP is not how much heat chip generates, it's how much chip package is designed to handle.
Most recent E8XXX Intel processor have 65W TDP, but it's nowhere near how much they really use, mine E8500 uses 21W under full load IIRC.
Check it yourself: http://en.wikipedia.org/wiki/Thermal_Design_Power [wikipedia.org]
And for real people... (Score:5, Insightful)
...who lack unlimited funds, the best buy at the moment are the ATi HD 48x0 series cards, which have ridiculously good price/performance and will run any current or near-future game easily at high detail.
Re: (Score:2)
Re: (Score:2, Insightful)
Until ATI gets their act together with their linux drivers, I'm not buying ATI.
Also, Nvidia has added MP4 video acceleration to it's linux drivers, so I can see full HD with my old P4@2.4GHz. When we have something similar from ATI I'll reconsider.
Re: (Score:2)
......... ATI is pretty much there too. Might want to read up on the state of Linux drivers.
Re: (Score:2)
I just bought a GTX260 after comparing price/performance ration of it vs it's nearest ATI card. The nVidia card turned out to give more bang for my buck.
Can't speak for low end cards though.
Re:State of the Market (Score:4, Informative)
The overclocking potential of the GeForce GTX 285 & reduced power consumption might make a three-way 285 setup preferrable to a dual 295 setup (for enthusiasts)
You do know that the GeForce GTX 295 has the same overclocking potential and reduced power consumption as the 285 because both use the same chip(s) [wikipedia.org]?.
jdb2
Re: (Score:2)
Except each 285 has it's own cooler. The 295 shares it's cooler between two chips. Sure, the cooler is probably more efficient, but there's a limit to the amount of heat you can remove with just air in a two-slot cooler, and you'll hit that limit a lot sooner with a 295 than a 285.
Yeah, I was going to mention that, but you beat me to the punch. :) Anyway, this is exactly why you need one of these [koolance.com]. Custom designed for solving the above mentioned problem.
jdb2
Comment removed (Score:5, Insightful)
Power usage of a gaming rig (Score:1, Troll)
Unless you live way up north or play games only in the winter, dealing with 840 Watts of heat is going to be problematic for a dual GTX295 setup. Summer is worse in that you now have to pump out that heat through the AC system.
People often will bitch about their cable/DSL bill, but have they ever tried to calculate the monthly cost of electricity their gaming rig racks up alone?
Some of us don't care. :D
Re: (Score:2)
If you care about GPU memory, the GTX295 only delivers 896mbyte per GPU (it's a dual GPU card), while the GTX285 delivers 1024mbyte. If you intend to do stuff with CUDA this may be the deciding factor for you -- and, to a limited extent, also for 2560x1600 gaming on 30" displays. The /only/ game where this would actually come into play, though, would by Crysis: Warhead, right now.
Power draw of a 3-way 285 SLI will likely be more than a "Quad"-SLI 295. Cooling might be an additional problem (there is not exa
Re: (Score:2)
>>If you are going to go with a Tri-SLI-Setup, you will probably need a 1200W power supply,
Baloney. I have a 8800GTX SLI setup. It draws 400W at the wall, which goes up to 500W under full load. Furthermore its watercooled os includes all the pumps etc. Seems to me that 8800GTX draws at most 180W. That means if I put another 8800GTX in my box to get 3-way, it would still olny consume 650W under load.
The GTX285 cards are 55nm so draw even less power.
Even allowing for some overhead an 800W PSU would stil
Re: (Score:2)
Baloney.
I call the same on your comment.
I have a 8800GTX SLI setup. It draws 400W at the wall, which goes up to 500W under full load.
Of course you do not cite what other components are in there. But Let's gop with your 8800GTX SLI non-OC setup. The 8800GTX has a TDP of 145W per card with reference clocks -- so with 2-way SLI that would be 290W there. Note that this is not the actual maximum power the card could consume, but it is fairly close. If you overclock, you are going to need more. If you had a 3-way SLI setup, you would have to account for 435W from the Geforce cards alone. A reasonably beefy CPU w
Re: (Score:2)
Why do you assume I'm not overclocking? I have a core 2 extreme so of course I overclock. Also ny memory is overclocked, both vid. cards are OC versions, I have 2 10k rpm drives, Yet still, I'm dragging only slightly over 500w at the wall under full load (3dmark running a benchmark).
I've checked this with a power monitor and 2 different multimeters and they all agree.
Explain that.
BTW my PSU is a galaxy extreme 1000w.
Re: (Score:2)
I do not need to explain it. I am going by the manufacturer's claims of TDP under full load. This may be an upper bound, but if I can supply it, under full load, for all components, I can be sure that the machine won't brown out when I stress it to the extreme on all components. YMMV, and you are free to follow a different approach.
A false state (Score:1)
Re: (Score:2, Funny)
Barely.
Re: (Score:2)
You just need to write an X86 emulator as a fragment shader. Part of me actually wants to see somebody attempt this, and get a PC OS running on a video card. (Larrabee doesn''t count, using x86 to start with spoils the fun.)
I just bought one (Score:1, Offtopic)
My 8800 GTS died yesterday, wonderful vertical lines down the screen (lines of characters in text mode) and unexpectedly booting in VGA res.
I looked online for a new card, saw a 285 being sold for cheaper than any 280, and looked it up. I saw that it was basically a 280 v2, so I ordered one. Even at 9:40pm I was offered next day delivery by ebuyer, so I took it. I got the order dispatched email at 10:20pm.
I didn't realise until a little later that its release date was yesterday! That's some crazy timing.
lolZ (Score:1)
Re: (Score:1)