Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

NVIDIA's 55nm GeForce GTX 285 Launched 82

Visceralini writes "NVIDIA is launching yet another high-end 3D graphics offering, an optimized version of their top shelf GeForce GTX 280 single GPU card, dubbed the GeForce GTX 285. This new GeForce is a 55nm die-shrunk version of the legacy GTX 280 with lower power consumption characteristics that don't require an 8-pin PCI Express connector, rather just a pair of more standard 6-pin plugs. Performance metrics are shown here in a number of the latest game titles including Fallout 3, Left 4 Dead, Far Cry 2 and Mirror's Edge. The new GTX 285 is about on par or slightly faster than a GTX 280 but with less power draw and some room for overclocking over the reference design."
This discussion has been archived. No new comments can be posted.

NVIDIA's 55nm GeForce GTX 285 Launched

Comments Filter:
  • by Arthur Grumbine ( 1086397 ) * on Thursday January 15, 2009 @10:30PM (#26477703) Journal

    The new GTX 285 is about on par or slightly faster than a GTX 280 but with less power draw and some room for overclocking over the reference design.

    40W less while idle (vs. 280), @ $0.12 kWh, means if I can pick one up for $400 (I can dream, can't I?), it will have paid for itself - through power savings - in less than 10 years!! I know what I'm spending my tax refund on!!

    • Re: (Score:2, Informative)

      by CDMA_Demo ( 841347 )

      in less than 10 years!! I know what I'm spending my tax refund on!!

      You can also use it to crack passwords even Faster! [hothardware.com]

    • Not exactly... but if you were about to spend $250 on the mid-range GTX260 card, you can now get the high-end GTX285 which will pay for the difference in power saving after a few years.

      • Re: (Score:3, Insightful)

        What enthusiast that spends big dollars on the latest cards uses their card for more than a year?

        Most people that rush out to buy these type of cutting-edge hardware replace them every few months or so. The savings realized by power conservation will never cover the difference between the two with the crowd these new cards (indeed, all new cards) are targeted at.
        • What enthusiast that spends big dollars on the latest cards uses their card for more than a year?

          Paid $400 for an 8800 GTX well over a year ago, wish I would have caught the 8800 GTSs. But anyways, I plan to keep it for more at least another 12 months. We'll see what's out then.

          • by jandrese ( 485 )
            Shoot, I dropped the big bucks on the 8800GTX when it came out and it's still an able performer. It looks like it's going to be another 6 months or maybe even a year before I have to dial back from "ultra extreme" settings to merely "high" settings on new games (although I didn't try Crysis).
    • Re: (Score:2, Interesting)

      by afidel ( 530433 )
      Or you know you could go with a 9600GSO which uses about 45W PEAK and plays most modern games at decent framerates (Crysis 1280*1024 medium, 20fps). I use Rivatuner to underclock the card by 50% at windows startup and run it at about 20% overclock with game profiles. Since my PC is running Mediaportal and is on 24x7 this saves me quite a bit over the course of the year.
    • Re: (Score:3, Interesting)

      by Creepy ( 93888 )

      The PSU requirement is apparently 550 Watts, and you can usually save a lot of money when you drop from 700 Watts to 550-600, however, I remember seeing a 700 Watt PSU at NewEgg for $50 after rebate, which is about what I paid for my 500 Watt at Christmas (after rebate - I'd planned for 600+, but PSU and a few other things fell to budget axe).

      If you're building a system from scratch you may be able to save additional money with a lower power draw card. Also, waste heat from the PSU is lower with smaller PS

      • THE PSU ONLY DRAWS WHATS ASKED OF IT!!! Seriously, its quite silly. A 700 watt PSU will only draw 700 watts if the components in the PC ask for 700 watts.
  • State of the Market (Score:4, Informative)

    by youknowjack ( 1452161 ) on Thursday January 15, 2009 @11:01PM (#26477921)
    - The current best performing single card is the GeForce GTX 295
    - The best performance setup was (before this card) a tossup between dual GeForce GTX 295s (quad SLI) and three GeForce GTX 280s (three-way SLI).
    - The overclocking potential of the GeForce GTX 285 & reduced power consumption might make a three-way 285 setup preferrable to a dual 295 setup (for enthusiasts)
    • by caitsith01 ( 606117 ) on Thursday January 15, 2009 @11:46PM (#26478225) Journal

      ...who lack unlimited funds, the best buy at the moment are the ATi HD 48x0 series cards, which have ridiculously good price/performance and will run any current or near-future game easily at high detail.

      • Not to mention the fact that nVdia have so many faulty chipsets out there right now, and have lied time and time again about the scale and scope of the problem - to the degree that I simply won't touch them any more.
      • Re: (Score:2, Insightful)

        by javilon ( 99157 )

        Until ATI gets their act together with their linux drivers, I'm not buying ATI.

        Also, Nvidia has added MP4 video acceleration to it's linux drivers, so I can see full HD with my old P4@2.4GHz. When we have something similar from ATI I'll reconsider.

      • I just bought a GTX260 after comparing price/performance ration of it vs it's nearest ATI card. The nVidia card turned out to give more bang for my buck.

        Can't speak for low end cards though.

    • by jdb2 ( 800046 ) * on Thursday January 15, 2009 @11:54PM (#26478289) Journal

      The overclocking potential of the GeForce GTX 285 & reduced power consumption might make a three-way 285 setup preferrable to a dual 295 setup (for enthusiasts)

      You do know that the GeForce GTX 295 has the same overclocking potential and reduced power consumption as the 285 because both use the same chip(s) [wikipedia.org]?.

      jdb2

    • Comment removed (Score:5, Insightful)

      by account_deleted ( 4530225 ) on Friday January 16, 2009 @12:12AM (#26478407)
      Comment removed based on user account deletion
      • Unless you live way up north or play games only in the winter, dealing with 840 Watts of heat is going to be problematic for a dual GTX295 setup. Summer is worse in that you now have to pump out that heat through the AC system.

        People often will bitch about their cable/DSL bill, but have they ever tried to calculate the monthly cost of electricity their gaming rig racks up alone?

        Some of us don't care. :D

    • by mxs ( 42717 )

      If you care about GPU memory, the GTX295 only delivers 896mbyte per GPU (it's a dual GPU card), while the GTX285 delivers 1024mbyte. If you intend to do stuff with CUDA this may be the deciding factor for you -- and, to a limited extent, also for 2560x1600 gaming on 30" displays. The /only/ game where this would actually come into play, though, would by Crysis: Warhead, right now.

      Power draw of a 3-way 285 SLI will likely be more than a "Quad"-SLI 295. Cooling might be an additional problem (there is not exa

      • by JustNiz ( 692889 )

        >>If you are going to go with a Tri-SLI-Setup, you will probably need a 1200W power supply,

        Baloney. I have a 8800GTX SLI setup. It draws 400W at the wall, which goes up to 500W under full load. Furthermore its watercooled os includes all the pumps etc. Seems to me that 8800GTX draws at most 180W. That means if I put another 8800GTX in my box to get 3-way, it would still olny consume 650W under load.
        The GTX285 cards are 55nm so draw even less power.
        Even allowing for some overhead an 800W PSU would stil

        • by mxs ( 42717 )

          Baloney.

          I call the same on your comment.

          I have a 8800GTX SLI setup. It draws 400W at the wall, which goes up to 500W under full load.

          Of course you do not cite what other components are in there. But Let's gop with your 8800GTX SLI non-OC setup. The 8800GTX has a TDP of 145W per card with reference clocks -- so with 2-way SLI that would be 290W there. Note that this is not the actual maximum power the card could consume, but it is fairly close. If you overclock, you are going to need more. If you had a 3-way SLI setup, you would have to account for 435W from the Geforce cards alone. A reasonably beefy CPU w

          • by JustNiz ( 692889 )

            Why do you assume I'm not overclocking? I have a core 2 extreme so of course I overclock. Also ny memory is overclocked, both vid. cards are OC versions, I have 2 10k rpm drives, Yet still, I'm dragging only slightly over 500w at the wall under full load (3dmark running a benchmark).
            I've checked this with a power monitor and 2 different multimeters and they all agree.
            Explain that.
            BTW my PSU is a galaxy extreme 1000w.

            • by mxs ( 42717 )

              I do not need to explain it. I am going by the manufacturer's claims of TDP under full load. This may be an upper bound, but if I can supply it, under full load, for all components, I can be sure that the machine won't brown out when I stress it to the extreme on all components. YMMV, and you are free to follow a different approach.

    • the parent post misses the fact that for dual and more 280 setups, you need a huge case and a kickass power supply. ati 4870s are much easier to fit in a single standard case, and give similar or better performance.
  • My 8800 GTS died yesterday, wonderful vertical lines down the screen (lines of characters in text mode) and unexpectedly booting in VGA res.
    I looked online for a new card, saw a 285 being sold for cheaper than any 280, and looked it up. I saw that it was basically a 280 v2, so I ordered one. Even at 9:40pm I was offered next day delivery by ebuyer, so I took it. I got the order dispatched email at 10:20pm.

    I didn't realise until a little later that its release date was yesterday! That's some crazy timing.

  • sry but those benchmarks are not even close to right. since when does the 280 score 2000 points more than the 4870 1 gig. sry not a chance. when you guys post benchmarks, at least post correct ones. and 1920x1200 at 4x aa and 16x af is not a valid benchmark because those are not 3d marks default (and online comparable for that matter) settings. this is not a nvidia fan boy cry, i run nvidia myself lol but i know what the 4870's can really do.

"Protozoa are small, and bacteria are small, but viruses are smaller than the both put together."

Working...