Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Graphics Hardware

NVIDIA Unveils GeForce GTX 1080, GTX 1070, Faster Than Titan X For a Lot Less (hothardware.com) 153

MojoKid writes (edited and condensed): NVIDIA has unveiled its next-generation Pascal-based GeForce graphics cards -- known as the GeForce GTX 1080 and GeForce GTX 1070. NVIDIA's Pascal architecture is based on 16nm FinFET technology, similar to that of NVIDIA's high-end data center Tesla P100 processing engine though the GeForce cards are targeted at the consumer gaming market. NVIDIA's GP104 GPU at the heart of the new GeForce cards is comprised of some 8 billion transistors and features a 256-bit memory interface with 8GB of Micron GDDR5X graphics memory on the GeForce GTX 1080. The GTX 1070, however, employs standard GDDR5. The core clock speed of the GeForce GTX 1080 hit 2.1GHz at one point during the demonstration, though GTX 1070 clocks were not disclosed. NVIDIA CEO Jen-Hsun Huang claimed the new GeForce GTX 1080 is faster than a pair of GeForce GTX 980 cards in SLI and faster than the company's very expensive Titan X graphics card but at half the price. The new GeForce GTX 1080 will be offered in two versions, a standard card with an MSRP of $599 or a highly-overclockable Founders Edition for $699. The standard GTX 1070 will arrive at $379, while a Founders Edition will be priced at $449. Availability for the GTX 1080 is slated for May 27th and the GTX 1070 for June 10. Anand Tech has more information.
This discussion has been archived. No new comments can be posted.

NVIDIA Unveils GeForce GTX 1080, GTX 1070, Faster Than Titan X For a Lot Less

Comments Filter:
  • Looks nice, but within a year I expect a cosumerish big pascal with hbm2 and closer to 300w as the ultimate single card. Not ready to replace my SLI setup for this one, but multi-gpu support is getting more and more niche. One monster card for 4k gaming would be great.

    • by Gr8Apes ( 679165 )
      I'd just like a low power quiet 4K HDR card for regular work at a $200 price point. :)
      • If you just want 4k for desktop pixels (as opposed to high-end gaming) then even an AMD R7 260x will do that. Mine even gets me acceptable framerates at 4k in most of the games I play, which tend to be older (e.g. Skyrim, TF2, Kerbal Space Program, Star Trek Online, etc.).

      • by armanox ( 826486 )

        Considering that 4K monitors still aren't common, you've still got at least another year to go.

    • by Shinobi ( 19308 )

      Something I'm hoping for is smoother offloading of physics between multiple GPU's/cards, such that using 1 GPU for graphics, and the other card, not connected to a display, can be used for physics with less clunkiness than what is currently in use. Would be nice for various simulators for example.

    • Re: (Score:2, Insightful)

      by CrashNBrn ( 1143981 )
      With Intel scaling back from Moores law, I think you are going to see Nvidia et al also scaling back from Moores law very very soon.
    • It's less niche now we have DirectX 12 and Vulkan, which allows the developer to pipe command streams into each separately, rather than the driver having to guess how to divvy up the work.
    • In a year in a year in a year.

      something better is always going to come out. You can get a huge improvement over your current kit, or you can wring your hands and worry.

  • Will the drivers finally be stable?

    • by epyT-R ( 613989 )

      more stable than amd's, yes.

      • What's wrong with AMDs drivers right now? Are you on xp? Lol
      • More stable than... fuck, why can I only settle for crappy or less crappy? Don't the capitalism preachers constantly tell me just how much capitalism ensures that only that gets produced what the customer wants, and how happy we should be that we're not in commie hell where we could only buy what The Party thinks is good enough for us?

        What's the difference between The Party and The Corporation deciding what the fuck I can buy?

        • by Anonymous Coward

          They *could* devote the time to developing drivers that crash less often, but they'd pass the cost of that developer time onto the consumer. Apparently they've decided that beyond a certain (frustratingly low) point, increases in stability do not add enough value to entice consumers spend more.

          Consider a different industry: commercial airlines. Just about every aspect of flying is a horrifying cluster-fuck. Can't they implement a system that doesn't lose baggage on a semi-regular basis? Yep, and it would be

          • What that analogy fails at is that air transport is temporary, while the use of a graphics card is a much more prolonged experience. It's easier to swallow to be considered freight for those 4-12 hours in the air than it is to be constantly fighting against your computer for the 2-3 years the average person clings to their graphics adapter.

  • Mixed GPUs (Score:3, Interesting)

    by Anonymous Coward on Saturday May 07, 2016 @01:19PM (#52067253)

    Wouldn't it be nice if the promise of mixed GPUs had arrived already. Then we could buy a new GPU and just add it to the stack we already have. And if the stack is full just drop the worst card out.

    Without that I'll probably skip this generation, replacing what I already have for a modest increase is too expensive. Still keeping my fingers crossed that multi-gpu is the way of the future but I'm not holding my breath.

  • Do video card upgrades even matter anymore?

    Now days it seems you can run almost any game at high settings with old cards. And the games still don't look as good as old elder school mods, which you need cpu and ram for. Everything is console lvl now.

    "muh frames per second are slightly better then your frames per second" is just esat bullshit nowdays when the settings are the same otherwise.

    • by Anonymous Coward

      Think VR
      90fps minimum, stereo, and high fov

    • by UnknownSoldier ( 67820 ) on Saturday May 07, 2016 @01:55PM (#52067381)

      > Do video card upgrades even matter anymore?

      Yes.

      * VR requires 90 Hz minimum (Thank god!)
      * 4K Gaming at 120 Hz requires beefy hardware.
      * ENB mods [google.com]

      If you can't even tell the difference between 24 Hz and 60 Hz ....

      OWE my eyes @ 24 fps ! [cachefly.net]

      Silky smooth @ 60 fps ! [cachefly.net]

      ... let alone 120 Hz [testufo.com] then obviously you don't need a high end GPU. Continue along with your crappy 30 Hz on consoles. [insomniacgames.com] The rest of us will be upgrading.

      • by AK Marc ( 707885 )
        S an article that people don't notice or care about framerate is your link for complaints about low framerate?

        I'm "budget". Though I get people visiting for the first time that ask if my 720 TV is 4k.Gamers trained for years to recognize glitches may notice. Nobody else does. Good lighting, proper setup and wow someone with inferior content.
        • by MrL0G1C ( 867445 )

          I recently bought a nice big 4K Phillips TV... And then sent it back, it was rubbish, the contrast was poor, the colours weren't good, it couldn't handle 4k 60fps properly, the menus were a bad joke, it took about 11 button presses to get to the brightness change setting. There was no backlight control. By default all profiles had 'sharpness' on, god that thing is an abomination that should be banned, why would anyone want to deliberately screw up their picture with it is beyond me.

          • by AK Marc ( 707885 )
            That's my finding as well, a high-quality unit well set up is better than a "better spec" unit that's set up less than optimally. My 720p plasma that's 6 years old, gets compliments all the time. I've never talked about it, until someone else brought it up. Whenever I go to the home of a 4k user, I know it. "How do you like my 4k?" If it were any good, you wouldn't have to point it out to everyone.
            • > My 720p plasma that's 6 years old, gets compliments all the time.

              I'm not surprised. Next to OLED, Plasma's superior viewing angles kicks the shit out of LCD's / LED's. Combine that with deep blacks, a good gamut, with a physical black border around the display (old contrast trick) and there isn't even any contest.

              Note: I'm a plasma man too. I picked up one of the last 1080p Panny's (TC-P60VT60) right before they went out of stock.

              Are you on AvsForum by chance?

              • by AK Marc ( 707885 )
                Mine's an LG 55" 3D Plasma. I'm not on AvsForums. I don't really care about it, I just set up my stuff to work the best I can, and noticed the differences in comments between my setup and others'.
    • by kuzb ( 724081 )

      Yes, absolutely they matter. Especially with 4k starting to become a serious contender to 1080 in the PC space.

    • GPUs are increasingly being used for general purpose computation and rendering, not just playing games. For these purposes more computing power is always welcome, in the same way as faster CPUs and more/faster RAM and storage. For example, would you rather process this data set in 1 day or 2?

      The display part won't benefit from indefinite improvements, as the human eye has its limits. But for everything behind the display, there's always more computing to be done.

    • by eth1 ( 94901 )

      Do video card upgrades even matter anymore?

      Now days it seems you can run almost any game at high settings with old cards. And the games still don't look as good as old elder school mods, which you need cpu and ram for. Everything is console lvl now.

      "muh frames per second are slightly better then your frames per second" is just esat bullshit nowdays when the settings are the same otherwise.

      The ONLY reason I'm still running 1920x1200 instead of 4k is that driving the monitor at native resolution for gaming would require too much spent on graphics card hardware.

      Most likely, even this new card isn't enough for a single-card setup at 4k.

      (says the guy who spent the last week playing Dwarf Fortress...)

    • It's a marketing gimmicks, my friends amd 5770 can run most games still at 60fps haha. IMO The only games that have problems with old cards really are crap works games. I hope Cyberpunk doesn't use it like Witcher 3 did, it did nothing but cause problems.
    • For VR they definitely do. A lot.

  • Time to upgrade from this now totally obsolete, highly OCed GTX 980 Ti, I guess. The upgrade gods demand their blood sacrifice :)
    • Wait for reviews. Based on the chart they showed, I predict that the 1080 GTX is only 15-20% better than a typical non-reference 980 Ti in the $600-$700 range.
      If you have a 980 Ti with high clocks you should just wait for the presumed 1080 Ti / Titan Whatever and AMD's Vega.

      Buying the non-flagship part is a sucker's game.

      • Meh, the flagship typically has the worst price/performance. But it's rare that it is worth trading down in the lineup, if you have last generation's flagship stick with it or buy a new flagship.

      • by Elledan ( 582730 )
        I got the MSI Seahawk version of the GTX 980 Ti. It's (probably still) the fastest GTX 980 Ti out there, with the integrated watercooling keeping things cool. I think I'll be fine with it for another year or two at least.

        It'll be interesting to see how much of an improvement Pascal will be relative to the previous gen of Nvidia GPUs, in particular among the flagship models. The 1080 Ti better be amazing with what they have been promising :)
  • NVIDIA CEO Jen-Hsun Huang claimed the new GeForce GTX 1080 is faster than a pair of GeForce GTX 980 cards in SLI and faster than the company's very expensive Titan X graphics card but at half the price.

    that's nice but it doesn't help me 13 months ago. also, if we're being honest here, this is more of an indicator that they charge too much for their products.

  • by account_deleted ( 4530225 ) on Saturday May 07, 2016 @02:56PM (#52067599)
    Comment removed based on user account deletion
    • by MrL0G1C ( 867445 )

      I don't see the point of having 32GB of ram, games won't use it, do you have some special application that will use it?

  • Let's see what AMD will say about that. And whether there will be usable open source drivers (for either manufacturers 2016 GPU lineup).
    • by aliquis ( 678370 )

      Let's see what AMD will say about that. And whether there will be usable open source drivers (for either manufacturers 2016 GPU lineup).

      I can promise(..) you AMD won't release a single GPU Polaris 10 graphics card with HBM 2 right now.

      Both companies will likely release theirs in 2017.

    • by aliquis ( 678370 )

      Or well, with 2017 I mean "later."
      Neither of them will release HBM 2 cards this summer.
      And AMD Polaris 10 may not keep up with the fastest of the Nvidia cards.

      You'll have to wait for the replacement of the Fury cards.

    • It seems there won't be much competition in the 2016 GPU lineups:
      - AMD is doing small to medium chips, their 2016 Polaris line will cover notebooks and midrange desktop GPUs.
      - Nvidia is doing fairly high end chips at higher prices.
      In 2016, some people might ask themselves if they want "Polaris 10" (the bigger of the AMD chips) or spend more money on the the GTX 1070. But for most the choice will be easy.

      2017 will be more interesting, with AMD Vega competing against the just released GTX 1080

    • At least not in any commercial quantity. So a company can't release a card with it for retail sales since they just couldn't make it. All they could do is a paper announcement, as nVidia did with their compute Pascal. If AMD wishes to launch a card soon, it will likely have to either use GDDR5(X) or HBM1 since there just aren't the HBM2 modules out there for it.

      Remember there's a non-trivial lag time between a company developing a technology and managing to produce it on a commercial scale.

  • I've been saving to buy a 980 TI but this is kind of interesting.

    So is this 1080 faster for less money than the 980 TI? I'm looking at one manufacturers specs for the 1080 versus their 980 TI (overclocked edition) and it looks like the 1080 has more memory and higher clock speeds but also less cuda cores.

    I'm going to be interested to see what the end users reviews are when it's available.

  • In an age where most CPUs have TDP of 65W or less, why does it seem every add on graphics card has TDP that starts at 65W and goes up to 250-400W?

    • For example a 65-watt card may have 640 processing units while a 250-watt card with the same tech or almost the same might have 3072 processing units.
      It's as if Intel sold you a 20-core consumer CPU that uses up to 250 watts, which they don't but that would be physically possible.

    • by Agripa ( 139780 )

      CPU power dissipation is limited by semiconductor die size unless junction temperature is increased which lowers operating life and reliability. Since about the start of the Core2 series, CPU die sizes are dropped so power has had to drop as well. GPU makers use a different trade off sacrificing operating life and reliability for performance so while they do use larger semiconductor die sizes, they use even higher power levels. In this respect AMD and nVidia have been in a race to the bottom with nVidia

      • The bottom line is you need a 500+ watt PSU to service your video card, not the rest of the pc. It is crazy.

        • by Agripa ( 139780 )

          For reliability reasons, the power supply should be significantly derated anyway. The manufacturers have gotten really good at designing them so that they fail just out of warranty because the aluminum electrolytic capacitors wear out.

  • I've seen a bunch of reports that a typically-overclocked GTX 960 is just as fast as the GTX 1080. Are those people just blowing smoke? Or is nVidia just jerking off here?

  • I remember when I read that the 980 had just been supported by Linux...heh...that's when I still had a 760, kinda happy I didn't buy a 980 since it got CUDA support so late. I love the speed of the 1080 - but since I use Linux exclusively for 3D rendering and Video editing, I'm going to hold off until there's decent support for it.

What is research but a blind date with knowledge? -- Will Harvey

Working...