Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Businesses Hardware Technology

Nvidia Says New GPUs Won't Be Available For a 'Long Time' (pcgamer.com) 98

Nvidia chief executive Jensen Huang said this week at Computex that people should not get their hopes up for any major GPU upgrades in the company's lineup in the foreseeable future. From a report: When asked when the next-gen GeForce would arrive, Jensen quipped, "It will be a long time from now. I'll invite you, and there will be lunch." That was it for discussions of the future Turing graphics cards, but that's hardly a surprise. Nvidia doesn't announce new GPUs months in advance -- it will tell us when it's ready to launch. Indications from other sources, including graphics card manufacturers, is that the Turing GPUs will arrive in late July at the earliest, with August/September for lower tier cards and custom designs.

Nvidia Says New GPUs Won't Be Available For a 'Long Time'

Comments Filter:
  • The existing product is selling out as is - why waste time/effort releasing anything new right now? (See also "Intel" until AMD started to kick its ass with the Ryzen chips.)
    • The existing product is selling out as is - why waste time/effort releasing anything new right now?

      They're developing a bunch of new stuff, the issue is that progress isn't as fast paced as it has been in recent years so putting out a new architecture every year just isn't worth it. Presently it's still difficult to get a hold of a TitanV when you're on the inside much less anything Turing-related.

    • by AHuxley ( 892839 )
      To support 4K, HDR display with a 144 Hz and G-Sync.
      4K at 144 Hz for the consumer to buy into. New display, new gpu and a new cpu to keep enjoying games.
  • Please continue to buy our 10xx series cards. Do not wait for the next generation, and definitely don't wait for AMD's next release.
    • Yeah... meanwhile GeForce GTX 1180 graphic cards rumor release date is July 30...http://www.guru3d.com/news-story/rumor-geforce-gtx-1180-to-launch-july-30.html
      • Sure, that's the rumored release date. Odds are you won't be able to actually get your hands on one for anywhere MSRP until October.

        The only people getting them in August will likely be the people who preorder the overpriced "Founders Edition" or "Special Edition" cards, and the crypto miners will probably snap up most of the stock that's available in September.

        If I was Nvidia's CEO, I'd probably say that they're not going to be available for awhile, either.

  • by Anonymous Coward

    Graphics have been "good enough" for max settings in 1080p gaming since at least the 7-series and nothing is driving 4K adoption.

    • by Moof123 ( 1292134 ) on Tuesday June 05, 2018 @02:25PM (#56732322)

      60 Hz limit in 4K of DP1.2 and HDMI 2.0 have made 4K a real trade-off compared to 1080P, gaming.

      There really is some chicken and egg stuff going on between having a card good enough to drive 4K, and having good monitors that can do 100+ Hz at 4K. Even today the Gsync capable screens painfully more expensive than their vanilla counterparts.

    • stellaris doesn't have 4K yet

    • > Graphics have been "good enough" for max settings in 1080p

      That's debatable.

      At 60 fps, yes.

      At 120 fps, depends on the game. Left For Dead, Minecraft, yes. ARK, Dark Souls, etc. no.

      To bad PC's stilll get shitty console 30 fps ports.

    • New GPUs don't necessarily have to drive performance forward or even have a demand for yet more performance. You can use process improvements due to Moore's law and improved designs to produce a smaller GPU capable of the same level of performance as the current parts. This makes your product cheaper and increases profits assuming the price remains fixed. Alternatively it allows you to drop the price as well, which may increase consumption and overall net profit.

      The real reason that NVidia feels no press
      • You can use process improvements due to Moore's law and improved designs to produce a smaller GPU capable of the same level of performance as the current parts.

        And/or get the same performance at lower power consumption.

        OMG, what did I say! I'm a tree-hugging hippy cawmnust!

        • I see nothing wrong with lower power consumption for the same performance, but if it's not better than my current card performance wise, I'm not shelling out for it. That's what drives improvements. Upgrades. There's a finite amount of people willing to pay the equivalent of $800 USD and up for the highest end cards. Power consumption isn't really on their list of "things I'd pay that kind of money for"

  • by RyanFenton ( 230700 ) on Tuesday June 05, 2018 @02:17PM (#56732270)

    If you're making a series of things, each replacing the last in the market, and your current one is selling at a high rate, and there's nothing that's going to cause it to spoil... you don't bring in the next item in the series.

    You save it for when sales drop off, after you've been forced to drop prices, so the new item can be the new high-price thing.

    If prices aren't falling, there's no room for the new replacement.

    So yeah, until the stamp collecting, I mean the random-number-sifting coin market cools down - any new video cards won't have any actual payoff for NVidia.

    Which is actually fine for me. Having game developers compete on actual content and ideas more instead of graphics churn is actually more to my liking. Well, except for when the accountants/managers also have time to toy around with recurring payment concepts, or DRM ideas.

    Ryan Fenton

  • What’s the point of new cards when miners will buy them all. Better to wait until the bubble bursts before releasing them.
    • What's the point of making a new product if some group of people (will just buy them all)? That will only result in more money. What's the point of money if all it does is purchase things you want?
    • by mentil ( 1748130 )

      The GPU mining bubble already burst and GPU availability/prices are pretty much back to normal, at least for Nvidia.

  • For now (Score:5, Insightful)

    by Impy the Impiuos Imp ( 442658 ) on Tuesday June 05, 2018 @02:27PM (#56732340) Journal

    "No GPUs for a long time. .forseeable future..."

    "Late July..."

    Consumer electronics is more development cycle-compressed than ever.

    • by zlives ( 2009072 )

      i can only foresee within in 30 days, thus late july is definitely unforeseeable future. heck who knows, maybe AMD can have a product out that would change that timeline for nV

    • They said WW1 would be over by Christmas.

      And in a way, it was.

  • Geforce 8800 GTX (Score:4, Informative)

    by Hadlock ( 143607 ) on Tuesday June 05, 2018 @02:38PM (#56732408) Homepage Journal

    Nvidia released the Gegfoce 8800 GTX in 2006 and that was effectively the fastest card until around 2010.... they just milked the archtecture, re-re-released the architecture under different names. I had the 8500 which was released a year later as the 9500.
     
    Then the 200/300/400 series, 5/6/7/8/900 series, finally they're at the 1040/50/60/70/80 series. Expect the cards to be warmed-over next spring wit hthe 1140/50/60/70/80... their product cycle is years long, this has been true for decades.

    • There's nothing wrong with milking the architecture as long as the performance / price point increases. Developing new architectures is not trivial.

  • by Anonymous Coward

    A 1070, which is at best a $150 video card, is selling for $500 on the street right now thanks to bitcoin-mining crackheads. There's no reason to release a new and improved product as long as your existing garbage is selling for 3x what it should cost.

  • by rsilvergun ( 571051 ) on Tuesday June 05, 2018 @03:00PM (#56732584)
    and they're once again, however briefly, forced to compete.
  • Wow, I'm really surprised Intel didn't buy them when they had the chance, considering this is classic Intel behavior.
  • I told them not to outsource manufacturing to Tesla.

  • I'm really suspecting that Nvidia will focus on the high margin AI / Computer Vision markets as it pays much better than a GPU.

    AMD on the other hand is already preparing several revolutionary generations of CPU's and GPU's based on their TSMC 7nm process to be released in 2019 with the second generation 14nm chips coming off the line this summer.

    So it is likely that AMD will dominate market share for the GPU market while Nvidia will continue to report record revenues even while losing GPU market share.

I've finally learned what "upward compatible" means. It means we get to keep all our old mistakes. -- Dennie van Tassel

Working...