Nvidia Says New GPUs Won't Be Available For a 'Long Time' (pcgamer.com) 98
Nvidia chief executive Jensen Huang said this week at Computex that people should not get their hopes up for any major GPU upgrades in the company's lineup in the foreseeable future. From a report: When asked when the next-gen GeForce would arrive, Jensen quipped, "It will be a long time from now. I'll invite you, and there will be lunch." That was it for discussions of the future Turing graphics cards, but that's hardly a surprise. Nvidia doesn't announce new GPUs months in advance -- it will tell us when it's ready to launch. Indications from other sources, including graphics card manufacturers, is that the Turing GPUs will arrive in late July at the earliest, with August/September for lower tier cards and custom designs.
Re: (Score:3)
Computer, bring up Celery Man.
Why would they want to ship new product? (Score:2)
Re:Why would they want to ship new product? (Score:4, Insightful)
You just failed business 101.
New cards should be held back as long as current cards are selling well and there's no serious competition. Anything else is throwing away R&D money.
(because you'll immediately be forced to start spending money on the next generation card)
Re: (Score:2)
so do you fire all your engineers while you're not shipping new product? why pay people to sit around and do nothing?
Re:Why would they want to ship new product? (Score:5, Insightful)
Just because you haven't rolled out Generation 3, doesn't mean you can't start the work on generation 4. It just means you can take more time to make sure they are GOOD, or you complete the design for gen 4, and start on gen 5. If you CAN be several generations ahead of your competition, and are ready for surprises, you should be. An example of where this did NOT happen was with Intel vs. the most recent generation of AMD chips. AMD came out stronger than anybody expected, and Intel didn't have a set of designs to put to the fab yet that would compete.
Re: (Score:2)
Plus: Each generation is exponentially more difficult to design.
Giving existing engineers more time to work between generations might be a necessary thing, not a luxury.
Re: (Score:2)
Re:Why would they want to ship new product? (Score:4, Interesting)
An example of where this did NOT happen was with Intel vs. the most recent generation of AMD chips. AMD came out stronger than anybody expected
Well thats bullshit. It was known ahead of time that AMD was moving to a smaller node and would thus make up the entire difference. Anybody that didnt know, doesnt know shit and should never be listened to on these subjects. Seriously. Apparently that includes you.
.. its bullshit. Intel are the ones that invented lying about process size and their old 14nm 3D tri-gates are not as good as anyones current 14nm or 16nm processes, which is why Intel had to switch back to the traditional transistors designs everyone else is using that require fewer lithography steps.
Additionally, Intel did move several generations ahead with their Core architecture, even getting to 14nm while the competition was still on 28nm. Intels failing was not neglecting to seize the moment, but in the very point on the table. Intels R&D failure was in trying to get their lucky ahead-of-its-time 14nm 3D tri-gates into the realm of practicality at 10nm, which is so apparently not possible that Intel has completely abandoned their 3D tri-gate effort, the very thing that put them so far ahead to begin with.
Not only is Intel at best at equality, they will be a node behind by this time next year.
And before people start saying "but Intels 14nm is better than others"
Re: (Score:2)
Well thats bullshit. It was known ahead of time that AMD was moving to a smaller node and would thus make up the entire difference.
No. It was known ahead of time that AMD was moving to a smaller node. That is it. The end of anything that anyone knew ahead of time. AMD has had quite a recent history of miserable product releases barely staying relevant in the face of the competition even with its discounted pricing model.
That they came out with what they did, and wiped the floor with Intel was a surprise to absolutely everyone in the industry.
Re: (Score:2)
What you have admitted is that (a) you didnt know in the past that AMD wasnt moving to a competitive process size and thus the poor performance was a surprise to you. And also (b) you didnt know in the present that AMD was moving to a competitive process size and thus the good performance was a surprise to you.
The common factor here is you not knowing what was important to know. Even n the face of you talking about Intels tick-tock repeatedly in the past, you still didn
Re: (Score:2)
I don't understand why you're salivating over competitive process size. No one gives a shit about process size as it is only a small factor in performance. And yeah it caught me by surprise. As well as all the market analysts and all the technical writers out there. Maybe you're clairvoyant? Quick give me 6 numbers between 1 and 42.
Even n the face of you talking about Intels tick-tock repeatedly in the past
Errr right. You must have me confused with that other mythical thegarbz who has talked about tick-tock repeatedly in the past, because I sure as hell haven't, and if I would have
Re: (Score:2)
You do whatever makes most money for the company.
Engineers are easy to keep happy.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
If the production cost of a new card can be reduced then you can most likely reduce the cost of the previous-gen card just as easily. Maybe give a small clock speed boost to gain some sales. This already happens with the "Ti" versions of Nvidia graphics cards, etc.
The point is that you release as few new features as possible, hold back as many as you can for the next-gen.
CPUs have pretty much plateaued, people are starting to say that GPUs are plateauing, a smart company will do anything that extends sales
Re: (Score:2)
If the production cost of a new card can be reduced then you can most likely reduce the cost of the previous-gen card just as easily.
This all doesnt follow. The primary driver of production costs at the high end is the cost for the fab time. Some fabs are now producing 10nm chips, but time on these fabs costs much more than time on the previous generations 14/16nm node, and time on much older nodes like 28nm are now dirt cheap. Some of the 28nm fabs have even been shut down for conversion to smaller node production already. nVidea's flagships are currently on 14nm / 16nm, and the only way to make them much cheaper is for someone to open
Re: (Score:3)
Having taking Strategic Business Analysis I think it was Business 630 or something.
Both methods have merit and costs.
Being on the leading edge and keeping a strong lead from your competitors can make sure you are holding a solid lead in your market. However it could cause a lot of expense in reworking your manufacturing process over and and over again. A lot of R&D expense isn't as much on how to make a newer and faster chip. But to product it at a price people are willing to pay for it. (This is the m
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Because their friends will laugh at them.
(Posted from a machine using a GeForce 6600. My friends don't laugh at me; I don't have any).
Re: (Score:2)
Re: (Score:2)
You just failed business 101.
New cards should be held back as long as current cards are selling well and there's no serious competition. Anything else is throwing away R&D money.
(because you'll immediately be forced to start spending money on the next generation card)
Business 101 for all the companies wiped out by more nimble newcomers to the market.
Re: (Score:2)
It has a distinct whiff of Kodak about it. Are you picking up traces of sunk cost fallacy too?
Re: (Score:2)
You start working on the new product before you need the new product, which means before the current products sales go to hell in a handbasket.
Otherwise you'll have this huge lag where you aren't selling much, and you don't have a new product to launch to recover the sales.
The advantage of having a longer possible development time means you can do more quality testing and performance tweaks. You can add new features that would take longer to develop. And here's a biggie, you ca
Re: (Score:2)
The existing product is selling out as is - why waste time/effort releasing anything new right now?
They're developing a bunch of new stuff, the issue is that progress isn't as fast paced as it has been in recent years so putting out a new architecture every year just isn't worth it. Presently it's still difficult to get a hold of a TitanV when you're on the inside much less anything Turing-related.
Re: (Score:2)
4K at 144 Hz for the consumer to buy into. New display, new gpu and a new cpu to keep enjoying games.
Re: We're at the end of Moore's law (Score:2)
Actually, I was in the room where when he indicated they would demonstrate Mooreâ(TM)s Law - squared - as it applies to GPU capabilities, for the next 5 years.
Sit back and watch fiction become reality.
Re: (Score:2)
Then it will be 5K for people creating 4K content.
Then 8K support.
Then more resolution to with on 8K content.
Then the 5K, 8K games.
Re: (Score:2)
Fuck everything, we're doing 16!
Party line (Score:2)
Re: (Score:2)
Re: (Score:2)
Sure, that's the rumored release date. Odds are you won't be able to actually get your hands on one for anywhere MSRP until October.
The only people getting them in August will likely be the people who preorder the overpriced "Founders Edition" or "Special Edition" cards, and the crypto miners will probably snap up most of the stock that's available in September.
If I was Nvidia's CEO, I'd probably say that they're not going to be available for awhile, either.
Translation: market penetration of 4K too low (Score:2, Insightful)
Graphics have been "good enough" for max settings in 1080p gaming since at least the 7-series and nothing is driving 4K adoption.
Re:Translation: market penetration of 4K too low (Score:5, Insightful)
60 Hz limit in 4K of DP1.2 and HDMI 2.0 have made 4K a real trade-off compared to 1080P, gaming.
There really is some chicken and egg stuff going on between having a card good enough to drive 4K, and having good monitors that can do 100+ Hz at 4K. Even today the Gsync capable screens painfully more expensive than their vanilla counterparts.
Re: (Score:2)
Re:Translation: market penetration of 4K too low (Score:4, Informative)
> Your eyes cannot resolve more than about 50fps anyway.
Bullshit. [testufo.com]
And yes, you need a 120+ Hz monitor to see the difference.
Re: (Score:2)
First, all of us who used CRT monitors at > 60Hz agree with you.
Second, that's site's seriously cool. It complained that I opened it on a side monitor instead of my main.
Re: (Score:2)
A better example:
https://boallen.com/fps-compar... [boallen.com]
Re: (Score:2)
That's not really a better example. It doesn't show the difference between 60 fps and 120 fps.
For the difference between 30 fps and 60 fps, RED, the maker of high end cameras, has these two clips:
OWE my eyes @ 24 fps [cachefly.net]
Silky smooth @ 60 fps [cachefly.net]
Re: (Score:2)
It is because you can clearly see the choppyness of the falling square against a black background. You may have poor eyesight or poor visual integration. AKA the filtering your brain is applying may make it look better than it actually appears to those of us with more keen vision.
Re: (Score:2)
Your eyes cannot resolve more than about 50fps anyway.
Reality disagrees:
https://boallen.com/fps-compar... [boallen.com]
Re: (Score:2)
People who do more than 60hz are ID:10Ts. Your eyes cannot resolve more than about 50fps anyway.
Your eyes can resolve some 300fps, but you can't usually resolve a 4K resolution screen at most sizes and distances. Right now screens are lacking human vision in frequency and dynamic range, and even 120Hz HDR is far from the human limits.
Re: (Score:2)
stellaris doesn't have 4K yet
Re: (Score:2)
> Graphics have been "good enough" for max settings in 1080p
That's debatable.
At 60 fps, yes.
At 120 fps, depends on the game. Left For Dead, Minecraft, yes. ARK, Dark Souls, etc. no.
To bad PC's stilll get shitty console 30 fps ports.
Re: (Score:3)
The real reason that NVidia feels no press
Re: (Score:2)
And/or get the same performance at lower power consumption.
OMG, what did I say! I'm a tree-hugging hippy cawmnust!
Re: (Score:2)
I see nothing wrong with lower power consumption for the same performance, but if it's not better than my current card performance wise, I'm not shelling out for it. That's what drives improvements. Upgrades. There's a finite amount of people willing to pay the equivalent of $800 USD and up for the highest end cards. Power consumption isn't really on their list of "things I'd pay that kind of money for"
Re: (Score:2)
Indeed - the demand curve is broken... (Score:3)
If you're making a series of things, each replacing the last in the market, and your current one is selling at a high rate, and there's nothing that's going to cause it to spoil... you don't bring in the next item in the series.
You save it for when sales drop off, after you've been forced to drop prices, so the new item can be the new high-price thing.
If prices aren't falling, there's no room for the new replacement.
So yeah, until the stamp collecting, I mean the random-number-sifting coin market cools down - any new video cards won't have any actual payoff for NVidia.
Which is actually fine for me. Having game developers compete on actual content and ideas more instead of graphics churn is actually more to my liking. Well, except for when the accountants/managers also have time to toy around with recurring payment concepts, or DRM ideas.
Ryan Fenton
Re:Indeed - the demand curve is broken... (Score:4, Insightful)
there is no competition to drive performance...
Re: (Score:2)
Not really, the Volta architecture is available and while it's a bit difficult you can certainly get a hold of a TitanV the issue really is that it's performance for gaming isn't really mindblowing when compared to a current TitanX so there's little reason to upgrade, its performance for workloads that can leverage the tensor core is incredible but very few games are going to be able to do much with that.
People doing machine learning (or things that leverage fast inferencing) didn't care about gaming-focuss
It’s because of miners (Score:2)
Re: (Score:3)
Re: (Score:2)
The GPU mining bubble already burst and GPU availability/prices are pretty much back to normal, at least for Nvidia.
For now (Score:5, Insightful)
"No GPUs for a long time. .forseeable future..."
"Late July..."
Consumer electronics is more development cycle-compressed than ever.
Re: (Score:2)
i can only foresee within in 30 days, thus late july is definitely unforeseeable future. heck who knows, maybe AMD can have a product out that would change that timeline for nV
Re: (Score:2)
They said WW1 would be over by Christmas.
And in a way, it was.
Re: (Score:3)
and take a good chunk of the gaming GPU market away from both Nvidia and AMD in 2019/2020
LOL. No.
The fledgeling hardware raytracing movement is cool to watch, but it's nowhere near replacement of rasterization. Even now it exists as a hybrid solution with numerous enough problems that adoption is nil.
Ultimately, it's game developers that will drive hardware raytracing. Awesome hardware without the game industry targeting your hardware leads to dead ends. How *is* your 3dfx gpu doing, anyway?
Geforce 8800 GTX (Score:4, Informative)
Nvidia released the Gegfoce 8800 GTX in 2006 and that was effectively the fastest card until around 2010.... they just milked the archtecture, re-re-released the architecture under different names. I had the 8500 which was released a year later as the 9500.
Then the 200/300/400 series, 5/6/7/8/900 series, finally they're at the 1040/50/60/70/80 series. Expect the cards to be warmed-over next spring wit hthe 1140/50/60/70/80... their product cycle is years long, this has been true for decades.
Re: (Score:2)
There's nothing wrong with milking the architecture as long as the performance / price point increases. Developing new architectures is not trivial.
No reason to release new stuff (Score:1)
A 1070, which is at best a $150 video card, is selling for $500 on the street right now thanks to bitcoin-mining crackheads. There's no reason to release a new and improved product as long as your existing garbage is selling for 3x what it should cost.
Re: (Score:2)
Well not until AMD releases theirs (Score:3)
Intel should have bought them (Score:2)
I told them (Score:2)
I told them not to outsource manufacturing to Tesla.
Expect Nvidia to distance themselves from GPU (Score:2)
I'm really suspecting that Nvidia will focus on the high margin AI / Computer Vision markets as it pays much better than a GPU.
AMD on the other hand is already preparing several revolutionary generations of CPU's and GPU's based on their TSMC 7nm process to be released in 2019 with the second generation 14nm chips coming off the line this summer.
So it is likely that AMD will dominate market share for the GPU market while Nvidia will continue to report record revenues even while losing GPU market share.