'Moore's Law Is Dead,' Says Nvidia CEO (marketwatch.com) 116
Nvidia Chief Executive Jensen Huang's remarks about Moore's Law from earlier this week: "Moore's Law's dead," Huang said, referring to the standard that the number of transistors on a chip doubles every two years. "And the ability for Moore's Law to deliver twice the performance at the same cost, or at the same performance, half the cost, every year and a half, is over. It's completely over, and so the idea that a chip is going to go down in cost over time, unfortunately, is a story of the past." He added: "Computing is a not a chip problem, it's a software and chip problem."
Says Nvidia CEO Says (Score:2)
"Says Nvidia CEO Says"...
needs some copyediting.
Re: Says Nvidia CEO Says (Score:3)
Department of Redundancy Department took Friday off.
Re: (Score:2)
Re: (Score:2)
They twice took it off twice off.
Re: (Score:2)
You should have said "needs some copyediting needs".
Re: (Score:2)
needs some copyediting.
That would be a helpful suggestion - if there were a copy editor. Or indeed an editor of any kind.
Unfortunately, intelligent meat is far too expensive these days.
Re: (Score:1)
Re: (Score:2)
Unfortunately, intelligent meat is far too expensive these days.
No need for meat when silicon will suffice. An automatic grammar checker would have caught the error. E.g., Grammarly flags it.
Re: (Score:2)
"Says" is Jensen's alter ego. Because he always Says things. So now they just call him Says instead.
Re: (Score:1)
https://www.youtube.com/watch?v=SiLY0WAdulI
Pay more, expect less? (Score:2)
that seems to be the takeaway.
i reckon if we had a look at true inflation in the stuff the average person needs it has to have skyrocketed lately.
Re:Pay more, expect less? (Score:4, Interesting)
Re:Pay more, expect less? (Score:5, Insightful)
Re: (Score:2)
Is it, though?
Wait a little bit. We'll get there soon.
Re: (Score:2)
Wait a little bit. We'll get there soon.
Not likely. There are still billions of underutilized people working in rice paddies and corn fields. We can shift production from China to Africa and Bangladesh.
By the time Africa stops having babies, the robots will be ready.
Re: (Score:3)
That's a perfectly fine price to not drown and make most of the rest of the species extinct.
The population declines need to go on for at least 2 centuries for the survival of ecosystems.
Inflation and tightening the belt is a small price.
Re: (Score:2)
Re: (Score:2)
I understand. But most countries have a social safety net now.
There will no doubt be upheavals. It is still a small price to pay for not drowning and not having mass extinctions.
Just as you urge me to look at it from the perspective of the less fortunate, look at it from an ecological perspective and of other species. For them, it is a genocide. We tend to ignore that as long as we have our bread and circuses.
Re: (Score:3)
The social safety net is actually one of the things that are at risk. For instance, I'm in Ontario, Canada, where we have a single-payer government run healthcare. I don't pay to see a doctor or to go to a hospital. During COVID a whole bunch of doctors and nurses took early retirement, and we're simply not graduating enough doctors and nurses to take their place, so we're now in a position where it's hard to find a family doctor taking new patients, and some rural hospitals have had to occasionally clos
Re: (Score:3)
Or some of the folks who would have made a career flipping burgers and swiping floors will be trained as skilled nurses while robots do the unskilled labor.
Improvements in health care should help as well, and by that I mean less diseases from less pollution and living better longer. If the population pyramid starts inverting we just need to have the younger population to become more productive at taking care of the economy and the older population. At some point humanity will find a good balance point with
Re: (Score:2)
Giant solar arrays? There already are enough around me, acres of fields that used to be nice grassy fields have these hideoous angled panels. That still have to be mowed around. So instead of a farmer using the field for a crop, some landscaper has to mow it once a week with a gas powered mower. Or like around here, they put some sheep in a solar array without any food or water and they starved to death.
Or do you want windmills everywhere? The ugly eyesores that destroy the mountain tops. Kill birds that fl
Re: (Score:2)
There are plenty of medical graduates from developing countries queuing up to become doctors in Canada and elsewhere.
They will need a little retraining and there may be a small dip in quality, but I don't think the supply of medical professionals is an issue with modest compromises.
Just addressing the obesity epidemic with nutritional education will reduce large strains on western healthcare.
So far there has not been adequate political will place the blame where it belongs. I don't think the hope and focus
Re: (Score:2)
First of all, until the countries we're accepting immigrants from start implementing national professional accreditation systems that we can trust in Canada, there's always going to be a very long and painful process to become accredited in Canada. Even in the engineering field there's ample evidence that the vast majority of the universities in some countries appear to be diploma mills. We've had to add a lot more technical testing to our interview process to try to figure out which "electrical engineeri
Re: (Score:2)
I agree with the accreditation problems you list. Yes, the degrees have been watered down and there are many diploma mills.
I don't think healthcare costs will come down by simple automation. I happen to know US healthcare and the problems aren't as simple as you think. Yes, its a mess and its going to stay a mess. Automation tends to introduce new problems in healthcare. For example, with bar code systems you mentioned, there is much research literature on how the systems did not succeed. Fixing it is not s
Re: (Score:2)
Link did not render.
tinyurl.com/3swyk44y
Re: (Score:2)
The vast majority of people can't handle a 5% drop in compensation without losing it. Imagine when the whole population of most nations on Earth go through a long drawn out decline in the standard of living.
You misunderstand the economic consequences of population decline.
1st World living standards may decline (or grow more slowly) because there are no longer masses of 3rd World workers to exploit.
But 3rd World living standards will RISE as there is a higher demand for the labor they provide.
There will be a leveling of global income, with the rich losing and the poor gaining.
It has happened before. The Black Death in the 14th century caused a population decline. The value of labor soared, and the wealth of the
Re: (Score:2)
Re: Pay more, expect less? (Score:5, Informative)
He says it's not a chip problem but a chip/software problem. In other words, he's saying that in this day in age, Java is not the solution to our problems, Java is the problem.
Re: (Score:2)
I think you meant JavaSCRIPT....
Re: (Score:2)
Basically any interpreting language is a performance problem.
For hard core performance there's C and assembly programming performed by skilled programmers (those that are considered wizards by normal programmers)
For the extremists - if they could then they'd be modifying the microcode.
Re: (Score:2)
Basically any interpreting language is a performance problem.
Never mind that Java has GIT compilers that saves code in machine instructions and re-uses it over and over so it's just like running C code.
Java isn't a good choice for short lived program execution in pipes like "ps axw | grep postgres" but for long living server processes, once warmed up (GIT has put everything in machine format), performance is at least 95% of what C performance is and can even be faster than poorly written C.
You may also fully pre-compile java program in machine format before starting
Re: (Score:2)
I'm aware about the JIT compiler (not GIT compiler), but even that code it's generating has tradeoffs compared to native compilers.
Java isn't an interpreting language, it's first compiled to byte code that's then compiled to native code at some hotspots during execution. Over time more and more will be compiled to native code.
One drawback with Java and also C# is the memory management that also costs performance over what you could code in C and assembly, so I'd classify C# and Java as quite a bit better th
Re: Pay more, expect less? (Score:2)
Interpreted languages give you instant feedback on whether the code works; not just that it builds correctly.
Re: (Score:2)
The code may work, but the system it executes in may stop working. When you have a compiling system then all references are checked for consistency.
Imagine the fun to locate a problem that occurs only at leap years because someone optimized the code in an interpreted language (e.g. removed a field considered redundant). It can take almost four years for the bug to show up because the person changing it didn't check for all callers and run tests for all callers because it was an urgent fix and then that pers
Re: Pay more, expect less? (Score:2)
It would take about 5 seconds to find a bug like that in an interpreted language.
Re: Pay more, expect less? (Score:2)
It sounds like you're arguing for Rust. You get the expressiveness of Java and C#, and even more guarantees against the presence of bugs. C#, and especially Java, have a nasty habit of throwing runtime errors in unexpected places. Less so than interpreted languages, but way more so than Rust.
Re: (Score:2)
You don't have to be a wizard to write good C or ASM code. It just takes non-wizards who are also good programmers a lot longer to do it.
Re: (Score:3)
"Moore's law is the observation that the number of transistors in a dense integrated circuit (IC) doubles about every two years."
It does not necessarily states that the performance increases at the same pace. Doubling the number of transistors can have other benefits like higher precision.
Performance in computing and performance in goods delivery today is basically limited by similar factors - logistics.
Since most computers are generalists and not designed for a specific problem they have limits that a dedi
Re: (Score:1)
Re: Pay more, expect less? (Score:2)
Nah there's a lot of shit in the stack of pretty much everything. I just singled out Java because it's a big pile of shit that just keeps getting bigger as time goes on, with no end in sight.
Re: Pay more, expect less? (Score:2)
Not exactly. Java is a symptom of a much greater problem: Too many people write shitty code using shitty frameworks. It's not just the Java runtime/vm that's shitty, it's the libraries too. The whole thing is just crap. It's planned like shit, and it's implemented like shit.
Take a look at the optional type for example. I was using it for about 30 minutes before realizing just how utterly pointless it is. Good concept that current generation languages make good use of, but Java royally fucked it anyways, jus
Re: (Score:3)
It's not the language that's the problem, it's the developers. You can write fast efficient code in Java, JavaScript or pretty much any language really.
Unfortunately though most working developers today think that being a good developer means being familiar all the esoteric features of abominably heavyweight frameworks such as Spring. They don't think about clock cycles, stack depth, the complexity of the object graphs they are creating, or any of the stuff that developers from the last century had to ca
Re:Pay more, expect less? (Score:5, Informative)
It's not because they feel like making you pay more - the reality is that chip technology has been getting harder to keep pushing faster and faster lately. We're starting to hit up against some real problems with physics.
You're still seeing the average cost of chips go down while seeing their performance go up - it's just the ratio of price decrease and performance increase that has changed.
And honestly most of the interesting work (for me anyways) is going towards chip efficiency rather than simple performance. If chip A can do 60% of the performance of chip B at only 10% of the power consumption then I'm still more interested in chip A even though its the slower of the two.
Re: (Score:3)
If chip A can do 60% of the performance of chip B at only 10% of the power consumption then I'm still more interested in chip A even though its the slower of the two.
Some people are willing to sacrifice performance for efficiency especially until battery technology can catch up. Unfortunately, I am not. I upgrade my laptop about every 5 years. This usually means twice the speed. This stopped being the case about 5 years ago where new laptops are slower than 3-4 year old laptops because they started optimizing for battery life instead. The only way to get a faster computer today than what was sold 5 years ago is to buy a gaming laptop.
Re: (Score:2)
This stopped being the case about 5 years ago where new laptops are slower than 3-4 year old laptops because they started optimizing for battery life instead. The only way to get a faster computer today than what was sold 5 years ago is to buy a gaming laptop.
They still make laptops optimized for performance today. They just also make laptops optimized for efficiency as well. And no, you don't have to buy a gaming laptop to do it. There are plenty of powerful laptops on the market geared towards business or personal use.
Re: (Score:2)
Re: (Score:2)
Get a rack server and connect to it from any laptop you find somewhere.
That depends on how much you want to pay the mobile Internet provider per month to move data back and forth between the two.
Re: (Score:2)
Re: (Score:2)
Battery technology isn't going to "catch up". There have been no serious advances in technological understanding, merely in industrial capability, for battery tech in the past 40+ years. The capabilities have changed, absolutely - but what we have on the horizon is geared more towards deep cycle storage, not portability.
There is nothing on the horizon for lightweight portable power density increases which really matter.
The solution, currently, to get a faster system is better silicon. ARM based chips are go
Re: (Score:2)
Unless you mean "the invention of literal magic" in which case there have not been and never will be any "advances."
Re: (Score:2)
Unless you mean "the invention of literal magic" in which case there have not been and never will be any "advances."
It wouldn't need to be magic just something considerably better than what we currently have.
The energy density of a lithium battery is about 0.3 MJ/kg.
On the other hand, gasoline is about 47.5 MJ/kg, hydrogen is about 120 MJ/kg and uranium is about 5,184,000MJ/Kg.
Those are consumables but there is no reason to believe that a much better battery technology with much better energy densities could not be invented.
It wouldn't have to be what we think of as a battery today to be useful. A micro hydrogen fuel ce
Re: (Score:2)
It's not because they feel like making you pay more - the reality is that chip technology has been getting harder to keep pushing faster and faster lately. We're starting to hit up against some real problems with physics.
Or perhaps we're just hitting an even larger problem with sales and unending greed.
It's not merely hard to sell the next-gen iPhone with the same "bionic" chip in it as the previous model. It's damn near impossible. Same goes for a lot of tech-n-marketing products.
And when sales fall, stock prices fall.
Also known as seemingly the only damn thing that's important in business today.
If a pile of dogshit made stock prices rise, they'd sell a pile of dogshit. Air freshener, is extra.
Re: (Score:2)
New iPhones are easy to sell. Give it a new color, establish that color of iPhone as an indicator of social prestige, and your job is done.
The difficulty is how to sell a low-end product that competes on price. Nobody's going to buy a new $100 Chinese phone that doesn't offer better performance or features than than the one they still have from a few years ago.
Re: (Score:2)
New iPhones are easy to sell. Give it a new color, establish that color of iPhone as an indicator of social prestige, and your job is done.
Yes. You've addressed the tech demand among 12 - 17 year-olds. The rest of the market, does care about paying more, and getting less, especially if you're pushing out hardware far faster than your phone is dying or losing support. Otherwise they wouldn't even bother with Bionic+1 every version.
The difficulty is how to sell a low-end product that competes on price. Nobody's going to buy a new $100 Chinese phone that doesn't offer better performance or features than than the one they still have from a few years ago.
Given some rather obscene profit margins with certain tech, there's an easier way to sell a good product, at a lower price. Stop promising shareholders the moon every quarter.
Re: (Score:2)
True, the spectre of physical limits is very close now. But, without violating NDAs, there are smaller processes available than what he's currently using. He's not dumb nor ignorant. They're coming around slower, and they're way more mealy mouthed about the advertised geometry, but we're not there yet. So while what he's saying might be technically true, I think he's still trying to justify his larger take on the pc system cost budget. Certainly most of the audience who cares about the 4090 would rather spe
Re: (Score:2)
It's not because they feel like making you pay more - the reality is that ...
In this case, Nvidia's profit margin went from 25% to 85%
Re: (Score:2)
The cost going down has a lot to do with economies of scale... we'll still have price decreases as the newer technologies become more widely adopted - but the performance gains between each generation will start to diminish.
Kinda sucks for people like GPU or CPU manufacturers, but hey... maybe game developers will start optimizing titles a little ;)
Re: (Score:2)
It's not because they feel like making you pay more - the reality is that chip technology has been getting harder to keep pushing faster and faster lately. We're starting to hit up against some real problems with physics.
You're still seeing the average cost of chips go down while seeing their performance go up - it's just the ratio of price decrease and performance increase that has changed.
And honestly most of the interesting work (for me anyways) is going towards chip efficiency rather than simple performance. If chip A can do 60% of the performance of chip B at only 10% of the power consumption then I'm still more interested in chip A even though its the slower of the two.
This. Given how power hungry the 30xx series was and the 40xx is reportedly even worse, efficiency is something we need more of.
However I suspect Moore's law is going to be replaced with Huang's law, every 18 months the price of GPUs will double.
Re: (Score:2)
And in doing this, and repeating it more and more, when it does happen, we will all know it was coming, and that is how they are going to justify their cost increase.
1. Not making e
Pay More, Get Less. (Score:3)
"Computing is a not a chip problem, it's a software and chip problem."
Oh, you mean like:
"Video cards are not a gaming product, it's a gaming and crypto product."
I highly doubt prices are coming down when we get 2018 bang for 2022 bucks, so this was a polite way of saying pay more, get less.
Re: (Score:2)
The Ethereum merge has taken the wind out of GPU mining, at least for the moment (and forever, I hope, though if the SEC decides that proof-of-stake is a securities-based offering, it could make mining profitable again). Scrounging whatever is left over from scripts ordering everything for months at a time should be a thing of the past.
Re: (Score:2)
The Ethereum merge has taken the wind out of GPU mining
Two years of "but it's not really the miners!!"
Yes, it was. Fuck off now. Thanks.
Also, fuck you Ethereum.
Re: (Score:2)
Re: (Score:3)
I can thing of an example by looking at how popular AVX has been among "some" programmers that I've met.
Those "some" outright refuse to work with such instructions because restructuring their logic to at least partially work as SIMD is such a bother.
Thus they hope that more powerful CPUs will execute their scalar logic faster in the future instead of reworking what could make sense to be done in vector logic into vector log
Re: (Score:1)
The Array of Structures strategy, which is what every goddamn programmer does by default, and when you are lucky sort of comes close to optimal under a restricted view.
The Structure of Arrays strategy, which is actually optimal, and quite easy to program for, but requires rearranging all your data specifically for this and that has consequences.
In AoS you have a thing class with a position vector and you opportunistically use some SIMD for processing it. In SoA, you cannot ha
Re: Pay More, Get Less. (Score:2)
I'll take tags over data tables any day of the week
GPU's are crazy expensive, yet... (Score:2)
Finally. (Score:5, Interesting)
Next up: Low Power Computing and algorithm-centric optimisation, just like in the good old days when every bit and every cycle was valuable.
When I see how these young wippersnappers waste processing power for yet another layer of docker-node-container-buildup-teardown nonsense just to change the label on a button in the web in it breaks my heart. ...
Now please excuse me, I have to chase some kids off my lawn.
Re: (Score:2)
Re: (Score:2)
There's a job security element, too.
Re:Finally. (Score:5, Insightful)
Why this is a lesson we needed to learn again is beyond me. Waste is bad. Don't be wasteful.
I've been screaming about this for years. Just think how much faster and smoother things could be today if we had even a little discipline.
Re: (Score:2)
Re: (Score:2)
Wasteful projects are generally larger and more complex than their less wasteful equivalents. Consequently, they waste significant amounts of developer time. The more Enterprise Ready [github.com] your code gets, in fact, the more developer time it is likely to waste.
What he really means to say (Score:4, Insightful)
Crypto is dead and so is consumers tolerance for inflated prices.
Re: (Score:2)
Strangely enough, the intolerance for high prices isn't stopping NVidia from overcharging for their new mid-tier models.
The MSRP for a 12GB GeForce 3080 is $900, while you can get a used GeForce 3090 with twice as much VRAM for less than that on the used market.
Re: (Score:2)
Crypto is dead and so is consumers tolerance for inflated prices.
I'd wait it out before making that claim. There are plenty wealthy gamers with disposable incomes, just like there were plenty of gamers who happily dropped $thousand + on a 3090 not because they needed it but because they wanted a new card and the 3080s were unavailable.
Rich people exist, and gaming even at these prices is a pretty damn cheap hobby compared to most others. My wife's sewing machine cost more than a 4090, so did a friend of mine's fishing rod, and let's not talk about cars or motorbikes.
Insa
One law that should concern Nvidia... (Score:1)
Their biggest market has most of their circuits topped-out at 1800W.
Neat (Score:2)
But i want to see the AMD (and intel if they don't screw up) take on this.
Competition is always a pretty fun thing.
It's an excuse (Score:2)
They're really trying to push a narrative here. Sure, their newest products are expensive. There are a lot of people who haven't been able to buy for the last several years. Maybe they shouldn't extort their customer base by only offering the highest performing options. The GTX 1650 is STILL not available for MSRP after several years. It's probably cheap to keep making these by now but they're only going to produce the most profitable and tell potential customers that maybe they just can't afford a GPU
what about 4-5 nm processes? (Score:2)
Moores law is about same for same (Score:2)
focus on power consumption ... (Score:2)
Fact is, for a large percentage of tasks, we reached a perfectly adequate speed probably a decade ago - what we really need now, is a double down on energy saving instead of a continuing focus of more raw processing power.
This fact is borne out by the glaringly obvious stats that people aren't buying new devices as often as they did, simply because the processing power of those devices are vastly underutilised for the majority of tasks.
We can see it pretty much _everywhere_ - that raw processing power now,
... and the art of software optimisation? (Score:2)
Just to add to my little ranty topic, as any old grey beard tech enthusiast software dev will tell you, software has become BLOATED.
The rise of raw processing power has also seen a rise in lazy coding.
The days of software engineers coming up with incredible feats of engineering to squeeze the most out of a limited system? - largely gone.
It's now libraries on top of libraries - layers upon layers of code to ease the burden of programming, at the expense of pure efficiency.
Take a trip back through history, t
Re: (Score:2)
Back to barebone programming in C and Fortran without a gigaton of frameworks.
Re:... and the art of software optimisation? (Score:4, Insightful)
It's now libraries on top of libraries - layers upon layers of code to ease the burden of programming, at the expense of pure efficiency.
That's a good thing. Why spend the scarce and nonrenewable resource of the time of a smart programmer on doing something that a processor can do anyway: make the code run faster? If the code runs "fast enough" (and of course there can be disagreements on what that means) then it's better to put that time to use doing something else such as improving security or adding features.
Re: (Score:2)
Yes, embedded devices are a different story. I was thinking of server, desktop, and mobile applications. For most of those, developer time is far more valuable than processor time.
Re: (Score:1)
when a 10 year old graphics card, is still capable of running most modern games at playable frame rates (with lower quality settings, sure)
Nvidia 600 series was 10 years ago. Please tell me what modern video games you can play on a 600 series card.
One of the major mining companies (Score:2)
Sure thing Nvidia. you can tell your stockholders whatever you want. Not sure the SEC agrees with that though. Didn't you already get your hand slapped for understating your dependency on crypto?
Re: (Score:2)
Point of diminishing returns (Score:2)
Dupe is in the headline dupe (Score:1)
Depends on how you define chip... (Score:2)
The traditional definition of chip with respect to Moore's law is a single die.
Moore's law basically implied the transistor density on the wafer would double every 18 months so either a die of the same size would be twice as performant or you could get the same performance out of a die half the size (so half the cost, assuming constant wafer cost).
If you defined chip as the physical package (not just the die) then you could argue AMD's chiplet process is kind of extending Moore's law. They are sticking mor
Re: (Score:2)
It didn't really. Moore's law is considerably more subtle than that. He basically observed that the price vs. element count curve is U shaped, with a definite optimum, and that optimum moves according to the familiar "Moore's Law."
Assuming Nvidia tends to target their GPUs at that optimum, the 4000 series is pretty close to obeying Moore's law versus the 3000 series. The 4090 was released two years after the 3090 a
Eh... It had a good run. (Score:2)
It's been sketchy for a long time. Anybody defending it had to tie themselves into technical knots to shoehorn it in.
Awesome (Score:1)
So my next GPU purchase will be final?
Haven't felt that way, since my Amiga.
Moore Clickbait (Score:2)
We've known for a while that the exponential nature of Moore's law that it cannot be sustained forever - but who care
Analog chips? (Score:2)
Is it considered software switching to analog chips? Some of the new neural engine chips based on analog circuits dramatically outperform GPU's for calculations per unit power.
Time to switch to AMD (Score:2)
I don't know, the things that I've been hearing out of NVidia lately kinda make me not want to buy Nvidia products anymore if I can avoid it.
Meanwhile Huang LIVE from 2019 at CES (Score:1)
Eh (Score:2)
I always thought of it as "Moore's Strongly Worded Suggestion" anyway...