Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Hardware Technology

'Moore's Law Is Dead,' Says Nvidia CEO (marketwatch.com) 116

Nvidia Chief Executive Jensen Huang's remarks about Moore's Law from earlier this week: "Moore's Law's dead," Huang said, referring to the standard that the number of transistors on a chip doubles every two years. "And the ability for Moore's Law to deliver twice the performance at the same cost, or at the same performance, half the cost, every year and a half, is over. It's completely over, and so the idea that a chip is going to go down in cost over time, unfortunately, is a story of the past." He added: "Computing is a not a chip problem, it's a software and chip problem."
This discussion has been archived. No new comments can be posted.

'Moore's Law Is Dead,' Says Nvidia CEO

Comments Filter:
  • "Says Nvidia CEO Says"...

    needs some copyediting.

  • that seems to be the takeaway.

    i reckon if we had a look at true inflation in the stuff the average person needs it has to have skyrocketed lately.

    • by RobinH ( 124750 ) on Friday September 23, 2022 @01:14PM (#62908401) Homepage
      The last 40 years we got used to an extremely rapid increase in computing power at cheaper prices every year, and also a rapidly increasing production capacity in emerging markets, particularly China. Both of those are just over now. The former is simply due to physics, and the latter [wikipedia.org] is due to the use of a one-child policy in combination with the correlation between rising standard of living and choosing to have fewer children. All you little Thanos' in the audience thought the solution to all our problems was to stop having so many kids. Well, welcome to the consequence: a shrinking working population supporting a bigger retired population, which leads to high inflation. Get used to it.
      • by AmazingRuss ( 555076 ) on Friday September 23, 2022 @01:41PM (#62908521)
        Better that than a starving population fighting over scraps.
        • by CAIMLAS ( 41445 )

          Is it, though?

          Wait a little bit. We'll get there soon.

          • Wait a little bit. We'll get there soon.

            Not likely. There are still billions of underutilized people working in rice paddies and corn fields. We can shift production from China to Africa and Bangladesh.

            By the time Africa stops having babies, the robots will be ready.

      • by jma05 ( 897351 )

        That's a perfectly fine price to not drown and make most of the rest of the species extinct.
        The population declines need to go on for at least 2 centuries for the survival of ecosystems.
        Inflation and tightening the belt is a small price.

        • by RobinH ( 124750 )
          It doesn't matter than you and I can tolerate it. The vast majority of people can't handle a 5% drop in compensation without losing it. Imagine when the whole population of most nations on Earth go through a long drawn out decline in the standard of living. Yeah, it needs to happen, but coupled with the recent trend of de-globalization, it's going to lead to a lot of violence and wars. You don't bomb your customers, but if your neighbour stops trading with you, why not go take their stuff?
          • by jma05 ( 897351 )

            I understand. But most countries have a social safety net now.
            There will no doubt be upheavals. It is still a small price to pay for not drowning and not having mass extinctions.
            Just as you urge me to look at it from the perspective of the less fortunate, look at it from an ecological perspective and of other species. For them, it is a genocide. We tend to ignore that as long as we have our bread and circuses.

            • by RobinH ( 124750 )

              The social safety net is actually one of the things that are at risk. For instance, I'm in Ontario, Canada, where we have a single-payer government run healthcare. I don't pay to see a doctor or to go to a hospital. During COVID a whole bunch of doctors and nurses took early retirement, and we're simply not graduating enough doctors and nurses to take their place, so we're now in a position where it's hard to find a family doctor taking new patients, and some rural hospitals have had to occasionally clos

              • by sodul ( 833177 )

                Or some of the folks who would have made a career flipping burgers and swiping floors will be trained as skilled nurses while robots do the unskilled labor.

                Improvements in health care should help as well, and by that I mean less diseases from less pollution and living better longer. If the population pyramid starts inverting we just need to have the younger population to become more productive at taking care of the economy and the older population. At some point humanity will find a good balance point with

              • Giant solar arrays? There already are enough around me, acres of fields that used to be nice grassy fields have these hideoous angled panels. That still have to be mowed around. So instead of a farmer using the field for a crop, some landscaper has to mow it once a week with a gas powered mower. Or like around here, they put some sheep in a solar array without any food or water and they starved to death.
                Or do you want windmills everywhere? The ugly eyesores that destroy the mountain tops. Kill birds that fl

              • by jma05 ( 897351 )

                There are plenty of medical graduates from developing countries queuing up to become doctors in Canada and elsewhere.
                They will need a little retraining and there may be a small dip in quality, but I don't think the supply of medical professionals is an issue with modest compromises.
                Just addressing the obesity epidemic with nutritional education will reduce large strains on western healthcare.
                So far there has not been adequate political will place the blame where it belongs. I don't think the hope and focus

                • by RobinH ( 124750 )

                  First of all, until the countries we're accepting immigrants from start implementing national professional accreditation systems that we can trust in Canada, there's always going to be a very long and painful process to become accredited in Canada. Even in the engineering field there's ample evidence that the vast majority of the universities in some countries appear to be diploma mills. We've had to add a lot more technical testing to our interview process to try to figure out which "electrical engineeri

                  • by jma05 ( 897351 )

                    I agree with the accreditation problems you list. Yes, the degrees have been watered down and there are many diploma mills.

                    I don't think healthcare costs will come down by simple automation. I happen to know US healthcare and the problems aren't as simple as you think. Yes, its a mess and its going to stay a mess. Automation tends to introduce new problems in healthcare. For example, with bar code systems you mentioned, there is much research literature on how the systems did not succeed. Fixing it is not s

          • The vast majority of people can't handle a 5% drop in compensation without losing it. Imagine when the whole population of most nations on Earth go through a long drawn out decline in the standard of living.

            You misunderstand the economic consequences of population decline.

            1st World living standards may decline (or grow more slowly) because there are no longer masses of 3rd World workers to exploit.

            But 3rd World living standards will RISE as there is a higher demand for the labor they provide.

            There will be a leveling of global income, with the rich losing and the poor gaining.

            It has happened before. The Black Death in the 14th century caused a population decline. The value of labor soared, and the wealth of the

      • The increase in production capacity doesn't have to end with China (PRC). Sure, China is now the supposed factory of the world. But there are other countries that have a labor pool that could compensate for the decrease in China's working age population (India, the countries of Africa, etc). Many productions systems are also increasingly automated. So between these two, if there was a way to beat quantum physics, then we'd continue to have cheaper and faster chips.
    • by ArmoredDragon ( 3450605 ) on Friday September 23, 2022 @01:16PM (#62908409)

      He says it's not a chip problem but a chip/software problem. In other words, he's saying that in this day in age, Java is not the solution to our problems, Java is the problem.

      • I think you meant JavaSCRIPT....

        • by Z00L00K ( 682162 )

          Basically any interpreting language is a performance problem.

          For hard core performance there's C and assembly programming performed by skilled programmers (those that are considered wizards by normal programmers)

          For the extremists - if they could then they'd be modifying the microcode.

          • by ls671 ( 1122017 )

            Basically any interpreting language is a performance problem.

            Never mind that Java has GIT compilers that saves code in machine instructions and re-uses it over and over so it's just like running C code.

            Java isn't a good choice for short lived program execution in pipes like "ps axw | grep postgres" but for long living server processes, once warmed up (GIT has put everything in machine format), performance is at least 95% of what C performance is and can even be faster than poorly written C.

            You may also fully pre-compile java program in machine format before starting

            • by Z00L00K ( 682162 )

              I'm aware about the JIT compiler (not GIT compiler), but even that code it's generating has tradeoffs compared to native compilers.

              Java isn't an interpreting language, it's first compiled to byte code that's then compiled to native code at some hotspots during execution. Over time more and more will be compiled to native code.

              One drawback with Java and also C# is the memory management that also costs performance over what you could code in C and assembly, so I'd classify C# and Java as quite a bit better th

              • Interpreted languages give you instant feedback on whether the code works; not just that it builds correctly.

                • by Z00L00K ( 682162 )

                  The code may work, but the system it executes in may stop working. When you have a compiling system then all references are checked for consistency.

                  Imagine the fun to locate a problem that occurs only at leap years because someone optimized the code in an interpreted language (e.g. removed a field considered redundant). It can take almost four years for the bug to show up because the person changing it didn't check for all callers and run tests for all callers because it was an urgent fix and then that pers

              • It sounds like you're arguing for Rust. You get the expressiveness of Java and C#, and even more guarantees against the presence of bugs. C#, and especially Java, have a nasty habit of throwing runtime errors in unexpected places. Less so than interpreted languages, but way more so than Rust.

          • You don't have to be a wizard to write good C or ASM code. It just takes non-wizards who are also good programmers a lot longer to do it.

      • by Z00L00K ( 682162 )

        "Moore's law is the observation that the number of transistors in a dense integrated circuit (IC) doubles about every two years."

        It does not necessarily states that the performance increases at the same pace. Doubling the number of transistors can have other benefits like higher precision.

        Performance in computing and performance in goods delivery today is basically limited by similar factors - logistics.

        Since most computers are generalists and not designed for a specific problem they have limits that a dedi

      • Of course he is talking about graphics cards, so this has nothing to do with Java. (unless you are planning to use a 4090 to play minecraft)
        • Nah there's a lot of shit in the stack of pretty much everything. I just singled out Java because it's a big pile of shit that just keeps getting bigger as time goes on, with no end in sight.

      • by nut ( 19435 )

        It's not the language that's the problem, it's the developers. You can write fast efficient code in Java, JavaScript or pretty much any language really.

        Unfortunately though most working developers today think that being a good developer means being familiar all the esoteric features of abominably heavyweight frameworks such as Spring. They don't think about clock cycles, stack depth, the complexity of the object graphs they are creating, or any of the stuff that developers from the last century had to ca

    • by MBGMorden ( 803437 ) on Friday September 23, 2022 @01:22PM (#62908453)

      It's not because they feel like making you pay more - the reality is that chip technology has been getting harder to keep pushing faster and faster lately. We're starting to hit up against some real problems with physics.

      You're still seeing the average cost of chips go down while seeing their performance go up - it's just the ratio of price decrease and performance increase that has changed.

      And honestly most of the interesting work (for me anyways) is going towards chip efficiency rather than simple performance. If chip A can do 60% of the performance of chip B at only 10% of the power consumption then I'm still more interested in chip A even though its the slower of the two.

      • If chip A can do 60% of the performance of chip B at only 10% of the power consumption then I'm still more interested in chip A even though its the slower of the two.

        Some people are willing to sacrifice performance for efficiency especially until battery technology can catch up. Unfortunately, I am not. I upgrade my laptop about every 5 years. This usually means twice the speed. This stopped being the case about 5 years ago where new laptops are slower than 3-4 year old laptops because they started optimizing for battery life instead. The only way to get a faster computer today than what was sold 5 years ago is to buy a gaming laptop.

        • by Xenx ( 2211586 )

          This stopped being the case about 5 years ago where new laptops are slower than 3-4 year old laptops because they started optimizing for battery life instead. The only way to get a faster computer today than what was sold 5 years ago is to buy a gaming laptop.

          They still make laptops optimized for performance today. They just also make laptops optimized for efficiency as well. And no, you don't have to buy a gaming laptop to do it. There are plenty of powerful laptops on the market geared towards business or personal use.

        • by short ( 66530 )
          Get a rack server and connect to it from any laptop you find somewhere. It is then powerful and silent.
          • by tepples ( 727027 )

            Get a rack server and connect to it from any laptop you find somewhere.

            That depends on how much you want to pay the mobile Internet provider per month to move data back and forth between the two.

            • by short ( 66530 )
              Most of the time I am at home so no data paid. Otherwise 99% of work is by text SSH so no real data transferred. And besides that data plans in Czech Republic are now around USD 25 for unlimited data.
        • by CAIMLAS ( 41445 )

          Battery technology isn't going to "catch up". There have been no serious advances in technological understanding, merely in industrial capability, for battery tech in the past 40+ years. The capabilities have changed, absolutely - but what we have on the horizon is geared more towards deep cycle storage, not portability.

          There is nothing on the horizon for lightweight portable power density increases which really matter.

          The solution, currently, to get a faster system is better silicon. ARM based chips are go

          • Compare the power and energy density of a commercially available battery from 1980 and 2020 and come say again that there haven't been any advances...

            Unless you mean "the invention of literal magic" in which case there have not been and never will be any "advances."
            • Unless you mean "the invention of literal magic" in which case there have not been and never will be any "advances."

              It wouldn't need to be magic just something considerably better than what we currently have.

              The energy density of a lithium battery is about 0.3 MJ/kg.
              On the other hand, gasoline is about 47.5 MJ/kg, hydrogen is about 120 MJ/kg and uranium is about 5,184,000MJ/Kg.
              Those are consumables but there is no reason to believe that a much better battery technology with much better energy densities could not be invented.
              It wouldn't have to be what we think of as a battery today to be useful. A micro hydrogen fuel ce

      • It's not because they feel like making you pay more - the reality is that chip technology has been getting harder to keep pushing faster and faster lately. We're starting to hit up against some real problems with physics.

        Or perhaps we're just hitting an even larger problem with sales and unending greed.

        It's not merely hard to sell the next-gen iPhone with the same "bionic" chip in it as the previous model. It's damn near impossible. Same goes for a lot of tech-n-marketing products.

        And when sales fall, stock prices fall.

        Also known as seemingly the only damn thing that's important in business today.

        If a pile of dogshit made stock prices rise, they'd sell a pile of dogshit. Air freshener, is extra.

        • New iPhones are easy to sell. Give it a new color, establish that color of iPhone as an indicator of social prestige, and your job is done.

          The difficulty is how to sell a low-end product that competes on price. Nobody's going to buy a new $100 Chinese phone that doesn't offer better performance or features than than the one they still have from a few years ago.

          • New iPhones are easy to sell. Give it a new color, establish that color of iPhone as an indicator of social prestige, and your job is done.

            Yes. You've addressed the tech demand among 12 - 17 year-olds. The rest of the market, does care about paying more, and getting less, especially if you're pushing out hardware far faster than your phone is dying or losing support. Otherwise they wouldn't even bother with Bionic+1 every version.

            The difficulty is how to sell a low-end product that competes on price. Nobody's going to buy a new $100 Chinese phone that doesn't offer better performance or features than than the one they still have from a few years ago.

            Given some rather obscene profit margins with certain tech, there's an easier way to sell a good product, at a lower price. Stop promising shareholders the moon every quarter.

      • True, the spectre of physical limits is very close now. But, without violating NDAs, there are smaller processes available than what he's currently using. He's not dumb nor ignorant. They're coming around slower, and they're way more mealy mouthed about the advertised geometry, but we're not there yet. So while what he's saying might be technically true, I think he's still trying to justify his larger take on the pc system cost budget. Certainly most of the audience who cares about the 4090 would rather spe

      • It's not because they feel like making you pay more - the reality is that ...

        In this case, Nvidia's profit margin went from 25% to 85%

      • The cost going down has a lot to do with economies of scale... we'll still have price decreases as the newer technologies become more widely adopted - but the performance gains between each generation will start to diminish.

        Kinda sucks for people like GPU or CPU manufacturers, but hey... maybe game developers will start optimizing titles a little ;)

      • by mjwx ( 966435 )

        It's not because they feel like making you pay more - the reality is that chip technology has been getting harder to keep pushing faster and faster lately. We're starting to hit up against some real problems with physics.

        You're still seeing the average cost of chips go down while seeing their performance go up - it's just the ratio of price decrease and performance increase that has changed.

        And honestly most of the interesting work (for me anyways) is going towards chip efficiency rather than simple performance. If chip A can do 60% of the performance of chip B at only 10% of the power consumption then I'm still more interested in chip A even though its the slower of the two.

        This. Given how power hungry the 30xx series was and the 40xx is reportedly even worse, efficiency is something we need more of.

        However I suspect Moore's law is going to be replaced with Huang's law, every 18 months the price of GPUs will double.

      • by xystren ( 522982 )
        sounds like a promo to let us know that they will be raising prices - they got so used to the crypto crazy high prices, and they want that to continue. So what better way that to state something that has been "relatively" accurate to date is going to be invalid and statee in a round about way, "We are going to charge more."

        And in doing this, and repeating it more and more, when it does happen, we will all know it was coming, and that is how they are going to justify their cost increase.

        1. Not making e
  • by geekmux ( 1040042 ) on Friday September 23, 2022 @01:13PM (#62908397)

    "Computing is a not a chip problem, it's a software and chip problem."

    Oh, you mean like:

    "Video cards are not a gaming product, it's a gaming and crypto product."

    I highly doubt prices are coming down when we get 2018 bang for 2022 bucks, so this was a polite way of saying pay more, get less.

    • The Ethereum merge has taken the wind out of GPU mining, at least for the moment (and forever, I hope, though if the SEC decides that proof-of-stake is a securities-based offering, it could make mining profitable again). Scrounging whatever is left over from scripts ordering everything for months at a time should be a thing of the past.

      • by Tailhook ( 98486 )

        The Ethereum merge has taken the wind out of GPU mining

        Two years of "but it's not really the miners!!"

        Yes, it was. Fuck off now. Thanks.

        Also, fuck you Ethereum.

        • by Xenx ( 2211586 )
          It wasn't just the miners. It was a culmination of factors and miners were definitely one of them. But, the problem wouldn't have existed to nearly that extent if it was the only one. Pretty much all the factors normalizing is why there is now a glut.
    • by fazig ( 2909523 )
      That perhaps too, but I think I understand the problem to be somewhere different.

      I can thing of an example by looking at how popular AVX has been among "some" programmers that I've met.
      Those "some" outright refuse to work with such instructions because restructuring their logic to at least partially work as SIMD is such a bother.
      Thus they hope that more powerful CPUs will execute their scalar logic faster in the future instead of reworking what could make sense to be done in vector logic into vector log
      • There are two types of SIMD

        The Array of Structures strategy, which is what every goddamn programmer does by default, and when you are lucky sort of comes close to optimal under a restricted view.
        The Structure of Arrays strategy, which is actually optimal, and quite easy to program for, but requires rearranging all your data specifically for this and that has consequences.

        In AoS you have a thing class with a position vector and you opportunistically use some SIMD for processing it. In SoA, you cannot ha
  • it could be worse. I know everyone is bitching about the 4080 and 4090 costs, but Nvidia's H100 80GB costs $36,405. Next generation AI(such as Stable Diffusion) requires crazy powerful hardware.
  • Finally. (Score:5, Interesting)

    by Qbertino ( 265505 ) <moiraNO@SPAMmodparlor.com> on Friday September 23, 2022 @01:31PM (#62908483)

    Next up: Low Power Computing and algorithm-centric optimisation, just like in the good old days when every bit and every cycle was valuable.

    When I see how these young wippersnappers waste processing power for yet another layer of docker-node-container-buildup-teardown nonsense just to change the label on a button in the web in it breaks my heart. ...
    Now please excuse me, I have to chase some kids off my lawn.

    • The whole microserver architecture thing seems designed to maximize power usage to me.
    • Re:Finally. (Score:5, Insightful)

      by narcc ( 412956 ) on Friday September 23, 2022 @03:33PM (#62908919) Journal

      Why this is a lesson we needed to learn again is beyond me. Waste is bad. Don't be wasteful.

      I've been screaming about this for years. Just think how much faster and smoother things could be today if we had even a little discipline.

      • by Zuriel ( 1760072 )
        Dev time isn't free, though. The extra six months the dev team takes to produce a less wasteful product is just a different kind of waste.
        • by narcc ( 412956 )

          Wasteful projects are generally larger and more complex than their less wasteful equivalents. Consequently, they waste significant amounts of developer time. The more Enterprise Ready [github.com] your code gets, in fact, the more developer time it is likely to waste.

  • by WaffleMonster ( 969671 ) on Friday September 23, 2022 @01:33PM (#62908489)

    Crypto is dead and so is consumers tolerance for inflated prices.

    • Strangely enough, the intolerance for high prices isn't stopping NVidia from overcharging for their new mid-tier models.

      The MSRP for a 12GB GeForce 3080 is $900, while you can get a used GeForce 3090 with twice as much VRAM for less than that on the used market.

    • Crypto is dead and so is consumers tolerance for inflated prices.

      I'd wait it out before making that claim. There are plenty wealthy gamers with disposable incomes, just like there were plenty of gamers who happily dropped $thousand + on a 3090 not because they needed it but because they wanted a new card and the 3080s were unavailable.

      Rich people exist, and gaming even at these prices is a pretty damn cheap hobby compared to most others. My wife's sewing machine cost more than a 4090, so did a friend of mine's fishing rod, and let's not talk about cars or motorbikes.

      Insa

  • Their biggest market has most of their circuits topped-out at 1800W.

  • by Z80a ( 971949 )

    But i want to see the AMD (and intel if they don't screw up) take on this.
    Competition is always a pretty fun thing.

  • They're really trying to push a narrative here. Sure, their newest products are expensive. There are a lot of people who haven't been able to buy for the last several years. Maybe they shouldn't extort their customer base by only offering the highest performing options. The GTX 1650 is STILL not available for MSRP after several years. It's probably cheap to keep making these by now but they're only going to produce the most profitable and tell potential customers that maybe they just can't afford a GPU

  • We keep hearing about the manufacturing process going from 12nm to 5nm to lower. That's what gave AMD the edge to dethrone intel in the CPU business. Same story with apple silicon going down on nm process for the M2 chip. What do we make of those?
    • Same for same, on the same sized piece of silicon real-estate. There is nothing there about cost - and Nvidia really means we want a return on pipeline and bus architecture, and pre-execution units. CPU's have beaten the light wavelength hurdle, and vertical stacking finfets. Any smaller than 5nm - will have yield concerns.The physics of heat dissipation have not changed. So yes, it is ended, but vertical stacking is still possible - but commercially etching speed and others makes this a dumb idea, Thus the
  • Fact is, for a large percentage of tasks, we reached a perfectly adequate speed probably a decade ago - what we really need now, is a double down on energy saving instead of a continuing focus of more raw processing power.

    This fact is borne out by the glaringly obvious stats that people aren't buying new devices as often as they did, simply because the processing power of those devices are vastly underutilised for the majority of tasks.

    We can see it pretty much _everywhere_ - that raw processing power now,

    • Just to add to my little ranty topic, as any old grey beard tech enthusiast software dev will tell you, software has become BLOATED.

      The rise of raw processing power has also seen a rise in lazy coding.
      The days of software engineers coming up with incredible feats of engineering to squeeze the most out of a limited system? - largely gone.

      It's now libraries on top of libraries - layers upon layers of code to ease the burden of programming, at the expense of pure efficiency.

      Take a trip back through history, t

      • by Z00L00K ( 682162 )

        Back to barebone programming in C and Fortran without a gigaton of frameworks.

      • by nasch ( 598556 ) on Friday September 23, 2022 @03:27PM (#62908897)

        It's now libraries on top of libraries - layers upon layers of code to ease the burden of programming, at the expense of pure efficiency.

        That's a good thing. Why spend the scarce and nonrenewable resource of the time of a smart programmer on doing something that a processor can do anyway: make the code run faster? If the code runs "fast enough" (and of course there can be disagreements on what that means) then it's better to put that time to use doing something else such as improving security or adding features.

    • by Anonymous Coward

      when a 10 year old graphics card, is still capable of running most modern games at playable frame rates (with lower quality settings, sure)

      Nvidia 600 series was 10 years ago. Please tell me what modern video games you can play on a 600 series card.

  • just filed for Bankruptcy. Many more to follow. AMD is selling their cards for about 30-50% less than the nvidia equivalent and all nvidia has to offer me is shiny windows and lighting effects I turn off. $500 gets you a card from AMD that can push native 1440p/60fps/High.

    Sure thing Nvidia. you can tell your stockholders whatever you want. Not sure the SEC agrees with that though. Didn't you already get your hand slapped for understating your dependency on crypto?
    • I'm not paying $1600 for the latest nVideo graphics card, and I envy the spoiled brats that can afford to spend that much on gaming. I choose to spend money on my beach house instead.
  • A lot of computer software is limited by the bandwidth to get data into and out of the CPU, not the speed of the CPU itself. I suspect much of future improvement will come from making the busses ridiculously wide, not by making the clock speed higher. That being said, integrating the whole system onto a single chip should always be a win for cost and speed. Much of the improvements in newer ARM chips is in power domains, i.e. the ability to switch off regions of the chip not currently needed.
  • Tonight we're gonna laugh at the editors tonight.
  • The traditional definition of chip with respect to Moore's law is a single die.

    Moore's law basically implied the transistor density on the wafer would double every 18 months so either a die of the same size would be twice as performant or you could get the same performance out of a die half the size (so half the cost, assuming constant wafer cost).

    If you defined chip as the physical package (not just the die) then you could argue AMD's chiplet process is kind of extending Moore's law. They are sticking mor

    • by ceoyoyo ( 59147 )

      Moore's law basically implied the transistor density on the wafer would double every 18 months

      It didn't really. Moore's law is considerably more subtle than that. He basically observed that the price vs. element count curve is U shaped, with a definite optimum, and that optimum moves according to the familiar "Moore's Law."

      Assuming Nvidia tends to target their GPUs at that optimum, the 4000 series is pretty close to obeying Moore's law versus the 3000 series. The 4090 was released two years after the 3090 a

  • It's been sketchy for a long time. Anybody defending it had to tie themselves into technical knots to shoehorn it in.

  • So my next GPU purchase will be final?

    Haven't felt that way, since my Amiga.

  • From Wikipedia: "Moore's law is the observation that the number of transistors in a dense integrated circuit (IC) doubles about every two years. " Cerebras has the world's largest chip: 2.6 trillion transistors -- that's 2,600,000,000,000 -- with 850,000 cores on TSMC 7nm. We don't have to worry about physics, we can just continue to drive the price down to bring doubled speeds for a while.

    We've known for a while that the exponential nature of Moore's law that it cannot be sustained forever - but who care
  • Is it considered software switching to analog chips? Some of the new neural engine chips based on analog circuits dramatically outperform GPU's for calculations per unit power.

  • I don't know, the things that I've been hearing out of NVidia lately kinda make me not want to buy Nvidia products anymore if I can avoid it.

  • https://www.cnet.com/tech/comp... [cnet.com] ...ALSO just in from the Friday Redundancy Department of Redundancy
  • by chris234 ( 59958 )

    I always thought of it as "Moore's Strongly Worded Suggestion" anyway...

The opossum is a very sophisticated animal. It doesn't even get up until 5 or 6 PM.

Working...