Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Hardware

'Huang's Law Is the New Moore's Law' (wsj.com) 55

As chip makers have reached the limits of atomic-scale circuitry and the physics of electrons, Moore's law has slowed, and some say it's over. But a different law, potentially no less consequential for computing's next half century, has arisen. WSJ: I call it Huang's Law, after Nvidia chief executive and co-founder Jensen Huang. It describes how the silicon chips that power artificial intelligence more than double in performance every two years. While the increase can be attributed to both hardware and software, its steady progress makes it a unique enabler of everything from autonomous cars, trucks and ships to the face, voice and object recognition in our personal gadgets. Between November 2012 and this May, performance of Nvidia's chips increased 317 times for an important class of AI calculations, says Bill Dally, chief scientist and senior vice president of research at Nvidia. On average, in other words, the performance of these chips more than doubled every year, a rate of progress that makes Moore's Law pale in comparison.

Nvidia's specialty has long been graphics processing units, or GPUs, which operate efficiently when there are many independent tasks to be done simultaneously. Central processing units, or CPUs, like the kind that Intel specializes in, are on the other hand much less efficient but better at executing a single, serial task very quickly. You can't chop up every computing process so that it can be efficiently handled by a GPU, but for the ones you can -- including many AI applications -- you can perform it many times as fast while expending the same power. Intel was a primary driver of Moore's Law, but it was hardly the only one. Perpetuating it required tens of thousands of engineers and billions of dollars in investment across hundreds of companies around the globe. Similarly, Nvidia isn't alone in driving Huang's Law -- and in fact its own type of AI processing might, in some applications, be losing its appeal. That's probably a major reason it has moved to acquire chip architect Arm Holdings this month, another company key to ongoing improvement in the speed of AI, for $40 billion.

This discussion has been archived. No new comments can be posted.

'Huang's Law Is the New Moore's Law'

Comments Filter:
  • Transistor count (Score:5, Insightful)

    by ArchieBunker ( 132337 ) on Monday September 21, 2020 @09:45AM (#60527584)

    Moore's law says nothing of performance. It is transistor count. Why does everyone get this wrong?

    • by groobly ( 6155920 ) on Monday September 21, 2020 @09:56AM (#60527632)

      Because "transistor count" has too many syllables. Oh, and also because "journalists" are generally morons with English degrees.

    • Moore's law says nothing of performance. It is transistor count. Why does everyone get this wrong?

      TFA is paywalled, but TFS does not get it wrong. It doesn't say that Moore's Law is about performance. It only says that Huang's Law is about performance, which it is.

    • As long as moore's law was active then increasing transistor count meant that the route to more speed was more complex CPUs.
      But were entering the age where data bandwidth is more important than single CPU speed.
      if the CPU is never waiting for data arrive then it will be faster. that's the key.
      The way we have band-aided this problem up until recently is to invent caches, and speculative execution, and out of order operations and multi-tasking. This gives the CPU something to do with it's idle sili

      • Wonder if the Connection Machine will make a comeback :)

      • Intel hasn't been CISC in decades. To grossly simplify things Intel has a RISC core with a front end that translates x86 into the RISC micro operations that the hardware actually schedules and executes.
        • So it's cisc. just think of it as a black box. It looks like CISC from the outside. it uses extra silicon to make it CISC instructructions. So it has the same drawbacks and advantages CISC does. And since you don't sent it RISC instructions it isn't risc.

          • So it's cisc. just think of it as a black box. It looks like CISC from the outside. it uses extra silicon to make it CISC instructructions. So it has the same drawbacks and advantages CISC does. And since you don't sent it RISC instructions it isn't risc.

            It *is* sent RISC instructions (micro ops) for scheduling and execution. The vast amount of the transistors on the device are for executing these RISC instructions. The legacy CISC instructions seen by the programmer are not executed, they are translated into one or more RISC instructions.

            The programmer's instruction set is CISC.
            The hardware's instruction set is RISC.
            Since we are talking about transistor count here the hardware's reality seems the more appropriate. If we were talking about compilers th

      • by Rei ( 128717 )

        GPUs are not optimal for neural nets; ASICs are.

    • by gweihir ( 88907 )

      Moore's law says nothing of performance. It is transistor count. Why does everyone get this wrong?

      Most people have absolutely no clue about performance and that transistor count is anything but. Especially since complex chips have been interconnect-limited for 20 years or so now.

    • Not only that, but Moore's Law—realized after the fact—is exactly what allowed the growth in AI development today.

      There's nothing special about the field of AI that allows it to outpace Moore's Law. Rather, the reason that machine learning and other AI applications have seemingly been advancing more rapidly than Moore's Law in the last few years is because there was an "overhang" in the industry: technology had advanced—at the pace of Moore's Law—beyond the necessary point to do X, b

    • Moore's law says nothing of performance. It is transistor count. Why does everyone get this wrong?

      I've always believed Moore's law is really about cost per transistor.

      Essentially how many transistors can you afford to shove into your product in order to provide a level of value at a price people are willing to spend.

      In the end you can call it transistor count I suppose yet what really powers the underlying feedback loop is a market based on cost.

  • CPU and graphics chips haven't reached the limits of semiconductors yet, 5 nm and 3nm node test devices have been done. Saying something that doesn't exist, reliable self-driving vehicles, depend on AI which depends on small chips is just making a circular track with 3 stations in the middle of nowhere.

  • We were almost out of reach of Moore's law and immediately you morons had to invent a new one!

    If you would just stop making new laws, technology would have less limits and could increase so much faster!

  • Artificial Intelligence doesn't exist.

    Quit calling this kind of stuff "AI".

    • Artificial Intelligence doesn't exist.

      Quit calling this kind of stuff "AI".

      This exactly.
      There is not now, nor anything on the horizon, that comes anywhere close to an actual AI. All we have now is advanced pattern recognition, and primitive machine learning. The best you could possible call any of the current "AI" would be an "Expert system", and even that is a stretch.

    • Artificial Intelligence doesn't exist.

      Oh really? Then what do you call this [wikipedia.org] then?

  • WHy name it after the CEO, did he actually say anything remotely technical or predictive ?
  • So you take Moore's Law and apply to a more specific application and YOU give it a name. Wow. Can we SCREAM MILLENNIAL!!!!!
  • The law doesn't work.
    AI chips more than double in performance because we basically started from 0. So of course the first increase is close to an infinite percentage.
    Also, like smartphones, if you make them more expensive every two years, you get better performance increase than mature segments (such as computers).

    Intel could also beat (it's own) Moore's law if they doubled the price of their chips every two years, and people were still buying them.

    So of course, that $5 AI chip is now surpassed by a new $10

    • AI chips more than double in performance because we basically started from 0.

      The field of AI started in the 1950s, long before computers were based on "chips".

      The first CPU on an integrated circuit ("chip") was in 1971.

      • Of course. But they didn't call it an AI chip. From what I understand, they are talking about dedicated AI chips.

        • From what I understand, they are talking about dedicated AI chips.

          Huang is mostly talking about GPUs, which are not dedicated AI chips.

          • OK but still more specialized than CPUs. In the past decade, GPUs began to become useful for tasks other than rendering 3D graphics.
            I doubt transistor count of GPUs surpassed Moore's law. And I also doubt AI performance (whatever their metric is) will continue the current trend of doubling every year for long. Therefore this law won't hold.

  • by Dan East ( 318230 ) on Monday September 21, 2020 @10:26AM (#60527756) Journal

    No, it's not. For starters, no one will be referring to Huang's Law because some WSJ author named it after the CEO of some company. That's ridiculous. You might as well call it Trump's Law because he's the president of the country at the moment. That's inconsequential.

    Next, Moore's Law has nothing to do with processing power, processor speed, MIPS, or any other higher-level concept. It purely refers to the physical density of transistors (and similar components) on silicon. Moore made that near-future observation based on the performance he'd seen by his company and the industry to that point. He never claimed it to be a "law" or any such thing. It was an observation.

    Finally, AI processing has absolutely nothing to do with this in any way. There is no reason that AI processing has to be done in a single chip, so the physical constraints of chip manufacture is no limitation to what AI can or cannot do, nor does it limit how fast the AI can process. If AI needs more computational power then you throw more cores at it. Heck, most AI processing for things like speech recognition is done off-device in the "cloud" anyway.

    This whole thing is just stupid and inconsequential.

    • by Tablizer ( 95088 )

      You seem to be assuming social memes must gel with logic. They often don't. A catch phrase that has some truth can live on even if it's full of caveats when carefully dissected. I've learned over the decades that IT is full of buzzwords, hype, and bullshit; and thus one should take such with a grain of salt.

      • by gweihir ( 88907 )

        I've learned over the decades that IT is full of buzzwords, hype, and bullshit; and thus one should take such with a grain of salt.

        Very, very true. And some refuses to die, no matter how stupid.

    • If AI needs more computational power then you throw more cores at it.

      Why is that not true of games as well? Why are we still using a GPU at all then?

      Turns out the answer to that, is the same as for AI.

      The simple fact is that modern AI is almost all neural networks, with very specific mathematical operations the way game engines have very specific mathematical operations that they both need to do a lot of, as rapidly as possible.

      That is why modern AI really took off when people started leaning on the GPU t

    • For starters, no one will be referring to Huang's Law because some WSJ author named it after the CEO of some company.

      You mean like some tech publication picked up Moore's Law based on a statement of a CEO of some company?

      Let's see:
      Both Moore and Huang are billionaires,
      Both Moore and Huang were CEOs of a major tech company specialising in the design of microprocessors at the time they made their claim,
      Both Moore and Huang are business men,
      Both Moore and Huang are electrical engineers.

      So let's just come out and address the elephant in the room: One isn't a white American. ... Are you racist?

      • You mean like some tech publication picked up Moore's Law based on a statement of a CEO of some company?

        Let's see: ... Both Moore and Huang were CEOs of a major tech company specialising in the design of microprocessors at the time they made their claim

        In 1965 when Gordon Moore made his transistor density "claim", he was at Fairchild, and he wasn't the CEO. In fact, he was never the CEO of Fairchild. When he co-founded Intel, he didn't become their CEO until some 7 years later, 10 years after he wrote the paper [intel.com] that gave birth to his "law".

        Moore's "Law" posits a doubling of transistor density every two years. "Huang's Law" is supposedly based on statements he made at nVidia's GPU conference in 2018. But Huang's statements, and those of his people for

  • Interesting. That last article I remember reading on this subject (I don't recall where - anyone?) basically said that development of "AI" would stall because training it had become too expensive. I wonder which will win out?

  • Journalists do not get to name new laws. And certainly not after phrases from CEOs that use bull sh!t bingo words. This would retroactively blemish Gordon Moore's law. Please don't go along with this. Just say no.
  • Please stop posting links to paywalled sites

  • So basically, shitty half-assed excuses for 'AI' doubling in shittiness every 2 years?
  • I name it the scumbag law, after nVidia's founder.

news: gotcha

Working...