Forgot your password?
The Almighty Buck Hardware

Moore's Law Blowout Sale Is Ending, Says Broadcom CTO 267

Posted by samzenpus
from the paying-the-price dept.
itwbennett writes "Broadcom Chairman and CTO Henry Samueli has some bad news for you: Moore's Law isn't making chips cheaper anymore because it now requires complicated manufacturing techniques that are so expensive they cancel out the cost savings. Instead of getting more speed, less power consumption and lower cost with each generation, chip makers now have to choose two out of three, Samueli said. He pointed to new techniques such as High-K Metal Gate and FinFET, which have been used in recent years to achieve new so-called process nodes. The most advanced process node on the market, defined by the size of the features on a chip, is due to reach 14 nanometers next year. At levels like that, chip makers need more than traditional manufacturing techniques to achieve the high density, Samueli said. The more dense chips get, the more expensive it will be to make them, he said."
This discussion has been archived. No new comments can be posted.

Moore's Law Blowout Sale Is Ending, Says Broadcom CTO

Comments Filter:
  • by Anonymous Coward on Thursday December 05, 2013 @11:19PM (#45615569)

    Used to be you used to have to upgrade every 2 years. Now you really have to upgrade every 5 or 7 years. Once every 10 years sounds pretty good to me. As the pace of computer innovation slows, less money has to go towards upgrades. Computers are now more like appliances, you run them down until they physically break.

    Of course if you manufacture computers or work in IT, then such a proposition is horrible as a long product lifecyle means less money coming to you. As a consumer, I like it because I no longer have to shell out hundreds of dollars every other year to keep my computers usable.

  • by 140Mandak262Jamuna (970587) on Thursday December 05, 2013 @11:22PM (#45615581) Journal
    Well, we had a good run. 99% of the computing needs of 99% of the people can be met by the existing chips electronics. For most people network and bandwidth limits their ability to do things, not raw computing power or memory. So Moore's observation (it ain't no law) running out of steam is no big deal. Of course the tech companies need to transition from selling shiny new things every two years to a more sedate pace of growth.
  • by Anonymous Coward on Thursday December 05, 2013 @11:36PM (#45615653)

    When people say this, I think that the person is not being imaginative about the future. Sure, we can meet 99% of current computing needs, but what about uses that we have not yet imagined.

      Image processing and AI are still pretty piss poor, and not all bound by network and bandwidth limits. Watch a Roomba crash into the wall as it randomly cleans your room, Dark Ages!

  • by CapOblivious2010 (1731402) on Thursday December 05, 2013 @11:36PM (#45615655)
    If that's true, we can only hope that the exponential bloating of software stops as well. Software has been eating the free lunch Moore was providing before it got to the users; the sad reality is that the typical end-user hasn't seen much in the way of performance improvements - in some cases, common tasks are even slower now than 10 years ago.

    Oh sure, we defend it by claiming that the software is "good enough" (or will be on tomorrow's computers, anyway), and we justify the bloat by claiming that the software is better in so many other areas like maintainability (it's not), re-usability (it's not), adherence to "design patterns" (regardless of whether they help or hurt), or just "newer software technologies" (I'm looking at you, XAML&WPF), as if the old ones were rusting away.
  • by VTBlue (600055) on Thursday December 05, 2013 @11:50PM (#45615739)

    Hold the boat, a return to C or C++ would be a HUGE boost, no need to throw the baby out with bath water.

    ASM programmers could never build rich content apps that the world relies on today. The code would be ridiculous, think the worst COBOL app times a 1000 for every application used today.

    No, moving dynamic languages and compiling them to optimized C or chunking out high level critical code into optimized C/C++ is what every major web service is focusing on today. Facebook for example is realizing well over 50% gains by just scraping some PHP components with unmanaged code.

  • by Xicor (2738029) on Friday December 06, 2013 @12:00AM (#45615795)
    more transistors per unit area on a chip is worthless atm. you can have a million cores on a processor, but it will still be slowed down dramatically due to issues with parallelism. someone needs to find a way to increase parallel processor speed.
  • by mcrbids (148650) on Friday December 06, 2013 @12:08AM (#45615843) Journal

    Software has been eating the free lunch Moore was providing before it got to the users; the sad reality is that the typical end-user hasn't seen much in the way of performance improvements - in some cases, common tasks are even slower now than 10 years ago.

    This point of view is common, even though its odd disparity with reality make it seem almost anachronistic. Software isn't bloating anywhere near as much as expectations are.

    Oh, sure, it's true that much software is slower than its predecessor. Windows 7 is considerably slower, given the same hardware, than Windows XP which is a dog compared to Windows 95, on the same hardware. But the truth is that we aren't running on the same hardware, and our expectations have risen dramatically. But in actual fact, most implementations of compilers and algorithms show consistent improvements in speed. More recent compilers are considerably faster than older ones. Newer compression software is faster (often by orders of magnitude!) than earlier versions. Software processes such as voice recognition, facial pattern matching, lossy compression algorithms for video and audio, and far too many other things to name have all improved consistently over time. For a good example of this type of improvement, take a look at the recent work on "faster than fast" Fourier Transforms [] as an easy reference.

    So why does it seem that software gets slower and slower? I remember when my Dell Inspiron 600m [] was a slick, fast machine. I was amazed at all the power in this little package! And yet, even running the original install of Windows XP, I can't watch Hulu on it - it simply doesn't have the power to run full screen, full motion, compressed video in real time. I was stunned at how long (a full minute?) the old copy of Open Office took to load, even though I remember running it on the same machine! (With my i7 laptop with SSD and 8 GB of RAM, OpenOffice loads in about 2 seconds)

    Expectations are what changed more than the software.

  • by Opportunist (166417) on Friday December 06, 2013 @12:13AM (#45615865)

    Considering the quality of contemporary components, you'll still be upgrading every 2-3 years. Or however long the warranty in your country is.

  • by adri (173121) on Friday December 06, 2013 @12:53AM (#45616055) Homepage Journal

    Go get Windows 3.1 and Works. Stick it in a vmware VM. Cry at how fast the VM is.


  • The problem is for the vast majority? To use a car analogy its like using a top fuel funny car to go to the store for milk, they have more power than they can possibly use.

    Take my dad for example, he is the perfect "Joe Average" user. he uses social media, watches videos, uses his bookkeeping software, the kind of everyday tasks the majority do daily. When the price dropped on the Phenom IIs to make way for the FX I thought "Well it has been awhile since I built him that Phenom I quad so maybe its time for an upgrade" and ran a usage monitor for a week to see how bad he was hitting the CPU,what did I find? 35%. That is the average amount of usage that quad was getting. Sure he'd get occasionally over 50% but that was only for a few seconds.

    And THAT is why its really not gonna matter to Joe and Jane average, because their systems already idle more than they run and the prices are already crazy cheap. I mean I just got dad a quad core Android tablet for Xmas.,..think he'll EVER come up with enough to do to peg all 4 cores enough that an upgrade would help? Not likely. Hell I was the guy that built a new system every other year with a major overhaul on the odd years,now? My system is 4 years old and I have zero reason to upgrade to a new one. Why should I? I have a hexacore, 8Gb of RAM, 3TB of HDD, the only thing I upgraded was my HD4850 for an HD7750 and even that was about lowering heat and not performance.

    Lets face it, Moore's Law made systems several orders more powerful than the work the masses can come up for them to do. Who cares if Moore's Law finally winds down when the systems are so powerful they spend more time idling than anything else?

  • by Katatsumuri (1137173) on Friday December 06, 2013 @05:30AM (#45617049)

    I see many emerging technologies that promise further great progress in computing. Here are some of them. I wish some industry people here could post some updates about their way to the market. They may not literally prolong the Moore's Law in regards to the number of transistors, but they promise great performance gains, which is what really matters.

    3D chips. As materials science and manufacturing precision advances, we will soon have multi-layered (starting at a few layers that Samsung already has, but up to 1000s) or even fully 3D chips with efficient heat dissipation. This would put the components closer together and streamline the close-range interconnects. Also, this increases "computation per rack unit volume", simplifying some space-related aspects of scaling.

    Memristors. HP is ready to produce the first memristor chips but delays that for business reasons (how sad is that!) Others are also preparing products. Memristor technology enables a new approach to computing, combining memory and computation in one place. They are also quite fast (competitive with the current RAM) and energy-efficient, which means easier cooling and possible 3D layout.

    Photonics. Optical buses are finding their ways into computers, and network hardware manufacturers are looking for ways to perform some basic switching directly with light. Some day these two trends may converge to produce an optical computer chip that would be free from the limitations of electric resistance/heat, EM interference, and could thus operate at a higher clock speed. Would be more energy efficient, too.

    Spintronics. Probably further in the future, but potentially very high-density and low-power technology actively developed by IBM, Hynix and a bunch of others. This one would push our computation density and power efficiency limits to another level, as it allows performing some computation using magnetic fields, without electrons actually moving in electrical current (excuse me for my layman understanding).

    Quantum computing. This could qualitatively speed up whole classes of tasks, potentially bringing AI and simulation applications to new levels of performance. The only commercial offer so far is Dwave, and it's not a classical QC, but so many labs are working on that, the results are bound to come soon.

  • by Lumpy (12016) on Friday December 06, 2013 @07:39AM (#45617443) Homepage

    "For decades, low skilled software developers have been able to play fast and loose,"


    Embedded system programmers are the only real programmers anymore.

  • Re:Yawn (Score:2, Insightful)

    by Anonymous Coward on Friday December 06, 2013 @09:42AM (#45617933)
    Only if they can work in parallel without needing information from each other.

2000 pounds of chinese soup = 1 Won Ton