Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Microsoft Hardware Technology

Moore's Law Is Becoming Irrelevant, Says ARM's Boss 236

holy_calamity writes "PCs will inevitably shift over to ARM-based chips because efficiency now matters more than gains in raw performance, the CEO of chip designer ARM tells MIT Technology Review. He also says the increasing adoption of ARM-based suppliers is good for innovation (and for prices) because it spurs a competitive environment. 'There’s been a lot more innovation in the world of mobile phones over the last 15-20 years than there has been in the world of PCs.'"
This discussion has been archived. No new comments can be posted.

Moore's Law Is Becoming Irrelevant, Says ARM's Boss

Comments Filter:
  • Duh (Score:5, Insightful)

    by Anonymous Coward on Friday November 09, 2012 @04:06PM (#41935659)

    CEO of a company that makes more efficient CPUs than the competition says the future is in efficient CPUs. News at 11.

    • CEO of a company that makes more efficient CPUs than the competition says the future is in efficient CPUs. News at 11.

      OR, it's the other way round: the people currently at the helm thought around 1990 that the future would be in efficient CPUs and so they formed an efficient CPU company. Then, they just kept their point of view.

  • by ackthpt ( 218170 ) on Friday November 09, 2012 @04:09PM (#41935689) Homepage Journal

    But every newer version of operating systems has more bloat than ever. There must be some corollary to Moore's Law which states successive Operating Systems will still require higher performance, but users will now become accustomed to slower response times.

    We could call it the Blort Law.

    • by GrumpySteen ( 1250194 ) on Friday November 09, 2012 @04:17PM (#41935759)

      Wirth's Law [techopedia.com]:
      Software is getting slower more rapidly than hardware is getting faster.

      • Tell me about it. I have a nominally-1.5GHz quadcore Android phone that, when running Graffiti, can barely tell the difference between a "G" drawn like a "6" on the letter side, and the letter "O" with better than 90% accuracy unless I use SetCPU to lock it to full speed (with devastating impact upon battery life) whenever the screen is on, yet somehow... SOMEHOW... a slow, lowly 16MHz Dragonball m68k could do the same thing with nearly perfect, flawless accuracy. The biggest single reason, as far as I can

    • Gate's Law (Score:5, Funny)

      by Citizen of Earth ( 569446 ) on Friday November 09, 2012 @04:19PM (#41935777)
      It's called Gate's Law: Every 18 months, the speed of software halves.
    • Actually, the problem is software that's not built for concurrency on multiple cores/cpus: Amdahl's Law: http://en.wikipedia.org/wiki/Amdahl's_law [wikipedia.org]
    • by nurb432 ( 527695 )

      But every newer version of operating systems has more bloat than ever.

      Which is the core of the problem. We did so much more with less not all that long ago.

  • Comment removed (Score:5, Informative)

    by account_deleted ( 4530225 ) on Friday November 09, 2012 @04:09PM (#41935691)
    Comment removed based on user account deletion
  • Efficiency! (Score:5, Interesting)

    by CajunArson ( 465943 ) on Friday November 09, 2012 @04:13PM (#41935735) Journal

    " efficiency now matters more than gains in raw performance"

    Sure, so why don't you start off by telling us why an Exynos Cortex A-15 chip running a web benchmark is using about 8 watts of power, with the display turned off so only SoC power is being measured, while Intel has already demoed a full-blown Haswell running Unigine Heaven at... 8 watts.

    So when the miraculous Cortex A-15 uses the same amount of power as the supposedly "bloated" x86 Haswell, while Haswell is running a benchmark that is massively more intensive than a web-browser test, who is really making the most "efficient" platform?

    Exynos Source: http://www.anandtech.com/show/6422/samsung-chromebook-xe303-review-testing-arms-cortex-a15/7 [anandtech.com]
    Haswell Demo Video: http://www.youtube.com/watch?v=cKvVdhkgAxg [youtube.com]

    • Re:Efficiency! (Score:4, Informative)

      by dgatwood ( 11270 ) on Friday November 09, 2012 @04:34PM (#41935947) Homepage Journal

      That's a false comparison, though. If users mostly ran benchmarks 24x7, that would be a good test of efficiency. The reality, however, is that CPUs mostly sit idle, so to compute average efficiency, you have to factor that in.

      Granted, a faster CPU that can reach an idle state sooner can be more efficient than a slower CPU that runs at full bore for a longer period of time, but only if the idle wattage is fairly similar.

      • I agree, and I'd put my money on Intel reducing idle wattage faster than ARM increasing performance.
      • Good thing then that Haswell's idle power draw is 20x better than Ivy Bridge's, meaning that it is probably about the same as the Cortex A-15 (or maybe even better).

        I'm not saying that Haswell belongs in a smartphone.. I'm also saying that unless you downclock that Exynos you don't want it in a smartphone either. I *am* saying that the blind assumption that ARM == efficiency tends to disintegrate when confronted with facts. I'm also saying that if Haswell can run at 8 watts, the whole "x86 wastes powar!"

      • by Kjella ( 173770 )

        That's a false comparison, though. If users mostly ran benchmarks 24x7, that would be a good test of efficiency. The reality, however, is that CPUs mostly sit idle, so to compute average efficiency, you have to factor that in.

        Your efficiency of not doing work is like measuring the MPG you get idling in your driveway. Laptops have been either off or in sleep/suspend, they haven't had an "active idle" mode like cell phones waiting for calls/texts/emails because they've never needed one. It's like having a huge office building with only floor switches, cell phones have had a single light for the night receptionist and laptop chips haven't because it's been lights out when they sleep. Now they need one and will get one with Haswell

    • by Pulzar ( 81031 )

      Sure, so why don't you start off by telling us why an Exynos Cortex A-15 chip running a web benchmark is using about 8 watts of power, with the display turned off so only SoC power is being measured, while Intel has already demoed a full-blown Haswell running Unigine Heaven at... 8 watts

      Wait, wait... are you trying to say that in a notebook system doing wireless web surfing, the only sources of power are the CPU and the display?

      If so, you are way off.

      • No, I'm saying that on a chromebook with a SoC (that stands for "system on a chip" you know...) the total power consumption of the SoC running a web benchmark that likely requires little or no wireless network power due to caching is equivalent to the power consumption of a low-power Haswell part (that is similar to a SoC but with a separate south-bridge MCM).

        Oh, and if the Kraken benchmark is anything remotely similar to any other web browser benchmark I've ever seen, the CPU/GPU on the SoC are not being t

    • Wonder what the cost difference is between those two.. If you're putting it in a low cost device, a difference in price can be rather significant.. (ie, do I spend $30 more on each CPU, or go with the cheaper CPU, and $20 worth of extra battery)..

      Oh, wait.. one is out in production. another has no firm release date.. So a brand new, not yet actually in use chip is faster, and uses less power than one that has been around a while.. Fascinating...

      • Re:Efficiency! (Score:5, Insightful)

        by CajunArson ( 465943 ) on Friday November 09, 2012 @06:26PM (#41937225) Journal

        Haswell is a (probably) ~1.6 Billion transistor chip that obviously costs more than a SoC that is really designed for tablets. Interesting then that a ~1.6 Billion transistor chip that includes similar functionality to the SoC uses about the same amount of power as that tablet SoC while including vastly more performance.

        If you want cheap, Atoms are already out now that are quite cost competitive with ARM chips, and 22nm Atoms will be out next year.

        Oh and as for "release dates" the Exynos has just barely begun to reach the market and Haswell will be out and about at around the same time that most Cortex A-15s really come into the market as well. Considering I've had to listen to "A15 will kill Intel!!!!" for over 2 years as if they were already coming out of faucets like water, I'm not too worried about part availability.

        So here we are in the ARM vs. Intel Evolution:
        2008: ARM is superior, Intel can NEVER scale its power consumption down below 100 watts!!

        2009-2010: ARM is still superior! Atom sucks at performance and uses 10 WHOLE WATTS, thats more than 10X ARM! The Cortex A9 will annihilate Intel!

        2011: ARM performance dominance is just around the corner! Ignore those useless benchmarks of Cortex A9 vs. Atom! So what if Atom has higher performance, IT SUCKS DOWN MORE POWER AND POWER CONSUMPTION IS ALL THAT MATTERS!

        2012: Medfield sucks! Who cares if it gets better battery life than a dual-core 28nm Krait when put into Motorala Razers with the exact same! See, we have benchmarks where the higher-clocked Krait gets 10% better performance (in some benchmarks while losing in others that we ignore)! WHO CARES THAT ATOM IS MORE POWER EFFICIENT THE ONLY THING THAT MATTERS IS MORE PERFORMANCE!
        INTEL IS STILL OVERPRICED EVEN THOUGH THE RAZER I AND RAZER M HAVE THE SAME PRICE!

        2013: Uh... at least ARMs are cheap when you intentionally compare chips desiged for cellphones to Intel's desktop chips and pretend that Atom doesn't exist. ARM WILL DESTROY INTEL!

  • by Anonymous Coward on Friday November 09, 2012 @04:15PM (#41935739)

    As a geek I love a powerful general purpose machine that can do all the things an ebook reader/music player/web browser can do AND a whole lot more like play 3d games, run a math or science simulation, allow you to record and edit video, memory and processor intensive image editing. To me a tablet is little more than a crippled PC with the keyboard removed (fantastic, why did I learn to type at 90wpm again??), and a smudge screen interface (hate viewing photos through finger marks!!!). It's really awesome that we have dumbed down our computers to the point of mediocrity. Even finding a decent e-book reading or music playing app - the things these pieces of shit are touted at making easier - is a nightmare. So many book readers don't even let you zoom on images. And browsing the web without flash support is like trying to surf with one leg. I don't mind that there are dumbed down idiot boxes for those who like to post pictures of food on Facebook, but I really resent the impact on general purpose computing.

    • buy a raspberry pi, if you really are a geek.
    • by afidel ( 530433 )

      Really, you hate the fact that the mobile core i5 is more powerful than the previous generation while allowing all day battery life? Because that's the biggest way that tablets have affected general purpose computing that I can see. Sure, the current mobile i5 isn't going to transcode video as fast as a current desktop i7, but it'll do it considerably faster than a Core2 era desktop. Plus optimizing idle power is good for the environment, replacing P4 era desktops with current era machines will save you ton

    • I pretty much prefer to browse the web without flash support. Steve Jobs was right. also I type too fast for slashdot.
    • Please Intel, keep making those big, inefficient chips.
  • Makes no sense! (Score:3, Insightful)

    by Anonymous Coward on Friday November 09, 2012 @04:19PM (#41935773)

    Moore's law just predicts transistor density - it says absolutely nothing about computational power. Increases in transistor density can make electronics more efficient per watt, but this still is aligned with Moore's law.

    The title is stupid, and the actual article says almost nothing like it.

    • by ebunga ( 95613 )

      Actually, this means that the CEO of ARM doesn't know what a transistor is and why you would want more transistors in a tiny space.

  • Power (Score:4, Insightful)

    by rossdee ( 243626 ) on Friday November 09, 2012 @04:20PM (#41935783)

    Sure efficiency matters, but only in portable devices. Desktops or other computers connected to the mains don't have a problem.

    Hey its winter already, a watt used by your CPU is a watt less that has to be used by your radiant or convective heater.

    • This is only the case if your heat is electric. Otherwise you're comparing apples and oranges.
      • by Cinder6 ( 894572 )

        Now to expose my woeful lack of understanding of the topic!

        Is it even apples to apples with electric heaters? I'm not sure how much power my PC is currently drawing, but its exhaust isn't particularly warm--in fact, it feels perceptively cooler than the ambient temperature. I have no doubt there's a sort of "wind chill" factor going on (it's not a magic PC, so far as I know), but it seems like a damned inefficient heating appliance all the same, especially if I consider space heaters I've used in the past

        • The vast majority of power consumed by the computer ends up as heat. Computers make heat, light (EM), and sound. Sound is mostly absorbed by the walls of your house. The amount of EM which leaks out of the case and after that, past the walls of your house, is pretty negligible. Note that "negligible" doesn't mean "not detectable," you can easily detect it, it just doesn't amount to much.
          • So is the PC just an inefficient heater, then? Even my aluminum case is cold to the touch. If I didn't have so many fans (10 in total), would it make the room hotter?

            I'm asking because I often see it claimed that PCs make great space heaters, but in my experience, this one plain doesn't. Under full load, it should draw quite a bit of power, but it outputs much, much less heat than lower-energy dedicated space heaters. I'm tempted to find my Kill-A-Watt and see what it says.

            • So is the PC just an inefficient heater, then? Even my aluminum case is cold to the touch. If I didn't have so many fans (10 in total), would it make the room hotter?

              I'm asking because I often see it claimed that PCs make great space heaters, but in my experience, this one plain doesn't. Under full load, it should draw quite a bit of power, but it outputs much, much less heat than lower-energy dedicated space heaters. I'm tempted to find my Kill-A-Watt and see what it says.

              There's no such thing as an inefficient heater. All the energy your computer uses must end somewhere, and that somewhere can only be sound or heat. The sound output is usually very low, and as GP explained, absorbed by walls and converted into heat as well. The exceptions are any long-range EM emitters, like WiFi and Bluetooth, which are still converted into heat but not always in the same room or house. So it is only the case and fan design which causes a difference in perceived heat.

              Also, I doubt your ded

      • by rossdee ( 243626 )

        The heat that I can control is electric. The furnace is controlled by a thermostat that is upstairs.
        And we live on the north side of the building

    • My basement is not heated so well, even though the furnace is down there. My computer keeps me warm.
    • Re:Power (Score:5, Insightful)

      by Chewbacon ( 797801 ) on Friday November 09, 2012 @04:45PM (#41936099)
      Efficiency matters to people who have many desktops around the home or office. Datacenters are focusing on efficient servers. Yeah, it does. Just because you're plugged into the wall doesn't mean that energy is infinite.
    • Re:Power (Score:5, Insightful)

      by xlsior ( 524145 ) on Friday November 09, 2012 @05:02PM (#41936317)
      Hey its winter already, a watt used by your CPU is a watt less that has to be used by your radiant or convective heater.

      Except in the summer every watt used by your CPU requires your air conditioner to use more energy to counteract it.
    • by geekoid ( 135745 )

      Wrong.

      You assume one watt of electric being converted to heat is the same as one watt converted by a heater. There are different devices with different inefficiencies.

      • You assume one watt of electric being converted to heat is the same as one watt converted by a heater.

        I think he's assuming that one dollar of electric is converted to as much heat as one dollar of something else.

        When that's true, then CPUs are good heaters.

        When that's false, then CPUs are second-rate heaters but OTOH you get some other kind of work out of them at that same time they heat, so maybe they're still ok. Or maybe they're not, depending on the cost difference and the value of the work.

        An

    • To an extent. Try selling a desktop that sucks down two kilowatts under full load - see how well it sells. Now look at the sales data and see that Intel's best-selling processors have dropped from 100W+ down to 77W, because it seems, given two processors of similar price, and both having sufficient processing power for the users' needs, consumers prefer the one using less power.

  • by AcidPenguin9873 ( 911493 ) on Friday November 09, 2012 @04:30PM (#41935903)

    Sigh. It seems there is a new, hip, propaganda trend on Slashdot: pro-ARM articles are posted, and a bunch of ARM zombies come out saying how anything ARM makes will (magically) be lower-power or more power-efficient than anything x86.

    So I'll start a tradition of posting this same response every time (originally posted by me here [slashdot.org]):

    "ARM isn't magic; there is nothing in the ARM ISA that makes it inherently lower power than x86. Yes, I'm counting all the decode hardware and microcode that x86 chips need to support legacy ISA. There just isn't much power burned there compared to modern cache sizes, execution resources, and queue/buffer depths which all high-performance cores need regardless of ISA. If you have an x86 processor that targets A9 performance levels, it will burn A9 power (or less if Intel makes it, given Intel's manufacturing advantage). If you have a ARM processor that targets Sandy Bridge performance levels, it will burn Sandy Bridge (or more) power."

    • Re: (Score:3, Funny)

      by ArcadeMan ( 2766669 )

      Aaaaaaaarrrrrrrmmsssss!!

  • by Weaselmancer ( 533834 ) on Friday November 09, 2012 @04:35PM (#41935963)

    It is just expressing itself differently as we begin to hit the wall with process size decreases and speed increases. If wattage of the cpu goes down, you can pack more cores into the same area. Computing power is still going up.

    • If wattage of the cpu goes down, you can pack more cores into the same area.

      Most of those 64 cores will sit idle until programming techniques for making extreme parallelism reliable become taught in universities and vocational schools.

  • by medv4380 ( 1604309 ) on Friday November 09, 2012 @04:42PM (#41936059)
    Efficiency only really matters when supply is limited. On a cell phone or any portable system power is limited, and improvement in power efficiency will extend battery life. ARM is a good option when it comes to things like a tablet, but when you start to do everything an Intel styles chip can do they start to get more tricky. Sure ARM probably has a lower floor so it's minimum power usage is a lot lower, but when you start having it do everything in the same time span as an x86_64 does then it starts to look too similar to actually matter. Unless using an ARM processor can save me well over 500 bucks a year on my power bill I don't see their efficiency as actually that much of an advantage.
  • I think at the end of the day what really matters whenever moores law is invoked is the underlying issue of cost per transister... I don't see cost ever being relegated to irrelevant.

    As transistors get cheaper you can take any combination of two paths:

    1. Build cheaper gear with same capabilities.

    2. Cram more into the same device to increase capabilities while maintaining price.

    Either way moores law is still critically important to the industry no matter who wins a CPU architecture war.

    With regards to ARM vs

  • by Jackie_Chan_Fan ( 730745 ) on Friday November 09, 2012 @05:15PM (#41936505)

    Mobile chips are shit to people who need renderfarms, simulation farms etc. People still do real work out there.

    People who keep crapping on workstations and servers seem to think everyone just needs a computer for texting, facebook and angry birds.

  • by gr8_phk ( 621180 ) on Friday November 09, 2012 @05:29PM (#41936685)
    He misses another point (though reference to competition hints at it). With Apples switch from PowerPC to x86 and now this move to ARM, and Linux going mainstream via Android on ARM, software developers are getting ever better at making things portable and hence making the underlying CPU architecture irrelevant. Also notice that Android tried to make this explicit by running most stuff on the Dalvik VM.

    Sure, power efficiency and die-area are important in many places, but don't think ARM is somehow going to have a lock on that.
  • The display (monitor), input (keyboard) and sound have all increased in the relative value added. I think processor speed is not irrelevant, but it is less relevant to flat/touch screens, keyboard/voice recognition, and sound quality. The displays have never followed Moore's law, which is probably why they now glue them indelibly to the chip in tablets, so you have to replace them when the chip does go.
  • by epine ( 68316 ) on Friday November 09, 2012 @06:15PM (#41937145)

    To me a PC is really just a smartphone in another form factor.

    I think we need some expert analysis on this one.

    All this was inspired by the principle--which is quite true within itself--that in the big lie there is always a certain force of credibility; because the broad masses of a nation are always more easily corrupted in the deeper strata of their emotional nature than consciously or voluntarily; and thus in the primitive simplicity of their minds they more readily fall victims to the big lie than the small lie, since they themselves often tell small lies in little matters but would be ashamed to resort to large-scale falsehoods. It would never come into their heads to fabricate colossal untruths, and they would not believe that others could have the impudence to distort the truth so infamously. Even though the facts which prove this to be so may be brought clearly to their minds, they will still doubt and waver and will continue to think that there may be some other explanation. For the grossly impudent lie always leaves traces behind it, even after it has been nailed down, a fact which is known to all expert liars in this world and to all who conspire together in the art of lying.

    The PC is used to create content. A smartphone is used to consume content. The PC functions autonomously (in a pinch). The smartphone is permanently welded to its cloud-nipple. The PC brings you smart ideas in shabby attire. The smartphone brings you shabby ideas in smart attire. The PC discourages walled gardens. A smartphone never leaves home without one.

    Wake me up when my smartphone comes with a holographic projector capable of conjuring up 40" of viewing pleasure at a comfortable focal plane, and either a haptic keyboard (gravitational hologram?) or a brainstem feed a million times better than Swype.

    Next we'll declare that mopeds and Harleys are the same form factor because there are more Asians than balding fat men. Clearly a modped is more like a Harley than a smartphone is like a PC.

  • The real issue here is whether ARM can lock up the market before Intel's offerings become highly competitive. The answer to that is clearly NO, they can't. Intel wants to compete in the mobile SOC market and they clearly have enough of a technology edge with their Fabs to jam their foot in the door before ARM can lock it. Intel doesn't need to blow away ARM here, they only need to make sufficient progress on power consumption to put themselves on near-equal ground. They've already shown that progress.

Trap full -- please empty.

Working...