Forgot your password?
typodupeerror
Intel Power Stats

Intel Challenges ARM On Power Consumption... And Ties 163

Posted by Unknown Lamer
from the not-too-shabby dept.
GhostX9 writes "Tom's Hardware just published a detailed look at the Intel Atom Z2760 in the Acer Iconia W510 and compared it to the NVIDIA Tegra 3 in the Microsoft Surface. They break it down and demonstrate how the full Windows 8 tablet outperforms the Windows RT machine in power consumption. They break down power consumption to include the role of the CPU, GPU, memory controller and display. Anandtech is also reporting similar findings, but only reports CPU and GPU utilization." Despite repeated claims that x86 is beating ARM here, they look neck and neck. Assuming you can make a meaningful comparison.
This discussion has been archived. No new comments can be posted.

Intel Challenges ARM On Power Consumption... And Ties

Comments Filter:
  • Neck AND Neck (Score:5, Informative)

    by Anonymous Coward on Monday December 24, 2012 @11:05PM (#42385351)

    Despite repeated claims that x86 is beating ARM here, they look neck in neck.

    It's neck and neck [thefreedictionary.com].

    • by Chemisor (97276)

      How about "equal"? A nice short word that is far more informative than an analogy to horse races, an event that no slashdotter has ever attended. Horses haven't been in use in a hundred years, it's time to get rid of horsey verbiage.

  • by magarity (164372) on Monday December 24, 2012 @11:05PM (#42385353)

    It's "neck and neck" as in a pair of horses very close together at the finish line.

    Sigh

    • Re: (Score:3, Interesting)

      by ohnocitizen (1951674)
      Neck in Neck seems like a more internet appropriate version. As in a series of images tucked away in a dark corner of imgur, briefly referenced on reddit before being removed by admins. Neck in Neck - "A filthy, gritty internet version of Neck and Neck."
      • Neck in Neck seems like a more internet appropriate version. As in a series of images tucked away in a dark corner of imgur, briefly referenced on reddit before being removed by admins. Neck in Neck - "A filthy, gritty internet version of Neck and Neck."

        Yes, but this just begs the question as to who we make the escape goat here on /.

    • The author must have written the summary while standing online.

    • neck 'n' neck

  • 'neck in neck'? (Score:4, Informative)

    by drainbramage (588291) on Monday December 24, 2012 @11:07PM (#42385355)

    Oh for crying out loud: Neck and Neck.
    Often used when describing two racers that are nearly even in position.

  • by Anonymous Coward on Monday December 24, 2012 @11:20PM (#42385391)

    If two processors are Neck and Neck in power consumption and one of them is x86. It means x86 is ahead. It's got better clock speeds and it's got more software going for it than arm. Yes we have a lot of android apps, but I would rather have my windows applications to those "apps" and their private internet. Unless Neck and Neck is for a processor intel does not produce any more, it's clearly advantage intel.

    • Re: (Score:3, Interesting)

      by Anonymous Coward

      Two processor are neck and neck. One costs $120 and the other costs $20.

      Which one has a brighter future?

      Especially now since people don't need to run all sorts of software. They just need android.

      • Re: (Score:3, Informative)

        by Ocker3 (1232550)
        What people? Enterprise IT staff are going to buy Huge numbers of Win8 mobile devices that can authenticate to their networks at the OS level, removing the need for every app itself to authenticate. We have iPads that refuse to forget wireless accounts, meaning a user can get locked out (hitting the bad login limit quite fast) in a few minutes, and the iOS on them doesn't prompt the user for a corrected username/pw. Apple's support for Enterprise environments has been late and shoddy, especially if you don'
        • by iserlohn (49556) on Tuesday December 25, 2012 @06:35AM (#42386355) Homepage

          If the execs and the sales guys want their Apple devices, or Android devices for that matter, what the IT organization thinks is 100% irrelevant. I've seen this happening already in quite a few large organizations that aren't particularly famous for being early adopters in new tech. Next thing to go are the standard windows images - corporate images are normally poor quality that people complain about constantly.

        • by Lennie (16154)

          Those won't be buying ARM that is for sure.

          Because Windows RT does not support any of these things. Only the Intel version does.

          That also means, if it's Bring-Your-Own-Device situation they'll be bringing the ARM-version.

          This is going to be fun to watch.

    • by poetmatt (793785)

      No, it doesn't.

      Why doesn't it mean x86 is ahead? Because x86 has had years of development ahead of ARM. Also because x86 uses proprietary microcode.

      So having them equal means ARM is a significant benefit.

      • by JimCanuck (2474366) on Tuesday December 25, 2012 @12:28AM (#42385625)

        No, it doesn't.

        Why doesn't it mean x86 is ahead? Because x86 has had years of development ahead of ARM. Also because x86 uses proprietary microcode.

        So having them equal means ARM is a significant benefit.

        The original x86 was introduced in 1978.

        The original ARM was introduced in 1985.

        That is just 7 years more over the ARM with 27 years of development since the first implementation. Plus all of the /. crowd and other self described "experts" have been saying for years that a neck and neck tie between them for power consumption would never happen. And well it did, so obviously this is a win for the x86.

      • x86 is a pyramid of kludges; ARM is alleged to be a clean design. That the designs are so close in effectiveness indicates that there really isn't a great difference in the system-level value of the designs, at least in the tests performed.

        General-purpose processor design is a heavily studied and fairly well understood body of information, and comparing 40 years of development with 30, 20, or 10 is irrelevant.

        • The ARM is supposed to be a cleaner design and the x86 is a kludge with backwards compatibility going back well over 30 years, but a tie with a processor that's a process node behind isn't too hot. The Intel is 32nm, the ARM is 28nm. The ARM should be better. They're doing the same thing.

    • it's got more software going for it than arm

      This product comparison [microsoft.com] from Microsoft leads me to believe that applications have to be rewritten to behave correctly on Windows 8 Pro. Notice the blurb about downloading apps from the Microsoft store. This does not say you can download any plain old exe file. The mention of Windows 7 applications could be those that have already been rewritten to be compatible with the tablet.

      If that's the case, iOS and Android apps witten for the ARM architecture greatly outnumber those for x86.

    • by makomk (752139)

      Intel actually had to write an ARM emulator for their Android stuff because ARM has a very definitive software advantage over x86 there. Sure, there's lots of x86 desktop applications, but how many of them are usable on a tablet? On a phone? For that matter, how many of them can be used without adding the substantial cost and system resource usage of a full Windows install?

    • by hazydave (96747)

      For Windows 8 vs Windows RT, maybe the tie goes to Intel. On Android, the tie certainly goes to ARM. But keep in mind, you're comparing a quad core ARM to a dual core x86, and this isn't even the best ARM for comparison anymore.... plus, the Surface doesn't even use the faster T33 verson. The Atom in question doesn't have a CPU speed advantage, but it has huge memory bus advantage over the Tegra 3: nVidia's single 32-bit DDR3 bus versus Intel's dual 64-bit DDR3 bus. A comparison o the Nexus 10 might be mor

  • Doesn't mean a thing (Score:3, Interesting)

    by Tough Love (215404) on Monday December 24, 2012 @11:22PM (#42385395)

    Even if true (watch out for cognitive dissonsoance with respect to Intel power efficiency claims) it does not mean a thing if Intel cannot match the price. Currently something like $1 goes to ARM holdings per chip. Lets see a bloated old monopolist get by on that.

    • Re: (Score:2, Interesting)

      Nope, they get MORE than $1 a chip, which means they have more to plow back into R&D. Truthfully though, its an interesting question, but all told unless the price is substantially different we're not talking a big deal. If you pay $5 more for your x86 tablet you won't really care, assuming it works at least as well and happens to have the features you wanted/be the brand you like/etc.

      I think the question is whether Intel will be able to push the x86 design down to EXTREME low cycles/watt levels. x86 ha

      • all told unless the price is substantially different we're not talking a big deal. If you pay $5 more for your x86 tablet you won't really care

        You're in outer space. Intel can't get by on $5/tablet, they need at least $50 or they will soon need to sell their head office. There is no way Intel can compete with ARM's royalty structure while continuing to live in the manner to which they have become accustomed.

        • There's more to it than that. Embedded chips are small and cheap, and sell in great numbers. Of course it is going to be true that a shift in the market will bring changes in everyone's business I'm not at all sure Intel can't bring in tons of money still. Its a complex situation, going to a new level of commoditization.

    • by Patch86 (1465427)

      Correct me if I'm wrong, but ARM may only take $1 a chip, but they're only the designer. The manufacturer must be taking a cut too- including a cut big enough to cover the manufacturing costs.

      Intel are vertically integrated, so their prices include the full cost of designing and making chips. To get a comparable cost, you'd need to add the costs together for ARM and, say, Qualcomm.

      Not that I'm saying your point is wrong; I've no idea what the figures are.

  • by arbiter1 (1204146) on Monday December 24, 2012 @11:26PM (#42385407)
    http://www.tomshardware.com/news/intel-arm-processor-soc-atom,17476.html [tomshardware.com] When that story was posted i said that all ARM was doing was poking the bear. Didn't take long for Intel to get there either. Just shows you don't piss off a company with a lot of $ for R&D
    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Samsung will be presenting at the ISSCC on their 28nm "big-little".
      http://www.eetimes.com/electronics-news/4401645/Samsung-big-little--no-Haswell--Project-Denver-at-ISSCC
      >Samsung will detail a 28-nm SoC with two quad-core clusters. One cluster runs at 1. 8 GHz, has a 2 MByte L2 cache and is geared for high performance apps; the other runs at 1.2 GHz and is tuned for energy efficiency.

      Need to see how it matches up to Samsung latest 14nm proto.
      http://www.eetimes.com/electronics-news/4403838/Samsung-14nm-F

  • by obarthelemy (160321) on Monday December 24, 2012 @11:34PM (#42385435)

    First, those articles are very interesting, thanks to Intel for making them happen.

    Second, it's a good thing that Intel is catching up. I'm not a great Intel fan (rooting for the underdogs and all that), but still, I'm impressed.

    Third, isn't the OS choice biasing the results a bit ? Would ARM fare better under a more ARM-oriented OS such as Android ? Or is power consumption profile, in the end, fully OS-independent ?

    • by Anonymous Coward

      Windows RT still runs a Windows subsystem.

      Android's apps are really fragments of apps, the gui is a different fragment from the service (the thing that does any grunt work if needed) etc. If you don't use a gui bit, then that gui bit never loads. If a service bit is running, it's gui bit can/usually is closed.
      The broadcast intents mean apps that appear to be running, actually aren't always running or even in memory. The broadcast intent fires (e.g. a minute timer, particular network events, lots of other ev

      • by Anonymous Coward

        Arm draws 10% of the power of Atom at idle, and Android runs mostly at idle even when you're using it to do stuff because its designed from day one that way. Windows uses a lot more processing power, and 'idle' on those Windows, literally means not using it at all, and even when you're not using it, the Atom is still drawing > 1W.

        • There you have it a 1.5 mW-445 mW [semiaccurate.com] superscalar X86 processor.
      • by gl4ss (559668)

        android apps are as real apps as windows8rt /windows phone apps - not fragments as such really. this is done for permission isolation and other advantages, like not crashing the entire thing if something goes awry.

        you are aware that android apps run as their own user? how about you just go suck it in a ditch.

        happy xmas!(rt and wp still blow more than android though!)

      • Android's apps are really fragments of apps, the gui is a different fragment from the service (the thing that does any grunt work if needed) etc. If you don't use a gui bit, then that gui bit never loads. If a service bit is running, it's gui bit can/usually is closed.
        The broadcast intents mean apps that appear to be running, actually aren't always running or even in memory. The broadcast intent fires (e.g. a minute timer, particular network events, lots of other events...), wakes up the bit of code to handle it, executes, then returns, ending the fragment if necessary.
        Apps can be killed at any time, and are designed that way. Hence code is already written to handle it.
        Widgets on Android aren't anything, just bitmaps, if the widget changes, it can be because an intent fired, the tiny bit of code needed to redraw the fragment was loaded, executed then discarded. They're not code constantly running.
        Apps are memory constrained on Android, on Windows they can grow beyond ram. Which unfortunately means paging to disk or flash. You can see why Android keeps the memory usage of apps down to a minimum given this limit, but paging is no longer a fix if flash is there, writing to flash eats battery.

        All of the above is also true for Windows Store apps - which, to remind, is the only thing you can run on RT tablets, except for Explorer, desktop IE and Office. The whole point of that Metro thingy was to come up with not just an UI, but the whole application model that works well on mobile devices - meaning good battery life. To do that, it borrowed a lot of ideas and techniques from iOS and Android, including app lifecycle management.

        And yes, it does actually work. My Asus VivoTab RT has battery life jus

    • I think the underlying intent of the article is to show that the Microsoft Surface is a waste of time, and so it was Windows 8 focussed. They compared a Microsoft Surface with an Acer W510, and the Acer tied on power and won on performance. But also the Acer runs all Windows apps, so why would you buy the Microsoft Surface over the Acer W510?
      • what is the price difference?

        Beyond that... I think the Surface Pro type devices will win the day only because intel is moving to a super efficient design and you might as well get a full windows experience if you can....but again....only if the prices are close.

    • would probably be a much better comparison.
    • by AmiMoJo (196126) *

      Power consumption certainly does depend quite a bit on the OS, but more so on drivers I think. For example MacBooks run longer on OSX than Windows, but similar spec laptops from other manufacturers match or outperform OSX running Windows. I doubt Apple puts too much effort into their Windows drivers, where as everyone else highly optimizes their system for it.

      Windows RT is very new so probably isn't a fair comparison at this point. Maybe a few years down the line when it is more mature a fairer comparison c

  • technology node (Score:5, Insightful)

    by blackC0pter (1013737) on Monday December 24, 2012 @11:36PM (#42385441)
    The only issue here is that this is not an apples for apples comparison. 40nm vs. 32nm should give a huge benefit to the 32nm Atom. We need to compare the same technology node for this to make any sense. Also, looking at the idle cpu power consumption from the anandtech article, the Atom SOC used 10x more power.
    So the real question is what do most tablets spend the majority of their time doing? Running a benchmark at full /half speed or with the SOC sitting idle?
    • Re:technology node (Score:5, Insightful)

      by jiteo (964572) on Tuesday December 25, 2012 @12:18AM (#42385585)
      One of Intel's weapons has always been process size. So while it's not a fair comparison if you're doing science, it's a fair comparison if you're wondering what tablet to buy.
      • by Sycraft-fu (314770) on Tuesday December 25, 2012 @12:52AM (#42385689)

        Why are they a node ahead all the time? Because they spend billions in R&D. When the downturn hit everyone in the fab business cut R&D, except Intel. So now they have a 22nm fab that has been running for awhile, another that just came fully online, and two 14nm fabs that'll be done soon (one on 450mm wafers).

        They do precisely what geeks harp on companies to do: Invest money in R&D, invest in tech. They also don't outsource production, they own their own fabs and make their own chips. Most of them are even in the United States (8 of the 11).

        The payoff is that they are ahead of people in terms of node size, and that their yields tend to be good (because the designers and fab people can work closely).

        If other companies don't like it, well the only option is to throw in heavy on the R&D front. In ARM's case being not only fabless but actually chipless, just licensing cores to other companies, they can't do that. They are at the mercy of Samsung, TSMC, Global Foundries, and so on.

        • by rrohbeck (944847)

          And even though they're way in front technology wise, they keep pissing everybody off with artificial market segmentation. Why?

        • They also don't outsource production, they own their own fabs and make their own chips

          Outsourcing production is not necessarily a bad thing, as it allows specialisation. Intel can afford it because they are a big player, but for other companies it makes sense to share the fab R&D costs with others, including with their competitors. They then compete based on their strengths (chip design), and the manufacturers compete based on their process technology.

        • I'd love to see reviews start taking into account cost and also DMIPS/watt. Cost is a major driving factor for OEMs to consider ARM but I imagine NDAs are going to keep a lid on this advantage. But if intel starts matching arm pricing, will they make enough profit to keep investing so much money to stay 1-2 years ahead with their foundries? On a side note, I meant to cite the tomshardware review and not the anandtech review in my original post.

          The second battle to watch is the upcoming server CPU/SOC
      • by Anonymous Coward
        Nobody is buying Windows tablets, so it's a pointless comparison. Like arguing over whether Rosie O'Donnell or Roseanne Barr has a tighter pussy.
    • Unfortunately this article lacks detail, it seems that Bernstein Research considers Intel's latest smartphone designs to be as energy efficient as competitors. [mercurynews.com] Intel's latest smartphone chip is Medfield, which is 32nm. Unfortunately the article does not say what chips they compared... but it would be surprising if they didn't include the Qualcomm (Snapdragon S4 @ 28nm node) in their comparison. So we at least have some indirect evidence that even when they are at the same technology node, Intel's design
    • by Lennie (16154)

      Of course this wasn't an apples for apples comparison, there was no iPad ;-)

    • by Patch86 (1465427)

      The comparisons that matter are dollars, watts and performance benchmarking. If the processors are a similar price, similar power consumption, but one is a much better performance, you have a winner. Same goes for the other variations.

      That one of the competitors is made of magic pixey dust is neither here nor there to the consumer. If Intel have achieved a victory by using a more advanced technology, then more power to them; it's hardly "cheating" the comparison.

  • by Anonymous Coward

    Example numbers: ARM CPU 0.0038 W vs.. Atom 0.02.
    NVidia GPU 0.21 W vs. Imagination 0.11 W
    The part that wins isn't from Intel, and it is available for ARM and it probably is the part that would lose badly in any benchmark.
    Yay for biased benchmarking.
    So far Intel wins by undersizing the GPU.

  • by steveha (103154) on Monday December 24, 2012 @11:51PM (#42385501) Homepage

    I have said it before [slashdot.org]: with ARM, you can choose from multiple, competing chip vendors, or you can license the ARM technology yourself and make your own chips if you are big enough; with x86, you would be chaining yourself to Intel and hoping they treat you well. So, if low-power x86 is neck and neck with ARM, that's not good enough.

    Intel is used to high margins on CPUs, much higher than ARM chip makers collect. Intel won't want to give up on collecting those high margins. If Intel can get the market hooked on their chips, they will then ratchet up the margins just as high as they think they can.

    The companies making mobile products know this, and will not lightly tie themselves to Intel. So long as ARM is viable, Intel is fighting an uphill battle.

    • by ikaruga (2725453)
      Let alone backwards software compatibility. Recompile and debug all those iOS and NDK based Android apps all over again doesn't sound something developers will like.
      • It shouldn't be an issue in this day and age.

        • by ikaruga (2725453)
          Lazy devs are a issue in all ages. Even something as simple as changing the target device and maybe changing a couple of parameters of a project can make people moan. Plus resource intensive apps may still require some low level code. "Luckily" such apps seem to be very rare on the consumer mobile app market.
      • by CODiNE (27417) on Tuesday December 25, 2012 @01:43AM (#42385791) Homepage

        Actually all those iOS apps already run on Intel, XCode simulator runs Intel code not ARM code. Android also runs on Intel but I believe most apps are emulated during development so they might have slightly more tweaking than an iOS app to get running on intel.

      • Native ARM apps can run on Intel Android thanks to libhoudini. Which is actually really good performance-wise - it is probably JITing ARM code to Intel. Unfortunately it is only legally usable on Medfield chips.

        http://grokbase.com/p/gg/android-x86/12a35ssv8e/commercial-application-testing [grokbase.com]

  • Poor comparison (Score:5, Interesting)

    by markdavis (642305) on Monday December 24, 2012 @11:58PM (#42385521)

    Interesting that they are not comparing to a *modern* ARM chip (Cortex-A15), like the Exynos 5 (5250) or even a Qualcom Krait S4 (perhaps MSM8960).

    So the news is that Intel has mostly caught up to an old ARM based chip based on designs/specs years older still and only running under MS-Windows. Yawn....

  • The main problem is likely the compiler.
  • by Morgaine (4316) on Tuesday December 25, 2012 @01:22AM (#42385747)

    One area in which Intel is significantly more open than any manufacturer in the ARM ecosystem is in graphics hardware. Although Intel hasn't opened all their GPUs fully yet (from what I've read), this seems to be mostly because providing all the documentation takes time, not because they are against making everything open.

    This contrasts dramatically with every single ARM license in existence. ARM's own MALI GPU is tightly closed (probably because MALI was a licensed technology) so the Lima team is having to reverse engineer a Linux driver. All the ARM licensees who provide GPUs seem to be either unable to open their GPU information because their GPU core has been licensed from a 3rd party, or else are simply disinterested in doing so, or else vehemently opposed to it for alleged commercial reasons in at least a couple of cases. So, the prospect of open documentation on SoC GPUs appearing from ARM manufacturers is vanishingly small.

    This gives Intel at least one possible opening through which they can be fairly certain that the competition will not follow. Although that may be worth a lot to us in this community, the commercial payback from community support tends to be very slow in coming. Still, it's something that Intel might consider an advantage worth seizing in the mobile race where they're a rank outsider.

  • by EmperorOfCanada (1332175) on Tuesday December 25, 2012 @01:53AM (#42385807)
    One thing to keep in mind is that the ARM is much more general purpose while the Intel chips tend to have a more complex assembly instruction set. So for adding one number to another (x=y+z) I suspect the simpler ARM architecture is going to win on power consumption. But many Intel chips have assembly instructions specifically for crazy things like AES encryption. This is used as the basis of many encryption protocols, hashing, and random number generation. So if a machine is basically serving up all encrypted data then it is possible that an Intel chip will be much faster and consume much less power while performing these operations. Depending on whether he software will take advantage

    So I thing this is a case where you really have to look at the significantly broken down performance results to see if your use case fits one chip better than the other. A normal consumer example would be if your OS is encrypting your file system and using these cool Intel instructions. I suspect that it would then be a night and day difference in battery drain. But the drag is that you probably have to pretty well buy a device with both chips, set up your standard configuration, and then test it out. This is generally only something an IT person about to provision a department might be expected to do.

    I guess that the overall benchmark is all we really have to go by which really doesn't tell the whole story.
    • by willy_me (212994) on Tuesday December 25, 2012 @02:59AM (#42385965)

      One thing to keep in mind is that the ARM is much more general purpose while the Intel chips tend to have a more complex assembly instruction set. So for adding one number to another (x=y+z) I suspect the simpler ARM architecture is going to win on power consumption. But many Intel chips have assembly instructions specifically for crazy things like AES encryption. This is used as the basis of many encryption protocols, hashing, and random number generation. So if a machine is basically serving up all encrypted data then it is possible that an Intel chip will be much faster and consume much less power while performing these operations.

      Not really important. The Intel chips convert assembly instruction into microcode - how they implement it internally (either dedicated hardware or reusing existing silicon) is up to them. You can't make a blanket statement like that unless Intel has specifically stated that hardware support is included. But in general, the Atom series trims as much off the CPU core as possible so don't be surprised if hardware support for some of those exotic instructions is lacking. And many ARM cores include instructions that are just as interesting - mostly for the embedded DSP market. A manufacturer, with the appropriate license, can include whatever instructions and dedicated hardware they want.

      What likely matters more then the instructions is the included memory and cache. Intel likely includes a larger cache - which will drive up the price. Cache is usually static and has a very low power draw when not in use. By including a large cache, Intel can minimize expensive requests to memory. Also note that DIMMs have a significant constant current draw. Low power DIMMs are available but more expensive. You can bet that Intel used the latest and greatest for their demo while others might opt for the cheaper and slightly more power hungry DIMMs.

      This demo shows how having a process 1 step more advanced then the competition can make a big difference wrt power consumption. But newer ARMs will be available soon - I believe Samsung is scheduled for roll out 28nm in the near future. Intel still has a long way to go to convince manufacturers that they should pay more for what ARM can do for less.

      • You must mean RAM chips and even those are often on-chip on these SoC systems. The main thing here is price point and since Intel is the only manufacturer and uses a very expensive fab at 32nm, their system is far more expensive to buy than a "generic" 40nm fab Arm chip. You are right that the Intel is, under the hood, just as RISC as the Arm chip is. The point seems to be that with using a more expensive smaller fab, Intel can sort-of offset the extra power required for the on-the-fly translation of x86 in

        • by nojayuk (567177)

          "You must mean RAM chips and even those are often on-chip on these SoC systems."

          Nope. A DRAM of any significant capacity (256MB or better) has a similar die size to a SoC chip. An SoC will usually have some RAM on-board for buffers, cache, maybe low-end graphics support but the main memory in tablets, phones etc. resides on separate DRAM chips. A typical 2Gb DDR3 die is about 30 sq. mm whereas the Tegra 3 with 5 cores and over a MB of cache is 80 sq. mm.

          Devices like the Raspberry Pi uses package-on-pack

    • But many Intel chips have assembly instructions specifically for crazy things like AES encryption.

      You picked a pretty poor example, as ARMv8 includes instructions for AES. You should also look at the NEON instruction set on ARM, which has a number of fairly complex floating point operations. The advantage of the microcode on an x86 chip is greater instruction density, meaning less instruction cache usage, so you can have less instruction cache, which means less power consumption. The disadvantage is that you have a significantly more complex instruction decoder, which means more power consumption. T

    • by Pinhedd (1661735)

      Intel chips are nothing more than dressed up RISC processors. The high level CISC instructions are converted into RISC micro-ops before execution. Similarly, no one in their right mind would call ARMv7/ARMv8 "reduced"

  • Battery life is not the reason we don't want Windows tablets. Windows tablets suck. Might as well evaluate which one makes a better skateboard.
  • by Anonymous Coward

    They are comparing old ARM vs new Intel !

  • by Anonymous Coward

    Tom's Hardware was bought outright by Intel years ago, and has since written only glowing reviews of trending Intel products. What else did you expect to come out of that now-defunct propaganda machine?

  • Sounds like Intel is leading ARM along...give them a bit of a head start, then catch up, then...<snuff>

"Trust me. I know what I'm doing." -- Sledge Hammer

Working...