Forgot your password?
typodupeerror
Intel Hardware

Intel Details Silvermont Microarchitecture For Next-Gen Atoms 82

Posted by samzenpus
from the check-it-out dept.
crookedvulture writes "Since their debut five years ago, Intel's low-power Atom microprocessors have relied on the same basic CPU core. That changes with the next generation, which will employ an all-new Silvermont microarchitecture built using a customized version of Intel's tri-gate, 22-nm fabrication process. Silvermont ditches the in-order design of previous Atoms in favor of an out-of-order approach based on a dual-core module equipped with 1MB of shared L2 cache. The design boasts improved power sharing between the CPU and integrated graphics, allowing the CPU cores to scale up to higher speeds depending on system load and platform thermals. Individual cores can be shut down completely to provide additional clock headroom or to conserve power. Intel claims Silvermont doubles the single-threaded performance of its Saltwell predecessor at the same power level, and that dual-core variants have lower peak power draw and higher performance than quad-core ARM SoCs. Silvermont also marks the Atom's adoption of the 'tick-tock' update cadence that guides the development of Intel's Core processors. The successor to Silvermont will be built on 14-nm process tech, and an updated microarchitecture is due after that."
This discussion has been archived. No new comments can be posted.

Intel Details Silvermont Microarchitecture For Next-Gen Atoms

Comments Filter:
  • by spamchang (302052) on Monday May 06, 2013 @02:07PM (#43644937) Journal

    Silvermont is a just core (CPU). It sits inside an SoC (system on chip), and your final power figures will still depend on the efficiency of the rest of the SoC (the GPU, the IO interfaces, the memory interfaces, any other dedicated hardware, etc.). And even then, the integration of technology is getting to the point where the SoC's power consumption is only a partially limiting factor in battery life. During lower power states and standby states, the comms units, the display, etc. can all consume way more power than the core.

    • by Amouth (879122) on Monday May 06, 2013 @02:25PM (#43645115)

      During lower power states and standby states, the comms units, the display, etc. can all consume way more power than the core.

      Which is great really, because only a few years ago it was top of the list for power consumption. once it gets to the bottom, then we can start picking up the next heavy hitter to power consumption. It makes sense to work on what is hurting the most, and the CPU was hurting the most, now we can shift focus on to the next big one. Although that doesn't mean the CPU group should slow down, else they will soon be back at the top of that list.

      • by SJHillman (1966756) on Monday May 06, 2013 @02:27PM (#43645135)

        If we keep this up, then eventually we'll have computers with negative power consumption and I can start using it as an air conditioner rather than a space heater.

        • by ArcadeMan (2766669) on Monday May 06, 2013 @03:03PM (#43645569)

          I'm in Canada. I use AMD in the winter and Intel in the summer.

          • by Anonymous Coward

            All fun asside, the largest heat generators in my setup are the screens, not the CPU, HD or the rest. While the computer uses 50-60W (AMD+nVidia 650+16G+2HD) at idle, the 3 LCD screens are well over 200W. And even if I upgraded to all LED LCDs, it would still be more than 2x the computer.

            The largest improvement in heat reduction from the computer has been replacement of a regular power supply with a APF correcting, 80-90% efficiency power supply.

            Inefficient power supplies are by far the largest waste of pow

            • by wagnerrp (1305589)
              80Plus ratings only measure PSU efficiency down to 20% capacity. Chances are at a mere 50W idle, you're running well below that level.
            • your *PFC power supply doesnt do that much if the house you live in is already (it should be) doing this to the different plugs about your house
              • by amorsen (7485)

                your *PFC power supply doesnt do that much if the house you live in is already (it should be) doing this to the different plugs about your house

                That makes no sense. How would the house cure phase distortion?

          • by operagost (62405)
            Intel used to make a nice winter space heater: Pentium 4.
        • by Anonymous Coward

          But wouldn't your work get undone? I'd rather not have negative fps when gaming.

      • by evilviper (135110)

        Which is great really, because only a few years ago it was top of the list for power consumption.

        That's utter nonsense. Displays (backlights in particular) have always consumed several times as much as the CPU being used. This is true at least back to 386 laptops, and I haven't ever seen an exception... I supposed some idiot, somewhere, might have crammed a Pentium-4 Extreme Edition in a tiny laptop, but I'm doubtful you can find a salable device anywhere, where the CPU was the biggest power consumer.

    • Damn you, Gene Amdahl!

    • To be fair though - even current Clover Trail Atom SoCs are astoundingly low power. It's one of the few good things about the Win8 tablet I bought. The (30Wh) battery lasts surprisingly long... I haven't gotten below 50% in a day (and that's with extremely heavy use, with nearly permanent inking in OneNote). I'd say I'm averaging less than 2W total power consumption (that's including the display and network connections).

  • peak power lower (Score:4, Informative)

    by Anonymous Coward on Monday May 06, 2013 @02:20PM (#43645061)

    If power consumption when lightly loaded is competitive with ARM, then Intel may have something. Peak power consumption isn't as important for devices where the cpu is never pegged, or only pegged for a tiny fraction of a percentage of total time the cpu is running.

    I have one arm dev board with an exynos4 on it, that has a huge heatsink on top. Pull the heatsink, and you never get even close to speed/power consumption when running with heatsink at 100% cpu. I have yet to see a phone with a heatsink as big as the phone, so I suspect that these phones *never* see 100% cpu, or only see it for such a short period of time (before thermal throttling takes place), that peak power usage is meaningless for most devices using arm SOCs.

    I hope Intel pulls it off. It would be nice if power consumption factored larger in their other offerings too.

    • Got an ODROID also?
    • I have yet to see a phone with a heatsink as big as the phone

      Why not? Seriously. Why can't a phone chassis be made of aluminum be the heatsink at the same time? There have been a few silent computer cases that have done this using heat-pipes. No reasons the chassis can't be affixed to the CPU via a thermal pad. At the very least, it makes for a nifty hand warmer in the winter time (j/k).

    • Have you taken a look at the Atom Z2760? Running full Windows 8, it feels noticeably faster than most mainstream ARM SoCs... definitely faster than my Galaxy Nexus and Nexus 7. That may be down to the RAM though.

      • by spamchang (302052)

        Might also be that magic 24 fps framerate that UX designers have pegged as the golden standard for smoothness :) But Clover Trail SoCs can have a max CPU freq of 2GHz.

    • by spamchang (302052)

      It's possible--Intel and ARM both have SoCs in mobile phones right now, and none of those phones have heatsinks as you've described :) You can run the processors fairly hot, but when you trip a certain thermal limit, CPU throttling will kick in. For the amount of time you can run a processor at 100% speed without throttling, you ought to be able to finish whatever it was that you needed to do...don't loop Dhrystone all day!

  • AMD (Score:3, Interesting)

    by chevelleSS (594683) on Monday May 06, 2013 @02:25PM (#43645125) Homepage
    It's going to be interesting to compare this to AMD's new G-series low power processors. The G-series will have a GPU attached similar to what will go into the PS4 and the XBOX720
  • by alen (225700) on Monday May 06, 2013 @02:28PM (#43645147)

    If they cost the $649 the iphone 5 or Galaxy S4 cost what is the point in switching?

    i'd rather buy something that has market share unless there is a compelling reason t buy something else

    • by Solandri (704621) on Monday May 06, 2013 @03:36PM (#43645987)
      Intel's Atom processors typically retail for $30-$80, with some being more, some less. OEM pricing is lower. That's just the CPU so it's not directly comparable to ARM-based SoCs which I hear cost about $15-$25. So Intel is substantially higher priced, but not ruinously so from an end-user's purchase standpoint. Certainly not $650.

      The more interesting thing to watch will be how this impacts the broader computing market. Intel has managed to stay ahead of the competition buoyed by the enormous profits it generates from its Core CPUs, which typically sell for $100-$400. As CPUs get faster, the general population can get by with something lower down the product chain. I've already been recommending i3s to most of my customers for the last couple years. I'm very close to dropping the bar to high-end Atom or AMD CPUs. As more and more of Intel's sales shift towards these lower-end CPUs, their overall profit margin will start to dry up. It's going to be interesting to watch how they'll react to that.
      • i3 for low profile notebook users. i5 for desktops and laptops that use a docking station, and i7 if you're doing multi-media or other workstation class functionality (CAD, geophysics, etc).

        For in office desktop computers such as a Dell OptiPlex 3010 or 7010 series, I recommend an i5 as anti-virus software and Windows Updates including .NET updates (trustedinstaller.exe and mscoree.dll) can take a substantial amount of processing power. Also, if you plan on backing up to the cloud or providing remote IT ass

  • I love reading articles from back in December that call out Intel's bs. http://www.electronicsweekly.com/mannerisms/markets/intel-has-no-process-advantage-2012-10/ [electronicsweekly.com]

    • by Algae_94 (2017070)
      I love seeing articles that scream about their bias in the headline.

      "Intel Has No Process Advantage In Mobile, says ARM CEO"
  • Silvermont looks pretty good. The only weak spot is the Graphics. It only has 4 EU compaired to the 16 EU in the HD 4000. The article says "I wouldnâ(TM)t be too surprised to see something at or around where the iPad 4â(TM)s GPU is today". That's pretty unlikely. If you consider that iPad4 has 76.8 GFLOPS. The Silvermont GPU would have to be clocked at 1200 Mhz to achieve the same performance - (only the top end Ivy Bridge parts are clocked that high)

  • Always hard to read the tea leaves, but I predict a wave of new netbooks that will catch the market by surprise. I believe a wave of $350 netbooks running Bay Trail and Windows 8.1 will prove pretty popular. This will, of course, cannibalize the $1000 ultrabook sales, so this isn't to say it will be a revenue success. But Bay Trail would definitely make Netbook 2.0 pretty compelling.

  • by hsa (598343) on Tuesday May 07, 2013 @07:13AM (#43651679)

    * Numbers may be subject to change once verified with actual the parts.

    http://images.anandtech.com/doci/6936/Screen%20Shot%202013-05-06%20at%2011.16.42%20AM.png [anandtech.com]

    So this is marketing pulling figures out of somewhere and posting them as the Ultimate Truth, without actually having the hardware to test them with?

  • Is 'Tick' when the add more DRM and 'Tock' when they add the backdoors for the state security organs, or is it the other way around?

Never say you know a man until you have divided an inheritance with him.

Working...