Intel Details Silvermont Microarchitecture For Next-Gen Atoms 82
crookedvulture writes "Since their debut five years ago, Intel's low-power Atom microprocessors have relied on the same basic CPU core. That changes with the next generation, which will employ an all-new Silvermont microarchitecture built using a customized version of Intel's tri-gate, 22-nm fabrication process. Silvermont ditches the in-order design of previous Atoms in favor of an out-of-order approach based on a dual-core module equipped with 1MB of shared L2 cache. The design boasts improved power sharing between the CPU and integrated graphics, allowing the CPU cores to scale up to higher speeds depending on system load and platform thermals. Individual cores can be shut down completely to provide additional clock headroom or to conserve power. Intel claims Silvermont doubles the single-threaded performance of its Saltwell predecessor at the same power level, and that dual-core variants have lower peak power draw and higher performance than quad-core ARM SoCs. Silvermont also marks the Atom's adoption of the 'tick-tock' update cadence that guides the development of Intel's Core processors. The successor to Silvermont will be built on 14-nm process tech, and an updated microarchitecture is due after that."
Re: (Score:2)
Chips with 5x lower power consumption? (Score:5, Insightful)
Silvermont is a just core (CPU). It sits inside an SoC (system on chip), and your final power figures will still depend on the efficiency of the rest of the SoC (the GPU, the IO interfaces, the memory interfaces, any other dedicated hardware, etc.). And even then, the integration of technology is getting to the point where the SoC's power consumption is only a partially limiting factor in battery life. During lower power states and standby states, the comms units, the display, etc. can all consume way more power than the core.
Re:Chips with 5x lower power consumption? (Score:5, Insightful)
During lower power states and standby states, the comms units, the display, etc. can all consume way more power than the core.
Which is great really, because only a few years ago it was top of the list for power consumption. once it gets to the bottom, then we can start picking up the next heavy hitter to power consumption. It makes sense to work on what is hurting the most, and the CPU was hurting the most, now we can shift focus on to the next big one. Although that doesn't mean the CPU group should slow down, else they will soon be back at the top of that list.
Re:Chips with 5x lower power consumption? (Score:5, Funny)
If we keep this up, then eventually we'll have computers with negative power consumption and I can start using it as an air conditioner rather than a space heater.
Re:Chips with 5x lower power consumption? (Score:5, Funny)
I'm in Canada. I use AMD in the winter and Intel in the summer.
Re: (Score:1)
All fun asside, the largest heat generators in my setup are the screens, not the CPU, HD or the rest. While the computer uses 50-60W (AMD+nVidia 650+16G+2HD) at idle, the 3 LCD screens are well over 200W. And even if I upgraded to all LED LCDs, it would still be more than 2x the computer.
The largest improvement in heat reduction from the computer has been replacement of a regular power supply with a APF correcting, 80-90% efficiency power supply.
Inefficient power supplies are by far the largest waste of pow
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
your *PFC power supply doesnt do that much if the house you live in is already (it should be) doing this to the different plugs about your house
That makes no sense. How would the house cure phase distortion?
Re: (Score:1)
Re: (Score:1)
But wouldn't your work get undone? I'd rather not have negative fps when gaming.
Re: (Score:2)
That's utter nonsense. Displays (backlights in particular) have always consumed several times as much as the CPU being used. This is true at least back to 386 laptops, and I haven't ever seen an exception... I supposed some idiot, somewhere, might have crammed a Pentium-4 Extreme Edition in a tiny laptop, but I'm doubtful you can find a salable device anywhere, where the CPU was the biggest power consumer.
Re: (Score:2)
well a quick google search for "laptop power consumption by component" first link is a PDF
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.87.5604&rep=rep1&type=pdf [psu.edu]
Which is a fairly nicely done research paper, sure in idle the screen is the most, but under load the CPU dominates, and that is very true even in a lot of newer laptops..
Re: (Score:2)
Mobile Atoms on the market today (i.e. Medfield) under idle/sleeping conditions are competitive with any ARM processor on the market.
Re: (Score:1)
Damn you, Gene Amdahl!
Re: (Score:2)
To be fair though - even current Clover Trail Atom SoCs are astoundingly low power. It's one of the few good things about the Win8 tablet I bought. The (30Wh) battery lasts surprisingly long... I haven't gotten below 50% in a day (and that's with extremely heavy use, with nearly permanent inking in OneNote). I'd say I'm averaging less than 2W total power consumption (that's including the display and network connections).
peak power lower (Score:4, Informative)
If power consumption when lightly loaded is competitive with ARM, then Intel may have something. Peak power consumption isn't as important for devices where the cpu is never pegged, or only pegged for a tiny fraction of a percentage of total time the cpu is running.
I have one arm dev board with an exynos4 on it, that has a huge heatsink on top. Pull the heatsink, and you never get even close to speed/power consumption when running with heatsink at 100% cpu. I have yet to see a phone with a heatsink as big as the phone, so I suspect that these phones *never* see 100% cpu, or only see it for such a short period of time (before thermal throttling takes place), that peak power usage is meaningless for most devices using arm SOCs.
I hope Intel pulls it off. It would be nice if power consumption factored larger in their other offerings too.
Re: (Score:1)
Re: (Score:3)
Re: (Score:2)
Have you taken a look at the Atom Z2760? Running full Windows 8, it feels noticeably faster than most mainstream ARM SoCs... definitely faster than my Galaxy Nexus and Nexus 7. That may be down to the RAM though.
Re: (Score:2)
Might also be that magic 24 fps framerate that UX designers have pegged as the golden standard for smoothness :) But Clover Trail SoCs can have a max CPU freq of 2GHz.
Re: (Score:2)
It's possible--Intel and ARM both have SoCs in mobile phones right now, and none of those phones have heatsinks as you've described :) You can run the processors fairly hot, but when you trip a certain thermal limit, CPU throttling will kick in. For the amount of time you can run a processor at 100% speed without throttling, you ought to be able to finish whatever it was that you needed to do...don't loop Dhrystone all day!
AMD (Score:3, Interesting)
Re:AMD (Score:4, Informative)
If by similar you mean 1/18th the performance.
Re: (Score:1)
Hopefully, they are more successful than the Z-60, a product that had so few design wins that it almost never existed:
http://semiaccurate.com/2013/05/06/where-are-amds-z-60-tablets/ [semiaccurate.com]
how much will these cost? (Score:3, Insightful)
If they cost the $649 the iphone 5 or Galaxy S4 cost what is the point in switching?
i'd rather buy something that has market share unless there is a compelling reason t buy something else
Re:how much will these cost? (Score:5, Informative)
The more interesting thing to watch will be how this impacts the broader computing market. Intel has managed to stay ahead of the competition buoyed by the enormous profits it generates from its Core CPUs, which typically sell for $100-$400. As CPUs get faster, the general population can get by with something lower down the product chain. I've already been recommending i3s to most of my customers for the last couple years. I'm very close to dropping the bar to high-end Atom or AMD CPUs. As more and more of Intel's sales shift towards these lower-end CPUs, their overall profit margin will start to dry up. It's going to be interesting to watch how they'll react to that.
Re: (Score:2)
Re: Atom is dead!! (Score:3)
Re: (Score:2)
Re: Atom is dead!! (Score:2)
Re: (Score:1)
However, I don't see the Asian phone makers switching to Atom, thanks to Qualcomm and Samsung. I do however, see Nokia and maybe all of Motorola switching to Intel.
Re: (Score:1)
Not quite. Atom is still good for non-gaming low-powered workstations and laptops. Pretty much every current computer I use, except the i5 at work that my boss pays the electric bill for.
More than one thing on the screen (Score:2)
Atom became irrelevant when netbooks gave way to tablets and phablets.
ARM tablets and phablets failed to make showing more than one thing on the screen at the same time a standard feature. If a tablet's screen is as big as three phones' screens, why can't it run three phone apps side by side? The only tablets that ship with multi-window multitasking as a standard feature of the operating system are Surface Pro and other Windows 8 tablets, and these use x86.
Re: (Score:2)
Samsung Galaxy Tab 2 has multi-window support.
Re: (Score:2)
Samsung Galaxy Tab 2 has multi-window support.
The problem is that the feature you mention is specific to Samsung products as opposed to being a standard feature of Android, and I've read that it only works with a few applications because Android applications are normally allowed to assume that the screen size never changes after installation [slashdot.org]. Do you expect every developer of an Android application to buy a Galaxy Tab 2 in order to certify the application for multi-window mode?
Re: (Score:2)
No offering from ARM or Samsung could do what I do with my Atom machine
If by "what I do with my Atom machine", you mean running x86 code, then perhaps you're right, but why exactly do you think that a 4-core, 1.5-2 GHz ARM solution with appropriate peripherals wouldn't be able to do the same thing?
Re: (Score:1)
Is there an ARM-based board on the market with PCIe, at least 4 SATA ports, e-SATA, HDMI 1.4, Dual-link DVI, S/PDIF, 7.1 Audio? If there are, they are hard to find, but there are multiple Atom offerings. I require and use all of these things, but like a relatively low-power package. I'm not willing to solder up my own board either.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
I agree (speaking from 2009) and (even here in 2013) still use an Atom(ION) HTPC. I love it, as far as hardware-I-already-have goes. But you need to check out the lowest-power Ivy Bridge and upcoming Haswells. They are seriously encroaching on what used to be Atom's power usage, except much much faster and without the need for any Nvidia chips or drivers. I am not kidding: think carefully and look at what's available
Re: (Score:2)
Not on price, though it appears that this generation of Atom and Core are not very different. I suspect they will continue to converge in future generations until there are various flavors in a wider range of the same base technology.
Re: (Score:2)
They will converge until one cannibalizes much of the other's market on the power consumption spectrum (guess which).
Re: (Score:2)
Atom is going to more than just consumer phablet market segments. While you laugh, the roadmap is being laid down way outside the scope you just described.
Re: (Score:2)
When on earth did China manufacture Alphas? They were originally fabbed only by DEC, then Mitsubishi & Samsung got into the act, finally, DEC fabs were sold to Intel, and Compaq/HP ended the processor. China was never involved in its manufacture.
Even for MIPS, China was never involved. Loongson was a Chinese company licensing a subset of the MIPS instruction set and making a CPU based on that. It however is different from the MIPS in that it supports certain x86 instructions on-chip, which of cour
22nm my rear end (Score:1)
Re: (Score:3)
"Intel Has No Process Advantage In Mobile, says ARM CEO"
Re: (Score:2)
Graphics weak (Score:1)
Silvermont looks pretty good. The only weak spot is the Graphics. It only has 4 EU compaired to the 16 EU in the HD 4000. The article says "I wouldnâ(TM)t be too surprised to see something at or around where the iPad 4â(TM)s GPU is today". That's pretty unlikely. If you consider that iPad4 has 76.8 GFLOPS. The Silvermont GPU would have to be clocked at 1200 Mhz to achieve the same performance - (only the top end Ivy Bridge parts are clocked that high)
Re: (Score:2)
Looks like AMD's budgeted priority for their marketing staff at the expense of their engineering staff is paying off.
Netbook 2.0 is coming (Score:2)
Always hard to read the tea leaves, but I predict a wave of new netbooks that will catch the market by surprise. I believe a wave of $350 netbooks running Bay Trail and Windows 8.1 will prove pretty popular. This will, of course, cannibalize the $1000 ultrabook sales, so this isn't to say it will be a revenue success. But Bay Trail would definitely make Netbook 2.0 pretty compelling.
Numbers may be subject to change.. (Score:3)
* Numbers may be subject to change once verified with actual the parts.
http://images.anandtech.com/doci/6936/Screen%20Shot%202013-05-06%20at%2011.16.42%20AM.png [anandtech.com]
So this is marketing pulling figures out of somewhere and posting them as the Ultimate Truth, without actually having the hardware to test them with?
I can never remember (Score:1)