Info On Intel Bay Trail 22nm Atom Platform Shows Out-of-Order Design 107
MojoKid writes "New leaked info from Intel sheds light on how the company's 2014 platforms will challenge ARM products in the ultra light, low power market. At present, the company's efforts in the segment are anchored by Cedar Trail, the 32nm dual-core platform that launched a year ago. To date, all of Intel's platform updates for Atom have focused on lowering power consumption and ramping SoC integration rather than focusing on performance — but Bay Trail will change that. Bay Trail moves Atom to a quad-core, 22nm, out-of-order design. It significantly accelerates the CPU core with burst modes of up to 2.7GHz, and it'll be the first Atom to feature Intel's own graphics processor instead of a licensed core from Imagination Technologies."
Hackintosh (Score:0, Offtopic)
Im looking forward to another atom based hackintosh. Low power file server/media server with iTunes!!
Re:Hackintosh (Score:5, Funny)
I'm not sure I've ever seen iTunes earn that many exclaimation points since the days when Apple announced that iPods on Windows would no longer depend on 'Musicmatch Jukebox' for file transfer...
Re:Hackintosh (Score:2)
best or best available? (Score:2)
'Nice' compared to Amarok, Banshee, etc? Or just nice in that there aren't better alternatives?
Last I checked, iTunes was a contender for the best media library available for Windows. Personally I have always found it to be rather lacking, with a small feature list and limited configuration options. I understand that might actually be a feature in itself for a majority of users. In any case, it's been some time since I've used it.
It is Always Reassuring When .... (Score:5, Funny)
Re:It is Always Reassuring When .... (Score:0)
Looking forward to seeing these new processor in the new Thinkpad w540. My Thinkpad w500 is doing great with a cpu having 6M-L2 and wattage at 25-watts, but i would not upgrade to W510, W520,or W530 because the cpu I7 wattage was like 45-watts(burn your system from the inside out).
Re:It is Always Reassuring When .... (Score:2)
The buying decisions should be made based on the requirements. If what you buy meets your requirements (until the life time of the device is over and you want to upgrade) you should not regret your decision.
That applies to smart phones, DSLR and normal cameras, PCs, tablets etc. These devices have a great advancement pace and you will always regret if you want to compete with the market.
About bloody time... (Score:5, Interesting)
I, for one, will be overjoyed to see the last of Imagination's 'PowerVR' shit, especially on x86, and hope we'll never see the likes of the "GMA500" again.
On the other hand, this report has me wondering exactly what the Atom team is up to. Back when Intel started the whole 'Atom' business, the whole point of having a substantially different architecture, in-order, was to have something that could scale down to lower power in a way that their flagship designs couldn't. Since then, the ULV Core I3/5/7 chips have continued to improve on power consumption, and the Atoms have apparently been sprouting additional complexity and computational power. How much room do they have to do that before 'Atom' evolves itself right out of its power envelope, or Core ULV parts start hitting the same TDPs as higher-power Atoms; but with much more headroom?
Re:About bloody time... (Score:5, Informative)
How much room do they have to do that before 'Atom' evolves itself right out of its power envelope
That's why they reduce the gate size (22nm). You get a less power-demanding product, and at the same time you gain additional room for extra features.
or Core ULV parts start hitting the same TDPs as higher-power Atoms; but with much more headroom?
If you consider current Atoms and performance-per-watt, a latest-gen Core is probably more efficient than Atom. But on the other hand, they are way more complex processors, usually with bigger on-die cache, and way more expensive. There may be some overlap over "budget" processors (such as Celeron and the old Pentium D) on the new versions, but even then I don't think they will be direct competitors (as an example, how many easily upgradable Atom boards with ZIF-style socket have you seen?).
Re:About bloody time... (Score:0)
Modern Celerons and Pentium chips use the same basic architecture as the Core chips, just stripped of some of the more expensive features like large cache sizes, high clock speeds, and fewer cores. At equal clock speeds, a modern "Sandy Bridge"-based Pentium will perform nearly same as an i3 2000 series.
Broadwell will not come in a package having pins (Score:2)
as an example, how many easily upgradable Atom boards with ZIF-style socket have you seen?
What makes you think Intel won't do away with upgradable socketed boards on desktops too [slashdot.org]?
Re:Broadwell will not come in a package having pin (Score:2)
Re:Broadwell will not come in a package having pin (Score:2)
Also, unless the signal integrity issues are truly brutal, it wouldn't be terribly difficult to produce a CPU that is designed to be 'zillion-little-BGA-balls-permanently-attached' for volume constrained embedded applications and also produce a little PCB card that has an array of BGA pads on top and an array of LGA lands on the bottom, allowing you to turn your BGA-only CPU into a socketed CPU at modest additional expense.
Given the uptick in tablets, ultrathin laptops, and 'every CPU manufactured in the past 5 years is faster than I need' cheapy desktops, I certainly wouldn't bet on CPU sockets getting any more common; but it seems unlikely that sockets would be killed entirely in the more expensive areas.
Re:About bloody time... (Score:3)
Remember when the Netburst (P4) architecture turned out not to have the legs that they hoped, and AMD was beating up on them, it was Intel's mobile architecture (Pentium M, developed somewhat independently in Israel following on P3 rather than Netburst) that became the basis for the Core architecture, which brought Intel back into the lead on desktops. Secondly, consider Itanium - what if they had completely committed to that and burned their bridges on x86? If I were in the corner office at Intel I would allow Atom and Core to compete until and unless one has no advantages over the other.
Re:About bloody time... (Score:4, Insightful)
They will probably not need to compete dollar for dollar on price as long as they can deliver superior performance, but they will have to close the gap somewhat. ARM SoC's, etc. aren't going away any time soon, especially on the lower end (ain't gonna see any $79 Intel tablets), but I think Intel are finally getting their shiz together to challenge the likes of the Tegra line, at least.
If you want to see Intel push the envelope with Core or a successor, they might need some competition. There is no one to push them to innovate there, and no excitement (i.e. $$$ rolling in).
Re:About bloody time... (Score:5, Insightful)
AMD wasn't defeated, they committed suicide by laying off engineers to help the bottom line in the short term. Naturally they're finding that is deadly in the long term.
Re:About bloody time... (Score:1)
AMD wasn't defeated, they intentionally gave up the desktop. They're now focusing on the low-end and the high-end. AMD dominates in data centers, simply because they're so much cheaper. Rackspace, for example, is an AMD-only shops. AMD does well on the appliance side, as well, although there are a ton of players, including even MIPS based products.
Basically, AMD decided it was too costly to beat Intel at their own game. So AMD crept back into the shadows. They'll live on as one of the myriad B2B companies. (Ok, technically they were always B2B, but I mean they're not going to compete in the public sphere anymore.)
Re:About bloody time... (Score:2)
Engineers are fungible. Lay them off when you do not need them and hire new ones when you do. What could possibly go wrong?
Signed,
MBA
Wildly Over Optimistic For Intel (Score:0)
The PC market looks like it is down 20 percent from the same quarter last year. And there is no sign that there is anything that is going to change the collapse of the desktop x86 PC market.
On the booming cellphone and tablet markets Intel is effectively a non-entity. Intel has nothing to offer other than hotter, hugely more power hungry, and ridiculously more expensive chips.
Intel's PR campaign that has been going on the past few weeks isn't impressing anyone but existing Intel fans. So far all Intel has been able to demonstrate is they are somewhat competitive versus year old previous gen fab ARM solutions with higher cost and higher power requirements when done by cherry picked friendly people in the computing press.
There is no sign that Intel is going to somehow miraculously transform Atom into a solution that matches the needs of cellphone and tablet manufacturers.
Re:Wildly Over Optimistic For Intel (Score:2)
In the last Intel Atom Slashvertisment it was the Atom that had the larger gate size, with the comparison being fairly equal but slightly in the Atom favour for performance and load power and in ARM for idle power.
Where Are The Customers? (Score:0)
If Intel's Atom was actually competitive with ARM in the real world Intel wouldn't be wasting time with this latest PR effort and instead would be putting out press releases with announcements of new customers dumping ARM for Atom.
They aren't.
Intel's absurd boasts about Atom are as believable as their old hilariously fake SPEC compiler scores.
Re:Wildly Over Optimistic For Intel (Score:0)
Really? So this [anandtech.com] comparison is somehow stacked? Because in it latest Cortex-A15 based Exynos in Nexus 10 draws more power than Z-series Atom while being slower.
Re:Wildly Over Optimistic For Intel (Score:0)
Slower? Interesting, because I read that article and it said it was faster, albeit using more power (and it's a smartbook/large tablet optimised design, not a smartphone optimised design).
Nevermind the fact that the entire article is sponsored by Intel, they supplied everything, and the testing methodology.
Anandtech doesn't even bother to mask the fact that it's really Inteltech these days.
Re:Wildly Over Optimistic For Intel (Score:2)
They don't need to. Worst comes to worst Intel begins to make ARM SOCs and apply their superior process technology (Intel is always atleast 1 node ahead of the curve in most cases) they don't even need to be a better designed processor. Just good 'nuff to beat the competition with their lithography advantage.
Re:About bloody time... (Score:4, Interesting)
It's more like ARM could eat Intel's breakfast if it isn't careful. ARM processors are already good enough for 95% of what people do, even on the desktop. Just look at Chromebooks and the near console level gaming available on high end tablets.
ARM's biggest advantage is that there are so many people making them. Any shape or size you like, desktop style CPU or fully integrated SoC, any price bracket. The fact that Chinese manufacturers like Allwinner make them is a big deal too, because just like the west doesn't seem to like Chinese parts the Chinese prefer to avoid western manufacturers where possible (language and supply chains probably have a lot to do with it). On top of that big companies like Samsung and Apple make their own CPUs anyway, and since they own the top end of the market it will be very hard for Intel to get in.
Nintendo 64 to Nintendo DS (Score:2)
ARM processors are already good enough for 95% of what people do, even on the desktop.
But everybody has a different 5 percent that isn't yet ported to ARM.
Just look at Chromebooks and the near console level gaming available on high end tablets.
Given that the Xbox 360 is seven years old, "near console level" is not saying much. Seven years is just one year less than the gap between the Nintendo 64 and Nintendo DS, which offered near Nintendo 64-level graphics.
Re:About bloody time... (Score:3)
Beats every competition in every metric, bar price? They're not quite like Microsoft.
Re:About bloody time... (Score:0)
Every metric, ridiculously skewered in their favour, you mean?
All test software is written specifically to perform better on Intel if the code can't be written agnostically. Sometimes it gives intel preferential treatment anyway even if it could be done in a neutral way.
All test software, practically, is compiled with the Intel compiler, which has already been caught at least once doing naughty stuff if the code was run on an AMD.
The people doing the tests seem to all pretty intently focus on measuring the parts where Intel cpus are strong, and skipping the parts were they are weak, or at least gloss them over. And even IF they lose, most "testers" usually somehow manages to spin it as a "win" somehow, because they all seem to be Intel fanboys -- if not outright "sponsored".
"Testers" lost their relevance and credibility a long time ago.
Re:About bloody time... (Score:0)
do you have any proof of any of this?
All test software is written specifically to perform better on Intel if the code can't be written agnostically.
what does this mean?
All test software, practically, is compiled with the Intel compiler,
the intel compiler doesn't even run on arm, so i have no idea how this could be true.
The people doing the tests seem to all pretty intently focus on measuring the
parts where Intel cpus are strong, and skipping the parts were they are weak
exactly what should have been tested and wasn't?
i'd have to say that your post isn't credible. you've made a bunch of claims, and have
nothing to back them up.
Re:About bloody time... (Score:0)
It means that there's more than one way to skin a cat, to use a disgusting proverb, and all cpu's are not made alike. "Tests" tend to do things "the Intel way".
If you look at the GP, you'll find he was using words like "Core", "Atom" and "Netburst (P4)", a rather powerful hint he was talking about x86/x64. ARM is a red herring in this context.
As for backing stuff up, if you're interested, just start reading these "tests", and look at how the results are presented. It's pretty obvious, if you look for it. As for the rest, google lists pretty http://www.agner.org/optimize/blog/read.php?i=49#49 [agner.org] high..
Re:About bloody time... (Score:0)
Oh, not that anyone will care, but these "tests" tend to use a lot of games as "benchmarks". Guess whose cpu those are "optimized" for.
EOT.
Re:About bloody time... (Score:2)
Atom and ARM are now direct competitors in the mobile market.
Atom and Core seem to converging. Atom is upping its performance, Core is lowering is power consumption.
Netburst is the odd one out.
Re:About bloody time... (Score:0)
sos ur mum
Re:About bloody time... (Score:3, Interesting)
Let's be clear about this - the Imagination GPUs are excellent, the problem is that Intel decided to write their own drivers, badly. Very badly. Okay, they outsourced it, but the end responsibility was theirs. Imagination's own drivers, which by all accounts are good, were not used.
So put the blame where it should be directed - Intel.
Re:About bloody time... (Score:3)
Same thing they've always been up to, competing with ARM.
At first they needed to be low power, when top of the line ARM was 650mhz on a single core. Within 3 years, ARM got quad-cores running at 1.5ghz and other enhancements.
What changed was the competition.
If your looking for cheap no frills x86 SoC, try an AMD Geode.
today's current devices require more however.
Re:About bloody time... (Score:3)
If your looking for cheap no frills x86 SoC, try an AMD Geode.
No frills like acceptable performance. Anyway, AMD replaced Geode with their Fusion line.
Re:About bloody time... (Score:3)
I think the in-order architecture was just as much based on the other key feature of Atom that Intel didn't talk so much about to consumers - die size and cost for Intel. If we compare the early 230 and 330 to contemporary 45nm processors then a single core Atom was 25 mm^2, dual core 2x25 mm^2, Wolfsdale dual-core 107 mm^2 and quad core 2x107 mm^2. On top of that comes better edge utilization of wafers and lower defect rate since each chip is smaller. In practice Intel could probably produce 5 single-cores Atoms for the cost of one Wolfsdale dual core, allowing Intel to sell a $29 CPU in a market they'd otherwise charge $100+.
I think that even if Atom and Haswell starts to overlap they'll belong to two quite different markets for Intel, one is the low performance - low cost market and the other the high performance - high cost market even if they're in the same power envelope. And if the Atoms are smaller than the Haswells, well Intel can have high margins on both. Besides I doubt Intel has forgotten that the Atoms are their SoC solution for smart phones and such, Anandtech did a pretty solid power analysis [anandtech.com] of their Clovertrail platform and the Atom CPU peaked at <1W, the platform at <5W. Haswell has a long way to go to reach those levels, even if a turbocharged Atom and ULV Haswell could intersect at 10W.
Re:About bloody time... (Score:0)
I figure that's the point.. In the end, one approach or the other will come out on top, so why not attack the problem from both ends?
They have the resources to bet on two horses, and win either way.
Re:About bloody time... (Score:2)
I, for one, will be overjoyed to see the last of Imagination's 'PowerVR' shit, especially on x86, and hope we'll never see the likes of the "GMA500" again.
Yeah, thank goodness. Folks, this is how bad it is: the FreeBSD folks had to patch the VGA text console code to be compatible with the current Intel Atom boards, and the first full release of that code was just a week ago. From origin to present day, nobody had ever managed to implement VGA in a way that would fail when data was written a whole byte at a time to the VGA buffer, but PowerVR did, for a 2012 board release. To be completely fair, the original 1980's VGA spec does call for half-word writes to memory, but as far as testing or design goes, they broke 80-column text mode on an embedded platform, on one of the most popular embedded OS's. Intel didn't catch this or didn't care either. At least Intel now seems to be taking responsibility for their mistake, but boy did it cause a bit of consternation in the field. It's not that text mode is the flagship feature of any device people are making today, but boy, it's something you really expect not to have to fight.
What's with the memory controller? (Score:1)
I know that it's DDR3 SODIMM but is there any particular reason they're limiting it to DDR3-1333?
Would there be a performance gain if it could utilize DDR3-1600 like how the AMD fusion processors show decent performance gains using higher speed memory? I'm pretty sure that DDR3-1600 SODIMM's are out there.
Re:What's with the memory controller? (Score:1)
It doesn't make sense for a processor 2 years out to have memory speed below current generation of products. It is okay if the product were to be available in Q1 2013, but you got to be designing for the next generation of DDR3 speeds already.
I can see that it is DDR3L and not DDR3. There is probably geared toward power saving, but there is no reason not getting a faster memory clock speed for as you got to keep Quad cores fed.
Re:What's with the memory controller? (Score:2)
Because they don't want memory performance to be comparable with the higher margin Core line. Atom is cheap - and they want to ensure that it's not cheap to the point where it eats into their bread and butter.
And anyhow, DDR3 will be old hat by the time this processor comes out - DDR4 is just coming out.
Re:What's with the memory controller? (Score:2)
Because higher speeds create more heat. More power to feed the RAM and more power to run the bus.
win8/linux tablets? (Score:4, Informative)
Runs all your x86 binaries.
By MS' own definition, uefi will support other os options (not guaranteed under ARM).
Has mature, supported foss GPU drivers unlike every android-only ARM SoC.
THE platform for that budget linux tablet that dual boots into MS Office?
Re:win8/linux tablets? (Score:0)
Why would you want to use apps that were written in x86? For a good experience you need to rewrite code so your apps are optimized for touch not mouse. Then you need to optimize for performance, RAM usage, and power since mobiles have much lower specs and will continue to be less powerful for the near future.
Of course, if you like the hybrid laptop/tablet concept MS is pushing I guess x86 is for you.
I'm not looking forward to half-assed x86 apps for tablets.
Re:win8/linux tablets? (Score:2)
For a good experience you need to rewrite code so your apps are optimized for touch not mouse.
My apps are optimized for keyboard, thank you very much. Mice and touch are just extensions to the keyboard interface.
Discontinuation of small laptops (Score:2)
My apps are optimized for keyboard, thank you very much. Mice and touch are just extensions to the keyboard interface.
Unfortunately, all the new 10" computers are tablets, not laptops, and these aren't guaranteed to come with a keyboard. How do you plan to adapt your applications to the discontinuation of small laptops [slashdot.org]?
Re:win8/linux tablets? (Score:2)
Yes, I'm fine with a tablet 'cover' that doubles as a keyboard and trackpad solution.
Hybrids are the technology for 2013. Witness this week's CES for examples. Some rotate the screen like the Lenovo twist, some flip like the Dell XPS Convertible, some detach like the Hp Envy. Then there's the Surface Pro, as you mentioned. These things are in Ultrabook territory but prices will come down as the novelty of a touchscreen laptop becomes the norm.
But by x86 binaries I mean legacy win32 stuff that won't run on ARM linux. e.g. my government's tax software. Can an iPad run that?
By 'half-assed x86 apps for tablets' I'm sure you're aware that Android-x86 runs on Intel, as does KDE Plasma Active and probably without too much tweaking webOS and Firefox OS.
If, as MS promises, the firmware for x86 devices isn't locked down for Windows-only, you can triple boot to your heart's content. On an ARM based system, you're at the mercy of a manufacturer that ships Android-only kernel blobs.
A note to the submitter and editors (Score:1, Offtopic)
Re:A note to the submitter and editors (Score:0)
Most useless and vapid comment ever.
Re:A note to the submitter and editors (Score:0)
You must be REALLY new here.
Re:A note to the submitter and editors (Score:0)
PowerShell user detected.
Not using imagination tech is a good news (Score:1)
Re:Not using imagination tech is a good news (Score:3, Insightful)
The imagination technology drivers aren't open source, which was a big issue. Moving to an Intel video board means that it will be released as free software (unless Intel changes its policy which is very unlikely). That's a very good news for the open source platforms!
Call my cynical with these. Intels clover is not Windows 7 compatible, and the previous are not Windows 8 compatible [neowin.net]. If intel is blowing off Windows 7 without working drivers for their newest chipsets what makes you think they will support Linux either?
They want you to blow extra $$$ for an icore5 that you do not need, and are trying to make this for tablets and phones only to stop ARM.
Re:Not using imagination tech is a good news (Score:0)
I'm thinking they almost have to produce drivers for Linux or at least Android. Who else is going to buy these otherwise? It is possible they could back pedal on the free software front and not release the code. However I'm not so sure that is going to happen. However some of the statements made have been framed such that Intel is not supporting Linux. What that means exactly is unclear. HP's printers are the best supported under Linux and they state something similar. What they mean is they don't support end-users. While this is an odd usage of words for Intel given the end-user to them would be Dell, HP, etc. it could merely mean they won't produce drivers even though they will release specs. We could still see 100% support despite not being “supported”.
Re:Not using imagination tech is a good news (Score:2)
If intel is blowing off Windows 7 without working drivers for their newest chipsets what makes you think they will support Linux either?
They want you to blow extra $$$ for an icore5 that you do not need, and are trying to make this for tablets and phones only to stop ARM.
I'm convinced that Intel will not support older kernel versions. They never do anyway, they always target current, or next, so it can get upstreamed. Probably, they did the same with windows, I don't know (and frankly, I don't care). But what we're talking about here is support for the video driver, which is already in kernel.org (unless they integrate something completely new, but that's also very unlikely).
Re:Not using imagination tech is a good news (Score:0)
>what makes you think they will support Linux
The fact that before running windows anything, every cpu design in intel first runs an in-house linux variant on which various verification tools are run. So linux has to run.
Re:Not using imagination tech is a good news (Score:2)
These processors are intended for tablet devices with touchscreens, windows 7 is not really suited to such devices and windows =7 tablets have always sold very poorly in the past.
Linux on the other hand, has sold well on tablets in the form of android so it makes more sense to support.
Also if Intel release enough of the hardware specs, they don't need to explicitly support linux, someone else will do it if they don't. Windows users typically don't write their own drivers while linux users do.
It may be more than drivers too, windows =7 expects an ibm pc compatible system, if the new processors eliminate some of the backwards compatibility cruft in order to save power then the platform will be different enough that windows cannot boot, and the only organisation capable of making the changes necessary for it to work wants to sell the latest version and has no incentive to do so for older versions.
Linux on the other hand could have any necessary changes made by all manner of people.
I believe this has already been the case with at least one model of atom processor, which despite being x86 was unable to boot windows but did have a modified linux kernel working on it.
Re:Not using imagination tech is a good news (Score:3)
Re:Not using imagination tech is a good news (Score:2)
They'll also have ULX Haswells that go down to 10W and support 4K, I think both price and performance-wise they'd be a better match. Besides, going from 132 ppi on the iPad 2 to 264 ppi on the iPad 3 was huge and the Nexus 10 tops that with 299 ppi, but I don't see that race going much further since they are hitting the limits on human vision. I just hope we'll see reasonably priced 4K desktop monitors soon, they're also good for huge TVs but really serve no point on a 50" TV.
Baytrail is the follow on to Clovertrail... (Score:0)
Interesting that the submissions falsely asserts that Baytrail follows Cedartrail incorrectly. Clovertrail is the more direct preceding product in the roadmaps and was released about 3 months ago.
Competing against LAST YEARS Arm (Score:0)
The trouble with Intel is it's competing against LAST YEARS ARM quad core chip, by the time they get it out, the 8 and 16 core ARM chips will be out.
Not only that their main problem isn't the processing power, it's the power draw! They keep talking about idle less than 10 watts, in a market where idle is less than 100mW. Defining idle as 'what Windows 8 does in idle' doesn't work in a market dominated by Android which idles a lot deeper.
So we seem to have these endless puff pieces from them, promises of how amazing the next generation will be. It doesn't look like attack on ARM (which requires weapons AKA competitive chips), rather it looks like defense, to keep the server market from switching over anymore than it has already.
It's more about keeping companies locked into Intel from switching, on the promise of a fix real-soon-now.
Re:Competing against LAST YEARS Arm (Score:0)
They keep talking about idle less than 10 watts,
I'm pretty sure you'll find that's under 10 watts when active, for any low-power offering.
Anandtech did a surprisingly decent comparison [anandtech.com] of the power usage of Clover Trail (Atom Z2760) vs.Cortex-A15 (Exynos 5250).
There are nits to be picked, but they do a good job of showing the power usage, including the energy benefits of race-to-idle, and the low idle draw of all tested platforms.
These are both recent releases; I think comparing them is entirely fair, and they do appear close enough to trade blows.
Medfield is 2.6 Watt at idle, Tegra 3 100mW (Score:0)
http://www.extremetech.com/computing/110563-intel-medfield-32nm-atom-soc-power-consumption-specs-and-benchmarks-leak
"Anandtech did a surprisingly decent comparison [anandtech.com] of the power usage of Clover Trail (Atom Z2760) vs.Cortex-A15 (Exynos 5250)."
The actual 'favorable' test claimed was the Tegra 3 running Windows RT vs a Medfield running Windows 8. With Wifi turned off and the machine left to do nothing, nothing at all, not stream a movie, not run a GPS app, nothing.
This is probably the only case where Medfield can shut down all the silicon blocks needed to compete with Tegra's low power single core.
But you may aswell have compared the 'off' state of both devices and declared it even. If they'd streamed a movie on an Android on a Tegra 3, vs Windows RT on a medfield, the true nature of this is revealed:
The Tegra 3 can run on it's single low power core quite happily, and stream video across Wifi. The Medfield has no such core, it's silicon blocks are on or off, as soon as you do anything it's sucking down the juice. This is why the Razr has terrible battery life. Google [battery life android razr] to see the complaints!
Re:Medfield is 2.6 Watt at idle, Tegra 3 100mW (Score:0)
I imported a Motorola RAZR i which is a Medfield based Android phone.
With wifi active and 3G data disabled, it has incredible battery life, often coming home at the end of the day with 85% remaining, since I'm actually working during the day and not playing games. I accidentally left a GPS tracking app active all night w/o a charger recently, and the battery ran from around 75% when I went to bed to 35% when I woke in the morning. According to the system battery monitoring graphs, it is the display that uses the most power, not other software nor radios.
ARM based Android phones I've tested previously never had anywhere near this stamina with the same use. This includes several Samsung Galaxy S and Nexus family models. At the same time, the RAZR i feels very responsive to me with Android 4.0.4, better than most previous phones except perhaps a Galaxy Nexus which had Android 4.1 (which made it far more responsive than its original Android 4.0).
Re:Medfield is 2.6 Watt at idle, Tegra 3 100mW (Score:2)
Wow, so much misinformation across the board. Both the ARM based Razr M and the Intel based Razr I have excellent battery life. They are both excellent phones. I have the M myself, but comparing against friends with the I... No real difference.
-Matt
Re:Competing against LAST YEARS Arm (Score:0)
"They keep talking about idle less than 10 watts"
No they do not. TDP is the maximum power draw from the socket. Intel has demonstrated 20mW idle at 32nm with Medfield, and expects to drop this by a factor of four at 22nm.
Same old, same old... (Score:5, Insightful)
New leaked info from Intel sheds light on how the company's 2014 platforms will challenge ARM products in the ultra light, low power market.
Intel is using the tactic perfected by Microsoft, i.e., compare your product plans from two or so years in the future with the current products of your competitor, and then say how much better your envisioned products are.
.
Intel is behind the 8-ball in the low power market space, and this is nothing less than a move of desperation on Intel's part.
Re:Same old, same old... (Score:2)
Uh... so you are saying that ARM is copying Intel's strategy with the never-ending harping about how great the A57 cores will be? I seem to recall sitting through 3 years of ARM hype about how the Cortex A-15 was going to permanently destroy Intel, and here we are with real systems running real tests that show that it isn't even insanely better than 32nm Atom parts. How come ARM hasn't completely taken over yet? I've been promised miracles!
Intel is dead in the water (Score:4, Interesting)
Dual ARM A15 chips destroy any current dual Atom from Intel. The coming quad A15 parts will destroy any Intel ULV i3 part (Intel's crown jewel CPU) that competes in the same space.
However, the A15 design is now years old. ARM is replacing it with a fully 64-bit part that uses only 60% of the same die space in the same process. This means that the ARM part that replaces the A15 early 2014 has either more performance or less energy use- a total nightmare for Intel.
Meanwhile, it is impossible for Intel to 'repeal' the Intel Tax. Intel is addicted to massive profits per chip, and cannot function on the margins made by those that manufacture the ARM SoC parts. Example: Intel is boasting support for 4K video on its next generation CPUs, but 4K support already exists on one of the cheapest ARM chips you can find in a tablet, the Allwinner A10.
When Atom goes 'out of order', it ceases to be an Atom, as is, instead, a renamed version of Intel's current 'core' architecture.Intel going quad with the Atom makes zero sense, when the targeted low power devices try to keep all but one core in idle for power-saving reasons. Intel can already thrash its own future Atom with the earlier mentioned ULV dual-core i3 part, as used in the latest Chromebook.
It gets worse. AMD and ARM are fully unifying the memory space of external memory as used by either the GPU cores or the CPU cores. Intel is going in the opposite direction, attempting to build on-die RAM blocks for the exclusive use of the GPU on versions of its chips aimed at high-end notebooks. This project is dying on its feet as notebook manufactures cannot believe the money Intel wants for this version of Haswell- they know if their notebook customers pay a lot for the product, they demand decent graphics from Nvidia or AMD, not half-working slow graphics rubbish from Intel.
It gets worse. Apple is on the verge of dumping Intel completely for their own ARM SoC designs. The high-end Apple desktop systems that would struggle with current ARM chips hardly make money for Apple anyway compared with the phones, tablets, and Airbooks.
It gets worse. Weak demand in the traditional PC marketplace means that Intel has growing spare capacity at its insanely expensive fabs. It tried to find customers for this free capacity, but Intel fabs are massively customised for Intel's own CPUs, and lack the technical support for other kinds of chips. Intel uses its outdated equipment to make other kinds of parts (like the dreadful Atoms, or the dreadful MB chipsets), but potential customers hardly want to make their new chips on these very old lines.
It gets worse. Global Foundries (AMD's chip production facility- that pretends to be independent) is making incredible strides in attracting business form many companies designing the most cutting edge ARM parts. Samsung's chip business is going from strength to strength. Apple is making massive investments at TSMC. The Chinese fabs are coming along in leaps and bounds.
It gets worse. The GPU is becoming by far the most important part of the modern SoC (system on a chip). Intel's GPU design is a distant fifth to the SoC GPUs from AMD, Nvidia, PowerVR and ARM itself. Of the five, only Intel's GPU still doesn't work properly, and is NOT compatible with modern graphics APIs. Intel has to hack its drivers even to get even a handful of the most popular games running with minimal glitches. Intel GPU = you will have massive compatibility issues.
Where is the Z80 today? The same question will be asked of x86/x64 tomorrow.
Re:Intel is dead in the water (Score:3)
Re:Intel is dead in the water (Score:1)
He asked where it was, he didn't say it was gone. Running a fricking graphing calculator is a far cry from ruling the PC. You fail, fanboy.
Re:Intel is dead in the water (Score:0)
if i had an account and mod points, i'd mod you down. as
for the fanboiism, the pot is calling the kettle black!
the gp's is correct and you are trying to put words in his
mouth by jabbering about destops.
Re:Intel is dead in the water (Score:0)
I bet I know why you don't have an account: You're too bleeding stupid to get one, you mouthbreather. Fanboy, me? From that post? Pointing out glaring errors? Thanks for the laugh.
Finally, your reading comprehension really sucks. This is what the parent said:
All in the context of the growing importance of the SoC, where Intel does not rule, as opposed to the desktop, where they do.
He finally ask the question;
which is clearly another reference to the desktop since that's where the z80 used to be king, albeit obviously before you were born, going by your arrogant, ignorant shitty brat attitude.
Re:Intel is dead in the water (Score:5, Interesting)
Sorry wrong. Google the anandtech benchMark of current medfield in razr i vs kraitkrait. It's competitive NOW and that's without process advantage
Re:Intel is dead in the water (Score:0)
Ya, but who cares if it costs more? Intel likes its profits and an extra $10 over ARM isn't enough to keep their business alive. If Atom costs $50 more for better performance it still isn't enough to sustain Intel's process lead. Not to mention it would make it make the CPU the most expensive component of the device.. Battery life be different since the display and wireless take more power. Samsung and Apple will be last manufacturers to switch to x86. They'll use their own in-house chips out of pride. Also, price of mobiles will drop so their will even less of an incentive to use expensive Atoms.
The war just started but Intel lost years ago.
Re:Intel is dead in the water (Score:0)
And yet, Razr already contains an intel SoC. You can't explain that.
Re:Intel is dead in the water (Score:3, Interesting)
I don't know why this is being modded down but AC is right on the money with the 'Intel tax'. Intel are addicted to 60%+ average margins on their CPUs and it's going to be hell for them to give them up.
People can tout supposed superior performance figures for Intel's offerings but it simply doesn't matter. Even if their parts offer 30% better performance unless they can them down to no more than $20 per part the tablet and mobile manufacturers will simply not be interested.
Another issue is Intel's lack of flexibility. ARM is the 'Have It Your Way' CPU designer. You can license entire SOC designs, or you can license the ISA or just pick and choose what you want to incorporate into your own SOC. With Intel it's all or nothing.
Re:Intel is dead in the water (Score:1, Insightful)
Also the CPU is growing less important. Strangely, a lot of people like to play games on their tablets so GPU performance is more important. So is hardware accelerated video decode. Neither requires a fast CPU. Neither does having a lighter tablet nor a better display. These days battery life is more dependent on wireless performance and the kind of display you have. Oh, and as tablets slowly replace desktops as the main computing device, people will demand more storage and RAM. So people are willing demanding better GPUs displays, wireless, RAM, NAND, weight, battery life and price over CPU performance. I forgot to mention screen size.
I have an iPad. Personally, I'd like to see larger screens, more storage, lower prices, and GPS in the wifi model. Also wish Apple wouldn't cripple competing browsers from having their own Javascript engine over BS security reasons. CPU performance? It's better than my old laptop.
Look at how well the iPad mini sold. All because it's so light and the price is low. Nobody gave a shit that it had a slower CPU than the iPad.
Intel is fucked.
Re:Intel is dead in the water (Score:0)
And this is exactly what ARM designs are good for - not allowing Intel to rest. They will be what AMD was (and hopefully will be) for desktop processors - much needed competition. But actually beating Intel in any technical metric - unlikely.
Re:Intel is dead in the water (Score:0)
We could listen to you or we could listen to experienced people that have designed x86 and other processors. Those that have done that admit that x86 have overheads but those are so small that it can be mostly ignored. Some of those overheads have in practice shown to increase performance by either forcing the engineers to make a better solution or by making the life of programmers easier, like the efficient load/store mechanisms or microcode that enables fast REP MOVSQ (=block move) execution.
At worst x86 have ~10% overheads (that doesn't mean -10% performance) with all other things equal but all other things isn't equal! X86 have the best processor engineers and the best process to build processors with. Wake me up when ARM have fabrication at the same level as Intel and I'll admit ARM have an advantage.
There are many ways to create an out of order processor and claiming that an OoO Atom is automatically the same design as a Core i-whatever just shows that there's no reason to listen to you being completely clueless in this area. ... Speculative execution can be done a number of ways too.
The Core series since Sandy Bridge (Core i3/5/7 2xxx) uses another way for major OoO parts than the Pentium Pro line (Physical register file with no data stored in the reorder buffer (ROB) v.s. storing data in the ROB), the AMD Athlon used another method with two register files to store data in combination with distributed reservation stations: the physical file and the future file. The Power PC G4 used distributed reservation stations with a different mechanism.
Shall I continue with other vital parts of an OoO execution engine? Instruction schedulers can use a multitude of solutions, CAMs (Content Addressed Memory), dependency lookup designs, bit matrix
So an OoO design can be done in a lot of different ways and be tweaked for different goals. A low power design would use less power expensive resources so CAMs would either be of reduced size or removed all together. But sure; if you want to you can keep the idea that the PPC G4 is essentially the same as a Pentium 4...
Global Foundries is a wholly separate entity which you'd know if you'd follow tech news in any way. E.g. AMD recently paid GF a hefty fine for not needing previously reserved production capacity.
Zilog still keeps trucking BTW, the Z80 and the more modern eZ80 is used in a lot of designs.
Re:Intel is dead in the water (Score:0)
You are overlooking the server market. SPARC got smashed pretty bad by AMD, and now Intel is easily smashing AMD.
Data center operations are focusing on huge iron for databases, and huge iron to host VM's. Xeon is fitting the bill perfectly, and the newer Xeon's are just getting started. A general workhorse server will have 8 Xeon CPU's, and machines with 256 CPU's are becoming more common. Power and network bandwidth issues have been neatly addressed in the data center, now the drive is toward more processing power and Intel is clearly the future.
Re:Intel is dead in the water (Score:0)
"When Atom goes 'out of order',"
Almost every CPU made today uses out of order processing at the internal ISA level, it is transparent to the layers above. The Intel Atom uses in-order only because it saves power, but at a great deal of cost to performance.
The resurrection of Netbooks (Score:2)
My main issue with netbooks was the horrible resolution and the sluggishness.
If, by the end of 2013, they can slim down a Bay Trail-based netbook to 3/4", banish the absolutely awful 1024x600 resolution for 1366x768 or even 1600x900, rev to Windows 8.5, and keep it at $350, I will buy 3 for the price of a Macbook Air.
Re:The resurrection of Netbooks (Score:2)
Why wait? My latest netbook (or they call it a netbook anyway) has an 11.6" display at 1366x768, is pretty close to 3/4" thick, has an AMD processor and Windows 8, and cost $200. Though maybe that was a special for the holidays ...
Re:The resurrection of Netbooks (Score:0)
AMD E processors are horrible. The CPU part is slower than same clocked T series Core 2 Duo from 5 years ago. I can't even understand intel with this atom crap. I rarely play games ( defcon and darwinia ) but my autocad and Netflix need CPU, not measly GPS. However windows 8 does work wonderfully on atoms where win 7 struggles. Anyway I prefer to get a second hand 5 year old ultra book that cost upwards of 2k back then, instead of getting a budget 200 bucks crap from today, and I'll get that high quality machine for much the same price.
Re:The resurrection of Netbooks (Score:2)
On a sidenote: I seriously hope resolutions will get better.. 1366x768 is at the very low end of my tolerance.
challenge arm? (Score:0)
We are still talking about 5 to 10 watts...
Is it really a challenge tor arm?
This is Kodak (Score:0)
Intel is like Kodak: stubbornly continue business based on old tech no matter what happens in the real world. But there are still lots of fans for old PC stuff so Intel doesn't die. Yet.
The first Atom with an Intel GPU? (Score:2)
the first Atom to feature Intel's own graphics processor
I have an Atom D510 with integrated Intel graphics [wikipedia.org].
GMA 3150 GPU and memory controller are integrated into the processor.
Does that count? I bought the motherboard in 2010.
Secret Intel trick (Score:0)
I propose that Intel license ARM technology and then blow everyone off the map with 22 nm ARM chips.
Re:Not Windows 7 compatible (Score:5, Interesting)
The link is here. [neowin.net] Basically some Atoms can not run WIndows 8, and clovertrail is specifically designed not to be Windows 7 compatible.
Unless any information has changed my suspicious part of me feels Intel feels threatens the low margin and is trying to make sure this stays only in tables and not in servers nor desktops which is shame. I see no reason to spend a tiny portion of R&D backporting WDDM1.2 to WDDM 1.1 so the graphics work with the Windows 7 kernel.
Re:Not Windows 7 compatible (Score:2)
Re:Not Windows 7 compatible (Score:2)
Is this a joke, FUD, or idiocy?
A little of each, methinks.
I perused the NeoWin "article" linked elsewhere, and followed a few other links. It appears the problem is not (specifically) with the chip; rather it appears the driver for the integrated graphics is crap and doesn't like running Metro apps, at least that's what folks on the Intel support forum are complaining about.
Re:Not Windows 7 compatible (Score:2)
With netbooks declared dead during the week, so dies Windows 7 Starter with them.
The market here is the $400 Windows 8 Tablet, allowing Intel to compete with Win RT but allowing OEMs to produce high-end Core i7 convertibles at triple the price.
Re:Not Windows 7 compatible (Score:1)
With netbooks declared dead during the week, so dies Windows 7 Starter with them.
The market here is the $400 Windows 8 Tablet, allowing Intel to compete with Win RT but allowing OEMs to produce high-end Core i7 convertibles at triple the price.
Some people want Windows 7 on tablets and tiny laptops too. The market is not dead. Just no more a cost saver if you can get a bigger screen and keyboard as well. Walmart has lots of larger notebooks with such crappy processors in them.
I can see a business case too in a few years where energy savings could be miracolous for desktops with these. Especially if it is a really big company with 40,000+ computers! That could equal tens of millions in energy costs.
But nope no corp will touch Windows 8 nor will most users. If all you need is to run some crappy IE 8 intranet app, type a document in word, and read email in outlook you do not need an icore7. This is perfect for most users.
Re:Not Windows 7 compatible (Score:2)
Some people want Windows 7 on tablets and tiny laptops too.
But not enough people want to run PC applications on tiny, affordable laptops to make them profitable to manufacture.
Just no more a cost saver if you can get a bigger screen and keyboard as well.
A bigger screen and keyboard would actually be more of a cost to me, as I'd have to carry them in a larger bag that screams "mug me". My current 10" netbook fits in a modest messenger bag, and I'll be disappointed when it finally dies on me.
Re:Not Windows 7 compatible (Score:0)
There is nothing architectural which limits Atom to Windows 8, and since the same basic core design and chipset design exists until Baytrail, this is purely a certification and business decision, not a technology decision.
In Baytrail's timeframe, it is unlikely that MS will be granting any driver certifications for Windows 7, and all OxMs will have been pushed to move to Windows 8.
Re:First (Score:5, Funny)
Re:First (Score:5, Funny)