Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Intel Hardware

Info On Intel Bay Trail 22nm Atom Platform Shows Out-of-Order Design 107

MojoKid writes "New leaked info from Intel sheds light on how the company's 2014 platforms will challenge ARM products in the ultra light, low power market. At present, the company's efforts in the segment are anchored by Cedar Trail, the 32nm dual-core platform that launched a year ago. To date, all of Intel's platform updates for Atom have focused on lowering power consumption and ramping SoC integration rather than focusing on performance — but Bay Trail will change that. Bay Trail moves Atom to a quad-core, 22nm, out-of-order design. It significantly accelerates the CPU core with burst modes of up to 2.7GHz, and it'll be the first Atom to feature Intel's own graphics processor instead of a licensed core from Imagination Technologies."
This discussion has been archived. No new comments can be posted.

Info On Intel Bay Trail 22nm Atom Platform Shows Out-of-Order Design

Comments Filter:
  • Hackintosh (Score:0, Offtopic)

    by Anonymous Coward on Saturday January 05, 2013 @11:22PM (#42492745)

    Im looking forward to another atom based hackintosh. Low power file server/media server with iTunes!!

  • by pollarda ( 632730 ) on Saturday January 05, 2013 @11:24PM (#42492753)
    It is Always Reassuring When .... You spend a bunch of money on a new processor and they tell you it is already "Out of Order" from the get-go.
    • by Anonymous Coward on Saturday January 05, 2013 @11:35PM (#42492815)

      Looking forward to seeing these new processor in the new Thinkpad w540. My Thinkpad w500 is doing great with a cpu having 6M-L2 and wattage at 25-watts, but i would not upgrade to W510, W520,or W530 because the cpu I7 wattage was like 45-watts(burn your system from the inside out).

    • by wmac1 ( 2478314 ) on Sunday January 06, 2013 @01:00PM (#42496775)

      The buying decisions should be made based on the requirements. If what you buy meets your requirements (until the life time of the device is over and you want to upgrade) you should not regret your decision.

      That applies to smart phones, DSLR and normal cameras, PCs, tablets etc. These devices have a great advancement pace and you will always regret if you want to compete with the market.

  • About bloody time... (Score:5, Interesting)

    by fuzzyfuzzyfungus ( 1223518 ) on Saturday January 05, 2013 @11:32PM (#42492801) Journal

    I, for one, will be overjoyed to see the last of Imagination's 'PowerVR' shit, especially on x86, and hope we'll never see the likes of the "GMA500" again.

    On the other hand, this report has me wondering exactly what the Atom team is up to. Back when Intel started the whole 'Atom' business, the whole point of having a substantially different architecture, in-order, was to have something that could scale down to lower power in a way that their flagship designs couldn't. Since then, the ULV Core I3/5/7 chips have continued to improve on power consumption, and the Atoms have apparently been sprouting additional complexity and computational power. How much room do they have to do that before 'Atom' evolves itself right out of its power envelope, or Core ULV parts start hitting the same TDPs as higher-power Atoms; but with much more headroom?

    • by rev0lt ( 1950662 ) on Saturday January 05, 2013 @11:52PM (#42492909)

      How much room do they have to do that before 'Atom' evolves itself right out of its power envelope

      That's why they reduce the gate size (22nm). You get a less power-demanding product, and at the same time you gain additional room for extra features.

      or Core ULV parts start hitting the same TDPs as higher-power Atoms; but with much more headroom?

      If you consider current Atoms and performance-per-watt, a latest-gen Core is probably more efficient than Atom. But on the other hand, they are way more complex processors, usually with bigger on-die cache, and way more expensive. There may be some overlap over "budget" processors (such as Celeron and the old Pentium D) on the new versions, but even then I don't think they will be direct competitors (as an example, how many easily upgradable Atom boards with ZIF-style socket have you seen?).

      • by Anonymous Coward on Sunday January 06, 2013 @12:23AM (#42493057)

        Modern Celerons and Pentium chips use the same basic architecture as the Core chips, just stripped of some of the more expensive features like large cache sizes, high clock speeds, and fewer cores. At equal clock speeds, a modern "Sandy Bridge"-based Pentium will perform nearly same as an i3 2000 series.

      • as an example, how many easily upgradable Atom boards with ZIF-style socket have you seen?

        What makes you think Intel won't do away with upgradable socketed boards on desktops too [slashdot.org]?

        • by rev0lt ( 1950662 ) on Monday January 07, 2013 @01:47PM (#42507699)
          Because it is not a good business decision. That would imply that the whole board would have the shelf life of the (expensive) CPU. That would also imply that for each N board configurations and Y cpus, you'd have N*Y products, instead of a good set of N generics and a small set of Y specifics. The unsold boards that would become obsolete would also have the additional cost of having a valuable, but otherwise useless (because it is soldered in) CPU. Intel's premium market isn't the embedded segment where SbC's are the norm - its highly specialized integrated circuits (such as CPUs) and generic boards. Having them combined by solder would reduce their potential profit, not increase it.
          • Also, unless the signal integrity issues are truly brutal, it wouldn't be terribly difficult to produce a CPU that is designed to be 'zillion-little-BGA-balls-permanently-attached' for volume constrained embedded applications and also produce a little PCB card that has an array of BGA pads on top and an array of LGA lands on the bottom, allowing you to turn your BGA-only CPU into a socketed CPU at modest additional expense.

            Given the uptick in tablets, ultrathin laptops, and 'every CPU manufactured in the past 5 years is faster than I need' cheapy desktops, I certainly wouldn't bet on CPU sockets getting any more common; but it seems unlikely that sockets would be killed entirely in the more expensive areas.

    • by timeOday ( 582209 ) on Sunday January 06, 2013 @12:09AM (#42492991)
      Intel is a big, rich company, so why place all their bets on Core or Atom exclusively instead of stacking the deck with both?

      Remember when the Netburst (P4) architecture turned out not to have the legs that they hoped, and AMD was beating up on them, it was Intel's mobile architecture (Pentium M, developed somewhat independently in Israel following on P3 rather than Netburst) that became the basis for the Core architecture, which brought Intel back into the lead on desktops. Secondly, consider Itanium - what if they had completely committed to that and burned their bridges on x86? If I were in the corner office at Intel I would allow Atom and Core to compete until and unless one has no advantages over the other.

      • by uvajed_ekil ( 914487 ) on Sunday January 06, 2013 @12:42AM (#42493129)
        I think it is clear that with the Core line Intel has finally ended and won the x86 war, so it is only logical for them to begin to focus more on ARM's market. AMD is all but defeated, I am sad to say, and demand for ever-faster desktop and traditional laptop uber-CPUs has died off. I think I speak for a lot of slashdotters when I say we still enjoy the ultimate app performance, immersive gaming experience, and ridiculous storage and networking options that desktops can deliver, and don't mind lugging around a "huge" 6-pound laptop with a 15"-17" screen. But that is not where the greatest demand lies right now. Intel is lagging in the tablet and ultra-mobile market segments so their continued Atom progress is not unexpected and, to be honest, it looks pretty intriguing (which is high praise coming from a longtime Intel hater!).

        They will probably not need to compete dollar for dollar on price as long as they can deliver superior performance, but they will have to close the gap somewhat. ARM SoC's, etc. aren't going away any time soon, especially on the lower end (ain't gonna see any $79 Intel tablets), but I think Intel are finally getting their shiz together to challenge the likes of the Tegra line, at least.

        If you want to see Intel push the envelope with Core or a successor, they might need some competition. There is no one to push them to innovate there, and no excitement (i.e. $$$ rolling in).
        • by Anonymous Coward on Sunday January 06, 2013 @12:59AM (#42493175)

          AMD wasn't defeated, they committed suicide by laying off engineers to help the bottom line in the short term. Naturally they're finding that is deadly in the long term.

          • by Anonymous Coward on Sunday January 06, 2013 @07:46PM (#42499617)

            AMD wasn't defeated, they intentionally gave up the desktop. They're now focusing on the low-end and the high-end. AMD dominates in data centers, simply because they're so much cheaper. Rackspace, for example, is an AMD-only shops. AMD does well on the appliance side, as well, although there are a ton of players, including even MIPS based products.

            Basically, AMD decided it was too costly to beat Intel at their own game. So AMD crept back into the shadows. They'll live on as one of the myriad B2B companies. (Ok, technically they were always B2B, but I mean they're not going to compete in the public sphere anymore.)

          • by strikethree ( 811449 ) on Monday January 07, 2013 @06:12AM (#42503217) Journal

            Engineers are fungible. Lay them off when you do not need them and hire new ones when you do. What could possibly go wrong?

            Signed,
            MBA

        • by Anonymous Coward on Sunday January 06, 2013 @01:49AM (#42493353)

          The PC market looks like it is down 20 percent from the same quarter last year. And there is no sign that there is anything that is going to change the collapse of the desktop x86 PC market.

          On the booming cellphone and tablet markets Intel is effectively a non-entity. Intel has nothing to offer other than hotter, hugely more power hungry, and ridiculously more expensive chips.

          Intel's PR campaign that has been going on the past few weeks isn't impressing anyone but existing Intel fans. So far all Intel has been able to demonstrate is they are somewhat competitive versus year old previous gen fab ARM solutions with higher cost and higher power requirements when done by cherry picked friendly people in the computing press.

          There is no sign that Intel is going to somehow miraculously transform Atom into a solution that matches the needs of cellphone and tablet manufacturers.

          • by viperidaenz ( 2515578 ) on Sunday January 06, 2013 @03:07AM (#42493661)

            In the last Intel Atom Slashvertisment it was the Atom that had the larger gate size, with the comparison being fairly equal but slightly in the Atom favour for performance and load power and in ARM for idle power.

            • by Anonymous Coward on Sunday January 06, 2013 @03:27AM (#42493753)

              If Intel's Atom was actually competitive with ARM in the real world Intel wouldn't be wasting time with this latest PR effort and instead would be putting out press releases with announcements of new customers dumping ARM for Atom.

              They aren't.

              Intel's absurd boasts about Atom are as believable as their old hilariously fake SPEC compiler scores.

          • by Anonymous Coward on Sunday January 06, 2013 @03:13AM (#42493695)

            Really? So this [anandtech.com] comparison is somehow stacked? Because in it latest Cortex-A15 based Exynos in Nexus 10 draws more power than Z-series Atom while being slower.

            • by Anonymous Coward on Sunday January 06, 2013 @04:49AM (#42494039)

              Slower? Interesting, because I read that article and it said it was faster, albeit using more power (and it's a smartbook/large tablet optimised design, not a smartphone optimised design).

              Nevermind the fact that the entire article is sponsored by Intel, they supplied everything, and the testing methodology.

              Anandtech doesn't even bother to mask the fact that it's really Inteltech these days.

          • by FishTankX ( 1539069 ) on Sunday January 06, 2013 @07:08AM (#42494517)

            They don't need to. Worst comes to worst Intel begins to make ARM SOCs and apply their superior process technology (Intel is always atleast 1 node ahead of the curve in most cases) they don't even need to be a better designed processor. Just good 'nuff to beat the competition with their lithography advantage.

        • by AmiMoJo ( 196126 ) * on Sunday January 06, 2013 @07:20AM (#42494577) Homepage Journal

          It's more like ARM could eat Intel's breakfast if it isn't careful. ARM processors are already good enough for 95% of what people do, even on the desktop. Just look at Chromebooks and the near console level gaming available on high end tablets.

          ARM's biggest advantage is that there are so many people making them. Any shape or size you like, desktop style CPU or fully integrated SoC, any price bracket. The fact that Chinese manufacturers like Allwinner make them is a big deal too, because just like the west doesn't seem to like Chinese parts the Chinese prefer to avoid western manufacturers where possible (language and supply chains probably have a lot to do with it). On top of that big companies like Samsung and Apple make their own CPUs anyway, and since they own the top end of the market it will be very hard for Intel to get in.

          • ARM processors are already good enough for 95% of what people do, even on the desktop.

            But everybody has a different 5 percent that isn't yet ported to ARM.

            Just look at Chromebooks and the near console level gaming available on high end tablets.

            Given that the Xbox 360 is seven years old, "near console level" is not saying much. Seven years is just one year less than the gap between the Nintendo 64 and Nintendo DS, which offered near Nintendo 64-level graphics.

    • by Anonymous Coward on Sunday January 06, 2013 @04:46AM (#42494013)

      Let's be clear about this - the Imagination GPUs are excellent, the problem is that Intel decided to write their own drivers, badly. Very badly. Okay, they outsourced it, but the end responsibility was theirs. Imagination's own drivers, which by all accounts are good, were not used.

      So put the blame where it should be directed - Intel.

    • by davydagger ( 2566757 ) on Sunday January 06, 2013 @05:06AM (#42494091)
      "On the other hand, this report has me wondering exactly what the Atom team is up to."

      Same thing they've always been up to, competing with ARM.

      At first they needed to be low power, when top of the line ARM was 650mhz on a single core. Within 3 years, ARM got quad-cores running at 1.5ghz and other enhancements.

      What changed was the competition.

      If your looking for cheap no frills x86 SoC, try an AMD Geode.

      today's current devices require more however.
    • by Kjella ( 173770 ) on Sunday January 06, 2013 @09:24AM (#42495141) Homepage

      I think the in-order architecture was just as much based on the other key feature of Atom that Intel didn't talk so much about to consumers - die size and cost for Intel. If we compare the early 230 and 330 to contemporary 45nm processors then a single core Atom was 25 mm^2, dual core 2x25 mm^2, Wolfsdale dual-core 107 mm^2 and quad core 2x107 mm^2. On top of that comes better edge utilization of wafers and lower defect rate since each chip is smaller. In practice Intel could probably produce 5 single-cores Atoms for the cost of one Wolfsdale dual core, allowing Intel to sell a $29 CPU in a market they'd otherwise charge $100+.

      I think that even if Atom and Haswell starts to overlap they'll belong to two quite different markets for Intel, one is the low performance - low cost market and the other the high performance - high cost market even if they're in the same power envelope. And if the Atoms are smaller than the Haswells, well Intel can have high margins on both. Besides I doubt Intel has forgotten that the Atoms are their SoC solution for smart phones and such, Anandtech did a pretty solid power analysis [anandtech.com] of their Clovertrail platform and the Atom CPU peaked at <1W, the platform at <5W. Haswell has a long way to go to reach those levels, even if a turbocharged Atom and ULV Haswell could intersect at 10W.

    • by Anonymous Coward on Sunday January 06, 2013 @09:30AM (#42495183)

      I figure that's the point.. In the end, one approach or the other will come out on top, so why not attack the problem from both ends?

      They have the resources to bet on two horses, and win either way.

    • by bill_mcgonigle ( 4333 ) * on Monday January 07, 2013 @12:48PM (#42506901) Homepage Journal

      I, for one, will be overjoyed to see the last of Imagination's 'PowerVR' shit, especially on x86, and hope we'll never see the likes of the "GMA500" again.

      Yeah, thank goodness. Folks, this is how bad it is: the FreeBSD folks had to patch the VGA text console code to be compatible with the current Intel Atom boards, and the first full release of that code was just a week ago. From origin to present day, nobody had ever managed to implement VGA in a way that would fail when data was written a whole byte at a time to the VGA buffer, but PowerVR did, for a 2012 board release. To be completely fair, the original 1980's VGA spec does call for half-word writes to memory, but as far as testing or design goes, they broke 80-column text mode on an embedded platform, on one of the most popular embedded OS's. Intel didn't catch this or didn't care either. At least Intel now seems to be taking responsibility for their mistake, but boy did it cause a bit of consternation in the field. It's not that text mode is the flagship feature of any device people are making today, but boy, it's something you really expect not to have to fight.

  • by L473ncy ( 987793 ) on Saturday January 05, 2013 @11:40PM (#42492847)

    I know that it's DDR3 SODIMM but is there any particular reason they're limiting it to DDR3-1333?

    Would there be a performance gain if it could utilize DDR3-1600 like how the AMD fusion processors show decent performance gains using higher speed memory? I'm pretty sure that DDR3-1600 SODIMM's are out there.

    • by Anonymous Coward on Saturday January 05, 2013 @11:57PM (#42492931)

      It doesn't make sense for a processor 2 years out to have memory speed below current generation of products. It is okay if the product were to be available in Q1 2013, but you got to be designing for the next generation of DDR3 speeds already.

        I can see that it is DDR3L and not DDR3. There is probably geared toward power saving, but there is no reason not getting a faster memory clock speed for as you got to keep Quad cores fed.

    • by tlhIngan ( 30335 ) <slashdot.worf@net> on Sunday January 06, 2013 @02:54AM (#42493621)

      I know that it's DDR3 SODIMM but is there any particular reason they're limiting it to DDR3-1333?

      Would there be a performance gain if it could utilize DDR3-1600 like how the AMD fusion processors show decent performance gains using higher speed memory? I'm pretty sure that DDR3-1600 SODIMM's are out there.

      Because they don't want memory performance to be comparable with the higher margin Core line. Atom is cheap - and they want to ensure that it's not cheap to the point where it eats into their bread and butter.

      And anyhow, DDR3 will be old hat by the time this processor comes out - DDR4 is just coming out.

    • by viperidaenz ( 2515578 ) on Sunday January 06, 2013 @03:09AM (#42493665)

      Because higher speeds create more heat. More power to feed the RAM and more power to run the bus.

  • win8/linux tablets? (Score:4, Informative)

    by ChunderDownunder ( 709234 ) on Saturday January 05, 2013 @11:43PM (#42492865)

    Runs all your x86 binaries.

    By MS' own definition, uefi will support other os options (not guaranteed under ARM).

    Has mature, supported foss GPU drivers unlike every android-only ARM SoC.

    THE platform for that budget linux tablet that dual boots into MS Office?

    • by Anonymous Coward on Sunday January 06, 2013 @06:53AM (#42494469)

      Why would you want to use apps that were written in x86? For a good experience you need to rewrite code so your apps are optimized for touch not mouse. Then you need to optimize for performance, RAM usage, and power since mobiles have much lower specs and will continue to be less powerful for the near future.

      Of course, if you like the hybrid laptop/tablet concept MS is pushing I guess x86 is for you.

      I'm not looking forward to half-assed x86 apps for tablets.

      • by Rockoon ( 1252108 ) on Sunday January 06, 2013 @08:53AM (#42494959)

        For a good experience you need to rewrite code so your apps are optimized for touch not mouse.

        My apps are optimized for keyboard, thank you very much. Mice and touch are just extensions to the keyboard interface.

      • by ChunderDownunder ( 709234 ) on Sunday January 06, 2013 @09:05PM (#42500073)

        Yes, I'm fine with a tablet 'cover' that doubles as a keyboard and trackpad solution.

        Hybrids are the technology for 2013. Witness this week's CES for examples. Some rotate the screen like the Lenovo twist, some flip like the Dell XPS Convertible, some detach like the Hp Envy. Then there's the Surface Pro, as you mentioned. These things are in Ultrabook territory but prices will come down as the novelty of a touchscreen laptop becomes the norm.

        But by x86 binaries I mean legacy win32 stuff that won't run on ARM linux. e.g. my government's tax software. Can an iPad run that?

        By 'half-assed x86 apps for tablets' I'm sure you're aware that Android-x86 runs on Intel, as does KDE Plasma Active and probably without too much tweaking webOS and Firefox OS.

        If, as MS promises, the firmware for x86 devices isn't locked down for Windows-only, you can triple boot to your heart's content. On an ARM based system, you're at the mercy of a manufacturer that ships Android-only kernel blobs.

  • by flargleblarg ( 685368 ) on Sunday January 06, 2013 @12:13AM (#42493007)
    What, you can properly hyphenate dual-core, but you can't be bothered to properly hyphenate ultra-light and low-power when used as adjectives?
  • by GPLHost-Thomas ( 1330431 ) on Sunday January 06, 2013 @12:19AM (#42493027)
    The imagination technology drivers aren't open source, which was a big issue. Moving to an Intel video board means that it will be released as free software (unless Intel changes its policy which is very unlikely). That's a very good news for the open source platforms!
    • by Billly Gates ( 198444 ) on Sunday January 06, 2013 @12:25AM (#42493065) Journal

      The imagination technology drivers aren't open source, which was a big issue. Moving to an Intel video board means that it will be released as free software (unless Intel changes its policy which is very unlikely). That's a very good news for the open source platforms!

      Call my cynical with these. Intels clover is not Windows 7 compatible, and the previous are not Windows 8 compatible [neowin.net]. If intel is blowing off Windows 7 without working drivers for their newest chipsets what makes you think they will support Linux either?

      They want you to blow extra $$$ for an icore5 that you do not need, and are trying to make this for tablets and phones only to stop ARM.

      • by Anonymous Coward on Sunday January 06, 2013 @01:00AM (#42493187)

        I'm thinking they almost have to produce drivers for Linux or at least Android. Who else is going to buy these otherwise? It is possible they could back pedal on the free software front and not release the code. However I'm not so sure that is going to happen. However some of the statements made have been framed such that Intel is not supporting Linux. What that means exactly is unclear. HP's printers are the best supported under Linux and they state something similar. What they mean is they don't support end-users. While this is an odd usage of words for Intel given the end-user to them would be Dell, HP, etc. it could merely mean they won't produce drivers even though they will release specs. We could still see 100% support despite not being “supported”.

      • by GPLHost-Thomas ( 1330431 ) on Sunday January 06, 2013 @01:35AM (#42493317)

        If intel is blowing off Windows 7 without working drivers for their newest chipsets what makes you think they will support Linux either?

        They want you to blow extra $$$ for an icore5 that you do not need, and are trying to make this for tablets and phones only to stop ARM.

        I'm convinced that Intel will not support older kernel versions. They never do anyway, they always target current, or next, so it can get upstreamed. Probably, they did the same with windows, I don't know (and frankly, I don't care). But what we're talking about here is support for the video driver, which is already in kernel.org (unless they integrate something completely new, but that's also very unlikely).

      • by Anonymous Coward on Sunday January 06, 2013 @02:45AM (#42493577)

        >what makes you think they will support Linux

        The fact that before running windows anything, every cpu design in intel first runs an in-house linux variant on which various verification tools are run. So linux has to run.

      • These processors are intended for tablet devices with touchscreens, windows 7 is not really suited to such devices and windows =7 tablets have always sold very poorly in the past.
        Linux on the other hand, has sold well on tablets in the form of android so it makes more sense to support.

        Also if Intel release enough of the hardware specs, they don't need to explicitly support linux, someone else will do it if they don't. Windows users typically don't write their own drivers while linux users do.

        It may be more than drivers too, windows =7 expects an ibm pc compatible system, if the new processors eliminate some of the backwards compatibility cruft in order to save power then the platform will be different enough that windows cannot boot, and the only organisation capable of making the changes necessary for it to work wants to sell the latest version and has no incentive to do so for older versions.
        Linux on the other hand could have any necessary changes made by all manner of people.
        I believe this has already been the case with at least one model of atom processor, which despite being x86 was unable to boot windows but did have a modified linux kernel working on it.

    • by symbolset ( 646467 ) * on Sunday January 06, 2013 @02:22AM (#42493455) Journal
      The GPU on these chips, due to be released year-after-next, max out at the resolution of the 10" Nexus 10 from last year. That is a strategic error.
      • by Kjella ( 173770 ) on Sunday January 06, 2013 @04:31AM (#42493957) Homepage

        They'll also have ULX Haswells that go down to 10W and support 4K, I think both price and performance-wise they'd be a better match. Besides, going from 132 ppi on the iPad 2 to 264 ppi on the iPad 3 was huge and the Nexus 10 tops that with 299 ppi, but I don't see that race going much further since they are hitting the limits on human vision. I just hope we'll see reasonably priced 4K desktop monitors soon, they're also good for huge TVs but really serve no point on a 50" TV.

  • by Anonymous Coward on Sunday January 06, 2013 @12:25AM (#42493067)

    Interesting that the submissions falsely asserts that Baytrail follows Cedartrail incorrectly. Clovertrail is the more direct preceding product in the roadmaps and was released about 3 months ago.

  • by Anonymous Coward on Sunday January 06, 2013 @12:42AM (#42493131)

    The trouble with Intel is it's competing against LAST YEARS ARM quad core chip, by the time they get it out, the 8 and 16 core ARM chips will be out.

    Not only that their main problem isn't the processing power, it's the power draw! They keep talking about idle less than 10 watts, in a market where idle is less than 100mW. Defining idle as 'what Windows 8 does in idle' doesn't work in a market dominated by Android which idles a lot deeper.

    So we seem to have these endless puff pieces from them, promises of how amazing the next generation will be. It doesn't look like attack on ARM (which requires weapons AKA competitive chips), rather it looks like defense, to keep the server market from switching over anymore than it has already.

    It's more about keeping companies locked into Intel from switching, on the promise of a fix real-soon-now.

    • by Anonymous Coward on Sunday January 06, 2013 @02:04AM (#42493395)

      They keep talking about idle less than 10 watts,

      I'm pretty sure you'll find that's under 10 watts when active, for any low-power offering.

      Anandtech did a surprisingly decent comparison [anandtech.com] of the power usage of Clover Trail (Atom Z2760) vs.Cortex-A15 (Exynos 5250).

      There are nits to be picked, but they do a good job of showing the power usage, including the energy benefits of race-to-idle, and the low idle draw of all tested platforms.

      These are both recent releases; I think comparing them is entirely fair, and they do appear close enough to trade blows.

      • by Anonymous Coward on Sunday January 06, 2013 @06:13AM (#42494317)

        http://www.extremetech.com/computing/110563-intel-medfield-32nm-atom-soc-power-consumption-specs-and-benchmarks-leak

        "Anandtech did a surprisingly decent comparison [anandtech.com] of the power usage of Clover Trail (Atom Z2760) vs.Cortex-A15 (Exynos 5250)."

        The actual 'favorable' test claimed was the Tegra 3 running Windows RT vs a Medfield running Windows 8. With Wifi turned off and the machine left to do nothing, nothing at all, not stream a movie, not run a GPS app, nothing.

        This is probably the only case where Medfield can shut down all the silicon blocks needed to compete with Tegra's low power single core.

        But you may aswell have compared the 'off' state of both devices and declared it even. If they'd streamed a movie on an Android on a Tegra 3, vs Windows RT on a medfield, the true nature of this is revealed:
        The Tegra 3 can run on it's single low power core quite happily, and stream video across Wifi. The Medfield has no such core, it's silicon blocks are on or off, as soon as you do anything it's sucking down the juice. This is why the Razr has terrible battery life. Google [battery life android razr] to see the complaints!

        • by Anonymous Coward on Sunday January 06, 2013 @12:42PM (#42496615)

          I imported a Motorola RAZR i which is a Medfield based Android phone.

          With wifi active and 3G data disabled, it has incredible battery life, often coming home at the end of the day with 85% remaining, since I'm actually working during the day and not playing games. I accidentally left a GPS tracking app active all night w/o a charger recently, and the battery ran from around 75% when I went to bed to 35% when I woke in the morning. According to the system battery monitoring graphs, it is the display that uses the most power, not other software nor radios.

          ARM based Android phones I've tested previously never had anywhere near this stamina with the same use. This includes several Samsung Galaxy S and Nexus family models. At the same time, the RAZR i feels very responsive to me with Android 4.0.4, better than most previous phones except perhaps a Galaxy Nexus which had Android 4.1 (which made it far more responsive than its original Android 4.0).

        • Wow, so much misinformation across the board. Both the ARM based Razr M and the Intel based Razr I have excellent battery life. They are both excellent phones. I have the M myself, but comparing against friends with the I... No real difference.

          -Matt

    • by Anonymous Coward on Sunday January 06, 2013 @02:05AM (#42493401)

      "They keep talking about idle less than 10 watts"

      No they do not. TDP is the maximum power draw from the socket. Intel has demonstrated 20mW idle at 32nm with Medfield, and expects to drop this by a factor of four at 22nm.

  • by QuietLagoon ( 813062 ) on Sunday January 06, 2013 @01:53AM (#42493367)

    New leaked info from Intel sheds light on how the company's 2014 platforms will challenge ARM products in the ultra light, low power market.

    Intel is using the tactic perfected by Microsoft, i.e., compare your product plans from two or so years in the future with the current products of your competitor, and then say how much better your envisioned products are.

    .
    Intel is behind the 8-ball in the low power market space, and this is nothing less than a move of desperation on Intel's part.

    • by CajunArson ( 465943 ) on Sunday January 06, 2013 @05:42PM (#42498807) Journal

      Uh... so you are saying that ARM is copying Intel's strategy with the never-ending harping about how great the A57 cores will be? I seem to recall sitting through 3 years of ARM hype about how the Cortex A-15 was going to permanently destroy Intel, and here we are with real systems running real tests that show that it isn't even insanely better than 32nm Atom parts. How come ARM hasn't completely taken over yet? I've been promised miracles!

  • by Anonymous Coward on Sunday January 06, 2013 @02:23AM (#42493457)

    Dual ARM A15 chips destroy any current dual Atom from Intel. The coming quad A15 parts will destroy any Intel ULV i3 part (Intel's crown jewel CPU) that competes in the same space.

    However, the A15 design is now years old. ARM is replacing it with a fully 64-bit part that uses only 60% of the same die space in the same process. This means that the ARM part that replaces the A15 early 2014 has either more performance or less energy use- a total nightmare for Intel.

    Meanwhile, it is impossible for Intel to 'repeal' the Intel Tax. Intel is addicted to massive profits per chip, and cannot function on the margins made by those that manufacture the ARM SoC parts. Example: Intel is boasting support for 4K video on its next generation CPUs, but 4K support already exists on one of the cheapest ARM chips you can find in a tablet, the Allwinner A10.

    When Atom goes 'out of order', it ceases to be an Atom, as is, instead, a renamed version of Intel's current 'core' architecture.Intel going quad with the Atom makes zero sense, when the targeted low power devices try to keep all but one core in idle for power-saving reasons. Intel can already thrash its own future Atom with the earlier mentioned ULV dual-core i3 part, as used in the latest Chromebook.

    It gets worse. AMD and ARM are fully unifying the memory space of external memory as used by either the GPU cores or the CPU cores. Intel is going in the opposite direction, attempting to build on-die RAM blocks for the exclusive use of the GPU on versions of its chips aimed at high-end notebooks. This project is dying on its feet as notebook manufactures cannot believe the money Intel wants for this version of Haswell- they know if their notebook customers pay a lot for the product, they demand decent graphics from Nvidia or AMD, not half-working slow graphics rubbish from Intel.

    It gets worse. Apple is on the verge of dumping Intel completely for their own ARM SoC designs. The high-end Apple desktop systems that would struggle with current ARM chips hardly make money for Apple anyway compared with the phones, tablets, and Airbooks.

    It gets worse. Weak demand in the traditional PC marketplace means that Intel has growing spare capacity at its insanely expensive fabs. It tried to find customers for this free capacity, but Intel fabs are massively customised for Intel's own CPUs, and lack the technical support for other kinds of chips. Intel uses its outdated equipment to make other kinds of parts (like the dreadful Atoms, or the dreadful MB chipsets), but potential customers hardly want to make their new chips on these very old lines.

    It gets worse. Global Foundries (AMD's chip production facility- that pretends to be independent) is making incredible strides in attracting business form many companies designing the most cutting edge ARM parts. Samsung's chip business is going from strength to strength. Apple is making massive investments at TSMC. The Chinese fabs are coming along in leaps and bounds.

    It gets worse. The GPU is becoming by far the most important part of the modern SoC (system on a chip). Intel's GPU design is a distant fifth to the SoC GPUs from AMD, Nvidia, PowerVR and ARM itself. Of the five, only Intel's GPU still doesn't work properly, and is NOT compatible with modern graphics APIs. Intel has to hack its drivers even to get even a handful of the most popular games running with minimal glitches. Intel GPU = you will have massive compatibility issues.

    Where is the Z80 today? The same question will be asked of x86/x64 tomorrow.

    • by asm2750 ( 1124425 ) on Sunday January 06, 2013 @04:02AM (#42493867)
      Actually the popular 8-bit cores of yore are still here. The Z80 still lurks around. The 6502 is still alive and kicking. Didn't you know? Zilog and WDC are still around.
      • by Anonymous Coward on Sunday January 06, 2013 @06:31AM (#42494395)

        He asked where it was, he didn't say it was gone. Running a fricking graphing calculator is a far cry from ruling the PC. You fail, fanboy.

        • by Anonymous Coward on Sunday January 06, 2013 @09:43AM (#42495257)

          if i had an account and mod points, i'd mod you down. as
          for the fanboiism, the pot is calling the kettle black!
          the gp's is correct and you are trying to put words in his
          mouth by jabbering about destops.

          • by Anonymous Coward on Sunday January 06, 2013 @11:28AM (#42496013)

            I bet I know why you don't have an account: You're too bleeding stupid to get one, you mouthbreather. Fanboy, me? From that post? Pointing out glaring errors? Thanks for the laugh.

            Finally, your reading comprehension really sucks. This is what the parent said:

            It gets worse. The GPU is becoming by far the most important part of the modern SoC[...]

            All in the context of the growing importance of the SoC, where Intel does not rule, as opposed to the desktop, where they do.

            He finally ask the question;

            Where is the Z80 today?

            which is clearly another reference to the desktop since that's where the z80 used to be king, albeit obviously before you were born, going by your arrogant, ignorant shitty brat attitude.

    • by wintermute000 ( 928348 ) <(bender) (at) (planetexpress.com.au)> on Sunday January 06, 2013 @05:06AM (#42494089)

      Sorry wrong. Google the anandtech benchMark of current medfield in razr i vs kraitkrait. It's competitive NOW and that's without process advantage

      • by Anonymous Coward on Sunday January 06, 2013 @08:25AM (#42494807)

        Ya, but who cares if it costs more? Intel likes its profits and an extra $10 over ARM isn't enough to keep their business alive. If Atom costs $50 more for better performance it still isn't enough to sustain Intel's process lead. Not to mention it would make it make the CPU the most expensive component of the device.. Battery life be different since the display and wireless take more power. Samsung and Apple will be last manufacturers to switch to x86. They'll use their own in-house chips out of pride. Also, price of mobiles will drop so their will even less of an incentive to use expensive Atoms.

        The war just started but Intel lost years ago.

    • by leathered ( 780018 ) on Sunday January 06, 2013 @08:37AM (#42494875)

      I don't know why this is being modded down but AC is right on the money with the 'Intel tax'. Intel are addicted to 60%+ average margins on their CPUs and it's going to be hell for them to give them up.

      People can tout supposed superior performance figures for Intel's offerings but it simply doesn't matter. Even if their parts offer 30% better performance unless they can them down to no more than $20 per part the tablet and mobile manufacturers will simply not be interested.

      Another issue is Intel's lack of flexibility. ARM is the 'Have It Your Way' CPU designer. You can license entire SOC designs, or you can license the ISA or just pick and choose what you want to incorporate into your own SOC. With Intel it's all or nothing.

      • by Anonymous Coward on Sunday January 06, 2013 @11:08AM (#42495853)

        Also the CPU is growing less important. Strangely, a lot of people like to play games on their tablets so GPU performance is more important. So is hardware accelerated video decode. Neither requires a fast CPU. Neither does having a lighter tablet nor a better display. These days battery life is more dependent on wireless performance and the kind of display you have. Oh, and as tablets slowly replace desktops as the main computing device, people will demand more storage and RAM. So people are willing demanding better GPUs displays, wireless, RAM, NAND, weight, battery life and price over CPU performance. I forgot to mention screen size.

        I have an iPad. Personally, I'd like to see larger screens, more storage, lower prices, and GPS in the wifi model. Also wish Apple wouldn't cripple competing browsers from having their own Javascript engine over BS security reasons. CPU performance? It's better than my old laptop.

        Look at how well the iPad mini sold. All because it's so light and the price is low. Nobody gave a shit that it had a slower CPU than the iPad.

        Intel is fucked.

      • by Anonymous Coward on Sunday January 06, 2013 @12:43PM (#42496629)

        And this is exactly what ARM designs are good for - not allowing Intel to rest. They will be what AMD was (and hopefully will be) for desktop processors - much needed competition. But actually beating Intel in any technical metric - unlikely.

    • by Anonymous Coward on Sunday January 06, 2013 @09:14AM (#42495083)

      We could listen to you or we could listen to experienced people that have designed x86 and other processors. Those that have done that admit that x86 have overheads but those are so small that it can be mostly ignored. Some of those overheads have in practice shown to increase performance by either forcing the engineers to make a better solution or by making the life of programmers easier, like the efficient load/store mechanisms or microcode that enables fast REP MOVSQ (=block move) execution.
      At worst x86 have ~10% overheads (that doesn't mean -10% performance) with all other things equal but all other things isn't equal! X86 have the best processor engineers and the best process to build processors with. Wake me up when ARM have fabrication at the same level as Intel and I'll admit ARM have an advantage.

      There are many ways to create an out of order processor and claiming that an OoO Atom is automatically the same design as a Core i-whatever just shows that there's no reason to listen to you being completely clueless in this area.
      The Core series since Sandy Bridge (Core i3/5/7 2xxx) uses another way for major OoO parts than the Pentium Pro line (Physical register file with no data stored in the reorder buffer (ROB) v.s. storing data in the ROB), the AMD Athlon used another method with two register files to store data in combination with distributed reservation stations: the physical file and the future file. The Power PC G4 used distributed reservation stations with a different mechanism.
      Shall I continue with other vital parts of an OoO execution engine? Instruction schedulers can use a multitude of solutions, CAMs (Content Addressed Memory), dependency lookup designs, bit matrix ... Speculative execution can be done a number of ways too.

      So an OoO design can be done in a lot of different ways and be tweaked for different goals. A low power design would use less power expensive resources so CAMs would either be of reduced size or removed all together. But sure; if you want to you can keep the idea that the PPC G4 is essentially the same as a Pentium 4...

      Global Foundries is a wholly separate entity which you'd know if you'd follow tech news in any way. E.g. AMD recently paid GF a hefty fine for not needing previously reserved production capacity.

      Zilog still keeps trucking BTW, the Z80 and the more modern eZ80 is used in a lot of designs.

    • by Anonymous Coward on Sunday January 06, 2013 @04:36PM (#42498421)

      You are overlooking the server market. SPARC got smashed pretty bad by AMD, and now Intel is easily smashing AMD.

      Data center operations are focusing on huge iron for databases, and huge iron to host VM's. Xeon is fitting the bill perfectly, and the newer Xeon's are just getting started. A general workhorse server will have 8 Xeon CPU's, and machines with 256 CPU's are becoming more common. Power and network bandwidth issues have been neatly addressed in the data center, now the drive is toward more processing power and Intel is clearly the future.

    • by Anonymous Coward on Sunday January 06, 2013 @07:20PM (#42499465)

      "When Atom goes 'out of order',"

      Almost every CPU made today uses out of order processing at the internal ISA level, it is transparent to the layers above. The Intel Atom uses in-order only because it saves power, but at a great deal of cost to performance.

  • by caywen ( 942955 ) on Sunday January 06, 2013 @02:47AM (#42493587)

    My main issue with netbooks was the horrible resolution and the sluggishness.

    If, by the end of 2013, they can slim down a Bay Trail-based netbook to 3/4", banish the absolutely awful 1024x600 resolution for 1366x768 or even 1600x900, rev to Windows 8.5, and keep it at $350, I will buy 3 for the price of a Macbook Air.

    • by sgunhouse ( 1050564 ) on Sunday January 06, 2013 @03:19AM (#42493717)

      Why wait? My latest netbook (or they call it a netbook anyway) has an 11.6" display at 1366x768, is pretty close to 3/4" thick, has an AMD processor and Windows 8, and cost $200. Though maybe that was a special for the holidays ...

      • by Anonymous Coward on Sunday January 06, 2013 @05:57AM (#42494259)

        AMD E processors are horrible. The CPU part is slower than same clocked T series Core 2 Duo from 5 years ago. I can't even understand intel with this atom crap. I rarely play games ( defcon and darwinia ) but my autocad and Netflix need CPU, not measly GPS. However windows 8 does work wonderfully on atoms where win 7 struggles. Anyway I prefer to get a second hand 5 year old ultra book that cost upwards of 2k back then, instead of getting a budget 200 bucks crap from today, and I'll get that high quality machine for much the same price.

    • Currently, I have started seeing Celeron 867 based laptops advertised as "netbooks". They do have 1366x768, come with 4GB RAM, a hard disk (typically 500GB) and Windows 8... They cost just short under 400€ (Remember in the tech world 1$=1€, at least in the current exchange rate situation). They don't have an optical disk and are not Core iN based. So they have everything Netbooks have, except for an Atom CPU, but everything Ultrabooks have, except for an iN CPU. If I needed a new laptop right now, I'd get myself one.

      On a sidenote: I seriously hope resolutions will get better.. 1366x768 is at the very low end of my tolerance.

  • by Anonymous Coward on Sunday January 06, 2013 @03:20AM (#42493731)

    We are still talking about 5 to 10 watts...
    Is it really a challenge tor arm?

  • by Anonymous Coward on Sunday January 06, 2013 @04:20AM (#42493917)

    Intel is like Kodak: stubbornly continue business based on old tech no matter what happens in the real world. But there are still lots of fans for old PC stuff so Intel doesn't die. Yet.

  • by TeknoHog ( 164938 ) on Sunday January 06, 2013 @07:08AM (#42494519) Homepage Journal

    the first Atom to feature Intel's own graphics processor

    I have an Atom D510 with integrated Intel graphics [wikipedia.org].

    GMA 3150 GPU and memory controller are integrated into the processor.

    Does that count? I bought the motherboard in 2010.

  • by Anonymous Coward on Sunday January 06, 2013 @05:17PM (#42498665)

    I propose that Intel license ARM technology and then blow everyone off the map with 22 nm ARM chips.

According to the latest official figures, 43% of all statistics are totally worthless.

Working...