Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Intel Hardware

Info On Intel Bay Trail 22nm Atom Platform Shows Out-of-Order Design 107

MojoKid writes "New leaked info from Intel sheds light on how the company's 2014 platforms will challenge ARM products in the ultra light, low power market. At present, the company's efforts in the segment are anchored by Cedar Trail, the 32nm dual-core platform that launched a year ago. To date, all of Intel's platform updates for Atom have focused on lowering power consumption and ramping SoC integration rather than focusing on performance — but Bay Trail will change that. Bay Trail moves Atom to a quad-core, 22nm, out-of-order design. It significantly accelerates the CPU core with burst modes of up to 2.7GHz, and it'll be the first Atom to feature Intel's own graphics processor instead of a licensed core from Imagination Technologies."
This discussion has been archived. No new comments can be posted.

Info On Intel Bay Trail 22nm Atom Platform Shows Out-of-Order Design

Comments Filter:
  • by pollarda ( 632730 ) on Sunday January 06, 2013 @12:24AM (#42492753)
    It is Always Reassuring When .... You spend a bunch of money on a new processor and they tell you it is already "Out of Order" from the get-go.
    • by wmac1 ( 2478314 )

      The buying decisions should be made based on the requirements. If what you buy meets your requirements (until the life time of the device is over and you want to upgrade) you should not regret your decision.

      That applies to smart phones, DSLR and normal cameras, PCs, tablets etc. These devices have a great advancement pace and you will always regret if you want to compete with the market.

  • About bloody time... (Score:5, Interesting)

    by fuzzyfuzzyfungus ( 1223518 ) on Sunday January 06, 2013 @12:32AM (#42492801) Journal

    I, for one, will be overjoyed to see the last of Imagination's 'PowerVR' shit, especially on x86, and hope we'll never see the likes of the "GMA500" again.

    On the other hand, this report has me wondering exactly what the Atom team is up to. Back when Intel started the whole 'Atom' business, the whole point of having a substantially different architecture, in-order, was to have something that could scale down to lower power in a way that their flagship designs couldn't. Since then, the ULV Core I3/5/7 chips have continued to improve on power consumption, and the Atoms have apparently been sprouting additional complexity and computational power. How much room do they have to do that before 'Atom' evolves itself right out of its power envelope, or Core ULV parts start hitting the same TDPs as higher-power Atoms; but with much more headroom?

    • by rev0lt ( 1950662 ) on Sunday January 06, 2013 @12:52AM (#42492909)

      How much room do they have to do that before 'Atom' evolves itself right out of its power envelope

      That's why they reduce the gate size (22nm). You get a less power-demanding product, and at the same time you gain additional room for extra features.

      or Core ULV parts start hitting the same TDPs as higher-power Atoms; but with much more headroom?

      If you consider current Atoms and performance-per-watt, a latest-gen Core is probably more efficient than Atom. But on the other hand, they are way more complex processors, usually with bigger on-die cache, and way more expensive. There may be some overlap over "budget" processors (such as Celeron and the old Pentium D) on the new versions, but even then I don't think they will be direct competitors (as an example, how many easily upgradable Atom boards with ZIF-style socket have you seen?).

      • as an example, how many easily upgradable Atom boards with ZIF-style socket have you seen?

        What makes you think Intel won't do away with upgradable socketed boards on desktops too [slashdot.org]?

        • by rev0lt ( 1950662 )
          Because it is not a good business decision. That would imply that the whole board would have the shelf life of the (expensive) CPU. That would also imply that for each N board configurations and Y cpus, you'd have N*Y products, instead of a good set of N generics and a small set of Y specifics. The unsold boards that would become obsolete would also have the additional cost of having a valuable, but otherwise useless (because it is soldered in) CPU. Intel's premium market isn't the embedded segment where Sb
          • Also, unless the signal integrity issues are truly brutal, it wouldn't be terribly difficult to produce a CPU that is designed to be 'zillion-little-BGA-balls-permanently-attached' for volume constrained embedded applications and also produce a little PCB card that has an array of BGA pads on top and an array of LGA lands on the bottom, allowing you to turn your BGA-only CPU into a socketed CPU at modest additional expense.

            Given the uptick in tablets, ultrathin laptops, and 'every CPU manufactured in the pa

    • Intel is a big, rich company, so why place all their bets on Core or Atom exclusively instead of stacking the deck with both?

      Remember when the Netburst (P4) architecture turned out not to have the legs that they hoped, and AMD was beating up on them, it was Intel's mobile architecture (Pentium M, developed somewhat independently in Israel following on P3 rather than Netburst) that became the basis for the Core architecture, which brought Intel back into the lead on desktops. Secondly, consider Itanium - w

      • by uvajed_ekil ( 914487 ) on Sunday January 06, 2013 @01:42AM (#42493129)
        I think it is clear that with the Core line Intel has finally ended and won the x86 war, so it is only logical for them to begin to focus more on ARM's market. AMD is all but defeated, I am sad to say, and demand for ever-faster desktop and traditional laptop uber-CPUs has died off. I think I speak for a lot of slashdotters when I say we still enjoy the ultimate app performance, immersive gaming experience, and ridiculous storage and networking options that desktops can deliver, and don't mind lugging around a "huge" 6-pound laptop with a 15"-17" screen. But that is not where the greatest demand lies right now. Intel is lagging in the tablet and ultra-mobile market segments so their continued Atom progress is not unexpected and, to be honest, it looks pretty intriguing (which is high praise coming from a longtime Intel hater!).

        They will probably not need to compete dollar for dollar on price as long as they can deliver superior performance, but they will have to close the gap somewhat. ARM SoC's, etc. aren't going away any time soon, especially on the lower end (ain't gonna see any $79 Intel tablets), but I think Intel are finally getting their shiz together to challenge the likes of the Tegra line, at least.

        If you want to see Intel push the envelope with Core or a successor, they might need some competition. There is no one to push them to innovate there, and no excitement (i.e. $$$ rolling in).
        • by Anonymous Coward on Sunday January 06, 2013 @01:59AM (#42493175)

          AMD wasn't defeated, they committed suicide by laying off engineers to help the bottom line in the short term. Naturally they're finding that is deadly in the long term.

          • by Anonymous Coward

            AMD wasn't defeated, they intentionally gave up the desktop. They're now focusing on the low-end and the high-end. AMD dominates in data centers, simply because they're so much cheaper. Rackspace, for example, is an AMD-only shops. AMD does well on the appliance side, as well, although there are a ton of players, including even MIPS based products.

            Basically, AMD decided it was too costly to beat Intel at their own game. So AMD crept back into the shadows. They'll live on as one of the myriad B2B companies.

          • Engineers are fungible. Lay them off when you do not need them and hire new ones when you do. What could possibly go wrong?

            Signed,
            MBA

        • by AmiMoJo ( 196126 ) * on Sunday January 06, 2013 @08:20AM (#42494577) Homepage Journal

          It's more like ARM could eat Intel's breakfast if it isn't careful. ARM processors are already good enough for 95% of what people do, even on the desktop. Just look at Chromebooks and the near console level gaming available on high end tablets.

          ARM's biggest advantage is that there are so many people making them. Any shape or size you like, desktop style CPU or fully integrated SoC, any price bracket. The fact that Chinese manufacturers like Allwinner make them is a big deal too, because just like the west doesn't seem to like Chinese parts the Chinese prefer to avoid western manufacturers where possible (language and supply chains probably have a lot to do with it). On top of that big companies like Samsung and Apple make their own CPUs anyway, and since they own the top end of the market it will be very hard for Intel to get in.

          • ARM processors are already good enough for 95% of what people do, even on the desktop.

            But everybody has a different 5 percent that isn't yet ported to ARM.

            Just look at Chromebooks and the near console level gaming available on high end tablets.

            Given that the Xbox 360 is seven years old, "near console level" is not saying much. Seven years is just one year less than the gap between the Nintendo 64 and Nintendo DS, which offered near Nintendo 64-level graphics.

    • Re: (Score:3, Interesting)

      by Anonymous Coward

      Let's be clear about this - the Imagination GPUs are excellent, the problem is that Intel decided to write their own drivers, badly. Very badly. Okay, they outsourced it, but the end responsibility was theirs. Imagination's own drivers, which by all accounts are good, were not used.

      So put the blame where it should be directed - Intel.

    • "On the other hand, this report has me wondering exactly what the Atom team is up to."

      Same thing they've always been up to, competing with ARM.

      At first they needed to be low power, when top of the line ARM was 650mhz on a single core. Within 3 years, ARM got quad-cores running at 1.5ghz and other enhancements.

      What changed was the competition.

      If your looking for cheap no frills x86 SoC, try an AMD Geode.

      today's current devices require more however.
      • If your looking for cheap no frills x86 SoC, try an AMD Geode.

        No frills like acceptable performance. Anyway, AMD replaced Geode with their Fusion line.

    • by Kjella ( 173770 )

      I think the in-order architecture was just as much based on the other key feature of Atom that Intel didn't talk so much about to consumers - die size and cost for Intel. If we compare the early 230 and 330 to contemporary 45nm processors then a single core Atom was 25 mm^2, dual core 2x25 mm^2, Wolfsdale dual-core 107 mm^2 and quad core 2x107 mm^2. On top of that comes better edge utilization of wafers and lower defect rate since each chip is smaller. In practice Intel could probably produce 5 single-cores

    • I, for one, will be overjoyed to see the last of Imagination's 'PowerVR' shit, especially on x86, and hope we'll never see the likes of the "GMA500" again.

      Yeah, thank goodness. Folks, this is how bad it is: the FreeBSD folks had to patch the VGA text console code to be compatible with the current Intel Atom boards, and the first full release of that code was just a week ago. From origin to present day, nobody had ever managed to implement VGA in a way that would fail when data was written a whole byte at

  • I know that it's DDR3 SODIMM but is there any particular reason they're limiting it to DDR3-1333?

    Would there be a performance gain if it could utilize DDR3-1600 like how the AMD fusion processors show decent performance gains using higher speed memory? I'm pretty sure that DDR3-1600 SODIMM's are out there.

    • by Anonymous Coward

      It doesn't make sense for a processor 2 years out to have memory speed below current generation of products. It is okay if the product were to be available in Q1 2013, but you got to be designing for the next generation of DDR3 speeds already.

        I can see that it is DDR3L and not DDR3. There is probably geared toward power saving, but there is no reason not getting a faster memory clock speed for as you got to keep Quad cores fed.

    • by tlhIngan ( 30335 )

      I know that it's DDR3 SODIMM but is there any particular reason they're limiting it to DDR3-1333?

      Would there be a performance gain if it could utilize DDR3-1600 like how the AMD fusion processors show decent performance gains using higher speed memory? I'm pretty sure that DDR3-1600 SODIMM's are out there.

      Because they don't want memory performance to be comparable with the higher margin Core line. Atom is cheap - and they want to ensure that it's not cheap to the point where it eats into their bread and but

    • Because higher speeds create more heat. More power to feed the RAM and more power to run the bus.

  • win8/linux tablets? (Score:4, Informative)

    by ChunderDownunder ( 709234 ) on Sunday January 06, 2013 @12:43AM (#42492865)

    Runs all your x86 binaries.

    By MS' own definition, uefi will support other os options (not guaranteed under ARM).

    Has mature, supported foss GPU drivers unlike every android-only ARM SoC.

    THE platform for that budget linux tablet that dual boots into MS Office?

  • What, you can properly hyphenate dual-core, but you can't be bothered to properly hyphenate ultra-light and low-power when used as adjectives?
  • The imagination technology drivers aren't open source, which was a big issue. Moving to an Intel video board means that it will be released as free software (unless Intel changes its policy which is very unlikely). That's a very good news for the open source platforms!
    • Re: (Score:3, Insightful)

      The imagination technology drivers aren't open source, which was a big issue. Moving to an Intel video board means that it will be released as free software (unless Intel changes its policy which is very unlikely). That's a very good news for the open source platforms!

      Call my cynical with these. Intels clover is not Windows 7 compatible, and the previous are not Windows 8 compatible [neowin.net]. If intel is blowing off Windows 7 without working drivers for their newest chipsets what makes you think they will support Linux either?

      They want you to blow extra $$$ for an icore5 that you do not need, and are trying to make this for tablets and phones only to stop ARM.

      • If intel is blowing off Windows 7 without working drivers for their newest chipsets what makes you think they will support Linux either?

        They want you to blow extra $$$ for an icore5 that you do not need, and are trying to make this for tablets and phones only to stop ARM.

        I'm convinced that Intel will not support older kernel versions. They never do anyway, they always target current, or next, so it can get upstreamed. Probably, they did the same with windows, I don't know (and frankly, I don't care). But what we're talking about here is support for the video driver, which is already in kernel.org (unless they integrate something completely new, but that's also very unlikely).

      • by Bert64 ( 520050 )

        These processors are intended for tablet devices with touchscreens, windows 7 is not really suited to such devices and windows =7 tablets have always sold very poorly in the past.
        Linux on the other hand, has sold well on tablets in the form of android so it makes more sense to support.

        Also if Intel release enough of the hardware specs, they don't need to explicitly support linux, someone else will do it if they don't. Windows users typically don't write their own drivers while linux users do.

        It may be more

    • The GPU on these chips, due to be released year-after-next, max out at the resolution of the 10" Nexus 10 from last year. That is a strategic error.
      • by Kjella ( 173770 )

        They'll also have ULX Haswells that go down to 10W and support 4K, I think both price and performance-wise they'd be a better match. Besides, going from 132 ppi on the iPad 2 to 264 ppi on the iPad 3 was huge and the Nexus 10 tops that with 299 ppi, but I don't see that race going much further since they are hitting the limits on human vision. I just hope we'll see reasonably priced 4K desktop monitors soon, they're also good for huge TVs but really serve no point on a 50" TV.

  • by QuietLagoon ( 813062 ) on Sunday January 06, 2013 @02:53AM (#42493367)

    New leaked info from Intel sheds light on how the company's 2014 platforms will challenge ARM products in the ultra light, low power market.

    Intel is using the tactic perfected by Microsoft, i.e., compare your product plans from two or so years in the future with the current products of your competitor, and then say how much better your envisioned products are.

    .
    Intel is behind the 8-ball in the low power market space, and this is nothing less than a move of desperation on Intel's part.

    • Uh... so you are saying that ARM is copying Intel's strategy with the never-ending harping about how great the A57 cores will be? I seem to recall sitting through 3 years of ARM hype about how the Cortex A-15 was going to permanently destroy Intel, and here we are with real systems running real tests that show that it isn't even insanely better than 32nm Atom parts. How come ARM hasn't completely taken over yet? I've been promised miracles!

  • by Anonymous Coward on Sunday January 06, 2013 @03:23AM (#42493457)

    Dual ARM A15 chips destroy any current dual Atom from Intel. The coming quad A15 parts will destroy any Intel ULV i3 part (Intel's crown jewel CPU) that competes in the same space.

    However, the A15 design is now years old. ARM is replacing it with a fully 64-bit part that uses only 60% of the same die space in the same process. This means that the ARM part that replaces the A15 early 2014 has either more performance or less energy use- a total nightmare for Intel.

    Meanwhile, it is impossible for Intel to 'repeal' the Intel Tax. Intel is addicted to massive profits per chip, and cannot function on the margins made by those that manufacture the ARM SoC parts. Example: Intel is boasting support for 4K video on its next generation CPUs, but 4K support already exists on one of the cheapest ARM chips you can find in a tablet, the Allwinner A10.

    When Atom goes 'out of order', it ceases to be an Atom, as is, instead, a renamed version of Intel's current 'core' architecture.Intel going quad with the Atom makes zero sense, when the targeted low power devices try to keep all but one core in idle for power-saving reasons. Intel can already thrash its own future Atom with the earlier mentioned ULV dual-core i3 part, as used in the latest Chromebook.

    It gets worse. AMD and ARM are fully unifying the memory space of external memory as used by either the GPU cores or the CPU cores. Intel is going in the opposite direction, attempting to build on-die RAM blocks for the exclusive use of the GPU on versions of its chips aimed at high-end notebooks. This project is dying on its feet as notebook manufactures cannot believe the money Intel wants for this version of Haswell- they know if their notebook customers pay a lot for the product, they demand decent graphics from Nvidia or AMD, not half-working slow graphics rubbish from Intel.

    It gets worse. Apple is on the verge of dumping Intel completely for their own ARM SoC designs. The high-end Apple desktop systems that would struggle with current ARM chips hardly make money for Apple anyway compared with the phones, tablets, and Airbooks.

    It gets worse. Weak demand in the traditional PC marketplace means that Intel has growing spare capacity at its insanely expensive fabs. It tried to find customers for this free capacity, but Intel fabs are massively customised for Intel's own CPUs, and lack the technical support for other kinds of chips. Intel uses its outdated equipment to make other kinds of parts (like the dreadful Atoms, or the dreadful MB chipsets), but potential customers hardly want to make their new chips on these very old lines.

    It gets worse. Global Foundries (AMD's chip production facility- that pretends to be independent) is making incredible strides in attracting business form many companies designing the most cutting edge ARM parts. Samsung's chip business is going from strength to strength. Apple is making massive investments at TSMC. The Chinese fabs are coming along in leaps and bounds.

    It gets worse. The GPU is becoming by far the most important part of the modern SoC (system on a chip). Intel's GPU design is a distant fifth to the SoC GPUs from AMD, Nvidia, PowerVR and ARM itself. Of the five, only Intel's GPU still doesn't work properly, and is NOT compatible with modern graphics APIs. Intel has to hack its drivers even to get even a handful of the most popular games running with minimal glitches. Intel GPU = you will have massive compatibility issues.

    Where is the Z80 today? The same question will be asked of x86/x64 tomorrow.

    • Actually the popular 8-bit cores of yore are still here. The Z80 still lurks around. The 6502 is still alive and kicking. Didn't you know? Zilog and WDC are still around.
      • by Anonymous Coward

        He asked where it was, he didn't say it was gone. Running a fricking graphing calculator is a far cry from ruling the PC. You fail, fanboy.

    • by wintermute000 ( 928348 ) <{ua.moc.sserpxetenalp} {ta} {redneb}> on Sunday January 06, 2013 @06:06AM (#42494089)

      Sorry wrong. Google the anandtech benchMark of current medfield in razr i vs kraitkrait. It's competitive NOW and that's without process advantage

    • Re: (Score:3, Interesting)

      by leathered ( 780018 )

      I don't know why this is being modded down but AC is right on the money with the 'Intel tax'. Intel are addicted to 60%+ average margins on their CPUs and it's going to be hell for them to give them up.

      People can tout supposed superior performance figures for Intel's offerings but it simply doesn't matter. Even if their parts offer 30% better performance unless they can them down to no more than $20 per part the tablet and mobile manufacturers will simply not be interested.

      Another issue is Intel's lack of f

      • Re: (Score:1, Insightful)

        by Anonymous Coward

        Also the CPU is growing less important. Strangely, a lot of people like to play games on their tablets so GPU performance is more important. So is hardware accelerated video decode. Neither requires a fast CPU. Neither does having a lighter tablet nor a better display. These days battery life is more dependent on wireless performance and the kind of display you have. Oh, and as tablets slowly replace desktops as the main computing device, people will demand more storage and RAM. So people are willing demand

  • My main issue with netbooks was the horrible resolution and the sluggishness.

    If, by the end of 2013, they can slim down a Bay Trail-based netbook to 3/4", banish the absolutely awful 1024x600 resolution for 1366x768 or even 1600x900, rev to Windows 8.5, and keep it at $350, I will buy 3 for the price of a Macbook Air.

    • Why wait? My latest netbook (or they call it a netbook anyway) has an 11.6" display at 1366x768, is pretty close to 3/4" thick, has an AMD processor and Windows 8, and cost $200. Though maybe that was a special for the holidays ...

    • Currently, I have started seeing Celeron 867 based laptops advertised as "netbooks". They do have 1366x768, come with 4GB RAM, a hard disk (typically 500GB) and Windows 8... They cost just short under 400€ (Remember in the tech world 1$=1€, at least in the current exchange rate situation). They don't have an optical disk and are not Core iN based. So they have everything Netbooks have, except for an Atom CPU, but everything Ultrabooks have, except for an iN CPU. If I needed a new laptop righ
  • the first Atom to feature Intel's own graphics processor

    I have an Atom D510 with integrated Intel graphics [wikipedia.org].

    GMA 3150 GPU and memory controller are integrated into the processor.

    Does that count? I bought the motherboard in 2010.

For God's sake, stop researching for a while and begin to think!

Working...