Forgot your password?
typodupeerror
Intel Cellphones Hardware

Intel Eyes Smartphone Chip Market 84

Posted by kdawson
from the make-room-for-atom dept.
MojoKid writes "Intel has been rather successful at carving out a large percentage of the netbook market with their low power Atom processor. Moving forward, Intel's executives believe there's a good potential to increase Atom's traction in adjacent markets by targeting its low-cost, energy-efficient chips at various multifunctional consumer gadgets including smartphones and other portable devices that access the Internet. Code-named Moorestown, a new version of the chip will offer a 50x power reduction at idle and reportedly will deliver enough horsepower to handle 720p video recording and 1080p quality playback. It is with this upcoming chip that Intel will begin targeting the smartphone market In 2011. Intel also plans to introduce an even smaller, less power-hungry version of the chip known as Medfield, which will be built on a 32nm process with a full solution comprising a PCB area of about half the size of a credit card."
This discussion has been archived. No new comments can be posted.

Intel Eyes Smartphone Chip Market

Comments Filter:
  • Really.... (Score:3, Interesting)

    by Darkness404 (1287218) on Sunday June 14, 2009 @04:49PM (#28329085)
    Really Intel could excel in the smartphone chip market where they can't in the netbook market because of MS and their speed/power restrictions on netbooks. The problem I see with the smartphone market is that x86 is terribly hard to make power-efficient enough and still be fast. Could Intel do it, sure, but unlike desktop CPUs they can't just increase the clock speed and get faster CPUs, they have to work at it.
    • by symbolset (646467) on Sunday June 14, 2009 @05:23PM (#28329373) Journal

      The Microsoft definition is driven by Intel [gizmodo.com]. It's dumb of both of them, as it defines "premium netbook" as one that doesn't have either of their products in it but which has a bigger screen, more memory, more storage or a faster processor. It's a "loser mentality [cnet.com]" that tries to protect the notebook market that's already in "race to the bottom" mode.

      Since neither of them can prevent other manufacturers from innovating outside of this specification, that just make it easier for an up-an-coming manufacturer to create a new market without them, and enjoy the benefit of not having to compete with them in that market.

      So of course after that happens the restrictions will go away and it will be a free-for-all again.

      • It's a "loser mentality" that tries to protect the notebook market that's already in "race to the bottom" mode.

        I seem to recall the geek saying that Linux had a lock on the netboook market.

        Until XP and the Atom started kicking butt.

        How about - this time - we wait and see how well the next generation "mini laptop" sells.

        In a deep recession the market for the $99 gadget - the Blue Light special on Aisle 3 - often just dies.

      • by TheLink (130905)
        > but which has a bigger screen, more memory, more storage or a faster processor

        I thought that was typically called a laptop ;).

        The netbook spec just allows Microsoft to sell a cheaper Windows O/S for netbooks without affecting their pricing for the laptop market. I don't see how that netbook spec would keep Intel out of a premium netbook market. Linux runs fine on netbook/laptops with Intel CPUs. OSX runs fine too.

        Even your link itself and this article show that Intel is going into more markets, whether
        • by symbolset (646467)
          If they disappear the snapdragon [gottabemobile.com], there'll be hell to pay. Both of them had best be checking the logs to see who had contact during CeBit.
    • by Julie188 (991243)
      So, are you saying that all the chips that Intel makes, like IXP4XX family (for network processors), are all based on the X86? Seems as if they already work at it. And if they are smart (and they are smart), they would see that in the next few years the PC becomes the netbook and the netbook merges with the smartphone. Some company or another is going to make a lot of money from those new devices.
  • Can't wait to (Score:5, Interesting)

    by msgmonkey (599753) on Sunday June 14, 2009 @04:54PM (#28329125)

    watch those 1080p movies on my smart phone screen.

    But on a more serious note, Intel will always be able to leverage their advanced fabrication processes to reduce power consumption. Most ARM chips I've seen use older (in terms of desktop CPU) process technology but the good architecture still gives you excellent power consumption.

    • Re:Can't wait to (Score:5, Informative)

      by supersat (639745) on Sunday June 14, 2009 @05:23PM (#28329369)
      Intel already had an ARM processor for smartphones -- the XScale PXA family. They decided to sell it off to Marvell a few years ago as part of their cost-cutting strategy. We'll see if that was a wise thing to do.
      • Re: (Score:3, Informative)

        by msgmonkey (599753)

        Well as far as I remember they got XScale when thet aquired DEC so it probably was n't a division that was taken very seriously. Whilst they did make some improvements other manufacturers started producing ARM based chips that were as good as if not better so they got rid of it. I suspect the problem for Intel was that they did n't own the ARM architecture so for them it was better to sell off what they had since they would always be competing with other ARM licensees.

      • They sold it because it competed with their aspiration to have x86 enter the smartphone market. I don't think profitability had anything to do with it.
        • Re: (Score:1, Interesting)

          by Anonymous Coward
          Not exactly, I've seen their roadmap from before they decided that UMPCs were the future, and they had the XScale family moving up into the low end tablet/laptop space. At some point they changed their marketing from targeting smart phones to UMPCs, then sold off the XScale business. This was before the iPhone came out and gave the smartphone market the kick in the ass it needed.
    • [Can't wait to] watch those 1080p movies on my smart phone screen.

      There are already phones which can play 780p (and record too). Why the sarcasm? Would you rather watch a lower quality movie?

      • Re:Can't wait to (Score:4, Insightful)

        by vlm (69642) on Sunday June 14, 2009 @05:54PM (#28329535)

        [Can't wait to] watch those 1080p movies on my smart phone screen.

        There are already phones which can play 780p (and record too). Why the sarcasm? Would you rather watch a lower quality movie?

        I don't have a brick sized smartphone, I have a tiny flip-phone. The screen is the size of a postage stamp, and the speakerphone sounds like a broken cb radio, which is plenty good enough for phone use. I will never be able to tell the difference between 320x240 and mono sound vs 1080p and 5.1 surround. Even on a slightly larger brick sized smartphone, I don't think it would be a noticeable difference, other than the dramatic decrease in battery life and maybe waves of heat wafting off the CPU.

        At this time, can the average smartphone battery survive a low res feature length movie, and how much does it cost at five cents per kilobyte? Then extrapolate to ten times the data transfered (equals ten times the profit) plus ten times the processing equals roughly a tenth the battery life?

        The other problem is the past decade has been spent trying to convince mindless consumers that nirvana is buying the largest big screen TV with the most surround speakers, then even the stupidest most formulaic movie is great. They have had some success with this sales pitch. Now all the marketers have to do is convince them they were just kidding, and nirvana is using the worlds highest resolution tiny phone, then even the stupidest most formulaic movie is great. Good luck! They'll need it!

    • Re:Can't wait to (Score:5, Informative)

      by BikeHelmet (1437881) on Sunday June 14, 2009 @05:34PM (#28329439) Journal

      good architecture

      Don't you mean ludicrously good architecture?

      I'm thinking Cortex A8's, which have been out for over a year. Stuff like the OMAP 3530(present in the Beagleboard [beagleboard.org], upcoming Pandora Handheld [openpandora.org], and Palm Pre [slashdot.org]) consumes remarkably small amounts of power.

      The Pandora developers said their device consumes around or just over 1 watt. Most of that is from the LCD. They did experiments completely shutting off certain hardware, to measure power consumption, and concluded...

      CPU - about 20-40mw
      DSP - about 30-60mw
      SGX GPU - about 30-60mw

      (Hard to get exact measurements due to the nature of how components interact. Anything loading the CPU probably loads up the memory as well. Anything hitting the GPU will hit the CPU, and DSP load varies greatly depending on the codec and video being decoded.)

      The entire SoC uses a ludicrously small amount of power; something like 0.2-0.4w. Then add another 0.6w for the LCD, and a bunch more for wireless.

      Now, compare that to the current Atoms, with 6+ watts just for the CPU/chipset, another 2+ for the HDD/SSD, at least 6-15w for the LCD, etc...

      If any company can drive down their power consumption, Intel can, but that doesn't mean it'll be easy to catch ARM!

      I just can't wait for Cortex A9's. Quad-core ARM in the exact same power envelope!

      • Re: (Score:3, Interesting)

        by BikeHelmet (1437881)

        Wish there was an edit button. :)

        Found the link: http://www.gp32x.com/board/index.php?s=&showtopic=48259&view=findpost&p=733993 [gp32x.com]

        If interested, you can search the forums for more info, and look up the Palm Pre battery life.

      • by msgmonkey (599753)

        Yes I agree Intel won't beable to to compete with the ARM because as you rightly point out the ARM is just too well designed an architecture.

        If you read between the lines though I expect that the GPU and video decoding/encoding will be competitive if not better due fabrication process and if you notice it says "50X less power consumption at idle" so what I suspect is that as long as you're not doing anything that pushes the CPU you will get OK power consumption overall, but I guess we will have to wait and

        • Yea Intel will be able to get a 50X power savings at idle but the real problem is the 2watts used when talking. That'll kill any phone battery in less then 10 minutes unless it's one of the old Analog Bricks that weigh 2 kilo's.

      • Re:Can't wait to (Score:4, Insightful)

        by ciroknight (601098) on Sunday June 14, 2009 @05:51PM (#28329519)

        good architecture

        Don't you mean ludicrously good architecture?

        I'm thinking Cortex A8's, which have been out for over a year. Stuff like the OMAP 3530(present in the Beagleboard [beagleboard.org], upcoming Pandora Handheld [openpandora.org], and Palm Pre [slashdot.org]) consumes remarkably small amounts of power.

        The Pandora developers said their device consumes around or just over 1 watt. Most of that is from the LCD. They did experiments completely shutting off certain hardware, to measure power consumption, and concluded...

        CPU - about 20-40mw DSP - about 30-60mw SGX GPU - about 30-60mw

        (Hard to get exact measurements due to the nature of how components interact. Anything loading the CPU probably loads up the memory as well. Anything hitting the GPU will hit the CPU, and DSP load varies greatly depending on the codec and video being decoded.)

        The entire SoC uses a ludicrously small amount of power; something like 0.2-0.4w. Then add another 0.6w for the LCD, and a bunch more for wireless.

        Now, compare that to the current Atoms, with 6+ watts just for the CPU/chipset, another 2+ for the HDD/SSD, at least 6-15w for the LCD, etc...

        If any company can drive down their power consumption, Intel can, but that doesn't mean it'll be easy to catch ARM!

        I just can't wait for Cortex A9's. Quad-core ARM in the exact same power envelope!

        To be fair, the Atom runs at 6 Watts max, where average TDP can down to as little as 0.4W. The problem with Atom, as you say, is all of the other hardware to make it work. Its current chipset is incredibly power hungry, but they're working on that (integrating more and doing even deeper clock gating). Future Atoms will likely use even less power, with Intel already shipping chips with a max 2.4W threshold.

        And yes, you are being unfair comparing a device which has a hard drive with hundreds of gigabytes of space and a WXSVGA screen to a handheld device with a couple of gigs of flash memory and a HVGA screen. Nobody's stopping you from making an Atom device with those components (though it will take more power right now, it'll be vastly faster than the Cortex A8, and you won't have to recompile or use highly specialized toolkits, which is a huge Intel advantage).

        • Re:Can't wait to (Score:5, Interesting)

          by BikeHelmet (1437881) on Sunday June 14, 2009 @06:48PM (#28329863) Journal

          To be fair, the Atom runs at 6 Watts max, where average TDP can down to as little as 0.4W. The problem with Atom, as you say, is all of the other hardware to make it work. Its current chipset is incredibly power hungry, but they're working on that (integrating more and doing even deeper clock gating). Future Atoms will likely use even less power, with Intel already shipping chips with a max 2.4W threshold.

          Right, so if you're actually doing something, you don't get to use your computer as long.

          And yes, you are being unfair comparing a device which has a hard drive with hundreds of gigabytes of space and a WXSVGA screen to a handheld device with a couple of gigs of flash memory and a HVGA screen.

          The Pandora has dual-SDHC slots, so you could have 64GB of space. (More if bigger SDHC cards were actually made)

          Fine, an HDD is unfair, but SSD vs dual-SDHC is a valid comparison. The EEE PCs with SSDs had about 25MB/sec read/write speed. High end SDHC cards are slightly below that, and you can have two.

          Now that better SSDs are available(like the Vertex), it changes things - but the Vertex is also a whole other price range.

          Nobody's stopping you from making an Atom device with those components (though it will take more power right now, it'll be vastly faster than the Cortex A8, and you won't have to recompile or use highly specialized toolkits, which is a huge Intel advantage).

          Nobody makes x86 programs that work on such tiny screens. I would cite the "highly specialized toolkits" as an advantage for ARM, in this case... you will need linux and those fancy toolkits. Maemo, Android, etc. all work very well on tiny screens.

          And by the time Intel has an x86 Atom chip that will work in a fanless tiny device like a Pandora, ARM will have quad-core A9's available, so your point regarding performance is moot...

          After all, there is no Atom that will fit in a device that small... yet. I have news for you though - my Phenom II in a cellphone (lol) beats your Atom in a cellphone. ;)

        • Nobody's stopping you from making an Atom device with those components (though it will take more power right now, it'll be vastly faster than the Cortex A8, and you won't have to recompile or use highly specialized toolkits, which is a huge Intel advantage).

          Actually, go check the benchmarks and power draws on the chips and chipsets again. The Atom is most certainly not vastly faster than the Cortex A8 (particularly for equivalent clock and number of cores), and while Intel may be able to work the power draw down from the tens of watts that the chip + chipset + graphics require right now, that's a much harder task than what ARM has to do putting together the quad core Cortex A9 package that already has extremely low power graphics, etc. Though you do have it th

          • by Svartalf (2997)

            If you're already sitting on Linux, it's less of a port and more of a recompile if it's clean code. Seriously.

        • The problem with Atom, as you say, is all of the other hardware to make it work. Its current chipset is incredibly power hungry, but they're working on that (integrating more and doing even deeper clock gating).

          The ratio can already be seen getting better. Older designs' 945GC used max 22W while the newer 945GSE tops at 6W.

        • by Svartalf (2997)

          Even then, you're still consuming 2-3 times the juice of the current comparable ARM parts already shipping.

          "Vastly Faster" is a relative concept, mind. Clock-for-clock, they're showing to be rather close in performance
          right at the moment. Most of the Cortex-A8 parts are clocked down to 500-600MHz to save further on juice.

          Don't get me wrong, Atom's VERY nice (I've got one machine right now, getting more...) but as a smartphone
          platform, it's not as compelling as ARM is. The only real reason you'd really EV

      • by hattig (47930)
        Damn right. Every few months this story about Intel driving their products down into the smartphone arena comes along.

        Last year it was laughable, with a CPU + Chipset that needed more PCB space than your average brickphone's footprint, never mind other components. They've reduced that for 2011 to something that's still 10x the footprint of an ARM SoC. I can't see Intel getting anything competitive until 2013, but it's not as if ARM is standing still.

        Cortex A9 brings out-of-order capability and multi-core (u
      • by PhireN (916388)
        The LCD on my EEEPC 1000he uses way less than 6-15w.
        With linux, a stripped down window manager (awsome), screen at ~30% brightness and bluetooth, wifi, sd slot and webcam powered down, it idles at 8.1-8.3w. (I think the hdd is spun down at this point)
        Turning the backlight off only gets me down to 7.9w, and If I put the screen at full brightness the entire power load only increases to ~9.2w.
        Clearly the Screen isn't using the bulk of the power.
        • Oh? Interesting.

          All I had available for reference was a smaller/older EEE PC.

          I assumed the bigger LCDs would use more juice, but it looks like the newer ones may use less.

          I take it you measured with a device like the Kill-A-Watt?

      • by mgblst (80109)

        Apple have done some research into this area, and concluded that the best power saving technique is to ramp the CPU up for complex tasks, then hit idle as soon as you can. Rather than dragging out the process. This sounds like what Intel is going for, with there 50 x reduction in idle draw.

        • Apple have done some research into this area, and concluded that the best power saving technique is to ramp the CPU up for complex tasks, then hit idle as soon as you can. Rather than dragging out the process. This sounds like what Intel is going for, with there 50 x reduction in idle draw.

          Doesn't really matter. A Cortex A8 isn't that much slower than an Atom. Certainly not enough that the idle savings will offset the load power usage.

      • Instead of replying to myself, I thought I'd add it here. Here's a Linuxdevices article [linuxdevices.com] on Intel's upcoming lower power atoms. This is reducing the ridiculous power draw of the chipsets by combining the package into two chips. Quoting

        "Even more important, the Pine Trail platform will have a seven-Watt TDP and require an average of just two Watts"

        That's after the improvements on an upcoming chip release. The article goes on the say the setup will cost more for Intel to produce. Good luck to them though, I'm still rooting for the race to the greatest performance out of milliwatts.

    • But on a more serious note, Intel will always be able to leverage their advanced fabrication processes to reduce power consumption. Most ARM chips I've seen use older (in terms of desktop CPU) process technology but the good architecture still gives you excellent power consumption.

      I think that may actually be on purpose. Moving to a smaller process offers many benefits, such as increased speed and circuit density. However, it also tends to increase the leakage power of a chip. Leakage used to be almost nonexistent; these days, somewhere on the order of 50% of the power dissipation in a chip is just leakage.

      For those not familiar with (semiconductor) leakage, here's a quick explanation: Transistors, as they are used in digital logic, have two states; 'on' and 'off'. When you make thos

    • Perhaps. But what if your phone had a USB socket (mine already does) and a HDMI socket? Carry your "laptop" around with you everywhere and use it as it should be - a communication device... BUT when you want a bigger screen, find the nearest 1080p panel and bam, big screen. Plug in a keyboard and mouse when you want to use it for office / school work.

      In my mind a device the size of a cell phone but as powerful as todays netbooks is something of a holy grail. Make a decent universal cradle so everyone has on

      • Exactly. This wasn't really possible before since the performance of the low power chips just wasn't high enough to do a broad enough range of tasks. But it really is within reach now. All of the necessary components are small enough, it would really just take some packaging and somebody decent behind it. Heck the current smartphones basically are this, just minus the HDMI and keyboard/mouse sockets. Actually the touchscreen on the smartphone could be the trackpad, so you'd just need a decent keyboard. Then
  • wrong info (Score:3, Informative)

    by Anonymous Coward on Sunday June 14, 2009 @04:57PM (#28329161)

    Intel talked at the press release about 50% reduction, not 50 times...

    • Re: (Score:3, Informative)

      by caladine (1290184)
      TFA has it here [hothardware.com] and here [hothardware.com] as a 50x reduction.
      • Yeah, for the standby power, of just the chip. The chipset is still a hog, and that lower standby power doesn't matter if you can't keep it in standby long enough. The chip still draws way more power than the ARM chips when it's running. The ATOM at 2 watts plus multiple watt chipset has a long way to go to get down to the 300 milliwatts or less that the whole Cortex A8 SoC runs at. It'll be interesting, but I don't see how they'll do it. It kind of seems like a me too, now that ARM have been able to drive
        • by caladine (1290184)
          Yeah, I was just pointing out what Intel was claiming. Even a 50x reduction in standby power from the 1.6W (see my second link) is still 32mW in standby. That's considerably higher that other offerings in the market place. Also, as you point out, this doesn't include the power used by the chipset. Intel has a ridiculous amount of catch-up to do, and they know it. Given Intel's track history with the market place, watch for underhanded dealings with the Smartbook/Netbook/MID/whatever manufacturers.
  • "It's a loser mentality to not develop one segment because you're worried about the other," he said. "I think we have several years ahead of us where we can innovate the heck out of any of these categories without getting defensive about the other one. You just need to unleash innovation in all of the segments and see what happens." - Sean Maloney [cnet.com]

    It's interesting to see Intel expanding out of their traditional markets and unleashing innovation in every direction. Since they're also staying pretty open about interfaces, people are going to do some pretty amazing stuff with their new products.

    • by Tycho (11893)

      It is however fail when a design team attempts to design a product when the CEO and executives have made idiotic, fixed, immutable design requirements for a product that could never be competitive. The blame for this will not go to the CEO and execs, the design team will take the fall. This is the story with the Atom, Larabee, and the eight-core Nehalem server processors with memory expansion controller chips, nearly all of which seem, to me, to be really stupid ideas. Though who knows, maybe if Intel tw

      • by symbolset (646467)

        This is the story with the Atom, Larabee, and the eight-core Nehalem server processors with memory expansion controller chips, nearly all of which seem, to me, to be really stupid ideas.

        If you were a subscriber here you could see that I predicted all of these things years before they happened. I disagree that they're stupid ideas because to me they're my ideas. Itanium? Let's agree about that. That was a stupid idea that has ripened into a vile stench. Another $10B from Intel and HP isn't going to make this dog profitable.

        Like any big company, Intel has various factions that don't necessarily agree with one another. It has to: that's the price of progress. As a whole I think they're

  • "Intel also plans to introduce an even smaller, less power hungry version of the chip known as Medfield, which will be built on a 32nm process with a full solution comprising a PCB area of about half the size of a credit card."

    Selma [tvacres.com], is that you?

  • Intel vs. ARM (Score:5, Interesting)

    by moon3 (1530265) on Sunday June 14, 2009 @05:18PM (#28329325)
    It would be interesting to see what will Intel pitch against ARM's current superior offering. ARM is cheap and already has Power-VR OpenGL accelerator and other stuff integrated, while being very power efficient. Bundled GPU and power efficiency is a deal breaker in the mobile arena. Intel doesn't have integrated GPU nor a track record of being very power efficient.
    • by CAIMLAS (41445)

      Pretty simple, really. It doesn't take much looking to find info (on wikipedia, even) that the next generation Atom (due out end of 2009) will tentatively be a dual core SoC with integrated next-gen (for Intel, anyway) GPU.

  • Xscale? (Score:4, Interesting)

    by areusche (1297613) on Sunday June 14, 2009 @05:21PM (#28329357)
    Seriously what happened to Intel's Xscale processor? After they sold it to Marvell it went into the abyss of forgotten tech. That ARM processor had the entire Palm and Pocket PC market by the balls a couple of years ago since every device worth its weight was using it! They left that market and now want to reenter it? Last I checked every smartphone still uses ARM.
    • Re: (Score:3, Informative)

      Intel had to license ARM technology from ARM, Ltd. This was not a viable long-term business strategy.
    • Re: (Score:2, Informative)

      by hattig (47930)
      It's in the SheevaPlug device from Marvell - that's a 1.2GHz ARMv5 device (1.2GHz StrongARM / XScale effectively).
  • I hope the Intel employees don't get too distracted by random visits from USB co-inventor Ajay Bhatt.

  • Intel trying to cut down power, and ARM trying to enter the multicore, superscalar field. So far, ARM is way ahead in the smartphone/mobile market. In there, battery life is king, and Intel lags behind.
  • Good luck with that (Score:5, Interesting)

    by Taxman415a (863020) on Sunday June 14, 2009 @05:38PM (#28329461) Homepage Journal
    The current atoms run about 2 watts, way too much for a smartphone even if they are able to cut that in half, and that's not even counting the power hog chipsets needed for the atom that require 5-12+ watts. By comparison the current cortex A8 packages with video etc that are able to do 1080p are able to make it under the 300 milliwatt line smartphone manufacturers are looking for.

    And even better, if you're talking about Intel's chips two generations out, then consider the Cortex A9 quad core chips that are claiming to be ready to go and at reasonable power consumption in the same time frame if not sooner than Intel's offering. That article is actually claiming dual core Cortex A9 phones within a year that use about the same power as current chips with much better performance.

    So as noted it looks like ARM is going to have a much easier time scaling up performance at the smartphone power draw level than Intel is going to have getting anywhere near it. And the Cortex A9 will probably spank the Atom. The race should benefit everyone though. Maybe we'll actually get some decent performing netbook, laptop, and desktop chips out of it that run on extremely low power.
    • The current atoms run about 2 watts, way too much for a smartphone even if they are able to cut that in half, and that's not even counting the power hog chipsets needed for the atom that require 5-12+ watts. By comparison the current cortex A8 packages with video etc that are able to do 1080p are able to make it under the 300 milliwatt line smartphone manufacturers are looking for.

      And even better, if you're talking about Intel's chips two generations out, then consider the Cortex A9 quad core chips that are claiming to be ready to go and at reasonable power consumption in the same time frame if not sooner than Intel's offering. That article is actually claiming dual core Cortex A9 phones within a year that use about the same power as current chips with much better performance.

      So as noted it looks like ARM is going to have a much easier time scaling up performance at the smartphone power draw level than Intel is going to have getting anywhere near it. And the Cortex A9 will probably spank the Atom. The race should benefit everyone though. Maybe we'll actually get some decent performing netbook, laptop, and desktop chips out of it that run on extremely low power.

      http://m.news.com/2166-12_3-10263278-64.html [news.com]
      http://www.liliputing.com/tag/arm-cortex-a9 [liliputing.com]
      http://www.pcmag.com/article2/0,2817,2341032,00.asp [pcmag.com]
      Crap, missed the link the first time. A couple more for good measure.

  • by Akir (878284)
    This is a day of rejoicing! Now even our most simple embedded devices can have decades of backward-compatibility baggage and buggy code!
  • Push for (potentially) standardized low-power/decent-performance mobile platform that might actually result in a handheld general purpose computer that isn't an iPhone? Yes, please.

    (Yes, I know all about the Palm Pre, Blackberries, and others. Quiet, you in the peanut gallery.)

    If it doesn't work, a competitive push for other makers (ARM, etc.) to do better? Yes please to that, as well.

    If this thing is supposed to be based on x86-ish architecture, though, I wonder how (or if) they've licked the bus and chips

  • I'd love to have an x86 processor powering my smartphone, this way I can run all the amazing x86-only apps and be in synergy with the x86 world, I'll dumb my iPhone for one in a heartbeat.

    x86 shall prevail! die ARM die!

  • Somehow the flurry of upcoming ARM-Cortex based netbook and MID launches this summer has escaped Slashdot crowds attention
    http://www.engadget.com/tag/arm [engadget.com]
    Intel is gonna be so dead in this segment.

    • by Svartalf (2997)

      I don't think it escaped anyone, really.

      It's almost as if we've got Intel or Windows fans posting the "pro" postings.

      ARM's already IN this space and nearly as fast per clock as Atom and in 6-12 months will be
      nipping at the heels of Core's performance profile with the Cortex-A9 with nearly the same
      power/performance profile the A8's are already showing to have. This is not saying they're "OMG
      FAST!" at this stuff- but then, neither is the Atom, really. What I've had the pleasure of seeing was
      a machine that

A Fortran compiler is the hobgoblin of little minis.

Working...