Intel Eyes Smartphone Chip Market 84
MojoKid writes "Intel has been rather successful at carving out a large percentage of the netbook market with their low power Atom processor. Moving forward, Intel's executives believe there's a good potential to increase Atom's traction in adjacent markets by targeting its low-cost, energy-efficient chips at various multifunctional consumer gadgets including smartphones and other portable devices that access the Internet. Code-named Moorestown, a new version of the chip will offer a 50x power reduction at idle and reportedly will deliver enough horsepower to handle 720p video recording and 1080p quality playback. It is with this upcoming chip that Intel will begin targeting the smartphone market In 2011. Intel also plans to introduce an even smaller, less power-hungry version of the chip known as Medfield, which will be built on a 32nm process with a full solution comprising a PCB area of about half the size of a credit card."
Really.... (Score:3, Interesting)
It's the other way around (Score:5, Insightful)
The Microsoft definition is driven by Intel [gizmodo.com]. It's dumb of both of them, as it defines "premium netbook" as one that doesn't have either of their products in it but which has a bigger screen, more memory, more storage or a faster processor. It's a "loser mentality [cnet.com]" that tries to protect the notebook market that's already in "race to the bottom" mode.
Since neither of them can prevent other manufacturers from innovating outside of this specification, that just make it easier for an up-an-coming manufacturer to create a new market without them, and enjoy the benefit of not having to compete with them in that market.
So of course after that happens the restrictions will go away and it will be a free-for-all again.
It ain't over till it's over (Score:2)
It's a "loser mentality" that tries to protect the notebook market that's already in "race to the bottom" mode.
I seem to recall the geek saying that Linux had a lock on the netboook market.
Until XP and the Atom started kicking butt.
How about - this time - we wait and see how well the next generation "mini laptop" sells.
In a deep recession the market for the $99 gadget - the Blue Light special on Aisle 3 - often just dies.
Re: (Score:1)
I'm guessing you didn't click the link to see who was calling that a loser mentality.
Re: (Score:2)
I thought that was typically called a laptop
The netbook spec just allows Microsoft to sell a cheaper Windows O/S for netbooks without affecting their pricing for the laptop market. I don't see how that netbook spec would keep Intel out of a premium netbook market. Linux runs fine on netbook/laptops with Intel CPUs. OSX runs fine too.
Even your link itself and this article show that Intel is going into more markets, whether
Re: (Score:1)
Re: (Score:2)
Can't wait to (Score:5, Interesting)
watch those 1080p movies on my smart phone screen.
But on a more serious note, Intel will always be able to leverage their advanced fabrication processes to reduce power consumption. Most ARM chips I've seen use older (in terms of desktop CPU) process technology but the good architecture still gives you excellent power consumption.
Re: (Score:3, Informative)
Re: (Score:3, Insightful)
Yes that was my point, but another point is that 1080p playback does n't really mean anything as a CPU metric either because I seriously doubt it would be the CPU that would be doing the decoding. Most likely like the SoC arm based processors used on todays smartphones there would be dedicated hardware to assist with the video encoding/decoding.
Re: (Score:1)
Re: (Score:2)
A beagleboard already does 720p, it uses dedicated hardware for it, if I'm not mistaken.
Re:Can't wait to (Score:5, Informative)
Re: (Score:3, Informative)
Well as far as I remember they got XScale when thet aquired DEC so it probably was n't a division that was taken very seriously. Whilst they did make some improvements other manufacturers started producing ARM based chips that were as good as if not better so they got rid of it. I suspect the problem for Intel was that they did n't own the ARM architecture so for them it was better to sell off what they had since they would always be competing with other ARM licensees.
Re: (Score:3, Informative)
IIRC, Intel got the rights to all of Digital Semiconductor's design portfolio, bar AXP, as part of the DEC v Intel lawsuit settlement [berkeley.edu] in about 1997. This included things like the 21x4x tulip NICs, the 21x5x PCI-PCI bridges, the SA-110 StrongARM.
Re: (Score:1)
Re: (Score:1, Interesting)
Re: (Score:2)
[Can't wait to] watch those 1080p movies on my smart phone screen.
There are already phones which can play 780p (and record too). Why the sarcasm? Would you rather watch a lower quality movie?
Re:Can't wait to (Score:4, Insightful)
[Can't wait to] watch those 1080p movies on my smart phone screen.
There are already phones which can play 780p (and record too). Why the sarcasm? Would you rather watch a lower quality movie?
I don't have a brick sized smartphone, I have a tiny flip-phone. The screen is the size of a postage stamp, and the speakerphone sounds like a broken cb radio, which is plenty good enough for phone use. I will never be able to tell the difference between 320x240 and mono sound vs 1080p and 5.1 surround. Even on a slightly larger brick sized smartphone, I don't think it would be a noticeable difference, other than the dramatic decrease in battery life and maybe waves of heat wafting off the CPU.
At this time, can the average smartphone battery survive a low res feature length movie, and how much does it cost at five cents per kilobyte? Then extrapolate to ten times the data transfered (equals ten times the profit) plus ten times the processing equals roughly a tenth the battery life?
The other problem is the past decade has been spent trying to convince mindless consumers that nirvana is buying the largest big screen TV with the most surround speakers, then even the stupidest most formulaic movie is great. They have had some success with this sales pitch. Now all the marketers have to do is convince them they were just kidding, and nirvana is using the worlds highest resolution tiny phone, then even the stupidest most formulaic movie is great. Good luck! They'll need it!
Re: (Score:2)
What about 720p while connected to a TV ?:
http://www.youtube.com/watch?v=cXr-D1wROfQ [youtube.com]
Re: (Score:2)
Considering that most people WON'T be doing that and that the Cortex-A8 based appear to already able to DO 1080p...
Re:Can't wait to (Score:5, Informative)
good architecture
Don't you mean ludicrously good architecture?
I'm thinking Cortex A8's, which have been out for over a year. Stuff like the OMAP 3530(present in the Beagleboard [beagleboard.org], upcoming Pandora Handheld [openpandora.org], and Palm Pre [slashdot.org]) consumes remarkably small amounts of power.
The Pandora developers said their device consumes around or just over 1 watt. Most of that is from the LCD. They did experiments completely shutting off certain hardware, to measure power consumption, and concluded...
CPU - about 20-40mw
DSP - about 30-60mw
SGX GPU - about 30-60mw
(Hard to get exact measurements due to the nature of how components interact. Anything loading the CPU probably loads up the memory as well. Anything hitting the GPU will hit the CPU, and DSP load varies greatly depending on the codec and video being decoded.)
The entire SoC uses a ludicrously small amount of power; something like 0.2-0.4w. Then add another 0.6w for the LCD, and a bunch more for wireless.
Now, compare that to the current Atoms, with 6+ watts just for the CPU/chipset, another 2+ for the HDD/SSD, at least 6-15w for the LCD, etc...
If any company can drive down their power consumption, Intel can, but that doesn't mean it'll be easy to catch ARM!
I just can't wait for Cortex A9's. Quad-core ARM in the exact same power envelope!
Re: (Score:3, Interesting)
Wish there was an edit button. :)
Found the link: http://www.gp32x.com/board/index.php?s=&showtopic=48259&view=findpost&p=733993 [gp32x.com]
If interested, you can search the forums for more info, and look up the Palm Pre battery life.
Re: (Score:2)
Yes I agree Intel won't beable to to compete with the ARM because as you rightly point out the ARM is just too well designed an architecture.
If you read between the lines though I expect that the GPU and video decoding/encoding will be competitive if not better due fabrication process and if you notice it says "50X less power consumption at idle" so what I suspect is that as long as you're not doing anything that pushes the CPU you will get OK power consumption overall, but I guess we will have to wait and
Re: (Score:2)
Yea Intel will be able to get a 50X power savings at idle but the real problem is the 2watts used when talking. That'll kill any phone battery in less then 10 minutes unless it's one of the old Analog Bricks that weigh 2 kilo's.
Re: (Score:2)
No phone uses the CPU when you talk, it is all handled by the baseband DSP.
Re:Can't wait to (Score:4, Insightful)
good architecture
Don't you mean ludicrously good architecture?
I'm thinking Cortex A8's, which have been out for over a year. Stuff like the OMAP 3530(present in the Beagleboard [beagleboard.org], upcoming Pandora Handheld [openpandora.org], and Palm Pre [slashdot.org]) consumes remarkably small amounts of power.
The Pandora developers said their device consumes around or just over 1 watt. Most of that is from the LCD. They did experiments completely shutting off certain hardware, to measure power consumption, and concluded...
CPU - about 20-40mw DSP - about 30-60mw SGX GPU - about 30-60mw
(Hard to get exact measurements due to the nature of how components interact. Anything loading the CPU probably loads up the memory as well. Anything hitting the GPU will hit the CPU, and DSP load varies greatly depending on the codec and video being decoded.)
The entire SoC uses a ludicrously small amount of power; something like 0.2-0.4w. Then add another 0.6w for the LCD, and a bunch more for wireless.
Now, compare that to the current Atoms, with 6+ watts just for the CPU/chipset, another 2+ for the HDD/SSD, at least 6-15w for the LCD, etc...
If any company can drive down their power consumption, Intel can, but that doesn't mean it'll be easy to catch ARM!
I just can't wait for Cortex A9's. Quad-core ARM in the exact same power envelope!
To be fair, the Atom runs at 6 Watts max, where average TDP can down to as little as 0.4W. The problem with Atom, as you say, is all of the other hardware to make it work. Its current chipset is incredibly power hungry, but they're working on that (integrating more and doing even deeper clock gating). Future Atoms will likely use even less power, with Intel already shipping chips with a max 2.4W threshold.
And yes, you are being unfair comparing a device which has a hard drive with hundreds of gigabytes of space and a WXSVGA screen to a handheld device with a couple of gigs of flash memory and a HVGA screen. Nobody's stopping you from making an Atom device with those components (though it will take more power right now, it'll be vastly faster than the Cortex A8, and you won't have to recompile or use highly specialized toolkits, which is a huge Intel advantage).
Re:Can't wait to (Score:5, Interesting)
To be fair, the Atom runs at 6 Watts max, where average TDP can down to as little as 0.4W. The problem with Atom, as you say, is all of the other hardware to make it work. Its current chipset is incredibly power hungry, but they're working on that (integrating more and doing even deeper clock gating). Future Atoms will likely use even less power, with Intel already shipping chips with a max 2.4W threshold.
Right, so if you're actually doing something, you don't get to use your computer as long.
And yes, you are being unfair comparing a device which has a hard drive with hundreds of gigabytes of space and a WXSVGA screen to a handheld device with a couple of gigs of flash memory and a HVGA screen.
The Pandora has dual-SDHC slots, so you could have 64GB of space. (More if bigger SDHC cards were actually made)
Fine, an HDD is unfair, but SSD vs dual-SDHC is a valid comparison. The EEE PCs with SSDs had about 25MB/sec read/write speed. High end SDHC cards are slightly below that, and you can have two.
Now that better SSDs are available(like the Vertex), it changes things - but the Vertex is also a whole other price range.
Nobody's stopping you from making an Atom device with those components (though it will take more power right now, it'll be vastly faster than the Cortex A8, and you won't have to recompile or use highly specialized toolkits, which is a huge Intel advantage).
Nobody makes x86 programs that work on such tiny screens. I would cite the "highly specialized toolkits" as an advantage for ARM, in this case... you will need linux and those fancy toolkits. Maemo, Android, etc. all work very well on tiny screens.
And by the time Intel has an x86 Atom chip that will work in a fanless tiny device like a Pandora, ARM will have quad-core A9's available, so your point regarding performance is moot...
After all, there is no Atom that will fit in a device that small... yet. I have news for you though - my Phenom II in a cellphone (lol) beats your Atom in a cellphone. ;)
Re: (Score:2)
Nobody's stopping you from making an Atom device with those components (though it will take more power right now, it'll be vastly faster than the Cortex A8, and you won't have to recompile or use highly specialized toolkits, which is a huge Intel advantage).
Actually, go check the benchmarks and power draws on the chips and chipsets again. The Atom is most certainly not vastly faster than the Cortex A8 (particularly for equivalent clock and number of cores), and while Intel may be able to work the power draw down from the tens of watts that the chip + chipset + graphics require right now, that's a much harder task than what ARM has to do putting together the quad core Cortex A9 package that already has extremely low power graphics, etc. Though you do have it th
Re: (Score:2)
If you're already sitting on Linux, it's less of a port and more of a recompile if it's clean code. Seriously.
Re: (Score:1)
The ratio can already be seen getting better. Older designs' 945GC used max 22W while the newer 945GSE tops at 6W.
Re: (Score:2)
Even then, you're still consuming 2-3 times the juice of the current comparable ARM parts already shipping.
"Vastly Faster" is a relative concept, mind. Clock-for-clock, they're showing to be rather close in performance
right at the moment. Most of the Cortex-A8 parts are clocked down to 500-600MHz to save further on juice.
Don't get me wrong, Atom's VERY nice (I've got one machine right now, getting more...) but as a smartphone
platform, it's not as compelling as ARM is. The only real reason you'd really EV
Re: (Score:1)
Last year it was laughable, with a CPU + Chipset that needed more PCB space than your average brickphone's footprint, never mind other components. They've reduced that for 2011 to something that's still 10x the footprint of an ARM SoC. I can't see Intel getting anything competitive until 2013, but it's not as if ARM is standing still.
Cortex A9 brings out-of-order capability and multi-core (u
Re: (Score:1)
With linux, a stripped down window manager (awsome), screen at ~30% brightness and bluetooth, wifi, sd slot and webcam powered down, it idles at 8.1-8.3w. (I think the hdd is spun down at this point)
Turning the backlight off only gets me down to 7.9w, and If I put the screen at full brightness the entire power load only increases to ~9.2w.
Clearly the Screen isn't using the bulk of the power.
Re: (Score:2)
Oh? Interesting.
All I had available for reference was a smaller/older EEE PC.
I assumed the bigger LCDs would use more juice, but it looks like the newer ones may use less.
I take it you measured with a device like the Kill-A-Watt?
Re: (Score:2)
Apple have done some research into this area, and concluded that the best power saving technique is to ramp the CPU up for complex tasks, then hit idle as soon as you can. Rather than dragging out the process. This sounds like what Intel is going for, with there 50 x reduction in idle draw.
Re: (Score:2)
Apple have done some research into this area, and concluded that the best power saving technique is to ramp the CPU up for complex tasks, then hit idle as soon as you can. Rather than dragging out the process. This sounds like what Intel is going for, with there 50 x reduction in idle draw.
Doesn't really matter. A Cortex A8 isn't that much slower than an Atom. Certainly not enough that the idle savings will offset the load power usage.
More evidence Intel can't get low enough power (Score:2)
"Even more important, the Pine Trail platform will have a seven-Watt TDP and require an average of just two Watts"
That's after the improvements on an upcoming chip release. The article goes on the say the setup will cost more for Intel to produce. Good luck to them though, I'm still rooting for the race to the greatest performance out of milliwatts.
Re: (Score:2)
I think I'll just bide my time and wait for a 2 watt ARM quad-core. ;)
Re: (Score:2)
But on a more serious note, Intel will always be able to leverage their advanced fabrication processes to reduce power consumption. Most ARM chips I've seen use older (in terms of desktop CPU) process technology but the good architecture still gives you excellent power consumption.
I think that may actually be on purpose. Moving to a smaller process offers many benefits, such as increased speed and circuit density. However, it also tends to increase the leakage power of a chip. Leakage used to be almost nonexistent; these days, somewhere on the order of 50% of the power dissipation in a chip is just leakage.
For those not familiar with (semiconductor) leakage, here's a quick explanation: Transistors, as they are used in digital logic, have two states; 'on' and 'off'. When you make thos
Re: (Score:1)
Perhaps. But what if your phone had a USB socket (mine already does) and a HDMI socket? Carry your "laptop" around with you everywhere and use it as it should be - a communication device... BUT when you want a bigger screen, find the nearest 1080p panel and bam, big screen. Plug in a keyboard and mouse when you want to use it for office / school work.
In my mind a device the size of a cell phone but as powerful as todays netbooks is something of a holy grail. Make a decent universal cradle so everyone has on
Re: (Score:2)
Re: (Score:3, Funny)
In other words, you own one?
Re: (Score:2)
The iPhone costs $2800. Of course the owners are going to tend to be more affluent. In fact, I'd suggest that if you earn less than $70k and you bought an iPhone for personal use, you're a dumb-ass.
wrong info (Score:3, Informative)
Intel talked at the press release about 50% reduction, not 50 times...
Re: (Score:3, Informative)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Being tiny is also a major selling point.
The primary reasons for my GF to get one (MSI-Wind) was weight and battery life. The other choices were around 1000 euros, which was the price of a 13" laptop capable of handling Vista. But still the weight was higher, and battery life worse. Same for the ~1000euros Mac we saw.
Innovation in all market segments (Score:2, Insightful)
"It's a loser mentality to not develop one segment because you're worried about the other," he said. "I think we have several years ahead of us where we can innovate the heck out of any of these categories without getting defensive about the other one. You just need to unleash innovation in all of the segments and see what happens." - Sean Maloney [cnet.com]
It's interesting to see Intel expanding out of their traditional markets and unleashing innovation in every direction. Since they're also staying pretty open about interfaces, people are going to do some pretty amazing stuff with their new products.
Re: (Score:2)
It is however fail when a design team attempts to design a product when the CEO and executives have made idiotic, fixed, immutable design requirements for a product that could never be competitive. The blame for this will not go to the CEO and execs, the design team will take the fall. This is the story with the Atom, Larabee, and the eight-core Nehalem server processors with memory expansion controller chips, nearly all of which seem, to me, to be really stupid ideas. Though who knows, maybe if Intel tw
Re: (Score:1)
This is the story with the Atom, Larabee, and the eight-core Nehalem server processors with memory expansion controller chips, nearly all of which seem, to me, to be really stupid ideas.
If you were a subscriber here you could see that I predicted all of these things years before they happened. I disagree that they're stupid ideas because to me they're my ideas. Itanium? Let's agree about that. That was a stupid idea that has ripened into a vile stench. Another $10B from Intel and HP isn't going to make this dog profitable.
Like any big company, Intel has various factions that don't necessarily agree with one another. It has to: that's the price of progress. As a whole I think they're
Time Trax (Score:2)
"Intel also plans to introduce an even smaller, less power hungry version of the chip known as Medfield, which will be built on a 32nm process with a full solution comprising a PCB area of about half the size of a credit card."
Selma [tvacres.com], is that you?
Intel vs. ARM (Score:5, Interesting)
Re: (Score:2)
Pretty simple, really. It doesn't take much looking to find info (on wikipedia, even) that the next generation Atom (due out end of 2009) will tentatively be a dual core SoC with integrated next-gen (for Intel, anyway) GPU.
Xscale? (Score:4, Interesting)
Re: (Score:3, Informative)
Re: (Score:2, Informative)
Stay Focused (Score:1)
I hope the Intel employees don't get too distracted by random visits from USB co-inventor Ajay Bhatt.
The Two Ways Race (Score:1)
Good luck with that (Score:5, Interesting)
And even better, if you're talking about Intel's chips two generations out, then consider the Cortex A9 quad core chips that are claiming to be ready to go and at reasonable power consumption in the same time frame if not sooner than Intel's offering. That article is actually claiming dual core Cortex A9 phones within a year that use about the same power as current chips with much better performance.
So as noted it looks like ARM is going to have a much easier time scaling up performance at the smartphone power draw level than Intel is going to have getting anywhere near it. And the Cortex A9 will probably spank the Atom. The race should benefit everyone though. Maybe we'll actually get some decent performing netbook, laptop, and desktop chips out of it that run on extremely low power.
Replying to own post with link (Score:2)
The current atoms run about 2 watts, way too much for a smartphone even if they are able to cut that in half, and that's not even counting the power hog chipsets needed for the atom that require 5-12+ watts. By comparison the current cortex A8 packages with video etc that are able to do 1080p are able to make it under the 300 milliwatt line smartphone manufacturers are looking for.
And even better, if you're talking about Intel's chips two generations out, then consider the Cortex A9 quad core chips that are claiming to be ready to go and at reasonable power consumption in the same time frame if not sooner than Intel's offering. That article is actually claiming dual core Cortex A9 phones within a year that use about the same power as current chips with much better performance.
So as noted it looks like ARM is going to have a much easier time scaling up performance at the smartphone power draw level than Intel is going to have getting anywhere near it. And the Cortex A9 will probably spank the Atom. The race should benefit everyone though. Maybe we'll actually get some decent performing netbook, laptop, and desktop chips out of it that run on extremely low power.
http://m.news.com/2166-12_3-10263278-64.html [news.com]
http://www.liliputing.com/tag/arm-cortex-a9 [liliputing.com]
http://www.pcmag.com/article2/0,2817,2341032,00.asp [pcmag.com]
Crap, missed the link the first time. A couple more for good measure.
Hooray! (Score:1)
Three words come to mind (Score:2)
Re: (Score:1)
WTF?
Hmm... (Score:2)
Push for (potentially) standardized low-power/decent-performance mobile platform that might actually result in a handheld general purpose computer that isn't an iPhone? Yes, please.
(Yes, I know all about the Palm Pre, Blackberries, and others. Quiet, you in the peanut gallery.)
If it doesn't work, a competitive push for other makers (ARM, etc.) to do better? Yes please to that, as well.
If this thing is supposed to be based on x86-ish architecture, though, I wonder how (or if) they've licked the bus and chips
x86 please (Score:1)
I'd love to have an x86 processor powering my smartphone, this way I can run all the amazing x86-only apps and be in synergy with the x86 world, I'll dumb my iPhone for one in a heartbeat.
x86 shall prevail! die ARM die!
ARM netbooks (Score:2)
Somehow the flurry of upcoming ARM-Cortex based netbook and MID launches this summer has escaped Slashdot crowds attention
http://www.engadget.com/tag/arm [engadget.com]
Intel is gonna be so dead in this segment.
Re: (Score:2)
I don't think it escaped anyone, really.
It's almost as if we've got Intel or Windows fans posting the "pro" postings.
ARM's already IN this space and nearly as fast per clock as Atom and in 6-12 months will be
nipping at the heels of Core's performance profile with the Cortex-A9 with nearly the same
power/performance profile the A8's are already showing to have. This is not saying they're "OMG
FAST!" at this stuff- but then, neither is the Atom, really. What I've had the pleasure of seeing was
a machine that