Intel Demos Phone and Tablet In New Mobile Chip Push 99
holy_calamity writes "Intel is making another assault on the mobile processor market, showing off a prototype phone and a tablet using its newest mobile processor Medfield. The company claims that products based on the chips will appear in the first half of next year. There's reason to believe that Intel might get somewhere this time. Its chipsets traditionally comprise three separate chips, a design that guzzles power. Medfield introduces an all-in-one chip, mirroring the power efficient design of the ARM-based chips that run smart phones and tablets in the market today."
Intel brand fading? (Score:4, Insightful)
Surely, it is no longer the "Intel Inside" mantra we had become so used to seeing and hearing in the late 90s and early 2000s. Agree?
Re: (Score:2)
Re: (Score:3)
Dunno, I remember Centrino being a very good mobile processor line back in the day. I'm more surprised they didn't enter the market until now, maybe it's because they've been dominating the desktop market pretty hard? I have a hard time recommending AMD with a straight face nowadays for desktops... haven't read too much about what came in the past few months, I know AMD released something decent, but all they're doing is joining in on the party, not starting one there.
Re:Find a new market! (Score:5, Insightful)
Dunno, I remember Centrino being a very good mobile processor line back in the day. I'm more surprised they didn't enter the market until now, maybe it's because they've been dominating the desktop market pretty hard? I have a hard time recommending AMD with a straight face nowadays for desktops... haven't read too much about what came in the past few months, I know AMD released something decent, but all they're doing is joining in on the party, not starting one there.
They didn't enter the market because they rested on their laurels like they often do, and also got completely blindsided by how quickly smartphones and tablet computing took over the world. Intel is a great company in many respects, but too often relies on a kick in the pants to get moving. Traditionally, AMD has done the kicking like they did with x64 and Athlon, which is why Intel got blindsided when the whipping came from ARM. They responded eventually to AMD with Centrino, Merom/Conroe/Woodcrest, and eventually with Quickpath and Nehalem, and AMD is still recovering.
They are finding it harder to do the same with ARM because both companies are moving in different directions - ARM has an extremely low power and low performance architecture while Intel's x86 is extremely high power and high performance. Plus, Intel has to deal legacy support in every subsequently new "tock" which is why x86 improvement will always remain evolutionary in nature. ARM also found it much easier to scale up its performance at a similar power envelope while Intel has found it much harder to scale down its power consumption while maintaining adequate performance.
Atom was probably the first x86 redesign that targeted power consumption first and only then performance. Even with this design goal, it only managed to scale down to single digit wattage while ARM operates in the sub-watt to milliwatt range. This is still a crucial difference - it is the difference between the weight and size of a netbook sized laptop and a handheld device. On top of this, ARM has been steadily integrating more and more peripheral chips back into the chip while keeping the same power envelope, which makes it even simpler and more attractive to device manufacturers.
Anyway, rambling aside, I suspect that Intel gave up the race for a brief period of time and instead waited for its manufacturing process to shrink to a level (22nm) where it could finally combine its process node lead with the Atom architecture to reach the sub-watt power level. It still hasn't got there, but it will - by 2013. Don't count them out, and I say this mainly because Intel is still the only surviving company that still designs AND manufactures its own chips. The advantages of this kind of vertical integration is huge. Companies love to talk about outsourcing everything but there are significant advantages to being vertically integrated as well. To digress slightly, look at how mainframes continue to survive and thrive in this age of commodity computing.
It is also interesting to reflect that this fortuitously coincides with Microsoft's Win 8 release and MSFT's own struggle to compete in tablet and handheld computing. Again, their true credible answer will be Windows 9 if not Windows 8. I suspect that at least in the tablet playing field, Win 8 will be a very credible competitor, and Win 9 will probably merge back almost fully with x86 architecture. The allure of x86 and its backward compatibility should not be underestimated. Legacy app support is extremely attractive for enterprise IT even if it is not so much so for normal consumers.
Re: (Score:2)
I have a hard time recommending AMD with a straight face nowadays for desktops...
Sure, they're getting their asses handed to them on the bleeding edge... but what kind of imbecile recommends bleeding edge, anyhow?! A four or six core Phenom II coupled with a cheap AM3 board BLOWS THE DOORS off anything from Intel in terms of value...
Re: (Score:3)
My time $ for dev work > $ of most expensive hardware, that's an IT philosophy and anybody that says otherwise is blatantly ignorant (want != get here). My experience is not off of hard numbers made by benchmark tools, but by building and watching people use both intel and amd machines around the same price points around the same time (within 6 months). The intel processor rips amd at number crunching, they seem about even on multi-tasking. Intel has hyperthreading, doubling the logical cores allowing
Re: (Score:2)
Hmm, I hate to correct you, but you bought a desktop replacement, an i7 is the probably the worst processor on the market for power consumption, the extreme edition pretty much requires a distinct connector off a rail (not available in laptops). I have an alienware w an i7, but that's also for heavy mobile IT processing in mind and I carry the cord in the backpack as a result, if I wanted a laptop, I'd buy an ASUS / Toshiba w like an i5/i3 (intel's turbo boost or w/e again let's it crunch faster than AMD o
Re: (Score:2)
Centrino (i.e. pentium M) was a leaps and bounds improvement over the P4 "mobile" chips (I use the "mobile" bit very lightly).
I seem to think they started at around 25W TDP and worked down to as little as 5 or 10W with die shrinks and lower voltage cores.
However, ARM stuff is usually much less than that still, and those numbers don't account for chipset and video, which is generally integrated on the ARM devices.
Justify? (Score:4, Informative)
I'm just wondering... do you really believe that the inefficiency you're talking about is related to an instruction set? Really? It's actually like saying that people are less efficient if they speak Mandarin as opposed to English though maybe more efficient if they speak Spanish. There are obviously advantages and disadvantages of each instruction set, but in truth, all a processor is is a device capable of processing a list of instructions.
Now, to compare it to something more intelligent... if you can cope with intelligence... I hope so.
It is possible to write directions to the your house for someone. Assuming that you have a starting point, you can provide instructions with multiple levels of detail. You can put a great deal of effort into every minute detail and even over compensate. On option you have is :
"Try every single road in this town until you find a blue house with a purple roof and a green football flag on the lawn".
The alternative would be
"Take this road to this road. You'll recognize this road because of the gas station at the corner on your left side. Turn right at this intersection." and so forth.
Which is more efficient? It doesn't matter what language or instruction set you write it in... what matters is the quality of the instructions you provide.
Desktop operating systems which are generally what's run on x86 processors tend to be written by people who know they have access to what feels like unlimited CPU power. The operating systems focus entirely on user experience and not specifically on efficiency. If an OS is designed to be efficient and usable, then there's much to be gained. At this point in time, we're limited to pathetic half breeds like versions of linux that are so covered in band-aids to make it run on small devices you almost want to cry, Symbian which wasn't functional, but was power efficient. Windows CE which was the same... a few others and soon a version of Windows 8 that will be similar to how Linux has been bastardized to run on small devices.
Android has had some of the greatest work done on it to make it more efficient. To compensate for the obvious short-comings in the Linux kernel, Android implements itself on top of something similar to Java which in effect makes it more power efficient. It's not that Java itself or Java programmers are more efficient. It's that by having a virtual machine layer that performs more "traffic control" on the system, power efficiency can be more easily achieved. This would be true for an MSIL or LLVM virtual machine as well. The extra layer makes it so that the virtual machine can do things like shut down or decrease the priority of a given virtual processor as a software function and makes it so that software developers don't have to instrument their apps to achieve it.
The next really big difference between Android/IOS and a desktop OS is simple. All the applications written for these OSes are designed to be run on telephones or "power efficient devices". You could in theory put these operating systems on any processor and their power performance will be quite good.
So... while I'd like to hear from you why you think Intel can't do power efficient, I doubt you'd have much to offer other than stupid buzzy wordy kinda snips.
Try learning something
Re: (Score:2)
Now, to compare it to something more intelligent... if you can cope with intelligence... I hope so.
Something snarky this way comes.
Re: (Score:2)
Re:Intel brand fading? (Score:5, Informative)
Disagree! From their 3rd quarter financials...
"Intel managed to exceed analyst predictions, posting record revenue of $14.3 billion -- up $3.2 billion, or 29 percent year-over-year. The company also set new records for microprocessor units shipped, and expects further growth over the next quarter, with notebook computer sales driving $14.7 billion in predicted Q4 revenue."
gasmonso ReligiousFreaks.com [religiousfreaks.com]
Every time I read about a new Intel chip ... (Score:4, Funny)
... I eagerly scan the article to see if the predictions here were true: http://www.tealdragon.net/humor/startrek/power.htm [tealdragon.net]
"It's that miserable 80986 with the 512K bit bus multiplexed down to one pin."
That's so Intel...
"mirroring the power efficient design of the ARM" (Score:5, Interesting)
Re: (Score:2, Interesting)
That's a bit of a cheap shot. Increased component integration has been a driving force for longer than Intel has been a company, and Intel has been as much of a driving force as anybody else. In fact Intel should excel at system-on-a-chip, since it's all about getting lots of transistors on a small piece of silicon, something they happen to be pretty good at.
Except that full system-on-a-chip designs, including X86 versions courtesy of AMD's Geode, have been around quite a while, but Intel has seemed hesitant to actually do it.
Though I will point out that it's cyclical... at some point, people will start saying "Hey, why don't we make that bit there modular, so we can upgrade is separately without having to redesign the rest of the chip?".
Re: (Score:2)
though it's interesting that Intel hasn't done much in the way of SoC until recently - they've traditionally kept the CPU quite separate from support chips. from a fab perspective, this makes a certain amount of sense, since the cpu needs transistors that perform quite differently (voltage, drive, frequency, etc) and the limited SoC integration present in, say, sandybridge, is mixed in performance (good memory and pcie controllers, mediocre gpu).
the real question with intel mid SoCs is whether they can dr
Why we might possibly care (Score:5, Informative)
and
So it looks like a bit of incremental leapfrog (if that), not some kind of breakthrough. Meh.
Re:Why we might possibly care (Score:5, Insightful)
The prevailing difference is it isn't an ARM chip. It is an x86 chip, meaning off-the-shelf x86 programs and OSes should run on it. Getting an x86 processor below the performance/power threshold of an ARM chip (while keeping it small enough to fit in a phone) is a pretty major breakthrough.
Re:Why we might possibly care (Score:5, Informative)
Who is going to run current desktop software/OS on a mobile device that has a drastically different spec in other areas (memory, screen size, touchscreen, etc.)?
Intel getting better performance/power threshold compared to ARM is a great selling factor; but x86-compatibility especially for off-the-shelf program isn't one of them.
Re: (Score:3)
Who is going to run current desktop software/OS on a mobile device that has a drastically different spec in other areas (memory, screen size, touchscreen, etc.)?
It looks like sgt scrub, Baloroth, and Gothmolly so far.
Re: (Score:3, Insightful)
it is not that they are going to run the desktop software unaltered but they can recycle most of the code and simply rewrite the gui rather than rewrite from scratch
Re: (Score:1)
Because the other portions were written in x86 assembly? I just don't get it.
Re: (Score:1)
no but just because it is written in a higher level language does not make it portable, despite that being the original intent of thing like c and java as well as many other languages
Re:Why we might possibly care (Score:4, Interesting)
ARM level performance and power consumption with X86 compatibility would open up a whole new world for netbook-type devices. Think 48 hour battery life with your average 50Wh laptop 6-cell...
Re: (Score:2)
Except that Intel doesn't want to improve netbook-type devices because that would cut into their sales of more profitable x86 chips.
Re: (Score:3)
Hmmm, I don't see a problem with charging a premium for this tech. Say the processing power of a mid level Core 2 Duo (running a P8400 right now and that's pretty much the sweet spot for mobile computing power when it comes to me), but 3W power draw for the whole system running at full tilt including screen and radios (which is roughly similar to the iPad and other tablets)... with a 100Wh battery, you'd be looking at over 30 hours of heavy use, and far longer runtimes with more realistic use.
I'd pay upward
Re: (Score:2)
Smartphones and tablets are currently adjacent to (rather than replacing) PCs. But if I could have one tiny portable computer for everything and just plug it into larger peripherals for heavy-duty work, I would.
Re: (Score:2)
Re: (Score:2)
That is Win8 in a nutshell.
I have a feeling (having played with it for several months now) that it will be OK, but that MS will learn a lot on this gen and Win9 will be the next WinXPEmbeded while Win7 will stay mainstream in the enterprise.
-nB
Re: (Score:2)
Who is going to run current desktop software/OS on a mobile device that has a drastically different spec in other areas (memory, screen size, touchscreen, etc.)?
Hence Win8, Unity, Gnome3 etc.
Arguably, x86 support would be most important for Win8, since on Linux you'd just recompile the apps. But there are still some binary-only packages there.
Re: (Score:1)
Mod parent up. Low power x86 = win.
Re: (Score:3)
Intel has tested its reference handset against a handful of the leading phones on sale today. It says these tests show that Medfield offers faster browsing and graphics performance and lower power consumption than the top three, says Smith.
But have they done something like taken 3 of the fastest android phones out there, and benchmarked it against Medfield on iOS, and then performed tasks that cause Android's lack of hardware accelerated UI to suck the battery and speed out of the chip?
Re: (Score:2)
I imagine they've tested Android on x86, since they've had it running for a while now and that's the main target for 2012. iOS does NOT run on x86 at this point. The stated goal is to double the pace of Moore's law for mobile processors in the next few generations. They have the room to do this for 2-3 more cycles, which would imply intercepting ARM pretty soon.
Re: (Score:2)
Actually, iOS probably does run on x86. The iOS simulator runs on x86, and runs iOS programs compiled for x86.
There's no reason Intel would try to run it on their chip and compare to Android though.
Re: (Score:2)
iOS does NOT run on x86 at this point.
Doesn't it? Funny, I ran the iOS simulator (iOS on x86) on my Mac earlier today.
Re: (Score:2)
Saying that iOS runs on x86 because of the iOS simulator is like saying that Windows runs on PowerPC because of WINE. It's just an implementation of the iOS APIs, not the OS itself. The simulator just makes a set of iOS-like APIs available to Mac applications.
Re: (Score:2)
Re: (Score:2)
Given that the APIs and apps are pretty much the only thing that differ between Mac OS and iOS, that's pretty much iOS running on x86. I'm sure it would take *very* little for apple to provide a full x86 build of iOS to intel – especially if intel is sat there saying "we have chips we could sell you".
Re: (Score:2)
True, a larger concern is that even if existing applications CAN run on x86 with just a recompile, you still need all those developers to recompile their apps. I've got more than a few apps on my phone that are no longer actively updated. I'd lose them all in a switch to x86.
Fundamentally, it comes back to the basic question: why should Apple switch? ARM's licensing fees are cheap, Apple makes their own SoCs (and they've consistently moved in the direction of doing *more* of the design themselves, rather th
Re: (Score:1)
You might possibly care because this will create additional competition in the mobile processor market which is a good thing(tm). Like all competition it will spur innovation on both sides and provide you with better, faster, cheaper products. Unless, you know, you don't like that kind of thing.
Re: (Score:1)
the only reason ARM is more power efficient is that it's SoC. everything is on one chip so less electricity gets used traveling around the motherboard and PCI bus.
for years intel treated Atom like a bastard with the oldest manufacturing lines when they should have put it on their newest processes. they would have killed ARM a long time ago
Re: (Score:3, Insightful)
This is not true.
ARM is more power efficient because it has a simpler ISA which requires much less logic to execute. Intel chips take CISC instructions, and have a pile of optimization and translation logic to turn instructions into smaller micro-code RISC instructions. This is all necessary just to support a legacy ISA. ARM chips don't have this problem, at least to the same extreme.
Re:Why we might possibly care (Score:4, Informative)
Re: (Score:2)
No, there's a hell of a lot of other reasons ARM is more power efficient - the core designs themselves are fundamentally more efficient than any x86 core I've ever seen.
Re: (Score:3, Insightful)
Apart from just rooting for different companies as if they were in a horse race, which seems to be a popular pastime in the press and blogosphere,
I'm always up for making fun of fanbois. But on reflection, I think rooting for companies is a better pasttime than rooting for professional sports. When it comes down to brass tacks, multi-billion dollar organizations like the NFL, NBA, etc are nothing more than bread and circuses. At least what these companies do has the potential to make a significant difference in people's daily lives.
Why I care as a developer (Score:4, Insightful)
If you have ever touched the Android SDK, you learn very quickly that doing development on a desktop is rather painful. The main reason is their dev test environment completely emulates an ARM processor (on top of your desktop x86 system), which is extremely CPU intensive. If we get android running x86 (there are already a number of people out there working towards this), we can then do our testing in an x86 based simulator, which will be much easier on desktop system.
Re: (Score:2)
This is in terms of application developer. The x86 for Android is much faster than the one with ARM emulation, but is the ARM emulation speed that much of a hinder for usual development?
What about the actual application usage by end-users? Will the x86 Android phone come with an ARM emulator to run applications that has native ARM libraries (at least until there are enough generic or x86-specific Android apps)?
Re: (Score:2)
I imagine that, since Google is partnering with Intel on this, we'll have a sort of fat binary like during the Mac's transition era from PowerPC to x86. An added benefit would be that developers stick to bytecode libraries if they want to be able to run anywhere, and everyone will benefit from the added portability.
Re: (Score:2)
What about the actual application usage by end-users? Will the x86 Android phone come with an ARM emulator to run applications that has native ARM libraries (at least until there are enough generic or x86-specific Android apps)?
Andoid apps are run on a platform-independent, Java-esque virtual machine. All that would need porting is Dalvik, not the individual apps.
Re: (Score:2)
Unless that app was written with the Android NDK, or links to a library built with the NDK. Then it's running native code, and is not platform independent.
Re: (Score:2)
Android on ARM development on an ARM-based laptop would give the best of both worlds. Remember that ARM has not had any motivation for high-performance chips until recently. The ARM11 was fast enough for almost all phone and embedded applications. Within two years there will be eight-core 64-bit socs with high performance GPGPUs.
Re: (Score:2)
If we get android running x86 (there are already a number of people out there working towards this), we can then do our testing in an x86 based simulator, which will be much easier on desktop system.
Very interesting comment. It might be possible to create a VServer instance and install the SDK. There wouldn't be any emulation needed.
Re: (Score:3)
But why use the emulator if you can debug directly on a physical device attached to the USB port of the PC. Works much better and easier to test as some sensors cannot be emulated easily. Use a physical device for devlopment and the emulator for automated unit testing.
Re: (Score:2)
If we get android running x86 (there are already a number of people out there working towards this)
Actually, I am running Android on my x86 tablet [android-x86.org]. The speakers and touch screen don't work so you have to plug a mouse, and it's unstable, but it does run already.
Re: (Score:1)
Or, alternatively, we need an ARM desktop machine.
Re: (Score:1)
Apart from just rooting for different companies as if they were in a horse race, which seems to be a popular pastime in the press and blogosphere, the summary omits any reason why we might care about Intel's new offering. In what way is it different from the prevailing ARM chip? The answer is buried on page 2 of TFA:
and
So it looks like a bit of incremental leapfrog (if that), not some kind of breakthrough. Meh.
The statement about faster graphics is bull shit. ARM doesn't make GPGPUs. ImgTec makes them and sorry but Intel's OpenCL and OpenGL graphics against the VR tech from ImgTec version 6 is nowhere near their 600 series that has OpenCL 1.1/2.x and OpenGL 3.2 full support. That's right up there with Intel claiming it's graphics support compares to Nvidia and AMD. It's a crock of shit.
as the article says (Score:2)
Re: (Score:2)
Today: 1.5GHz 45nm dual-core Cortex A9 or 1.3GHz 40nm quad-core Cortex A9
3-6 months: 2.0GHz 32nm Cortex A15.
Samsung expects to get that out in 2012Q2, which would be in less than 6 months. Inside of 2012, we should also see 28nm Cortex A15, as well as quad-core parts.
So, the problem is that Intel is touting these performance advantages for their next-gen part, but are comparing it to current-gen ARM stuff. An accurate comparison would be between the next-gen Atom and the next-gen ARM, since that's what it'l
Maybe we will see Tizen on this . . . (Score:2)
Since Intel and Samsung are the driving forces behind Tizen, a new "open" Linux project for phones and tablets, maybe we will see Tizen running on this processor next year? When I say "open" I mean as in door; full access without jail breaking. Although the details about Tizen are still murky, at best.
Or maybe Mer, the folks who are picking up where Meego left off, could use this. But they need to get a hardware manufacturer on board.
Re: (Score:2)
Intel was also the driving force behind Meego and Moblin and Maemo and lost interest in all of them.
Intel is terrible at seeing software through to completion, other than their compiler.
Re: (Score:2)
Err, no. Intel has continued to be a driving force behind them, despite the setbacks. Moblin merged with Maemo because Nokia approached them with the goal of creating something that wasn't so tied to one vendor, and created MeeGo. Then MeeGo got taken over by Microsoft, damaging the relationship and catching MeeGo in the middle. So Intel has gone off looking for another partner and found Samsung.
And since it looks unlikely that Samsung is going to drop everything and go Microsoft-only, I doubt that we'll se
Re: (Score:2)
should read
But what OS will it run? (Score:2)
If it's Windows 8, penetration may be in the single digits. In the tablet marketplace, Microsoft, and by extension Intel based processors, are not major players. It's not just the power consumption. I've heard a rumor that Android would be ported to X86, but how will that work, I wonder? Would there be a separate marketplace? Development requirements to compile and test on two different architectures?
Re: (Score:1)
I've heard a rumor that Android would be ported to X86, but how will that work, I wonder?
Did you read TFA?
From page one of the article:
The phone prototype seen by Technology Review was similar in dimensions to the iPhone 4 but noticeably lighter, probably because the case was made with more plastic and less glass and metal. It was running the version of Google's operating system shipping with most Android phones today, known as Gingerbread; a newer version, Ice Cream Sandwich, was released by Google only about a month ago.
From page two of the article:
Intel's reference tablet, which used the same Medfield chip as the phone, was running the latest version of Android, Ice Cream Sandwich. It had a slightly larger screen than the iPad 2 but was about the same in thickness and weight. A limited trial suggested that it was noticeably nicer to use than older tablets based on the abandoned Honeycomb version of Android.
Re: (Score:2)
Re: (Score:2)
I can tell you Android runs on x86 already [android-x86.org], albeit in an unstable state. I expect with Intel behind this, things will develop faster. Regarding the compatibility issues, Android is bytecode with only specific libraries compiled natively, and they're being ported. I imagine we'll see some sort of fat binary support for both architectures on the Android Market.
Re: (Score:2)
Google TV currently runs on Atom (x86) hardware. I'm guessing the main issues for broader x86 support are drivers and performance.
Re: (Score:2)
> I think both Microsoft and Intel are working to stay relevant in a future of one device / OS that does it all.
Yeah, that's the thing I don't want any part of at all, and I don't think I'm alone. M$ is pretty much a given on the PC, but Winders on every other platform has been, let's say, an acquired taste. I had to live through Windows Mobile 5 and 6 on a phone, and it sucketh mightily. I've worked with Windows Phone 7, and it's just not relevant and not anything I'd ever want to own. I've had the
Re: (Score:2)
iOS apps virtually always compile for both x86 and ARM, with no distinction. Why shouldn't Google be able to do the same thing?
Re: (Score:2)
I have no idea why Google shouldn't be able to do the same thing. But before I touch an Intel tablet, I'd want to know what I'm getting into. If it isn't completely interoperable with ARM devices in the Android universe, it's not interesting. And Windows 8 is a complete non-starter.
Re: (Score:2)
Microsoft, and by extension Intel based processors, are not major players. It's not just the power consumption.
Yes, it's the fact that Windows was simply not designed for touch-centric mode of use. Not even the various "Tablet PC" editions, which really required a stylus to be useful.
However, the whole point of Win8 is that it is designed for touch - that's what the new Metro stuff is all about. And yet it will still run old x86 apps if you need them. And if it'll also have a 12-hour battery life, like Transformer Prime does? pray tell, why wouldn't it be a major player?
Re: (Score:2)
> However, the whole point of Win8 is that it is designed for touch - that's what the new Metro stuff is all about. And yet it will still run old x86 apps if you need them. And if it'll also have a 12-hour battery life, like Transformer Prime does? pray tell, why wouldn't it be a major player?
I've looked at Metro, and it looks like repurposed Windows 7 Mobile and Windows Media Center stuff. Indications are, Windows 8 was a small amount of "designed for" and a much larger amount of "rebrand and reuse the
Re: (Score:2)
Something to think about: The various *nix solutions tend to fall into a pattern -- a GUI for KVM (OSX, Ubuntu) and a different GUI for touch devices (iOS, Android). This is because touch is a whole different paradigm from KVM and trying to make one GUI do both leads to incompatible design decisions. Mind you, the underlying OS can be the same (if it's decent, and supports resources required by both paradigms). But the actual interface design choices are radically different. In the past, Microsoft has
Re: (Score:2)
Something to think about: The various *nix solutions tend to fall into a pattern -- a GUI for KVM (OSX, Ubuntu) and a different GUI for touch devices (iOS, Android).
True. This is exactly what Win8 does, too, except it lets you switch between two modes as you go.
What I see in Windows 8 is not so much a different interface as a different marketing campaign.
Are you seriously claiming that Win8 Metro is "not so much a different interface" compared to classic desktop?
Re: (Score:2)
> True. This is exactly what Win8 does, too, except it lets you switch between two modes as you go.
Re: (Score:2)
...which means that as a practical matter you'll be able to play music from the Metro interface but will always have to drop back into the KVM interface to get work done.
The same goes for iPad and Android, except that you have to "drop back" to a different device in that case.
Your basic argument is flawed from the get go because you assume that Metro has something in common with WMC. Not only it doesn't, the far more important point is that it's extensible in the same way iOS is - you install Metro apps that let you things you need to be done.
Microsoft, on the other hand, appears to see the touch interface as an additional feature that might control simple things
Not true either. If anything, the classic desktop in Win8 feels more like an "additional feature", what with Start menu's replacemen
Re: (Score:2)
>> ...which means that as a practical matter you'll be able to play music from the Metro interface but will always have to drop back into the KVM interface to get work done.
> The same goes for iPad and Android, except that you have to "drop back" to a different device in that case.
At the moment. Adobe has some ports coming out that promise to sever my last connection to the PC. And I'll let you in on a secret -- I don't own a tablet yet, of any kind. I own an Android phone because Microsoft make
Battery life? (Score:2)
When someone shows it has battery life comparable to the current dual-core ARM A9 SoCs, then they will have something to talk about. Until then, it's just a PR pipedream.
Fif time? (Score:2)
How many times Intel has tried to compete against Risc?
First, ð [wikipedia.org]ere was ðe iAPX 432 [wikipedia.org]. I never saw any use of it.
Ðen ðere was ðe i80860 [wikipedia.org], today remembered for being ðe demonstration vehicle to Microsoft OS/2 3.0 NT.
Next try, ðe i80960 [wikipedia.org], was actually succeß [wikipedia.org]ful — in printers, network and I/O controllers.
Ðen we had Merced, later named Itanium [wikipedia.org], AKA Itanic, ðe biggest flop around.
Intel actually ditched perfectly fine StrongARM [wikipedia.org] and Alpha [wikipedia.org] architectures it bought
Re: (Score:2)
How many times Intel has tried to compete against Risc?
[...]
Forgive me, but colour me skeptic this time around.
It's true that Intel hasn't achieved great success with it's own RISC designs, but what about the times that Intel competed using its CISC designs against:
It's also worth noting that all of the modern ARM-based SoCs that Medfield will compete against are CISC designs, not RISC, so I guess my list doesn't even matter :-/
Re:Fifth time? (Score:2)
> It's true that Intel hasn't achieved great success with it's own RISC designs, but what about the times that Intel competed using its CISC designs against:
[]
> It's also worth noting that all of the modern ARM-based SoCs that Medfield will compete against are CISC designs, not RISC, so I guess my list doesn't even matter :-/
Yes, but all ðese were hampered in ðe desktop by ðe prevalence of binary, proprietary software. While binary, proprietary software also dominates ðe mobile market, it is compiled against iOS and Android, where it is Intel, not Risc, which fights an uphill battle.
Ðat, and talking about a proceß-derived advantage in a not yet ðere product is easy. Most probably ARM (and MIPS) will be already ðere if and when Intel hits ðe shelves.
Now, how is ARM Cisc? Last time
Re: (Score:2)
Now, how is ARM Cisc? Last time I checked, it stood for Advanced Risc Machines has technology subverted the acronym?
ARM chips since ARMv7 have supported the Thumb-2 instruction set, which has 32-bit instructions with CISC features like making an optional left shift available to most instruction, and allowing each comparison to be followed by up to four conditional statements. It's what most JIT and my compilers target now, IIRC.
While binary, proprietary software also dominates the mobile market, it is compiled against iOS and Android, where it is Intel, not Risc, which fights an uphill battle.
It's absolutely true that it's Intel whom must the uphill battle here. The fact that many Android applications are compiled to DEX, and the emergence of HTML5 runtimes offer some relief. I stil
Re: (Score:2)
ARM chips since ARMv7 have supported the Thumb-2 instruction set, which has 32-bit instructions with CISC features like making an optional left shift available to most instruction, and allowing each comparison to be followed by up to four conditional statements. It's what most JIT and my compilers target now, IIRC.
Ðat is quite different from being a Cisc proceßor in ðe mobile market, being able to choose some Cisc instructions wiðout carrying ðe full Cisc legacy is an advantage. And ARM has chosen ðe Thumb 2 instructions not to go back to Cisc, but to achieve a code density which is a furðer advantage over anything Intel can do wiðout throwing out its only real competitive advantage, which is ðat people are familiar with ðe x86 instruction set. Not ðat it buys