Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
Cellphones Businesses HP Handhelds Intel Microsoft Hardware Apple

Why Mobile Innovation Outpaces PC Innovation 231

Posted by Soulskill
from the pc-aren't-trendy dept.
Sandrina sends in an opinion piece from TechCrunch that discusses why mobile systems are developing so much faster than the PC market. The article credits Intel with allowing hardware innovation to stagnate, and points out how much more competitive the component vendor market is for smartphones. Quoting: "In PCs, Intel dictates the pace of hardware releases — OEMs essentially wait for CPU updates, then differentiate through inventory control, channel / distribution and branding. Intel and Microsoft win no matter which PC makers excel — they literally don't care if it's Asus, Dell or HP. In the smartphone world, it's the opposite. Dozens of component vendors fight each other to the death to win designs at smartphone OEMs. This competitive dynamic forms an entirely different basis for how component vendors approach system integration and support. Consider Infineon, which supplies the 3G wireless chipset in the iPhone. In order to stay in Apple's graces, Infineon must do everything necessary to help the hardware and software play well together, including staffing permanent engineers in Cupertino or sending a team overnight from Germany. Do you think Intel does this for Dell?"
This discussion has been archived. No new comments can be posted.

Why Mobile Innovation Outpaces PC Innovation

Comments Filter:
  • by symbolset (646467) on Monday June 21, 2010 @01:07PM (#32643832) Journal

    It's called cannibalization. When there's an established monopoly any possible invention "cannibalizes" the markets of established product groups and must be suppressed. It takes a long time because monopoly is tremendously profitable, but ultimately this is a stagnant path that goes extinct in much the same form as it existed when it achieved monopoly.

  • It's About Time (Score:5, Insightful)

    by WrongSizeGlass (838941) on Monday June 21, 2010 @01:14PM (#32643918)
    Mobile innovation is outpacing desktop innovation because desktop innovation has been going on for 20+ years and mobile innovation has been stuck in its infancy for too long.
  • by Animats (122034) on Monday June 21, 2010 @01:16PM (#32643930) Homepage

    If Intel was holding everyone back with your proposed CPU and Chipset conspiracy, don't you think that would just prime the market for AMD to pair up with VIA or someone and just wreck Intel?

    AMD tried hard. They introduced 64-bit x86-compatible CPUs. And Microsoft wouldn't support them until Intel caught up. On the other hand, Microsoft supported the Inanium until 2004.

  • Good Enough (Score:3, Insightful)

    by Fuseboy (414663) on Monday June 21, 2010 @01:17PM (#32643950) Homepage

    The PC isn't innovating because it doesn't need to - it's already perceived as "good enough" by its users. Advances in computing power generally get asorbed by the ever-increasing needs of the OS and office applications. Smart phones, on the other hand, are so constrained by their form factor and their tiny user interface that innovations in UI, usability, battery life, etc. are very meaningful. Merely making a different set of trade-offs can produce real wins.

  • Easy answer (Score:3, Insightful)

    by VincenzoRomano (881055) on Monday June 21, 2010 @01:17PM (#32643952) Homepage Journal
    Because there's more money! In the handsets first (look how much the iPhone 4 will cost!), then voice services and texting and finally with data plans.
    Are you really able to check the bills they send to you?
    Are you really willing to do it?
    Or you simply PAY?
    This is why!
  • by pwilli (1102893) on Monday June 21, 2010 @01:18PM (#32643964)
    Because PCs have a headstart of decades?

    It's like asking why China can have growth rates of over 10% while "Western" countries only get 1-3%. It is very hard to improve if you're already close to technical and physical limits and any made improvement won't look as impressive. Handhelds will soon enough hit the same walls that Desktop Systems currently try to tear down.
  • by $RANDOMLUSER (804576) on Monday June 21, 2010 @01:19PM (#32643996)
    The "dominance" is the x86 instruction set. Intel and Microsoft have locked us in; AMD is just a second source for chips that use that instruction set.
  • by Anonymous Coward on Monday June 21, 2010 @01:21PM (#32644012)
    Also wasn't that "Vista ready" "Vista Capable" "Really Vista Capable" "Vista capable but not really" stickers were all Microsoft helping intel because it was not ready to handle Aeroglass?
  • by petes_PoV (912422) on Monday June 21, 2010 @01:27PM (#32644080)
    Before IBM created the standard platform there were a plethora of competing chips, architectures, "operating systems" approaches, price-points and failures. The phone market is in the same situation now. Just as soon as some manufacturer starts to dominate and everything becomes standardised two things will happen: the software will become much more important and the hardware will start the spiral down to commodity status.

    The car market has gone the same way - they all look pretty much the same - dictated by the laws of aerodynamics. It means that other features have been developed to differentiate - things like economy, safety, electronics. While this is not necessarily good for the manufacturers - the number of players shrinks as the market consolidates, it is good for the consumers. So it will be with phones (or whatever they evolve into, they're the equivalent of an Atari, today). We have yet to see the major benefits emerge, despite what Apple may tell us.

  • Re:It's About Time (Score:3, Insightful)

    by petes_PoV (912422) on Monday June 21, 2010 @01:30PM (#32644114)
    Mobiles have been around for over 20 years. I got my first one in 1988 and they *have* come a long way since. However, unlike PCs, mobile phones have always been more restricted by size and battery capacity. Constraints that never applied to PCs.
  • by Anonymous Coward on Monday June 21, 2010 @01:31PM (#32644132)

    Incorrect.

    AMD introduced a 64-bit/32-bit hybrid CPU as a competitor of the Itanium for the server market. Opterons were and still are quire successful in that market, especially with the new g34 socket and 12-core processors (up to 48 cores per server and no tier-BS - all processors can run 1-4 SMP configuration) Microsoft viewed AMD's technology as *superior* to Itaniums because it allowed for seamless migration from 32-bit to 64-bit platform. Microsoft essentially *told* Intel that they will only support *one* 64-bit CPU and that will be the AMD instruction set. Intel had no choice but to incorporate AMD's instruction set into their processors.

    Microsoft doesn't care if AMD or Intel catch up to each other as long as their software runs on those processors. They didn't "wait" for Intel to catch up. It simply took many years to migrate Windows from 32-bit code to 64-bit clean code. There was XP 64-bit, but how many people used that? Hell, lots of people didn't even get 64-bit Vista because of perception that if you don't use more than 4G of RAM you don't need it. Actually, all modern machines should be running 64-bit OS only - simplified address space management and increased register count makes it a no-brainer.

    If you want an example of a company that still fails and fails hard at 64-bit software, it would be Adobe. They recently dropped support of the 64-bit plugin. Not sure, maybe they are still "waiting for Intel to catch up"?

  • by SQLGuru (980662) on Monday June 21, 2010 @01:32PM (#32644148) Journal

    Don't forget that the mobile market gets to take advantage of knowledge and research done for the server/desktop market. Sure, there's new tech going on in there, but it's the whole trickle down approach, too. The mobile market is *catching up* to the desktop market, so there's a lot of acceleration just from using all of the prior knowledge. Building multi-core processors isn't easy and how many mobile phones do you know that are sporting them? Zero that I know of. And what about Intel's turbo processing (dropping cores and overclocking the remaining cores when not needing as many cores), how long do you think before a mobile phone will have that technology?

    The innovation in a lagging area (mobile) seems faster only because the innovation has already been researched in the leading area (servers first and consumer second). It takes longer to figure out something the first time than it does to figure out how to make it "smaller" (smaller in the sense that it is for the mobile market, it may be a smaller die footprint or power footprint or whatever).

  • Re:It's About Time (Score:5, Insightful)

    by CAIMLAS (41445) on Monday June 21, 2010 @01:32PM (#32644154) Homepage

    Of course, TFA completely overlooks the newer line of Mooreland Atom processors from Intel.

    It also ignores the fact that cell phones are a throw-away market. There isn't nearly the 'data lock-in' that the x86 architecture has. Where smartphones can have their software sized to the hardware, Intel (and AMD) are forced to size to the software. Not only does this limit what Intel can do, it limits how fast they can do it.

  • by Tridus (79566) on Monday June 21, 2010 @01:40PM (#32644278) Homepage

    "Mobile" in terms of dumb phones actually isn't moving very quickly. Dumb phones have existed for a couple of decades, and strictly speaking call quality was better in the 90s then it is today. In terms of voice in remote places and durability, every phone on the market today is straight up worse then the Nokia 6160 I had 10 years ago. Voice is more of an afterthought these days.

    The smartphone market on the other hand is pretty young, and is acting like a new market with rapid improvements and cut throat competition. It's also a market subject to fashion trends and full of users who will change phones as often as their contracts allow, which really isn't the case in say the PC market (where average users will buy a new computer when the old one dies and these days even gamers don't need frequent upgrades like they used to).

  • by vlueboy (1799360) on Monday June 21, 2010 @01:50PM (#32644388)

    Try your statement again on a Bean counter test (TM):

    Hell, lots of people didn't even get 64-bit Vista because of perception that if you don't use more than 4G of RAM you don't need it.

    Bean counter:Alright! Since we skipped Vista, none of our corporate PC's ever needed even 3GB. Money saved!

    Actually, all modern machines should be running 64-bit OS only

    Bean counter:Tell me more and I'll put in an order so we can stay competitive in this "modern" market. I'm curious.

    simplified address space management

    Bean counter:Huh?

    and increased register count

    Bean counter:Useless. More technobabble that only programmers need. I'll recommend keeping XP on our single core Pentium 4. I'll also get a raise for saving the PHB a ton on this year's budget.

    makes it a no-brainer.

    Bean counter:I fully agree. I'll even grin all the way to the bank!

  • by cheesybagel (670288) on Monday June 21, 2010 @01:53PM (#32644414)
    SPARC didn't get scuttled because of Itanium. Sun merely bungled up enough times with chip design that they did not have much of a product to compete with Itanium. UltraSPARC V was late, buggy, and canned. Rock, about the same thing. They managed to finish Niagara, but Niagara was mostly good for low end boxes which did web serving: it has lots of threads for doing integer processing, but lousy floating point, and lousy single threaded performance.

    Sun fumbled so much with SPARC chip design they had to ask Fujitsu to sell them their SPARC64 IV processors, so they could actually have a high end SPARC server product to sell.

  • by turbidostato (878842) on Monday June 21, 2010 @01:55PM (#32644430)

    "The "dominance" is the x86 instruction set."

    And the "dominance" of the "dominance factor" is that's 30 year old, mature, stablished technology.

    Oh, well, why we don't see so much innovation on the VHS world? Companies should be urged! VHS is not only stagnating, is even dispearing!

  • Re:Chip juggling (Score:1, Insightful)

    by Anonymous Coward on Monday June 21, 2010 @02:08PM (#32644592)

    Enormous chunks of Windows are not written in x86 assembly code. The NT kernel [wikipedia.org] was written from the start [wikipedia.org] to be portable across architectures.

  • by petes_PoV (912422) on Monday June 21, 2010 @02:21PM (#32644772)

    Laws of marketing, definitely not laws of aerodynamics.

    The biggest driver in car design since the oil crises of the 70's has been miles per gallon. That has improved engine technology and made car shapes more slippery. There's only one way to reduce drag, that's to be aerodynamically efficient. There's only a small number of solutions to the laws of laminar flow. That's why all cars look the same.

  • by icebraining (1313345) on Monday June 21, 2010 @02:23PM (#32644804) Homepage

    lots of people didn't even get 64-bit Vista because of perception that if you don't use more than 4G of RAM you don't need it. Actually, all modern machines should be running 64-bit OS only - simplified address space management and increased register count makes it a no-brainer.

    And they're right. Those are fine technical arguments, but the end result is the same. The performance gain is negligible, end you get in compatibility problems like the mentioned Adobe plugins.
    I just switched to 64bit on my AMD Neo with 2GB of RAM and I haven't noticed any improvement whatsoever.

  • by improfane (855034) on Monday June 21, 2010 @02:25PM (#32644840) Journal

    RSI sufferers would disagree. I love my trackball and recommend it to anyone. Seriously, use one, you won't want to go back to a mouse.

    It might not be that common as it's a niche. Many disabled people need them too.

  • by sexconker (1179573) on Monday June 21, 2010 @02:28PM (#32644868)

    Because PCs sit at home while mobile devices, being mobile, get trotted out in public. They are a fashion accessory and fucktards will pay gobs and gobs of money, every fucking year, for useless, backwards shit and not give a crap about the actual good shit.

    Because when you start from zero, you've got nowhere to go but up. All the useful innovation is simply copied over from the PC realm when mobile devices can handle it (size, performance, battery).

    Because when you've got a "new" market, there is no status quo with regards to who owns that shit. Companies will scramble to get a good seat as a top supplier. When the smaller players get wiped out, the big players will become Intel & MS & IBM - actively attacking innovation until someone scrapes up enough money (debt) and effort (3rd-world / open-source slaves) to challenge them. The small players will enjoy some success until the big players finally react with the budget of a million SUNs and make them irrelevant again.

    There's something I don't get, though: Who are the fucks that see innovation in the market? I see shitty devices that can't do a tenth of the shit that my PC can, and I see them following the same pattern as PCs did decades ago, for better (features, performance) and worse (players getting big, competitors dying off, innovation being choked). There is no innovation here. Everything has been completely predictable, completely shitty compared to existing offerings, and completely expensive.

  • Re:Good Enough (Score:3, Insightful)

    by Hatta (162192) on Monday June 21, 2010 @02:42PM (#32645058) Journal

    That's a software problem, not a hardware problem. And to the extent that it can be blamed on hardware, there's better hardware available to fix it. Multiple cores enable you to do many things at once without slowing any of them down to an appreciable extent. SSDs allow you to drastically reduce load times for your applications. But in the end, if you want a responsive system you need to use software that's designed for responsiveness.

  • by Rene S. Hollan (1943) on Monday June 21, 2010 @02:46PM (#32645094)

    I know, I could write that every decade or so.

    When I started with computers, processing audio was hard and clunky, and video unheard of. But, increasingly, non-computer devices are getting more intelligent (in terms of really being computers under the hood), to the point where they look and feel like computers, with different peripherals.

    When I first viewed video on a computer monitor, it was clunky, and in a window. Even in full screen mode, one would eventually escape back to the windowing UI, that made the TV stop looking like one, and more like a computer. 10 foot interfaces have changed all this, of course. And yet, if one does want to switch from a video entertainment device "mode" to an "internet browsing" mode to view YouTube videos, for example, the computer UI looks normal and not out of place. We are getting used to the browser being our interface to the world around us.

    The point is that computers are becoming ubiquitous. From TVs to phones, to ebook readers, to netbooks, and iPads, we are using computers to present content as well as organize it. If I were to desire a "universal" remote control, I would seriously consider a netbook for the purpose because it could add so much more functionality over a universal "remote", and actually costs less than many of them! Why we still have 38khz IR remote controls instead of web-based UIs available over 802.11b/g/n escapes me, but I am sure that will start to change with the first "networked" remote, and "IR hubs" with 802.11b/g/n in and IR blasters "out" for legacy equipment. Why can't I use my smartphone as a remote? Oh wait! I can!

    Just look at how UpNP has shaken out into DNLA-based equipment.

    I just retired a 400 disk CD/DVD changer and replaced it with a MythTV box. I had done that before, but with false starts, and things weren't smooth enough to really retire the changer. Now, the MythTV box is quiet enough, and powerful enough, to make the thought of actually handling media for anything more than "one of" playback archaic.

    Look at HDMI, at least the latest incarnations. Not only does it integrate uncompressed video and audio in a single cable, 100 Mb/s datalink layer ethernet, and SPDIF "back channels" are included. Literally, "one cable to link them all". And, it's not an expensive interface, only found on high end equipment: it is becoming the standard for computer monitors and televisions (the difference really becoming blurred).

    So, certainly because of competition and "technology catchup", phones and consumer electronics are evolving at a dizzying pace, whereas computers have stagnated. but, perhaps we've reached the point where computers already do everything we want them to: compute, process, store, and retrieve data. As far as presentation of entertainment content goes, a traditional computer offers little more than storage, and second rate display: it is non portable and the display or audio capabilities are poor compared to alternative: smaller display but complete mobility in phones, netbooks, and iPads, and massive displays in flat-screen TVs. And these are the areas where we are seeing advances.

  • Re:Good Enough (Score:3, Insightful)

    by nyctopterus (717502) on Monday June 21, 2010 @02:54PM (#32645186) Homepage

    It's both, I think. Sure you can argue that there are better hardware components around, but the reality is that, as sold, most hardware packages are contributing to the problem. My iMac here, for example, has all the processing power I need, but clearly has a IO bottleneck. The processor mostly sits pretty idle and the RAM unused while the disk grinds. Yes, and SSD would improve the situation, but it wasn't sold with one. The dual core was a disappointment, I thought it would drastically improve multitasking, but it's not noticeably better than multitasking on a single core G4 (loaded with software from its day).

    I guess my point is that hardware needs to be better balanced. Yeah you can do this yourself, but eh.

  • by Anonymous Coward on Monday June 21, 2010 @04:27PM (#32646308)

    Yes and No.
    DEC made the same mistake as Intel repeated (much later) with the Itanium: There _must_ be software, lots of it.

    It's not enough to just say "oh we've got $some unix$ and perhaps a special NT version coming up, you can always just build your own application stacks, yadda yadda".

    There _must_ be a web server, a database, a file server, a mail server, CAD applications, audio engineering, video post-production, circuit capture, compilers, engineering tools, mathematics tools (symbolic and numerical), etc.

    The arrogance of chip firms never ceases to surprise me. If a [not a drop-in replacement] chip isn't _radically_ faster/cheaper/more featureful than what's already out there, what OEM or end-user in their right mind would sign up to rewrite/redeploy/repurchase the needed software and IP?

  • by Zaphod The 42nd (1205578) on Monday June 21, 2010 @07:41PM (#32648098)
    I understand and agree, but if you use Linux and don't give a flying flip about Windows and hate everything that windows restricts (yeah, I hear you, and I agree) then what the hell do you care about how M$ handles its drivers? Trying to find things to complain about that don't even bother you? :P
  • by hazydave (96747) on Wednesday June 23, 2010 @10:22AM (#32665458)

    The mobile market was pretty boring until recently. One Blackberry was pretty much like another, same with Palm and Microsoft WinCE/PocketPC/WinMo.

    It was really Apple legitimizing the "Consumer Smart Phone" that's got everyone out there now scrambling for position in this space. Which, curiously, is exactly what happened in the 70s, 80s, and into the 90s in the world of personal computers. Back in the 70s, there were dozens of companies making proprietary hardware, operating systems, etc. You could have something come along, like the Apple Macintosh or the Commodore Amiga, that entirely changed the market in one shot.

    Since then, PCs have more or less grown up. The level of complexity is such that it's very difficult to do anything interesting at the system level... it has to be part of a new chip design. That raises the risk threshold significantly, as well as time between new generations of CPU, GPU, or PC system chips and architectures. Even Intel is slow moving on these things. As a result, most of the stuff that gets called "innovative" in the PC marketplace is little more than "same, old, same old" in fancy casework (Apple), or increasingly small incremental improvements what was pretty damn fine last year (Intel, AMD, nVidia, etc).

    The powers that be are pretty settled... Intel rules in CPUs, and is only likely to move that forward fast enough to keep AMD stumbling along.. they don't benefit from delivering new CPU technology any faster. This summer's $1000 CPU becomes next year's $200 bargain, but that only works if they can make a suitable replacement by next year. Without sufficient challenge, it's actually best for the company to keep this pace something they can optimize... one reason why the kind of shortages of parts we used see, say, around the 1GHz mark, rarely if every occurs these days.

    Software too... we're so used to waiting years for Microsoft to properly support new hardware standards (USB, Firewire, AGP, 32-bit, 64-bit, etc), that not much attention is really given to new hardware ideas. Microsoft, largely, gets to claim they're "mainstream", and until they do so, they effectively aren't. This is a stupid way to manage an OS... the very existence of the OS as hardware abstraction layer is supposed to make adopting new hardware faster, not slower. But MS always need a carrot to dangle for upgrades. They use hardware wherever possible.

    The hand-held market is booming for several reasons. One is simply that the opportunity is now undeniably real, but the powers that will be not entirely settled yet. This means everyone in the PC, Telco, and CE markets can jockey for a position in the new order. This happens every so often in tech... digital cameras is a good example. The pace of the film camera market was pretty settled: Nikon and Canon accounted for 80+% of all SLRs, Kodak and Fujifilm made most of the film, etc. But enter digital, and now film companies have to become sensor and camera companies, traditional camera companies have to get digital and electronic very fast, if they haven't already (or team with with CE companies, like Leica-Panasonic and Zeiss-Sony), PC companies look at this as Yet Another Electronic Device, and as well a PC peripheral, so you have them in the mix (Epson, HP, etc). The dust from that is settling, but for handhelds, it's just getting to the fun parts.

    And as with cameras, companies are looking at their future in new ways. Motorola never cared all that much about smart phones when it was just business people buying them, but as soon as it's looking like everyone will be involved, they had to think intelligently about where they'd be in 5 years, selling largely only dumb and "feature" phones. Palm finally woke up, a bit late, but they did. Android seems to be in the position held by MS-DOS in the PC days, only implemented better (open source, a decent enough design, Linux roots). And Apple's been making a fortune on this stuff, though still concentrating on form over function. It's not exactly the wild and woolly days of the PC indu

If it happens once, it's a bug. If it happens twice, it's a feature. If it happens more than twice, it's a design philosophy.

Working...