Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Cellphones Businesses HP Handhelds Intel Microsoft Hardware Apple

Why Mobile Innovation Outpaces PC Innovation 231

Sandrina sends in an opinion piece from TechCrunch that discusses why mobile systems are developing so much faster than the PC market. The article credits Intel with allowing hardware innovation to stagnate, and points out how much more competitive the component vendor market is for smartphones. Quoting: "In PCs, Intel dictates the pace of hardware releases — OEMs essentially wait for CPU updates, then differentiate through inventory control, channel / distribution and branding. Intel and Microsoft win no matter which PC makers excel — they literally don't care if it's Asus, Dell or HP. In the smartphone world, it's the opposite. Dozens of component vendors fight each other to the death to win designs at smartphone OEMs. This competitive dynamic forms an entirely different basis for how component vendors approach system integration and support. Consider Infineon, which supplies the 3G wireless chipset in the iPhone. In order to stay in Apple's graces, Infineon must do everything necessary to help the hardware and software play well together, including staffing permanent engineers in Cupertino or sending a team overnight from Germany. Do you think Intel does this for Dell?"
This discussion has been archived. No new comments can be posted.

Why Mobile Innovation Outpaces PC Innovation

Comments Filter:
  • by symbolset ( 646467 ) on Monday June 21, 2010 @01:07PM (#32643832) Journal

    It's called cannibalization. When there's an established monopoly any possible invention "cannibalizes" the markets of established product groups and must be suppressed. It takes a long time because monopoly is tremendously profitable, but ultimately this is a stagnant path that goes extinct in much the same form as it existed when it achieved monopoly.

  • I See It Differently (Score:3, Interesting)

    by eldavojohn ( 898314 ) * <eldavojohn@gm a i l . com> on Monday June 21, 2010 @01:09PM (#32643858) Journal
    Man, complaining about Intel's market dominance and not even one mention of AMD? If Intel was holding everyone back with your proposed CPU and Chipset conspiracy, don't you think that would just prime the market for AMD to pair up with VIA or someone and just wreck Intel?

    I'm no market expert but I think the author of this opinion piece overlooked a lot of things. For example, when you make a chip or chipset that is sold to Dell or HP or whomever to be put into another device, you're not directly fleecing the customer. You get smaller margins that way than you would if you were the manufacturer, marketer and distributor simply because Dell takes a cut otherwise. There's more money to be had in making complete phones because not only are you fleecing the customer but the carrier is willing to subsidize you to get the customer into a juicy two year data plan deal to the tune of $70/mo (at least in the US). I would assume this money spurs more rapid development and innovation.

    Quite frankly, I'm curious how Intel decides the "bundling" of my AM2+ motherboard running my cheaper quad core AMD chip? And if they don't, why isn't my AMD motherboard outpacing Intel and "keeping up" with mobile devices?
    • by Animats ( 122034 ) on Monday June 21, 2010 @01:16PM (#32643930) Homepage

      If Intel was holding everyone back with your proposed CPU and Chipset conspiracy, don't you think that would just prime the market for AMD to pair up with VIA or someone and just wreck Intel?

      AMD tried hard. They introduced 64-bit x86-compatible CPUs. And Microsoft wouldn't support them until Intel caught up. On the other hand, Microsoft supported the Inanium until 2004.

      • Microsoft supported the Inanium [princeton.edu] until 2004.

        That's because they didn't know that it was asinine.

      • by Anonymous Coward on Monday June 21, 2010 @01:31PM (#32644132)

        Incorrect.

        AMD introduced a 64-bit/32-bit hybrid CPU as a competitor of the Itanium for the server market. Opterons were and still are quire successful in that market, especially with the new g34 socket and 12-core processors (up to 48 cores per server and no tier-BS - all processors can run 1-4 SMP configuration) Microsoft viewed AMD's technology as *superior* to Itaniums because it allowed for seamless migration from 32-bit to 64-bit platform. Microsoft essentially *told* Intel that they will only support *one* 64-bit CPU and that will be the AMD instruction set. Intel had no choice but to incorporate AMD's instruction set into their processors.

        Microsoft doesn't care if AMD or Intel catch up to each other as long as their software runs on those processors. They didn't "wait" for Intel to catch up. It simply took many years to migrate Windows from 32-bit code to 64-bit clean code. There was XP 64-bit, but how many people used that? Hell, lots of people didn't even get 64-bit Vista because of perception that if you don't use more than 4G of RAM you don't need it. Actually, all modern machines should be running 64-bit OS only - simplified address space management and increased register count makes it a no-brainer.

        If you want an example of a company that still fails and fails hard at 64-bit software, it would be Adobe. They recently dropped support of the 64-bit plugin. Not sure, maybe they are still "waiting for Intel to catch up"?

        • Actually, all modern machines should be running 64-bit OS only - simplified address space management and increased register count makes it a no-brainer.

          As of Windows Vista and Windows 7, Microsoft has severely tightened its requirements for digital signatures on kernel-mode device drivers. So if you have connected a home-built or low-volume peripheral to your PC, the only way to run self-signed drivers without "Test Mode" always on top in all four corners of the screen is to run Linux on the bare hardware and Windows in a virtual machine. But how well do virtual machines support x86-64?

          • But how well do virtual machines support x86-64?

            Pretty well considering I am running the regular Ubuntu 64-bit distro inside a virtual machine.

          • But how well do virtual machines support x86-64?

            VirtualBox runs 64-bit just fine, as an example. The better question is, "How well do virtual machines support hardware acceleration?" Progress is being made, but running things like 3D games in a virtual machine is an exercise in frustration (if it works at all).

          • As of Windows Vista and Windows 7, Microsoft has severely tightened its requirements for digital signatures on kernel-mode device drivers. So if you have connected a home-built or low-volume peripheral to your PC, the only way to run self-signed drivers without "Test Mode" always on top in all four corners of the screen is to run Linux on the bare hardware and Windows in a virtual machine. But how well do virtual machines support x86-64?

            This response is just begging for this question:

            Why does your device do

            • by h4rr4r ( 612664 )

              Maybe I want to do something evil like copy movies out of ram? or whatever the hell I want?
              I thought this was my computer, not MS's. Since when did we start leasing computers?

              • You don't own the software, you license it. Use your own software, or software without copy protection, and copy away.
                • by h4rr4r ( 612664 )

                  And this is why I do not use Windows.

                  • Re: (Score:3, Insightful)

                    I understand and agree, but if you use Linux and don't give a flying flip about Windows and hate everything that windows restricts (yeah, I hear you, and I agree) then what the hell do you care about how M$ handles its drivers? Trying to find things to complain about that don't even bother you? :P
            • Why does your device do that required it to have a Kernel-Mode driver

              From UMDF FAQ [microsoft.com]:

              A user-mode driver cannot have kernel-mode clients because Windows does not allow calls from kernel mode to user mode. The majority of drivers for input, display, and most network and storage devices cannot be migrated to user mode because they have kernel-mode clients. For the same reason, user-mode drivers must be at the top of the device stack; they cannot attach to the middle of the stack. However, a stack can contain more than one user-mode driver; that is, a user-mode driver can have use

          • Um..... no. Anyone can get a cert and sign the driver. It's documented here: http://www.microsoft.com/whdc/driver/install/drvsign/kmsigning.mspx [microsoft.com] (word doc attached has all the details). Essentially you just get a cert from a CA. What is that $100.00? If you don't want to do that, then just run in test mode. What's wrong with that?

            • by tepples ( 727027 )

              Essentially you just get a cert from a CA. What is that $100.00?

              SPCs expire. And last time I looked into getting an SPC, the CA didn't offer SPCs to individuals, only to companies.

        • Re: (Score:3, Insightful)

          by vlueboy ( 1799360 )

          Try your statement again on a Bean counter test (TM):

          Hell, lots of people didn't even get 64-bit Vista because of perception that if you don't use more than 4G of RAM you don't need it.

          Bean counter:Alright! Since we skipped Vista, none of our corporate PC's ever needed even 3GB. Money saved!

          Actually, all modern machines should be running 64-bit OS only

          Bean counter:Tell me more and I'll put in an order so we can stay competitive in this "modern" market. I'm curious.

          simplified address space management

          Bean counter:Huh?

          and increased register count

          Bean counter:Useless. More technobabble that only programmers need. I'll recommend keeping XP on our single core Pentium 4. I'll also get a raise for saving the PHB a ton on this year's budget.

          makes it a no-brainer.

          Bean counter:I fully a

        • by yuhong ( 1378501 )

          Microsoft essentially *told* Intel that they will only support *one* 64-bit CPU and that will be the AMD instruction set. Intel had no choice but to incorporate AMD's instruction set into their processors

          Really? The reason Itanium support was scaled back over time was I think because it was a low-volume niche market, not that MS wasn't willing to support two 64-bit architectures.

          It simply took many years to migrate Windows from 32-bit code to 64-bit clean code.

          Well, I read that most of the work was done in the year 2000, then in 2001 they released Itanium Windows XP. From there, porting to AMD64 was as simple as developing a AMD64 compiler, kernel and WOW64 and a few other things.

        • Re: (Score:3, Insightful)

          lots of people didn't even get 64-bit Vista because of perception that if you don't use more than 4G of RAM you don't need it. Actually, all modern machines should be running 64-bit OS only - simplified address space management and increased register count makes it a no-brainer.

          And they're right. Those are fine technical arguments, but the end result is the same. The performance gain is negligible, end you get in compatibility problems like the mentioned Adobe plugins.
          I just switched to 64bit on my AMD Neo

    • by $RANDOMLUSER ( 804576 ) on Monday June 21, 2010 @01:19PM (#32643996)
      The "dominance" is the x86 instruction set. Intel and Microsoft have locked us in; AMD is just a second source for chips that use that instruction set.
      • Re: (Score:3, Insightful)

        "The "dominance" is the x86 instruction set."

        And the "dominance" of the "dominance factor" is that's 30 year old, mature, stablished technology.

        Oh, well, why we don't see so much innovation on the VHS world? Companies should be urged! VHS is not only stagnating, is even dispearing!

      • by overlordofmu ( 1422163 ) <overlordofmu@gmail.com> on Monday June 21, 2010 @03:14PM (#32645374)
        turbudostato is missing the point.

        I shit you not, my mod point expire and then I see this post that needs an insightful mod.

        In 1995 there was the beautiful CPU called the Alpha. It was faster than anything offered by Intel. It was RISC and not CISC. It didn't boot into 16-bit mode and then require the OS to do work to access 32 bit registers. It was a 64 bit CPU when all the Intel and AMD processors were 32. It had 32 registers for both floating point and integer arithmetic. That is 64 registers for data, people. Even today's Intel CPUs don't have a data register count like that. It was a shining example of a beautiful CPU that was not based on old tech and trying to be compatible with something from 1981. It was good. It was right. It was the furture. It was the best, fastest general purpose CPU on the fucking planet.

        And what happened? That is right! It fucking died because Intel's crappy Pentium had all the market share and there was no volume on Alpha sales. The monopoly's shit tech won and the better CPU disappeared down the hole. Mature, "stablished" means good-old-boy in the context. In the tech world, we pick tech because it works better, not because it is the kind your daddy used back in the day. Your comment is that of an asshat, turbidastato, an asshat.

        Randomluser, thank you for wisdom to the unwashed massed of Intel ass-lickers. LONG LIVE THE ALPHA! GET OFF OF MY LAWN!!!
    • by SQLGuru ( 980662 ) on Monday June 21, 2010 @01:32PM (#32644148) Homepage Journal

      Don't forget that the mobile market gets to take advantage of knowledge and research done for the server/desktop market. Sure, there's new tech going on in there, but it's the whole trickle down approach, too. The mobile market is *catching up* to the desktop market, so there's a lot of acceleration just from using all of the prior knowledge. Building multi-core processors isn't easy and how many mobile phones do you know that are sporting them? Zero that I know of. And what about Intel's turbo processing (dropping cores and overclocking the remaining cores when not needing as many cores), how long do you think before a mobile phone will have that technology?

      The innovation in a lagging area (mobile) seems faster only because the innovation has already been researched in the leading area (servers first and consumer second). It takes longer to figure out something the first time than it does to figure out how to make it "smaller" (smaller in the sense that it is for the mobile market, it may be a smaller die footprint or power footprint or whatever).

      • Basically, all the mobile market needs is further miniaturization, which is hard. As they get better, though, you see big jumps because they are able to take advantage of more advanced technologies.

        Also, more money in the market always, always helps innovation, and right now the mobile market is absolutely brimming with cash, thanks in no small part to the iPhone.

    • by Bert64 ( 520050 )

      The problem in the PC market is more to do with Microsoft than Intel...
      Intel would certainly prefer to stagnate, but when they've done this in the past competitors (most notably AMD) have taken market share away from them. Perhaps not much, but enough to force Intel to compete. These days i would imagine processor innovation proceeds at the speed of AMD... Intel want to stay ahead, but not too far ahead.

      Infact, Intel would love to be where ARM are in the smartphone market, sure ARM don't manufacture process

  • It's About Time (Score:5, Insightful)

    by WrongSizeGlass ( 838941 ) on Monday June 21, 2010 @01:14PM (#32643918)
    Mobile innovation is outpacing desktop innovation because desktop innovation has been going on for 20+ years and mobile innovation has been stuck in its infancy for too long.
    • Re: (Score:3, Insightful)

      by petes_PoV ( 912422 )
      Mobiles have been around for over 20 years. I got my first one in 1988 and they *have* come a long way since. However, unlike PCs, mobile phones have always been more restricted by size and battery capacity. Constraints that never applied to PCs.
      • Laptops (Score:4, Informative)

        by tepples ( 727027 ) <.tepples. .at. .gmail.com.> on Monday June 21, 2010 @01:39PM (#32644274) Homepage Journal
        If you think "size and battery capacity" are "constraints that never applied to PCs", then I highly doubt that you have ever owned a laptop.
      • Re: (Score:3, Informative)

        Mobiles have been around for over 20 years. I got my first one in 1988 and they *have* come a long way since.

        Yes, they have come a long way but a huge chunk of it has been the last few years. There weren't that many "breakthroughs" after the Palm & Newtons until the mobile handsets started trying to resurrect their functionality.

        However, unlike PCs, mobile phones have always been more restricted by size and battery capacity. Constraints that never applied to PCs.

        These are some of the most important hurdles for mobile computing to clear. It's a mishmash of extended battery life supported by CPU efficiency supported by OS's that treat power conservation as a priority to get more out of smaller batteries with extended life ...

        The smaller sizes

      • Depends. Those restrictions aren't necessarily as clear cut as you might imagine. Take the iPad for example. It's effectively a giant mobile device with a bigger battery. Effectively they took the mobile phone idea and tossed out the notion that you HAD to keep it small.

        And the things are selling like hotcakes. Sometimes conventional wisdom is a handicap.

    • Re:It's About Time (Score:5, Insightful)

      by CAIMLAS ( 41445 ) on Monday June 21, 2010 @01:32PM (#32644154)

      Of course, TFA completely overlooks the newer line of Mooreland Atom processors from Intel.

      It also ignores the fact that cell phones are a throw-away market. There isn't nearly the 'data lock-in' that the x86 architecture has. Where smartphones can have their software sized to the hardware, Intel (and AMD) are forced to size to the software. Not only does this limit what Intel can do, it limits how fast they can do it.

      • One thing to also consider is how we interact with our PCs is pretty entrenched so new methods are slow to enter the market and gain acceptance. With mobiles the field is wide open and the means of their use still has plenty of openings. Consider that mobiles are much more "personal" in their interaction that PCs ever were. We hold them in our hands, that and their size requires new ways of thinking. I expect some of the usability available through mobiles to move to PCs but be interpreted in slightly d

      • by h4rr4r ( 612664 )

        Only if you demand binary only apps. I run the same software on my desktop and my phone, they are not the same instruction set.

    • I think the rate at which people buy new desktops/laptops and new phones is important too. My desktop lasted years with only minor improvements. My laptop is a year old, and I will probably get another year or two out of it. I get a new cell phone every year, and I know people who get one more often than that.
    • Re: (Score:3, Interesting)

      by dkleinsc ( 563838 )

      In other words, we know what works well on a desktop. And more to the point, we know what doesn't work on a desktop, which is why we'll probably never see another trackball ever again.

      In mobile, we're only collectively beginning to understand what we should be trying to build. There have been some real dead ends too - Palm handwriting, anyone?

      • by improfane ( 855034 )

        RSI sufferers would disagree. I love my trackball and recommend it to anyone. Seriously, use one, you won't want to go back to a mouse.

        It might not be that common as it's a niche. Many disabled people need them too.

        • by avm ( 660 )

          Iprefer a trackball myself. Why move your arm when a finger will suffice? (Though I expect CLI aficionados can also use that line). Definitely tickles the carpal tunnels less for me.

          • by gorzek ( 647352 )

            I'll third the trackball love. I invested in a Kensington Expert Mouse about a year ago and my wrist issues have all but disappeared. I also find I can make more precise movements by rolling a ball with the tip of my finger than I ever could pushing a mouse around.

            Trackballs probably aren't for everyone, but I wouldn't knock them--there is definitely a market for them and they are very good at what they do.

    • Also: desktops are for work and mobiles are for entertainment. (forget the details, this is bottom line).

  • In simple numeric terms, any platform or group of platforms that is not very well
    established is going to appear to experience explosive growth in it's own terms.
    The numbers are so small and the features so immature that the new tech simply
    needs to keep up.

    While mobile devices certainly have some unique interesting features and they have
    the virtue of being mobile, they still lag non-mobile devices in some key areas that
    key features of those devices.

    It's a lot easier to seem innovative when your predecessor i

  • Good Enough (Score:3, Insightful)

    by Fuseboy ( 414663 ) on Monday June 21, 2010 @01:17PM (#32643950) Homepage

    The PC isn't innovating because it doesn't need to - it's already perceived as "good enough" by its users. Advances in computing power generally get asorbed by the ever-increasing needs of the OS and office applications. Smart phones, on the other hand, are so constrained by their form factor and their tiny user interface that innovations in UI, usability, battery life, etc. are very meaningful. Merely making a different set of trade-offs can produce real wins.

    • Re:Good Enough (Score:4, Interesting)

      by 0123456 ( 636235 ) on Monday June 21, 2010 @01:23PM (#32644038)

      The PC isn't innovating because it doesn't need to - it's already perceived as "good enough" by its users. Advances in computing power generally get asorbed by the ever-increasing needs of the OS and office applications.

      I bought a laptop for $1000 in 2007. I just replaced it with a 2010 model $1000 laptop... the CPU is 5x faster, the GPU is immensely faster, and it plays all my games at medium to high quality settings with no problems when the old one had problems playing anything more sophisticated than Pacman.

      So while I'm not sure that providing vastly greater power for the same price counts as 'innovation', I'd hardly say that the PC market is stagnant. I'd agree though, that if you don't play games or edit video or some other performance-intensive task then even the cheapest PC is generally 'good enough'.... probably much of the real 'innovation' in the PC market over the last few years has been getting usable performance at lower and lower power consumption (e.g. my Ion system takes 30W to play HD video that my 300W Pentium-4 system can't play at all).

      • Re:Good Enough (Score:4, Interesting)

        by nyctopterus ( 717502 ) on Monday June 21, 2010 @02:05PM (#32644554) Homepage

        PCs are failing hard at something the same vendors have figured out is really important for mobile computing, and that is UI responsiveness.

        My experience is this:I upgrade on a 4-year average, and I usually do so because I can no longer run a recent Adobe CS at a usable speed. Every upgrade allows me to work on more complex and bigger files, for sure, but the responsiveness of the UI has definitely gone down. Illustrator CS5 feels slower on my 2.8Ghz Core 2 Duo with 4gb of RAM than Illustrator 9 did on a 500Mhz G4 with 256mb of RAM. This is true even working on very simple stuff. Launch times are absolutely atrocious, cancelling a mistakenly called operation (like say, applying a texture) still virtually impossible (why the hell do they even bother with the "cancel" button on progress bars?). It's not just Adobe, Apple's never managed to claw back the responsiveness of the classic Mac OS, and Microsoft Office... well, it's got seriously nasty.

        Big-ticket software has made using a modern computer like wading through molasses. Yeah, it gives you a lot speed for some things that are processor intensive, but pressing a button, opening a menu, or bringing up a dialogue are all going to be slower. In some cases, much slower. This is EXACTLY the opposite of what I want. I don't care if a filter that was going to take two minutes takes four, if I can go and do something else without everything being as slow fuck. Even as I type this, the computer occasionally failing to keep up. I mean really, typing words into a web browser while playing an MP3: I was doing this in 1998 with no lag.

        If I really believed there was still innovation in PCs I would say that instant-response UIs--where cancel buttons worked and processes just got slower rather than stepping destroying responsiveness--were going going to be the next big thing. However, I don't think anyone gives a shit, because all the software vendors have gone down this road.

        • by yuhong ( 1378501 )
          I think part of it is the emphasis on benchmarks. User responsiveness is not that easy to measure.
          • by yuhong ( 1378501 )
            As an example, anyone remember Con Kolivas of Linux kernel fame?
          • I agree. Benchmarks seem to consist of things like filters that take over a minute. I'm a digital artist, and I work in Illustrator and Photoshop all day long. I very rarely run a filter that takes anything like that long. I do, however, switch on and off layers, change tools and look through menus thousands of times a day. I really, really don't care that a filter I would never do is going to be twice as quick (under ideal conditions presumably, without all the other stuff that tends to be running on a re

        • Re: (Score:3, Insightful)

          by Hatta ( 162192 )

          That's a software problem, not a hardware problem. And to the extent that it can be blamed on hardware, there's better hardware available to fix it. Multiple cores enable you to do many things at once without slowing any of them down to an appreciable extent. SSDs allow you to drastically reduce load times for your applications. But in the end, if you want a responsive system you need to use software that's designed for responsiveness.

          • Re: (Score:3, Insightful)

            by nyctopterus ( 717502 )

            It's both, I think. Sure you can argue that there are better hardware components around, but the reality is that, as sold, most hardware packages are contributing to the problem. My iMac here, for example, has all the processing power I need, but clearly has a IO bottleneck. The processor mostly sits pretty idle and the RAM unused while the disk grinds. Yes, and SSD would improve the situation, but it wasn't sold with one. The dual core was a disappointment, I thought it would drastically improve multitaski

      • So while I'm not sure that providing vastly greater power for the same price counts as 'innovation', I'd hardly say that the PC market is stagnant.

        It is stagnant in the sense that most people now buy replacement computers when the old one breaks, instead of buying new hardware and software to do new (presumably exciting) things, or buying a computer for the first time. Replacement level sales means no growth, which means Wall Street slaps you down, which means you can't easily raise capital to innovate, whi

  • Easy answer (Score:3, Insightful)

    by VincenzoRomano ( 881055 ) on Monday June 21, 2010 @01:17PM (#32643952) Homepage Journal
    Because there's more money! In the handsets first (look how much the iPhone 4 will cost!), then voice services and texting and finally with data plans.
    Are you really able to check the bills they send to you?
    Are you really willing to do it?
    Or you simply PAY?
    This is why!
  • by pwilli ( 1102893 ) on Monday June 21, 2010 @01:18PM (#32643964)
    Because PCs have a headstart of decades?

    It's like asking why China can have growth rates of over 10% while "Western" countries only get 1-3%. It is very hard to improve if you're already close to technical and physical limits and any made improvement won't look as impressive. Handhelds will soon enough hit the same walls that Desktop Systems currently try to tear down.
    • Even more than that: you don't want rapid "innovation" in established products. When I buy a new computer, I want it to be better than my last computer, but I specifically want a lot of things to be the same. I'm used to a certain UI, and I have a variety of peripherals already that I might want to plug into it. I want to be able to perform essentially the same tasks in the same way.

      Basically, the smartphone market had a distinct shift a couple of years ago (when the iPhone was released) where vendors started offering a new kind of product. They were starting with a clean slate, and you can draw whatever you want on a clean slate. Once you've established a new product that way, you have a relatively brief period of time to refine that vision before people's expectations become established. Then people want everything to work "as expected", and they want legacy support more than they want new features.

      Don't get me wrong, I'd love to see more innovation in the desktop/laptop market. But if someone did conceive of a new and interesting vision for the computer, they'd have a lot of inertia to overcome.

    • It is very hard to improve if you're already close to technical and physical limits and any made improvement won't look as impressive.

      "Everything that can be invented has been invented." Apparently misattributed as being said by Charles H. Duell of the US Patent Office in 1899, but the point stands.

      IMHO the mobile market's not particularly far behind the PC, for instance my phone's 400MHz and so is my laptop, and they're both a few years old now (Freerunner and XO-1). They both running Linux, Enlightenment 0.17, Pidgin, Midori, etc.

      I think the mojor problem with mobiles is the software, based on the fact that that very few people think of

  • I don't agree with the premise at all. It's just that it only recently become possible to make screens that were good enough, and mobile CPUs that were fast enough, and memory that was small and cheap enough to push mobile devices into a large consumer market. Now that it's possible to make these new things that work reasonable well in way they didn't just 5 years ago, of course lots of different companies are going to be experimenting to see what they do better than anyone else. That will likely continue
  • Intel engineers will go out of their way to get a "design win", i.e. to get the developer of a new product to commit to using Intel parts as a fundamental part of the design. It is only once they get the design win that they no longer care about their customers. It is hard to be customer-driven when you've got a 5 year road map documenting the planned obsolescence of your CPUs for the next several years, but Intel marketing does try to be responsive to it's higher-volume customer's needs... but AMD is much
    • by $RANDOMLUSER ( 804576 ) on Monday June 21, 2010 @01:30PM (#32644116)
      That's only partially true. It happened again (in a big way) with the switch from 16 to 32 bits, and it is/has again (in a much smaller way) with the switch from 32 to 64 bits. Picture what the computing world would be like today if Alpha (and maybe Unix) had been adopted instead of everybody waiting for the Itanic to come in. Just the THREAT of Itanic was enough to scuttle SPARC, PA-RISC, MIPS, ALPHA...
      • by cheesybagel ( 670288 ) on Monday June 21, 2010 @01:53PM (#32644414)
        SPARC didn't get scuttled because of Itanium. Sun merely bungled up enough times with chip design that they did not have much of a product to compete with Itanium. UltraSPARC V was late, buggy, and canned. Rock, about the same thing. They managed to finish Niagara, but Niagara was mostly good for low end boxes which did web serving: it has lots of threads for doing integer processing, but lousy floating point, and lousy single threaded performance.

        Sun fumbled so much with SPARC chip design they had to ask Fujitsu to sell them their SPARC64 IV processors, so they could actually have a high end SPARC server product to sell.

  • This is an interesting observation about competitiveness and innovation, because I always feel like I get more value from Intel CPUs ($2-300) and Windows operating systems ($2-300) than I do from smartphones ($3-500).

    And not just by a little.

    It could be because of the small screen, balky UI, limited data storage, and limited connectivity.

    It could be because I'm somewhat ignoring the OEM contribution ($200 mobo, $60 case with silent power supply, $200 gigundo HD with raid striping for speed, $300 billboard-s

    • Re: (Score:3, Informative)

      by gbjbaanb ( 229885 )

      ah, but think how much they get from you.

      iPhone : $$$ plus monthly voice, text and data tariffs and then you go and buy another one in 1-2 years time.

      Dell: $300 for a desktop PC. One off payment.

      There's money to be made in the mobile marketplace, whereas the desktop one is saturated with lowest-possible-price units.

      • by blair1q ( 305137 )

        Which means, as I was trying to apply, that the Wintel model of little competition actually serves the consumer better than the phone-market model of cage-fight competition.

  • by petes_PoV ( 912422 ) on Monday June 21, 2010 @01:27PM (#32644080)
    Before IBM created the standard platform there were a plethora of competing chips, architectures, "operating systems" approaches, price-points and failures. The phone market is in the same situation now. Just as soon as some manufacturer starts to dominate and everything becomes standardised two things will happen: the software will become much more important and the hardware will start the spiral down to commodity status.

    The car market has gone the same way - they all look pretty much the same - dictated by the laws of aerodynamics. It means that other features have been developed to differentiate - things like economy, safety, electronics. While this is not necessarily good for the manufacturers - the number of players shrinks as the market consolidates, it is good for the consumers. So it will be with phones (or whatever they evolve into, they're the equivalent of an Atari, today). We have yet to see the major benefits emerge, despite what Apple may tell us.

    • by vlm ( 69642 )

      The car market has gone the same way - they all look pretty much the same - dictated by the laws of aerodynamics

      Laws of marketing, definitely not laws of aerodynamics. Combined with a desperate desire for conformity, same end result, so it doesn't matter too much.

      But don't make the mistake of thinking that changing marketing trends over the years means the laws of aerodynamics are evolving or something.

      • by petes_PoV ( 912422 ) on Monday June 21, 2010 @02:21PM (#32644772)

        Laws of marketing, definitely not laws of aerodynamics.

        The biggest driver in car design since the oil crises of the 70's has been miles per gallon. That has improved engine technology and made car shapes more slippery. There's only one way to reduce drag, that's to be aerodynamically efficient. There's only a small number of solutions to the laws of laminar flow. That's why all cars look the same.

        • by Yvan256 ( 722131 )

          There's only one way to reduce drag, that's to be aerodynamically efficient.

          You forgot the most obvious way: remove the air around the car.

        • by h4rr4r ( 612664 )

          They why do they all have huge draggy grills?
          Cars do not need that much airflow, people already are replacing these grills or blocking off part of them to improve mileage.

        • by Hatta ( 162192 )

          The biggest driver in car design since the oil crises of the 70's has been miles per gallon.

          Then explain SUVs. There has been a lot of work on aerodynamics and engine efficiency, but the result of that work hasn't been to reduce MPG until recently. It's been to make bigger vehicles.

    • Before IBM created the standard platform there were a plethora of competing chips, architectures, "operating systems" approaches, price-points and failures. The phone market is in the same situation now.

      Correct.

      Just as soon as some manufacturer starts to dominate and everything becomes standardised two things will happen: the software will become much more important and the hardware will start the spiral down to commodity status.

      WRONG... for the following reasons.

      1) Consolidation is always bad for the con

  • Maybe it's because people don't want desktops so much anymore and the market is shifting to mobile devices and the technology companies want to keep making money?

    • Maybe it's because people don't want desktops so much anymore and the market is shifting to mobile devices and the technology companies want to keep making money?

      Demand is only a part of it I think. People buy a PC and use it for around 4ish years before they look to upgrade it, mobile devices (mostly in the terms of smartphones) are upgraded every year or 2 on average (not considering them breaking from mis-use, something you'll find in smartphones much more often then PC's/laptops). This means you'll sell at the longest stretch twice as many smartphones then PC's in the same time frame making the smartphone a higher 'demand' market even though I think PC's/laptops

  • Despite the fact that PC's are 20+ years old, the development cost of a new PC is substantially larger than that of a mobile device. The BIOS development alone is a substantial part of the NRE cost. Mobile devices use open source bootloaders or run natively and so such NRE costs aren't applicable. Then add prototyping costs for the hardware and things get very expensive in a hurry.

    The use and availability of operating systems is an additional burden the PC must bear. There's an acceptance in the mobile mar

  • by sootman ( 158191 ) on Monday June 21, 2010 @01:38PM (#32644246) Homepage Journal

    Maybe it's because the PC market has already gone so far? In the last five years, handhelds have been gaining things--large color screens, powerful web browsers, built in wireless--that desktops have had for years. This stuff was physically impossible to do at small sizes five years ago.

    Also, everyone in the world already has a PC, but people are just now buying large numbers of (only recently existing) mobile devices.

    TechCrunch headline, June 2015: "Why implant innovation is blowing away handhelds"

  • by vlm ( 69642 ) on Monday June 21, 2010 @01:38PM (#32644250)

    In order to stay in Apple's graces, Infineon must do everything necessary to help the hardware and software play well together, including staffing permanent engineers in Cupertino or sending a team overnight from Germany. Do you think Intel does this for Dell?"

    To the best of my knowledge, dell is at most an assembler of parts, at their least they're a rebrander. I would agree there is utterly no point in stationing VLSI engineers and RF analysts at Dell, because those guys belong at the board level designers and board manufacturers.

    http://en.wikipedia.org/wiki/Dell#Manufacturing [wikipedia.org]

    It would be pointless overkill; like GM stationing a permanent automotive engineer at my local car dealership to oversee oil changes.

    I also thought it interesting that Dell is closing the last of their assembly plants in the USA. Kind of hard to call it an American company if everything they do is overseas, except the expensive overhead of upper management. I would not anticipate a bright future for Dell because their only differentiation against their foreign competition would be extremely expensive upper management compared to their competitors.

    • Re: (Score:3, Interesting)

      by swb ( 14022 )

      It would be pointless overkill; like GM stationing a permanent automotive engineer at my local car dealership to oversee oil changes.

      Ha! They may soon have to given the complexity level of cars and the lack of sophistication in the repair department.

      My Volvo actually required a software patch only the factory engineers knew about (unique to subset of ECMs in my model year) and I've run into other people who have had problems the "shop couldn't solve" and that actually required an engineer from the factory

    • by rrhal ( 88665 )
      I'm pretty sure that Dell's are assembled by contractors in China. All the major assemblies are built there and it costs roughly the same amount to ship an assembled computer as it does to ship an empty case. I have a good friend that was on the Intel product team that was supporting Dell when the Bad caps problem hit the GX260's. Those guys were located in Hillsborough, OR and had long conference calls with Dell engineers.
  • by cheesybagel ( 670288 ) on Monday June 21, 2010 @01:39PM (#32644268)

    I point to this fallacy:

    Consider Infineon, which supplies the 3G wireless chipset in the iPhone. In order to stay in Apple’s graces, Infineon must do everything necessary to help the hardware and software play well together, including staffing permanent engineers in Cupertino or sending a team overnight from Germany. Do you think Intel does this for Dell?

    Dell is not comparable with Apple in this case. Apple develops the operating system software for the iPhone. Intel also has permanent engineers at Microsoft, just like Infineon has engineers at Apple. Microsoft develops the operating system software for the PC. Intel also funds many Linux driver developers, and has staff working specifically on Linux support.

    There are multiple x86 vendors including Intel, AMD, VIA. The reason there is not more competition is that Intel exploits network effects leveraged by their market monopoly which lead to the current situation. It used to be at a time that the chipset was manufactured by different vendors than the CPU. This enabled more rapid progress in some cases (e.g. ALI and VIA had a chipset with onboard 3D graphics long before other vendors). This is no longer the case. In fact it seems chipsets are becoming increasingly irrelevant as more things get integrated in the same chip. Intel is starting to include the graphics card and high speed I/O in the processor chip. Eventually the chipset will be today's equivalent of a slow I/O south bridge. Perhaps it will even vanish completely.

    Another reason that mobile devices will not leave the PC industry behind is that Intel has superior manufacturing prowess. Historically Intel has had inferior chip design capabilities: the 8086 was inferior to the 68000, the 486 was inferior to many RISC processors, the Pentium Pro was inferior to the Alpha, etc. None of this mattered because Intel had the ability to deliver in volume and price where its competitors could not. The Pentium Pro, for example, had similar integer performance to Alpha because it had superior manufacturing, even if the hardware design was worse. Today Intel enjoys a healthy manufacturing process lead over all their competitors. It is a matter of time until they develop a specific chip to attack the smartphone market, like they developed Atom to counter the rising MID market, or Centrino to counter Transmeta years before.

  • by Tridus ( 79566 ) on Monday June 21, 2010 @01:40PM (#32644278) Homepage

    "Mobile" in terms of dumb phones actually isn't moving very quickly. Dumb phones have existed for a couple of decades, and strictly speaking call quality was better in the 90s then it is today. In terms of voice in remote places and durability, every phone on the market today is straight up worse then the Nokia 6160 I had 10 years ago. Voice is more of an afterthought these days.

    The smartphone market on the other hand is pretty young, and is acting like a new market with rapid improvements and cut throat competition. It's also a market subject to fashion trends and full of users who will change phones as often as their contracts allow, which really isn't the case in say the PC market (where average users will buy a new computer when the old one dies and these days even gamers don't need frequent upgrades like they used to).

  • "A great example of this [stagnation] is the notable lack of GPS chips in laptops."

    Or maybe it's because Intel did some research and found that 99% of people use their laptops indoors 99% of the time.

    "Today's 3G wireless chipsets integrate GPS, Bluetooth, and 802.11n on a single chip."

    And they do so at great expense because size and power consumption are an order of magnitude more important in a handheld than on a desktop. And single chips cost more to revise than individual components. But speaking of desk

  • It's going to be interesting to see how "tablets" go. Will they come downward from Windows PCs, as Microsoft wants, or up from phones, as Apple is doing? Or will an accepted interface not from either world be developed for them?

    It's going to be interesting to see how tablets develop as business tools. Tablet machines for special purposes, like the one every UPS delivery person has, have been around for decades. Tablets for doctors, cops, and others who need info in the field are coming along. The ta

    • Up from phones. Windows makes for very poor input on a touchscreen device, as well as being someone limited in the resolutions it supports. Therefore any existing Windows apps would need to be completely redesigned and rewritten to be really usable on a tablet, in which case there is little advantage in basing a tablet on Windows.
  • Stagnation (Score:3, Interesting)

    by QuietLagoon ( 813062 ) on Monday June 21, 2010 @02:18PM (#32644740)
    The article credits Intel with allowing hardware innovation to stagnate

    .
    The stagnation in the PC industry has far more to do with Microsoft's monopoly-maintaining innovation-stifling policies than anything else. At least Intel had some marginal competition in the form of AMD. Microsoft had no real competition for over a decade, and the entire PC industry and its customers suffered.

  • Because mobile processing ala smart phones hardly existed until 2000, and when you suck as hard as those gadgets from a decade ago, it's hard not to significantly improve.

  • by Rene S. Hollan ( 1943 ) on Monday June 21, 2010 @02:46PM (#32645094)

    I know, I could write that every decade or so.

    When I started with computers, processing audio was hard and clunky, and video unheard of. But, increasingly, non-computer devices are getting more intelligent (in terms of really being computers under the hood), to the point where they look and feel like computers, with different peripherals.

    When I first viewed video on a computer monitor, it was clunky, and in a window. Even in full screen mode, one would eventually escape back to the windowing UI, that made the TV stop looking like one, and more like a computer. 10 foot interfaces have changed all this, of course. And yet, if one does want to switch from a video entertainment device "mode" to an "internet browsing" mode to view YouTube videos, for example, the computer UI looks normal and not out of place. We are getting used to the browser being our interface to the world around us.

    The point is that computers are becoming ubiquitous. From TVs to phones, to ebook readers, to netbooks, and iPads, we are using computers to present content as well as organize it. If I were to desire a "universal" remote control, I would seriously consider a netbook for the purpose because it could add so much more functionality over a universal "remote", and actually costs less than many of them! Why we still have 38khz IR remote controls instead of web-based UIs available over 802.11b/g/n escapes me, but I am sure that will start to change with the first "networked" remote, and "IR hubs" with 802.11b/g/n in and IR blasters "out" for legacy equipment. Why can't I use my smartphone as a remote? Oh wait! I can!

    Just look at how UpNP has shaken out into DNLA-based equipment.

    I just retired a 400 disk CD/DVD changer and replaced it with a MythTV box. I had done that before, but with false starts, and things weren't smooth enough to really retire the changer. Now, the MythTV box is quiet enough, and powerful enough, to make the thought of actually handling media for anything more than "one of" playback archaic.

    Look at HDMI, at least the latest incarnations. Not only does it integrate uncompressed video and audio in a single cable, 100 Mb/s datalink layer ethernet, and SPDIF "back channels" are included. Literally, "one cable to link them all". And, it's not an expensive interface, only found on high end equipment: it is becoming the standard for computer monitors and televisions (the difference really becoming blurred).

    So, certainly because of competition and "technology catchup", phones and consumer electronics are evolving at a dizzying pace, whereas computers have stagnated. but, perhaps we've reached the point where computers already do everything we want them to: compute, process, store, and retrieve data. As far as presentation of entertainment content goes, a traditional computer offers little more than storage, and second rate display: it is non portable and the display or audio capabilities are poor compared to alternative: smaller display but complete mobility in phones, netbooks, and iPads, and massive displays in flat-screen TVs. And these are the areas where we are seeing advances.

  • Rapid advancement in mobile is often attributed to the natural disruption by which emerging industries innovate quickly, while established markets like PCs follow a slower, more sustained trajectory.

    But there are deeper fundamentals driving the breathtaking pace of smartphone advancement.

    Rapid advancement in mobile is often attributed to the natural disruption by which emerging industries innovate quickly, while established markets like PCs follow a slower, more sustained trajectory.

    But there
    had to be some way for me to create buzz for my blog so I came up with some convoluted explanation.

  • Just maybe, it's that circuitry is miniaturizing with the advances in 45nm and smaller processes such that the amount of capability available to tiny IC's like the ones used for cell phones is increasing so fast that what you could not do in cell phones 9 months ago is now trivial. Graphics rendering on cell phones is just about caught up to where PCs were 15 years ago. What will be interesting is seeing how long it will take cell phones to catch up to where PCs are today (in terms of processing/rendering p

  • Innovation in a new technology outpaces innovation in a multi-decade old technology. This is news? Say it ain't so!

    The "innovations" taking place in the PC world are innovations of software. The chips are powerful enough to run pretty much whatever anyone can throw at them. At this point the instruction set has been pretty well defined. Developers are focused on developing applications. Look at OSX versus Windows. Both are running on x86 hardware. They deliver different user experiences, while doing

  • Everything you need to know from TFA...

    "Guest author Steve Cheney is an entrepreneur and formerly an engineer & programmer specializing in web and mobile technologies."

    "it seems like mobile devices and platforms are innovating at about five times the pace of personal computers."

    "Intel's monopoly in PC processors and peripheral chipsets has caused PC innovation to stagnate."

    "A great example of this is the notable lack of GPS chips in laptops."
    "Sure, PC makers could add a separate GPS chip to the motherbo

  • by hazydave ( 96747 ) on Wednesday June 23, 2010 @10:22AM (#32665458)

    The mobile market was pretty boring until recently. One Blackberry was pretty much like another, same with Palm and Microsoft WinCE/PocketPC/WinMo.

    It was really Apple legitimizing the "Consumer Smart Phone" that's got everyone out there now scrambling for position in this space. Which, curiously, is exactly what happened in the 70s, 80s, and into the 90s in the world of personal computers. Back in the 70s, there were dozens of companies making proprietary hardware, operating systems, etc. You could have something come along, like the Apple Macintosh or the Commodore Amiga, that entirely changed the market in one shot.

    Since then, PCs have more or less grown up. The level of complexity is such that it's very difficult to do anything interesting at the system level... it has to be part of a new chip design. That raises the risk threshold significantly, as well as time between new generations of CPU, GPU, or PC system chips and architectures. Even Intel is slow moving on these things. As a result, most of the stuff that gets called "innovative" in the PC marketplace is little more than "same, old, same old" in fancy casework (Apple), or increasingly small incremental improvements what was pretty damn fine last year (Intel, AMD, nVidia, etc).

    The powers that be are pretty settled... Intel rules in CPUs, and is only likely to move that forward fast enough to keep AMD stumbling along.. they don't benefit from delivering new CPU technology any faster. This summer's $1000 CPU becomes next year's $200 bargain, but that only works if they can make a suitable replacement by next year. Without sufficient challenge, it's actually best for the company to keep this pace something they can optimize... one reason why the kind of shortages of parts we used see, say, around the 1GHz mark, rarely if every occurs these days.

    Software too... we're so used to waiting years for Microsoft to properly support new hardware standards (USB, Firewire, AGP, 32-bit, 64-bit, etc), that not much attention is really given to new hardware ideas. Microsoft, largely, gets to claim they're "mainstream", and until they do so, they effectively aren't. This is a stupid way to manage an OS... the very existence of the OS as hardware abstraction layer is supposed to make adopting new hardware faster, not slower. But MS always need a carrot to dangle for upgrades. They use hardware wherever possible.

    The hand-held market is booming for several reasons. One is simply that the opportunity is now undeniably real, but the powers that will be not entirely settled yet. This means everyone in the PC, Telco, and CE markets can jockey for a position in the new order. This happens every so often in tech... digital cameras is a good example. The pace of the film camera market was pretty settled: Nikon and Canon accounted for 80+% of all SLRs, Kodak and Fujifilm made most of the film, etc. But enter digital, and now film companies have to become sensor and camera companies, traditional camera companies have to get digital and electronic very fast, if they haven't already (or team with with CE companies, like Leica-Panasonic and Zeiss-Sony), PC companies look at this as Yet Another Electronic Device, and as well a PC peripheral, so you have them in the mix (Epson, HP, etc). The dust from that is settling, but for handhelds, it's just getting to the fun parts.

    And as with cameras, companies are looking at their future in new ways. Motorola never cared all that much about smart phones when it was just business people buying them, but as soon as it's looking like everyone will be involved, they had to think intelligently about where they'd be in 5 years, selling largely only dumb and "feature" phones. Palm finally woke up, a bit late, but they did. Android seems to be in the position held by MS-DOS in the PC days, only implemented better (open source, a decent enough design, Linux roots). And Apple's been making a fortune on this stuff, though still concentrating on form over function. It's not exactly the wild and woolly days of the PC indu

Beware of all enterprises that require new clothes, and not rather a new wearer of clothes. -- Henry David Thoreau

Working...