Forgot your password?
typodupeerror
Intel Hardware

Why Can't Intel Kill x86? 605

Posted by Soulskill
from the keeps-getting-luck-on-the-saving-throws dept.
jfruh writes "As tablets and cell phones become more and more important to the computing landscape, Intel is increasingly having a hard time keeping its chips on the forefront of the industry, with x86 architecture failing to find much success in mobile. The question that arises: Why is Intel so wedded to x86 chips? Well, over the past thirty years, Intel has tried and failed to move away from the x86 architecture on multiple occasions, with each attempt undone by technical, organizational, and short-term market factors."
This discussion has been archived. No new comments can be posted.

Why Can't Intel Kill x86?

Comments Filter:
  • by colin_faber (1083673) on Tuesday March 05, 2013 @02:38PM (#43081393)
    Really? I mean the Atom line processors are pretty great. The technology is well developed both for hardware and software and Intel basically owns that market. Why would they want to kill it off when they're still making money hand over fist with it?
  • by cait56 (677299) * on Tuesday March 05, 2013 @02:39PM (#43081419) Homepage
    This has been true for decades. Technology wants to evolve from CISC to RISC. The x86 brilliantly hid this by translating CISC to RISC superbly,
    But once you lose the x86 tag Intel would just be one of many vendors. The closest thing to competition they have had for x86 has been AMD.
  • by Jeremiah Cornelius (137) on Tuesday March 05, 2013 @02:40PM (#43081425) Homepage Journal

    Never forget! i960

  • by hedwards (940851) on Tuesday March 05, 2013 @02:58PM (#43081671)

    Pretty great? Atom sucks balls compared with AMD's offerings. And it's not even close. Intel offers them so that AMD has some competition in that space, but Intel doesn't have any reason for them to be good as that would take away from their business of selling the more expensive processors.

  • by Anonymous Coward on Tuesday March 05, 2013 @03:00PM (#43081697)

    Not to mention they are the fastest general purpose processors in the world right now. Yet some how that means they aren't staying on the forefront?

  • by WilliamGeorge (816305) on Tuesday March 05, 2013 @03:05PM (#43081747)

    "Computers long ago reached the point where they were fast enough..."

    For you, maybe - but not for everyone. I work with people daily who need more computing power, and in fact would benefit even further if processors were faster even than they are today. "Fast enough" is a fallacy - there is always, and will always be, room for improvement. Folks doing media editing, 3D animation, scientific research, financial calculations, and a whole host of other things need more power from their computers - not to move away to a less capable platform.

    Heck, even in games this is apparent. A lot of new games simply will not play well on processors from 2006 - that is seven years ago now, before quad-core processors were widely available! So please, don't take your one case and assume that means no one else has different needs for their computers.

  • Re:It will (Score:5, Insightful)

    by rsmith-mac (639075) on Tuesday March 05, 2013 @03:10PM (#43081819)

    You don't seriously think that modern Intel processors are actually CISC, right? The underlying instruction set is closer to a DEC Alpha than it is to an 80x86 processor....

    And that's really why the story question is misguided. The underlying architecture has nothing to do with the ISA; Intel can build whatever they want and throw an x86 decoder frontend on it and have a suitable x86 CPU. Killing the x86 ISA doesn't do anything for Intel or their customers.

  • by overshoot (39700) on Tuesday March 05, 2013 @03:10PM (#43081829)

    Intel is still the major manufacturer of laptop, desktop, workstation and server chips... What if they're not the main provider for cheap toys?

    If you weren't around for IBM's reaction to the arrival of minicomputers, or for Digital Equipment's reaction to microcomputers, you wouldn't understand why I'm cleaning up the coffee I just spewed all over my desk. Let's just say that last sentence isn't exactly new.

  • by Anonymous Coward on Tuesday March 05, 2013 @03:15PM (#43081889)

    general purpose means not a GPU, FPGA, etc..

  • Re:Legacy (Score:5, Insightful)

    by Cro Magnon (467622) on Tuesday March 05, 2013 @03:15PM (#43081893) Homepage Journal

    Until you can build a chip that can emulate x86 and support a different architecture and do so more cost effectively than just an x86 chip x86 will live. You can't kill it, Intel can't kill it, AMD can't kill it, Microsoft can't kill it and you sure as hell can't nuke it from orbit. It's embedded in billions of computers and software programs worldwide, and that is a zombie army that you just can't fight.

    That, in fact is how Apple switched processors. Twice. The PowerPC Macs were so much faster than the old 68K that they could emulate the old stuff as fast as the 68K machines, and the native PPC software blew the older machines away. When they switched to (ugh) Intel, the PPC had fallen behind and there was a similar performance gap.

    IIRC, early versions of Windows NT could run emulated x86 software at decent speed on the DEC Alpha, but that machine was too pricey for the mass market.

    So, to kill the x86, we need a machine that is enough faster than the x86 to run legacy software at comparable speed, native software that's faster than anything on X86, and a price low enough for the average consumer.

  • by JDAustin (468180) on Tuesday March 05, 2013 @03:16PM (#43081903)

    The Core-2-Quad 6600 (q6600) was released in Jan 2007. The chip is such a workhorse that it will run any of the new games out their. The limiter is the video card capabilities.

  • by pulski (126566) on Tuesday March 05, 2013 @03:22PM (#43081981)

    There's a lot more to life than gaming. A fast video card won't do a thing to speed up the work I do every day.

  • by PRMan (959735) on Tuesday March 05, 2013 @03:27PM (#43082077)

    David Packard (of HP) used to say, "We're trying to put ourselves out of business every six months. Because if we don't, someone else will."

    Back then, they came out with the LaserJet and DeskJet series and made tons of money. And every new printer was WAY better than the last one. But then he died and they decided that they should lock their ink cartridges and sue refillers instead of innovating. Now, companies like Brother and Canon are eating their lunch, by...wait for it...putting themselves out of business every 6 months...

  • by KingMotley (944240) on Tuesday March 05, 2013 @03:27PM (#43082083) Journal

    There is a whole set of folks apparently that don't understand that the CPU doesn't have an execution engine that can process "REPNE SCASB". "REPNE SCASB" will get translated into a small set of RISC-like instructions internally that get executed.

    Or are you trying to say that RISC computers can't possibly run C, because they don't those complex instructions too? Do you think that RISC assembly can't possibly have a REPNE SCASB macro? Are you confused because the translation happens inside the CPU instead of the assembler?

  • by jitterman (987991) on Tuesday March 05, 2013 @03:45PM (#43082357)
    I'll support you on this. I look at processing power as analogous to income - them more most people have, the more ways we find we are capable of using all of it, and eventually find we could certainly use more.
  • by TsuruchiBrian (2731979) on Tuesday March 05, 2013 @03:51PM (#43082475)

    I don't think things will ever reach a point of "fast enough" in an absolute sense either, but I can see where CastrTroy is coming from.

    I got my first computer was in 1992, and it was the most expensive computer I've (my parents) have ever purchased. Since then I have built computers from parts every year (each time becoming cheaper) until about 2001. The computer I built in 2001 lasted 2 years. The computer I built in 2003 lasted 3 years. The computer I built in 2006 lasted 6 years until 2012.

    Yes new applications are constantly coming out that demand faster computers for personal use, but it seems to be slowing down to me. It's not that technology is slowing down, but that the new technology seems more able to run on 6 year old technology than it used to.

    My core 2 Duo from 2006 is now the processor for my 20 TB RAID5 NAS, and it's doing great. I didn;t even really need an upgrade back in 2012, I just wanted to have a NAS and build a new computer for fun (I hadn't built one in 6 years). My new computer is definately faster, but all I do on it is play FTL, which I can also do on my crappy laptop from 2006.

  • by nitehawk214 (222219) on Tuesday March 05, 2013 @03:59PM (#43082587)

    GPU acceleration might come in handy if you do any sort of video editing.

    There is a lot more to GPUs than video and bitcoins.

    The ever increasing power of commodity processors is what makes my business of inexpensive data crunching possible. 10 years ago the kinds of things we do would require a supercomputer. Today it requires a moderately prices server-class machine.

    However I am drooling at the thought of using something like PGStrom [postgresql.org]. GPU based database queries.

  • by Yunzil (181064) on Tuesday March 05, 2013 @04:12PM (#43082761) Homepage

    They consciously made a profit-seeking management decision that shackled their ability to engineer radically.

    Oh come on. Do you honestly think there have been no major innovations in Intel processors since the 8086?

    they'd cut of all the old baggage that keeps them weighed down

    Except all that stuff that keeps them "weighed down" is the same stuff than generates them millions in profits.

  • by Chris Burke (6130) on Tuesday March 05, 2013 @04:15PM (#43082807) Homepage

    ARM is a really nice design, very extensible and very RISC

    It has fixed instruction length and load/store architecture, the two crucial components of RISC imo, but doesn't go "very" imo. The more I learn about ARM, the more delirious my laughter gets as I think that this of all RISC ISAs is the one that is poised to overturn x86.

    For example, it has a flags register. A flags register! Oh man, I cackled when I heard that. I must have sounded very disturbed. Which I was, since only moments before I was envisioning life without that particular albatross hanging around my neck. But I guess x86 wasn't the only architecture built around tradeoffs for scalar minimally-pipelined in-order machines.

    Well whatever. The long and short of it is that ISA doesn't matter all that much. It wasn't the ISA that made those Acorn boxes faster than x86 chips. The ISA is limiting x86 in that the amount of energy spent decoding is non-negligible at the lowest power envelopes. In even only somewhat constrained systems it does just fine.

    Oh and on the topic of Intel killing x86 -- they don't really want to kill x86. x86 has done great things for them, with both patents and it's general insane difficulty to implement creating huge barriers to entry for others helping them maintain their monopoly. Their only serious move to ditch x86 in the markets where x86 was making them tons of money (as opposed to dabbling in embedded markets) was IA64, and the whole reason for that was that then AMD and Via wouldn't have licenses to make compatible chips.

  • by Macman408 (1308925) on Tuesday March 05, 2013 @04:19PM (#43082863)

    And they've also demonstrated several times that even when they can't beat their competitors on technical merits, they can still use their monopolistic footprint to stomp all over them anyway.

    Don't get me wrong; Intel has a huge R&D budget, which buys them a lot of progress when they decide to focus on something that somebody else is currently better than them at. But sometimes, they use that money to just undercut their competitors (eg by selling chips at a loss), so smaller companies have no hope of surviving. Either they sell at a loss too and go out of business; or they maintain their price, nobody buys their chips, and they go out of business. Because of this, they've been sued by numerous companies and governments, and fined or settled for billions of dollars multiple times.

  • by cusco (717999) <brian@bixby.gmail@com> on Tuesday March 05, 2013 @04:19PM (#43082865)
    Great product my ass. DEC chips, including the Alpha, could do all the memory management and protection necessary to keep the system stable in the early 1990s, while Intel x86 chips STILL cannot do the same thing. Pretty much every BSOD that you've experienced is directly attributable to that lack. Dumbest thing that Compaq ever did was discontinue the Alpha chip. DEC had 64-bit CPUs in production years before Intel or AMD had even laid out their basic architecture (and they managed to do that mostly by hiring away DEC talent). In 1997 when our fastest Intel server was a P133 our database server was a Alpha 550, and DEC and Microsoft were porting Win2k to run on the Alpha when Compaq shut the effort down.
  • by Jeremiah Cornelius (137) on Tuesday March 05, 2013 @04:29PM (#43083039) Homepage Journal

    Yes. DEC Alpha, which originally ran Slashdot on a 166 mHZ Multia, and the great MIPS III 64's: R4000 and descendants.

  • by WilliamGeorge (816305) on Tuesday March 05, 2013 @04:41PM (#43083199)

    I don't see many people buying tablets or smartphones *instead* of a PC / laptop - they are usually purchased (at least in my experience) to augment them, or to fill a new and unique role. Further, mobile sales like that have picked up - but desktop and laptop sales have not yet *dropped* substantially; their growth has slowed, but unless they stop selling altogether I think there is still plenty of market for Intel's processors.

    Further, the modern Atom chips from Intel are increasingly capable and viable compared to ARM - and yet they are also full x86. This gives them more flexibility in terms of what they can run, without loss of battery life... and that will only get better in the future, as the Atom line is improved.

  • by hypergreatthing (254983) on Tuesday March 05, 2013 @04:42PM (#43083209)

    and since then, there's been a whole lot of improvements. Sure, chips are still packaged as quad core since that seems to be the best bang for the buck in terms of processing power, but efficiency has gone increasingly higher, cache has increased, speed has increased a lot. Sure games aren't pushing cpus as much, because there's not much to push it in. Back in the day video cards didn't have gpus and the ones that were out were really strained and relied a lot on the cpu. So of course when video cards have been increasingly getting better the work that the cpu had to do has been decreasing.
    Would you still buy a q6600 today? No.
    Does that mean you don't need a new i5-3570K? Depends, do you need a new computer? you can probably get away with your q6600 for a while. But if you were in the market for a new one, you'd probably get today's equivalent. And you would probably notice the difference of speed.

  • by adri (173121) on Tuesday March 05, 2013 @04:52PM (#43083435) Homepage Journal

    Seriously? You think the BSOD thing is because of the CPU architecture, versus the operating system architecture?

    Please provide more information. I think you're getting it wrong here.

    The alpha architecture was nice, but it was expensive, niche and single-vendor. It had floating point performance the smoked the i387/i487 of the day. It had 64 bit internal bits far before the PC architecture was 64 bits. But none of those prevent BSOD.

    BSOD is because of poor driver writing, poor system architecture and crappy hardware quality. Not because of the CPU architecture.

  • by leandrod (17766) <l.dutras@org> on Tuesday March 05, 2013 @05:15PM (#43083815) Homepage Journal

    There are loads of proprietary, binary software around. Some people even run OS/2 because they won’t port their software to something newer. FreeDOS is around and used in production. Alpha emulated x86 quite competently, and current x86 processors are actually Risc chips with an x86 translation unit.

    Until most software is based on open standards and free components that can be trivially recompiled, all platforms will live much longer than people would like them to.

  • It's more that NT for Alpha had a far more limited, and thus far better tested set of drivers, and the machines were only mid to highend - no lowend questionable hardware to worry about.
    The same reason Apple have a reputation for stability, despite these days being based on mostly the same components as any other x86 vendor.

A Fortran compiler is the hobgoblin of little minis.

Working...