Forgot your password?
typodupeerror
Intel Hardware News

Oracle Claims Intel Is Looking To Sink the Itanic 235

Posted by samzenpus
from the left-behind dept.
Blacklaw writes "Intel's ill-fated Itanium line has lost another supporter, with Oracle announcing that it is to immediately stop all software development on the platform. 'After multiple conversations with Intel senior management Oracle has decided to discontinue all software development on the Intel Itanium microprocessor,' a company spokesperson claimed. 'Intel management made it clear that their strategic focus is on their x86 microprocessor and that Itanium was nearing the end of its life.'"
This discussion has been archived. No new comments can be posted.

Oracle Claims Intel Is Looking To Sink the Itanic

Comments Filter:
  • Re:Sparc (Score:5, Interesting)

    by blair1q (305137) on Wednesday March 23, 2011 @08:36PM (#35594012) Journal

    x86 is a small part of what's in a modern x86 CPU.

    There's hardly any good reason to choose anything else over it, either. You can't beat it on performance the way Alpha did. PPC lost its simplicity long ago (and comes with some annoyances that make me wish it would just die).

    Intel's latest stuff is the best that ever was. Nobody else does or ever has come close.

  • by Mysticalfruit (533341) on Wednesday March 23, 2011 @09:02PM (#35594208) Journal
    I've long argued that Itanium was Intel's vehicle to kill PA-RISC and get HP out of the high performance computing market and it worked. Intel let that CPU die a death of a thousand committee compromises while simultaneously plundering all of the technology they could out of Alpha and rolling out their Xeon cpus out at much higher clock speeds and with features that weren't in Itanium.

    I worked at a computer company and we built servers that used PA-RISC cpus at the time and we got our hands on some Itanium samples and needless to say, we decided to migrate the platform to Xeon instead.
  • Ah well (Score:4, Interesting)

    by Mr Z (6791) on Wednesday March 23, 2011 @09:05PM (#35594226) Homepage Journal

    I work directly with a VLIW architecture myself (the TI C6000 family of DSPs). From that perspective, I'm a little sad to see Itanium go. I realize EPIC isn't exactly VLIW, but they had an awful lot in common. Much of HP's and Intel's compiler research helps us other VLIW folks too.

    I think EPIC tried to live up to its name a little too much. The original Merced overreached, and so it ended up shipping far too late for its performance to be compelling. Everybody always zooms in on the lackluster x86 performance, but x86 wasn't at all interesting in the spaces Itanium wanted to play in originally. It wanted to go after the markets dominated by non-x86 architectures such as Alpha, PA-RISC, MIPS and SPARC. And had it come out about 3 years earlier, it may've had a chance there by thinning the field and consolidating the high-end server space behind EPIC.

    Instead, it launched late as a room-heating yawner. And putting crappy x86 emulation on board only tempted comparisons to the native x86 line. That it made it all the way to Poulson is rather impressive, but smells more like contractual obligation than anything else.

    Rest in peace, Itanium.

  • Re:Sparc (Score:5, Interesting)

    by hairyfeet (841228) <bassbeast1968.gmail@com> on Wednesday March 23, 2011 @09:56PM (#35594486) Journal

    Well ARM is a hell of a lot less power using but it is also a hell of a lot less powerful clock for clock, so it evens out doesn't it? I mean sure in a cell phone where its main job is running a highly specialized OS, with tons of little support chips to help it out it does great, but I wouldn't want to do my day to day desktop computing on it.

    I never did understand the Intel VS ARM comparisons because it made as much sense to me as comparing a Peterbuilt and a Kia. Sure the Kia is gonna get a hell of a lot better gas mileage but I sure wouldn't want to try to move into an apt using only a Kia to haul my furniture. You try one of those AMD or Intel ULV netbooks and comparing it to the little ARM netbooks is like night and day. I could easily see myself doing most of my day to day on the X86 and not getting frustrated, whereas anything not expressly thought up and prepared for by the ARM netbook OEM and it is welcome to slow town.

    So while the Itanic will go down as just another failed Intel experiment, like that ARM based chip they tried to get everyone to switch to in the 80s, I really can't see X86 going anywhere, especially once AMD solved the 4Gb barrier with the X64 extensions. The little specialized devices will stay ARM while the general computing will stay X86.

    I'm sure there will be a few crossover niches, such as ARM for specialized servers which stress low power over everything else, but for the rest of the jobs where performance matters I just don't see ARM stepping up to AMD or Intel quad levels of performance, not without killing the low power selling point. It is just one of those things you can't get around, faster equals hotter and more power usage, whereas slow chips with less complexity use less power.

  • by Anonymous Coward on Thursday March 24, 2011 @12:19AM (#35595274)

    I agree with Oracle that it is close to over for the chip. Intel lost every good engineer working on it to AMD in Fort Collins, CO and can't (even with massive financial incentives) coax anybody on their x86 teams to transfer over. Itanium is considered the kiss of death on a resume so they are having a hard time even finding people willing to work on it. Work on Itanium is about 6 years behind original schedules! Originally designed and marketed as a performance leader to the Xeon series it has fallen so far behind that it had to be re-marketed with FUD about quality, scalability, and stability. While I agree it has better quality and stability than the i3,5,7 series, Intel has a hard time explaining how it is better in those terms compared to their higher end Xeon series.

  • by BLToday (1777712) on Thursday March 24, 2011 @01:09AM (#35595450)

    My old college roommate was offered a job at Intel Itanium's unit after finishing his PhD in compiler theory. He turned it down because "life's too short to spend it fixing Itanium."

  • Re:Sparc (Score:5, Interesting)

    by TheRaven64 (641858) on Thursday March 24, 2011 @07:43AM (#35596944) Journal

    Since this is an article about Itanium, it's worth noting that Itanium copies the predicated instruction model from ARM. This doesn't just make the code denser, it meant that ARM could get away without having a branch predictor for a very long time (new ARM chips have one). It works very nicely with superscalar architectures, because the instructions are always executed, and the results are only retired if the condition is met. You always know the state of the condition flag by the time the predicated instructions emerge from the pipeline, so it's trivial to implement in comparison with the kind of speculative execution required for predicted branches on x86.

    Lots of people seem to assume that x86 is translated into RISC and then x86 has no impact on the rest of the execution pipeline. This is absolutely not the case. The x86 instruction set is horrible. Lots of things have side effects like setting condition registers, which cause complex interactions between instructions in a pipelined implementation, and insanely complex interactions in an out-of-order design. This complexity all has to be replicated in the micro-ops. Something like Xeon then has a pass that tries to simplify the micro-ops. You effectively have an optimising JIT, implemented in hardware, which does things like transforming operations that generate side effects into ones that don't if the relevant condition flags are guaranteed to be replaced by something else before they are accessed. All of this adds to complexity and adds to the power requirement.

    Oh, and some of these interactions are not even intentional. Some of the old Intel guys tell a story about the first test simulations of the Pentium. It implemented all of the documented logic, but then they found that most of the games that they tried running on it failed. On the 486, one of the instructions was accidentally setting a condition flag due to a bug in the design. Game designers found that they could shorten some instruction sequences by taking advantage of this. In the Pentium, they didn't recreate this bug, and software broke. After the first phase of testing, they had to go back and recreate it (adding some horrible hacks in the Pentium design in the process), because if games suddenly crashed when people upgraded to a Pentium then people would blame Intel (Windows 95 had a hacky work-around to prevent SimCity crashing on a use-after-free bug, for the same reason). All of these things add to complexity and in hardware complexity equals power consumption.

    Or if, you are that way inclined, you could argue about Java/.NET bytecode making code compiled at run time achieving the same thing.

    And, if you are, then Thumb-2EE is a much nicer target than x86 for running this code. It has instructions for things like bounds-checked array access, which really help performance in JIT'd VM code.

  • by hairyfeet (841228) <bassbeast1968.gmail@com> on Thursday March 24, 2011 @09:42AM (#35597726) Journal

    But nobody ever asks the important questions, such as : Why did Windows win (hint: It is NOT a conspiracy)? Why hasn't everyone ditched their desktops for Linux on ARM? Hell why hasn't Linux on x86 gotten more than the margin for error even though it has been free for 15 years?

    I'll tell you why, and it'll be the same reason why Windows on x86 will ultimately win against ARM on the desktop: elitism. Before WinNT server OSes were a CLI heavy, having to know shitloads of arcane commands good old boys club where only the "true" admins were allowed to play and that is how they liked it. Linux today has that same problem, anything that makes it easier for the user automatically gets accused of "dumbing down" for the "noobs". Hell some of the developers are so anti-user I'm shocked they don't have the pic of Johnny Cash flipping the bird as a loading screen before dropping you into a shell and demanding all code in brainfuck.

    But Windows came along where "lets be friendly for users!" was a mantra and where any middle manager could set up file and print serving and totally changed the game because they took away a LOT of frustration. Hell I could teach my 15 year old how to run a basic Windows domain in less than two weeks easy.

    Another example of user hostility and elitism is in this very thread where so many are screaming about how backwards compatibility is bad and evil, well you know what? For the users of the world allow me to say "go fuck yourselves, because we don't care what you think developers!" That's right, we LIKE backwards compatibility! My mom LIKES being able to play her AoE I and ancient match 3 games on her new PC, I LIKE being able to run all my old games and programs, I LIKE being able to stick with Office 2K even on my new x64 quad running Win 7.

    The developers here seem to forget we the users are the ones buying the hardware and while developers may not give a shit or have anything invested in their software (hell as long as they have an IDE and a compilers they're happy) we the users have time, money, and experience invested in our programs and we do NOT care whether you think our shit isn't cool anymore we LIKE it and we WILL keep it!

    So I would say the whole ARM VS x86 debate is just a microcosm of the whole developers VS users debate, and I'm betting once again the users win over the developers. Not everyone (in fact I would say damned few) want to get on the Apple "toss every couple of years" treadmill when it comes to their work and play programs. ARM can get away with that shit in its current niche because cell phones and pads are considered disposable items by the users and they frankly don't have shit invested in them. Like their cell they just toss and get another one, and never expect any of their programs to work.

    The reason ARM (and Linux) is doomed to fail on the desktop is people DO expect to be able to keep their programs and with both unless you have compiling skills (which knocks out a good 85% of the public) you WILL take the new hotness or be SOL, because the developers only care about the new hotness NOT whether or not they break everything that came before it. But the public simply won't go along with that on the desktop thanks to Windows showing them programs CAN be run year after year, OS after OS, and it'll all "just work" for the most part.

    To me it all smacks of developer elitism, and that way lost before and WILL lose again. The public simply won't go along.

Do not simplify the design of a program if a way can be found to make it complex and wonderful.

Working...