Forgot your password?
typodupeerror
AMD Businesses Hardware

AMD Confirms Commitment To x86 163

Posted by Unknown Lamer
from the amd-versus-world dept.
MrSeb writes with an excerpt from an Extreme Tech story on the recent wild speculation about AMD abandoning x86: "Recent subpar CPU launches and product cancellations have left AMD in an ugly position, but reports that the company is preparing to jettison its x86 business are greatly exaggerated and wildly off base. Yesterday, Mercury News ran a report on AMD's struggles to reinvent itself and included this quote from company spokesperson Mike Silverman: 'We're at an inflection point. We will all need to let go of the old 'AMD versus Intel' mind-set, because it won't be about that anymore.' When we contacted Silverman, he confirmed that the original statement has been taken somewhat out of context and provided additional clarification. 'AMD is a leader in x86 microprocessor design, and we remain committed to the x86 market. Our strategy is to accelerate our growth by taking advantage of our design capabilities to deliver a breadth of products that best align with broader industry shifts toward low power, emerging markets and the cloud.' The larger truth behind Silverman's statement is that no matter what AMD does, it's not going to be 'AMD versus Intel' anymore — it's going to be AMD vs. Qualcomm, TI, Nvidia, and Intel."
This discussion has been archived. No new comments can be posted.

AMD Confirms Commitment To x86

Comments Filter:
  • by LordNimon (85072) on Wednesday November 30, 2011 @11:27AM (#38215282)

    AMD is a leader in x86 microprocessor design, and we remain committed to the x86 market. Our strategy is to accelerate our growth by taking advantage of our design capabilities to deliver a breadth of products that best align with broader industry shifts toward low power, emerging markets and the cloud.

    This is a completely meaningless statement. You could say the same exact thing about any microprocessor company. For example:

    "Freescale is a leader in PowerPC microprocessor design, and we remain committed to the PowerPC market. Our strategy is to accelerate our growth by taking advantage of our design capabilities to deliver a breadth of products that best align with broader industry shifts toward low power, emerging markets and the cloud."

    This statement is true even though AMD and Freescale aren't competitors.

    This is the kind of garbage that makes employees think that their managers are clueless and don't know how to fix the company.

  • by ackthpt (218170) on Wednesday November 30, 2011 @11:30AM (#38215330) Homepage Journal

    that had to google "inflection point"? From a marketing standpoint it might be good to have a CEO who isn't an engineer :P.

    or a CEO who picks up a word or phrase from an engineer and thinks, 'Hey, that sounds good, I'll use it in my next meeting or press statement!'

  • by Skarecrow77 (1714214) on Wednesday November 30, 2011 @11:30AM (#38215336)

    Meaningless marketing spin are the only public statements that:
    A. don't cause controversy and anger among the investors/stockholders
    B. you aren't forced to go back on 12 months down the line when you find out you were too optimistic and/or out of touch.

  • by 93 Escort Wagon (326346) on Wednesday November 30, 2011 @11:35AM (#38215408)

    Let's say AMD is planning - or thinking about, at least - stopping the manufacture of x86 processors. What's a responsible company spokesperson going to say? "Yes, we're working on an exit strategy and are hoping to be out of the business by 2014" - does anyone believe that would be stated? If it was, their x86 business would tank immediately, and all employees working on x86 now would update their resumes and get while the getting is good.

    Several years ago, we had an important faculty member accept a dean-ship at another university. The lead time was going to be a bit more than a year. In the meantime, this faculty member still had research projects going full bore. So what did he do? He told his staff that the research projects were going to continue, and would remain at our university for the foreseeable future. Guess what happened a year later? Yup - the "foreseeable future" he spoke of 12 months before turned out to be almost exactly 12 months long.

  • by arth1 (260657) on Wednesday November 30, 2011 @11:44AM (#38215500) Homepage Journal

    Indeed. Which is why words and phrases like "pushing the envelope" and "quantum leap" are so often used wrong, and marks the CEO (who reflects on the company) as a dummy.

  • by Hatta (162192) on Wednesday November 30, 2011 @12:21PM (#38215988) Journal

    Meaningless marketing spin should cause controversy and anger among the stockholders. If I'm investing my money in a company, I want to know they have real plans, not just platitudes. Buzzwords are a sign that they have no idea what they're doing. Take your money and run.

  • Shame... (Score:4, Insightful)

    by Oswald McWeany (2428506) on Wednesday November 30, 2011 @12:27PM (#38216056)

    Shame- I usually support the underdog- and always wanted AMD to be able to run Intel neck-and-neck.

    Nowadays though AMD seems to stand for A Mediocre Design

    I hope they can recapture their mojo and challenge intel again- if for no other reason than to provide a lower pricing incentive to intel.

  • Re:x86 (Score:5, Insightful)

    by TheRaven64 (641858) on Wednesday November 30, 2011 @12:30PM (#38216114) Journal

    When the Pentium 4 came out, it was frequently called the "7th generation", but it was never called the 786 or 80786, either formally or informally

    But they are all x86 compatible, because they can all run code compiled for 8086, 80186, 80286, 80386 and 486 processors.

    My new hobby will be referring to processors as having x87 architecture, as a distinction to indicate they support floating point instructions.

    People do refer to x87 when talking about the FPU on x86 chips. It's commonly used when differentiating it from SSE - modern compilers will emit SSE instructions instead of x87 ones unless you specify a backwards compatible target architecture (PII or earlier).

  • by 0123456 (636235) on Wednesday November 30, 2011 @12:42PM (#38216264)

    If AMD can offer a single chip that does both, and does it well (key factor here)

    You can't put a 300W GPU and a 125W CPU on the same die. At least not if you're sane.

    The only use for graphics integrated on the CPU are for cheap low-end systems or for extra performance if you can offload some processing to the GPU cores. Putting a high-performance GPU there makes no sense because you need insane cooling to get the heat out and it will be crippled by the slow, shared memory interface anyway.

    Plus, of course, you can't just upgrade the GPU in two years when the CPU is still fast enough for current games but the GPU isn't; you have to replace both. CPU manufacturers might love that, but users won't.

  • by AmiMoJo (196126) <{ten.3dlrow} {ta} {ojom}> on Wednesday November 30, 2011 @01:22PM (#38216826) Homepage

    The problems with Bulldozer are more than can be fixed by a few revisions or software patches I'm afraid.

    I honestly can't figure out what AMD was thinking. Every demanding desktop task these days makes heavy use of floating point maths. In fact that has been the case for a decade or more, going back to P4 and Athlon eras where they were adding more FPUs to single cores. So what does AMD do? Let's have more cores and fewer FPUs!

    I can only assume they were hoping that more of the heavy floating point computation would be handled by the GPU. Meanwhile Intel's current generation have added new instructions that outperform GPUs in tasks like video transcoding. It breaks my heart because I was really looking forward to Bulldozer as I have always favoured AMD. Their sockets last much longer than Intel's who seem to dream up a new one for every CPU revision, and you get all the features that Intel charges extra for like ECC RAM support.

    I think the best thing they can do now is revise the design and release the next generation as early as possible because this one is going nowhere.

  • by TheRaven64 (641858) on Wednesday November 30, 2011 @02:14PM (#38217448) Journal
    Not really. AMD has really crap proprietary drivers, nVidia has slightly crap proprietary drivers. AMD's open source drivers are poor, nVidia's are nonexistent. If you're willing to run a blob, nVidia's support is better. If you aren't, they both suck.

Lo! Men have become the tool of their tools. -- Henry David Thoreau

Working...