Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AMD Businesses Hardware

AMD Confirms Commitment To x86 163

MrSeb writes with an excerpt from an Extreme Tech story on the recent wild speculation about AMD abandoning x86: "Recent subpar CPU launches and product cancellations have left AMD in an ugly position, but reports that the company is preparing to jettison its x86 business are greatly exaggerated and wildly off base. Yesterday, Mercury News ran a report on AMD's struggles to reinvent itself and included this quote from company spokesperson Mike Silverman: 'We're at an inflection point. We will all need to let go of the old 'AMD versus Intel' mind-set, because it won't be about that anymore.' When we contacted Silverman, he confirmed that the original statement has been taken somewhat out of context and provided additional clarification. 'AMD is a leader in x86 microprocessor design, and we remain committed to the x86 market. Our strategy is to accelerate our growth by taking advantage of our design capabilities to deliver a breadth of products that best align with broader industry shifts toward low power, emerging markets and the cloud.' The larger truth behind Silverman's statement is that no matter what AMD does, it's not going to be 'AMD versus Intel' anymore — it's going to be AMD vs. Qualcomm, TI, Nvidia, and Intel."
This discussion has been archived. No new comments can be posted.

AMD Confirms Commitment To x86

Comments Filter:
  • by ackthpt ( 218170 ) on Wednesday November 30, 2011 @12:10PM (#38215118) Homepage Journal

    The larger truth behind Silverman's statement is that no matter what AMD does, it's not going to be 'AMD versus Intel' anymore — it's going to be AMD vs. Qualcomm, TI, Nvidia, and Intel."

    Considering the execution of Bulldozer, you could possibly add AMD to the vs. list.

    • Granted, Bulldozer is...painful to look at. However, I am willing to give AMD the benefit of the doubt, and allow them one upgrade cycle to fix the bugs in their design before considering the competition. They claim that this design will ramp up better than the previous stuff, and others have claimed that a few software patches are needed for various OSs like Windows to take advantage of the change in architecture.

      Mind you, it does kind of feel like Intel with the Itanium (the Itanic), but thankfully this d

      • by AmiMoJo ( 196126 ) on Wednesday November 30, 2011 @02:22PM (#38216826) Homepage Journal

        The problems with Bulldozer are more than can be fixed by a few revisions or software patches I'm afraid.

        I honestly can't figure out what AMD was thinking. Every demanding desktop task these days makes heavy use of floating point maths. In fact that has been the case for a decade or more, going back to P4 and Athlon eras where they were adding more FPUs to single cores. So what does AMD do? Let's have more cores and fewer FPUs!

        I can only assume they were hoping that more of the heavy floating point computation would be handled by the GPU. Meanwhile Intel's current generation have added new instructions that outperform GPUs in tasks like video transcoding. It breaks my heart because I was really looking forward to Bulldozer as I have always favoured AMD. Their sockets last much longer than Intel's who seem to dream up a new one for every CPU revision, and you get all the features that Intel charges extra for like ECC RAM support.

        I think the best thing they can do now is revise the design and release the next generation as early as possible because this one is going nowhere.

        • I honestly can't figure out what AMD was thinking. Every demanding desktop task these days makes heavy use of floating point maths. In fact that has been the case for a decade or more, going back to P4 and Athlon eras where they were adding more FPUs to single cores. So what does AMD do? Let's have more cores and fewer FPUs!

          I thought that the standard 128 bit FPUs were independent between the modules as before. The only sharing that happens is when an AVX instruction is issued and they get merged to be a si

        • Comment removed based on user account deletion
      • by Kjella ( 173770 )

        Unfortunately the 2nd gen Bulldozer chips promise no more performance improvements than Ivy Bridge does. Intel's internal documents are leaking [hwbot.org], expect 10-15% performance gain, 20% lower power consumption (95W -> 77W) and HD4000 will be about 50% faster than HD3000 which will take another chunk of the discrete graphics market. Their 22nm 3D transistors are a real kick in the nuts for AMD, looks like the Core equivalent of die shrinks. Not exactly the competition they needed right now.

  • by LordNimon ( 85072 ) on Wednesday November 30, 2011 @12:27PM (#38215282)

    AMD is a leader in x86 microprocessor design, and we remain committed to the x86 market. Our strategy is to accelerate our growth by taking advantage of our design capabilities to deliver a breadth of products that best align with broader industry shifts toward low power, emerging markets and the cloud.

    This is a completely meaningless statement. You could say the same exact thing about any microprocessor company. For example:

    "Freescale is a leader in PowerPC microprocessor design, and we remain committed to the PowerPC market. Our strategy is to accelerate our growth by taking advantage of our design capabilities to deliver a breadth of products that best align with broader industry shifts toward low power, emerging markets and the cloud."

    This statement is true even though AMD and Freescale aren't competitors.

    This is the kind of garbage that makes employees think that their managers are clueless and don't know how to fix the company.

    • Re: (Score:2, Funny)

      by Anonymous Coward
      "PRUNEJUICE is a leader in DRINK YO PRUNE JUICE design, and we remain committed to the DRINK YO PRUNE JUICE market. Our strategy is to accelerate our growth by taking advantage of our design capabilities to deliver a breadth of products that best align with broader industry shifts toward low power, emerging markets and the cloud."
      • by nitehawk214 ( 222219 ) on Wednesday November 30, 2011 @03:59PM (#38218000)

        "PRUNEJUICE is a leader in DRINK YO PRUNE JUICE design, and we remain committed to the DRINK YO PRUNE JUICE market. Our strategy is to accelerate our growth by taking advantage of our design capabilities to deliver a breadth of products that best align with broader industry shifts toward low power, emerging markets and the cloud."

        A warrior's drink.

    • by Skarecrow77 ( 1714214 ) on Wednesday November 30, 2011 @12:30PM (#38215336)

      Meaningless marketing spin are the only public statements that:
      A. don't cause controversy and anger among the investors/stockholders
      B. you aren't forced to go back on 12 months down the line when you find out you were too optimistic and/or out of touch.

      • Re: (Score:3, Insightful)

        by Hatta ( 162192 )

        Meaningless marketing spin should cause controversy and anger among the stockholders. If I'm investing my money in a company, I want to know they have real plans, not just platitudes. Buzzwords are a sign that they have no idea what they're doing. Take your money and run.

        • that's what board meetings are for, not conversations with the press.

        • by epine ( 68316 )

          Take your money and run.

          Stores have figured out how to dump the unprofitable customers (long live data mining). Now they've figured out how to dump the skittish investors. You weren't wanted in the first place. Actually, the game was played this way all along.

          Reason is a short leash. The receiving side will take blind faith any time they can get it.

        • by Shatrat ( 855151 )
          Reporters deserve meaningless drivel because if you give them any real information they'll just butcher it in order to twist it into a cure for cancer or the latest exploit for iPhones in order to grab headlines.
          I'd look for firm dates and roadmaps provided to customers, partners, and investor relations as a sign that the company knows what they're doing.
    • by arth1 ( 260657 ) on Wednesday November 30, 2011 @12:51PM (#38215602) Homepage Journal

      This is a completely meaningless statement. You could say the same exact thing about any microprocessor company. For example:

      "Freescale is a leader in PowerPC microprocessor design, and we remain committed to the PowerPC market. Our strategy is to accelerate our growth by taking advantage of our design capabilities to deliver a breadth of products that best align with broader industry shifts toward low power, emerging markets and the cloud."

      This statement is true even though AMD and Freescale aren't competitors.

      Freescale commits to the cloud? That is BIG news. Time to run out and adjust my stock portfolio!

    • It's not completely meaningless, but you have to know how to read this kind of business-speak. He's saying, "We're either not planning on dropping x86 or we're not prepared to announce that we're dropping it. We do have hopes of moving into additional markets, including mobile, low-end, and scalable architectures, but we don't have any specific plans that we'd like to announce about that either."
  • In the past, I always advocated for, and employed AMD chipped systems. I was once burned by my advocacy when I lost several AMD mobos after they all got fried!

    This was a contributory event to my getting fired, though a poorly written application was partly responsible. My employer could not listen because other AMD systems survived. They did because they were to be running the application next.

    What is the experience of slashdotters using these systems? Do they still consume lots of power or overheat?

    • by 0123456 ( 636235 )

      AMD chips seem to consume more power than comparable Intel chips, but I'm pretty sure they have thermal throttling these days.

      I was impressed with one of the old P4 systems in my previous job because the fan was just lying on top of the heat-sink and every once in a while someone would knock it off and the CPU would just throttle down until someone got around to putting it back (yeah, I don't know why we never spent a few dollars to buy a fan that could actually be screwed into place). In those days an AMD

      • by yuhong ( 1378501 )

        I remember reading an old Tom's Hardware article dating back to the Athlon XP days about this.

    • That problem only affected earlier chips with inadequate cooling. The stock cooler was inadequate as they made certain assumptions with regard to "typical" usage. They fixed this long ago (it's been a decade?) by adding temp sensors and automatic clock throttling.
      • It probably has, Cool 'n' Quiet has been around for nearly a decade, so I'd bet that would be about the same time that they added the temp sensors and throttling. IIRC that was about the same time that Intel introduced similar technology.

  • by 93 Escort Wagon ( 326346 ) on Wednesday November 30, 2011 @12:35PM (#38215408)

    Let's say AMD is planning - or thinking about, at least - stopping the manufacture of x86 processors. What's a responsible company spokesperson going to say? "Yes, we're working on an exit strategy and are hoping to be out of the business by 2014" - does anyone believe that would be stated? If it was, their x86 business would tank immediately, and all employees working on x86 now would update their resumes and get while the getting is good.

    Several years ago, we had an important faculty member accept a dean-ship at another university. The lead time was going to be a bit more than a year. In the meantime, this faculty member still had research projects going full bore. So what did he do? He told his staff that the research projects were going to continue, and would remain at our university for the foreseeable future. Guess what happened a year later? Yup - the "foreseeable future" he spoke of 12 months before turned out to be almost exactly 12 months long.

    • Or it could just as easily be someone floating a balloon -> a rumor is reported through various sources, and AMD gets a preview of how the market might react. Depending on the reaction, they might go one way or the other.

    • by tlhIngan ( 30335 ) <slashdot.worf@net> on Wednesday November 30, 2011 @02:19PM (#38216800)

      Let's say AMD is planning - or thinking about, at least - stopping the manufacture of x86 processors. What's a responsible company spokesperson going to say? "Yes, we're working on an exit strategy and are hoping to be out of the business by 2014" - does anyone believe that would be stated? If it was, their x86 business would tank immediately, and all employees working on x86 now would update their resumes and get while the getting is good.

      I'd be willing to bet that one of AMD's investors is Intel, and while AMD may want to get rid of the x86 business, Intel won't let it.

      Intel needs AMD. And AMD's weakened state is ideal for Intel. However, if AMD dies, Intel also suffers (think anti-trust). But with AMD alive, Intel's scrutiny is lowered and they can sell more chips easily.

      Heck, I'm willing to bet Intel has next-gen chips ready, but they want to keep AMD viable and are holding off the release. There's no benefit to Intel other than a few percent marketshare if AMD dies, and there's a huge downside of EU regulators, US regulators and very close scrutiny.

      • AMD already is using next-gen chips. They would love for x86 compatibility to no longer be a line item for those chips, because it represents significant wasted silicon. I don't think we're there yet though, and I don't think AMD thinks so either. Only when Windows XP is gone, and the machines that run it along with it, will we truly be ready to move Windows to the 64 bit era. From what I can tell, most of the machines that have come with it are 64 bit anyway, whether they came with 64 bit windows or not, b

  • Translation (Score:5, Interesting)

    by DarthVain ( 724186 ) on Wednesday November 30, 2011 @12:38PM (#38215442)

    "Our strategy is to accelerate our growth by taking advantage of our design capabilities to deliver a breadth of products that best align with broader industry shifts toward low power, emerging markets and the cloud."

    We will continue to make chips for servers, and low end crap. We can't compete with Intel for the consumer market in the short to medium term, however we are still relevant in business circles.

    Consumers prepared to be gouged by Intel as soon as they figure this out. Also other than to just "say it" this has been the truth for some time, years in fact. I don't know if it is AMD stumbling or Intel just continuing to hit home runs, but there hasn't exactly been a whole lot of competition since the days of the ye old Athlon 64 series of processors. Ever since Intel came out with the Core 2 Duo, AMD has been unable to come up with an answer. Perhaps it had something to do with diversifying by buying up ATI, diverting capitol or focus away from core business. Ironically the AMD/ATI brand of video cards has a better reputation than the AMD CPU division, if only my opinion...

    • We will continue to make chips for servers, and low end crap. We can't compete with Intel for the consumer market in the short to medium term, however we are still relevant in business circles.

      Consumers prepared to be gouged by Intel as soon as they figure this out.

      Intel really can't gouge customers too hard, or it will hasten the transition away from x86 that they fear. ARM will be a much more serious competitor once Windows 8 is released with support for it. Yes, it requires everything to be recompile

      • by 0123456 ( 636235 )

        Indeed. Intel's real competition hasn't been AMD for a few years now, it's been ARM.

        • Let Intel turn its full attention to ARM for a few cycles, and see if AMD doesn't punish them.

          • by 0123456 ( 636235 )

            Let Intel turn its full attention to ARM for a few cycles, and see if AMD doesn't punish them.

            Intel don't need to, because they're big enough to have different teams doing both. The problem is that no-one can really push the x86 architecture down to ARM-level power consumption because it's such a complex beast in comparison.

        • by Andy Dodd ( 701 )

          I wouldn't be surprised if Intel is REALLY regretting selling off XScale to Marvell - Intel had an ARM business for a while, but it just didn't do particularly well, so they sold it.

          Probably 1-2 years later, the ARM market started exploding.

          I would not be surprised if Intel is quietly working on getting back into the ARM business.

      • Intel is pushing forward because it's beneficial to them at the moment not to rest on their laurels.
        AMD is underperforming, yes, but not so much that Intel is given any real leeway to slack off;
        That is to say, if the i5/i7 lines were only a 5% increase over C2D performance for 1/3 higher price, AMD would have destroyed them, so while AMD hasn't been "real" competition for Intel for quite some time now, they've been good enough to keep the industry trudging along.

        If AMD outright left the market, there would

        • by Shatrat ( 855151 )
          There would still be an incentive to keep performance/price increasing enough to keep the upgrade cycle going, it just wouldn't be as strong as if they had someone else pushing that same metric in competition. They could also control exactly what rate it grows at, and hold back their outstanding new designs until demand tapered off, then release them in order to obsolete the last generation and start the next wave of upgrades.
      • by Andy Dodd ( 701 )

        This is pretty much what happened in the P4 days. Intel got complacent and started gouging customers, and that allowed AMD to gain HUGE amounts of market share.

        • Re: (Score:2, Informative)

          AMD SHOULD have gained huge amounts of market share, but thanks to some anti-competitive behavior by Intel that wasn't the case.
      • Yeah "gouge" might be too strong a word. Elevated prices due to limited competition is likely a better way to put it. At best when AMD was even at its height and enthusiasts argued hotly which was better, AMD only had a marginal market share, mostly due to the big box stores such as Dell, Gateway (remember them, whatever happened to them), and the rest being reluctant to move away from Intel (I also recall some shady trade practices by Intel at the time also).

        In any case, even though limited, the competitio

  • My understanding is that Radeon cards are still competing neck-and-neck with Nvidia's offerings these days, especially per-dollar. I may be mistaken, though, as my video card is still an 18-month-old ATI Radeon 5850 (back before Nvidia even had a DirectX 11 card on the market, and before the AMD-ATI buyout), which can still play everything I've thrown at it on full settings at 1920x1080.

    Even if their CPUs are lack-luster (even at the lower price point, it would seem, where they used to be quite competitive)

    • Until ATI has good Linux support I'll only be buying NVidia. Actually I've generally bought AMD CPUs and NVidia GPUs, but if AMD starts slipping then I'll be forced to go Intel.
    • I'm a radeon guy to... always buy AMD/Radeon, but even I have to admit.. Radeon cards have a lot of problems that NVidia cards just dont have. You go to nearly every major game releaseds support forums, and what's stickied at the top? "Radeon owners issues click here"

      Add to that Nvidias clearly superior support for hardware accelerated HD decoding and really, my favorite card has some catching up to do. I spent months trying to get a Radeon card to work in my HTPC and I think I got the hardware decoding to
      • by jandrese ( 485 )
        It has always been the case that AMD/ATI has generally had the better hardware (although it does shift whenever new chips drop), but nVidia has the better drivers. People will occasionally tell you that "yeah, that was true in the past, but ATI drivers are pretty good now", but then some new game comes out and they're crap again.
        • by Andy Dodd ( 701 )

          Yup, I have yet to ever see evidence that ATI has learned the concept of regression testing.

          It seems like on a regular basis, Game X needs driver revision M or lower, and Game Y needs driver revision N or higher with ATI cards. So you're screwed if you want to play both games.

          Every time I have had the misfortune of dealing with an ATI video chipset, it's been utter driver hell. NVidia does a much better job of regression testing, and they also do a MUCH better job of long term support of older chipsets.

          • Let us also not forget that AMD will not give you mobile drivers, which used to also be true of nVidia, but not any more. Now you can download the Quadro FX drivers (for example) direct from nVidia. But AMD still expects you to get mobile graphics drivers from the OEM. That's a convenient way to avoid supporting older graphics chips like the integrated graphics in my R690M-based "netbook" (at 11" with megachiclets it's more like a subnotebook) which only works properly in Windows Vista 32-bit. There's no Wi

    • Hmm. AMD has been trying to topple Intel by merging the CPU and the GPU into a single unit.

      Intel tends to be better in the single-threaded CPU performance, while AMD has been better with offering more cores. What changes with ATI and Intel is that Intel's graphics options are something of a terrible joke (played on corporate and value customers), and ATI's video cards are sought after as equally as Nvidias.

      If AMD can offer a single chip that does both, and does it well (key factor here), with compilers that

      • by 0123456 ( 636235 ) on Wednesday November 30, 2011 @01:42PM (#38216264)

        If AMD can offer a single chip that does both, and does it well (key factor here)

        You can't put a 300W GPU and a 125W CPU on the same die. At least not if you're sane.

        The only use for graphics integrated on the CPU are for cheap low-end systems or for extra performance if you can offload some processing to the GPU cores. Putting a high-performance GPU there makes no sense because you need insane cooling to get the heat out and it will be crippled by the slow, shared memory interface anyway.

        Plus, of course, you can't just upgrade the GPU in two years when the CPU is still fast enough for current games but the GPU isn't; you have to replace both. CPU manufacturers might love that, but users won't.

        • by Andy Dodd ( 701 )

          It doesn't help that Intel integrated graphics made GREAT leaps forward in Sandy Bridge.

    • The AMD buyout of ATI happened in 2006. Your 5850 was made by AMD.

  • I just upgraded my PC from a Intel E6600 to a AMD Phenom II X6 1100T. I chose AMD, for one reason. How the heat sink / fan attach to the motherboard.

    I have dogs, and kids and my PC doesn't reside protected under a desk. It gets bumped all the time from them playing and those stupid plastic plug brackets that Intel uses to attach the heat sink and fan to the motherboard were absolute garbage. Someone would bump my PC and the heat sink would hang off and cause the CPU to overheat. Not to mention after re

    • Why not just buy a third-party heatsink that comes with its own backplate? You don't have to use the default mounting hardware.
    • Comment removed based on user account deletion
  • by Vyse of Arcadia ( 1220278 ) on Wednesday November 30, 2011 @12:52PM (#38215620)

    When we contacted Silverman, he confirmed that the original statement has been taken somewhat out of context and provided additional clarification. 'AMD is a leader in x86 microprocessor design, and we remain committed to the x86 market. Our strategy is to buzzwords buzzwords buzzwords buzzwords buzzwords buzzwords.'

  • I love you guys but recently only have been buying Intel i5 and i7 because your Math coprocessor still stinks badly compared to Intel. For video compression and really heavy maths, I really wanted to use your 6 core processors, but they were slower than the 4 core i7 I bought instead.

    Give me a 6 core that runs like a raped ape and has a really good math coprocessor and I'll be back. give me an 8 core that can also do multi chip on the same motherboard so I can build a 16 core for a cheap price, and I'll

  • Shame... (Score:4, Insightful)

    by Oswald McWeany ( 2428506 ) on Wednesday November 30, 2011 @01:27PM (#38216056)

    Shame- I usually support the underdog- and always wanted AMD to be able to run Intel neck-and-neck.

    Nowadays though AMD seems to stand for A Mediocre Design

    I hope they can recapture their mojo and challenge intel again- if for no other reason than to provide a lower pricing incentive to intel.

    • by Shatrat ( 855151 )
      I think the problem is that they are competing with Intel to beat them in massively parallel and IO limited applications. This doesn't look good on your average review site benchmarking the processor in Crysis or some PC oriented synthetic benchmarks.
      I think they're still going to be very popular for servers and supercomputers for a long time.
    • I support the underdog too!

      I can remember the days when Intel had the whole market to themselves, the 486 CPU was over $1,000, worth more than its weight in gold. Then came along Cyrix who started making a cheaper alternative, the price soon dropped to less than $200 per CPU. The manufacturers were still making a profit and the consumers better off.

      As long as the performance is not too far off the mark I will continue to buy AMD.

  • China needs a processor company, and even without AMD being leading-edge if their products are sold inexpensively there is a huge potential market worldwide.

  • Comment removed based on user account deletion
    • The Bulldozer release showed AMD's commitment to low-end computers.

      In what way? The Bulldozer architecture is transistor-heavy and uses lots of power, just the opposite of what you want in a low-end computer.

      If anything shows AMD's commitment to low-end x86 computers, it's Bobcat.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...