Forgot your password?
typodupeerror
AMD Businesses Intel Hardware

AMD Preparing To Give Intel a Run For Its Money 345

Posted by Soulskill
from the saddle-up dept.
jfruh writes: "AMD has never been able to match Intel for profits or scale, but a decade ago it was in front on innovation — the first to 1GHz, the first to 64-bit, the first to dual core. A lack of capital has kept the company barely holding on with cheap mid-range chips since; but now AMD is flush with cash from its profitable business with gaming consoles, and is preparing an ambitious new architecture for 2016, one that's distinct from the x86/ARM hybrid already announced."
This discussion has been archived. No new comments can be posted.

AMD Preparing To Give Intel a Run For Its Money

Comments Filter:
  • Buh? (Score:5, Interesting)

    by drinkypoo (153816) <martin.espinoza@gmail.com> on Friday May 16, 2014 @03:18PM (#47020151) Homepage Journal

    But the real fight of a decade ago, when AMD was first to 1GHz, the first to 64-bit, the first to dual core, seemed missing. It's not surprising since the company was facing a real threat to its survival. But with a gravy train from the gaming consoles, it looks like the company is ready for a fresh battle, with a familiar face at the helm.

    Uh, wait. No. It was surprising when AMD was the performance leader. It was surprising because they were broke. It's not surprising to see AMD pushing out a new architecture now that they have money. It takes a lot of money to do that. So we start out completely ass-backwards here.

    Much elided, then

    The most logical move for Keller would be to dump the CMT design in favor of a design with simultaneous multi-threading (SMT), which is what Intel does (and IBM's Power and Oracle's Sparc line).

    Wait, what? Why? Why wouldn't it make more sense to just fix the lack of FP performance, perhaps by adding more FP units? Why would it make more sense for them to go to a completely different design? It might well, but there is no supporting evidence for that in the article.

    • It wasn't that surprising that AMD was king around 2003-2004, the problem was that Intel was playing very dirty, signing deals with OEMs like Dell to specifically NOT use AMD chips. The fines Intel got from the EU are never going to do as much to help AMD as actually gaining more profits during that period (and who knows, they may not have sold their mobile Radeon group to Qualcomm in an effort to raise cash). It's the domino effect of unknowns that hurts the most.
    • Re:Buh? (Score:4, Insightful)

      by jandrese (485) <kensama@vt.edu> on Friday May 16, 2014 @03:38PM (#47020363) Homepage Journal
      AMD was dominant while Intel was chasing dead ends (Netburst and Itanium). Once Intel woke up and started working on sane chip designs again AMD's goose was cooked. They just can't compete with Intel's R&D budget. Plus, AMD made some boneheaded decisions of their own, like firing a bunch of their R&D staff in the belief that computer automated chip layout would prove superior to human designed layouts.
      • by drinkypoo (153816)

        AMD was dominant while Intel was chasing dead ends (Netburst and Itanium). Once Intel woke up and started working on sane chip designs again AMD's goose was cooked. They just can't compete with Intel's R&D budget.

        Well, that was my point, AMD can afford to have an R&D budget right now. But you're right, intel spent a lot of time dicking around with nonsensical architectures that they might well have been able to spend crushing AMD sooner. On the flip side of that, though, is the question of whether they could have actually been more effective. Too many cooks, and all that. Spending more money doesn't necessarily result in getting where you want to go sooner. You tend to go somewhere, but not necessarily in your c

      • To be fair to Intel (Score:4, Interesting)

        by Sycraft-fu (314770) on Friday May 16, 2014 @04:39PM (#47021025)

        Netburst did seem like a reasonable idea, in testing. While it was low IPC, it looked like it would scale bigtime in the speed area. They had test ALUs running at 10GHz.

        So I can see the logic: You make an architecture that can scale to high frequencies easily, and that gets you the speed.

        Obviously it didn't scale, and wasn't a good idea, but I can see what they were going for. It wasn't like it was completely nuts.

        • by Grishnakh (216268)

          Well they completely forgot that higher clock frequencies translate to higher power consumption and higher thermal dissipation, and you can only remove heat from a chip in a consumer computer so fast since resorting to submerging it in coolant obviously isn't feasible for a consumer device. I remember some Intel presentations (I worked there at the time) where they were proudly bragging about how much power these chips consumed. Did they not think people might not like their computers generating that much

  • by Jaborandy (96182) on Friday May 16, 2014 @03:19PM (#47020175)
    I was so proud of them when they kicked IA64's ass with their amd64 architecture, beating Intel at their own game by choosing to be x86-compatible when even Intel didn't go that way. Then I was sad when amd64 started getting called x64, since it stripped AMD of the credit they deserved. Go AMD! A world without strong competition for Intel would be very bad for consumers.
    • by jandrese (485)
      I have to admit, I found the AMD64 moniker a little confusing when I first read it. I had to Google around to make sure my Core2 chip would support it and that it wasn't using some AMD proprietary extension top of the 64 bit extension. x86_64 is less confusing, even if it is more awkward to type out.
  • Drivers? (Score:5, Insightful)

    by Bigbutt (65939) on Friday May 16, 2014 @03:19PM (#47020179) Homepage Journal

    Honestly they need a better team writing the drivers. You can have the best CPU/GPU in the industry but if the drivers suck, no one will want to buy them.

    [John]

    • I've had plenty of issues with AMD video drivers, though those problems seem to be behind them, but is there really a problem with their CPU/chipset drivers?
      • Not really, as far as I know. The CPU power management drivers and chipset drivers work well.
    • This is incredible true and has been over a decade. They don't seem to realize that they have better hardware for the same price, but people refuse to buy them because it's hard to appreciate the betterness without proper drivers.
      I'm sure the FLOSS crowd would also start embracing AMD if they did decent OSS drivers like Intel does.

  • by Baldrson (78598) * on Friday May 16, 2014 @03:19PM (#47020181) Homepage Journal

    This was their opportunity to dominate the CPU market with the MIll CPU architecture [millcomputing.com] and they blew it.

  • wrong (Score:4, Interesting)

    by Charliemopps (1157495) on Friday May 16, 2014 @03:27PM (#47020255)

    Sorry AMD, you're heading in the completely wrong direction. CPUs are already plenty fast. They have been for years. 3D gaming is starting to look like just another "Gold plated speaker wire" guy hobby as everyone moves to mobile devices.

    The real winners in the future are going to be the very cheap, very efficient chips. Do you want one very powerful computer to run everything in your house? Or do you want everything in your house to have its own dedicated, highly efficient CPU that does just what that device needs?

    • by Bruinwar (1034968)
      Why did I still wait if my CPU is "plenty fast" enough? My dream is to some day work on a machine that waits for my input rather than me always waiting for it. This means faster everything, including the CPU.
    • by mlts (1038732)

      What AMD should consider are FPGAs and different power cores on the same die. This isn't anything new, but done right, it can go a long way in the server room.

      The FPGAs can be used for almost anything. Need a virtual CPU for AES array shifting? Got it. Need something specialized for FFT work? Easy said, say done. Different power utilization cores would be ideal for a server room where most of the hosts see peak load use, then after quitting time, end up idle.

      • Anything with an FPGA is always going to be in a niche market.
        It won't let you multi-task since you can't reprogram it when ever you context switch.

        Users don't want messages say they need to wait for X to finish before they start Y.

        They're also very expensive because they use a lot of silicon.
        They also consume a lot of power too.

    • by Loki_1929 (550940)

      CPUs are already plenty fast. They have been for years.

      Incorrect. CPUs are plenty fast and have been for years for doing many common tasks. The fact is that they aren't nearly fast enough (particularly for single-threaded items) and almost certainly won't be for another decade or more. There's a limit to what and how much you can multi-thread, and even then, you're still limited by single-thread performance x number of threads.

      So yes, for grandma playing Blackjack on Yahoo, today's CPUs are plenty fast. For me and many others? The fastest stuff available is 100

    • by jbolden (176878)

      I don't think CPUs are plenty fast at all. The stagnation in CPU speeds for the last 15 years has been dreadful for PC applications and the industry. Prices of computers have fallen, and software isn't much more capable than it was in 2000. If we had growth in CPUs like 1985-2000 for the 2000-2015 period ... it would be awe inspiring how terrific our machines would be today.

    • by Solandri (704621)

      3D gaming is starting to look like just another "Gold plated speaker wire" guy hobby as everyone moves to mobile devices.

      Only for the short-term. In a 25-year timeframe, the tech needed for 3D gaming is going to become the most important branch in the computer industry. Why? Because we'll need it to drive holographic display technology. To generate a hologram in real-time, you need to convert a virtual 3D scene to a 2D interference pattern. Transmit that pattern to a display, shine the appropriate ligh

    • 3D gaming is starting to look like just another "Gold plated speaker wire" guy hobby as everyone moves to mobile devices.

      Let me know when a substantial number of people start buying MOGA clip-on gamepads for their mobile devices. Until then, even the smartphone or tablet with the strongest CPU and GPU will be limited by its touch input. Mega Man 2 and Castlevania ran comfortably on 1.8 MHz CPUs, yet not even a 1.8 GHz CPU can add buttons to a device that doesn't have them.

  • I'm looking at the new Intel G3240 with Intel HD 4000 and I was wondering if something around the same price range (70$CAD) from AMD had an equivalent CPU with a better GPU.

    • I'm not even sure the G3240 comes with the HD 4000 because Intel makes it near impossible to know which GPU is used inside a lot of their CPUs, listing only "Intel HD".

    • by jandrese (485)
      I don't know about your case specifically, but the rule of thumb is that for low to mid range stuff you can get an AMD solution for about the same price that is going to have a slower CPU and faster GPU. It's pretty easy to beat a HD4000 GPU in any case. Of course this shoehorns you in a bit. If you went with the Intel solution, then you could drop a discrete GPU in later. If you go with a CPU that is too slow you often have to change more of the base system to upgrade (memory and motherboard).
      • (reply for both jandrese and washu_k)

        Thank you for your comments. I guess I'll go with the G3240 since it's a better CPU, endure the Intel HD GPU for now and add a GTX 750TI later.

    • by washu_k (1628007)
      No, there really isn't an equivalent. Which is more important, CPU power or GPU power?

      The closest AMD in price with a GPU is the A6-6400K. It would be quite a bit better in the GPU department, but MASSIVELY worse in the CPU department. Not even close in CPU power. To get something that wont cripple you on CPU you would need to go up to the A8-6600K, but that is over $110 at the CAD stores I checked and would still be way worse in single thread CPU.

      There are also the new Kabini CPUs and the top end
  • by Kartu (1490911) on Friday May 16, 2014 @03:29PM (#47020267)

    Compaq was afraid to use AMD chips given out for free, because Intel would "retaliate", ok?
    What kept AMD's market share low was not "clever marketing" of its competitor, it's crime.

    Back in P4 Prescott times, Intel's more expensive, more power hungry, yet slower chip outsold AMD's 3 or 4 to 1.
    Not being able to profit even when having superior products, it's really astonishing, to see AMD still afloat.

    • Compaq was afraid to use AMD chips given out for free, because Intel would "retaliate", ok?
      What kept AMD's market share low was not "clever marketing" of its competitor, it's crime.

      Back in P4 Prescott times, Intel's more expensive, more power hungry, yet slower chip outsold AMD's 3 or 4 to 1.
      Not being able to profit even when having superior products, it's really astonishing, to see AMD still afloat.

      Intel's Payola [1] (which basically kept Dell profitable for several quarters of the past decade) is something you have to factor in when looking at these "deals". I'm just sad that Intel didn't pay a bigger price for their purely anticompetitive corrupt practices.

      [1] http://www.theatlantic.com/tec... [theatlantic.com]

    • by Kjella (173770) on Friday May 16, 2014 @06:33PM (#47021981) Homepage

      While Intel did a lot of shady things, part of it was also that AMD didn't have nearly enough fab capacity to supply the market. Those kinds of decisions are made years in advance, you don't just pop up a sub-100nm processing plant on demand. So AMD got a huge winner, they surely produced everything they could and got a nice premium on their products but the remaining demand had to go with Intel. It's just not the sort of market battle you can win quickly.

    • Compaq was afraid to use AMD chips given out for free, because Intel would "retaliate", ok?

      Let's be honest though, Compaq wasn't known for making good business decisions.

  • 2016 (Score:4, Funny)

    by Anonymous Coward on Friday May 16, 2014 @04:16PM (#47020785)

    By 2016 AMD will have a CPU that beats the sh*t out of Intel's 2014 best offerings.

  • One could argue that the reason Intel's products have advanced as far as they have is because AMD was there to keep them on their toes. The game has changed since then, with mobile and whatnot, but I am still rooting for a comeback. Rory Read has played his cards well so far; and with Jim Keller back, it will be interesting to see what they have in store for us.
  • AMD has pissed away massive leads over Intel in the past.

    AMD single-handedly created the x86-x64 market from NOTHING.

    Then they fell back on their laurels.

    Then they bought a graphics company.

    Their last effort in the market was basically a fizzle. Forgoing a custom chip designed to eake the maximum efficiency and power from the device, they went with a crappy computer-designed monstrosity that basically was the worst of all worlds, and a flame-throwing power hog to boot.

    Sure, they can kick out a processor th

  • It would be nice to see AMD offer a 4-8 processor chipset that would allow you to highly parallelize their chips. Intel can do it, but the premium for Xeon silicon is outrageous. Not sure if AMD has enough business in that market that they're willing to chuck it in hopes of getting a leg up, but I sure as hell wish I could drop a second CPU into my desktop so I don't have to chuck the entire thing and buy a whole new board/CPU from Intel just to get a 50% boost in performance every 3-4 years.

Line Printer paper is strongest at the perforations.

Working...