Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
AMD Hardware

The Chip That Changed the World: AMD's 64-bit FX-51, Ten Years Later 259

Dputiger writes "It's been a decade since AMD's Athlon 64 FX-51 debuted — and launched the 64-bit x86 extensions that power the desktop and laptop world today. After a year of being bludgeoned by the P4, AMD roared back with a vengeance, kicking off a brief golden age for its own products, and seizing significant market share in desktops and servers." Although the Opteron was around before, it cost a pretty penny. I'm not sure it's fair to say that the P4 was really bludgeoning the Athlon XP though (higher clock speeds, but NetBurst is everyone's favorite Intel microarchitecture to hate). Check out the Athlon 64 FX review roundup from 2003.
This discussion has been archived. No new comments can be posted.

The Chip That Changed the World: AMD's 64-bit FX-51, Ten Years Later

Comments Filter:
  • The old days (Score:5, Insightful)

    by Thanshin ( 1188877 ) on Wednesday September 25, 2013 @11:10AM (#44949179)

    Those were the good old days. How I miss when it took me one day at most to learn about all options I had to build a gaming computer, with enough detail to make an informed decision about what bits and pieces to built it with.

    Nowadays just piercing the veil of lies, half truths, false reports and bought reviews, makes the entire process incredibly boring and frustrating.

    • by ElementOfDestruction ( 2024308 ) on Wednesday September 25, 2013 @11:16AM (#44949227)
      It seems we've lost a lot of quality in the comment fields in the last 10 years. Lots of expertise modded up carefully; now we seemingly have opinion-pieces moderated up by whichever group happens to be awake at the time, and the real expertise is hidden in the +2 or below.
    • Re:The old days (Score:5, Insightful)

      by Dunbal ( 464142 ) * on Wednesday September 25, 2013 @11:19AM (#44949277)
      It's still pretty much common sense. You want a fast CPU, so not the top of the line $1000 chip, take a step back or two and go for the one selling in the $300-$500 range. Motherboard for that chip from someone you trust - ASUS, Gigabyte, etc. Again never the $500 "gamer" board, take a step back, there are some really nice ones for $200 or so. Latest generation graphics card, or top end from last generation (assuming the prices have come down), plenty of memory on the card. Power supply that can feed the card what it needs and then some. Plenty of system RAM. SSD hard drive. Water/Air cooling system for your CPU type. And you're set! Shouldn't take a whole "day" to check those out. An hour or two would suffice.
      • I'd need a day before I felt that I actually knew enough to start buying rather than just THINKING I knew enough.
      • Re:The old days (Score:5, Informative)

        by Dagger2 ( 1177377 ) on Wednesday September 25, 2013 @12:13PM (#44949991)
        And then you end up either with an i7 4770 which has a locked multiplier, or a 4770K which doesn't do VT-d. Then you realize that there is no Intel CPU that'll do both. So then you start looking at AMD, in the hope that they don't pull shit like that with their CPU models. And then you're way over your hour or two budget.
        • by armanox ( 826486 )

          Unless you're running HyperV or Xen, VT-d doesn't matter.

          • Unless you're running HyperV or Xen, VT-d doesn't matter.

            Or if you want to virtualise a random piece of hardware that your primary OS doesn't have drivers for. Like the heaps of hardware with XP only drivers, for example.

          • by vux984 ( 928602 )

            I tend to make this years gaming rig next years home-office server.

            So the core2quad Q6600 I used to use in my gaming PC is now running XenServer 6.

            I realize I'm in the very distinct minority here, but still ... it would be nice if i could buy a product that does both.

        • by NJRoadfan ( 1254248 ) on Wednesday September 25, 2013 @01:37PM (#44951131)
          I miss the old days where I didn't have to consult Intel's website to figure out what the model numbers mean. It use to be easy, the CPU has a name and a speed rating which told you how fast the chip was and the number of cores at a glance. Now we get a jumble of numbers to decipher.
      • It's still pretty much common sense. You want a fast CPU, so not the top of the line $1000 chip, take a step back or two and go for the one selling in the $300-$500 range. Motherboard for that chip from someone you trust - ASUS, Gigabyte, etc. Again never the $500 "gamer" board, take a step back, there are some really nice ones for $200 or so. Latest generation graphics card, or top end from last generation (assuming the prices have come down), plenty of memory on the card. Power supply that can feed the card what it needs and then some. Plenty of system RAM. SSD hard drive. Water/Air cooling system for your CPU type. And you're set! Shouldn't take a whole "day" to check those out. An hour or two would suffice.

        I would agree with you except for one thing, new technology. It can take a few days to get up to speed on the newest technology. I built a new system this past winter and it had been three years since I built my old one. It took time to research SSDs (brands, price, reliability, best practice, etc) as it was fairly new tech at the time, CPU and socket types, triple-channel memory, Video cards, etc. On top of that, anyone concerned about best bang for their buck will shop around a bit and look for deals.

    • Re:The old days (Score:5, Insightful)

      by ArcadeMan ( 2766669 ) on Wednesday September 25, 2013 @11:20AM (#44949291)

      The good old days was the 286 era, when all you needed to know what the clock speed of the CPU, EGA was four times better than CGA and SoundBlaster was AdLib compatible.

      Of course, you had to deal with XMS and EMS memory settings, loading your mouse driver into high memory and solving IRQ and DMA conflicts between your ISA add-on cards.

      Screw that, the good old days are today. Take out the iMac from the box, plug it in the wall socket and start using it right away.

      • by SB9876 ( 723368 )

        I'll second that. You haven't known pain until you try to get Ultima 7 to run on a system with a Proaudio Spectrum 16 sound card.

        • by adolf ( 21054 )

          You think that was pain? Try it with a Gravis UltraSound.

        • by bored ( 40072 )

          You haven't known pain until you try to get Ultima 7 to run on a system with a Proaudio Spectrum 16

          IIRC, with that version of ultima it wasn't your PAS that was the problem, it was the game. That darn game was a buggy POS even a couple years after release.

          My game machines from that era always had the latest Sound Blaster (even though I also owned a PAS and a Gravis (actually still have the gravis)) because then tended to "just work". That is until PCI came out, in which case nothing really worked for a cou

      • by yurtinus ( 1590157 ) on Wednesday September 25, 2013 @12:41PM (#44950365)
        Y'know, I was enjoying reading all the little nuggets of wisdom (Video cards that could use as much as 512 mb of address space, $700 for 2GB of RAM). Then I was thinking "hey, the computer I had before this one was an Athlon 64, it wasn't *that* long ago!" Then I realized it was. Then I felt old. Now I'm crying.
        • by jandrese ( 485 )
          It's rediculous just how long a properly built machine will last, even if you are a gamer. I'm still using my 2.4Ghz C2D from 6 years ago and it's only now starting to fall below the minimum requirements for some games.
    • by Nadaka ( 224565 )

      These days, aim for a price point of $1k with competitively priced components and you are almost certain get a decent gaming rig. PC hardware is far ahead of the curve thanks in part to extreme production costs of high quality graphics, and also in part to console hardware holding back the standard quality settings for multi-platform releases. That will give you medium of better settings at 1080p on all current and foreseeable games.

      • by h4rr4r ( 612664 )

        $1000? You can beat the consoles for less than $500.
        If you keep your old case and dvd/bluray drive you can do even better. I tend to swap out MOBO, RAM, CPU and GPU in one shot every few years. I have not had to do this since I picked up my GTX465.

        The consoles are holding gaming so far back there is no point in spending even $1000.

    • Re:The old days (Score:4, Informative)

      by asmkm22 ( 1902712 ) on Wednesday September 25, 2013 @11:51AM (#44949693)

      Not really sure what you're smoking. It's much easier to put together a computer (including a gaming computer) these days than it was 10 years ago. We don't really have to worry if we need PC-133, PC-2700, DDR1, DDR2, etc.. There's no need to choose between AGP, PCI, or that new-fangled PCI-Express, much less whatever multiplier is involved. Hard drives are straight up SATA now, and it doesn't matter if you choose a disk or SSD type. The graphics cards themselves aren't even as important since the console cycle has pretty much bottlenecked as a result of developers focusing on those consoles first and foremost. We don't need to do much more than make sure the motherboard is either an Intel or AMD socket.

      In fact, about the only real difficult decision you might need to make these days is finding a computer case that has enough room to use a modern video card.

    • To be honest, I'm loving the ease of putting together a decent system these days. I actually owned an Athlon64 based system back in the day (with an expensive high-end SLI nForce-based board), and that sucker was never completely stable. Same thing with the AthlonXP generation, and K7 (Athlon/Duron) beforehand...

      These days, I just pick the Intel chip that fits my needs, by the cheapest name-brand board that fits my needs, slap it together and it's rock solid. Celerons, Pentiums, Core iX, whatever... hell, e

  • by MightyYar ( 622222 ) on Wednesday September 25, 2013 @11:16AM (#44949235)

    I only just replaced my Athlon 64 motherboard and processor this spring. It was a good product, but not quite up to running Windows 8 IMHO.

  • by Anonymous Coward on Wednesday September 25, 2013 @11:21AM (#44949299)

    10 years later and we're still running games and applications that are 32bit that only use a single core.

    • by alen ( 225700 )

      for gaming, the GPU took over most of the work which is the way it should have happened

      for applications, most don't really need 2 cores. even running multiple apps at the same time you don't really need 2 cores. i was playing MP3's on a computer in the 1990's with minimal CPU usage. there is no way you need to dedicate a whole core to music while surfing the internet. or some of the other idiotic use cases people make up

      • On high end games, the CPU gets hit hard. AI, physics, etc, all need a lot of power. Battlefield 3 will hit pretty hard on a quad core CPU, while hitting hard on a high end GPU at the same time.

      • by adolf ( 21054 )

        When I'm waiting for an application to do whatever that application is doing, and that application is only using one core, then yes, I really do need it to use more than one core.

        To suggest otherwise is also to suggest that computers are fast enough, and that general-purpose computing is a solved problem.

        I don't think we're anywhere near that point just yet.

      • Most don't really need two cores, but that's not a reason not to want two cores.

        I fell in love with multiple core processors when I first got one, not because my computer in general became faster (I'll bet that all but one of my cores are idling most of the time) but because my computer wouldn't get unresponsive when I was doing computationally heavy tasks (or programs crashed).

    • Fair point and the follow up is WHY?
    • 10 years later and we're still running games and applications that are 32bit that only use a single core.

      At least 64-bit OSes are widespread now.

      Almost ten years after the 80386 was introduced, most people were still running "OSes" which were little more than GUI shells running on 16-bit DOS.

      • by yuhong ( 1378501 )

        I mentioned that Caldera actually sued MS based on the fact that Win9x was still based on DOS in my blog article on the OS/2 2.0 fiasco, because OS/2 never depended on DOS.

    • Damnit people, do not give Adobe's Flash developers any ideas!!
    • The need for computers that can run multiple programs concurrently with a total of > 4GB RAM is more than the need for any single program to consume multiple cores or > 4 GB RAM.
  • Too bad AMD was just sitting on their laurels after that. Incidentally, in 2 more years, you can start making your own Pentium Pro compatible processor without violating any patents (assuming you're using the same patents that went into the Pentium Pro).

  • Wow. Ten years. And here I am still dealing with 64 bit incompatability issues every six months or so.

    Out of curiosity, how long did 16bit library problems linger after the 32 bit move?

    • Wow. Ten years. And here I am still dealing with 64 bit incompatability issues

      10 years? Some of us are still waiting to reap the benefits of MMX extensions. Ha..

      • And in AMD64 MMX (along with 3DNow!) is officially deprecated in favor of the SSE instructions. Like a twisting of the knife.
    • by KliX ( 164895 )

      They're still here.

    • Wow. Ten years. And here I am still dealing with 64 bit incompatability issues every six months or so.

      Out of curiosity, how long did 16bit library problems linger after the 32 bit move?

      16 to 32 was a much more radical change. Segments to flat. In the Unix line, this happened in the early 80's (late 70's?) when few systems were deployed.

      In the Wintel line, it was also cooperative to preemptive. Very painful. Very manual. It took 10 years just to let go of 16 bit device drivers and many were never ported.

      Classic Mac, Amiga, and Atari ST had an easier time since their "16-bit" systems were already 32-bit internally. Even then you had a few years of dealing with geniuses who stored data

    • I've got rule of thumb on how long it will take to move completely to 64 bit. Basically every time we double the number of bits, the time to convert takes double the time. I'm sure someone could refine that but it makes a tiny bit of sense.

      Rectally extracted numbers:
      4 to 8 : 2 years
      8 to 16 : 5 years
      16 to 32 : 10 years
      32 to 64 : 20 years?

    • The "problem" is that 32 bit is still enough for mostly everyone, while 32 bit gained quick popularity after windows 95 (18 years old)

      Also, there is less incentive to upgrade your current working machine today than before.

  • Still better IMHO (Score:3, Insightful)

    by s.petry ( 762400 ) on Wednesday September 25, 2013 @11:34AM (#44949503)
    AMD still makes a better chip for many FP intensive applications, and the price is still superior to Intel to boot. Intel always made a big deal about clock speed, while AMD worked on actual performance. It is really a shame that people pay more attention to marketing than real performance.
    • by Anaerin ( 905998 )
      Maybe you should look at the actual performance numbers [bit-tech.net]. Intel is performing better than AMD, and at a cheaper price point. And unfortunately, I'm an AMD fan, running a Hexcore Bulldozer here.
      • by s.petry ( 762400 )
        Instead of relying on a web site which does testing based on their funding, perform actual bench marks yourself. I do, and see mixed results today. 5 years ago for math intensive apps, AMD was a hands down winner. Apps like Muses, Nastran, Abaqus, Dyna, etc...
        • by Anaerin ( 905998 ) on Wednesday September 25, 2013 @01:03PM (#44950729)
          Unfortunately, not everyone has the ready cash available to buy every CPU, set up a complete system for each, and benchmark them. We plebians have to rely on other people to do that kind of testing for us. And when a great deal of websites, all doing independent benchmarks and reviews, all show AMD getting their collective asses handed to them on a regular basis at the moment, I tend to lend those reviews some weight.
    • Aren't Intel's chips faster clock for clock right now? Not to mention much more efficient?

  • I was hoping to find a current review of the processor against current CPUs....

    However, in AnandTech bench you can compare an AMD Athlon X2 4450e (2.3GHz - 1MB L2) with current CPUs. If you compare this to an Intel Core i7 4770K (3.5GHz - 1MB L2 - 8MB L3, one of the best CPUs right now), you can find that the Intel CPU is between 3 times faster and 9 times faster. Most of the times is about 6-7 times faster.

    See http://www.anandtech.com/bench/product/37?vs=836 [anandtech.com]

    However, if you could compare an AMD FX-51 with a

    • The Athlon X2 4450e was released in April of 2008, so we are only looking at 5 years difference not 10 years. I think the more interesting comparison would be to what the Athlon FX-51 and the new Apple A7 chip look like, given they are the first 64 bit chips of their class.

  • by Anonymous Coward on Wednesday September 25, 2013 @11:55AM (#44949769)

    AMD, forgotten by most of you, purchased a CPU design company not long after it lost the right to clone Intel CPU designs. The people from this company gave AMD a world beating x86 architecture that became the Athlon XP and then Athlon 64 (and true first x86 dual core), thrashing Intel even though AMD was spending less than ONE-HUNDREDTH of Intel's R&D spend.

    What happened? AMD top management sabotaged ALL future progress on new AMD CPUs, in order to maximise salaries, bonuses and pensions. A tiny clique of cynical self-serving scumbags ruined every advantage AMD had gained over Intel for more than 5 years afterwards. Eventually AMD replaced its top management, but by that time it was too late for the CPU. Obviously, AMD had far more success on the GPU side after buying ATI. (PS note that ATI had an identical rise to success, when that company also bought a GPU design team that became responsible for ALL of ATI's world-beating GPU designs. Neither AMD nor ATI initially had in-house talent good enough to produce first rate designs.)

    Today, AMD is ALMOST back on track. It's Kaveri chip (2014) will be the most compelling part for all mains powered PCs below high-end/serious gaming. In the mobile space, Intel seems likely to have the power-consumption advantage (for x86) across the next 1.5 years at least. However, even this is complicated by the fact that Nvidia is ARM, and AMD is following Nvidia, and is soon to combine its world beating GPU with ARM CPU cores.

    At this exact moment, AMD can only compete on price in the CPU market. Loaded, its chips use TWICE the power of Intel parts. In heavy gaming, average Intel i5 chips (4-core) usually wallop AMD's best 8-cores. In other heavy apps, AMD at best draws equal, but just as commonly lags Intel.

    Where AMD currently exterminates Intel is with SoC designs. AMD won total control of the console market, providing the chips for Nintendo, Sony and Microsoft. Intel (and Nvidia) were literally NOT in the running for these contracts, having nothing usable to offer, even at higher prices or lower performance.

    AMD is currently improving the 'bulldozer' CPU architecture once again for the Kaveri 4-core (+ massive integrated GPU and 256-bit bus) parts of 2014. There is every reason to think this new CPU design will be at rough parity with Intel's Sandybridge, in which case Intel will be in serious trouble in the mains-powered desktop market.

    Intel is in a slow but fatal decline. Intel is currently selling its new 'atom' chips below cost (illegal, but Intel just swallows the court fines) in an attempt to take on ARM, but even though Intel's 'atom' chips are actually Sandybridge class, and have a process advantage, they are slaughtered by Apple's new A7 ARM chip found in the latest iPhones. A7 uses the latest ARM-64 bit design known as ARMv8, making the A7 and excellent point of comparison with the original Athlon 64 from years back.

    Again, AMD is now x86 *and* ARM. AMD has two completely distinct and good x86 architectures ('stars-class' and 'bulldozer-class'. Intel is only x86, and now with the latest 'Atom' has only ONE x86 architecture in its worthwhile future lineup. Intel has other x86 architectures, but they are complete no-hopers like the original Atom family, the hilariously awful Larabee family, and the putrid new micro-controller family. Only Intel's current sandybridge/ivybridge/haswell/new-atom architecture has any value.

    • by 0123456 ( 636235 )

      AMD is not ARM. ARM is ARM. Anyone can buy an ARM license and start releasing ARM chips. AMD are producing ARM chips because they can't compete with Intel in the x86 market.

      Nothing stops Intel releasing ARM chips, as they have in the past, except the margins would probably be awful compared to their x86 lineup.

    • by jonwil ( 467024 )

      The most telling thing about AMD is that their first generation Bulldozer-architecture CPUs were getting their pants creamed not just by their Intel competitors but by the last-generation AMD parts.

  • so i was already feeling stoked about finally getting around to finding a matched pair of the fastest cpus that I can put into this board that's been sitting in a box for SIX YEARS, they boot and now I read this. /me - does the peacock strut happy dance thing.

    i was always a fan of AMD going back to the 8x300 bit slice stuff. they're clever boys.

  • by LodCrappo ( 705968 ) on Wednesday September 25, 2013 @12:16PM (#44950033)

    Apple just released a 64bit processor, and now AMD is copying it TEN YEARS ago?!?

    Can the industry please do something original and quit just following wherever Apple leads it?

  • by unixisc ( 2429386 ) on Wednesday September 25, 2013 @04:27PM (#44953401)

    The instruction set itself was an yawner - I was looking forward to 64-bit being the point where all CPUs become RISC, and where Windows NT could go from being Wintel only to NT/RISC.

    However, one delicious piece of irony that I love about the Opteron/Athlon 64 is that this was the architecture that sunk the Itanic. If the Itanium sank far worthier chips before it - PA-RISC, DEC Alpha and MIPS V, this architecture brought out the Itanic in Itanium. Originally, the Itanium was supposed to be the 64 bit replacement for x86, but thanks to this gag from AMD, it never happened. Instead, AMD started stealing the market, and to add insult to injury, when Intel tried entering w/ 64-bit extensions of its own, Microsoft forced them to be AMD compatible. So that Intel was ultimately forced to let x64 be the successor to x86, and let Itanium wither on the vine.

    Once that happened, Itanium followed the same path as the better CPUs that it killed above. Microsoft dropped support for it after Server 2008 and XP or Vista were never supported, Monterrey collapsed and to add insult to injury, even Linux - the OS that boasts about being ported everywhere - didn't want to remain supported on the Itanic. Today, the Itanic has as many OSs as the DEC Alpha had at its peak - 3: HP/UX, Debian Linux and FreeBSD.

    So no, the x64 didn't change the world. But it sure sunk the Itanic!

Talent does what it can. Genius does what it must. You do what you get paid to do.

Working...