Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
AMD Hardware

The Chip That Changed the World: AMD's 64-bit FX-51, Ten Years Later 259

Dputiger writes "It's been a decade since AMD's Athlon 64 FX-51 debuted — and launched the 64-bit x86 extensions that power the desktop and laptop world today. After a year of being bludgeoned by the P4, AMD roared back with a vengeance, kicking off a brief golden age for its own products, and seizing significant market share in desktops and servers." Although the Opteron was around before, it cost a pretty penny. I'm not sure it's fair to say that the P4 was really bludgeoning the Athlon XP though (higher clock speeds, but NetBurst is everyone's favorite Intel microarchitecture to hate). Check out the Athlon 64 FX review roundup from 2003.
This discussion has been archived. No new comments can be posted.

The Chip That Changed the World: AMD's 64-bit FX-51, Ten Years Later

Comments Filter:
  • Wow. Ten years. And here I am still dealing with 64 bit incompatability issues every six months or so.

    Out of curiosity, how long did 16bit library problems linger after the 32 bit move?

  • Re:The old days (Score:4, Interesting)

    by OolimPhon ( 1120895 ) on Wednesday September 25, 2013 @12:48PM (#44949641)

    You claim to be a geek and you're contemplating getting rid of an old computer?

    All my old computers ended up being used for something else. I only get rid of them when the architecture is so old that <OS of choice> won't run on it any more (or when the smoke comes out!). Device drivers are the things that limit usage to me.

  • Re:P4 vs Athlon XP (Score:5, Interesting)

    by Dputiger ( 561114 ) on Wednesday September 25, 2013 @12:54PM (#44949753)

    As the author of the article:

    In 2000 - 2001, the Athlon / Athlon XP were far ahead of the P4. But from Jan 2002 to March 2003, Intel increased the P4's clock speed by 60% and introduced Hyper-Threading. SSE2 became more popular during the same time. As a result, the P4 was far ahead of Athlon XP by the spring of the year in most content creation, business, and definitely 3D rendering workloads. Now it's true that an awful lot of benchmark shenanigans were going on at the same time, and the difference between the two cores was much smaller in high-end gaming. But if you wanted the best 'all around' CPU, the P4 Northwood + HT at 2.8 - 3.2GHz was the way to go. Northwoods were also good overclockers -- it was common to pick up a 2.4GHz P4 and clock it to 3 - 3.2GHz with HT.

    Athlon 64 kicked off the process of changing that, but what really did the trick was 1). Prescott's slide backwards as far as IPC and thermals and 2) The introduction of dual-core. It really was a one-two punch -- Intel couldn't put two Pentium 4 3.8GHz chips on a die together, so the 820 Prescott was just 3.2GHz. AMD, meanwhile, *could* put a pair of 2.4GHz Athlon 64's on a single chip. Combine that with Prescott's terrible efficiency, and suddenly the Athlon 64 was hammering into the P4 in every workload.

  • by Anonymous Coward on Wednesday September 25, 2013 @12:55PM (#44949769)

    AMD, forgotten by most of you, purchased a CPU design company not long after it lost the right to clone Intel CPU designs. The people from this company gave AMD a world beating x86 architecture that became the Athlon XP and then Athlon 64 (and true first x86 dual core), thrashing Intel even though AMD was spending less than ONE-HUNDREDTH of Intel's R&D spend.

    What happened? AMD top management sabotaged ALL future progress on new AMD CPUs, in order to maximise salaries, bonuses and pensions. A tiny clique of cynical self-serving scumbags ruined every advantage AMD had gained over Intel for more than 5 years afterwards. Eventually AMD replaced its top management, but by that time it was too late for the CPU. Obviously, AMD had far more success on the GPU side after buying ATI. (PS note that ATI had an identical rise to success, when that company also bought a GPU design team that became responsible for ALL of ATI's world-beating GPU designs. Neither AMD nor ATI initially had in-house talent good enough to produce first rate designs.)

    Today, AMD is ALMOST back on track. It's Kaveri chip (2014) will be the most compelling part for all mains powered PCs below high-end/serious gaming. In the mobile space, Intel seems likely to have the power-consumption advantage (for x86) across the next 1.5 years at least. However, even this is complicated by the fact that Nvidia is ARM, and AMD is following Nvidia, and is soon to combine its world beating GPU with ARM CPU cores.

    At this exact moment, AMD can only compete on price in the CPU market. Loaded, its chips use TWICE the power of Intel parts. In heavy gaming, average Intel i5 chips (4-core) usually wallop AMD's best 8-cores. In other heavy apps, AMD at best draws equal, but just as commonly lags Intel.

    Where AMD currently exterminates Intel is with SoC designs. AMD won total control of the console market, providing the chips for Nintendo, Sony and Microsoft. Intel (and Nvidia) were literally NOT in the running for these contracts, having nothing usable to offer, even at higher prices or lower performance.

    AMD is currently improving the 'bulldozer' CPU architecture once again for the Kaveri 4-core (+ massive integrated GPU and 256-bit bus) parts of 2014. There is every reason to think this new CPU design will be at rough parity with Intel's Sandybridge, in which case Intel will be in serious trouble in the mains-powered desktop market.

    Intel is in a slow but fatal decline. Intel is currently selling its new 'atom' chips below cost (illegal, but Intel just swallows the court fines) in an attempt to take on ARM, but even though Intel's 'atom' chips are actually Sandybridge class, and have a process advantage, they are slaughtered by Apple's new A7 ARM chip found in the latest iPhones. A7 uses the latest ARM-64 bit design known as ARMv8, making the A7 and excellent point of comparison with the original Athlon 64 from years back.

    Again, AMD is now x86 *and* ARM. AMD has two completely distinct and good x86 architectures ('stars-class' and 'bulldozer-class'. Intel is only x86, and now with the latest 'Atom' has only ONE x86 architecture in its worthwhile future lineup. Intel has other x86 architectures, but they are complete no-hopers like the original Atom family, the hilariously awful Larabee family, and the putrid new micro-controller family. Only Intel's current sandybridge/ivybridge/haswell/new-atom architecture has any value.

  • by iroll ( 717924 ) on Wednesday September 25, 2013 @01:10PM (#44949957) Homepage

    You must have read different articles than I did, because 10 years ago it was "Micro$oft $hills," "Apple Fanboys," etc. You do know that this was the origin of "No wireless. Less space than a Nomad. Lame," right? And that was 2001.

  • by unixisc ( 2429386 ) on Wednesday September 25, 2013 @05:27PM (#44953401)

    The instruction set itself was an yawner - I was looking forward to 64-bit being the point where all CPUs become RISC, and where Windows NT could go from being Wintel only to NT/RISC.

    However, one delicious piece of irony that I love about the Opteron/Athlon 64 is that this was the architecture that sunk the Itanic. If the Itanium sank far worthier chips before it - PA-RISC, DEC Alpha and MIPS V, this architecture brought out the Itanic in Itanium. Originally, the Itanium was supposed to be the 64 bit replacement for x86, but thanks to this gag from AMD, it never happened. Instead, AMD started stealing the market, and to add insult to injury, when Intel tried entering w/ 64-bit extensions of its own, Microsoft forced them to be AMD compatible. So that Intel was ultimately forced to let x64 be the successor to x86, and let Itanium wither on the vine.

    Once that happened, Itanium followed the same path as the better CPUs that it killed above. Microsoft dropped support for it after Server 2008 and XP or Vista were never supported, Monterrey collapsed and to add insult to injury, even Linux - the OS that boasts about being ported everywhere - didn't want to remain supported on the Itanic. Today, the Itanic has as many OSs as the DEC Alpha had at its peak - 3: HP/UX, Debian Linux and FreeBSD.

    So no, the x64 didn't change the world. But it sure sunk the Itanic!

To program is to be.

Working...