The Chip That Changed the World: AMD's 64-bit FX-51, Ten Years Later 259
Dputiger writes "It's been a decade since AMD's Athlon 64 FX-51 debuted — and launched the 64-bit x86 extensions that power the desktop and laptop world today. After a year of being bludgeoned by the P4, AMD roared back with a vengeance, kicking off a brief golden age for its own products, and seizing significant market share in desktops and servers."
Although the Opteron was around before, it cost a pretty penny. I'm not sure it's fair to say that the P4 was really bludgeoning the Athlon XP though (higher clock speeds, but NetBurst is everyone's favorite Intel microarchitecture to hate). Check out the Athlon 64 FX review roundup from 2003.
The old days (Score:5, Insightful)
Those were the good old days. How I miss when it took me one day at most to learn about all options I had to build a gaming computer, with enough detail to make an informed decision about what bits and pieces to built it with.
Nowadays just piercing the veil of lies, half truths, false reports and bought reviews, makes the entire process incredibly boring and frustrating.
Made me miss the old Slashdot (Score:4, Insightful)
Re:Made me miss the old Slashdot (Score:5, Insightful)
Yeah like the hot grits down your pants, Natalie Portman naked and petrified, gay niggers association, penis bird registrations in ASCI, and of course who could have forgotten Goatse I mean LITERALLY forget! Ahoot one goatse troll had a +3 and got +90 responses with MY EYES?!! By a moderator trying to be funny.
No I dont miss those days as we tend to remember only the good ones
Re: (Score:2)
CHOCOLATE CHIP COOKIE RECIPE
It was a nice break from the usual trolling. The recipe was legit, too.
Re:Made me miss the old Slashdot (Score:4, Interesting)
You must have read different articles than I did, because 10 years ago it was "Micro$oft $hills," "Apple Fanboys," etc. You do know that this was the origin of "No wireless. Less space than a Nomad. Lame," right? And that was 2001.
Re:The old days (Score:5, Insightful)
Re: (Score:2)
Re:The old days (Score:5, Informative)
Re: (Score:2)
Unless you're running HyperV or Xen, VT-d doesn't matter.
Re: (Score:3)
Unless you're running HyperV or Xen, VT-d doesn't matter.
Or if you want to virtualise a random piece of hardware that your primary OS doesn't have drivers for. Like the heaps of hardware with XP only drivers, for example.
Re:The old days (Score:4, Insightful)
Or you can realize that we're talking about building a *new* computer, with *new* hardware.
Which doesn't do him a lick of good if he wants the new computer to run his old $5000 data acquisition hardware that only has XP drivers. Or dozens of other pieces of hardware which may have newer versions supported by newer OS's but the price of replacement is significantly more than the computer.
Re: (Score:3)
Why exactly would you exclude ever using any older hardware in a new computer...?
Re: (Score:2)
I tend to make this years gaming rig next years home-office server.
So the core2quad Q6600 I used to use in my gaming PC is now running XenServer 6.
I realize I'm in the very distinct minority here, but still ... it would be nice if i could buy a product that does both.
Re:The old days (Score:4)
Re: (Score:3)
It's still pretty much common sense. You want a fast CPU, so not the top of the line $1000 chip, take a step back or two and go for the one selling in the $300-$500 range. Motherboard for that chip from someone you trust - ASUS, Gigabyte, etc. Again never the $500 "gamer" board, take a step back, there are some really nice ones for $200 or so. Latest generation graphics card, or top end from last generation (assuming the prices have come down), plenty of memory on the card. Power supply that can feed the card what it needs and then some. Plenty of system RAM. SSD hard drive. Water/Air cooling system for your CPU type. And you're set! Shouldn't take a whole "day" to check those out. An hour or two would suffice.
I would agree with you except for one thing, new technology. It can take a few days to get up to speed on the newest technology. I built a new system this past winter and it had been three years since I built my old one. It took time to research SSDs (brands, price, reliability, best practice, etc) as it was fairly new tech at the time, CPU and socket types, triple-channel memory, Video cards, etc. On top of that, anyone concerned about best bang for their buck will shop around a bit and look for deals.
Re:The old days (Score:4, Interesting)
You claim to be a geek and you're contemplating getting rid of an old computer?
All my old computers ended up being used for something else. I only get rid of them when the architecture is so old that <OS of choice> won't run on it any more (or when the smoke comes out!). Device drivers are the things that limit usage to me.
Re: (Score:3)
I did that for a bit but when I got to seven computers sitting idle in the closet, I took them down to the electronics recycling bin. Heck, I'm even looking at my old Sun box and considering punting that one as well. That will leave me with 4 computers that are regularly in use plus the tablet and phone.
[John]
Re: (Score:3)
I've got 2 SGI Octanes and an O2 sitting in my room now. Without an actual purpose. I might be restoring one of the Octane boxes just to annoy a friend of mine on support (hey, you guys don't have a build for IRIX!).
Re: (Score:3)
Two of the ones I punted this time were Ultra 60's. Back in 2004 when I moved, I got rid of an SLC I had for several years. This one is an Enterprise 250. I have to remove the power supplies in order to pick it up. It's fricking loud though. :)
[John]
Re: (Score:3)
Would have been nice if you'd put a TL;DR at the top that this is an apple propaganda piece.
The specs you listed above are for a gaming computer. Your Mac is a nice machine and it can certainly play some games, but it wouldn't be ideal for that purpose.
Re: (Score:2)
Nah, forget it. As a computer software nerd, but not a PC building nerd, I'll just go with a 27" iMac for $1999. Granted only an i5 CPU and 8 GB, but comes with a great OS and a gorgeous 27" monitor. (BTO with i7, 16GB and 256GB SSD bumps the price to $2599.) It has a GTX 775M instead of GTX 660 - no idea which is faster. At least I know all the components will work together, and they're properly supported by the OS.
I built a new games machine last year. That had the second-fastest i7 at the time, 32GB of RAM, the GTX660 GPU you mentioned, a 200GB-ish SSD, 3TB hard drive and a few other bells and whistles. Even including $100 for Windows, it only cost $1500.
So that had better be a really, really gorgeous monitor.
Re:The old days (Score:5, Insightful)
The good old days was the 286 era, when all you needed to know what the clock speed of the CPU, EGA was four times better than CGA and SoundBlaster was AdLib compatible.
Of course, you had to deal with XMS and EMS memory settings, loading your mouse driver into high memory and solving IRQ and DMA conflicts between your ISA add-on cards.
Screw that, the good old days are today. Take out the iMac from the box, plug it in the wall socket and start using it right away.
Re: (Score:3)
I'll second that. You haven't known pain until you try to get Ultima 7 to run on a system with a Proaudio Spectrum 16 sound card.
Re: (Score:2)
You think that was pain? Try it with a Gravis UltraSound.
Re: (Score:3)
You haven't known pain until you try to get Ultima 7 to run on a system with a Proaudio Spectrum 16
IIRC, with that version of ultima it wasn't your PAS that was the problem, it was the game. That darn game was a buggy POS even a couple years after release.
My game machines from that era always had the latest Sound Blaster (even though I also owned a PAS and a Gravis (actually still have the gravis)) because then tended to "just work". That is until PCI came out, in which case nothing really worked for a cou
Re:The old days (Score:5, Funny)
Re: (Score:3)
Re: (Score:2)
These days, aim for a price point of $1k with competitively priced components and you are almost certain get a decent gaming rig. PC hardware is far ahead of the curve thanks in part to extreme production costs of high quality graphics, and also in part to console hardware holding back the standard quality settings for multi-platform releases. That will give you medium of better settings at 1080p on all current and foreseeable games.
Re: (Score:2)
$1000? You can beat the consoles for less than $500.
If you keep your old case and dvd/bluray drive you can do even better. I tend to swap out MOBO, RAM, CPU and GPU in one shot every few years. I have not had to do this since I picked up my GTX465.
The consoles are holding gaming so far back there is no point in spending even $1000.
Re: (Score:2)
My machine also runs KVM, and various database software. I have not upgraded since that video card, which is more than a year old.
Re: (Score:2)
$1000 includes all parts of a computer, including monitor. Reusing previously bought parts does not greatly reduce the actual cost of the computer as you lose the opportunity to use that other hardware for other purposes.
And that $1000 will likely play all current games at high settings, and have the ability to play foreseeable future games with at least medium settings.
A new generation of consoles is coming out, and the base quality of graphics will rise to meet the abilities of that new hardware, and that
Re: (Score:2)
How many monitors do you need?
If you had no other use for that hardware it sure does. SSDs, monitors, good cases those are expensive.
We have seen the new consoles already and no they will not exceed a $500 PC at this point.
Re: (Score:2)
Or if you don't want to upgrade every year, and want a machine that will last for a decade.
...like an Athlon 64!?!?!
Re:The old days (Score:4, Informative)
Not really sure what you're smoking. It's much easier to put together a computer (including a gaming computer) these days than it was 10 years ago. We don't really have to worry if we need PC-133, PC-2700, DDR1, DDR2, etc.. There's no need to choose between AGP, PCI, or that new-fangled PCI-Express, much less whatever multiplier is involved. Hard drives are straight up SATA now, and it doesn't matter if you choose a disk or SSD type. The graphics cards themselves aren't even as important since the console cycle has pretty much bottlenecked as a result of developers focusing on those consoles first and foremost. We don't need to do much more than make sure the motherboard is either an Intel or AMD socket.
In fact, about the only real difficult decision you might need to make these days is finding a computer case that has enough room to use a modern video card.
Re: (Score:2)
To be honest, I'm loving the ease of putting together a decent system these days. I actually owned an Athlon64 based system back in the day (with an expensive high-end SLI nForce-based board), and that sucker was never completely stable. Same thing with the AthlonXP generation, and K7 (Athlon/Duron) beforehand...
These days, I just pick the Intel chip that fits my needs, by the cheapest name-brand board that fits my needs, slap it together and it's rock solid. Celerons, Pentiums, Core iX, whatever... hell, e
Just Replaced (Score:3)
I only just replaced my Athlon 64 motherboard and processor this spring. It was a good product, but not quite up to running Windows 8 IMHO.
10 years later and applications are still 32bit. (Score:4, Insightful)
10 years later and we're still running games and applications that are 32bit that only use a single core.
Re: (Score:3)
for gaming, the GPU took over most of the work which is the way it should have happened
for applications, most don't really need 2 cores. even running multiple apps at the same time you don't really need 2 cores. i was playing MP3's on a computer in the 1990's with minimal CPU usage. there is no way you need to dedicate a whole core to music while surfing the internet. or some of the other idiotic use cases people make up
No not really (Score:2)
On high end games, the CPU gets hit hard. AI, physics, etc, all need a lot of power. Battlefield 3 will hit pretty hard on a quad core CPU, while hitting hard on a high end GPU at the same time.
Re: (Score:2)
When I'm waiting for an application to do whatever that application is doing, and that application is only using one core, then yes, I really do need it to use more than one core.
To suggest otherwise is also to suggest that computers are fast enough, and that general-purpose computing is a solved problem.
I don't think we're anywhere near that point just yet.
Re: (Score:3)
Most don't really need two cores, but that's not a reason not to want two cores.
I fell in love with multiple core processors when I first got one, not because my computer in general became faster (I'll bet that all but one of my cores are idling most of the time) but because my computer wouldn't get unresponsive when I was doing computationally heavy tasks (or programs crashed).
Re: (Score:2)
Re: (Score:2)
10 years later and we're still running games and applications that are 32bit that only use a single core.
At least 64-bit OSes are widespread now.
Almost ten years after the 80386 was introduced, most people were still running "OSes" which were little more than GUI shells running on 16-bit DOS.
Re: (Score:2)
I mentioned that Caldera actually sued MS based on the fact that Win9x was still based on DOS in my blog article on the OS/2 2.0 fiasco, because OS/2 never depended on DOS.
Re: (Score:2)
Re: (Score:2)
Great processor (Score:2)
Too bad AMD was just sitting on their laurels after that. Incidentally, in 2 more years, you can start making your own Pentium Pro compatible processor without violating any patents (assuming you're using the same patents that went into the Pentium Pro).
Error in 32/64 bit libraries. Please reinstall (Score:4, Interesting)
Wow. Ten years. And here I am still dealing with 64 bit incompatability issues every six months or so.
Out of curiosity, how long did 16bit library problems linger after the 32 bit move?
Re: (Score:2)
Wow. Ten years. And here I am still dealing with 64 bit incompatability issues
10 years? Some of us are still waiting to reap the benefits of MMX extensions. Ha..
Re: (Score:2)
Re: (Score:2)
They're still here.
16 to 32 was mostly incompatible (Score:2)
Wow. Ten years. And here I am still dealing with 64 bit incompatability issues every six months or so.
Out of curiosity, how long did 16bit library problems linger after the 32 bit move?
16 to 32 was a much more radical change. Segments to flat. In the Unix line, this happened in the early 80's (late 70's?) when few systems were deployed.
In the Wintel line, it was also cooperative to preemptive. Very painful. Very manual. It took 10 years just to let go of 16 bit device drivers and many were never ported.
Classic Mac, Amiga, and Atari ST had an easier time since their "16-bit" systems were already 32-bit internally. Even then you had a few years of dealing with geniuses who stored data
Re: (Score:2)
I've got rule of thumb on how long it will take to move completely to 64 bit. Basically every time we double the number of bits, the time to convert takes double the time. I'm sure someone could refine that but it makes a tiny bit of sense.
Rectally extracted numbers:
4 to 8 : 2 years
8 to 16 : 5 years
16 to 32 : 10 years
32 to 64 : 20 years?
Re: (Score:2)
The "problem" is that 32 bit is still enough for mostly everyone, while 32 bit gained quick popularity after windows 95 (18 years old)
Also, there is less incentive to upgrade your current working machine today than before.
Re: (Score:2)
You don't really think that all those 16-bit Windows apps statically linked in every Windows library, do you?
Re: (Score:2)
Library problems? Were there ever 16-bit programs that were not statically linked to their libraries?
Yes, and it was a largely manual process. I suggest you find an old 16-bit Windows programming textbook and learn about the hoops people used to have to jump through to implement dynamic linking. When software versions changed, then as now, what could possibly go wrong?
Still better IMHO (Score:3, Insightful)
Re: (Score:3)
Re: (Score:2)
Re:Still better IMHO (Score:5, Insightful)
Re: (Score:2)
Aren't Intel's chips faster clock for clock right now? Not to mention much more efficient?
Re: (Score:2)
Hmmm, I was actually using the word "faster" in a "get shit done more quickly" sense - not higher clock speeds. I.e. a 3GHz Haswell i5 being faster than a 3GHz Bulldozer (or whatever the latest generation is called) for a purely CPU-limited single-threaded workload. That's what I'm asking - is this not still the case?
The fact that I can crank an unlocked i5/i7 to 4.5-5.0 GHz without any issues whatsoever is just icing on the cake.
Re:Still better IMHO (Score:4, Insightful)
See, this is why I asked. Looking at benchmark lists (things like Cinebench) would lead me to believe that this is nowhere near the case, with the fastest AMD chip (with a 4.4GHz singlethreaded turbo vs. 3,9GHz on the fastest Intel chip in this benchmark) barely keeping up with good old 1st-gen Core i5/i7 chips.
http://www.tomshardware.de/charts/cpu-charts-2012/-01-Cinebench-11.5,3142.html [tomshardware.de]
Multithreaded workloads are a different story, of course, what with AMD having consumer octacores on the market: http://www.tomshardware.de/charts/cpu-charts-2012/-02-Cinebench-11.5,3143.html [tomshardware.de]
Comparison with current CPUs (Score:2)
I was hoping to find a current review of the processor against current CPUs....
However, in AnandTech bench you can compare an AMD Athlon X2 4450e (2.3GHz - 1MB L2) with current CPUs. If you compare this to an Intel Core i7 4770K (3.5GHz - 1MB L2 - 8MB L3, one of the best CPUs right now), you can find that the Intel CPU is between 3 times faster and 9 times faster. Most of the times is about 6-7 times faster.
See http://www.anandtech.com/bench/product/37?vs=836 [anandtech.com]
However, if you could compare an AMD FX-51 with a
Re: (Score:2)
The Athlon X2 4450e was released in April of 2008, so we are only looking at 5 years difference not 10 years. I think the more interesting comparison would be to what the Athlon FX-51 and the new Apple A7 chip look like, given they are the first 64 bit chips of their class.
AMD was king of the hill, but... (Score:4, Interesting)
AMD, forgotten by most of you, purchased a CPU design company not long after it lost the right to clone Intel CPU designs. The people from this company gave AMD a world beating x86 architecture that became the Athlon XP and then Athlon 64 (and true first x86 dual core), thrashing Intel even though AMD was spending less than ONE-HUNDREDTH of Intel's R&D spend.
What happened? AMD top management sabotaged ALL future progress on new AMD CPUs, in order to maximise salaries, bonuses and pensions. A tiny clique of cynical self-serving scumbags ruined every advantage AMD had gained over Intel for more than 5 years afterwards. Eventually AMD replaced its top management, but by that time it was too late for the CPU. Obviously, AMD had far more success on the GPU side after buying ATI. (PS note that ATI had an identical rise to success, when that company also bought a GPU design team that became responsible for ALL of ATI's world-beating GPU designs. Neither AMD nor ATI initially had in-house talent good enough to produce first rate designs.)
Today, AMD is ALMOST back on track. It's Kaveri chip (2014) will be the most compelling part for all mains powered PCs below high-end/serious gaming. In the mobile space, Intel seems likely to have the power-consumption advantage (for x86) across the next 1.5 years at least. However, even this is complicated by the fact that Nvidia is ARM, and AMD is following Nvidia, and is soon to combine its world beating GPU with ARM CPU cores.
At this exact moment, AMD can only compete on price in the CPU market. Loaded, its chips use TWICE the power of Intel parts. In heavy gaming, average Intel i5 chips (4-core) usually wallop AMD's best 8-cores. In other heavy apps, AMD at best draws equal, but just as commonly lags Intel.
Where AMD currently exterminates Intel is with SoC designs. AMD won total control of the console market, providing the chips for Nintendo, Sony and Microsoft. Intel (and Nvidia) were literally NOT in the running for these contracts, having nothing usable to offer, even at higher prices or lower performance.
AMD is currently improving the 'bulldozer' CPU architecture once again for the Kaveri 4-core (+ massive integrated GPU and 256-bit bus) parts of 2014. There is every reason to think this new CPU design will be at rough parity with Intel's Sandybridge, in which case Intel will be in serious trouble in the mains-powered desktop market.
Intel is in a slow but fatal decline. Intel is currently selling its new 'atom' chips below cost (illegal, but Intel just swallows the court fines) in an attempt to take on ARM, but even though Intel's 'atom' chips are actually Sandybridge class, and have a process advantage, they are slaughtered by Apple's new A7 ARM chip found in the latest iPhones. A7 uses the latest ARM-64 bit design known as ARMv8, making the A7 and excellent point of comparison with the original Athlon 64 from years back.
Again, AMD is now x86 *and* ARM. AMD has two completely distinct and good x86 architectures ('stars-class' and 'bulldozer-class'. Intel is only x86, and now with the latest 'Atom' has only ONE x86 architecture in its worthwhile future lineup. Intel has other x86 architectures, but they are complete no-hopers like the original Atom family, the hilariously awful Larabee family, and the putrid new micro-controller family. Only Intel's current sandybridge/ivybridge/haswell/new-atom architecture has any value.
Re: (Score:2)
AMD is not ARM. ARM is ARM. Anyone can buy an ARM license and start releasing ARM chips. AMD are producing ARM chips because they can't compete with Intel in the x86 market.
Nothing stops Intel releasing ARM chips, as they have in the past, except the margins would probably be awful compared to their x86 lineup.
Re: (Score:3)
The most telling thing about AMD is that their first generation Bulldozer-architecture CPUs were getting their pants creamed not just by their Intel competitors but by the last-generation AMD parts.
that first tyan dual cpu board (Score:2)
so i was already feeling stoked about finally getting around to finding a matched pair of the fastest cpus that I can put into this board that's been sitting in a box for SIX YEARS, they boot and now I read this. /me - does the peacock strut happy dance thing.
i was always a fan of AMD going back to the 8x300 bit slice stuff. they're clever boys.
Those bastards (Score:5, Funny)
Apple just released a 64bit processor, and now AMD is copying it TEN YEARS ago?!?
Can the industry please do something original and quit just following wherever Apple leads it?
Re: (Score:2)
First: yes, it's clearly a joke. How can you copy something done ten years earlier? :P
Second, Apple does make their own ARM CPUs these days. They build and design licensed ARM CPUs for their iOS devices these days, which includes AppleTV, iPhones, iPads, and iPods, but for their Mac / OS X business they are still 100% Intel. Their latest design is starting to turn some heads. [wikipedia.org]
Re: (Score:3)
The chip that sunk the Itanic (Score:4, Interesting)
The instruction set itself was an yawner - I was looking forward to 64-bit being the point where all CPUs become RISC, and where Windows NT could go from being Wintel only to NT/RISC.
However, one delicious piece of irony that I love about the Opteron/Athlon 64 is that this was the architecture that sunk the Itanic. If the Itanium sank far worthier chips before it - PA-RISC, DEC Alpha and MIPS V, this architecture brought out the Itanic in Itanium. Originally, the Itanium was supposed to be the 64 bit replacement for x86, but thanks to this gag from AMD, it never happened. Instead, AMD started stealing the market, and to add insult to injury, when Intel tried entering w/ 64-bit extensions of its own, Microsoft forced them to be AMD compatible. So that Intel was ultimately forced to let x64 be the successor to x86, and let Itanium wither on the vine.
Once that happened, Itanium followed the same path as the better CPUs that it killed above. Microsoft dropped support for it after Server 2008 and XP or Vista were never supported, Monterrey collapsed and to add insult to injury, even Linux - the OS that boasts about being ported everywhere - didn't want to remain supported on the Itanic. Today, the Itanic has as many OSs as the DEC Alpha had at its peak - 3: HP/UX, Debian Linux and FreeBSD.
So no, the x64 didn't change the world. But it sure sunk the Itanic!
Re:Before AMD committed suicide (Score:5, Insightful)
AMD is very competitive for many-cores workloads. To get an equivalent core count on Intel can be as much as a second AMD system. AMD has gone more wide, Intel has gone more deep. Both have their applications.
Re: (Score:3, Insightful)
Tell that to Tomshardware and others who use x87 benchmarks and games like skyrim showing an AMD 8 core being handed a smackdown by an i3?
No one believes in AMD anymore
Re:Before AMD committed suicide (Score:5, Insightful)
For games sure, but there are lots of workloads that are not games.
Re: (Score:3)
and many games are not made by Bugthesda.
Re: (Score:2)
Re: (Score:2)
My desktop is used to run KVM, please tell me all about how a 1 core intel would be good enough. I shall wait.
Re: (Score:2, Insightful)
Re: (Score:2)
Re: (Score:3)
Re: (Score:3)
I go by Cinebench myself. It seems completely neutral.
http://www.bit-tech.net/hardware/2013/06/12/intel-core-i5-4670k-haswell-cpu-review/6 [bit-tech.net]
Re: (Score:3)
Tell that to Tomshardware and others who use x87 benchmarks and games like skyrim showing an AMD 8 core being handed a smackdown by an i3?
True, it's horrible that review sites benchmark CPUs using the kind of programs people actually run on them.
I remember back when I bought my P4, the only thing a similarly priced Athlon XP really beat it on were x87-intensive games. Professional 3D apps using SSE were significantly faster on the P4, which is why I ended up buying it instead.
Re: (Score:3)
True, it's horrible that review sites benchmark CPUs using the kind of programs people actually run on them.
Yes, and no. If your a gamer, obviously having a CPU that the games are optimized for is a big win. But don't extrapolate general performance from a single benchmark. Especially when one of the CPU vendors is providing "free" performance help for the game/application.
At this point, its pretty clear that choosing the Intel is the correct choice for big name games.
We will see if this changes over the ne
Bechmark what people use? Which people? (Score:5, Insightful)
Also, most benchmarks are some synthetic benchmarks (compiled with Intel compiler), or some 3D games, or some video transcoding. I do none of that, and what I do I do not do it on Windows. I am yet to see a site that benchmarks Java compilers, Java IDEs, databases, application or Java web servers, etc. on Intel vs AMD vs ARM platforms on Linux. The only site that comes close is Phoronix. And if you look at their Linux benchmarks, difference between AMD and Intel CPUs is much less than on Anandtech or Toms Hardware. Intel is still making faster CPUs, but not that much faster.
--Coder
Re: (Score:2)
Meanwhile, in the real world, most people's primary reason for buying a faster CPU is... to play games on Windows.
If AMD suck at gaming on Windows, they want to know that.
Now, if a site is only using games as a benchmark, yes, that's a problem. But Toms Hardware, at least, usually covers a range of different types of applicaiton benchmarks, for the minority who aren't looking at the CPU as something to run games on.
Re: (Score:3)
Tell that to Tomshardware and others who use x87 benchmarks and games like skyrim showing an AMD 8 core being handed a smackdown by an i3?
Awesome, you found a workload where deeper is better. Now go try costing out a cluster with hardware virtualization and ECC RAM to support several thousand SMP virtual machines and see what you come up with.
Re:Before AMD committed suicide (Score:5, Insightful)
The spec benchmarks tell a different story, and tend to be more representative because each vendor does their best rather than intel/nvidia providing "free" performance enhancement advice for game companies.
So, from my own experience the Amd/Intel story is a lot closer than some of these benchmarks might lead you to believe. Especially for server applications.
Its pretty easy with modern CPU's to make fairly small tweaks that give advantage to one CPU or another. We have a bunch of microbenchmarks for our application, and things like memcpy performance can be swung 2x-4x. Or even the depth of loop unrolling for some things. In one loop the intel it may like 2x and the AMD like 4x unroll. With each one tuned to run best on the platform the bottom line performance is often quite similar, but run the AMD optimized one on the intel, or the reverse and suddenly one or the other CPU appears to be trouncing the other.
Re: (Score:2)
Games typically don't benefit from horizontal scaling. Although many games have gone multi-threaded, there are only so many tasks you can hand off to additional cores.
What kills AMD is a per-core license. (Score:4, Informative)
Oracle's Enterprise database costs $47,500 [oracle.com] per processor core. There is no way in heck that I'd choose AMD over Intel when I have to run more cores to get the same performance.
Microsoft SQL Server Enterprise costs $6,874 [microsoft.com] per processor core.
AMD has a heavy investment in the server space. They should negotiate lower per-core license costs in these cases; license parity with Intel is throwing them out of the data center.
As the developers of x86-64, they should have a patent portfolio to do serious damage to 64-bit x86 systems vendors. Use it.
Re: (Score:3)
Microsoft SQL Server Enterprise costs $6,874 [microsoft.com] per processor core.
Ha! Did you know that Microsoft reduces "per core" licensing costs for SQL Server on AMD processors, because otherwise nobody could justify buying them?
Re: (Score:2)
Need parity files on the disk itself as well as at least mirrored disks.
You need ZFS. :) No, really, it checksums all the writes, which reflects the modern reality. I've got a machine at home in the basement that is effectively just ECC RAM and a bunch of disks (RAID-Z on that one I think, RAID-Z2 at work), to store our home data. It's still cheaper to do it in one spot and then run non-ECC hardware elsewhere, accessing the reliable data over gigabit.
On my laptop, where I have many fewer options, I've ju
Re: (Score:3)
They are still competitive on the performance per $ scale, and provide cpu's adequate for almost all standard needs.
Re:Before AMD committed suicide (Score:5, Informative)
They swooped in when Intel was being stupid, made the best chips in the world... then committed suicide and
If, by committed suicide, you mean that suffered when intel bribed people like Dell not to use the clearly superior products and so lots out on many billions of sales and hence the crucial R&D advantage, then yeah sure suicide.
Assisted suicide.
Like assisted like throwing a healthy happy person off a clifff.
haven't built a competitive chip in 3 years. Sad times...
Depends what for. For games, intel seem to be better IF you're prepared to buy a separate GPU. If you look on the Linux, not Window centric benchmarks, the top end AMD ones often lie somewhere between the top end i5 and the top end i7.
Sometimes they lose out sometimes thy beat the i7s.
For the kind of stuff I do, they're very competitive.
But if you play skyrim, then no. But an i5 and an external graphics card.
Re:P4 vs Athlon XP (Score:5, Interesting)
As the author of the article:
In 2000 - 2001, the Athlon / Athlon XP were far ahead of the P4. But from Jan 2002 to March 2003, Intel increased the P4's clock speed by 60% and introduced Hyper-Threading. SSE2 became more popular during the same time. As a result, the P4 was far ahead of Athlon XP by the spring of the year in most content creation, business, and definitely 3D rendering workloads. Now it's true that an awful lot of benchmark shenanigans were going on at the same time, and the difference between the two cores was much smaller in high-end gaming. But if you wanted the best 'all around' CPU, the P4 Northwood + HT at 2.8 - 3.2GHz was the way to go. Northwoods were also good overclockers -- it was common to pick up a 2.4GHz P4 and clock it to 3 - 3.2GHz with HT.
Athlon 64 kicked off the process of changing that, but what really did the trick was 1). Prescott's slide backwards as far as IPC and thermals and 2) The introduction of dual-core. It really was a one-two punch -- Intel couldn't put two Pentium 4 3.8GHz chips on a die together, so the 820 Prescott was just 3.2GHz. AMD, meanwhile, *could* put a pair of 2.4GHz Athlon 64's on a single chip. Combine that with Prescott's terrible efficiency, and suddenly the Athlon 64 was hammering into the P4 in every workload.
Re: (Score:2)
Re: (Score:3)
Intel has their grand, big-iron-class, future-of-enterprise-computing 64-bit architecture, then AMD pops up "Hi guys, who wants a 64-bit CPU, fully backwards compatible with your 32-bit x86 code and pretty damn fast at that, for only slightly more than the price of a nice desktop CPU?"
Boom. Headshot.
Re: (Score:2)
Today, I do not see any apps in general use that need or require access to memory beyond a contiguous block of data beyond 4GB (frankly far far less).
Then you're not looking very hard. There are plenty of games that fall over as soon as you install enough mods that you go over 2GB (the maximum amount of RAM the majority of 32-bit apps are allowed to access in Windows). There are also games which install crappy versions of textures in 32-bit mode because they would otherwise hit the 2GB limit.
PAE is a disgusting kludge. There is simply no reason not to run 64-bit apps on a 64-bit x86 OS... unless you're stuck with a proprietary antique like Windows.
Re: (Score:2)
Today, I do not see any apps in general use that need or require access to memory beyond a contiguous block of data beyond 4GB (frankly far far less).
Lacking inside information, I could be wrong, but I believe you'll find Photoshop, Premier Pro, After Effects, Solidworks, PTC Creo, various FEA packages, and CAM applications all use blocks of RAM beyond 4GB. And that just a sample of MY machine.
More knowledgeable folks (i.e. coder's for the above apps) can correct me if I've got it wrong. But there is clearly a need for >32bit memory address space.