AMD Preparing To Give Intel a Run For Its Money 345
jfruh writes: "AMD has never been able to match Intel for profits or scale, but a decade ago it was in front on innovation — the first to 1GHz, the first to 64-bit, the first to dual core. A lack of capital has kept the company barely holding on with cheap mid-range chips since; but now AMD is flush with cash from its profitable business with gaming consoles, and is preparing an ambitious new architecture for 2016, one that's distinct from the x86/ARM hybrid already announced."
Buh? (Score:5, Interesting)
But the real fight of a decade ago, when AMD was first to 1GHz, the first to 64-bit, the first to dual core, seemed missing. It's not surprising since the company was facing a real threat to its survival. But with a gravy train from the gaming consoles, it looks like the company is ready for a fresh battle, with a familiar face at the helm.
Uh, wait. No. It was surprising when AMD was the performance leader. It was surprising because they were broke. It's not surprising to see AMD pushing out a new architecture now that they have money. It takes a lot of money to do that. So we start out completely ass-backwards here.
Much elided, then
The most logical move for Keller would be to dump the CMT design in favor of a design with simultaneous multi-threading (SMT), which is what Intel does (and IBM's Power and Oracle's Sparc line).
Wait, what? Why? Why wouldn't it make more sense to just fix the lack of FP performance, perhaps by adding more FP units? Why would it make more sense for them to go to a completely different design? It might well, but there is no supporting evidence for that in the article.
Re: (Score:2)
Re: (Score:2)
Re:Buh? (Score:4, Insightful)
Re: (Score:2)
AMD was dominant while Intel was chasing dead ends (Netburst and Itanium). Once Intel woke up and started working on sane chip designs again AMD's goose was cooked. They just can't compete with Intel's R&D budget.
Well, that was my point, AMD can afford to have an R&D budget right now. But you're right, intel spent a lot of time dicking around with nonsensical architectures that they might well have been able to spend crushing AMD sooner. On the flip side of that, though, is the question of whether they could have actually been more effective. Too many cooks, and all that. Spending more money doesn't necessarily result in getting where you want to go sooner. You tend to go somewhere, but not necessarily in your c
To be fair to Intel (Score:4, Interesting)
Netburst did seem like a reasonable idea, in testing. While it was low IPC, it looked like it would scale bigtime in the speed area. They had test ALUs running at 10GHz.
So I can see the logic: You make an architecture that can scale to high frequencies easily, and that gets you the speed.
Obviously it didn't scale, and wasn't a good idea, but I can see what they were going for. It wasn't like it was completely nuts.
Re: (Score:2)
Well they completely forgot that higher clock frequencies translate to higher power consumption and higher thermal dissipation, and you can only remove heat from a chip in a consumer computer so fast since resorting to submerging it in coolant obviously isn't feasible for a consumer device. I remember some Intel presentations (I worked there at the time) where they were proudly bragging about how much power these chips consumed. Did they not think people might not like their computers generating that much
Re: (Score:2)
The longtime complaint about P4s is that they were a marketing ploy. People buy the chips with the most Hz, so we're going to give them the biggest numbers ever! Oh, and they'll look amazing in benchmarks. But then AMD's marketing guys came up with the crazy notion of just making up numbers to compare against Intel's mostly meaningless Ghz number and the marketing advantage never materialized. Plus
Re: (Score:3)
No. Itanium was completely insane. Itanium took everything ever invented in computer architecture and tried to fit it onto one chip. At the same time it added every feature from x86 and PA-RISC, apart from the actual ISA of course, in order to simplify porting operating systems and other software. Making e.g. 10W Itanium chips is out of the question unless you software-emulate the whole thing at 10MHz on a sane architecture.
Anyway, Itanium succeeded in killing off the Unix market for MIPS and PA-RISC and al
I'm Still Rooting for AMD (Score:5, Interesting)
Re: (Score:2)
Re: (Score:2, Informative)
As others have pointed out, AMD have historically beaten Intel when Intel fscks up. Intel needed the P4 to keep ramping up clock speed because it had sucky IPC, and it hit a brick wall, so AMD beat them because they had significantly better IPC at similar clock speeds. Intel wanted everyone to switch to Itanium, so that was their 64-bit push, while AMD pushed 64-bit into the x86.
As soon as Intel realized they needed good IPC on a 64-bit x86 you couldn't fry eggs on, AMD was back to second place, and have be
Drivers? (Score:5, Insightful)
Honestly they need a better team writing the drivers. You can have the best CPU/GPU in the industry but if the drivers suck, no one will want to buy them.
[John]
Re: (Score:2)
Re: (Score:3)
Re: (Score:3)
This is incredible true and has been over a decade. They don't seem to realize that they have better hardware for the same price, but people refuse to buy them because it's hard to appreciate the betterness without proper drivers.
I'm sure the FLOSS crowd would also start embracing AMD if they did decent OSS drivers like Intel does.
They're fools (Score:3)
This was their opportunity to dominate the CPU market with the MIll CPU architecture [millcomputing.com] and they blew it.
wrong (Score:4, Interesting)
Sorry AMD, you're heading in the completely wrong direction. CPUs are already plenty fast. They have been for years. 3D gaming is starting to look like just another "Gold plated speaker wire" guy hobby as everyone moves to mobile devices.
The real winners in the future are going to be the very cheap, very efficient chips. Do you want one very powerful computer to run everything in your house? Or do you want everything in your house to have its own dedicated, highly efficient CPU that does just what that device needs?
Re: (Score:2)
Re: (Score:2)
What AMD should consider are FPGAs and different power cores on the same die. This isn't anything new, but done right, it can go a long way in the server room.
The FPGAs can be used for almost anything. Need a virtual CPU for AES array shifting? Got it. Need something specialized for FFT work? Easy said, say done. Different power utilization cores would be ideal for a server room where most of the hosts see peak load use, then after quitting time, end up idle.
Re: (Score:2)
Anything with an FPGA is always going to be in a niche market.
It won't let you multi-task since you can't reprogram it when ever you context switch.
Users don't want messages say they need to wait for X to finish before they start Y.
They're also very expensive because they use a lot of silicon.
They also consume a lot of power too.
Re: (Score:2)
CPUs are already plenty fast. They have been for years.
Incorrect. CPUs are plenty fast and have been for years for doing many common tasks. The fact is that they aren't nearly fast enough (particularly for single-threaded items) and almost certainly won't be for another decade or more. There's a limit to what and how much you can multi-thread, and even then, you're still limited by single-thread performance x number of threads.
So yes, for grandma playing Blackjack on Yahoo, today's CPUs are plenty fast. For me and many others? The fastest stuff available is 100
Re: (Score:2)
I don't think CPUs are plenty fast at all. The stagnation in CPU speeds for the last 15 years has been dreadful for PC applications and the industry. Prices of computers have fallen, and software isn't much more capable than it was in 2000. If we had growth in CPUs like 1985-2000 for the 2000-2015 period ... it would be awe inspiring how terrific our machines would be today.
Re: (Score:3)
Only for the short-term. In a 25-year timeframe, the tech needed for 3D gaming is going to become the most important branch in the computer industry. Why? Because we'll need it to drive holographic display technology. To generate a hologram in real-time, you need to convert a virtual 3D scene to a 2D interference pattern. Transmit that pattern to a display, shine the appropriate ligh
Input is another key part of the problem (Score:3)
3D gaming is starting to look like just another "Gold plated speaker wire" guy hobby as everyone moves to mobile devices.
Let me know when a substantial number of people start buying MOGA clip-on gamepads for their mobile devices. Until then, even the smartphone or tablet with the strongest CPU and GPU will be limited by its touch input. Mega Man 2 and Castlevania ran comfortably on 1.8 MHz CPUs, yet not even a 1.8 GHz CPU can add buttons to a device that doesn't have them.
Best low-cost CPU with half-decent GPU? (Score:2)
I'm looking at the new Intel G3240 with Intel HD 4000 and I was wondering if something around the same price range (70$CAD) from AMD had an equivalent CPU with a better GPU.
Re: (Score:2)
I'm not even sure the G3240 comes with the HD 4000 because Intel makes it near impossible to know which GPU is used inside a lot of their CPUs, listing only "Intel HD".
Re: (Score:2)
Re: (Score:2)
(reply for both jandrese and washu_k)
Thank you for your comments. I guess I'll go with the G3240 since it's a better CPU, endure the Intel HD GPU for now and add a GTX 750TI later.
Re: (Score:2)
The closest AMD in price with a GPU is the A6-6400K. It would be quite a bit better in the GPU department, but MASSIVELY worse in the CPU department. Not even close in CPU power. To get something that wont cripple you on CPU you would need to go up to the A8-6600K, but that is over $110 at the CAD stores I checked and would still be way worse in single thread CPU.
There are also the new Kabini CPUs and the top end
Compaq was afraid to use AMD chips FOR FREE (Score:4, Interesting)
Compaq was afraid to use AMD chips given out for free, because Intel would "retaliate", ok?
What kept AMD's market share low was not "clever marketing" of its competitor, it's crime.
Back in P4 Prescott times, Intel's more expensive, more power hungry, yet slower chip outsold AMD's 3 or 4 to 1.
Not being able to profit even when having superior products, it's really astonishing, to see AMD still afloat.
Free? Keep in mind they'd lose Intel Payola (Score:2)
Compaq was afraid to use AMD chips given out for free, because Intel would "retaliate", ok?
What kept AMD's market share low was not "clever marketing" of its competitor, it's crime.
Back in P4 Prescott times, Intel's more expensive, more power hungry, yet slower chip outsold AMD's 3 or 4 to 1.
Not being able to profit even when having superior products, it's really astonishing, to see AMD still afloat.
Intel's Payola [1] (which basically kept Dell profitable for several quarters of the past decade) is something you have to factor in when looking at these "deals". I'm just sad that Intel didn't pay a bigger price for their purely anticompetitive corrupt practices.
[1] http://www.theatlantic.com/tec... [theatlantic.com]
Re:Compaq was afraid to use AMD chips FOR FREE (Score:4, Interesting)
While Intel did a lot of shady things, part of it was also that AMD didn't have nearly enough fab capacity to supply the market. Those kinds of decisions are made years in advance, you don't just pop up a sub-100nm processing plant on demand. So AMD got a huge winner, they surely produced everything they could and got a nice premium on their products but the remaining demand had to go with Intel. It's just not the sort of market battle you can win quickly.
Re: (Score:3)
Compaq was afraid to use AMD chips given out for free, because Intel would "retaliate", ok?
Let's be honest though, Compaq wasn't known for making good business decisions.
2016 (Score:4, Funny)
By 2016 AMD will have a CPU that beats the sh*t out of Intel's 2014 best offerings.
Fight the good fight (Score:2)
I'll believe it when I see it. (Score:2)
AMD has pissed away massive leads over Intel in the past.
AMD single-handedly created the x86-x64 market from NOTHING.
Then they fell back on their laurels.
Then they bought a graphics company.
Their last effort in the market was basically a fizzle. Forgoing a custom chip designed to eake the maximum efficiency and power from the device, they went with a crappy computer-designed monstrosity that basically was the worst of all worlds, and a flame-throwing power hog to boot.
Sure, they can kick out a processor th
SMP? (Score:2)
It would be nice to see AMD offer a 4-8 processor chipset that would allow you to highly parallelize their chips. Intel can do it, but the premium for Xeon silicon is outrageous. Not sure if AMD has enough business in that market that they're willing to chuck it in hopes of getting a leg up, but I sure as hell wish I could drop a second CPU into my desktop so I don't have to chuck the entire thing and buy a whole new board/CPU from Intel just to get a 50% boost in performance every 3-4 years.
Re:Only the great Master of Paper can save AMD (Score:4, Insightful)
I stick to Intel because they're the best CPU you can buy right now.
But I'd love to see AMD back in the game. I bought the first X2 Athlon series, what a beast that was.
Sadly that was also the last AMD CPU I've purchased.
Re:Only the great Master of Paper can save AMD (Score:5, Informative)
I bought the first X2 Athlon series, what a beast that was.
Sadly that was also the last AMD CPU I've purchased.
The Phenom II X3 was also an absolute monster for the price, as was the Phenom II X6.
Re: Only the great Master of Paper can save AMD (Score:2, Informative)
Got a Phenom II x4 as it was the best bang for the watts and I was building an always on multi-purpose rig.
Re: (Score:3)
AMD made it easy to upgrade incrementally; not sure if the same would have been true of Intel as I've not had an Intel desktop in over 10 yrs.
Bought a Athlon X2 with nForce-based mainboard & DDR2 RAM in 2006.
Maxed out the RAM & upgraded to Athlon II X4 in 2009 while keeping same mainboard.
In 2011, bought new 990FX- based board to get SATA 3 / USB 3 & DDR3 RAM but kept the same 4-core CPU.
Just last week, got a 8320 Black Edition 8-core at a good price and might soon get my first "AMD" videocard a
Re:Only the great Master of Paper can save AMD (Score:5, Informative)
AMD made it easy to upgrade incrementally; not sure if the same would have been true of Intel as I've not had an Intel desktop in over 10 yrs.
No, intel changed sockets more than AMD did in that period. I got an AM3+ board, so I went from Phenom II X3 720 to Phenom II X6 1045T, which I still have. If you're not expecting massive single-thread performance, it is still a fairly beefy CPU. I mean, sure, half as much as an intel chip, but I paid a hundred bucks for this (and for my original CPU) and you'd have to spend $200 to get an intel chip with this much horsepower today. AMD-chipset motherboards are cheaper than intel-chipset motherboards as well, so the total savings was at least $200 if not more. Today, I still have more than enough CPU for anything I want to do; It's the 240GT that's holding me back now. Been thinking about a modest upgrade to a newer nvidia card pulled from a Dell on ebay for $60.
Re: (Score:2)
I'd advise against unless it's a very new card or you're not intending to game; Nvidia's just dropped support for all pre-fermi cards and will be dropping support for Fermi (400 and 500 series cards) with their next hardware revision.
That's OK, I'm looking at a 640GT. I do game, but I mostly play older games and I don't demand full detail at 1920x1200, my display res.
The 240GT is enough to play many older games at full res with full options, but of course, no newer games. And it won't even play most newer games with even half the options turned on. Even Star Trek Online taxes my 240GT.
I chose the card on the basis of power consumption. The 250GT was the hot card when I bought it (I've had it long enough to replace the GPU fan!) and the
Re:Only the great Master of Paper can save AMD (Score:4, Informative)
Re: (Score:2)
* just happened to be the generation I was on when i was ready to upgrade my 1st gen phenom
Re: (Score:2)
I do wonder what the future has in store for the humble CPU. With a huge market shift towards tablets and phones in the consumer area, where power savings are more important than raw oomph, as well as a similar shift in a good portion of the server market, are we starting to reach an era of CPU's being "good enough" for most people and performance to begin stagnating?
Hopefully some good competition between AMD and Intel will keep things fresh and fast.
Re:Just like Bulldozer? (Score:5, Informative)
RTFA (Score:3, Insightful)
DId you RTFA?
More of the same? Probably not.
Re: (Score:2)
What a maroon.
No kidding (Score:5, Insightful)
I would -love- to see AMD truly competitive with Intel on every level because it is only good for us consumers. It would be great if both companies made chips so fast, efficient, stable, and capable that you didn't buy AMD or Intel based on anything but who had the better deal that week.
However I'm not interested in hype and bullshit. As you say, "put up or shut up." I get tired of hearing about how great your shit will be in the future. Guess what? Intel's shit will be great in the future too, probably. It is great right now.
So less with the hype, more with the making a good CPU.
Re:Just like Bulldozer? (Score:5, Insightful)
Well, not so long as it used to be. I recently got a Macbook Pro and under "About This Mac / Processor" it says "2.3 GHz Intel Core i7" - the same thing it says on a Macbook Pro I got 3 years ago. The CPU is not actually identical of course - it has much-improved battery life, which is good. But the performance increase, if any, is not noticeable. Times really have changed.
Re:Just like Bulldozer? (Score:5, Interesting)
Last I looked, Intel's R&D budget was larger than AMD's revenue
That certainly was true (probably still is), but it's misleading. AMD no longer owns fabs and the majority of Intel's R&D spending is on process technology. By spinning off GlobalFoundaries, AMD is able to share that R&D cost with other SoC makers and go to other companies if they happen to be able to do it better at a specific time.
Re: (Score:2)
how is spinning off your fabrication capability 'good' in the long run? (not trying to be flippant, it's a serious question)
Re:Just like Bulldozer? (Score:5, Interesting)
how is spinning off your fabrication capability 'good' in the long run?
I don't work at AMD, but I do work at another company that relies partly on foundries.
Basically, it's economies of sale and competition. Semiconductor fabrication processes keep getting more expensive. Foundries specialize in process development and spread the R&D across many, many customers. Unless you're willing to spend a fortune keeping up (as Intel is), have special requirements, or need a ton of volume, you have little to gain and a lot to lose from rolling your own process. Remember, you don't just have to make transistors, you also have to have good enough yield to turn a profit and good enough reliability to keep your customers. If you fail, you have to spend even more money to fix the fab on top of the money you're losing on the stuff you manufacture. Meanwhile, TSMC is cheerfully cranking out wafers for your competitors.
Re: (Score:2)
I would also add that unless AMD plans to have a business where it can migrate cheaper chip business to use the older fabs, AMD might well find itself having to manage fabs i.e. run a foundries business to recoup its investment. AMD doesn't really make the cheap as chips chips (the kind of stuff that Broadcom makes), then they should not be in the business of fabs. They ought to let the likes of TSMC who can manage that migration much better than AMD be in that business.
Re: (Score:2)
That's a good question. After all, Intel was able to use their vastly superior fab capabilities to fend off AMD's enhanced tech for years until they released their Nehalem architecture to definitively take back the desktop CPU performance crown.
They're not called ChipZilla for nothing.
Re: (Score:2)
Intel does that by spending massive scads of money on process technology. They are able to spend that because they massively overcharge for their product if you take only the design, manufacturing and distribution costs into account. You also have to count paying for their processes.
AMD let a bunch of designers go, it's not clear if intel would have eaten their lunch so aggressively otherwise. Guess we'll have some inkling here soon.
Re:Just like Bulldozer? (Score:5, Informative)
Helps to bribe system builders to keep AMD out of most consumer's machines.
http://hardware.slashdot.org/s... [slashdot.org]
Re: (Score:2)
Re:Just like Bulldozer? (Score:5, Insightful)
Yup, and the BS about them being first to 64-bit...maybe in the consumer sector, but Intel, IBM and DEC all had 64-bit chips before the Athlon was even designed let alone shipped.
They invented the architecture that you probably typed your post on. That was the point. Heck, on my linux distro it is still called amd64...
Re:Just like Bulldozer? (Score:5, Informative)
Not the architecture, that belongs to Intel, AMD extended it to support 64 bits.
What are you on about? amd64 is not an architecture, nor is x86. They are instruction sets. The underlying architecture may be informed by the instruction set, but it's also only loosely coupled in modern CPUs.
Re: (Score:2)
Not the architecture, that belongs to Intel, AMD extended it to support 64 bits.
What are you on about? amd64 is not an architecture, nor is x86. They are instruction sets. The underlying architecture may be informed by the instruction set, but it's also only loosely coupled in modern CPUs.
This is why the term "microarchitecture" is useful. In addition to the term "instruction set", the term "instruction set architecture" is also used, so "architecture" is used for both.
Re: (Score:2)
Yeah, but ISAs are all but over, and they are over in x86-land. We haven't had an x86 core defined by its instruction set since the 80486. Even the Am586 was internally RISCy.
Re: (Score:2)
Yeah, but ISAs are all but over, and they are over in x86-land.
They're not "over" to compiler writers and assembler-language programmers.
We haven't had an x86 core defined by its instruction set since the 80486. Even the Am586 was internally RISCy.
So? They (and the latest z/Architecture chips) might translate native instructions into microops and schedule and execute those microops, but the only way in which those microops - or other implementation details of the processor - are visible to code and people or software that generate code is that they may affect the performance of particular sequences of instructions, so that, for example, a compiler might optimize differently fo
Re: (Score:2)
Yeah, but ISAs are all but over, and they are over in x86-land.
They're not "over" to compiler writers and assembler-language programmers.
IS != ISA, HTH HAND
Re: (Score:2)
Yeah, but ISAs are all but over, and they are over in x86-land.
They're not "over" to compiler writers and assembler-language programmers.
IS != ISA, HTH HAND
It would only help if it were true. What are your definitions of "instruction set" and "instruction set architecture", and what citations can you give that would make those definitions worth taking seriously, as opposed to, for example, Intel's use of "instruction set" [intel.com] and Intel's use of "instruction set architecture" [intel.com]?
Re: (Score:2)
Instruction Set Architecture (ISA) is also a common name for it. That and tons of linux distributions and people refer to it as "arch" or "architecture". GNU coreutils even has a program named arch that prints out the "machine hardware name" (ew...).
The fact is lots of people call it architecture and it can reasonably be called architecture in the given context. I'll allow it!
Not denying it's important to know the difference, though!
Re: (Score:3)
What are you on about? amd64 is not an architecture, nor is x86. They are instruction sets. The underlying architecture may be informed by the instruction set, but it's also only loosely coupled in modern CPUs.
AMD64 and x86 most certainly are architectures. Have you heard the term "instruction set architecture", i.e., ISA? The underlying implementation you refer to is usually referred to as the "microarchitecture" to distinguish it from the ISA.
The term "architecture" is often tossed around to refer more
Re: (Score:2)
AMD64 and x86 most certainly are architectures.
Nope. I will correct you one time and then I'm done with this stupid thread.
AMD64 and x86 are instruction sets. x86 isn't even an instruction set, it's a name for families of instruction sets. Once upon a time, instruction sets were related directly to architectures. In PC-land, that time ended with the 80486. All x86-compatible processors since have been some other kind of core internally, with decode and encode on the way in and out of the CPU to make it look like an x86 processor.
We no longer have instru
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re:Just like Bulldozer? (Score:4, Interesting)
That is true. However AMD were the first to make a 64-bit architecture, which was x86 compatible. And it was also the first 64-bit CPU to be in a price range that was acceptable to average consumer. But most importantly, AMD designed an architecture so successful that Intel decided to make their own AMD compatible CPU. Today Intel probably earns most of its money on CPUs using AMD's 64 bit design.
But if AMD now want to go and build an entirely new design, which is nothing like x86, they may very well be repeating the exact same mistake Intel made to let AMD64 get the lead.
By now it might be safe to ditch all 8, 16, and 32 bit backwards compatibility with the x86 family. But AMD64 compatibility is too important to ignore.
Re:Just like Bulldozer? (Score:5, Insightful)
" Itanium failed to make significant inroads against IA-32 or RISC, and then suffered from the successful introduction of x86-64 based systems into the high-end server market, systems which were more compatible with the older x86 applications." http://en.wikipedia.org/wiki/I... [wikipedia.org]
So the point is that AMD was more than capable of producing a chip to beat Intel.
Re:Just like Bulldozer? (Score:4, Insightful)
I think the point was even with Intel's massive cash and infrastructure they couldn't bring 64 bit to the desktop
Wrong. They could have if they had wanted to, but they didn't want to. They wanted 64-bit to remain in the realm of big-iron, so they could sell their big, overpriced Itanic chips. Whenever anyone asked about 64-bit chips, Intel said "buy our Itanic!". When anyone complained about the 4G memory limitation inherent with 32-bit chips, they pointed to their crappy PAE extension.
Then AMD came out with the X86-64 ISA, and then suddenly Intel looked stupid. They tried to say things like "people don't need 64 bits on desktop systems", "you can use PAE to use more than 4G", "no one needs more than 4G", until they trotted out their hastily-made "EM64T" version.
Re: (Score:2, Insightful)
EMT64 was in the labs for several years before it was released to market, just like "Jackson Technology" or "Hyper-Threading" was in the labs several years before it's introduction to market, so "hastily-made" is definably false.
As someone stated earlier, EMT64 was already cooking in the labs at the same time Itanium or IA64 was cooking, Intel gambled on IA64 and wanted to start to move away from IA32, but fortunately for the rest of the world, AMD forced their hands.
I know a lot of this first hand, has I a
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
First in the sense that Apple made the first tablet. ;-)
Re: (Score:2)
Re:First to 64-bit (Score:4, Informative)
Re:First to 64-bit (Score:5, Insightful)
You mean first to x86-64. Intel had a 64-bit processor before that (Itanium). 13 years later, Itanium is dead and x86 is holding us back, so much that servers are turning towards ARMv8 (inferior design to Itanium, but tons of momentum from mobile/embedded).
You do realize that this run towards ARM is not a full stampede, and is driven by price and operating costs and only useful for Unix/Linux systems as windows server isn't really interested in supporting ARM yet. This is more like a trickle of some large specialized systems off onto Red Hat (or similar) systems where one can afford to just change processors and recompile everything in an effort to same a bit of operating power and hardware costs. But you have to be looking at enough servers to make this worth the labor cost.
So, where I don't care for the X86 family and would love everybody to switch to ARM, I know it's not going to happen in my career without there being that "killer" app that pushes everybody off of Windows. Right now, with "Office" being the "killer got to have" application of all time, and that generally only running on Windows, guess what? X86 is here to stay.
First to 64-bit (Score:2)
x86 is holding us back, so much that servers are turning towards ARMv8 (inferior design to Itanium, but tons of momentum from mobile/embedded).
The x86 ISA is not holding us back. IMO, the only thing that motivates people to turn to ARM for servers is that AMD is not giving Intel sufficient competition in the server space. No one wants an Intel monopoly, and if AMD is not going to be an effective alternative, then people are forced to look beyond x86 for one. But that has nothing to do with the relative
Re: (Score:3)
I've been hearing "x86 is holding is back" for, oh, over 20 years now.
Re: (Score:3)
Liquidation isn't entertaining...
Re:target foot acquired! (Score:4, Informative)
Way to shoot yourself in the foot, AMD. I don't want or need a new architecture. I want x86 (and x64) for my PC and laptop, the end.
Another reason to avoid the unqualified term "architecture" when speaking either of instruction sets or chip designs; person A may read "architecture" as "instruction set architecture" and person B may read it as "microarchitecture". I suspect they're talking about a new microarchitecture, implementing the x86-64 instruction set architecture, here.
Re: (Score:2)
Windows NT used to run on DEC Alpha.
Re: (Score:2)
Re: (Score:2)
Why do they need to make better drivers? Their graphics division makes all its money selling GPU's to Microsoft and Sony. No driver issues there.
Re: (Score:2)
K6, the Celeron killer!
Re: (Score:2)
The original K6 was a bit of a turd, because of its 24-bit FPU. That was a horrible, terrible mistake.
The K6/2 was a peach of a processor, clock for clock it will beat a Pentium II at many operations not least because the low-end P2s that it was competing with had crippled cache. Sadly, most of the motherboards it was coupled with were pure shit.
By the time that was fixed and the K6/3 came out with onboard L2 cache (and the motherboard cache, if any, became L3) it was too little, too late. The K6 was known
Re: (Score:2)
I've been in the AMD camp for ages, but I have to admit that even the K6-2 was not all that. It was decent, but it still suffered a weak FPU, which became apparent when MP3 made its big hit.
Re: (Score:2)
I've been in the AMD camp for ages, but I have to admit that even the K6-2 was not all that. It was decent, but it still suffered a weak FPU, which became apparent when MP3 made its big hit.
MP3 doesn't tax the K6. What does is 3D gaming, which was just becoming a major thing at that time, and which is pretty much all fp math. The K6 had badass int performance, but not so much fp. The K7 had great fp.
Re: (Score:2)
You could argue it's all anecdotal, but the evidence says you're incorrect. A similarly clocked PII would encode faster than the AMD counterpart. Same for video, gaming, etc. It's not like it was terrible, but it wasn't as good as its opponent either.
Re: (Score:2)
You could argue it's all anecdotal, but the evidence says you're incorrect. A similarly clocked PII would encode faster than the AMD counterpart
Sure, but it's not like it took aeons to do on the AMD. But what really didn't work was 3d gaming. It kinda worked, for a while. But at first the 24-bit FPU caused problems which required AMD-specific patches which were often an afterthought, and later the lack of enough FPU for 3d gaming caused a decisive shift away from the K6. If your mp3 encode takes longer, so what? You're encoding from a batch, right? But if your CPU doesn't have enough FPU to play a game at a decent resolution, it just doesn't have e
Re: (Score:2)
Need to reconsider your hardware news sites... (Score:3)
According to xbitlabs [xbitlabs.com], Kaveri has worse CPU performance than its predecessor.
AMD got lucky. It's found a dependable stream of revenue in game consoles. Better yet, no matter whether Microsoft or Sony wins the next generation console wars, both have AMD under the hood. Now that's hedging your bets. Whoever at AMD was in charge of negotiating these deals deserves a paid vacation to Necker Island [virgin.com] with all the trimmings.
But lets get serious. AMD's current processors suck. And I hate saying that. A decade