The Ups and Downs of AMD (hackaday.com) 225
szczys writes: In 2003 AMD was on top of the world. Now they're not, but they're also still in business. AMD continues to produce inexpensive, well-engineered semiconductors. The fall over the last 10 years is due to Intel, who used illegal practices and ethically questionable engineering decisions to knock AMD off their roost while still keeping them in business. The latter prevents the finger of antitrust from being pointed at Intel the way it was for Ma Bell.
AMD settled (Score:5, Informative)
AMD settled their entirely valid lawsuit:
http://www.cnet.com/news/intel... [cnet.com]
Intel's actions were shocking and absurd, and they seem to be willing to play by legal limits only when failing to do so would visibly get them hammered with monopoly lawsuits. It was a poor resolution to a very real issue. The other part? It prevents Intel from having to do anything rash or aggressive with their chip power, because by neutering their only competitor they were able to focus more on profitability and less on performance and perception. In my *opinion*, I think this is a big part of why we saw chips mostly become stagnant compared to in years prior- Intel is actually keeping in range of what AMD is capable of on purpose. They are holding back.
Re:AMD settled (Score:5, Insightful)
In the past Intel did them dirty and there's no argument about that.
AMD's curren't problems are entirely their own fault. They fired the development team that made the K8(and then K10), the processor family that completely destroyed all of Intel's products from desktop to enterprise.
Intel had the Netburst CPUs, AKA the Pentium 4. Power hungry, low IPC, stuck with the FSB, hamstrung because they were developed around another failed Intel venture - The RDRAM debacle. The arch was utterly unable to go multicore (Pentium D was one of the worst processors ever made and was multi-chip packaged.)
And lets not forget fucking Itanium. Intel fucked that up so hard they had to backpedal and introduce the 64 bit tech that AMD pushed.
Enter the K8 - Scalable chip interconnect, 64 bit, later developed in to the first true multi-core cpu available to consumers. Took over the server space completely. For a time, Xeon was dead. Not even kidding.
And then AMD threw it all away. A bunch of fucking MBAs decided they didn't really need to pay a bunch of expensive chip designers to make chips, and that it would be a better idea financially to sell of the fab so their remaining development team could be isolated away from the fabrication process. Brilliant plan.
That's the shit that gave us bulldozer, and that is why AMD sucks today.
The rest is history. Intel cleaned up their act, released the core 2, and AMD has been irrelevant ever since.
Intel has learned. They have not slowed down. AMD almost killed them. Every iteration is faster, lower power, cheaper. They're 2 generations ahead of everyone else in fabrication tech. Skylake CPUs are CRAZY fast and sip power.
Re:AMD settled (Score:5, Interesting)
While, yes, AMD management totally did destroy the company, the bit about selling the fab happened later, after the Barcelona disaster, and after they threw away all their money on ATI.
The fab was not competitive (as GlobalFoundries performance showed for the next few years), and they absolutely had to get rid of it to survive. Not having the cost of maintaining that thing is the reason they are not bankrupt (yet).
Re:AMD settled (Score:4, Informative)
I wouldn't say that ATI was a bad purchase, arguably its the only reason AMD is still competitive, and they can leverage that design work into making better desktop chips.
Re: (Score:2)
System on chip is great for embedded solutions, but for performance solutions it's better with more specialized designs that can be combined in different ways depending on which kind of performance that shall be achieved. It's quite different to run a web server from a graphic-intense game.
Re: (Score:3)
And the reality is that by the time AMD had gotten both it's and ATI's house in order enough to build Bobcat, they could have licensed a core from PowerVR like Intel did for their lowest-power Atom.
So your solution to AMd's problems is to do what intel did when they produced one of their most wretched ever chips? That's utterly bonkers. I used that chip in an otherwise rather neat toughbook CF-U1. It was sodding awful, because not only were the graphics deeply anaemic, but Intel didn't give a flying fuck ab
Re: (Score:3)
And then AMD threw it all away. A bunch of fucking MBAs decided they didn't really need to pay a bunch of expensive chip designers to make chips, and that it would be a better idea financially to sell of the fab so their remaining development team could be isolated away from the fabrication process. Brilliant plan.
Well it'll be Intel's chance to gain again, since for the last couple of years Intel has been hiring a bunch of MBA's and slapping them into high positions within the company and it's starting to show already.
Re: (Score:2)
faster, well sorta. (Score:3)
I pretty much agree with your timeline, and wasn't really aware of the business plan, but that sounds about right. The results were the same.
As for Intel learning. I am not as optimistic. AMD hasn't been competitive. Meaning Intel hasn't had to do much really. They have come out with several generations of solid CPU, however the increase in computational power year over year isn't what it used to be. You could chalk it up to physical limitations, or even lack of demand, or is it lack of competition? About t
Re: (Score:3)
Having worked in the HPC/supercomputer world during the rise & fall of AMD, I really wish I could mod you up further.
TFA talks makes much of the Intel compiler & benchmarks compiled with the Intel compiler for Intel processors.
I call BS. Nobody in HPC was dumb enough to be fooled with the benchmarks using the Intel compiler & Intel chips. There are (and were) commercial, highly optimizing alternatives to Intel's compiler, each with similar speed boasts over GCC: PathScale and PGI come to mind
Re: (Score:2)
Frankly, it was a perfect storm of both:
Intel being colossal dicks and doing crap like buying off dell to deny AMD a market, rigging compilers and benchmarks, ect...
AMD's management being colossally stupid and shooting themselves in the foot multiple times. Never let MBAs run the show...
If only one thing had happened they probably would have muscled through and kept real competition going. But together AMD lost enough momentum that the best they can do now is try to keep just a few steps behind. Which they
Re: (Score:2)
Once it becomes more difficult to get money the very short term money focused types (MBAs and accountants) get far more say in doing things than the long term product oriented types. An attitude of "we have to do something now" is the enemy of long term growth and stable business. Buying ATI for existing technology was seen as netter value than retaining staff that could develop whatever ATI had or better.
Re:AMD settled (Score:5, Informative)
AMD settled one of their entirely valid lawsuits:
Fixed that for you.
In another lawsuit [europa.eu], Intel was convicted of anti-trust violations.
The European Commission has imposed a fine of €1 060 000 000 on Intel Corporation for violating EC Treaty antitrust rules on the abuse of a dominant market position (Article 82) by engaging in illegal anticompetitive practices to exclude competitors from the market for computer chips called x86 central processing units (CPUs). The Commission has also ordered Intel to cease the illegal practices immediately to the extent that they are still ongoing. Throughout the period October 2002-December 2007, Intel had a dominant position in the worldwide x86 CPU market (at least 70% market share). The Commission found that Intel engaged in two specific forms of illegal practice. First, Intel gave wholly or partially hidden rebates to computer manufacturers on condition that they bought all, or almost all, their x86 CPUs from Intel. Intel also made direct payments to a major retailer on condition it stock only computers with Intel x86 CPUs.
Re: (Score:3)
Re: (Score:2)
I think this is a big part of why we saw chips mostly become stagnant compared to in years prior
Nope. CPU power increases have slowed down because the mainstream market isn't demanding faster CPUs. It's not the bottleneck for a vast majority of users. Even serious games only need a decent CPU and then put all of their money into video cards. The market pressure has been on price and power usage, not performance. Intel is just responding to the market.
Re: (Score:2)
For numerical work modern CPUs have gotten MUCH MUCH faster than older CPUs. Things like FMA, more vector ops, load and store two cache lines per cycle etc. These features are hard to take advantage of in higher level languages but modern cpus are vastly faster than older ones. For any normal users modern cpus are fast enough. If you need higher performance in games and simulation software you can write your code to use the CPU more effectively.
In the end a GPU is really not any faster than a CPU but a GPU
Re:Read the settlement (Score:2, Interesting)
The article is repeating a lie. The actual settlement and case do not contain the lie.
The Lie is Intel sold below cost.
Due to a fixed cost to operate a fab and process wafers, the cost per die is greatly impacted by line yield.
Due to the competitors line yield of about 50% at the time, it was assumed Intel had to be selling below cost. This was investigated and found to be false based on the number of raw wafers purchased and the number of die shipped. If two identical companies manufacture identical chi
Re:AMD settled (Score:5, Informative)
Remember when a story on this site would bring down servers? I'll bet that C|Net article barely tweaks the bandwidth meter.
That says as much about the march of progress as it does about the decline of slashdot. Even if slashdot were at it's peak, times 2, the capacity of the hardware and the internet has grown many times that, plus dynamic loadbalancing and scaling and content delivery networks...
These days even trending on facebook and twitter won't bring anybody significant down.
Re: (Score:2)
The first web server I administered was a pentium 75 with well under a gig of ram (cant remember. very early 2000s). It was already a redundant piece of hardware, dumpster dived for a cash strapped student union. It ran on slackware,
Re:AMD settled (Score:4, Interesting)
Re: (Score:2)
16-32 MB RAM sounds about right for the era. IIRC, my 200 MHz PII desktop had 8-16 MB.
As for comparing the P75 to your smart watch... Sure, but don't forget to compare the power and heat envelopes of your watch vs. that P75. You watch does all that and sips, what, 1 watt-hour per 72 hour period? A watt hour would probably be enough to power the P75 for the 16 millisecond hold up time on its PSU after cold booting it.
Re: (Score:2)
Remember when a story on this site would bring down servers? I'll bet that C|Net article barely tweaks the bandwidth meter.
That says as much about the march of progress as it does about the decline of slashdot. Even if slashdot were at it's peak, times 2, the capacity of the hardware and the internet has grown many times that, plus dynamic loadbalancing and scaling and content delivery networks...
These days even trending on facebook and twitter won't bring anybody significant down.
Granted it was 5 years ago, but Gizmodo buying a stolen iPhone 4 prototype, and then blackmailing Steve Jobs, created a news article trending so much that Gizmodo (and maybe all of Gawker) reverted to a very basic page layout, free of excess Javascript.
AMD was their own worst enemy (Score:5, Informative)
Read Ars Technica's history of AMD, the issue was with spectacular mismanagement more than with Intel's practices.
http://arstechnica.com/business/2013/04/the-rise-and-fall-of-amd-how-an-underdog-stuck-it-to-intel/
Re: (Score:2)
Re:AMD was their own worst enemy (Score:5, Interesting)
didn't know your father, but i was there for a "long time" up till 2013, and mismanagement is about the only thing AMD had going on at the top. it was comically bad. and it still is... i get a chuckle out of fanbois hyping Lisa-this, Raja-that, whatever. i never met Raja, so can't comment on him; but Lisa is not terribly impressive technically, and seemed to be planning for her golden parachute from the moment she walked into our office.
she also, apparently/allegedly, told teams (who had dependencies on other internal teams) that different projects were "top priority". so you'd have a weird deadlock case of project A being held up by people who were working on project B (being told it was top priority), being held up be a different set of people working on project A (being top priority). was a way of bullshitting paying customers, best i could tell. that was a sign that it was time to move on...
Re: (Score:2)
Thank you. The summary was pretty biased. I've never been a fan one way or the other, I mainly just try to get the best processors and video cards that I can with the money that I have. I've had AMD machines and ATI cards plenty in the past, and I still feel they deliver good low end chips and solutions. But with the introduction of the Core 2 Duo, Intel really started to shape up, and lately they've been blowing it out of the ballpark (for the most part). AMD, on the other hand... hasn't been. Now, y
Re: (Score:2)
I fully agree w/ this. I was a fan of AMD after they acquired a part of the ex DEC Alpha team, and for a while, they were doing well (even though as a RISC purist, I hated the idea of the x86 instruction set going 64-bit). But they failed to keep up w/ Intel due to a lot of their own shortcomings.
Main one, from what I could tell, was that AMD's process practices were way behind Intel's, and as process shrinks became more difficult, that magnified the gap b/w the 2. Couple that w/ the fact that AMD, in
Permanently disabling? (Score:4, Interesting)
The article mentions Intel "Permanently disabling AMD CPUs through compiler optimizations". Am I reading this right, did they find a way to brick AMD processors? It doesn't say anything else about it in the article that I can see, if so, and I'm really curious.
Re:Permanently disabling? (Score:5, Informative)
The article mentions Intel "Permanently disabling AMD CPUs through compiler optimizations". Am I reading this right, did they find a way to brick AMD processors? It doesn't say anything else about it in the article that I can see, if so, and I'm really curious.
No. TFA explains that Intel's compilers were written to ignore certain optimization-friendly parts of the instruction set if they were compiling for a non-Intel CPU. AMD actually supported the instructions, but Intel's compilers just pretended that AMD didn't. And surprise! Intel's processors beat the crap out of AMD's in benchmarks. Really shitty of Intel to do that.
Re: (Score:2)
AMD actually supported the instructions, but Intel's compilers just pretended that AMD didn't.
I'm not apologizing for Intel because they've definitely got some shady dealings, but if we've got our facts straight, their compiler is not pretending AMD doesn't support SSE.
Intel's compiler does not target instruction capabilities. It targets specific CPU architectures with intimate knowledge of their pipeline. Even if your CPU supports a fancy new instruction, for what you need it for it might perform worse in aggregate than some alternative.
So less about SSE, AVX, etc. and more about Sandy Bridge, Hasw
Re: (Score:2)
There is a reason why GNU/Linux users have favored AMD processors and there is a reason CPU benchmarks give somewhat different results with GCC code vs Intel compiler code.
Re: (Score:2)
To give a toy example, consider two implementations of the same vector instruction set. One provides a one-cycle add and a 5-cycle multiply, but microcodes the fused multiply-add and it takes 10 cycles. The other provides a 2-cycle add and a 6 cycle multiply, but a 7-cycle fused mult
Re: (Score:2)
Sorry, but that is bullshit. Intel started doing this very early on (in version 8 of their compiler), and none of their CPU capability checks looked at the specific architecture at all. The only thing they checked was the CPU capability flag, and they deliberately skipped that check unless the chip was from Intel.
They even cocked this up with their first iteration, such that instead of producing binaries that ran slowly on AMD chips it produced binaries that segfaulted on AMD chips. See http://www.swallowta [swallowtail.org]
Re: (Score:2)
I've not yet seen a valid and reasonable argument for why an Intel compiler should support a non-Intel product at all, let alone to the same level as an Intel product - care to give one?
Re: (Score:2)
Sucks.
I really don't know why they insist on the intel compiler in that place but I thing some marketing drone has got into the chain with the developers.
Re:Permanently disabling? (Score:4, Insightful)
So, why didn't AMD write their own compilers that would use all those instructions on AMD processors? It isn't as if the instruction set was secret.
An excellent question.
If I were AMD, I'd devote effort and resources to GCC development. (Maybe they have?)
Unfortunately, as others have mentioned in this thread, for the past decade AMD hasn't been well-known for acting in its own best interest.
Re:Permanently disabling? (Score:5, Informative)
If I were AMD, I'd devote effort and resources to GCC development. (Maybe they have?)
It appears they have indeed. [amd.com]
Re: (Score:2)
Or even better, devoting effort to clang/llvm. In any case it's probably not a great idea to allow your competitor's compiler be a part of how your processor is benchmarked.
And despite all the underhanded stuff intel did in the past, their integrated GPUs are better supported under Linux than AMD which is very important to me. I am still hopeful and waiting for AMD's open source drivers to be good, hopefully by the time Wayland is standard.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Evas (part of enlightenment) may be what you are looking for and is wayland compatible.
The "pretty exciting" stuff is the surrounding projects that tries to make a very cut down bit of X with cut and pasted X code to be almost as capable as X. If they actua
Wayland only uses the 2D driver, but ... (Score:2)
However, some software that writes to Wayland, such as Evas, uses OpenGL and can use the 3D rendering hardware to do it's work.
As for "more efficient" - there's not a lot of X that gets in the way between OpenGL and the graphics card anyway. I was using a pentium60 in 2000 with a cheap 3D graphics card to display stuff th
Re: (Score:2)
I should have answered the question more directly - it's not doing the 3D rendering at all and dumps a 2D canvas to the screen so the current drivers are as good as it gets for that step.
I don't know why you still feel as though this is not clear.
As for "more efficient" - there's not a lot of X that gets in the way between OpenGL and the graphics card anyway.
The X extension is getting in the way. Clearly you feel like this is "not a lot", but all these little things add up. As a software engineer, I can appreciate the benefits of good design. Those benefits can be large or small, and they can be not only in terms of performance but also in terms maintainability or usability. Providing a better API for application developers leads to software and ultimately a better user experience.
I am saying is th
Re: (Score:2)
Pentium 60 back then and still done the same way now. I quantified the sum of all those little things didn't I - and it doesn't add up to much at all.
It's a dumb framebuffer designed to be less complicated than X. Useful maybe, really nice when the project really gets going maybe, but "next generation"? Who told you that? The entire point is to be a few generations back as a frame
Re: (Score:2)
Actually that's one of the current problems, the developer has to do just about everything then dump to video as if it's an MSDOS display or a game console. Things are improving with other projects such as Enlightenment supporting Wayland so that you actually have some widgets and stuff instead of having to do them all yourself.
Re: (Score:2)
Re: (Score:2)
To get back to your point, it's not "making the 3d rendering chain more efficient" because it's ignoring 3D by design.
Re: (Score:2)
That's what I mean. Until enlightenment got behind it there were no widget toolkits that worked with wayland so the developer had to do that as well - rendering their own icons, menus etc onto the 2D canvas, trapping their own events etc.
So before there was any support for wayland, there wasn't any support for wayland?
That's like saying company X made the first LGA 1155 motherboard, you had to make your own motherboard for Intel Sandy Bridge processors.
To get back to your point, it's not "making the 3d rendering chain more efficient" because it's ignoring 3D by design.
Cutting out a middle man is a perfectly legitimate way to make something more efficient. Wayland not only "doesn't do 3d", it establishes a model where 3d rendering is more efficient.
Imagine if X where developed after wayland. Would it make sense to say "X handles 3d more efficiently than W
Re: (Score:2)
Yes, about six months ago not even widgets - getting the idea that it is a framebuffer and other people have to do the other stuff yet or do I need another six posts to keep on writing what I put in the first?
There is no Wayland 3D model.
Re: (Score:2)
FFS stop imagining and read the docs instead of trying to tell me about something you've never used and know nothing about.
Re: (Score:2)
Re: (Score:2)
Yes, about six months ago not even widgets - getting the idea that it is a framebuffer and other people have to do the other stuff yet or do I need another six posts to keep on writing what I put in the first?
I think you are confusing "not getting it" with "finding what you keep repeating to be irrelevant".
There is no Wayland 3D model.
There is a wayland 3d model (e.g. the way in which 3d is handled if you are running wayland), which is to have 3d bypass wayland. As oppsoed to the X 3d model.
Re: (Score:2)
So not having a model is a model? It appears that I've been played by someone having a joke at my expense. There is no way that you are actually as stupid as you are pretending to be.
Re: (Score:2)
If I were AMD, I'd devote effort and resources to GCC development. (Maybe they have?)
It really doesn't matter much when most people use this Windows thing and guess what they use to compile that and nearly all the software for it.. As for GCC, that thing usually supports new AMD CPUs/APUs before they are released.
Re: (Score:2)
Ive always prefered AMD (Score:2)
Re:Ive always prefered AMD (Score:5, Interesting)
Re: (Score:2)
It Goes Deeper (Score:2, Interesting)
Attorney here. In the late 90's I worked on contracts between clients and Intel. Intel was offering payments if you put a banner on your website that said it was optimized for the Pentium II. They also helpfully provided code to slow your website down if it detected any non-Intel processor.
Re:It Goes Deeper (Score:4, Funny)
How does a web site know what processor you're running?
Perform a floating point calculation. If the answer is wrong you've got Intel Inside (TM) :-)
Re: (Score:2)
https://msdn.microsoft.com/en-... [microsoft.com]
Intel has reasons to let them live (Score:5, Insightful)
Intel knows they have to let AMD live for at least 4 reasons:
1. Avoid anti-trust lawsuits over x86 chips.
2. Have a second-source option so that vendors don't switch to ARM. Contracting practices for critical equipment often require more than one part source (vendor).
3. Keep the x86 market viable. Without producer competition, x86 may die a slow death.
4. Have someone to steal ideas from.
Re: (Score:2)
While all of those might be good reasons to keep AMD alive, I don't think any of them are strong enough to ease off in the competition with ARM. Intel needs to push mobile chips that very directly compete with AMD and at this point the collateral damage might be better than the alternative. With lawyers you can stay #1 for years, #2 is marginal, #3 takes forever and #4 they probably hire any smart people AMD has to let go. Intel could join the ARM pack using their production process and low-level design kno
Re: (Score:2)
That is right - Intel is under no compulsion to keep AMD alive. And AMD is failing all on its own.
For Intel to get into mobile, they may have to come up w/ a chip that beats ARM on both performance, as well as price/performance. It's not clear that that's achievable w/ x86, so they might try one of the other instruction sets that they have cross patenting agreements on.
But that begs the question of why would Intel, w/ their high end fabs and processes, want the low end market, which is what the chea
Re: (Score:2)
It's three things. They have to beat ARM on price, performance and power consumption simultaneously. They can lower all three which is a fail, they can raise all three which is also a fail for mobile, but OK for high-end servers. They have to keep price down (less R&D), and power consumption down while keeping performance up (more R&D).
Not the whole story (Score:4, Insightful)
- AMD was on top of the world with Opteron / AMD64
- Intel was losing everywhere it went. You'd be hard-pressed to find an Internet / financial shop *not* buying AMD
- But Intel responded with Merom / Core2Duo. That mostly closed the gap, though initially the memory subsystem was still inferior
- Had AMD met expectations with the follow-on part (Bulldozer), there is no reason they could not have continued to win
- But in my mind, their ATi acquisition initiated their downfall. They became schizophrenic.
To beat Intel (like most market leaders) you have to have a non-trivial advantage. When AMD had one, they kicked Intel's ass to the point that they severely altered Intel's roadmap. When they no longer had one, they lost.
Re: (Score:2, Interesting)
as an ex-AMD-red guy, i can tell you that no one at the company was happy about AMD buying ATI. the AMD-green folks saw their stock price drop.. the red side saw ineffective leadership, weird internal politics, exceptionally-poor design methodologies, and a loss of a cool corporate culture. both saw tons of abandoned pre-pre-llano projects and strange re-orgs. i don't think the ATI acquisition was the downfall of AMD, they were already on that trajectory; but it didn't help anyone, that's for sure...
Re: (Score:2)
AMD lost my interest when they started their new naming schemes; before you have semprom operaon phenom, etc. then they went to A series, C Series and E series - trying to make heads or tails of which chip was better was not as easy anymore.
The second thing was AMD buying ATI and more frequency of bundling them with their CPUs - and that was a problem only because ATI just generally sucks under Linux.
So it became way easier to spec out an Intel i series and also find one bundled with nVidia GPU.
Wasn't my c
While Intel played dirty, Core was a killer (Score:5, Informative)
July 24, 2006: AMD buys ATI, stretching their credit to the limit
July 27, 2006: Intel launches Core 2 Duo (Conroe)
To get an idea of how quickly AMD was in trouble, here's Anandtech [anandtech.com] in November 2007 at the launch of Phenom:
If you were looking for a changing of the guard today it's just not going to happen. Phenom is, clock for clock, slower than Core 2 and the chips aren't yet yielding well enough to boost clock speeds above what Intel is capable of. While AMD just introduced its first 2.2GHz and 2.3GHz quad-core CPUs today, Intel previewed its first 3.2GHz quad-core chips. (...) Inevitably some of these Phenoms will sell, even though Intel is currently faster and offers better overall price-performance (does anyone else feel weird reading that?). Honestly the only reason we can see to purchase a Phenom is if you currently own a Socket-AM2 motherboard; you may not get the same performance as a Core 2 Quad, but it won't cost as much since you should be able to just drop in a Phenom if you have BIOS support.
Up to July 2006: K8 > Netburst
July 2006 - November 2007: K8 < Core (AMD sales tank)
November 2007 - October 2011 K10 < Core (successor lagging behind)
October 2011-2016? Bulldozer < Sandy Bridge (late and underperforming)
Why didn't AMD have the cash to burn in 2006-2009 to come up with something better? Oh, a $5.4 billion purchase of ATI. It sucked all the R&D out of CPUs and into APUs and "synergies", but even today you see no major differences between an APU and pairing a CPU + dGPU unless you've written very special code for just that situation.
Re: (Score:2)
Let's be honest, the ATI purchase was a bold move that had the potential to pay off big time in an era when we were seeing convergence between GPUs and CPUs with GPGPUs and such.
Unfortunately, it was a gamble that they lost, the market they foresaw never came to fruition to the degree they were expecting, in large part because everyone got distracted by mobile which became the new thing and the new focus. Had the iPhone and Android never have happened a completely different set of chip designs may have beco
Intel not the only factor... (Score:4, Interesting)
Sure, lots of controversy over their actions in the late 90s and early 2000s, but by 2005, Intel had recovered from the mistakes made in NetBurst. Starting with the Core microarchitecture, Intel made some very strong advances in process and gains in their CPU architectures in the consumer and server spaces. AMD got distracted with the APU designs and made a huge misstep with the Bulldozer line. I think the ATI acquisition was a distraction as well. Meanwhile, Sandy Bridge was in place and allow Intel to make gains all around. By the time Haswell was in place, their entire lineup was solid. They had the core counts to match the high end Opterons, they were pushing ahead on virtualization (VT-D, APICv) and AMD was and is in a rough spot.
Zen needs to have good parity with Skylake for AMD to regain market share, and that's a tough task. Also, Intel has major process advantages. They are at 14nm already, which helps keep yield up as transistor count rises (core count). They do have an advantage in the all in one market and do very well in the budget segments. We will see if their ARM based assets play out, but it's going to be tough going for AMD with Intel on one side and NVidia on the other.
Re: (Score:2)
I disagree. AMD played the largest role in failing to press advantages they had in growing markets up to the K10 microarchitecture (2006/2007). The ATI acquisition spilt their focus and reduced their resources in CPU microarchitecture, as well as tying up a lot of cash. The APU idea didn't play out nearly as well as it needed to. Then, the spinoff of Global Foundries and some process misses. Then Bulldozer. Intel killed NetBurst pretty fast. They had too, AMD was beating them on all fronts. But, they got Co
Can I get a woooooo Bundy (Score:2)
for ZEN! Come on AMD!
The Intel compiler still anti-competitive (Score:5, Informative)
Intel's compilers still use the CPUID instruction to decide whether to emit efficient code or not. Intel has an official notice to this effect. Charmingly, the notice is only available as an image file. I presume this is to make it harder to search for the notice.
https://software.intel.com/en-us/articles/optimization-notice/ [intel.com]
Every time I see benchmarks now, I wonder whether the results were affected by the use of an Intel compiler.
I try very hard to not buy Intel products.
Re: (Score:2)
I've tested version 13 of ICC using Povray [povray.org] on several of my AMD and Intel systems. I can tell you that the dispatch code works as you would expect: AVX (but not AVX2) code path does work on the FX8350, does not work on Phenom II x6 1090T so switched to SSE2 path. Of course, compiling without dispatch will cause seg faults on processors that don't support AVX or AVX2. But that doesn't necessarily mean ICC is the fastest compiler for AMD. For Povray at least that would be GCC.
Re: (Score:3)
How good would the Intel compiler have to be at optimizing for AMD products before people would no longer claim that Intel was deliberately crippling the optimizations? I submit that there is no limit, and therefore there is no reason for Intel to try.
Re: (Score:2)
In my case it was a CPU bound trivially parallel thing - only half the time on 64 AMD cores running flatout than on an i5 laptop with both using the intel compiler. With gcc the same sort of thing is clock to clock and core to core, with the AMD machine finishing 16 times faster as it should. I forget how many days it took for the short run, but it was days.
Re: (Score:2)
Re: (Score:2)
I'll see if I've still got time on the licence on the intel compiler and give it a go.
Re:The Intel compiler still anti-competitive (Score:5, Informative)
if (Intel chip) then if (supports SSE2) then run SSE2 code else run non-SSE2 code else run non-SSE2 code endif
All people are saying is that the code path should be
if (supports SSE2) then run SSE2 code else run non-SSE2 code else
See the difference?
Yes, you can get extra speed by ordering the instructions differently for different architectures, and Intel's compiler quite rightly does that to product Nehalem-optimised code or Skylake-optimised code. I don't expect the compiler to produce Bulldozer-optimised code, but I expect it to allow me to run the Nehalem-optimised code on a Bulldozer. Where does this meme that this request is "forcing Intel to optimise for the competition" come from? I want Intel to do *less* work, not more - all they need to do is *remove* a small amount of code from their compiler and I'd be happy.
Re:The Intel compiler still anti-competitive (Score:5, Informative)
Re: (Score:2)
Well, actually, no. An optimal compiler and libraries would indeed be able to squeeze out some gains by knowing exactly what microarchitecture was being targeted versus just instruction sets.
Lets help the readers of /. ;-) (Score:5, Informative)
Optimization Notice
Intel’s compiler may or may not optimize to the same degree for non-Intel microprocessors for optimisations that are not unique to Intel microprocessors. These optimisations include SSE2, SSE3, and SSSE3 instruction sets and other optimisations. Intel does not guarantee the availability, functionality, or effectiveness of any optimisation on microprocessors not manufactured by Intel. Microprocessor-dependant optimisations in this product are intended for use with Intel microprocessors. Certain optimisations not specific to Intel microarchitecture are reserved for Intel microprocessors. Please refer to the applicable product User and Reference Guides for more information regarding the specific instruction sets covered by this notice.
Notice revision #20110804
As written by Intel [intel.com], but written in text for the convenience of visually impaired slash-dotters with screen readers. Highlights mine.
Why not GCC? (Score:3)
I kind of don't get the defeatured compiler hack.
It seems like all AMD needs to do is contribute the appropriate code generators to GCC.
Intel was CONVICTED of monopoly abuse. (Score:5, Informative)
So here it is again:
E.U. Commission press release detailing their conviction of Intel. [europa.eu]
The European Commission has imposed a fine of €1 060 000 000 on Intel Corporation for violating EC Treaty antitrust rules on the abuse of a dominant market position (Article 82) by engaging in illegal anticompetitive practices to exclude competitors from the market for computer chips called x86 central processing units (CPUs). The Commission has also ordered Intel to cease the illegal practices immediately to the extent that they are still ongoing. Throughout the period October 2002-December 2007, Intel had a dominant position in the worldwide x86 CPU market (at least 70% market share).
Intel was CONVICTED of monopoly abuse. This is an irrefutable fact. There are a lot of people here either claiming that they were never convicted or downmodding those that are revealing the truth. The site I linked to is the official press release site of the E.U. Commission.
AMD are a joke (Score:2)
Why not just buy a cheap nVidia (Score:2)
This isn't the whole story - AMD dropped the ball (Score:2)
The article does not cover the whole story, missing the important parts of the last 10 years. AMD dropped the ball completely with the Athlon 64 - anyone else remember the Sempron and Opteron? Phenom was meant to redeem them, but Intel's Core2 architecture completely obliterated AMD, taking the entire high end of the market and beating them on bang per buck in the middle range as well. AMD were relegated to competing (relatively successfully) for the low end. Bulldozer only compounded this, again unable
Re: Fair and Balanced? (Score:2)
Competing with criminals is very hard.
Re: (Score:2)
I've had mixed luck with ATI/AMD GPUs, I do buy them for one big reason - OpenCL. CPU wise, my last AMD build was a FX-8120, which felt like a step backwards from a Phenom II x4 for a lot of things. Not that I wouldn't build AMD again - if I build a new virtualization server (my current ESX box is a PowerEdge 2950 that was given to me, fully loaded with RAM, SAS drives, and dual Xeons, can't argue with that) I'd go with AMD, and I'm likely to give AMD another shot when the Zen architecture hits
Re: (Score:2)
> my last AMD build was a FX-8120, which felt like a step backwards from a Phenom II x4 for a lot of things.
Could you go into more details please?
I upgraded from a Phenom II x4 965 BE o/c @ 3.5 GHz to an i7-4770K o/c @ 4.0 GHz. I still need to pick up a FX-8350 BE for testing my game so your comments about the FX-8120 are intriguing.
Re: (Score:2)
Re: (Score:3)
I don't think Microsoft did that out of the goodness of their heart. AMD64 happened right in the middle of the Microsoft/Linux server wars when competition was really really stiff, and PCs were bumping up against 4G of RAM. AMD dumped a bunch of resources into GCC and so Linux distributions compiled and ran on the CPUs before they were even released to the public.
Microsoft saw that their #1 competitor was about to get access to much better CPUs at the same price, so they *had* to support AMD to stay competi
Re: (Score:2)
I agree that Microsoft wasn't and isn't a benevolent company, but Linux wasn't the reason they did what they did. After all, they took their time in coming out w/ Windows 95, and IBM could do nothing to make OS/2 the market leader. While at the time Intel risked losing to AMD due to the initial Itanium strategy, which was to obsolete x86 altogether, Microsoft had no such challenge. Linux wasn't it, the AIM alliance had come unravelled due to Apple ending the cloning program, and OS X took years before it
Re: (Score:2, Insightful)
That is what a fanboy would say, but ignore the fact that AMD when they got the lead sat on said lead and got beat down.
Beat down by illegal practices that Intel was convicted of.
The fanboy here seems to be... you. Don't project your failings on others.
On top of last 3-4 years...
You mean the years that Intel failed to comply with the court ruling and pay AMD the damages it owed? Intel didnt pay AMD damages until late 2014, for a conviction in early 2009. Intel literally blew through an unprecedented amount of money on lawyers (several hundred million dollars), after being convicted, just to delay AMD getting the money owed to them.
You are the fa
Re:Um writer of this an AMD fanboy? (Score:5, Informative)
No, they weren't convicted.
Yes they were, and here is the courts press release [europa.eu] of the conviction.
A settlement is not a conviction
A settlement is not a conviction for sure, and the fact that there was an unrelated settlement doesnt negate the fact that Intel was convicted of flagrant monopoly abuse and ordered to (among other things) "cease illegal practices" (a direct quote.)
Why are Intel shills such lying fucks?
Re: (Score:3, Insightful)
Re: (Score:3)
AMD bet on 64-bit, and won (64-bit). Intel's effort was Itanium, it is pretty much done for.
Intel can compete better in low power, where owning its own fab which is also top notch, gives it considerable advantage. Even against superior architectures. AMD really fucked that up, along with everyone else who thought MFG should be done overseas. As a result their chips always run a bit hotter, and can't run quite as fast. So they have to sell them cheaper with lower margin... and AMD is spread a bit thinner...
Re: (Score:2)
More to the point, how do cows feel about cars with leather interiors?!?
Re: (Score:2)
Prior to the leather being installed, they don't care; afterwards -- they care even less.