AMD Withdraws From High-Density Server Business 133
An anonymous reader sends word that AMD has pulled out of the market for high-density servers. "AMD has pulled out of the market for high-density servers, reversing a strategy it embarked on three years ago with its acquisition of SeaMicro. AMD delivered the news Thursday as it announced financial results for the quarter. Its revenue slumped 26 percent from this time last year to $1.03 billion, and its net loss increased to $180 million, the company said. AMD paid $334 million to buy SeaMicro, which developed a new type of high-density server aimed at large-scale cloud and Internet service providers."
Late to the market....need to be special (Score:5, Interesting)
Looks like they're focusing on ARM chips:
"AMD still sees growth potential in the server market, but not from selling complete systems. It's returned its focus to x86 chips and to the development of its first ARM server processor, code-named Seattle."
8 core 64 bit ARM chips with GPU built in are fairly common and 10 core chips already announced (Mediatek), with 16-48 core vaguely hinted at for servers by other vendors. So if AMD plan on entering the ARM processor market they'd better get something special out and fast, and be prepared to stick at it and upgrade it and take the initial losses. Because they're unlikely to win companies over first time till they're confident AMD are in it for the long run and won't leave them hanging without a supplier.
On the other hand they could focus on x86 chips where Intel is already deep discounting at the low end, and likely will have to do that all the way up the range to compete.
AMD face a tough time either way.
Re: (Score:3)
8 core 64 bit ARM chips with GPU built in are fairly common and 10 core chips already announced (Mediatek), with 16-48 core vaguely hinted at for servers by other vendors
A bit more than hinting: Cavium is selling 24-48 core ThunderX (ARMv8) chips [cavium.com]. I think the first one shipped a month or two ago.
Re:Late to the market....need to be special (Score:4, Interesting)
Re: (Score:3)
Be aware that some vendors list a product on thier web site as if it were a current production product when really it's at the "we have a few samples and will let you have one if we like you and/or you pay us a load of money" stage.
Re: (Score:2)
Re: (Score:1)
Re: (Score:3, Interesting)
Actually AMD defined the 64 bit extensions to the 32 bit x86 architecture, and Intel had to follow and is letting the Itanic sink.
Of course Intel does not even remotely admit it (and even Linus ranted on this fact in a mail a decade or so ago), but they have still not come over the NIH syndrome that it caused them.
This said, amd64 (as Debian calls it) could and should have been better designed.
The trouble is that Intel really needs some competition, this will take years, but it may come now that they put MB
Re: (Score:3, Interesting)
Re: (Score:3)
There were lots of practical reasons for IBM to use the 8086 in the form of the 8088. It had compatibility with the existing base of 8080 CP/M software, the 8 bit external bus could use 8080 peripherals and halved the memory granularity, and Intel was willing to allow alternate sources. The prime alternative was
Re: AMD is on the road to nowhere (Score:2)
uh there is a 68008.... yes an 8 bit external 68000
68008 came out in 1982 (Score:1)
The IBM PC was released in 1981. An 8-bit bus 68K was not an option for the PC team.
Re: (Score:2)
The 68008 became available years (3?) after the 8088. I do not remember it even being planned or announced at the time.
Re: (Score:2)
I ran across this link to a transcript about the history of the 68000 after my post here. There is a discussion thread on the Real World Technology forums about why IBM chose the 8088 where this transcript was linked:
http://archive.computerhistory... [computerhistory.org]
http://www.realworldtech.com/f... [realworldtech.com]
The additional cost of the 68000 over the 8088 was an even larger factor than I remember.
Re: (Score:2)
Actually AMD defined the 64 bit extensions to the 32 bit x86 architecture, and Intel had to follow and is letting the Itanic sink.
Of course Intel does not even remotely admit it (and even Linus ranted on this fact in a mail a decade or so ago), but they have still not come over the NIH syndrome that it caused them.
That was the brief window that AMD had the upper hand. On the processor architecture side, Intel was splitting resources between hoping non-backward compatible Itanium was the future, and the disaster known as Netburst in the Pentium 4. Netburst had terrible performance, too high power consumption, but they were able to claim higher Ghz. All Intel had was name recognition, but technologically AMD was leading with the K7 and K8, and on this they were able to launch 64 bit. Even though it'd be the better part
Re: (Score:1)
Re: (Score:3)
I've personally played around with the Moonshot and being able to squeeze 45 blades in a 5U rack (the specs say 4.3U...) is a nice thing. Each blade has two DIMM spaces and a SSD, which is good enough to load a hypervisor, then use the onboard bus for going to a storage array.
I wouldn't say that each blade is as powerful as a blade in HP's conventional 16 blade enclosure (which takes 10 rack units), nor as powerful as a 1U standalone server... but you can choose what goes in, from a low end Xeon on the m71
AMD Withdraws From Business (Score:2)
Sadly, I don't see an "out" for AMD (Score:5, Informative)
Sadly, I don't see an "out" for AMD. Their x86/amd64 chips don't perform as well as Intel's. The ARM market is saturated. They don't have their own foundry.
What does modern day AMD bring to the table that anyone wants? Even at cut-rate pricing, they've saturated their channels with chips and can't even manufacture and ship new inventory until the backlog clears.
It's a shame, but I think they're on their last legs. :(
Comment removed (Score:5, Insightful)
Re: (Score:2)
I certainly don't agree with him on Windows and I have no strong feelings about Steam, but he is quite right about the CPUs.
When the story broke on the cheat embedded in the Intel compiler, I actually gave it a spin and looked at the assembly code. I also actually patched it out and saw the difference on an AMD compute node. I routinely see AMD perform on-par with Intel in compute intensive jobs.
The very top end Intel processors are faster than the top end AMD, but unless your constraints include "must fit
Re: (Score:1)
video rendering or kernel compiling, for instance, are a different story.
but yeah, the guys annoying. it's like the little kid whose parents can only buy him 1 videogame, so he needs to be a fanboy and try to convince everyone that he owns the best videogame.
Re: (Score:1)
truth be told, intel chips are a lot faster than amd chips, but games dont benefit anything from the i5's or i7's extra oomph. your just wasting power and draining notebook battery.
So, one can just use an i3 instead ?
Re: (Score:2)
Re: (Score:1)
I'm not "hairyfeet" but I agree with him. Testing with PCMARK 8 Work Accelerated 2.0 I have two benchmark results to share:
HP Probook 640 with Intel Core i5-4330m with Intel HD Graphics 4600 // 8 GB RAM // SSD 180 GB Intel // Score= 3486. // 8 GB RAM // SSD 240 GB MX500 Crucial // Score= 3423
HP Probook 645 with AMD A10-5750m with AMD HD8650G
Virtually equivalent to me, I don't buy the Intel hype.
Re: (Score:2)
Yes. Because embedded graphics on a ULV mobile CPU is EXACTLY what people are talking about when they talk about comparisons...
Le sigh...
Re: (Score:2)
AMD still has too low single-thread performance and if you care about that, Intel came out with Celeron G1620 and Pentium G2020 (now updated with the same as Haswell) and has ruled the low end too.
AMD ironically requires a more expensive motherboard and an aftermarket heatsink/fan if you go for that old six-core CPU. (but I do have that opinion that a CPU with four or six or more cores is most needed for games, unless you're a professional who works all the time with lots of big pictures or video)
I would ge
Re: (Score:3)
A10 CPU is very good!
Is not the faster CPU, that is right, but is fast enough!
Then you have the internal GPU, that will eat intel one alive. Taking out the hardcore gamers, the normal users (home users, casual gamers, office work, etc) will get a very good machine for a lower price. Everyone likes to have the most powerful rig of the neighborhood, but that is just ego talking, most people will not use it.
hardcore gamers will always choose top CPUs and GPUs and will pay huge amount of money to get then... b
Re:Sadly, I don't see an "out" for AMD (Score:5, Insightful)
what are you guys using that even see the difference between this CPUs? I use several computers with different CPUs (cores, speed and brands) and i see almost no difference at all. Most systems are idle, waiting for user input or HD access. Of course i'm ignoring video editing and some small set of very cpu hungry single thread apps, but most people don't use then anyway. Most people will see get better performance by buying a SSD, not cpu
I think this is just a matter of "who size is bigger", not real performance differences.
What i like in AMD cpus is they have all the features, not bullshit capped cpus like intel cpus, where they remove features from lower cpus to force you to buy higher (and much expensive) ones
Re: (Score:1)
Development.
Running compilers, tests, IDE's, DBs, several servers, etc, all chew up processes and threads.
And then there's Handbrake, which will eat all your available CPU if your memory is fast enough. Handbrake is an awesome multi-threaded test for CPUs. Take a known HD source and time the conversion. IIRC generally about 40 fps with 1080P sources and near 200 fps with DVD sources which sounds about right - 5 times more info in 1080P over 480i. I could squeeze another 20% by OC'ing the CPU/RAM some more
Re: (Score:2)
So a heavy multi-thread usage is close or even higher on AMD...
video encoding, is better to use GPU... and again, AMD APU is and faster than anything Intel sells... If you buy a real GPU, well, the cpu will not matter much
Re: (Score:2)
For myself, it's not uncommon for me to have many machines running under very high loads (often BOINC, sometimes games). And the boost in gaming performance I saw going from an AMD FX-8120 to an i5 3570K (both were equal priced at the time) was incredible. AMD's newer FX chips (which are very old at this point) don't even try to compete with the i7 - not even in AMD's market
Re: (Score:2)
Every cpu company with several CPUs do that
If a batch gave cpus that have some problem, disable that cpus and sell the silicon for the remaining working cpu. silicon is expensive, the build process is expensive, if you they didn't do this, all CPUs would be more expensive too
Re: (Score:3)
I like AMD, I really do. They've gotten the short end of the stick over and over again. But even I have to admit that the Tek Syndicate benchmarks are poor proof of value right now, and for 2 reasons.
Re: (Score:1)
XSplit has been rendered functionally obsolete by newer software that uses the on-board H.264 encoders provided by AMD/NVIDIA/Intel. H.264 encoding is now a virtually free operation (with a 5% perf hit)
You have any citations for this? I'd love to see where H.264 encoding has a less than 50% perf hit, as my current workflow uses nearly 100% of my system for significant portions of time.
Sure (Score:3)
Look up "Shadowplay" by nVidia. That is their software that uses the "nvenc" feature of their new GPUs. It has near zero CPU and GPU load, just load on the disk. All encoding is done by a special dedicated encoder on the chip. It's a fast encoder too, it can do 2560x1600@60fps.
The downside is it is not as good looking per bit as some of the software encoders (particularly X264) so if the target is something low bitrate you may wish to capture high bitrate and then reencode to a lower bitrate with other soft
Re: (Score:2)
The downside is it is not as good looking per bit as some of the software encoders (particularly X264)
That's going to do me no good then. The entire purpose of encoding for me is to shrink the size while maintaining quality.
In that case (Score:2)
You'd want to look at a 5960X, if you can afford it. Particularly when overclocked (and they OC well with good cooling) they are the unquestioned champs for that kind of thing. They have plenty of power to be able to run a game well, plus have cores left over for good quality encoding.
Re: (Score:1)
Not sure, you'd have to check tests (Score:2)
Part of it would depend on the relative OCs, of course. Also it would depend on if your encoder could use AVX2/FMA3 and if so, how much speedup it provides. For things that it matters on, there have been near 2X speed gains, but I don't know how applicable the instructions are to H.264 encoding.
Another option is if you can find an encoder you like that has a CUDA version, you could give it a video card to run on. However you'd want to check the implementation to make sure its quality is comparable. Also you
Re: (Score:1)
Re: (Score:2)
Re:Sadly, I don't see an "out" for AMD (Score:5, Insightful)
Well, try the Phoronix benchmarks then.
http://www.phoronix.com/scan.p... [phoronix.com]
In this one the FX8350 is basically comparable to the i7 3770, the contemporary intel processor. Sometimes a fair bit faster sometimes a fair bit slower, on average about the same.
Now pull up a benchmark from the other sites from a similar era. You'll find the AMD processot getting stomped all over. Given phoronix used open source software and GCC, I'm somewhat more inclined to trust it.
It also matches my experience that certain software is easily as fast on AMD as Intel, but then agan I run Linux too.
Re: (Score:3)
Re: (Score:2)
Hairy let's say AMD has a theoretical superior architecture?
AMD has .28 nm chips. Intel is down to .17 nm and skylark with .14 nm is just around the corner! Worse power requirements are now the new rage too. Tell me how can AMD compete?
They can't. Lower size increases speed and power requirements. Only advantage AMD has is cost ... oh wait another chip fabrication is needed and they want a cut :-(
Only saving grace is ATI graphics. If nvidia gets a hold of .17 nm chips then it's game over too.
I was a loyal
Re: (Score:2)
Sigh ... stupid Android auto correct thought x86 = xp 6. Slashdot please allow editing of posts?
Re: (Score:2)
Sigh, where to begin.
AMD has .28 nm chips. Intel is down to .17 nm and skylark with .14 nm is just around the corner!AMD has .28 nm chips. Intel is down to .17 nm and skylark with .14 nm is just around the corner!
Not .28nm, just 28nm and Broadwell is made on the same 14nm process as Skylake.
Only saving grace is ATI graphics. If nvidia gets a hold of .17 nm chips then it's game over too.
They haven't called it ATI graphics for 5 years, but now I'm quibbling. What's important is that both AMD and nVidia makes their GPUs at TSMC and so have access to the exact same technology if they pay.
I was a loyal AMD user too. I tried and stayed til last year. It is frustrating but an i7 4 core with 8 virtuals with hyperthreading really sped uo my games compared to the 6 core./
Hyperthreading has little to do with it, the step down with pure quad-core (i5-2500k, i5-3570k, i5-4690k) has usually been far more cost effective for gaming. Four Intel cores simply beat eight AMD Bulldozer cores.
AMD needs to leave [x86] and go all ATI to stay solvent.
They're in the same boat on graphics, the last major new architecture was GCN in 2011 and it's way overdue for a replacement. So that depends, have they actually invested in a new architecture? With their R&D money going everywhere else, I don't see how.
Re: (Score:2)
Re: (Score:2)
Microsoft SQL Server
MSSQL is:
1. Compiled using Microsoft's compiler, not Intel's. No "cheating" there.
2. A fairly integer heavy workload which in theory would benefit AMD's module architecture.
The reality is Intel processors absolutely destroy AMDs in MSSQL performance. By more than 2 to 1. A mere Westmere based 2X Xeon X5690 12 core system beats a 32 core 2X Opteron 6282SE in TPC-E. Intel has 3 ge
Re: (Score:2)
This is exactly correct. I myself replaced a SQL Server cluster that was using boxes with dual 12-core AMD procs with one using dual 4-core Xeons a couple years ago. Performance and responsiveness went way up while the bill to Microsoft dropped massively.
I was a solid AMD enthusiast from the original Athlons all the way up until about 5 years ago. They went from huge underdog to reigning champion for a long time while the marketing guys ran Intel's product offering into the ground with everything from North
Re: (Score:1)
Anecdote time: I have two gaming systems that are closely similar in spec, both running Win7x64.
One runs an i5-2500K, the other a Phenom II X4 955 BE. According to everything you can read online, the i5-2500K should slash, crush, destroy, and humiliate the Phenom II in every possible scenario, and then nuke it from orbit just to be sure.
In actuality: There is NO noticeable difference in gaming performance between the two systems.
Pricewise, the Phenom II was purchased new back in 2010 and still cost half
Re: (Score:2)
Re: (Score:2)
Cite an article, not YouTube videos.
And I'm concerned about single-threaded compute performance, not embedded graphics. I never use the embedded graphics on a processor except on a notebook that has no slot for a video card.
Re: (Score:2)
Re: (Score:2)
Oh please. What Intel did to AMD is anti-competitive and horrible, but please don't tell us the fairy tale that all benchmarks have been rigged. There exist dozens of benchmarks, as well as game and application based benches. They all show that AMD CPUs gets slaughtered when it comes to single core IPC, while also being pretty poor in the power consumption department.
Intel did nasty things to AMD, but AMD dug its own grave when it designed the current Bulldozer-based architectures with the goal of maximizin
Their hardware is very good (Score:4, Interesting)
I am writing my own (multi-threaded) software and recently I had a chance to do a test run on an intel i7 processor (8-core, 2.67GHz) to compare it with my old Athlon II X4 (3GHz). Both programs compiled with the same version of GCC (4.6.1), both compiled with -O3 optimization. Running 8 threads on the Intel machine was only marginally faster than running 4 threads on the old Athlon. The threads were independent, so no threads were inactive while waiting for something else to finish.
Where Intel have the lead is in the compiler business. Back in 2003 or so they released their ICC 8.0 for free for Linux users. I was writing only single-threaded software at the time, and simply re-compiling it with ICC made it run about 5 times faster than the version compiled with GCC 2.96. And that was on a 2GHz Athlon XP.
What AMD have done right is the integration of the CPU and GPU allowing them to gobble up the console market. However, their bet that all developers will jump on the heterogeneous computing bandwagon did not pan out. But with HSA 1.0 coming up their lead will be too large and neither Nvidia not Intel will have a competitor ready for the next console refresh. All that Nvidia will do is to continue to pay game developers to optimize their engines for GeForce cards, and refuse to optimize for Radeon. AMD's resources are so limited that they will be forced to have a desktop version of their console processor, and maybe an ARM core for good measure.
Exiting the "dense server" are makes perfect sense, as the market is very limited. Running across many small cores is hard and developers will avoid it. It is the same story as taking advantage of the GPU, which also provides many simple cores.
So no, they are not dead, they are simply adapting to market realities and accept that they made a mistake when they jumped in the dense server bandwagon. Unlike Intel, who even now refuse to let go of the Itanium.
Perhaps not. (Score:2)
While I agree in part, they have a few outs.
One thing I don't understand about the post is the "High-Density" bit. I am not sure if that is some techno babble for some super duper specialized server construct, however sever chips have been one of the few places AMD has excelled in the last number of years. Another place they do well is the budget segment where cost is more of an feature than actual processing speed. The difficulty with that segment is that the margins are likely very low, so you have to mak
Re: (Score:3)
AMD chips run all three latest-gen gaming consoles...
Re: (Score:2)
All PS4 and XBones combined = less than 40 million units total, since fall of 2013.
Compare that to 130 million desktops, 200 million laptops annually. Not quite, but the latest gen consoles over 18 months represent about 10% of 12 months' worth of PC + Laptop sales.
Add to that, tablets, of which 15% are Intel powered (and this will trend upwards over the next three years) which sell about 200 million a year. I wouldn't imagine the console contracts are particularly valuable, as they had to
AMD has played losing strategy for too long (Score:5, Interesting)
AMD has played a losing strategy for as long as I have can remember. It is sad, but I remember my first few PCs were all AMD machines. I bought AMD on principle, and because they were price/performance leaders. They were even outright leaders for a while, but failed to capitalise on that. I think, however, that the whole Sledgehammer/Clawhammer phase has ultimately ruined them. Obviously, those processors were streets ahead of the Intel offerings at the time, but it was always a long term losing strategy, in particular if they were depending on selling CPUs to make money. Their obsession with OEM deals also hurt them.
AMD could have done one of a few things, in my opinion, to reinvent themselves.
- They could have become a whole-hog PC builder, using their own chips and pricing their laptops and desktops accordingly.
- When Android happened, AMD, without as much baggage as Intel, could have produced an Android phone and Android tablets, and gone to market with that, using their chip making expertise to develop offerings that would have been more competitive than Qualcomm, Samsung etc.
AMD was obsessed with being a mini Intel, which was never going to work out for them.
AMD should have taken a page out of Apple's playbook. At best, they might be taken over by a Chinese company, otherwise they are doomed to irrelevance.
Re:AMD has played losing strategy for too long (Score:5, Insightful)
They were even outright leaders for a while, but failed to capitalise on that.
Wow, that is the understatement of the century. AMD at one point did decide not to be a "mini Intel" and become a technology leader. Do you realize that while AMD had a far superior product for several years, Intel threw money (and threats - as was proved) to every retailer/integrator/etc out there to not carry AMD (and did other "interesting" things such as rig their industry standard compilers etc). Intel was allowed to use strong-arm tactics that "scream" anti-trust and after many years an almost bankrupt AMD was allowed to accept a small payment and Intel went scot-free.
If you have a product that is far ahead of the competition, you should be allowed to capitalize on that. If you are illegally not allowed by thepowerful players, there should be some sort of protection for that, before it is too late. But I guess the DoJ was sleeping at the wheel...
You have to remember, the Athlon was getting a firm lead on the P3 and Intel got out the P4 as a "response". The P4, the processor now universally known as the biggest "dog" by virtually everyone (even in its final and much, much improved incarnations), eventually abandoned even by intel to go back to a saner P3-derived architecture, was actually welcomed with laurels, both by (most of) the press and the integrators. AMD put all this R&D effort and they got nothing out of it, instead the were bleeding money for years, while Intel was making money with the current situation being a very weak AMD next to a behemoth. It is too bad for us, because the sole reason Intel CPUs are affordable is AMD - I won't remind you how much Intel charged per-CPU before there was competition. The sole reason Intel CPUs are this fast (or even that their consumer products are 64bit) is AMD. I only hope in some miracle for AMD to survive and get some competition going, otherwise there will be no-one left to keep Intel in check and consumers will pay for it...
So, yeah, the greatest industrial robbery of all time has been largely forgotten. AMD just "failed to capitalize", they were "obsessed with being a mini Intel"...
Re: (Score:2)
AMD could never capitalise on their lead for long enough because Intel ultimately had more money, could spend more on R&D and would eventually catch up to and surpass. They were also leaders in process technology - AMD never caught up with them in that department, and were able to squeeze out more performance from what was a worse architecture. Ultimately, the likes of Dell, although they might have, of their own volition, used AMD, were always going to be Intel shops. AMD was always one step away from
Re: (Score:1)
Are you kiddinig me?
Intel was overselling AMD with slower, pricier, power hungry chips 4 to 1.
Compaq refused to take AMDs chips FOR FREE.
R&D, my ass...
Re: (Score:2)
Ultimately, the likes of Dell, although they might have, of their own volition, used AMD, were always going to be Intel shops.
So, Intel was paying Dell essentially up to $1 Billion a year to not carry AMD just for fun? They were not going to go AMD anyway, even though they were so much faster, cooler and even cheaper?
Back in 2003-2004 we wanted to buy a few dozen servers for our lab at my University. My professor who had gotten the grant had gotten offers from various companies, Dell offering Xeon-based ones and others (HP and Sun I think?) offering Opteron-based. I was given remote access to a sample Dell server and a sample Opte
Re: (Score:1)
While Intel did pull a whole bunch of shady crap, AMD's downfall was not (entirely) Intel's doing.
AMD has been serially mismanaged for almost a decade now, and the blame falls squarely on the shoulders of "Business minded" execs plundering the company for bonuses and golden parachutes.
AMD's big fuck-ups:
Spinning off their fabs - This was done purely to jiggle and manipulate stocks so a few connected organizations could make a lot of money. Becoming abstracted from your process tech is a stupid idea when mak
Re: (Score:2)
"The core2 was introduced in 2006. - Almost a decade ago and core2 based computers are still quite damn fast today."
Agreed. My hands-me down i3-2100 is faster than the 3.2 C2D it replaced, yes, but I didn't fall off my chair. Anything more recent than a P4 will be usable for everyday tasks for most people.
As for AMD going down, I really don't want to go back to paying a thousand dollars for a CPU...
Ummmm.... no (Score:2)
Sorry but you are having some selective memory. AMD actually was only a performance leader for a very brief period of time, that being the P4 days. That was also not because of anything great they did, but rather because the P4 ended up being a bad design because it did not scale as Intel thought it would. Outside of that they were competitive during the P3 days, but behind other than that.
They also had serious problems outside of any business practices from Intel. The three big ones that really screwed the
Re:AMD has played losing strategy for too long (Score:4, Informative)
They were even outright leaders for a while, but failed to capitalise on that.
Because if Intel's illegal business practices, for which they didn't receive any criminal sanctions. All they had to do was pay $1e9 to AMD, which is far less than they've profited by it.
Re: (Score:2)
I don't disagree that Intel had illegal business practices. But as you also point out, it was better for Intel to strong arm its "partners" to not deal with AMD in the long run, and they bet on AMD continuing to take them on in a game they could not win.
If AMD, by making its own computers, had been able to get an additional (completely made up) $25 per PC sold, they might have been making a billion or so dollars extra a year, which would have been a big deal for them, and might have given them the revenue t
Re: (Score:2)
The Athalon64 days where their peak oil.
Not only did they have chips that were faster, lower powered, but also had 64bit support well before Intel. (granted way before any useful purpose of 64bit really)
Then Intel came out with Core 2 Duo, which beat them in every category, to which AMD had no answer, then Intel refined the design even better, to which AMD again had really nothing, and it has been that way ever since.
They should not have become a whole hog builder. Margins suck. They would have been a Chine
We all need to realize... (Score:5, Insightful)
If AMD goes the way of the dodo bird, so do our cheap processors. Moreover, we'll likely lose a great deal of software freedom as what Intel says becomes law across the whole board. UEFI and TPM? Disneyland to what Intel can demand under the guise of "security" from every future computer.
Re: (Score:2, Informative)
Meh.. this meme has been copy & pasted onto Slashdot over & over again since the 90s.
Guess what:
1. I can tell you exactly how much Intel chips will cost if AMD is noncompetitive or goes away entirely... they'll cost exactly what they cost now because AMD is already effectively out of the game.
People forget that Intel is not only in heavy competition with ARM, but Intel is in perpetual competition with its own parts from last year and if Intel really jacks up prices they will simply lose busi
Re: (Score:2)
Nah, you just invent some new feature and makes sure marketing plants it into everybody's head that they need it (hyperthreading).
Re: (Score:2)
...we need AMD. Because if AMD goes away, Intel has zero competitors in the x86/64 market.
AMD gave up on the markets I care about in 2012 so I don't really care, what's worse it that without AMD there's really no competitor to nVidia in the high end GPU market either.
If AMD goes the way of the dodo bird, so do our cheap processors.
That's what smartphones and tablets are for, you only need x86 if you're doing anything CPU intensive and anything CPU intensive you shouldn't be doing on a cheap CPU in the first place.
Moreover, we'll likely lose a great deal of software freedom as what Intel says becomes law across the whole board. UEFI and TPM?
AMD supports all the same DRM standards as Intel.
What used to be the "traditional" AMD has already imploded, if anything they'll exit the consumer
Re: (Score:2)
AMD gpus are very competitive. If I were the ceos I would sell of cpu business. Keep ATI.
The reason AMD sucks is because they no longer have the economies of scale for chips lower than .28 nm while Qualcomm and intel are down to .22nm and are heading towards .17nm in skylake.
Nvidia is stuck at .28nm too.
If AMD didn't sell global foundries and also had .17nm then it could compete and throw nvidia out of business too.
Re: (Score:2)
...we need AMD. Because if AMD goes away, Intel has zero competitors in the x86/64 market.
I used to think this too, but I'm not so sure this is totally true today. There are more CPU makers than Intel and AMD, although these are the two players in the PC/server market, the PC/server market is starting to show a decline. People are moving away from the desktop/laptop in favor of their smartphones and handheld devices and the CPU's in these devices are usually not AMD or Intel made.
I used to think that Intel had to keep AMD going to avoid anti-trust problems, but these days that issue is reall
...and? (Score:2)
What is your proposal, people should purchase AMD chips as a charity?
Nobody other than Intel zealots wants to see AMD go away. However if AMD's products are not competitive for what they want, why should they buy them? Trying to argue charity buying is a non-starter and a very bad strategy.
AMD has been really screwing up on their processors as of late. Their performance is not that good in most things and their performance per watt is even worse. So for a great many tasks, they are not a great choice. Their
Re: (Score:2)
We need other competitors too. Remember Cyrix? :(
Re: (Score:2)
I'm cheering for AMD for much the same reasons. I'm also hedging my bet with ARM.
Re: (Score:2)
I was under the impression that Intel cannot survive in a non-AMD world because of anti-monopoly laws. At least not without breaking in half.
There is no law against being a monopoly, you just can't use that status as a monopoly unfairly.
Re: (Score:2)
Re: (Score:2)
Run Windows VMs and keep adding them until the boxes are under some level of resource contention (3:1, 4:1 vCPU:pCPU). If you don't see a difference, I'd be highly curious of your workloads and configuration.
Re: (Score:2)
No they didn't (Score:2)
The basic 1u 4S machines with 64 cores and 512G of RAM were denser than anything seamicro ever made.
The "dense server" companies were working on the myth that servers were still 1CPU in a 4U box. That stopped being the case years ago. Commodity stuff is already really dense.
Re: (Score:2)
But it's not all about density, but power consumption starts being the issue of the day. Anything to compete with the other guy and make a name in the market that really doesn't have any differences in product...
But as another poster noted, there isn't any difference in how AMD and Intel processors operate in a data center. If the machine runs the software you want, most of us who are buying servers don't really care all that much. Today's machines are faster, smaller, and consume less power so they fit
Re: (Score:2)
So density is way overrated as a differentiator in the server market. It doesn't really matter to the bulk of the customers anyway and people that fall for the whole blade server thing but buy server chassis that are not totally full to start, are nuts. IF you cannot afford the blades now, trust me, you won't be able to get them in 2 years after they are EOL'ed by the vendor. Just buy separate servers and keep the upgrade path as simple as possible. So none of this has anything to do with the CPU vendor in
Re: (Score:2)
The seamicro stuff wasn't just density. It was power use, switching fabric and lots of other stuff. It was designed as sort of a mini-mainframe with higher IO throughput than the dense high compute stuff you can get in commodity hardware. Power use was in fact one of their main selling points. They were offering the same compute power at like 10% of the power by using low power (and low compute) processors stacked on a fabric that eliminated their weaknesses. At the same time their custom networking fabric
Re: (Score:2)
They were offering the same compute power at like 10% of the power by using low power (and low compute) processors stacked on a fabric that eliminated their weaknesses.
Except that never worked because the low power processors didn't get great ops/watt so even that wasn't much if anything of an advantage. They were using atom processors which while low power were less efficient that the server processors at many of the workloads.
The core market of this was the datacenter virtual machine market. Their servers
Re: (Score:2)
I'm not saying it was perfect, back when they were Intel only they supposedly had some secret sauce to shut off the parts of the Atom CPU that weren't needed (lots of the northbridge). The expectation was that when Intel got Rangely and Avoton out the door they'd have something that blew everything else out of the water. IMO they would have been right. AMD purchasing and restricting their products to AMD chips, particularly with the AMD's refreshed CPU core with low IPC, destroyed their game plan
Seamicro wa
Re: (Score:2)
Seamicro was well positioned and had some neat tech, I think they would have been moderately successful in the data center had AMD not bought them.
Maybe. The fabric was interesting, but it's a bit meh. Don't forget that the commodity 1U servers have full ILM, and properly implemented WOL so you can power them up and down remotely to scale demand quite cleanly. While you don't have the same degree of fine grained control, it's a decent enough approximation.
I mean, it's possible. Back when seamicro was a thi
Next Gen Console CPU Supplier (Score:1)
Re: (Score:2)
It's curious they're having money problems since as I understand it they provide CPUs to both the XBOX One and the PS4. So that combined with the PC market is still not enough, huh?
Margins will be tiny. They probably need to sell a hundred consoles to make as much profit as Intel make from a single server CPU.
Intel will prop them up (Score:1)
I suspect Intel will find a way to keep AMD alive using pricing games to avoid both anti-trust accusations and the appearance that X86 cpu's are a dying biz. Intel and AMD need each other whether they like it or not.
Re: (Score:2)