Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
AMD Hardware

AMD Withdraws From High-Density Server Business 133

An anonymous reader sends word that AMD has pulled out of the market for high-density servers. "AMD has pulled out of the market for high-density servers, reversing a strategy it embarked on three years ago with its acquisition of SeaMicro. AMD delivered the news Thursday as it announced financial results for the quarter. Its revenue slumped 26 percent from this time last year to $1.03 billion, and its net loss increased to $180 million, the company said. AMD paid $334 million to buy SeaMicro, which developed a new type of high-density server aimed at large-scale cloud and Internet service providers."
This discussion has been archived. No new comments can be posted.

AMD Withdraws From High-Density Server Business

Comments Filter:
  • by Anonymous Coward on Friday April 17, 2015 @05:48AM (#49491899)

    Looks like they're focusing on ARM chips:

    "AMD still sees growth potential in the server market, but not from selling complete systems. It's returned its focus to x86 chips and to the development of its first ARM server processor, code-named Seattle."

    8 core 64 bit ARM chips with GPU built in are fairly common and 10 core chips already announced (Mediatek), with 16-48 core vaguely hinted at for servers by other vendors. So if AMD plan on entering the ARM processor market they'd better get something special out and fast, and be prepared to stick at it and upgrade it and take the initial losses. Because they're unlikely to win companies over first time till they're confident AMD are in it for the long run and won't leave them hanging without a supplier.

    On the other hand they could focus on x86 chips where Intel is already deep discounting at the low end, and likely will have to do that all the way up the range to compete.

    AMD face a tough time either way.

    • 8 core 64 bit ARM chips with GPU built in are fairly common and 10 core chips already announced (Mediatek), with 16-48 core vaguely hinted at for servers by other vendors

      A bit more than hinting: Cavium is selling 24-48 core ThunderX (ARMv8) chips [cavium.com]. I think the first one shipped a month or two ago.

      • Be aware that some vendors list a product on thier web site as if it were a current production product when really it's at the "we have a few samples and will let you have one if we like you and/or you pay us a load of money" stage.

        • I am aware of that. I'm also aware of the ThunderX box that we have on the FreeBSD test lab network, but I don't remember exactly how long it's been there.
    • I agree...what AMD should do (and I proposed that numerous times) is build a server chip that combines ARM and x86_64 with seamless switching. Yes, I know it will require OS support, but it would be incredibly flexible to run anything from low end ARM based services and then many of them on a box that in an instant could be switched to high performance x86 based tasks. And not only that, make it so that it can run ARM/x86 side by side in the OS using two cooperating kernels with an option to move task data
  • FTFY... in the future.
  • by msobkow ( 48369 ) on Friday April 17, 2015 @06:22AM (#49491957) Homepage Journal

    Sadly, I don't see an "out" for AMD. Their x86/amd64 chips don't perform as well as Intel's. The ARM market is saturated. They don't have their own foundry.

    What does modern day AMD bring to the table that anyone wants? Even at cut-rate pricing, they've saturated their channels with chips and can't even manufacture and ship new inventory until the backlog clears.

    It's a shame, but I think they're on their last legs. :(

    • Comment removed (Score:5, Insightful)

      by account_deleted ( 4530225 ) on Friday April 17, 2015 @06:41AM (#49491993)
      Comment removed based on user account deletion
      • AMD still has too low single-thread performance and if you care about that, Intel came out with Celeron G1620 and Pentium G2020 (now updated with the same as Haswell) and has ruled the low end too.

        AMD ironically requires a more expensive motherboard and an aftermarket heatsink/fan if you go for that old six-core CPU. (but I do have that opinion that a CPU with four or six or more cores is most needed for games, unless you're a professional who works all the time with lots of big pictures or video)
        I would ge

        • by higuita ( 129722 )

          A10 CPU is very good!

          Is not the faster CPU, that is right, but is fast enough!
          Then you have the internal GPU, that will eat intel one alive. Taking out the hardcore gamers, the normal users (home users, casual gamers, office work, etc) will get a very good machine for a lower price. Everyone likes to have the most powerful rig of the neighborhood, but that is just ego talking, most people will not use it.

          hardcore gamers will always choose top CPUs and GPUs and will pay huge amount of money to get then... b

      • I like AMD, I really do. They've gotten the short end of the stick over and over again. But even I have to admit that the Tek Syndicate benchmarks are poor proof of value right now, and for 2 reasons.

        1. They were specifically structured to make the AMD processors look good by running a high CPU load H.264 encoding task (XSplit) while also running a game, which leads us to...
        2. XSplit has been rendered functionally obsolete by newer software that uses the on-board H.264 encoders provided by AMD/NVIDIA/Intel. H.26
        • by Gr8Apes ( 679165 )

          XSplit has been rendered functionally obsolete by newer software that uses the on-board H.264 encoders provided by AMD/NVIDIA/Intel. H.264 encoding is now a virtually free operation (with a 5% perf hit)

          You have any citations for this? I'd love to see where H.264 encoding has a less than 50% perf hit, as my current workflow uses nearly 100% of my system for significant portions of time.

          • Look up "Shadowplay" by nVidia. That is their software that uses the "nvenc" feature of their new GPUs. It has near zero CPU and GPU load, just load on the disk. All encoding is done by a special dedicated encoder on the chip. It's a fast encoder too, it can do 2560x1600@60fps.

            The downside is it is not as good looking per bit as some of the software encoders (particularly X264) so if the target is something low bitrate you may wish to capture high bitrate and then reencode to a lower bitrate with other soft

            • by Gr8Apes ( 679165 )

              The downside is it is not as good looking per bit as some of the software encoders (particularly X264)

              That's going to do me no good then. The entire purpose of encoding for me is to shrink the size while maintaining quality.

              • You'd want to look at a 5960X, if you can afford it. Particularly when overclocked (and they OC well with good cooling) they are the unquestioned champs for that kind of thing. They have plenty of power to be able to run a game well, plus have cores left over for good quality encoding.

                • by Gr8Apes ( 679165 )
                  I'm running a 980X OC'd now, that would be about 35% faster, I'm guessing, based on 2 extra cores and a few newer generations.
                  • Part of it would depend on the relative OCs, of course. Also it would depend on if your encoder could use AVX2/FMA3 and if so, how much speedup it provides. For things that it matters on, there have been near 2X speed gains, but I don't know how applicable the instructions are to H.264 encoding.

                    Another option is if you can find an encoder you like that has a CUDA version, you could give it a video card to run on. However you'd want to check the implementation to make sure its quality is comparable. Also you

                    • by Gr8Apes ( 679165 )
                      I meant to reply to this earlier - if I were to go this route, I'd hit a 12 or even 16 core Xeon. There's no reason if you can afford the 5960X that you can't pick up a much higher core Xeon. The 18 core one is just ridiculous in price (heck, all of them are a bit overpriced IMNSHO). Much better to just wait 6-9 months. Or, buy a dual CPU motherboard and drop 2 Xeon hexcores in it for $500 or so and get more than double my current performance. Not sure I want the extra 140+W of heat though.
          • Look up XSplit. The newest version of XSplit Broadcaster natively supports x264 software encoding as well as Intel QuickSync, NVIDIA NVENC, AMD VCE, and AVerMedia H.264 hardware encoding. What I couldn't tell from a quick look around the site is if XSplit Gamecaster also supports the same hardware encoding options as XSplit Broadcaster.
        • by serviscope_minor ( 664417 ) on Friday April 17, 2015 @09:38AM (#49492779) Journal

          Well, try the Phoronix benchmarks then.

          http://www.phoronix.com/scan.p... [phoronix.com]

          In this one the FX8350 is basically comparable to the i7 3770, the contemporary intel processor. Sometimes a fair bit faster sometimes a fair bit slower, on average about the same.

          Now pull up a benchmark from the other sites from a similar era. You'll find the AMD processot getting stomped all over. Given phoronix used open source software and GCC, I'm somewhat more inclined to trust it.

          It also matches my experience that certain software is easily as fast on AMD as Intel, but then agan I run Linux too.

      • Hairy let's say AMD has a theoretical superior architecture?

        AMD has .28 nm chips. Intel is down to .17 nm and skylark with .14 nm is just around the corner! Worse power requirements are now the new rage too. Tell me how can AMD compete?

        They can't. Lower size increases speed and power requirements. Only advantage AMD has is cost ... oh wait another chip fabrication is needed and they want a cut :-(

        Only saving grace is ATI graphics. If nvidia gets a hold of .17 nm chips then it's game over too.

        I was a loyal

        • Sigh ... stupid Android auto correct thought x86 = xp 6. Slashdot please allow editing of posts?

        • by Kjella ( 173770 )

          Sigh, where to begin.

          AMD has .28 nm chips. Intel is down to .17 nm and skylark with .14 nm is just around the corner!AMD has .28 nm chips. Intel is down to .17 nm and skylark with .14 nm is just around the corner!

          Not .28nm, just 28nm and Broadwell is made on the same 14nm process as Skylake.

          Only saving grace is ATI graphics. If nvidia gets a hold of .17 nm chips then it's game over too.

          They haven't called it ATI graphics for 5 years, but now I'm quibbling. What's important is that both AMD and nVidia makes their GPUs at TSMC and so have access to the exact same technology if they pay.

          I was a loyal AMD user too. I tried and stayed til last year. It is frustrating but an i7 4 core with 8 virtuals with hyperthreading really sped uo my games compared to the 6 core./

          Hyperthreading has little to do with it, the step down with pure quad-core (i5-2500k, i5-3570k, i5-4690k) has usually been far more cost effective for gaming. Four Intel cores simply beat eight AMD Bulldozer cores.

          AMD needs to leave [x86] and go all ATI to stay solvent.

          They're in the same boat on graphics, the last major new architecture was GCN in 2011 and it's way overdue for a replacement. So that depends, have they actually invested in a new architecture? With their R&D money going everywhere else, I don't see how.

          • by armanox ( 826486 )
            I'd like to see a major update in the Radeon line, but they still preform well for certain workloads (like OpenCL). AMD could make a killing if they with a major GPU update, as long as it's not a flop like Bulldozer.
      • Since you are such an MS fanboy, here is the biggest MS example of how you are completely wrong:

        Microsoft SQL Server

        MSSQL is:
        1. Compiled using Microsoft's compiler, not Intel's. No "cheating" there.
        2. A fairly integer heavy workload which in theory would benefit AMD's module architecture.

        The reality is Intel processors absolutely destroy AMDs in MSSQL performance. By more than 2 to 1. A mere Westmere based 2X Xeon X5690 12 core system beats a 32 core 2X Opteron 6282SE in TPC-E. Intel has 3 ge
        • This is exactly correct. I myself replaced a SQL Server cluster that was using boxes with dual 12-core AMD procs with one using dual 4-core Xeons a couple years ago. Performance and responsiveness went way up while the bill to Microsoft dropped massively.

          I was a solid AMD enthusiast from the original Athlons all the way up until about 5 years ago. They went from huge underdog to reigning champion for a long time while the marketing guys ran Intel's product offering into the ground with everything from North

      • by Anonymous Coward

        Anecdote time: I have two gaming systems that are closely similar in spec, both running Win7x64.

        One runs an i5-2500K, the other a Phenom II X4 955 BE. According to everything you can read online, the i5-2500K should slash, crush, destroy, and humiliate the Phenom II in every possible scenario, and then nuke it from orbit just to be sure.

        In actuality: There is NO noticeable difference in gaming performance between the two systems.

        Pricewise, the Phenom II was purchased new back in 2010 and still cost half

        • by armanox ( 826486 )
          You missed the part where AMD chip design changed after the Phenom II (which was a nice processor in it's day and did compete against the Intel offerings). The Bulldozer based processors were a step backwards - a Phenom II x6 ran circles around any FX 81xx processor. The Bulldozer design is as big of a failure as Itanium.
      • by msobkow ( 48369 )

        Cite an article, not YouTube videos.

        And I'm concerned about single-threaded compute performance, not embedded graphics. I never use the embedded graphics on a processor except on a notebook that has no slot for a video card.

      • Oh please. What Intel did to AMD is anti-competitive and horrible, but please don't tell us the fairy tale that all benchmarks have been rigged. There exist dozens of benchmarks, as well as game and application based benches. They all show that AMD CPUs gets slaughtered when it comes to single core IPC, while also being pretty poor in the power consumption department.

        Intel did nasty things to AMD, but AMD dug its own grave when it designed the current Bulldozer-based architectures with the goal of maximizin

    • by Laxator2 ( 973549 ) on Friday April 17, 2015 @10:19AM (#49493145)

      I am writing my own (multi-threaded) software and recently I had a chance to do a test run on an intel i7 processor (8-core, 2.67GHz) to compare it with my old Athlon II X4 (3GHz). Both programs compiled with the same version of GCC (4.6.1), both compiled with -O3 optimization. Running 8 threads on the Intel machine was only marginally faster than running 4 threads on the old Athlon. The threads were independent, so no threads were inactive while waiting for something else to finish.

      Where Intel have the lead is in the compiler business. Back in 2003 or so they released their ICC 8.0 for free for Linux users. I was writing only single-threaded software at the time, and simply re-compiling it with ICC made it run about 5 times faster than the version compiled with GCC 2.96. And that was on a 2GHz Athlon XP.

      What AMD have done right is the integration of the CPU and GPU allowing them to gobble up the console market. However, their bet that all developers will jump on the heterogeneous computing bandwagon did not pan out. But with HSA 1.0 coming up their lead will be too large and neither Nvidia not Intel will have a competitor ready for the next console refresh. All that Nvidia will do is to continue to pay game developers to optimize their engines for GeForce cards, and refuse to optimize for Radeon. AMD's resources are so limited that they will be forced to have a desktop version of their console processor, and maybe an ARM core for good measure.

      Exiting the "dense server" are makes perfect sense, as the market is very limited. Running across many small cores is hard and developers will avoid it. It is the same story as taking advantage of the GPU, which also provides many simple cores.

      So no, they are not dead, they are simply adapting to market realities and accept that they made a mistake when they jumped in the dense server bandwagon. Unlike Intel, who even now refuse to let go of the Itanium.

    • While I agree in part, they have a few outs.

      One thing I don't understand about the post is the "High-Density" bit. I am not sure if that is some techno babble for some super duper specialized server construct, however sever chips have been one of the few places AMD has excelled in the last number of years. Another place they do well is the budget segment where cost is more of an feature than actual processing speed. The difficulty with that segment is that the margins are likely very low, so you have to mak

  • by vakuona ( 788200 ) on Friday April 17, 2015 @07:47AM (#49492153)

    AMD has played a losing strategy for as long as I have can remember. It is sad, but I remember my first few PCs were all AMD machines. I bought AMD on principle, and because they were price/performance leaders. They were even outright leaders for a while, but failed to capitalise on that. I think, however, that the whole Sledgehammer/Clawhammer phase has ultimately ruined them. Obviously, those processors were streets ahead of the Intel offerings at the time, but it was always a long term losing strategy, in particular if they were depending on selling CPUs to make money. Their obsession with OEM deals also hurt them.

    AMD could have done one of a few things, in my opinion, to reinvent themselves.
      - They could have become a whole-hog PC builder, using their own chips and pricing their laptops and desktops accordingly.
      - When Android happened, AMD, without as much baggage as Intel, could have produced an Android phone and Android tablets, and gone to market with that, using their chip making expertise to develop offerings that would have been more competitive than Qualcomm, Samsung etc.

    AMD was obsessed with being a mini Intel, which was never going to work out for them.

    AMD should have taken a page out of Apple's playbook. At best, they might be taken over by a Chinese company, otherwise they are doomed to irrelevance.

    • by Ecuador ( 740021 ) on Friday April 17, 2015 @08:44AM (#49492369) Homepage

      They were even outright leaders for a while, but failed to capitalise on that.

      Wow, that is the understatement of the century. AMD at one point did decide not to be a "mini Intel" and become a technology leader. Do you realize that while AMD had a far superior product for several years, Intel threw money (and threats - as was proved) to every retailer/integrator/etc out there to not carry AMD (and did other "interesting" things such as rig their industry standard compilers etc). Intel was allowed to use strong-arm tactics that "scream" anti-trust and after many years an almost bankrupt AMD was allowed to accept a small payment and Intel went scot-free.
      If you have a product that is far ahead of the competition, you should be allowed to capitalize on that. If you are illegally not allowed by thepowerful players, there should be some sort of protection for that, before it is too late. But I guess the DoJ was sleeping at the wheel...
      You have to remember, the Athlon was getting a firm lead on the P3 and Intel got out the P4 as a "response". The P4, the processor now universally known as the biggest "dog" by virtually everyone (even in its final and much, much improved incarnations), eventually abandoned even by intel to go back to a saner P3-derived architecture, was actually welcomed with laurels, both by (most of) the press and the integrators. AMD put all this R&D effort and they got nothing out of it, instead the were bleeding money for years, while Intel was making money with the current situation being a very weak AMD next to a behemoth. It is too bad for us, because the sole reason Intel CPUs are affordable is AMD - I won't remind you how much Intel charged per-CPU before there was competition. The sole reason Intel CPUs are this fast (or even that their consumer products are 64bit) is AMD. I only hope in some miracle for AMD to survive and get some competition going, otherwise there will be no-one left to keep Intel in check and consumers will pay for it...
      So, yeah, the greatest industrial robbery of all time has been largely forgotten. AMD just "failed to capitalize", they were "obsessed with being a mini Intel"...

      • by vakuona ( 788200 )

        AMD could never capitalise on their lead for long enough because Intel ultimately had more money, could spend more on R&D and would eventually catch up to and surpass. They were also leaders in process technology - AMD never caught up with them in that department, and were able to squeeze out more performance from what was a worse architecture. Ultimately, the likes of Dell, although they might have, of their own volition, used AMD, were always going to be Intel shops. AMD was always one step away from

        • by Kartu ( 1490911 )

          Are you kiddinig me?
          Intel was overselling AMD with slower, pricier, power hungry chips 4 to 1.
          Compaq refused to take AMDs chips FOR FREE.

          R&D, my ass...

        • by Ecuador ( 740021 )

          Ultimately, the likes of Dell, although they might have, of their own volition, used AMD, were always going to be Intel shops.

          So, Intel was paying Dell essentially up to $1 Billion a year to not carry AMD just for fun? They were not going to go AMD anyway, even though they were so much faster, cooler and even cheaper?
          Back in 2003-2004 we wanted to buy a few dozen servers for our lab at my University. My professor who had gotten the grant had gotten offers from various companies, Dell offering Xeon-based ones and others (HP and Sun I think?) offering Opteron-based. I was given remote access to a sample Dell server and a sample Opte

      • by Anonymous Coward

        While Intel did pull a whole bunch of shady crap, AMD's downfall was not (entirely) Intel's doing.

        AMD has been serially mismanaged for almost a decade now, and the blame falls squarely on the shoulders of "Business minded" execs plundering the company for bonuses and golden parachutes.

        AMD's big fuck-ups:

        Spinning off their fabs - This was done purely to jiggle and manipulate stocks so a few connected organizations could make a lot of money. Becoming abstracted from your process tech is a stupid idea when mak

        • "The core2 was introduced in 2006. - Almost a decade ago and core2 based computers are still quite damn fast today."

          Agreed. My hands-me down i3-2100 is faster than the 3.2 C2D it replaced, yes, but I didn't fall off my chair. Anything more recent than a P4 will be usable for everyday tasks for most people.

          As for AMD going down, I really don't want to go back to paying a thousand dollars for a CPU...

      • Sorry but you are having some selective memory. AMD actually was only a performance leader for a very brief period of time, that being the P4 days. That was also not because of anything great they did, but rather because the P4 ended up being a bad design because it did not scale as Intel thought it would. Outside of that they were competitive during the P3 days, but behind other than that.

        They also had serious problems outside of any business practices from Intel. The three big ones that really screwed the

    • by serviscope_minor ( 664417 ) on Friday April 17, 2015 @10:01AM (#49492975) Journal

      They were even outright leaders for a while, but failed to capitalise on that.

      Because if Intel's illegal business practices, for which they didn't receive any criminal sanctions. All they had to do was pay $1e9 to AMD, which is far less than they've profited by it.

      • by vakuona ( 788200 )

        I don't disagree that Intel had illegal business practices. But as you also point out, it was better for Intel to strong arm its "partners" to not deal with AMD in the long run, and they bet on AMD continuing to take them on in a game they could not win.

        If AMD, by making its own computers, had been able to get an additional (completely made up) $25 per PC sold, they might have been making a billion or so dollars extra a year, which would have been a big deal for them, and might have given them the revenue t

    • The Athalon64 days where their peak oil.

      Not only did they have chips that were faster, lower powered, but also had 64bit support well before Intel. (granted way before any useful purpose of 64bit really)

      Then Intel came out with Core 2 Duo, which beat them in every category, to which AMD had no answer, then Intel refined the design even better, to which AMD again had really nothing, and it has been that way ever since.

      They should not have become a whole hog builder. Margins suck. They would have been a Chine

  • by Chris Katko ( 2923353 ) on Friday April 17, 2015 @08:01AM (#49492179)
    ...we need AMD. Because if AMD goes away, Intel has zero competitors in the x86/64 market. Most people here probably aren't old enough to remember that CPU's used to cost an arm-and-a-leg in margins, and then when a bunch of hot shots like the 6502 came out, prices dropped literally over night. How could they drop so much? Because it was nothing but margin to begin with.

    If AMD goes the way of the dodo bird, so do our cheap processors. Moreover, we'll likely lose a great deal of software freedom as what Intel says becomes law across the whole board. UEFI and TPM? Disneyland to what Intel can demand under the guise of "security" from every future computer.
    • Re: (Score:2, Informative)

      by CajunArson ( 465943 )

      Meh.. this meme has been copy & pasted onto Slashdot over & over again since the 90s.

      Guess what:
      1. I can tell you exactly how much Intel chips will cost if AMD is noncompetitive or goes away entirely... they'll cost exactly what they cost now because AMD is already effectively out of the game.

      People forget that Intel is not only in heavy competition with ARM, but Intel is in perpetual competition with its own parts from last year and if Intel really jacks up prices they will simply lose busi

      • People forget that Intel is not only in heavy competition with ARM, but Intel is in perpetual competition with its own parts from last year and if Intel really jacks up prices they will simply lose business from people who don't upgrade.

        Nah, you just invent some new feature and makes sure marketing plants it into everybody's head that they need it (hyperthreading).

        AMD isn't some angel, it just doesn't have the opportunity to be the big dog very often. Additionally, even when AMD isn't the top dog they've

    • by Kjella ( 173770 )

      ...we need AMD. Because if AMD goes away, Intel has zero competitors in the x86/64 market.

      AMD gave up on the markets I care about in 2012 so I don't really care, what's worse it that without AMD there's really no competitor to nVidia in the high end GPU market either.

      If AMD goes the way of the dodo bird, so do our cheap processors.

      That's what smartphones and tablets are for, you only need x86 if you're doing anything CPU intensive and anything CPU intensive you shouldn't be doing on a cheap CPU in the first place.

      Moreover, we'll likely lose a great deal of software freedom as what Intel says becomes law across the whole board. UEFI and TPM?

      AMD supports all the same DRM standards as Intel.

      What used to be the "traditional" AMD has already imploded, if anything they'll exit the consumer

      • AMD gpus are very competitive. If I were the ceos I would sell of cpu business. Keep ATI.

        The reason AMD sucks is because they no longer have the economies of scale for chips lower than .28 nm while Qualcomm and intel are down to .22nm and are heading towards .17nm in skylake.

        Nvidia is stuck at .28nm too.

        If AMD didn't sell global foundries and also had .17nm then it could compete and throw nvidia out of business too.

    • ...we need AMD. Because if AMD goes away, Intel has zero competitors in the x86/64 market.

      I used to think this too, but I'm not so sure this is totally true today. There are more CPU makers than Intel and AMD, although these are the two players in the PC/server market, the PC/server market is starting to show a decline. People are moving away from the desktop/laptop in favor of their smartphones and handheld devices and the CPU's in these devices are usually not AMD or Intel made.

      I used to think that Intel had to keep AMD going to avoid anti-trust problems, but these days that issue is reall

    • What is your proposal, people should purchase AMD chips as a charity?

      Nobody other than Intel zealots wants to see AMD go away. However if AMD's products are not competitive for what they want, why should they buy them? Trying to argue charity buying is a non-starter and a very bad strategy.

      AMD has been really screwing up on their processors as of late. Their performance is not that good in most things and their performance per watt is even worse. So for a great many tasks, they are not a great choice. Their

    • by antdude ( 79039 )

      We need other competitors too. Remember Cyrix? :(

    • by sjames ( 1099 )

      I'm cheering for AMD for much the same reasons. I'm also hedging my bet with ARM.

  • Comment removed based on user account deletion
    • Run Windows VMs and keep adding them until the boxes are under some level of resource contention (3:1, 4:1 vCPU:pCPU). If you don't see a difference, I'd be highly curious of your workloads and configuration.

    • If AMD performed the same as Intel VMware wouldn't offer a 50% discount for using AMD. See my comment above about MSSQL. Like MS, VMware is not going to offer discounts unless they have to. AMD CPUs perform so poorly that VMware has to offer a discount just to make them even remotely viable. Oracle also offers a discount for using AMD. If any company loves money it's Oracle and even they realize that performance on AMD sucks without a discount.
  • The basic 1u 4S machines with 64 cores and 512G of RAM were denser than anything seamicro ever made.

    The "dense server" companies were working on the myth that servers were still 1CPU in a 4U box. That stopped being the case years ago. Commodity stuff is already really dense.

    • But it's not all about density, but power consumption starts being the issue of the day. Anything to compete with the other guy and make a name in the market that really doesn't have any differences in product...

      But as another poster noted, there isn't any difference in how AMD and Intel processors operate in a data center. If the machine runs the software you want, most of us who are buying servers don't really care all that much. Today's machines are faster, smaller, and consume less power so they fit

      • So density is way overrated as a differentiator in the server market. It doesn't really matter to the bulk of the customers anyway and people that fall for the whole blade server thing but buy server chassis that are not totally full to start, are nuts. IF you cannot afford the blades now, trust me, you won't be able to get them in 2 years after they are EOL'ed by the vendor. Just buy separate servers and keep the upgrade path as simple as possible. So none of this has anything to do with the CPU vendor in

    • The seamicro stuff wasn't just density. It was power use, switching fabric and lots of other stuff. It was designed as sort of a mini-mainframe with higher IO throughput than the dense high compute stuff you can get in commodity hardware. Power use was in fact one of their main selling points. They were offering the same compute power at like 10% of the power by using low power (and low compute) processors stacked on a fabric that eliminated their weaknesses. At the same time their custom networking fabric

      • They were offering the same compute power at like 10% of the power by using low power (and low compute) processors stacked on a fabric that eliminated their weaknesses.

        Except that never worked because the low power processors didn't get great ops/watt so even that wasn't much if anything of an advantage. They were using atom processors which while low power were less efficient that the server processors at many of the workloads.

        The core market of this was the datacenter virtual machine market. Their servers

        • I'm not saying it was perfect, back when they were Intel only they supposedly had some secret sauce to shut off the parts of the Atom CPU that weren't needed (lots of the northbridge). The expectation was that when Intel got Rangely and Avoton out the door they'd have something that blew everything else out of the water. IMO they would have been right. AMD purchasing and restricting their products to AMD chips, particularly with the AMD's refreshed CPU core with low IPC, destroyed their game plan

          Seamicro wa

          • Seamicro was well positioned and had some neat tech, I think they would have been moderately successful in the data center had AMD not bought them.

            Maybe. The fabric was interesting, but it's a bit meh. Don't forget that the commodity 1U servers have full ILM, and properly implemented WOL so you can power them up and down remotely to scale demand quite cleanly. While you don't have the same degree of fine grained control, it's a decent enough approximation.

            I mean, it's possible. Back when seamicro was a thi

  • It's curious they're having money problems since as I understand it they provide CPUs to both the XBOX One and the PS4. So that combined with the PC market is still not enough, huh?
    • by 0123456 ( 636235 )

      It's curious they're having money problems since as I understand it they provide CPUs to both the XBOX One and the PS4. So that combined with the PC market is still not enough, huh?

      Margins will be tiny. They probably need to sell a hundred consoles to make as much profit as Intel make from a single server CPU.

  • I suspect Intel will find a way to keep AMD alive using pricing games to avoid both anti-trust accusations and the appearance that X86 cpu's are a dying biz. Intel and AMD need each other whether they like it or not.

8 Catfish = 1 Octo-puss

Working...