AMD Reveals Roadmap For ARM and X86 SoCs 75
DeviceGuru writes "On the eve of the Intel Developer Forum in San Francisco, AMD unveiled what it calls an ambidextrous embedded roadmap, based on a series of new system-on-chip (SoC) and accelerated processing unit (APU) products built from both ARM and x86 CPU cores. Planned for launch in 2014 are an ARM Cortex A57-based 'Hierofalcon' SoC, a 'Bald Eagle' APU using a new 'Streamroller' x86 CPU, a multi-core x86 'Steppe Eagle' APU, and an 'Adelaar' discrete Embedded Radion GPU. 'There are different customer needs in different segments of this market, from low-power to high-performance, Linux to Windows, and x86 to ARM,' commented Arun Iyengar, VP and general manager, of the AMD Embedded Solutions division." Update: 09/10 16:54 GMT by T : As Slash DataCenter notes, this roadmap includes an SoC aimed specifically at datacenters.
Hopefully they talked to Linus first (Score:5, Funny)
Otherwise they might be next to die in a fire.
Good: APUs. Not so good: Server ARM (Score:5, Insightful)
The Kaveri-based APUs in servers are certainly not going to be great for every workload, but for servers that can take advantage of GPU compute, they give AMD a unique advantage in a competitive server environment.
Those ARM parts on the other hand have proven one thing: Just because ARM (and more importantly, Qualcomm) make good chips for smartphones doesn't mean that ARM is magic and can avoid physics.
The 8 core Cortex A57 parts on AMD's roadmap for late 2014 have a 50% higher power envelope than the high-end 8-core Avoton parts that Intel has on sale *this year* (30 watts vs. 20 watts). By the time they launch, Intel will either have launched or be on the verge of launching 14nm microserver parts. These things are a nice prototype, and AMD is easily the best vendor for ARM servers since it has experience in the server world, but ARM ain't about to take over the server room at this pace.
Re:Good: APUs. Not so good: Server ARM (Score:4, Informative)
So what if the power envelope is larger if they spend more time in a lower power state? What exactly is included in the SoC that Avoton does not have? Intel loves to report only CPU watts and ignore the rest. The first atoms were almost hilarious in that the northbridge/southbridge drew more power than the CPU but Intel only reported CPU power in their advertising.
Re: (Score:2)
ARM fanboy quote from 2013: "So what if the [ARM server part] power envelope is larger if they spend more time in a lower power state?"
Intel fanboy quote from 2008: "So what if the [Intel Atom parts] power envelope is larger if they spend more time in a lower power state?"
Watch the wheel o' time turn & turn.
Re: (Score:2)
I am not a fanboy for either.
I want to see them compete.
2008 intel fanboy was a liar. His Atom was low power but the rest of the stuff an SoC has was not in it and drew far more power. So it never got to a lower power state. Its idle usage including the MB was never low enough. I had one.
Re: (Score:3)
Actually, Atom has had extremely competitive idle power draw going back several generations... the trick was that the idle power draw was best on the embedded platforms that were not widely released prior to Clovertrail and *not* on the desktop platforms where the separate chipset alone used more power than the CPU.
That's not my point, however. At best I'd expect those A57 parts to have performance parity with Avaton under load... and Intel has already solved the idle power draw issues, especially when it c
Re: (Score:2)
I agree.
I just meant neither number is trustworthy. Both will lie about it, or say it in a way that looks better.
Re:Good: APUs. Not so good: Server ARM (Score:4, Informative)
I actually have one of those Atoms. It's on a D945GCLF2 MoBo. The CPU is passively cooled, but there's a more-or-less-standard 40mm CPU fan on the Northbridge.
Despite being dual-core, the performance is not very good. I have a similarly-clocked AMD Athlon II single-core that runs circles around it. The Athlon II machine uses less power in toto (i.e. monitor included) than the Atom desktop, just the computer (i.e. monitor excluded).
Re: (Score:2)
At least your Atom has Intel graphics and not that sucky PowerVR with no drivers, so I would like it over some alternatives for a 40%-decent desktop ('cos I don't want to say half-decent)
I wonder how a desktop with a Pentium 3 733 compares (with ATI Rage Pro and 440BX, or with Intel graphics)
Re: (Score:2)
Funny you say that, take a look at the following and see how many are PowerVR graphics. http://en.wikipedia.org/wiki/Atom_(system_on_chip) [wikipedia.org]
For quite a while there were atom powered linux machines that could not be updated and keep graphics working because of those bastards.
Re: (Score:2)
Yeah, I looked at replacing our netbook recently, but all the current generation of Atom netbooks have unsupported PowerVR chipsets so they can't run Linux.
Re: (Score:3)
Get an AMD one. Their CPU part is a bit crappy, but GPU is top notch. Many games are actually playable on it because of it.
Re: (Score:2)
They can : a buddy has a dual core Atom and Xubuntu 13.04. You get 2D display at the correct resolution, and even software OpenGL with llvmpipe (Google Earth launches and executes rather than give you a black screen or an error message).
But it's all raw unaccelerated X11 : in VLC you have to choose the X11 output (xv bugs if the video is scaled). Youtube video is choppy but somewhat usable, at least for 360p.
Re: (Score:2)
I don't really need the graphics capability beyond enough to run a GUI. I use that machine to DJ. As long as it can keep the waveform displays on screen reasonably up to date, it's good enough.
Re: (Score:2)
The Athlon II machine uses less power in toto (i.e. monitor included) than the Atom desktop, just the computer (i.e. monitor excluded).
That's because the standard Intel north bridge sucks, and takes about 4x as much power as the CPU (which is why it needs the fan).
My dual-core Ion system with the same CPU and a more powerful north bridge takes about 25W from the wall when playing HD video.
Re: (Score:2)
... but ARM ain't about to take over the server room at this pace.
For servers that run distributions that will have a good quality arm64 port, the transition could be as easy as the transition from i386 to amd64 architecture. Linaro, Debian and Ubuntu seem to work to make this a reality as soon as the hardware hit the market (and maybe even before for Linaro). Don't known for Fedora, but there will certainly not miss the opportunity.
Re: (Score:2)
Yeah.. as a raspberry pi early adopter, lemme tell you something: You're full of it. It's irrelevant if some flavor of Linux has a particular version targeted at particular ARM platforms because AMD's ARM platform will have its own requirements that will require quite a bit of software work to ensure that a full, production-quality software stack can get running on it from day zero.
Trust me, if everything ARM + Linux was completely perfect, then Torvalds wouldn't be on his high-horse about ARM SoCs. ARM's
Re: (Score:2)
Don't compare the raspberry pi obsolete armv6k architecture, that is bored to support from a distribution point of view, to the new arm64 architecture that will without any doubt gain a very large support in the future. The machine specific code will be limited to the kernel. Linus expectation about the arm* architectures will ensure that this will be integrated in a fare better way than it used to be in the past.
ARM was really not designed from the start to be a coherent ecosystem targeted to run a standar
Re: (Score:1)
It's all acedemic anyway, because the micro server market is tiny at the moment. Only 100,000 mirco servers were sold last year - that's about 1% of the entire server market.
Re: (Score:2)
Humm... Once upon a time Intel was saying very similar claim about the AMD64 architecture asserting that there own 64bits architecture will rule the market in a few years anyway. Today I can't even remember the name of that Intel architecture, but all my PCs and servers runs AMD64 distributions.
Re: (Score:1)
yes, but was once said to be a big thing and now it isn't, therefore neither will
Re: (Score:1)
grr, slashdost strips out anything in angle brackets
Re: (Score:3)
Slight difference, the AMD parts embeds 10 gigabit ethernet controllers (four or two, I dunno), while the Intel part embeds four gigabit ones. That may turn the power efficiency around, if you needeed such fast controllers for networking / I/O.
AMD may be open to doing custom SoC for some big customer too, with some other specialized units.
Seems AMD would go in places Avoton won't, like high speed network appliances, will still be usable for web hosting and the like. Avoton is just easier for low cost VPS re
Radion? (Score:2)
AMD is dying (Score:2, Interesting)
So basically, AMD has given up going for top of the line, and has decided to focus on commodity hardware because it just can't upgrade it's fab plant because earlier management decided profit was more important than investment... and now they're going to lose out on both. And this is just a consolidation move in that direction... downward.
Enjoy your slow fade to obscurity, AMD. If I could just open up a chasm and drop your fab plants and senior management into the center of the Earth, I would give serious c
Re: (Score:2)
How much top of the line stuff is sold vs the rest of it?
CPUs for most purposes, outside of mobile, are good enough. I think they already sold of their Fabs anyway.
Re: (Score:3)
Large scale computing in the enterprise is not the top end. They buy lots of middling range Xeons and Opterons. I buy these all the time. I am not your typical consumer. The top end CPUs don't even come as Xeons, Xeons lag behind.
Competiing for top performance keeps you relevant (Score:2)
If you're just making mid-tier or lower gear, releasing months after everyone else with mediocre specs, then you're going to fade into obscurity. This means you get less FREE ADVERTISING, because everyone ignores your press releases, so you are stuck charging lower prices for devices.
Just look at a company like VIA Technologies: they used to be relevant, producing competitive chipsets for Intel and AMD. But they were more complacent in their other "visible" product categories (x86 CPUs, GPUs) so they mad
Re: (Score:2)
intel beat AMD decades ago when they decided to concentrate on their production capability
the CPU's have always been about the same and most people don't care about the minor differences. Intel was able to manufacture enough CPU's to meet demand from all the top customers who were afraid to sign with AMD because of their production issues.
you can announce and hype products for months like nvidia does, but revenue is made by selling real manufactured products to customers writing checks. and you need to manu
to bad intel sucks in some ways (Score:5, Interesting)
Like having forcing you to go top of the line $300+ cpus just to get more then X16 pci-e lanes with out switches.
No Thunderbolt add in cards the demo used a mac pro with an add in card.
Poor on board video.
Trying to kill sockets that will drive up price for OEM's and limit choice.
If amd dies intels prices will go up and they do all kinds of stuff to make you pay.
Re:to bad intel sucks in some ways (Score:4, Informative)
Mod parent up.
I just sold my almost new i5-4670K to replace it by a A10-6800K. With the i5, it's simply impossible to get a working machine by using the new Debian Wheezy: no audio, no accelerated 3D, no fluid video, screen instability on the HDMI output, and high price. On the contrary, the A10 work perfectly well: audio, accelerated 3D, glitch free 1080p full screen video, rock stable HDMI output, and half of the i5 price.
Re: (Score:2)
You're doing it wrong.
Signed, Arch Linux user who has multiple Haswell machines running perfectly... oh and with open source drivers too.
Re: (Score:2)
Still twice the price compared to the A10, even using Arch Linux...
Re: (Score:2)
If all you care about is the IGP and don't care about power consumption and don't mind using closed-source drivers if you need real 3D performance, then the A10 is nice.
If, like me, you care about CPU, then the very-high end Haswell parts are about twice the price but deliver more than twice the performance in a lower power envelope. You can also buy Haswell parts for lower prices that are still comfortably ahead of the A10 at any CPU-bound load and have even lower power envelopes.
Oh and if the GPU is real
Re: (Score:2)
Why I will pay more for a CPU and pay for an additional GPU card to get what I can get with a APU for half the price of the CPU ? Oh and this GPU card will probably need a closed source driver anyway.
Re: (Score:2)
First, I never insulted you. I just share my experience. I am not a liar. Two weeks ago my 5 years old main machine motherboard failed and I needed a new one quickly to finish my work. My nearest PC shop have not so many parts in stock, some underpowered AMD processors and only a few mid to high end Intel processors. This is why I end up with the i5-4670K. I used it a few days with the configuration of my old machine as I have no time left. So I worked without sound, video, or 3D, and with some display glit
Re: (Score:1)
Another reason why it's morally superior to buy Intel if you care about open source instead of just being some shill who acts like he is a God because he made it halfway through an Ubuntu install once: AMD has dumped some out of date documentation on the internet for third parties to do their Linux driver development for free... Intel *pays* people to develop the entire Linux graphics stack.. and yes, that includes pretty much the entire infrastructure that makes it possible for any AMD gpu to run in Linux. If you want to be such a purist do this: Take out all the code that bad-old Intel wrote and see how well your amazing AMD graphics work on Linux, now do the reverse with AMD & Intel: guess what still runs fine because AMD doesn't do squat for the Linux graphics stack?
This isn't true. While AMD's main focus is on their proprietary driver, they do pay for developers to make an open source driver. Important recent driver work includes enabling uvd, better power management, and continued work on enabling the 7000+ series. Intel doors seem to get more done, but AMD is working with the community too.
Also, Intel still refuses to use the gallium 3d architecture. Gallium is supposed to help centralise various pieces of writing a graphics driver, making less work for everyone
Re: (Score:2)
I just sold my almost new i5-4670K to replace it by a A10-6800K. With the i5, it's simply impossible to get a working machine by using the new Debian Wheezy: no audio, no accelerated 3D, no fluid video, screen instability on the HDMI output, and high price. On the contrary, the A10 work perfectly well: audio, accelerated 3D, glitch free 1080p full screen video, rock stable HDMI output, and half of the i5 price.
So let me get this straight: Because you can't get a working computer running with your operating system of choice, that's the fault of the hardware? This seems like a case of misplaced blame. Especially when another operating system handles these things just fine. Now, if you want to make the argument that the necessary documentation of the hardware isn't available to the developers of your operating system of choice, or restricted by patents, etc., is what is the real problem here, I will grant that you m
Re: (Score:2)
Ok, the Debian Wheezy argument is a bit off tropic out of my concern.
But still, compared to the A10-6800K, the i5-4670K is not the nirvana:
* The HDMI output was not stable.
* The GPU performance was way inferior.
* The price was twice.
Re: (Score:2)
For me the price was CHF 252 for the i5-4670K and CHF 162 of the A10-6800K. Ok, it not twice the price, it's 55% more to be more exact. The DDR3 ram was the same and the motherboard price near identical.
I really owner a i5-4670K for two weeks. I have sold it this week-end to a young boy that will use it for gaming with an additional HD 8970 card.
I run Linux since 1995, I build embedded Linux system since 1999. I really prefer to run Debian stable on my main machine because I like the stability it granted to
Re: (Score:2)
Are you award that Ubuntu is a Debian variation ? There uses exactly the same package management: apt and dpkg. I have tried Debian Sid without much success on the i5-4670K so backports will have not solved the problem anyway.
You can blame my decision if you like, you can blame Debian if you like. I think that expecting that a new machine work without trouble with the last new stable revision of one of the leading distribution is not so insane. I personally blame Intel for there overpriced processors with l
Re: (Score:1)
I have made a bad experience with Intel, no with you, so why are you so emotive ? My expectation was perfectly realist: the A10-6800K meet the goal very well for a lower price. That's so simple.
Your english needs work, bud. That said... if low price is your only concern, AMD might be fine. That is, afterall, the market they're going for. But if you want performance, especially for games, or for running VMs, etc, forget it. And as far as this "HDMI not stable, blah blah blah"... that's not a hardware problem, that's a software problem. Stop blaming the platform for shitty drivers. It works fine on Windows.
If you want to say Intel isn't providing the specs or reference implimentations necessary for
Re: (Score:2)
English is a difficult task for me. Would you like to continue in french ?
My first concern was a new stable machine, not the price. If the price was a high concern I will have not even buy the i5-4670K in the first place. That said, I observe that a lower priced A10-6800K do the job just fine. Actually there is no processor on the Intel inventory that can match the GPU performance of the top AMD processors (and this is also true for discrete card as Intel don't make any of them). That might change in the fu
Re: (Score:2)
English is a difficult task for me. Would you like to continue in french ?
If you feel more comfortable with it, sure. But alas, english is the apparent default language of the internet, at least everywhere but China... which is weird because more people speak Mandarin than english, but I digress...
My first concern was a new stable machine, not the price. If the price was a high concern I will have not even buy the i5-4670K in the first place.
I suppose after all these exchanges, now is a bad time to point out that the 'K' at the end signifies it is meant for overclocking and, I'm wondering if maybe you tried this and the system became unstable, and so you concluded that Intel was shit at processors... rather than the rather
Re: (Score:2)
Vous avez donc bien compris pourquoi je fais l'effort de vous parler en Anglais.
Since I have to quickly replace a defect machine, I go to the nearest PC shop, and there only have the "K" version in stock. The non-"K" would have show exactly the same result as I did't even tried to overclock the processor. I only used the BIOS default conservative configuration. I did not overclock the A10-6800K either. So this point is not a concern for this exchange.
Note: I never overclocked, but I sometimes underclock or
Re: (Score:2)
So you bought brand new hardware, and expected it to work with an OS/drivers that entered feature freeze almost a year ago, and which was released slightly before the hardware was? I'm sorry, but that is no one's fault but your own. Haswell works fine in distros that were released after the hardware was. Even Debian Testing has Haswell support (as of a week or so ago).
Re: (Score:2)
The A10-6800K was released the same month as the i5-4670K: june 2013, so there the two equally brand new hardware.
I have tested last week without success the Debian Sid on the i5-4670K. Can't repeat the experience this week as I have sold it.
And I doubt that the distro was the cause of the HDMI instability.
Re: (Score:2)
Actually the A10-6800K is a re-release of the 5800K, but with more complex power management and new sensors bolted onto the chip ; motherboard/chipset are the same too and as far as I can tell the chipset are the same than with former APUs on socket FM1. The GPU has the same architecture as the Radeon 6970, released in December 2010.
Re: (Score:2)
AMD does not own a fab anymore. It is just another priority customer of global foundry. It is sad that they are in the path of vanishing (if they cannot reinvent themselves in another market). They cannot compete with a company 50 times bigger than themselves forever.
If it wasn't for ARM we would now have no option other than Intel. I think every attempt to create competition even if it is a futile Windows RT (but open for native programming) is good for consumers.
What about console architectures in PCs? (Score:2)
Re: (Score:2)
Was wondering that myself. I would of thought "Bald Eagle" would of been a likely candidate but looks like the roadmap:
* Hierofalcon CPU SoC 2014 Q2 ... doesn't include any mention.
* Bald Eagle APU/CPU Q1/2 2014 - The next generation high-performance x86-based embedded (???) processor
* Steppe Eagle APU SOC Q1/Q2 2014
-- :-/
With all these "bird" names does that mean AMD has flown the coop?
FFS (Score:2)
Re: (Score:3)
Athlon: Medium Power, medium price
A-Series: APU with built in graphics
FX: High end, lots of cores, high price
They're adding an ARM processor, which if you can't figure out 4 levels of processor isn't for you.
Re: (Score:2)
Re: (Score:2)
Dear AMD (Score:3)
AMD, please please please offer a socketed version of these chips and it would be even better if you offered an ARM processor only socketed chip that can plug into a full motherboard. I really want a full and snappy and upgradable PC in a small(er) form factor that does not need to crank up a fan to "OMG! I THINK I'M MELTING!"-speed because the CPU is running under a full load. If you insist on making it an APU, I can live with that. x86 is dying fast and Windows 8 runs on ARM which makes it the perfect opportunity to change and the time for change is now. Please AMD, do me this one favor.
Love,
A loyal customer
Hardware encryption (Score:1)