AMD Says It's 'Ambidextrous,' Hints It May Offer ARM Chips 140
J. Dzhugashvili writes "Today at its Financial Analyst Day, AMD made statements that strongly suggest it plans to offer ARM-based chips alongside its x86 CPUs and APUs. According to coverage of the event, top executives including CEO Rory Read talked up an 'ambidextrous' approach to instruction-set architectures. One executive went even further: 'She said AMD will not be "religious" about architectures and touted AMD's "flexibility" as one of its key strategic advantages for the future.' The roadmaps the execs showed focused on x86 offerings, but it seems AMD is overtly setting the stage for a collaboration with ARM."
let's hope that... (Score:1, Interesting)
this means less intel in the market and more AMD!!!!
though seriously, how good is the ARM architecture today? havent tried it yet, does it provide comparable performance to an intel processor of similar price tag?
Re:let's hope that... (Score:5, Informative)
Re:let's hope that... (Score:5, Interesting)
Its also worth noting that ARM has never been about performance until the semi-recent smart phone (mobile computing) surge. And even today, performance takes a backseat to power consumption. And it is here where ARM has always led the way. ARM vs Intel, ARM provides better price, better consumption, and very competative performance, albeit second place. But given the market to whch ARM is primarily focused on, ARM easily scores the win; in spite of Intels best efforts.
For those doing more traditional embedded development, Intel's offers are likely front runners. For those participating in the mobile computer segment, ARM, by far, is the very clear winner.
Re: (Score:3)
Seems some are working on bringing ARM into the server rack, and we can see the reason when we read about the kinds of power and cooling issues there are around some of the larger server farms.
Re: (Score:3)
I never understood why file servers didn't use low power processors. Recently we've seen more and more ARM NAS devices, but I figured FTP servers and such would use these "lower end" processors simply because they only need to perform minimal computation to validate users and serve files.
Re: (Score:3)
Re: (Score:2)
ARM NAS boxes are a nightmare. Slow as all hell. That's not the kind of performance you want from your FTP server. And FTP servers have generally been replaced by HTTP servers, and a lot of dynamic pages which use up lots of CPU time. But even if that wasn't the case, it's only in Windows that there's a drive to single-task. On any Unix se
Re: (Score:2, Informative)
Its also worth noting that ARM has never been about performance until the semi-recent smart phone (mobile computing) surge. And even today, performance takes a backseat to power consumption.
It was a long time ago [wikipedia.org], but not "never", when ARM was about performance and running circles around the 80286 and 68000 CPUs.
Re: (Score:2)
Comment removed (Score:5, Insightful)
Re:let's hope that... (Score:5, Insightful)
"The problem with ARM is there are literally millions of x86 programs that have become an integral part of peoples lives"
Not really. There are many ARM programs that have become and integral part of people lives. Android and IOS are two big example not to mention the apps that run on them.
Software is not as locked to an ISA as it once was. Microsoft and Apple have shown that with the move of Windows to ARM and the move of OS/X to x86.
Applications are not written in assembly anymore they are written in C++ or another high level language. Take your example of Photoshop? Moving Photoshop from Windows to Windows on ARM is probably a much simpler project a Windows and OS/X version. The same is true of Office.
I do think that AMDs Fusion is interesting but your reasoning on why people will keep use the x86 is not valid. They will only keep using x86 for as long as that is the best solution. IMHO x86 is endanger of being the next PDP-11 or VAX unless it can scale down to mobile and fast.
Re: (Score:2)
I'll just stop you there.
The problem with ARM is that it's not x86.
Yet Linux is x86 and it's not making any gains.
I think you might be trying to say that anything that's not Windows on x86 is going to be a failure?
I wonder, do you have the same attitude to Windows 8's much touted ARM versi
Re: (Score:3)
I can't say I've seen the driver issues you talk about.
Things tend (for me) to either work in linux, because the driver is supplied as a kernel module, or there's just no driver. In fact, for me, it's now considerably easier than windows. You don't even have to think about installing or rolling back drivers, because they're either just there already, or not available.
That's just me though, and what you're used to is a large part of it. I certainly do fall into the 1% here though.
Was 2004 the last time you t
Re:let's hope that... (Score:4, Interesting)
I told you already, Debian is the best at updates even across multiple versions. Also, OpenSuse and Fedora are where experiments happen and I doubt an "experienced Linux Admin" would be using them for something really important -unless she has the requisite depth to deal with a little breakage during an upgrade.
And posting again and again and again the same rant about an ABI isn't going to change the FACT that a stable ABI for drivers would make things worse for everyone. You keep trying to make comparisons to Windows in areas where it makes no sense at all to compare the two.
What's it going to take for you to realize that nobody cares if Linux gains market share on the desktop? You're making arguments (for years now) based the assumption that market share is a goal when in fact no one has such a goal.
Your perspective on this is so wrong and you've been corrected so many times by so many people I can't help but wonder if you've got some kind of learning disability or OCD or something. Seriously, not trying to be insulting at all. You've apparently started down this road in 2004 with some wrong assumptions that you just can't let go of, and that's really why you're so frustrated now.
Re: (Score:2)
Re: (Score:2)
"Finally, Linux is ready for the desktop" is almost always said in a sarcastic fashion. It's a joke, and has been for quite some time. It's great for workstations, but it's been a while since a significant number of people believed that Linux could overtake even the Mac for desktop marketshare.
Personally, I use Linux and Solaris for servers and workstations at work, Linux for my home server, and Windows for my desktop and my laptop. If I didn't want to play games, then I would consider replacing my Windo
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
I agree w/ this. ARM is an overcrowded market already, and why would someone prefer AMD to an established vendor who's been making it for years, such as TI, Qualcomm, nVidia, Freescale - just about every other big name in the semiconductor industry? Why would anyone prefer AMD to those guys? AMD did a good thing when it first went to the x64, and they can make that a more RISCy CPU over time when memory is never less than 4GB, there are 64-bit versions of most apps and then they can start dropping 32-bit
Re: (Score:2)
Really? Millions? You know, using hyperbole is all well and good, but ou
Re: (Score:2)
Heh... Clock to clock, they're neck and neck right at the moment with the A9. Remind yourself that most of the A9 devices are clocked 1.0-1.2 GHz where the Atoms are all clocked at 1.6-1.8 GHz. There's your speed difference. Power consumption...heh...they're not comparable right now. ARM consumes quite a bit less at comparable clocks to the Atoms. And the A15 changes the name of the game. It kind of pastes the current and the claimed next generation of Atoms in overall performance- and it still keeps
Re: (Score:3)
Re: (Score:2)
A few moths ago, I have compared a ARM Cortex-A8 and a Atom N450 running the same Linux application using OpenCV to process images. I was surprised that the speed was about the same if I reduce the processing to a single thread and set the two processors at the same clock frequency. This show that the architecture, while very different, yield about the same processing speed. The Atom N450 still have dual cores and higher clock clock capability that explain a higher power drain if fully used.
Probably, ARM a
Re: (Score:3)
First off, Intel was selling ARM chips up until a few years ago. They snagged the famous "StrongArm" series off of DEC and rebranded it "XScale".
Second, ARM only recently established itself as THE x86 competitor. Go look up all the RISC architectures out there which were competing for dominance. If you needed h
Re: (Score:2)
Re: (Score:2)
Antitrust legislation (and rightly so).
Re: (Score:2)
I am not an expert but from what I hear ARM has much more speed per dollar. Though ARM can't match x86 in parallelism.
Re: (Score:2)
this means less intel in the market and more AMD!!!!
though seriously, how good is the ARM architecture today? havent tried it yet, does it provide comparable performance to an intel processor of similar price tag?
The appeal of ARM is not measured in performance/$, it's about flipflops/wigwam.
Re:let's hope that... (Score:5, Informative)
The price tag is directly comparable, because ARM doesn't make processors, they sell licenses to designs. The only relevant metric is really performance at a given power point.
The closest competitor is Intel's Atom chips. At comparable power points, the current ARM chips seem to substantially outperform Atom chips, and the ARM chips scale far lower than Intel's do. It becomes a bit murkier at higher power levels, since until recently nobody was really making ARM chips that high, but we'll see a lot more competition in this field in the future with the ARM Cortex A15, which is intended to be a lot more scalable. The current design is planned to go from 1.0GHz single-core, up to 2.5GHz eight-core, depending on what the integrator wants. On top of that, they've got the new Cortex A7 that they've designed as an ultra-lower performance chip, which is intended to be a much simpler architecture that's still ISA-compatible with the A15. The intention is actually to put an A7 and A15 in the same SoC, so that the SoC can entirely turn off the A15 cores when only low performance is needed (like playing audio or video, since that's done almost entirely on a DSP). This is similar to what nVidia did with the Tegra 3, just taken even farther.
Re: (Score:2)
That was supposed to be "the price tag ISN'T directly comparable"
Re:let's hope that... (Score:4, Informative)
Much of this is a change of focus... Instead of beefy desktop CPUs running bloated OS, the focus is becoming more on portable devices.
Basically, this is "We're hanging in there in the desktop/laptop market, but rather than hang on to our piece of a shrinking pie, we want to get in on the pie that's getting bigger".
ARM is superior in low-power applications. It's highest-end CPUs maybe match Intel Atom, but often have far more peripherals (such as a fairly decent GPU and 1080p multi-format video decoding all on a tiny chip about the size of your thumbnail. Seriously - I can almost completely cover an OMAP4 with my thumb.)
Re: (Score:2)
this means less intel in the market and more AMD!!!!
though seriously, how good is the ARM architecture today? havent tried it yet, does it provide comparable performance to an intel processor of similar price tag?
To answer you directly, no. Not even close. I've read a few articles where folks are hopeful AMD could try to change that. Time will tell, I suppose.
PowerPC (Score:2, Funny)
Apparently they are bringing back the PowerPC for the new Amiga.
Re:PowerPC (Score:4, Informative)
The PPC used in the AmigaOne X1000 is a PA Semi PA6T - not very fast, designed as a low-power chip, and long-dead. Apple bought the company a few years ago, and I'm pretty sure new PA6T's are not being made. I suppose that speaks volumes about how many X1000's they reasonably expect to sell...
Re:PowerPC (Score:4, Insightful)
Re:PowerPC (Score:4, Informative)
RAD6000 / RSC / POWER1 (Score:4, Insightful)
RSC(POWER1) is the most popular CPU architecture on Mars, and possibly in the solar system outside of Earth.
Could we have a hybrid? (Score:4, Interesting)
So, when you switch to a high requirement program (Gaming,encoding,VS,etc) the x86 cores turn on like a coprocessor and the work is handed to them
The ARM handles the UI and other stuff
Re:Could we have a hybrid? (Score:4, Informative)
That's tough enough to do when all the processors use the same instruction set, but if the system has processors with different instruction sets, it makes it much harder to have the OS/system switch from a lower powered mode where it's running on the ARM processors to a high performance mode where it's running on the x86 processors. It's not impossible, it's just very complicated and I don't see companies lining up to do the work to implement something like that.
Re: (Score:2)
Re: (Score:3)
We manage to do it for Graphics in laptops (like Nvidia Optimus which shifts to the dedicated GPU when required, and the intergrated one otherwise)
That is for just one app, with one bit of specialized code that runs better on the GPU. And it's to do just one thing (arithmetic that the GPU is good at). Finding what operations work most efficiently on ARM vs x86 would be a whole project in itself.
You would basically need to convince Microsoft (or whoever is the prevalent OS vendor in this fantasy) along with ALL of their partners, to switch to ARM as the primary architecture, and THEN convince them to include additional code types if their apps want t
Re: (Score:2)
I'll put my $1M in on this one.
What you need is the ARM core to provide the hypervisor/uefi/bios access with the x86 cores being VM's. You then get the best of both worlds and can easily ensure that the best chip handles the apropriate load. Audio and Video get handled by the ARM core and it's DSP's while the x86 cores handle all of the x86 based software.
Re: (Score:2)
Re: (Score:2)
Sorta like this [cnet.com]?
It's not currently available though, and I'm not sure how long it was really available for...
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
But which of Microsoft's divergent, self-serving rules regarding Secure Boot apply to a hybrid x86_64/ARM system?
Re: (Score:1)
Re: (Score:1)
Ooh, someone's hurt because I brought up what would be an entirely valid question in the environment the GP quoted, and it put MS in a bad light!
Re: (Score:3)
I recall that this was tried once in the 90s by Apple - they had a special accelarator card w/ an AMD CPU on it, which would plug into a PCI slot on the motherboard. So one could run native PPC apps on the Mac, but if one needed to run any Windows apps, it could simply be run on the AMD. (I think it was a K5 or something - don't think the Athlons were out by then). Of course, today Apple uses the x86 itself, but any other workstation maker could use something similar.
Dunno that it would work for tablet
sub-45nm ARM? (Score:3)
Wondering if a big state-of-the-art chip-fab like AMD getting into ARM processors might make sub-45nm ARM processors a possibility? AFAIK, only X86 chips are made like this just now. Could lead to fantastic performance-per-Watt chips coming off the line.
Re:sub-45nm ARM? (Score:5, Informative)
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
Ambidextrous? (Score:5, Funny)
Does that mean it's using two ARMs at once?
(duck)
Re: (Score:2)
Wouldn't that be ARMbidextrous?
Re: (Score:2)
Amdibextrous?
Re: (Score:2)
Yes, but they cost an arm and a leg.
Re: (Score:2)
It means the mix big-endian and little little-endian in the same architecture.
Re: (Score:2)
As a complete NOOB on the subject... (Score:3)
... would it be possible (or I guess more importantly) worthwhile to put x86 cores WITH ARM cores on a single chip?
In addition to offering dual boot capabilities, it might be useful to run "Virtual" (or sort of virtual) machines at full speed. I've often thought it would be nice to run some of the thousands(!) of cellphone Apps that I have on my laptop. Although it might be tricky to implement multi-touch correctly, still I'd think there might be some utility.
Or maybe all CPUs today are very generalized RISCy architectures with everything taken care of in microcode (or maybe nowadays it's nanocode)? That would make it (comparatively) really easy to do, right?
Re: (Score:3, Informative)
Or maybe all CPUs today are very generalized RISCy architectures with everything taken care of in microcode (or maybe nowadays it's nanocode)? That would make it (comparatively) really easy to do, right?
Sounds like you are reinventing the Crusoe processor [wikipedia.org].
Re: (Score:2)
It'd be far easier to do Apple-style "universal binaries" [wikipedia.org] (bundles that contain executables for more than one architecture) than it is to create this kind of hybrid hardware. Apple could already create iOS/OSX universal binaries in Xcode if they wanted to since it can already compiles for both x86 and ARM for the "emulator" and device respectively. The biggest hurdle is the fact that the main control interface (touchscreen) is missing on the dektop.
Re: (Score:2)
Re: (Score:2)
What the hell are you talking about? Android is a slightly modified java based platform. 95% of the Android apps out there are completely CPU-agnostic, because that's just the default state of affairs.
The exceptions are the CPU-intensive multimedia apps... Adobe Flash, video players, and some games. Actually Firefox is on the list as well for no good reason.
You can't get Flash for any other
Re: (Score:2)
ambidextrous (Score:1)
Re: (Score:3)
The real question is: Will the left hand know what the right hand is doing?
Modern architectures usually don't do that. There is a solution to this problem, but it's kind of MESI.
choice of words (Score:4, Funny)
Since they have no products using that other architecture I think the word they were looking for is "Bicurious".
Why it's called "trinity" (Score:2)
1) They integrate CPU, GPU, and "system" on a chip - not really worthy of the name
2) They integrate 3 distinct CPU architectures in APUs. Bulldozer, Bobcat, Power. Or x86, Power, ARM.
3) They are aiming for PC, Apple, and Console markets with the stuff in #2 (consoles require Power arch for backward compatibility).
My bet is that Wii U will have an IBM CPU and A
Re: (Score:2)
Re: (Score:2)
Competitor for Tegra? (Score:4, Informative)
What I really want to know is if they can ... (Score:2)
... upgrade the ARM architecture to 64 bit (hopefully, they have some experience in that), put 64 cores of it on one die, and crank the speed up to 4 GHz.
Re: (Score:2)
64 cores at 6.4Ghz running 64-bit code... we'll call it the AMD 262144 processor
Re: (Score:2)
For many tasks 64 bit is over rated. Unless you are doing something that needs a HUGE memory space and or 64 bit ints 64 bit code takes up more room and is slower than 32 bit code... If the ISA isn't brain dead and starved for GP registers in 32 bit mode.
Re: (Score:2)
"For many tasks 64 bit is over rated."
And as time goes on, that 'many' turns into 'some' and eventually into 'once in a blue moon'. Thats the nature of progress.
The thing is, many of us actually do need 'HUGE' memory space and/or 64 bit ints.
this is 2012, and i just need more than 4GB of RAM in my computer.
- My flight combat simulator gobbles RAM like a crack whore gobbles crack. (DCS A-10) 4GB is simply not enough for this one application.
- Photoshop CS5 / Lightroom just runs better 64 bit
Re: (Score:2)
You can have more than 2/4GB of space with a 32 bit cpu. The limit is on per process.
I am also into flight sims and they do tend to fit that category.
Lightroom/photoshop. You bet.
Now the idea that their is no reason to not run a 64 bit version of the app... If the App will never need the memory space I disagree. Your GTalk client will never need that much space. Your word processor hopefully will never need that much memory. If it does then bloat has gotten out of hand.
Now on X86 things are different. In th
Idea... (Score:2)
Works for me. (Score:2)
I will so buy a bagfull of these chips if AMD follows through on this smart thing. 28 nm multicore ARMs. Booya! Also looking forward to the integrated low power GPU.
a*r*mbidextrous (Score:2)
x86 and GPU, not x86 and ARM (Score:3)
AMD is clearly talking about using both x86 and GPU for compute work vs. focusing on x86 only... the ARM thing is just a wild speculation, or wishful thinking.
Not about instruction set... (Score:2)
I think a lot of people mistakenly believe ARM's success is because the instruction set is just better. AMD impleminting the ARM instruction set does not, by itself, suddenly make AMD more compelling.
The ARM architecture's licensing has allowed a larger variety of companies to get in the game with their own ideas around implementation. This has led to exceeding low prices compared to the levels the x86 solutions have been willing to go, energy optimized designs to target specifically the mobile device mar
Re: (Score:3)
http://www.arm.com/files/downloads/ARMv8_Architecture.pdf [arm.com]
Re: (Score:2)
Re:Where does AMD come into the picture? (Score:5, Informative)
AIUI ARM do HDL design of processor cores, then they pass that HDL on to other companies who make complete chip designs based on it. Those companies in turn pass the designs onto fabs (which may be in-house or external) for manufacture. IIRC some vendors also do their own HDL work and only license the basic architectural design from ARM.
Re: (Score:3)
AMD bleeds money to Intel for the x86 instruction set. At one point this was manditory since all the programs out there that were able to be run by a comparatively inexperienced computer users were written for the computers they could find at Radio Shack et al. Now that Microsoft and Google are popular and platform agnostic (Linux/Android vs win8) AMD has a window of opportunity to start from scratch and just offer a kernel patch to have your apps run on their chips. This new direction is
Re: (Score:2)
Re: (Score:2)
So what does AMD bring to the table with ARM game? They do have a pretty nice graphics GPU and they do have some familiarity with optimizing not to mention the ability to merge x86 with ARM if they want to. ie 2 x86 cores and 2 ARM cores so you could have blazing performance at the cost of power or boot
Re: (Score:2)
Re:StrongARM (Score:4, Informative)
Re:at last! (Score:4, Insightful)
*BUT*, there comes a massive performance penalty which is that the clock rate now has to be twice as fast as a RISC processor in order to achieve the same results.
That's just complete bollocks.
A modern x86 processor (meaning... since the Pentium Pro in the mid 90s) is, internally, a RISC-like core with full OoO execution and so on and so forth.
Variable instruction decode is a pain in the ass and does add latency in the front end. This isn't great, but it is nowhere near a 50% reduction in IPC. Try more like 1-2% (measured via correlated cycle-accurate performance simulator), depending on how clever you get and in any case easily made up for by a clever widget or two.
Basically predictions of RISC eating x86 for breakfast were made over 15 years ago and never came to pass. Mostly by x86 morphing so that the difference was essentially irrelevant.
Your talk about northbridges sounds woefully out of date, too. This has nothing to do with ISA, and both major x86 vendors now have integrated northbridges.
You're closer to reality when talking about power. Regardless of the small IPC penalty, those decoders burn up a lot of power. There are ways to get around this, too, and for moderate perf moderate low power x86 does just fine. At the very low end of power, though, going to something like ARM makes sense.
Re: (Score:2)
It's not just the variable instruction decode (incidentally, just the bit that works out the length of the next instruction is the size of an entire ARM execution core) but it also makes things like branch prediction and out-of-order execution much more complex to implement compared with a more straightfoward ISA encoding.
The predictions that RISC eat x86 for breakfast DID come to pass. ARM outsells all other CPU architectures *put together*.
Re: (Score:2)
Given that most of the hit that x86 takes is as a result of the instruction length decoders and microcode, isn't it possible to have a 2 chip solution where:
That way, the first chip can be optimized for maximum power savings, RISC performance a
Re: (Score:2)
I'd go for hoping x86 will die. It's an outdated piece of junk, that even internally doesn't work anymore. Intels and AMD's simply convert that crap to RISC-ish architectures. The reason that they don't make the CPU ARM instruction compatible is because the instructions change everytime, just to get x86 apps to work faster. This has been said by John Bridgman, AMD's GPU driver manager, so the info must be correct.
It won't hurt open source and Microsoft has an internal port of Office and Windows already runn
Re: (Score:2)
Sounds strange, given that until recently, their CEO Dirk Meyer was the same guy who led DEC's Alpha team. At that time, MIPS was a more crowded market than it is now. However, in retrospect, AMD's move to do the x64 was fantastically successful - even being used in some super-computers. While Intel's Itanium project - their 3rd attempt to come up w/ a successful non-CISC architecture - bombed.
Intel's i860 had moderate success, like in the Paragon, while the 960 as well as AMD's 29k was used in periphe