AMD: We Stand Ready To Make Arm Chips (tomshardware.com) 95
AMD's CFO Devinder Kumar has commented that AMD stands ready to manufacture Arm chips if needed, noting that the company's customers want to work with AMD on Arm-based solutions. From a report: Kumar's remarks came during last week's Deutsche Bank Technology Conference, building on comments from AMD CEO Lisa Su earlier in the year that underscored the company's willingness to create custom silicon solutions for its customers, be they based on x86 or Arm architectures. Intel also intends to produce Arm and RISC-V chips, too, meaning that the rise of non-x86 architectures will be partially fueled by the stewards of the dominant x86 ecosystem. "But I'll tell you from my standpoint, when you look at compute solutions, whether it's x86 or ARM or even other areas, that is an area for our focus on investment for us," AMD CFO Devinder Kumar responded to a question about the company's view of competing Arm chips. "We know compute really well. Even ARM, as you referenced, we have a very good relationship with ARM. And we understand that our customers want to work with us with that particular product to deliver the solutions. We stand ready to go ahead and do that even though it's not x86, although we believe x86 is a dominant strength in that area."
ARM Wrestling. (Score:2)
Not so much the "who can make ARM", but "who has expertise in ARM"?
Arms and the ARM (Score:5, Funny)
They were talking about Arm chips. These, presumably, are chips that you use to control your arm.
ARM chips, on the other hand, set your Adjustable Rate Mortgage.
Simple.
Re: (Score:2)
Re: (Score:2)
"who has expertise in ARM"?
Well, AMD does [wikipedia.org], for one.
I'd go straight to RISC-V (Score:1)
And use ARM only until the RISC-V chip is ready to be revealed.
Maybe that's what they're doing.
Re:I'd go straight to RISC-V (Score:5, Interesting)
I'd make whichever chip customers are going to pay me to make.
Re: (Score:1)
And I'll drive you out of the market because you didn't see what was coming.
Keep making those buggy whips...
Re:I'd go straight to RISC-V (Score:4, Informative)
I'm big enough that I can wait and see. Then acquire the small companies that win in the market.
I think switching over to RISC-V before the market is ready for it is perhaps worse than being too late to the market. Because like I said, you can just buy whoever took the gamble and won. Dumping capital in something that you can't turn into positive cash flow in a few years is problematic for investors and share holders.
Ultimately a mature company like AMD can make x86, ARM64, and RISC-V and push out a new chip in 18 months of any of those architectures. A fair bit of reuse of internal architecture is possible even with different processor designs, control plans, busses, etc. Look at Nvidia's Denver project, it started off as an x86 of the Transmeta style VLIW, but it ended up being one of the first few 64-bit ARMs to ship. Is it the best ARM on the market? Hard to say, it had some early problems with inconsistent CPU benchmarks. But it is on the market, and mostly because of the on-chip peripherals and software support, not necessarily because of the CPU architecture.
Re: (Score:1)
Re: (Score:2)
You aren't driving anyone out of any market. You aren't making a damn thing. Do you even know about any of its problems which have been widely discussed [ycombinator.com] for some time now? Get a grip.
Show us then (Score:3)
If you're ready to make more chips, show us you can make chips you already have by making RX cards.
Re: (Score:3)
lol, found the justifiably angry gamer.
Re: (Score:3)
Wow, what? Are you here to be an attack dog for AMD? They have a wildly-popular line of video cards with enormous unfilled demand and they won't even publicly discuss their plans to fill that market's needs. Criticism of AMD is fair game.
Re: (Score:2)
A large share of the GPU chip market was "mining equipment for Ethereum". This is about to change, because Ethereum is in the transition phase towards proof of stake. Expect GPU boards to become cheaper and much more available in the coming year, but do no longer expect huge annual performance boosts, because these were partially financed by those miners.
Since AMD is well aware of all this, they'd be awfully silly to boost their GPU chip output now.
Re: (Score:2)
M1 (Score:4, Interesting)
Re: (Score:2)
Will that be the death of the Wintel duopoly [wikipedia.org], and the rise of the WinARM duopoly?
Re: (Score:2)
https://www.statista.com/stati... [statista.com]
Re: (Score:2)
Generally, Windows-on-ARM is called "WARM". It already exists on Qualcomm-based machines.
WinARM Duopoly (Score:2)
Well, by my count MSFT has tried to make Windows work on RISC processors many times, all were failures.
-WinNT on MIPS
-WinNT on Alpha
-Windows on various ARM cpus for Surface
-Windows on Qualcomm ARM procs
None ever lasted or achieved significant market share. Unlike Apple, Microsoft doesn't want to strangle big revenue stream they have by committing to a complete move to ARM for the (based on history) justifiable fear that that would flop. Gotta give Apple credit for risk-taking on that.
Re: (Score:2)
And how exactly were WinNT on Mips and Alpha failures?
And how is Windows on a Surface a failure?
You must have a silly definition of failure. They all work great.
Gotta give Apple credit for risk-taking on that.
They did not take a risk. The sell their own hardware. No one cares what CPU is inside.
Re: (Score:2)
Re: (Score:2)
Also... they were failures in that they wouldn't run a lot of the programs that people wanted to run, except perhaps slowly under emulation. Windows for Alpha had x86 emulation which worked pretty well, but as always it was a performance compromise. (x86-64 was not yet a thing.) Windows for MIPS and PowerPC never had emulation at all, so they were useless for most desktop users. Windows RT (the Windows that the Surface 1 and Surface 2 ran) only supported what are now known as UWP applications, which exclude
Re: (Score:2)
That's the failure of the third parties. Its the developers of such programs who never bothered to compile for the other architecture.
With FOSS this is not such a problem, since the code is available, but what do you do with proprietary software when the owner doesn't care for "minor" platforms?
Same reason for the Itanium flop, it wasn't the architecture fault. Thing is Android smartphones pushed arm cpu adoption and the real world usage is now very real, arguably the most popular architecture. Even the Nin
Re: (Score:2)
Re: (Score:2)
Microsoft has never made a full commitment to making Windows work on RISC processors. Native application support is almost nonexistent. Microsoft hasn't even ported all of its OWN applications; Office only runs under emulation at present (a native version is finally coming and will supposedly be ready in time for Windows 11), and you can't run Visual Studio at all on the released version of Windows-on-Arm (you can run it on an Insider build with x86-64 emulation).
If Microsoft wants Windows-on-Arm to succeed
Re: (Score:1)
Not at all, my Linux desktop of choice beats Apple's and Microsoft's into the ground. After using something good I can see what a piece of shit alternatives are, is all.
Times are changing and Microsoft and Apple users are being left behind.
Re: (Score:2)
"faggot hip appeal." Why don't you stop teasing and come out already. You'll feel better.
Re:M1 (Score:5, Insightful)
er, no. Hardly anyone has M1 chips. And Apple has only been making OSX more cumbersome [...]
You've entirely missed the point if you think that matters here. The OP was talking about what the M1 has demonstrated in terms of hardware capability, which is what it is regardless of how many people have the M1 in their hands or what anyone's opinions are on macOS. Your complaints may be valid, but they have nothing to do with the topic at hand.
The M1 is an entry-level chip that already outperforms nearly all of the entry-level and many of the mid-level x86 chips available today across a variety of metrics, despite Apple's choice to maintain tight constraints on its thermal performance and performance per watt. But there aren't any technical barriers keeping Apple from lifting those constraints—and, in fact, they've even announced their intent to do so over the next 1-2 years—which is terrifying to people heavily invested in x86. Apple has plenty of headroom left to move upmarket whenever it wants to: the architecture can easily support more cores, power budgets can easily be increased, and (better) cooling can easily be provided, a combination of which would easily put them ahead in most of the areas where x86 still maintains a lead (e.g. the M1 lags many mid-level chips in multi-core performance today, but in every case I've seen that's because it has fewer cores, which is an easy problem to solve). Meanwhile, AMD and Intel already lifted those constraints years ago, so they don't have the headroom left that Apple does.
"But wait!", a straw man says. "Apple doesn't compete with AMD or Intel, and even if the M1 was decent, it's still hamstrung by [macOS/iOS/Apple]." Again, those are missing the point. What the M1 has done is demonstrate what's possible with ARM. It shows that ARM is not a toy. It shows that ARM can be scaled up to a point where it competes with the incumbents. Apple may not compete against Intel or AMD, but others will, and when they do—not if—it is likely to result in a market disruption as the chips move upmarket over the course of several years, eventually displacing x86. Qualcomm's ARM designs are a few years behind Apple's and are trending in the same direction at about the same rate, so they're in a prime position to start competing against Intel and AMD in the not-too-distant future. Likewise, Samsung has deep experience designing ARM chips with its Exynos line. Now that they've been shown a path forward, they'll likely move in that direction as well.
If you're at the top of Intel or AMD and you don't have double-digit performance improvements planned for the next several years straight, this is when you tell the board that it may be prudent to start spinning up expertise lest a nascent competitor disrupt the entire market, leaving you a relic of a bygone age. And what would that look like to an outsider like you or me? Well, if I had to hazard a hypothesis, it might resemble AMD saying "we're willing to design ARM chips too!" or Intel announcing its willingness to work with designs that aren't their own.
Hypothetically speaking, of course.
Re: (Score:2)
silly rabbit, trix are for kids.
M1 doesn't demonstrate anything, as you said entry level chip. To push it to the limit would mean doing the same thing other architectures have done, with all the power consumption and other issues that make Intel and AMD struggle with.
You do know the underpinnings of an Intel or AMD chip, a very very different thing than x86-64 which is a kind of skin, could be turned into am ARM or Sparc or Power running thing.
Re: (Score:2)
To push it to the limit would mean doing the same thing other architectures have done, with all the power consumption and other issues that make Intel and AMD struggle with.
Yes and no. Just because Apple will do the same things does not mean that Apple will get stuck at the same place. Quite the contrary, all indications point to Apple having a higher performance ceiling because of the choices they made to get where they are. Apple prioritized performance per watt over all other metrics, and the M1's performance per watt is heads and shoulders over what AMD/Intel have to offer. The result of that choice is that as Apple has been scaling up from making smartphone chips to PC ch
Re: (Score:2)
"underpinnings" are modular and can be adapted and optimized to any instruction set whatsoever, and Intel and AMD have superior tools for doing such. They can make a better ARM than Apple if they so chose. We know both companies in fact are experimenting with it.
Re: (Score:2)
What are these "underpinnings" you're getting at? As with most CE/CS majors, I've built my own CPU before, so speak plainly.
Re: (Score:2)
funny, I've got the standard arch books from my masters degree too. YOU were the one who said "underpinnings"
Re: (Score:2)
I did? https://slashdot.org/comments.... [slashdot.org]
Re: (Score:2)
Also, my point in bringing up my background wasn’t to say I know more than you. My point was to say that I know enough for you to speak in specifics, rather than the vague terms you’ve been using.
Re: (Score:2)
yes, you did
*the distinction between an instruction setâ"the "skin"â"and the actual lines that are laid down in siliconâ"the "underpinnings"â"*
Re: (Score:2)
yes, you did, hoist by your own petard again.
https://slashdot.org/comments.... [slashdot.org]
Re: (Score:2)
I was quoting you from your immediately prior post, where you said “You do know the underpinnings of an Intel or AMD chip”, hence the quotation marks I used around “underpinnings” and my linking you to your previous post where you used the term first.
Re: (Score:2)
I think that the enormous success of Apple's M1 has awoken some sentiment that x86 is an also-ran.
I think that's guilding the lily somewhat.
It's an impressive CPU, with competitive but not leading performance. It is rather lower power draw, but it uses HBM, and we've not seen an x86 CPU with that memory yet, but there's nothing about x86 that precludes it.
Re: (Score:1)
I'd also like to point out that Apple's chip is completely dominated by application-specific accelerators which give their platform the illusion of a fast general purpose computer but it's in fact only capable of running those specific applications efficiently. As such, you can't even install alternative web browsers or even do things like play certain types of media in high resolution.
This requires the use of cloud computing for more demanding tasks which precludes a rental/service model for use of the de
Re: M1 (Score:2)
As such, you can't even install alternative web browsers or even do things like play certain types of media in high resolution.
WTF are you smoking?
You are either stoned, or stupid. I'm guessing the latter.
Re: (Score:1)
Not stupid, I can just afford an iPad Pro
Re: (Score:2)
Not stupid, I can just afford an iPad Pro
If you have an M1-based iPad Pro, then you already know it is plenty fast at general-purpose computing, even with tasks that don't particularly leverage its many fine on-chip subsystems.
Re: (Score:2)
guilding the lily
For future reference, the phrase is "gilding the lily."
Re: M1 (Score:1)
Re:M1 (Score:4, Informative)
I think that the enormous success of Apple's M1 has awoken some sentiment that x86 is an also-ran.
So it looks like Apple shipped about 5.4 million M1 PCs in the 2nd quarter of this year [patentlyapple.com], out of a total worldwide market of 71.6 million units. [gartner.com]
I mean, that's certainly a shift, but if x86 is outselling M1 by a factor of 13-to-1, that hardly makes x86 an "also ran" does it?
Re: (Score:2)
Re: (Score:2)
I'm disagreeing with your use of the words "also ran" and nothing more.
If you said "this could be the start of a general shift from x86 to ARM in the desktop space" I wouldn't have disagreed. Calling something that still holds the vast majority of the market an "also ran" is just plain inaccurate.
Re: (Score:2)
Not an also-ran, but I think this is the beginning of the end for x86.
Let's face it, CISC instruction sets are cumbersome and outdated, and everyone has known that for decades. Even Intel knew that when they tried to strong-arm everyone onto IA-64. The industry has been waiting for a viable RISC-like ISA to arise and challenge x86 for a long time. I've heard and read conjecture many times about ARM being that challenger. The question was always whether an ISA designed for low power could scale to desktop
Re:M1 (Score:4, Interesting)
So it looks like Apple shipped about 5.4 million M1 PCs in the 2nd quarter of this year [patentlyapple.com], out of a total worldwide market of 71.6 million units. [gartner.com]
You say those numbers as if they prove your point, but I'd argue they actually do the opposite: what you're saying is that, despite all of the issues that come with switching architectures, something went from being a rounding error to owning 8% of the market in less than a year. That's terrifying if you're an incumbent. The fact that it's coming from the low end of the market with aggressive (and entirely feasible) plans to move upmarket sounds like a textbook example of what a disruptive technology [wikipedia.org] looks like: first it's dismissed as a toy, then it's accepted as good enough for many people, then pretty soon it's the only thing anyone is using.
Granted, Apple doesn't compete directly with Intel and AMD, but Qualcomm already competes in adjacent markets, their ARM designs are only a few years behind Apple's, and their performance has been on a similar trajectory to Apple. Likewise, Samsung competes in adjacent markets too, and they have deep experience designing ARM chips for their Exynos line.
Barring an unprecedented x86 breakthrough, Apple has proven that x86 is an also-ran, but it'll be the ones who follow them that make it actually happen.
Re: (Score:2)
You say those numbers as if they prove your point, but I'd argue they actually do the opposite: what you're saying is that, despite all of the issues that come with switching architectures, something went from being a rounding error to owning 8% of the market in less than a year. That's terrifying if you're an incumbent.
That's a fair point (and one I agree with, FWIW) but we're not quibbling about whether or not this is a significant shift (I said as much in my comment) but rather if this makes x86 an "also ran." "Also ran" is defined as "a loser in a race or other contest, especially by a large margin" or, alternatively, "an undistinguished or unsuccessful person or thing." You're talking about something that has dominated desktop computing for almost four decades, and has flat owned the server market for more than two
Re: (Score:2)
Fair enough. Rather than calling it an "also ran", would it be more agreeable to say that we have every reason to believe we'll be able to look back in a few years and see that this was the inflection point when x86 began its decline?
The fact that it's coming from the low end of the market
This is the first time I've heard Apple products referred to as "the low end of the market" since the original iMac.
Sorry, I rewrote parts of that post a few times and I apparently cut the part where I made it clear that "it" is ARM and "the market" is the CPU market. Other than the very beginning, my post was intended to focus on ARM in general, rather than the M1 in particular, so it may m
Re: (Score:2)
No. AMD has a semi-custom business that they still nurture. It kept them alive in the dark ages before 2017. Their semi-custom partners are likely asking for ARM-based solutions, and AMD still has K12 sitting on the shelf proving that they are able to produce ARM designs (even if they decided not to bring K12 to full production).
M1 has nothing to do with it directly. Maybe some of AMD's customers are interested in similar silicon, but there's no guarantee that any AMD-produced ARM solution would be as p
Manufacture? (Score:1)
AMD's CFO Devinder Kumar has commented that AMD stands ready to manufacture Arm chips if needed
Doesn't AMD outsource all their manufacturing to Taiwan Semiconductor?
If so, then wouldn't the correct word would be design or create?
Re: (Score:2)
Yes, though that's quibbling. If they put in an order to TSMC for wafers, it's up to AMD what they etch into those wafers.
Re: Manufacture? (Score:2)
Yes, though that's quibbling. If they put in an order to TSMC for wafers, it's up to AMD what they etch into those wafers.
Sorry. AMD doesn't get a pass on that one.
There is no end to the Apple Hater posts on here that rush to point out that Apple doesn't "make" their own chips; so the same applies to AMD.
Re: (Score:2)
Microsoft could accelerate this rapidly (Score:3)
Since Apple already had a bunch of ARM hardware out now, an official M1 port of Windows could well mean a raft of third party ARM Windows laptops would not be far behind... Microsoft has to see the age of the Intel architecture causing limitations as well...
It'll be interesting to see if there's enough interest for custom Linux related ARM hardware to make it happen earlier.
Re: (Score:2)
Re: (Score:2)
Apple is a closed platform, they want it that way. There is no reason for Microsoft, or anyone else, to fight against it.
Re: (Score:2)
And there's no reason for Apple to undermine their own business model by selling their hardware to other software vendors.
Re: (Score:2)
Your post assumes that Apple is willing to provide "M1" to third parties (which is laughable) or that Microsoft is in a position to make an "official M1 port" which they likely are not. M1 != ARM.
Of course, this post isn't about Microsoft or ARM, it's about Apple and Apple's interest, because you are SuperKendall. "Third party ARM Windows laptops" are possible now, M1 support is completely irrelevant to that.
What are the "limitations" being "caused" by the "age of the Intel architecture", SuperKendall? C
Apple says it's up to Microsoft (Score:3)
Your post assumes that Apple is willing to provide "M1" to third parties
I'm not assuming anything, I'm going by what Apple said [9to5mac.com]...
Maybe true of Linux on M1, but if Microsoft wants BootCamp to work the M1 they could support it. Or, at least virtualized Windows instances running on M1 (which they do not officially support today even though people have got Windows ARM builds working in them).
Re: (Score:2)
Up to MS if they want to pay whatever exorbitant fees Apple would charge to license M1, M1x, M2, etc.
Re: (Score:2)
You do not need a license for an M1 to port Windows to M1 ...
Re: (Score:2)
Re: (Score:2)
I think times have changed. Most everything people need now is a website. All Windows on Arm really needs is a good web browser and Office.
Microsoft is reportedly working on a translation layer. My understanding is that x86 32-bit apps work now, but 64-bit compatibility is in progress.
https://www.techrepublic.com/a... [techrepublic.com]
https://docs.microsoft.com/en-... [microsoft.com]
Re: (Score:2)
It already exists. Windows on ARM is already a thing. M1 Macs can run it in virtualization.
https://9to5mac.com/2021/05/03... [9to5mac.com]
But, that is mostly a parlor trick. The bigger question is if and when more powerful ARM processors will be available from other CPU manufacturers (AMD, Qualcomm, Samsung, heck even Intel), and when Dell and Lenovo start making laptops with them.
AMD? Now? (Score:2)
I recall that AMD made ARM server CPUs several years ago. Not a huge success, but they had them.
Today, these have to be pretty obsolete.
Also, advanced manufacturing plants (7nm and smaller) are booked solid, I don't think there is much left for new products in the short term.
Finally, AMD can currently sell everything they make, demand exceeds supply. There is no pressure to move to ARM.
Re: (Score:2)
Yes, they were called the Opteron A1100 [wikipedia.org], and there were a few servers built with them -- with 14 SATA3 ports and dual 10Gb Ethernet, they were perfect for storage appliances. But it used ARM Cortex-A57 cores, while its successor [wikipedia.org] was planned to be an in-house design.
Re: (Score:1)
Yeah, what Lisa is really saying here is if you're prepared to pay AMD then they'll happily make your inefficient CPU which will never be able to compete with them.
Re: (Score:1)
That is, if AMD had free capacity. For their current 7nm products, they are using every wafer they can get from TSMC.
There may still be free capacity at Global Foundries, AMD's former manufacturing arm. But they do 12nm at best, putting new designs at a disadvantage right away.
Re: (Score:2)
Zen was supposedly originally designed as an X86 core which could be repurposed for ARM later.
Allegedly Jim Keller left AMD because the ARM core was cancelled.
Re: (Score:2)
Not entirely true. Zen and K12 (ARM-based) were designed simultaneously by some of the same people. K12 never made it off the shelf. But yes, Keller has been quoted as saying he was disappointed with K12 being shelved like that.
Re: (Score:3)
The big advantage in the server room would be motherboards, or chips, with both. AMD has recent patents that I've been speculating are for improving IO integration with FPGA or other chiplets. "Everybody" else thinks it is just a manufacturing increment thing, or a separate product.
In embedded there are lots of SOCs with a large application ARM chip plus two tiny M-series ARM chips, with shared memory and DMA support. This means you can put the IO protocol code in the microcontroller, feed it into memory fo
How about ARM Cortex? (Score:2)
I'm not really sure what the backlog of system/phone processors are, but I do know that there's a huge waiting list for ARM Cortex MCUs with everybody scrambling to see what's available, whether or not the person claiming to have the chips is legitimate and if they can be used in current products.
If AMD or Intel could help with that backlog, it would probably do more to ensure a smoother recovery from Covid than having more systems and phones (many of which Cortex and other MCUs).
Re: (Score:2)
AMD outsources the actual manufacturing of the chips (typically to TSMC). They just design them. Intel does own their own fab facilities but quite likely already has them at capacity making chips for their existing orders - it would be pointless to accommodate other people's jobs unless they had spare capacity going unused.
The stark truth is that at the moment fab capacity is reduced due to COVID (impacting both the plants themselves and the availability of raw materials), and demand is up, partially fuel
That would be new... (Score:2)
...since AMD doesn't "make" chips now, just designs and packages them. Are they going to buy GoFlo back to make these new ARM chips?
Another ARM design out there does nothing for the shortage of fab capacity.
Didn't they promise ARM chips before? (Score:3)
Where's the K12? Where are the A series Opterons? Still can't find a Seattle board anywhere.
Re: (Score:3)
Yes, K12 was developed alongside Zen. At the time, there was (apparently) no market for them. If you look at the landscape of modern ARM server architecture, perhaps you'll see that they were mostly correct. Amazon, MS, and Google may kill the server ARM market before it can mature with their in-house products. It'll be interesting to see if Ampere can make it as an at-large designer of server ARM solutions. Nearly everyone else has exited the market or (in the case of HiSilicon/Huawei) been bottled up
Re: (Score:2)
I see your point, but I would say that we haven't had a 'major player' enter the ARM server market--AMD/Intel, etc. Given how AMD has been willing to work with companies to design chips to their needs (Sony playstations, Xboxs) they would have been in a good position to meet Google, MS, and Amazon's needs for custom silicon. As it is, there's no big enough player for any of those companies to be interested in parterning with.
I guess we'll never know since they killed it.
Re: (Score:3)
The ARMy has been trying - and failing - to break into the server sector for over a decade. You may remember Cavium and Calxeda? Or Qualcomm's Centriq? All failures. Qualcomm bought out ARM server startup Nuvia and killed Nuvia's entry to the ARM server market (what Qualcomm intends to do with Nuvia's IP in the future is unknown, though it is widely thought that Qualcomm will simply fold Nuvia's tech into phone/mobile SoCs). AMD cancelled their own ARM server lineup - Seattle. They let K12 die on the vi
So, when do we get an ARM linux laptop? (Score:3)
Re: (Score:2)
Open source UEFI? Good luck. But I'm pretty sure you can run Linux on an 8cx or SQ1 laptop, albeit with some hiccups. Wifi and GPU acceleration are allegedly broken (thanks Qualcomm!). Might also have problems with touchpads.
Re: (Score:1)
Re: (Score:1)
Re: (Score:3)
What about the PineBook Pro?
Not sure about OSS firmware, and too lazy to search now.
https://www.pine64.org/pineboo... [pine64.org]