Qualcomm's M1-Class Laptop Chips Will Be Ready For PCs In 'Late 2023' (arstechnica.com) 46
An anonymous reader quotes a report from Ars Technica: Qualcomm bought a chipmaking startup called Nuvia back in March of 2021, and later that year, the company said it would be using Nuvia's talent and technology to create high-performance custom-designed ARM chips to compete with Apple's processor designs. But if you're waiting for a truly high-performance Windows PC with anything other than an Intel or AMD chip in it, you'll still be waiting for a bit. Qualcomm CEO Christian Amon mentioned during the company's most recent earnings call that its high-performance chips were on track to land in consumer devices "in late 2023."
Qualcomm still plans to sample chips to its partners later in 2022, a timeframe it has mentioned previously and has managed to stick to. A gap between sampling and mass production is typical, giving Qualcomm time to work out bugs and improve chip yields and PC manufacturers more time to design and build finished products that incorporate the chips. [...] Like Apple's processors, Nuvia's support the ARM instruction set but don't use off-the-shelf ARM Cortex CPU designs. These processor cores have been phenomenally successful in commodity SoCs that power everything from Android phones to smart TVs, and they helped popularize the practice of combining large, high-performance CPU cores and small, high-efficiency CPU cores together in the same design. But they rarely manage to top the performance charts, something that's especially noticeable when they're running x86 code on Windows with a performance penalty.
Qualcomm still plans to sample chips to its partners later in 2022, a timeframe it has mentioned previously and has managed to stick to. A gap between sampling and mass production is typical, giving Qualcomm time to work out bugs and improve chip yields and PC manufacturers more time to design and build finished products that incorporate the chips. [...] Like Apple's processors, Nuvia's support the ARM instruction set but don't use off-the-shelf ARM Cortex CPU designs. These processor cores have been phenomenally successful in commodity SoCs that power everything from Android phones to smart TVs, and they helped popularize the practice of combining large, high-performance CPU cores and small, high-efficiency CPU cores together in the same design. But they rarely manage to top the performance charts, something that's especially noticeable when they're running x86 code on Windows with a performance penalty.
Can’t wait for ARM? (Score:2)
Buy yourself a Talos system with POWER9 cpus and open source secure boot. https://www.raptorcs.com/TALOS... [raptorcs.com]
Re: (Score:2)
Yeh, Power is pretty good stuff... and you could also run AIX as well as other IBM operating systems in VMs on that system which is pretty cool for the sake of novelty.
But here's the catch... value for money.
That system has such an incredible lack of value for money that it would be just short of a miracle if it landed within only 100% more expensive than a comparable AMD system.
Oh, and I use Power quite often. I'm a huge fan. But if you're looking for a suit
Re: (Score:2)
Re: (Score:1)
Nuvia = Apple Axx-series chips (Score:4, Informative)
Re: (Score:2)
Re: (Score:1)
Gaaaak...it's 2022 and Slashdot still won't let you fix your typos.
Perhaps we need to suggest that Musk should buy Slashdot...?
Re: (Score:1)
TFA doesn't take pains to point out that Nuvia was founded by a bunch of ex-Apple engineers - the ones who originally designed the guts of the ARM-based A-series chips in the iPhone / iPad. Some series ARM chops and low-power, high-performance design chops and experience.
And which said IP is now Apple's; so neither Nuvia, its Traitorous Engineers, nor Qualcomm dare ever incorporate into one of their Designs. For over two decades.
This should be fun to watch.
Re: (Score:2)
That assumes that there's anything in there that wasn't already covered by other patents that Apple themselves may have had to license, and which they can license from the same source.
Last I looked Apple was themselves being sued by patent trolls over the M1 CPU. I dislike patent trolls as much as any of you but that doesn't change whether they own a patent.
Re: (Score:2)
That assumes that there's anything in there that wasn't already covered by other patents that Apple themselves may have had to license, and which they can license from the same source.
Last I looked Apple was themselves being sued by patent trolls over the M1 CPU. I dislike patent trolls as much as any of you but that doesn't change whether they own a patent.
I truly hadn't heard about any Patent Trolling over the M1. Can you provide a Citation?
Re: (Score:2)
Try googling "patent troll apple m1"
TL;DR: I looked and it looks like the lawsuit has already been dismissed
P.S. You should always try asking google, you're going to get your answer a lot faster
Re: (Score:2)
Try googling "patent troll apple m1"
TL;DR: I looked and it looks like the lawsuit has already been dismissed
P.S. You should always try asking google, you're going to get your answer a lot faster
Sorry.
I did check right after I replied to you. I wasn't thinking too clearly; as I had JUST woken-up when I read your Post!
Interestingly, I didn't see that it had been dimissed.
But are you sure that's true? Were you talking about this case?
https://appleinsider.com/artic... [appleinsider.com]
Because I still don't see that being dismissed. It should be, however; because it was filed in East Texas in 2021, LONG AFTER the Supreme Court Unanimously stopped that District Court's little Patent Troll scam of its own back in 2017 (Pa
Re: (Score:2)
Hmm, yeah. You're right. So that one hasn't been dismissed yet.
I'm against patent trolling in all its forms, but it's not yet clear that Apple's engineers invented all the stuff in their CPU that they claim they did.
Anyway I'm content to sit and wait, I don't have any emotional investment in the outcome of that battle.
Re: (Score:2)
Hmm, yeah. You're right. So that one hasn't been dismissed yet.
I'm against patent trolling in all its forms, but it's not yet clear that Apple's engineers invented all the stuff in their CPU that they claim they did.
Anyway I'm content to sit and wait, I don't have any emotional investment in the outcome of that battle.
Nobody ever invents everything in anything. It just isn't practical.
But I still can't imagine why any Patent Dispute Lawsuit against Apple filed in East Texas after 2017 can survive even a bare-bones Motion to Dismiss for lack of Jurisdiction. WTF is going on there?!?
That, my friend, is the real story here!
Late 2023 (Score:4, Insightful)
So by late 2023, they'll release a chip matching the one released in early 2021. Meanwhile, Apple will have released the M3.
Re: (Score:1)
Re: (Score:2)
But how much better will an M3 be than an M1? And who says this new processor merely "matches" and M1, it's not like Apple has an M2 yet.
One thing it will have over Apple processors is actual Windows support.
Re: (Score:3)
A15 is between 8-30% faster than A14 depending on operation.
So, assumption that M2 = A15 reconfigured in the same way seems reasonable.
Re: (Score:2)
But how much better will an M3 be than an M1? And who says this new processor merely "matches" and M1, it's not like Apple has an M2 yet.
One thing it will have over Apple processors is actual Windows support.
Everyone knows, and Apple already signalled, that, with the release of the "Ultra" variant, the M1 cycle is complete. So, why even embarrass yourself with a soon-to-be-laughable comment like "It's not like Apple has an M2 yet."?
And if MS doesn't get its shit together with their pathetic excuse for x86-64 Translation/Emulation (which Apple already stomps all over with Rosetta2), "official" WoA capability will hardly even matter.
Re: (Score:2)
So by late 2023, they'll release a chip matching the one released in early 2021. Meanwhile, Apple will have released the M3.
Indeed. But I guess you think Apple came from nowhere and produced an M1 first go without say many years of sub par slow mobile processors before it?
Or more to the point, are you suggesting just because someone can't come to the market with the absolute best of the best they shouldn't bother at all?
Re: (Score:3)
So by late 2023, they'll release a chip matching the one released in early 2021. Meanwhile, Apple will have released the M3.
Indeed. But I guess you think Apple came from nowhere and produced an M1 first go without say many years of sub par slow mobile processors before it?
Or more to the point, are you suggesting just because someone can't come to the market with the absolute best of the best they shouldn't bother at all?
Sub-par Mobile processors that just so happen to consistently outperform every other Mobile Processor by at least a Generation or more.
Apple is an ARM God, both in hardware and software design. It is so far ahead of the competition that it isn't even funny.
And of course, if Apple's Benedict Arnold Design Team even thinks of incorporating any of the IP they Developed for Apple, their current Employer (Qualcomm), as well as them personally, will be instantaneously C&D-ed and sued in the stone age; as well
Re: (Score:3)
Apple is an ARM God, both in hardware and software design.
Agree. They also have the not-insignificant advantage of being able to design to their specific needs. Qualcomm, like Intel, has to cater to a bunch of different manufacturers.
And that other advantage; that being that Apple can relentlessly optimize both software and hardware into one impossible-to-beat combination. Even MS working with Qualcomm on ARM SoCs for Surface products is a far, far distant second.
Add to that the fact that Apple has the M4 or M5 on the drawing-board already, and I don't think John Ternus' Team, Johnny Srouji's Team nor Craig Frederighi's Team are sweating any of this. It is as the buzzing of flies. . .
https://www.apple.com/leadersh... [apple.com]
https://www.apple.co [apple.com]
Re: (Score:2)
Apple can relentlessly optimize both software and hardware into one impossible-to-beat combination
In theory, they could do that. In practice, it's not even close. Linux beats OSX in many benchmarks on the M1 (and most of the rest come down to lack of GPU support because Apple is not giving out specs needed to support it properly, ala nvidia except nvidia can't and Apple simply won't.) Apple is nowhere near as competent at software as you imagine they are. Remember, they tried to make their own next-generation OS twice and failed both times. Then they had to go back to old-ass NeXTStep to improve. Though
Re: (Score:2)
Apple can relentlessly optimize both software and hardware into one impossible-to-beat combination
In theory, they could do that. In practice, it's not even close. Linux beats OSX in many benchmarks on the M1 (and most of the rest come down to lack of GPU support because Apple is not giving out specs needed to support it properly, ala nvidia except nvidia can't and Apple simply won't.) Apple is nowhere near as competent at software as you imagine they are. Remember, they tried to make their own next-generation OS twice and failed both times. Then they had to go back to old-ass NeXTStep to improve. Though they would have been better off with BeOS, frankly. In terms of responsiveness, old BeOS still beats the pants off of new OSX.
Then why haven't the Linux folks beat a path to BeOS?
The OS(es) that Apple was/were trying to create was/were simply too complex for anyone to have done. End of story. Rhapsody and Copland just died from Creeping Elegance.
And, back then, BeOS was a great idea that just wasn't a real OS yet. Purchasing NeXT was simply an expedient; and it has turned out to have been a great choice for them.
Linux is faster under the hood at some things, due to its macrokernel architecture; but in 1997, it wasn't ready for pri
Re: (Score:2)
In terms of responsiveness, old BeOS still beats the pants off of new OSX.
Then why haven't the Linux folks beat a path to BeOS?
Because Linux is more practical.
back then, BeOS was a great idea that just wasn't a real OS yet
Having used it, I know it was working quite well. They didn't go with it not for technical reasons, but for Steve Jobs reasons.
Re: (Score:2)
In terms of responsiveness, old BeOS still beats the pants off of new OSX.
Then why haven't the Linux folks beat a path to BeOS?
Because Linux is more practical.
Yet you recommended BeOS instead of Linux for Apple. Right!
back then, BeOS was a great idea that just wasn't a real OS yet
Having used it, I know it was working quite well. They didn't go with it not for technical reasons, but for Steve Jobs reasons.
As I said before: NeXTStep seems to have worked out fairly-well for Apple, and only took a little over 2 years to hammer into OS X 10.0.0 (with a minor stop at OS X Server 1.0 along the way), including some rather ingenious changes to accommodate MacOS (Classic)'s Resource Fork and more-advanced Extensionless File Associations. Who knows how Be would have handled those changes? From what I can tell, after acquiring BeOS, Palm simply buried it in the
Re: (Score:2)
Because Linux is more practical.
Yet you recommended BeOS instead of Linux for Apple. Right!
If you actually understand the history involved, then this should make perfect sense. At the time, Apple was shifting their operating system from the classic, super-antiquated MacOS. It didn't really matter what they went to — as long as development wasn't too much of a PITA, their users and developers were sure to follow. As long as whatever they moved to was essentially compatible with the rest of the world, with similar-looking APIs, they were guaranteed a base. Backwards compatibility was necessar
Re: (Score:2)
But I guess you think Apple came from nowhere and produced an M1 first go without say many years of sub par slow mobile processors before it?
No, but I have seen Apple's "subpar" mobile processors consistently beat Qualcomm's best processors. I have also seen that the M1 running Windows 10 in VM handily beat Qualcomm's SQ2 CPU running Windows on ARM natively in the MS Surface Pro X [youtu.be]. And those were the tests that Linus could get the Surface Pro to run as Windows compatibility on the Surface Pro X is very much a work in progress. While the M1 was not Apple's first CPU ever, this new chip is not Qualcomm’s first PC CPU and the previous attemp
Re: (Score:2)
It doesn't say that.
Also, performance isn't the only thing to compete on. The M1 needs memory on the same die, such are the massive bandwidth requirements needed to get it's mid to low end performance compared to x86 chips. If Qualcomm can produce a design that allows for memory upgrades, that would be a huge advantage.
The M2 is going to be very interesting, when Apple eventually releases it. If you look at their current line up they have just been throwing more and more cores at those machines, and in benc
Re: (Score:2)
The M1 needs memory on the same die, such are the massive bandwidth requirements needed to get it's mid to low end performance compared to x86 chips. If Qualcomm can produce a design that allows for memory upgrades, that would be a huge advantage.
Apple could always do the same thing but the performance would suffer. I do not see sacrificing performance for upgrade compatibility as a huge advantage as much as a compromise. Given that Qualcomm has not shown they are ahead of Apple when it comes to mobile processors performance, I do not think they have any secret technology that will make their ARM processors that much better when it comes to desktop performance.
Re: (Score:2)
Modern CPU performance is determined not just by the CPU core, but by things like the thermal envelope and special purpose peripherals like video encoders. Apple combine the CPU and GPU, but Qualcomm doesn't have to.
The M1 GPU is rather weak for modern desktop games. A Qualcomm chip paired with an AMD or Nvidia GPU would give better performance.
Qualcomm also has better support for things like video codecs. Qualcomm chips can encode and decode many more formats than the M1.
That all said, I'm not in any hurry
Re: (Score:2)
Modern CPU performance is determined not just by the CPU core, but by things like the thermal envelope and special purpose peripherals like video encoders. Apple combine the CPU and GPU, but Qualcomm doesn't have to.
Again, in terms of raw CPU performance, Qualcomm has not shown they can significantly beat Apple.
The M1 GPU is rather weak for modern desktop games. A Qualcomm chip paired with an AMD or Nvidia GPU would give better performance.
1) Again Qualcomm CPU performance is lacking regardless of which GPU is used. They will be lacking compared to x86. 2) A separate GPU requires more power and cost. For laptops these are compromises. For a desktop why wouldn't the consumer just use a x86 instead?
Qualcomm also has better support for things like video codecs. Qualcomm chips can encode and decode many more formats than the M1.
Which codec? As far as I know, Qualcomm and M1 support h265, h264, and mp2 encoding?
Re: (Score:2)
So by late 2023, they'll release a chip matching the one released in early 2021.
The M1 actually launched in late 2020, not early 2021, so we’re talking a full three years, by which time we do indeed, as you said, expect that the M2 (and its variations) and possibly the M3 will have been announced.
Re: (Score:3)
That's the penalty for having a general purpose machine. If you want a single purpose machine that doesn't need abstraction layers, caches, and multiple video modes, then yes, you can probably power on to a stable state in under two seconds. Think about applying power to your alarm clock. It will start flashing "12:00" immediately. If you want it to automatically set the time by sync'ing to a GPS or WWV signal, then be prepared to wait.
Re: (Score:2)
Think about applying power to your alarm clock. It will start flashing "12:00" immediately. If you want it to automatically set the time by sync'ing to a GPS or WWV signal, then be prepared to wait.
What I want is for the system to estimate whether the RTC is close enough to correct to do stuff before syncing the time, based on time elapsed since last boot, and measured RTC drift for my particular system.
Good news for embedded (Score:2)
Really, this is great news for embedded - because they'll have a real high-performance low-power chip. It doesn't matter if it's 3 years behind the M1. What matters is it'll be a huge jump from whatever they've got now, which is probably some MIPS chip.
not really (Score:2)
in the world of embedded, how many designs requires an application-class processor and how many of those would benefit from an M1-class design? Not so many. ARM makes embedded cores for a reason, and this isn't even news for them.
Re: (Score:2)
this is great news for embedded - because they'll have a real high-performance low-power chip. It doesn't matter if it's 3 years behind the M1. What matters is it'll be a huge jump from whatever they've got now, which is probably some MIPS chip.
That's nonsense. The generally available ARM designs left MIPS behind a long time ago. The only people still using MIPS don't care about performance, only cost matters. They will absolutely not move to this CPU.
Re: (Score:2)
Define "PC" (Score:1)
If it's a mobile device of any kind (and thus, battery powered - laptops included), then yes ARM's big.LITTLE [wikipedia.org] setup makes sense.
But if it's a small AC powered box like a mini desktop PC, who cares whether it's 1, 0.1 or 0.01W idle. Just add some mix of [more cores] and [higher performance cores] up to whatever thermal or cost envelope is targeted. Although personally I'm hoping to see some RISC-V options in this space.
Re: (Score:1)
The defining trait of a PC is the ability to make and use software without interference, permission, or continued support of the vendor. In practice, the platform must comply with an open standard, or the vendor can lock you out with an update, or as in the case of Qualcomm, by abandoning support of their proprietary blobs needed to use "your" device.
The PC is on the verge of death with UEFI, which is ever one update of "secure boot" away from locking it to run proprietary software only. Even now, UEFI neve
Re: (Score:3)
The PC is on the verge of death with UEFI, which is ever one update of "secure boot" away from locking it to run proprietary software only. [...] It is increasingly looking like a foolish hope that RISC-V will ever fix this
It's even more foolish to hope that ARM will ever fix this.
It's noteworthy because of what it isn't (Score:2)
The M1 came out in late 2020, a full 3 years earlier than what's still just a plan.
By late 2023, there'll probably be an M3, M3 Pro, M3 Max and M3 Ultra.