Apple Announces M4 With More CPU Cores and AI Focus (arstechnica.com) 66
An anonymous reader quotes a report from Ars Technica: In a major shake-up of its chip roadmap, Apple has announced a new M4 processor for today's iPad Pro refresh, barely six months after releasing the first MacBook Pros with the M3 and not even two months after updating the MacBook Air with the M3. Apple says the M4 includes "up to" four high-performance CPU cores, six high-efficiency cores, and a 10-core GPU. Apple's high-level performance estimates say that the M4 has 50 percent faster CPU performance and four times as much graphics performance. Like the GPU in the M3, the M4 also supports hardware-accelerated ray-tracing to enable more advanced lighting effects in games and other apps. Due partly to its "second-generation" 3 nm manufacturing process, Apple says the M4 can match the performance of the M2 while using just half the power.
As with so much else in the tech industry right now, the M4 also has an AI focus; Apple says it's beefing up the 16-core Neural Engine (Apple's equivalent of the Neural Processing Unit that companies like Qualcomm, Intel, AMD, and Microsoft have been pushing lately). Apple says the M4 runs up to 38 trillion operations per second (TOPS), considerably ahead of Intel's Meteor Lake platform, though a bit short of the 45 TOPS that Qualcomm is promising with the Snapdragon X Elite and Plus series. The M3's Neural Engine is only capable of 18 TOPS, so that's a major step up for Apple's hardware. Apple's chips since 2017 have included some version of the Neural Engine, though to date, those have mostly been used to enhance and categorize photos, perform optical character recognition, enable offline dictation, and do other oddities. But it may be that Apple needs something faster for the kinds of on-device large language model-backed generative AI that it's expected to introduce in iOS and iPadOS 18 at WWDC next month. A separate report from the Wall Street Journal says Apple is developing a custom chip to run AI software in datacenters. "Apple's server chip will likely be focused on running AI models, also known as inference, rather than in training AI models, where Nvidia is dominant," reports Reuters.
Further reading: Apple Quietly Kills the Old-school iPad and Its Headphone Jack
As with so much else in the tech industry right now, the M4 also has an AI focus; Apple says it's beefing up the 16-core Neural Engine (Apple's equivalent of the Neural Processing Unit that companies like Qualcomm, Intel, AMD, and Microsoft have been pushing lately). Apple says the M4 runs up to 38 trillion operations per second (TOPS), considerably ahead of Intel's Meteor Lake platform, though a bit short of the 45 TOPS that Qualcomm is promising with the Snapdragon X Elite and Plus series. The M3's Neural Engine is only capable of 18 TOPS, so that's a major step up for Apple's hardware. Apple's chips since 2017 have included some version of the Neural Engine, though to date, those have mostly been used to enhance and categorize photos, perform optical character recognition, enable offline dictation, and do other oddities. But it may be that Apple needs something faster for the kinds of on-device large language model-backed generative AI that it's expected to introduce in iOS and iPadOS 18 at WWDC next month. A separate report from the Wall Street Journal says Apple is developing a custom chip to run AI software in datacenters. "Apple's server chip will likely be focused on running AI models, also known as inference, rather than in training AI models, where Nvidia is dominant," reports Reuters.
Further reading: Apple Quietly Kills the Old-school iPad and Its Headphone Jack
Re: (Score:2)
NoMoreACs and ArchieBunker can't wait to see Derek Zoolander's latest look.
https://www.youtube.com/watch?... [youtube.com]
Sure, they're all the same, but they don't care.
Re: (Score:2)
NPUs are not new or novel.
Only the marketing of it is.
It's very telling that the iPad is first in line (Score:4, Insightful)
Even the announcement where they're hinting that with the addition of a Magic Keyboard it'll feel like a MacBook Air is prepping the ground for getting users gently accustomed to the idea that laptops are a thing of yesteryear. Classic Apple, really.
As for me personally, these premonitions slowly becoming reality were what made me move away from the platform some years ago, because of still depending on large screens, lots of RAM and compatibility with some older peripherals which can sometimes cost more than the computers themselves. Exactly the things Apple charges an insane premium for, where they seem to be expecting customers to constantly refresh all of their gear and who cares if it creates more landfill waste in the process....
Re: (Score:2, Insightful)
Don't do yourself down. I think you hit "too much" pretty much dead-centre.
Re:It's very telling that the iPad is first in lin (Score:4, Insightful)
The M4 is not just the fastest and most capable, it's simply the most recent launch of a new generation. The baseline M models were always destined for iPads Pros.
Whatever point you think you were making is lost given the existence of the "Pro" "Max" and "Ultra" versions of the M series CPUs which are designed exclusively for use in laptops, and the fact that no iPads got an M3 chip.
What alternative do you suggest, Apple sit around with their thumbs up their asses delaying a product simply so they can launch a laptop first? The iPad is the oldest product they have and the one most in need of a hardware refresh having not had an update missing an entire generation of CPU releases.
Re: (Score:3)
Moreover, it’s been reported that the M3 was an especially expensive process from TSMC with extraordinarily low yields, hence why it only ever was used in low volume, high margin lines. Supply chain rumors point to a rapid adoption of the M4 across all lines, and it’s likely that this was simply the earliest it was available, which is typically when new iPads are announced in the calendar year. Macs typically arrive at WWDC or after the Back to School sales wrap up, so we’ll almost certain
Re: (Score:2)
It may also be an indicator that the Mac oriented M4s will be even more capable, with this being the initial yield?
Until the next generation of laptops are released, I am going to reserve judgement.
Re: (Score:2)
M3 MBP was announced 5 fucking months ago. They're not going to do a 5 month refresh for the new CPU.
M4 MBP will be announced in the fall.
Re: (Score:2)
The fact that they're put their fastest and most capable CPU in a device running iOS before anything more substantial is yet another sign that Apple isn't really caring much about laptops or desktops anymore. Not to speculate too much, but for about five years many of us have theorized that the writing was on the wall and that OS-X and iOS will eventually soon be merged.... somehow. Clunky file system and all the simplified rest included.
Even the announcement where they're hinting that with the addition of a Magic Keyboard it'll feel like a MacBook Air is prepping the ground for getting users gently accustomed to the idea that laptops are a thing of yesteryear. Classic Apple, really.
As for me personally, these premonitions slowly becoming reality were what made me move away from the platform some years ago, because of still depending on large screens, lots of RAM and compatibility with some older peripherals which can sometimes cost more than the computers themselves. Exactly the things Apple charges an insane premium for, where they seem to be expecting customers to constantly refresh all of their gear and who cares if it creates more landfill waste in the process....
The fact that they're put their fastest and most capable CPU in a device running iOS before anything more substantial is yet another sign that Apple isn't really caring much about laptops or desktops anymore. Not to speculate too much, but for about five years many of us have theorized that the writing was on the wall and that OS-X and iOS will eventually soon be merged.... somehow. Clunky file system and all the simplified rest included.
Even the announcement where they're hinting that with the addition of a Magic Keyboard it'll feel like a MacBook Air is prepping the ground for getting users gently accustomed to the idea that laptops are a thing of yesteryear. Classic Apple, really.
As for me personally, these premonitions slowly becoming reality were what made me move away from the platform some years ago, because of still depending on large screens, lots of RAM and compatibility with some older peripherals which can sometimes cost more than the computers themselves. Exactly the things Apple charges an insane premium for, where they seem to be expecting customers to constantly refresh all of their gear and who cares if it creates more landfill waste in the process....
Insane premium?
Find me an equivalent Tablet to the new iPad Pro.
Also, Apple has been moving new iPad introductions around to the Spring for a few years now. They needed to get this out before WWDC in June; so they could do a big splash for iOS 18.
As for landfill waste: All these new iPads are 100% Recycled Aluminum.
And I'll believe they're thinking of axing MacBooks when the iPads are available with macOS.
or yield is not great on TSMC's N3E process (Score:2)
One thing that must not be forgotten is that iPads are relatively low volume products.
The M4 seems to use TSMC's N3E, which is a new node process, and there's a fairly good chance yield currently is good but not great.
Re: (Score:2)
Re: (Score:1)
Re: A lot of focus on AI... (Score:2)
- Searching photo libraries
- Editing photos
- Taking photos that arenâ(TM)t shit
- Making text selectable and copyable in images
- Text to speech
- Speech to text
- Much better autocorrect
- Much better antialiasing in games
- Raytracing in games (a is only possible via the use of a AI to fill in the blanks in the low sample size stochastic rendering)
Thereâ(TM)s tons of uses of machine learning out there already, you just havenâ(TM)t noticed them get quietly integrated into things, and working perfe
Re: (Score:2)
- Searching photo libraries
- Editing photos
- Taking photos that arenâ(TM)t shit
- Making text selectable and copyable in images
- Text to speech
- Speech to text
- Much better autocorrect
- Much better antialiasing in games
- Raytracing in games (a is only possible via the use of a AI to fill in the blanks in the low sample size stochastic rendering)
Thereâ(TM)s tons of uses of machine learning out there already, you just havenâ(TM)t noticed them get quietly integrated into things, and working perfectly.
Another thing is quietly using AI: Swipe-to-Type.
It is simply amazing how perfectly it determines what word you are forming. The only thing it gets tripped up on is acronyms; which is kind of understandable. It is absolutely incredible! And it's just quietly sitting there; waiting be used.
Re: (Score:2)
Another thing is quietly using AI: Swipe-to-Type.
I have the paid for Swype app still, from well over a decade ago and long discontinued. It's not really a heavy user of AI. On the other hand, it's a shame as it was discontinued ages ago and bugs/Android differences are eroding it's usefulness.
gboard has finally caught up with the basic swiping matching words, but is still a poor shadow of the OG app.
I do wish they'd just outright steal the UI from swype.
Re: (Score:2)
Another thing is quietly using AI: Swipe-to-Type.
I have the paid for Swype app still, from well over a decade ago and long discontinued. It's not really a heavy user of AI. On the other hand, it's a shame as it was discontinued ages ago and bugs/Android differences are eroding it's usefulness.
gboard has finally caught up with the basic swiping matching words, but is still a poor shadow of the OG app.
I do wish they'd just outright steal the UI from swype.
I'm sorry, I was unclear: I was talking about iOS' built-in Swipe-to-Type.
Re: (Score:2)
Oh right I haven't used that one.
Here's what I love about Swype:
QWERTY Keyboard with small symbols next to each letter. Swipe from key to the "symbol" button to insert a symbol. This is great for punctuation.
Swiping from the swype button to c, v, or x does copy, paste, cut.
Swiping up high after a letter capitalises that letter.
When you type in an unrecognized word, you have the option to add it to the dictionary (it's not automatic), and likewise you can remove ones you don't want.
Overall it's really smooth
Re: (Score:2)
Oh right I haven't used that one.
Here's what I love about Swype:
QWERTY Keyboard with small symbols next to each letter. Swipe from key to the "symbol" button to insert a symbol. This is great for punctuation.
Swiping from the swype button to c, v, or x does copy, paste, cut.
Swiping up high after a letter capitalises that letter.
When you type in an unrecognized word, you have the option to add it to the dictionary (it's not automatic), and likewise you can remove ones you don't want.
Overall it's really smooth.
Thanks! I'll have to try that.
I just sort of ran into the setting, and was floored at how well it guessed which word I was forming (it doesn't guess until you lift your finger); far better than auto-complete.
Re: (Score:2)
Thanks! I'll have to try that.
Unfortunately you cannot. The app got cancelled years ago, but since I have a paid for version I can apparently keep getting it from the app store even though it's long gone.
I just sort of ran into the setting, and was floored at how well it guessed which word I was forming (it doesn't guess until you lift your finger); far better than auto-complete.
Swiping is excellent in general.
I got into it about 12 or 13 years ago: I got a Samsung smartphone that had Swype preloaded as par
Re: (Score:2)
...but not much consumer application for it outside of touching up photos and making memes. I wonder how they will convince the average user to upgrade?
Without understanding the type of math the neural engine is processing, it’s hard to know why type of AI based solutions will benefit from this?
Re: (Score:2)
Without understanding the type of math the neural engine is processing
The same as all others. Matrix multiplication.
it’s hard to know why type of AI based solutions will benefit from this?
All <fp32 inference.
The NPU isn't new however- nothing really uses it already.
Nothing really uses it in all the Android phones that have them too.
The primary consumer will continue to be baked in features of the operating system- taking pictures, etc.
Re: (Score:2)
Without understanding the type of math the neural engine is processing
The same as all others. Matrix multiplication.
it’s hard to know why type of AI based solutions will benefit from this?
All <fp32 inference.
The NPU isn't new however- nothing really uses it already.
Nothing really uses it in all the Android phones that have them too.
The primary consumer will continue to be baked in features of the operating system- taking pictures, etc.
But who's fault is that? Apple provides the SDKs and frameworks; it's up to developers to come up with the Applications.
I'm pretty sure that FCP and Logic Pro on iPad are using some AI Special Sauce.
Re: (Score:2)
If Google added an API for calculating Pi to arbitrary precision, and nobody use it, would you say that was the fault of developers?
AI is the same thing. Lack of API isn't the problem. Apple's CoreML and Android's NNAPI are older than dirt at this point. Nobody wants to use them.
Long before them, ML libraries existed for both Metal and OpenCL for Apple and Androids as well.
You don't blame developers for not being able to contrive a need for something you
Re: (Score:2)
I don't see what fault has to do with anything.
If Google added an API for calculating Pi to arbitrary precision, and nobody use it, would you say that was the fault of developers?
AI is the same thing. Lack of API isn't the problem. Apple's CoreML and Android's NNAPI are older than dirt at this point. Nobody wants to use them.
Long before them, ML libraries existed for both Metal and OpenCL for Apple and Androids as well.
You don't blame developers for not being able to contrive a need for something you built.
As for FCP and LP- ya, they say they've "enhanced it with AI". Both Apple products. And they're obviously trying to push the tech right now.
I'm not saying AI isn't used at all- it is. But the primary consumer of AI processing on phones, be it NPU or GPU, will continue to be Apple and Google themselves.
So, if Apple is doing this stuff with AI and ML all over the place in their OSes (which they are), and in certain Applications (which they are), what Frameworks are they using?
If the answer is CoreML, then my original statement is true. If the answer is "Some Private Apple Framework", then Apple needs to Publish and Document it for Third Party Devs.
Re: (Score:2)
So, if Apple is doing this stuff with AI and ML all over the place in their OSes (which they are), and in certain Applications (which they are), what Frameworks are they using?
The ones that are publicly available.... I think you missed the point.
If the answer is CoreML, then my original statement is true. If the answer is "Some Private Apple Framework", then Apple needs to Publish and Document it for Third Party Devs.
The problem with your statement was the assignment of "fault".
You are saying people are at fault for not finding a use they care about for a thing. That's not a "fault".
That was the point with my CalculatePiAPI example.
There's nothing wrong with the ML APIs. The wrongness here, is the belief that many people have much use for them, outside of areas Apple and Google have already monopolized on their platforms.
Re: (Score:2)
So, if Apple is doing this stuff with AI and ML all over the place in their OSes (which they are), and in certain Applications (which they are), what Frameworks are they using?
The ones that are publicly available.... I think you missed the point.
If the answer is CoreML, then my original statement is true. If the answer is "Some Private Apple Framework", then Apple needs to Publish and Document it for Third Party Devs.
The problem with your statement was the assignment of "fault".
You are saying people are at fault for not finding a use they care about for a thing. That's not a "fault".
That was the point with my CalculatePiAPI example.
There's nothing wrong with the ML APIs. The wrongness here, is the belief that many people have much use for them, outside of areas Apple and Google have already monopolized on their platforms.
Ok, then not "Fault". Howabout "Lack of Imagination"?
You just said Apple is using the same Frameworks as any Apple Dev. Can use. So it obviously isn't the Frameworks that are deficient.
Re: (Score:2)
So it obviously isn't the Frameworks that are deficient.
Nobody said they were.
There is no fault for the current status quo of AI use. It merely is that- the status quo.
As I said, long before we had CoreML and NNAPI on android, we had various ML and TF libs via Metal and OpenCL.
Making a new API wasn't going to solve the problem that already existed: That they're trying to market something that just doesn't penetrate the ecosystem that deeply.
The new 38TOPS NPU is a really big upgrade, but it's not *that* big for realistic workloads.
Most workloads aren't usi
Re: (Score:2)
So it obviously isn't the Frameworks that are deficient.
Nobody said they were.
There is no fault for the current status quo of AI use. It merely is that- the status quo.
As I said, long before we had CoreML and NNAPI on android, we had various ML and TF libs via Metal and OpenCL.
Making a new API wasn't going to solve the problem that already existed: That they're trying to market something that just doesn't penetrate the ecosystem that deeply.
The new 38TOPS NPU is a really big upgrade, but it's not *that* big for realistic workloads.
Most workloads aren't using INT8 models. They're using FP16 or FP32.
On FP16, the perf of the NPU is half- 19TFLOPS. Which isn't bad, by any means, but it's not huge.
FP32 is not eligible for scheduling on the NPU at all.
But ultimately- none of that will really solve the problem of "a microscopic fraction of all applications use AI, and the primary consumer is going to continue to be your mobile OS."
This isn't a criticism of the new NPU (which unlike the old one, is actually fucking useful), or the APIs.
Person asked a question, I answered with realistic impact.
I actually have no business opining on this subject at all.
Re: (Score:1)
Re: (Score:2)
apple hates developers
Of course. They have that WWDC and those Workshops every year just to spew hate at them.
Think about who brings in the App Store Revenue.
You are a Genius. Not.
Re: (Score:2)
apple hates developers
Please explain your reasoning.
Re: (Score:1)
...but not much consumer application for it outside of touching up photos and making memes. I wonder how they will convince the average user to upgrade?
You're full of shit.
Re: (Score:2)
sounds like an apple user to me, just by the shit content
Fuck off.
AND Die.
Re: (Score:2)
Why feed a troll apparently specifically created to troll you?
I know, I know. . .
Re: (Score:2)
8 GB of memory (Score:1, Funny)
Re: (Score:2)
I understand that planning ahead takes brain cells you may not have, but I have faith that you can figure it out.
This surprises anyone because?...... (Score:1)
1) more cores. Hey, they're pretty much free and everyone knows moar is better
2) Yo, Katie! 'member those additional GPUs I promised you next spin? Don't kill the messenger, but....
3) Yo, Joe! How hard would it be to convert a GPU core to an AI core? Can you do it by Tuesday?
Re: (Score:2)
It IS successful?
How's that future telling going?
Pretty sure success isn't defined by zero sales.
Re: (Score:3)
What I wonder is which is better for ML applications?
1. The cpu general purpose cores?
2. The GPU
3. The Neural Engine
Because if its the Neural Engine, there won't be much difference between the Pro version and the Max version (except Max is super expensive). At least that's true with the M1-M3.
The question is whether the M4 will do this better. I'd almost want them to share Neural Engine / GPU resources so that both applications are better, rather than having two separate sections that are so so.
Re: This surprises anyone because?...... (Score:4, Informative)
GPU and neural engine hardware have rapidly diverged. While they both do shit tons of matrix multiplies, the GPU needs to do them with a reasonable (usually fp16-32) level of precision. Neural network processors often get away with 4 bit fixed point. My bet is that the neural engine is substantially faster, and more importantly, consumes much less power than the GPU.
Re: (Score:2, Insightful)
GPU and neural engine hardware have rapidly diverged. While they both do shit tons of matrix multiplies, the GPU needs to do them with a reasonable (usually fp16-32) level of precision. Neural network processors often get away with 4 bit fixed point. My bet is that the neural engine is substantially faster, and more importantly, consumes much less power than the GPU.
This.
Re: (Score:2)
Good explanation. Thanks.
But this also means that aside from RAM, there's no advantage in buying an M3 Max over an M3 Pro since the Neural Engines are the same.
Apple lost direction (Score:2)
After all the stuff Jobs had in the pipeline was exhausted, Apple had lost direction.
What new and exciting stuff came out in the last 3 years? VR? What feature in iPhone 15 is so exciting that it worth upgrading from an iPhone 12/13/14? Just a titanium case?
Now the only exciting thing Apple can say is just "more CPU"?! How dull, it reminded me of the PC market in the decade around 2000, just more CPU, more RAM, more HD, year after year.
If this continues for another 3-5 years, Apple will go the way of Nok
Re: Apple lost direction (Score:2)
Who do you envision replacing Apple in 3.5 years?
Re: (Score:2)
Its true that Tim Cook doesn't seem to have much vision.
The M1-M4 chips are cool. But they aren't revolutionary like the iPod or the SmartPhone.
The Google Glasses / Oculus Rift reboot doesn't seem much better either. They need to go much further.
The also reportedly wasted billions on a car that never came out.
Re: (Score:2)
The M1-M4 chips are cool. But they aren't revolutionary like the iPod or the SmartPhone.
The iPod? No wireless. Less space than a Nomad. Lame.
Re: (Score:1)
Big Whoosh (Score:1)
What new and exciting stuff came out in the last 3 years? VR?
Totally missed the point there. Spatial Computing is not VR. It's also not AR as you knew it. You'll understand someday.
What feature in iPhone 15 is so exciting that it worth upgrading from an iPhone 12
From a 14? Maybe not much except the ability to take spatial video. From a 12 though? Holy shit all around it's a huge upgrade. Yeah each year may seem like a minor step but wait even two years for an upgrade and you'll notice many things l
Re: (Score:2)
Re: (Score:2)
What feature in iPhone 15 is so exciting that it worth upgrading from an iPhone 12/13/14?
iPhone 12 is only 3 years old, considering the price of a decent phone you'd hope they'd last you longer than that.
Re: (Score:2)
Yeah, I was perfectly happy with my 6s but was forced by work to replace it when it wasn’t getting the latest iOS (despite it still getting security updates ) but definitely prefer the picture quality from the 14 Pro I replaced it with. I imagine I’ll upgrade again in another 7 or 8 years.
Re: (Score:2)
https://www.zdnet.com/article/... [zdnet.com]
61% of iPhone users keep their previous iPhones for two years or more, while only 41% of Android users do the same.
Applenorexia (Score:2, Insightful)
If the M4 can perform on half the power of the M2, it means that the slightly chunky M2 iPad Pro could be given significantly longer battery life with the same form factor with M4. But no. Again, they made the new iPad Pro thinner.
Re: (Score:2)
If they give it more battery then it will have a longer useful life (as the battery degrades, and the OS updates cause more power consumption) and then you won't have to buy a new product as soon. They are specifically and only targeting sales frequency with their battery sizing across all of their battery devices.
Re: (Score:2)
No need to surmise — Apple states that they kept battery life constant in the move from M2 to M4, and early reviewers report that this appears to be true.
Now update the Mac Studio to M4 this year..... (Score:2)
That makes me nervous (Score:2)
Next one will be the M5 [imdb.com]