CPU Benchmarks: Pre-Release Intel Alder Lake Chip Beats Apple's M1 Max (zdnet.com) 137
An anonymous reader quotes a report from ZDNet: The reign of Apple's M1 SoC at the top of the Geekbench speed benchmarks may soon be over with the impending arrival of Intel's 12th-generation Alder Lake mobile processors. Hardware site Wccftech appears to have been leaked Intel's upcoming Core i9-12900HK mobile CPU, and has now revealed the first benchmarks. The results show Intel's mobile CPU narrowly outperforming Apple's flagship 10-core M1 Max, which also integrates a 32-core GPU and 64GB of unified memory.
In these latest tests, the Core i9-12900HK outperforms the M1 Max on both single-core and multi-core benchmarks. The margin is slim, but is important for Intel since Apple ditched its CPUs for its own designs in new MacBooks. Intel's Alder Lake CPU didn't beat the M1 Max by much, with respective single-core scores of 1851 and 1785. It beat the Core i9-11980HK and AMD's top mobile CPU, the Ryzen 5980HX, by a bigger margin: the latter two CPUs saw scores of 1616 and 1506, respectively. In the multi-core benchmark, the Core i9-12900HK scored 13256 versus the M1 Max's score of 12753. Again, it trounced AMD's 5980HX, which scored 8217. Wccftech's Alder Lake benchmarks were run using Windows 11, so it's possible Thread Director's hardware scheduling influenced the results.
In these latest tests, the Core i9-12900HK outperforms the M1 Max on both single-core and multi-core benchmarks. The margin is slim, but is important for Intel since Apple ditched its CPUs for its own designs in new MacBooks. Intel's Alder Lake CPU didn't beat the M1 Max by much, with respective single-core scores of 1851 and 1785. It beat the Core i9-11980HK and AMD's top mobile CPU, the Ryzen 5980HX, by a bigger margin: the latter two CPUs saw scores of 1616 and 1506, respectively. In the multi-core benchmark, the Core i9-12900HK scored 13256 versus the M1 Max's score of 12753. Again, it trounced AMD's 5980HX, which scored 8217. Wccftech's Alder Lake benchmarks were run using Windows 11, so it's possible Thread Director's hardware scheduling influenced the results.
I patiently await... (Score:5, Funny)
Re: (Score:2)
Apple vs Intel, who does slashdot hate more?
Re: (Score:2)
Since there is also this Apple vs. Epic thing, Apple has become a darling.
Because you know -- There can only be one!
Re: (Score:2)
It doesn't matter, just wait until the Sony batteries in both types of laptop explode.
Re: (Score:2)
Re: (Score:2)
Re: Of course it did. (Score:3, Insightful)
Re: Of course it did. (Score:4, Interesting)
What is important to note here is that we used the highest scores we could find in the entire Geekbench database, which is important as both HK and HX parts are overclockable.
So, we really do not know if it is faster under the same power profile yet.
Re: (Score:2)
"So, we really do not know if it is faster under the same power profile yet."
A standard that has never been applied before, but is suddenly important to you. Wonder why?
People who care about ultimate performance are not so concerned about "the same power profile". Besides, it is the machine that matters, not a single chip.
BTW, can the M1 Max run industry standard Windows software? Can it be purchased in a machine offered by more than one vendor?
Re: (Score:2)
The thing is, Apple chips are not mostly recognized for their raw performance, but for their performance per watt. And of course, Intel's new flagship will be barely faster all the while consuming 3X the electrical power. Apple is still on top, by quite a margin.
Re: (Score:2)
"The thing is, Apple chips are not mostly recognized for their raw performance..."
False. Raw performance has been the primary message about Apple's processors. The focus is now shifting to other metrics because it isn't destroying like the boys have expected.
Re: Of course it did. (Score:2)
The apple includes a 5000 core GPU with 400GB.sec (Score:2)
The apple processor has a built in GPU that is never less than a factor of 2 of an NVIDIA deicated graphics card, and does so at a fraction of the energy. Likewise the anadatech bench showed that even when the Intel beats the apple by a nose in multiprocessing, it does so at 4 times the amount pf power consumption. And that comparison doesn't incluide heterogeneous processing using the GPU or any of the other apple processor specialization unsit.
Intel is cooked -- literally and figuratively.
Re: (Score:2)
Indeed, Apple's processor looks best when measured using benchmarks specifically designed to suit Apple's strengths. Just as has always been so, just as Apple and its fanboys have always asserted. When computing is measured precisely how Apple defines it, Apple always looks good.
"The apple processor has a built in GPU that is never less than a factor of 2 of an NVIDIA deicated graphics card..."
Sure, "never less than". For eternity I'm sure.
Intel has been "cooked" many times, and people like you have been
first posts (Score:2)
Boy, kids sure say “hot grits” differently these days
Need more info (Score:4, Interesting)
Re:Need more info (Score:5, Interesting)
While true, it's probably still nothing surprising. The reputation of ARM as low-power friendly stems from two things:
-The will to make extremely low performance, and by extension low-cost chips when Intel wouldn't even bother with anything under 25W
-Emphasis on sophisticated sleep states, to let a device that is effectively suspended wake up periodically to work for asynchronous input like calls or messages. Even when x86 decided to tackle it, their OS and application ecosystem may never reasonably adopt mobile style sleep management, versus the ARM focused OSes forcing applications to deal with being suspended with well maintained wake scenarios and/or just outright killed.
Thus far, I have not seen an ARM platform demonstrate being particularly more efficient than an x86 when you get up over the 25W area and you don't put the device into something resembling a sleep state. The motivations for ARM interest are more about business reasons (larger ecosystem of vendors, or for Apple the ability to take it inhouse and achieve a straightforward path to consistency between desktop and phone platforms).
It would be nice if the processor vendors all were conforming to a common ISA (e.g. RISC-V) so that migrating from one competitor to another would be more straightforward and they couldn't use the arbitrary differences in ISA to limit competition, but x86 is not fundamentally incapable of being competitive and it shouldn't be shocking to see an x86 part come out on top, nor should it be a shock to see an ARM on top at any given moment.
Re: (Score:2)
Thus far, I have not seen an ARM platform demonstrate being particularly more efficient than an x86 when you get up over the 25W area and you don't put the device into something resembling a sleep state. The motivations for ARM interest are more about business reasons (larger ecosystem of vendors, or for Apple the ability to take it inhouse and achieve a straightforward path to consistency between desktop and phone platforms).
Then watch Apple's release video of the M1 Max and M1 Pro.
It would be nice if the
Re: (Score:2)
Java and .NET so hot right now
Re: (Score:2)
Don't forget Delphi, Cobol and Fortran!
Re: (Score:2)
While developers may outsource platform support to LVM or .NET, golang and Rust continue to demonstrate delivering modern language features without the penalty of a VM layer and come out ahead in performance. JIT works wonders and in some cases can fare better than a compiler, but overwhelmingly you will see a hit for targeting a managed runtime instead of the actual processor.
I don't think arbitrary differences in CPU ISA is perfectly cool just because, in theory, a cooperative developer can target a VM i
Re: (Score:2)
except x86 is a hot mess of an ISA which needs a lot of untangling even before the actual execution begins
Re: (Score:2)
Re: (Score:2)
An Intel-designed ARM chip could be as fast as an M1 Max... maybe a bit quicker, and have a similar power profile.
After all, part of the heritage of Apple Silicon is the team that built the StrongARM that, for a while, was owned by Intel.
Arguably Intel should never have sold that tech - but keeping it would have required them to admit that x86 was flawed, and they can't possibly do that. :D
Re: (Score:2)
These days TDP comparisons are meaningless. Performance, both CPU processing power and power consumption, all depend on the particularly laptop that the part is installed in.
For a great example of this, look at Apple's Intel based MacBooks. They have high end CPUs but they perform way below what those same CPUs do in other laptops, because when they are loaded they hit their thermal ceiling of 99C in a second or two and then throttle.
Looking at performance benchmarks the M1 and new Pro/Max variants are slow
Re: (Score:2)
Intel makes parts with a TDP down to under 1W, FWIW.
Not much since with Intel you never get the lowest TDP per MIPS or what have you, and you also never get a low-power processor with any decent power. IOW you can have performance or efficiency, and frankly never competitive efficiency. There was a brief moment when they were leaders there in mobile processors because of their superior process technology, but that moment has passed.
Re: (Score:2)
TDP of a product family at least tells you where the vendor's mind is with respect to efficiency.
At the time that Android and iOS came into their own, Intel didn't make any viable processors for handheld, with their idea of a mobile processor being 25W TDP. Intel eventually reacted and started trying to support the use case, but I don't think they keep that part of their portfolio current anymore, after realizing they had unambiguously lost that market and they weren't making inroads despite throwing a la
Re: (Score:2)
Intel Atom processors were indeed crappy. Intel's main problem now though is their aging process node.
It's largely irrelevant anyway, if you are looking to buy a laptop you are probably not comparing a MacBook to other machines. You either want one or you don't, MacOS or Windows/ChromeOS.
Re: (Score:3)
Re: (Score:2)
It's the Pentium 4 all over again. Just keep throwing more power and megahertz at it to get the performance you want.
The max power draw from AC they saw was 120W, although there is no indication if it was also drawing power from the battery.
Strangely they didn't offer any temperature figures. My guess would be that the machines can't sustain that kind of load for very long.
Re: (Score:2)
It would be nice if the processor vendors all were conforming to a common ISA (e.g. RISC-V) so that migrating from one competitor to another would be more straightforward and they couldn't use the arbitrary differences in ISA to limit competition, but x86 is not fundamentally incapable of being competitive and it shouldn't be shocking to see an x86 part come out on top, nor should it be a shock to see an ARM on top at any given moment.
I'm extremely skeptical that a bare ARM-based CPU would be able to outperform a top x86 CPU. The RISC vs CISC wars are over and CISC won when it comes to top performance. However, there's something in between which I suspect is what Apple is developing here. If they add more complex instructions on top of the default ARM instructions, but are also able to stay away from the mess of the x86 legacy, then I think they could have a huge competitive advantage. So much for conforming to a common ISA, but when has
Even if Ader Lake is superior to Zen 3 (Score:5, Insightful)
In other words... (Score:5, Interesting)
Re: In other words... (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
Re: (Score:2)
"The leak likely is overclocked..."
Boy you are desperate.
"It is only the one CPU at the top of the price range that beats it."
Talking about "top of the price range", have you seen the prices of M1 Max laptops? No one buys a mobile processor on its own, they buy an entire computer. M1 Max's are at the "top of the price range".
Any news that doesn't glorify Apple really sticks in your throat, huh? One would think that competition would be welcomed against the new Apple processors, after all it was lack of c
Re: In other words... (Score:2)
This reminds me of when BlackBerry announced how their upcoming tablet was better than the iPad that Apple hadnâ(TM)t quite released yet. Problem is the iPad was much better than the Blackberry by the thong ne it came out.
I am not saying that this will be the case here, but comparing prototypes with current produce doesnâ(TM)t mean much. If anything it gives the competition a target to beat.
Intel has in recent times promised more than it delivered, though whatever the reality Windows users probabl
Re: (Score:2)
I remember a time that Intel's processors absolutely crushed the field.
When was that? I don't remember that ever being true. There's always been much more powerful processors from someone else, and that's often been true even in the PC market (as it is today, where AMD is offering processors with higher core counts.)
Re: (Score:2)
"Intel's next generation of product beats Apple's current one. Um, Is that a win for Intel?"
Yes, because "Apple's current one" is so new no one has them yet. It is literally a comparison of two brand new processors. It is amazing how /. posters can be so disingenuous.
Comment removed (Score:5, Interesting)
Re: (Score:2)
its hardly current when you can't have one yet.
Re: (Score:2)
What are you talking about, you can get one as of today.
Re:This seems like the bare minimum (Score:5, Insightful)
Re: (Score:2)
It's also numbers on Intel's top of the line chip. To me, that's actually impressive - I always considered the M1 and ARM to be about middle of the road chips - something to use in place of say, an Intel i5 or so.
The fact that Apple's top end chip is only superseded by Intel's top end chip, the i9, is very impressive. It means the M1 Max is probably outpacing the i7, something I never expected ARM to do.
It's also impressive because Intel has been putting out x86 chips since the 70s and has done many things
Re: (Score:2)
Re: (Score:2)
Sure, it's because Apple has outmaneuvered Intel. But Apple has outmaneuvered a lot of companies - all of them in fact. It's the most valuable company on earth, ever. The day Apple quits being your customer to become your competitor is a very, very bad day for any company.
In this case, Apple was almost forced into competing with Intel. After almost 5 years of little progress in CPUs from Intel, Apple ran out of patience and had to design their own chips or be left behind. What is forgotten is that Apple tried to use Intel for iPad prototypes and found the power efficiency abysmal. They have tried to make Intel work many different times.
Re: (Score:2)
Sure, it's because Apple has outmaneuvered Intel. But Apple has outmaneuvered a lot of companies - all of them in fact. It's the most valuable company on earth, ever. The day Apple quits being your customer to become your competitor is a very, very bad day for any company.
In this case, Apple was almost forced into competing with Intel. After almost 5 years of little progress in CPUs from Intel, Apple ran out of patience and had to design their own chips or be left behind. What is forgotten is that Apple tried to use Intel for iPad prototypes and found the power efficiency abysmal. They have tried to make Intel work many different times.
It was a sad reprise of the corner they got backed-into with the PowerPC.
IBM was not willing to put the effort into developing a low-power, fast, G5; and Motorola's semiconductor division was busy self-destructing from within; so Apple did two things:
1. Cast their lot with the only other viable game in town (Intel).
2. Purchased PASemi.
Time passes.
Now it's Intel that is too power-hungry, and too slow to improve.
Now what?
And the rest is history in the making!
Re: (Score:2)
IBM had no use internally for a lower-power POWER, or at least not enough of one to justify the cost. People who buy POWER systems aren't cross-shopping anything from Apple, especially after their killing-all-the-servers debacle.
Re: (Score:2)
Re: (Score:2)
Fascinating!
Re: This seems like the bare minimum (Score:2)
In the 486 days, I really doubt anybody gave a shit about wattage. That was barely in the days when you started seeing heat sinks on CPUs, and even then they were tiny ones. You didn't even start to see fans until the first pentiums exceeded 75mhz.
Re: (Score:2)
In the 486 days, I really doubt anybody gave a shit about wattage.
Acorn did (well that was maybe slightly before the 486 days). The reason being they were very very concerned about cost and the ARM had to be cheap. To get it to be cheap they designed it to be low power so that they could use cheap plastic packages, rather than the expensive ceramic ones which contemporary high performance CPUs had.
Re: (Score:2)
Re: (Score:2)
You didn't even start to see fans until the first pentiums exceeded 75mhz.
https://www.ebay.com/itm/23336... [ebay.com]
https://picclick.com/486-80486... [picclick.com]
Re: (Score:2)
So Intel's next generation is slightly more powerful than Apple's current line up
Calling the M1 Max a "current line up", while calling Intel's CPU to be released in 2 weeks "next gen" just smells of Apple fanboism.
Re: (Score:2)
"So Intel's next generation is slightly more powerful than Apple's current line up, Apple being a company that isn't a dedicated CPU manufacturer.
Gosh. Wowee. Well done Intel!"
Copied a comment not even a page up? I guess Apple fanbois all think alike. "Apple's current lineup", lol, for how long? You think anyone is going to be fooled by you implying that the M1 Max is an older Apple design? It is literally brand new.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
And that was done WITHOUT the benefit of HBM (Score:3)
with only ddr4 or ddr5 . nice!
Re: (Score:2)
Because the RAM is on the same package as the CPU and GPU. They can clock it higher and optimize the memory control for particular chips.
The down sides are that you can't upgrade the RAM, and the GPU shares it. Their headline memory bandwidth figures are only valid with the GPU idle, and of course it's not GDDR RAM either.
Silly benchmark (Score:5, Insightful)
These results are an embarrassment for Intel ..just look at performance per watt. The Apple CPU has the same performance as the Intel CPU but uses a lot less energy. That translates to increased battery life and a cooler laptop (literally). It also means that Apple could just add cores and beat Intel. It looks like x86 is F'd, so Intel needs a new CPU architecture ASAP .. if not ARM or RISC V something else.
Re:Silly benchmark (Score:5, Insightful)
Re: (Score:3, Insightful)
Ok.. so I'm supposed to buy their CPU because of that? Give them a participation trophy?
Re: (Score:2)
Re: (Score:2)
you get a pretty well gimped result if you purchase Apple unless you want to live in the Apple ecosystem.
Yes, this is why you buy Intel or AMD over Apple's silicon. That said, with so many cloud-based services these days it might actually work out in the long term. As long as your compute-intensive applications that have to be local are well supported on Apple silicon.
Re: (Score:2)
Intel has tried several times to put out an alternate architecture, like with the i960 and for that matter Itanic, and they have failed every time. They are not afraid to put out a non-x86 processor, they just aren't any good at it.
Re: (Score:2)
Intel is not "stuck with an ancient architecture", this is just what ignorant people say. They have an old ISA, that is a very different thing.
Re: (Score:2)
It looks like x86 is F'd,
How? It's not like Apple sell CPUs.
process, legacy (Score:2)
First, I'm assuming Intel's is a 10-nm part, which I understand is roughly equivalent to TSMC 7nm. I believe that Apple is producing a 5nm part. When Intel's designs start rolling out of TSMC's 3nm production line, the wattage questions may shift somewhat, but Apple and Intel will be on (more) equal physical footing.
Second, Intel has much more legacy hardware to support than Apple. Fujitsu doesn't support anything outside of AArch64 for their supercomputer, for example, but I am assuming that Apple retains
Re: (Score:2)
It looks like x86 is F'd
Based on the idea that shoppers shop based on energy efficiency alone? Honestly the M1 can come with free blowjobs and be 6 times faster, a lot of windows people won't give a shit, and visa versa as well.
x86 is completely unaffected by Apple's M1. No one is going to change platforms for marginal efficiency gains when their laptops already comfortably get them through their 10 hour day. It's the laptop equivalent of range anxiety.
Re: (Score:2)
"t also means that Apple could just add cores and beat Intel."
It doesn't really mean that though, since Apple just got through doing just that.
"It looks like x86 is F'd, so Intel needs a new CPU architecture ASAP .. if not ARM or RISC V something else."
That sounds like a fair an honest appraisal from a dispassionate industry expert. Too bad Intel has never faced a challenge to its CPU architecture.
Imagine what the news would be like had Intel not actually won the benchmarks. You wouldn't have to make up n
See Intel (Score:2)
Seriously, the lack of serious competition in the processor space has held back advancement by 10-15 years.
Re: (Score:2)
It is not really competition I think. It is mindset.
Being conservative or call it archaic. Why is anyone doing his embedded stuff still on 8088/8086 based hardware? Because they they are masochists.
Can not be so hard to make a sane orthogonal 8/16 bit mini risc processor which is actually fun to program (if you need to do assembly).
But no. They shell out 50 year old designs, as "industry standard". Somehow - during the CORONA crisis - I have the impression the CPUs are viruses, and the factories are infecte
Wishing Intel the best! (Score:2)
Good CPU competition is not a bad thing for the rest of us...
Re: (Score:2)
Good CPU competition is not a bad thing for the rest of us...
But what does that have to do with Intel? And what is the performance like with inherent architectural security hole mitigations enabled?
Units please for the numbers. (Score:2)
Mandatory XKCD https://xkcd.com/833/ [xkcd.com]
Re: (Score:2)
Indistinguishable from a rigged demo? (Score:3)
Let's see how the actual released chips perform in systems at similar price points to the MBP.
Re: (Score:2)
It will be hard to find one that expensive, and if you do there will be another reason to bias the comparison in Apple's favor.
I counted... (Score:3)
...two "may" and a "could" on this page. I won't get excited until I see how it performs in real world tests. What kind of power will it draw? How much throttling is there when the laptop is unplugged? How loud are the fans, and how long do they run? How long does it take to export ProRes and how much battery is left afterwards? We "may" soon have answers.
Re: (Score:2)
Regardless of all that, Intel got the headline they were looking for, so I guess their marketing team did their job.
Re: (Score:2)
"I won't get excited until I see how it performs in real world tests. "
Did you say that about the M1 Max?
"How long does it take to export ProRes and how much battery is left afterwards?"
Maybe we can compare how long it takes to boot OS X too? There has to be some benchmark that the M1 Max wins on after all. Let's find it then insist it is the most important measure ever, in true fanboy fashion.
And the i9 does it at what wattage? (Score:2)
How's the power draw and pricing on that part?
How will it perform in a real system I wonder (Score:2)
Yes the chip by itself in the most optimal bench situation may be slightly ahead.
But what happens when you put real laptop hardware around that chip? Will it fare as well as the "raw" benchmark presented here, even connected to power?
To say nothing of performance on battery - not only will the Intel chip eat up way more power shortening the battery life, but will probably be outright slower on battery unlike the M1 Max [yiningkarlli.com] which maintains full performance on battery or power, for heavy loads.
Re: (Score:2)
Citations please. Oh yeah, you just make shit up.
I guess Intel has never "put real laptop hardware around a chip" before. If only they were as experienced as Apple at making processors that work in the real world.
I wonder if you would be so flippant about "raw" benchmark results a couple weeks ago when they favored Apple? Seems to me you weren't when the M1 was introduced.
"...which maintains full performance on battery or power, for heavy loads."
If that were true, Apple wouldn't make a special mode for "
performance per watt? (Score:2)
That's the important factor to consider, because, ultimately, that is one of the standout features of Apple silicon.
It has insane performance, using a fraction of the energy.
This requires less cooling and for mobile devices, uses less battery life.
But, sure, the other factor is fairly obvious - Apple silicon is for Apple hardware only - and I guess that's the main reason WHY the architecture works so well. Apple control absolutely every aspect of the products they deliver - the hardware AND the software.
Power draw (Score:2)
I would have liked to see comparison per watt as well.
And that isn't just about battery as some surmise. Some people want to draw less power in general for any given workload. It's good for the earth, the wallet and solar setups. :)
News at 11 - Fixed Function Hardware is Faster (Score:2)
Re: (Score:2)
The tests people have been running have not by-and-large been of the fixed-function areas (e.g.: machine learning inference or media playback), then have been tests that stress workloads like compiling, memory operations, or matrix math. Areas that are still specialized, but are the building blocks of general computation.
This means in real-world usage (were those fixed-function blocks come into play) the new additions to the M1 family are going to do even better than the recent round of results.
Not just speed. (Score:2)
And of course, the fact that they lack the endian ability to run iOS apps.
Performance on battery power... (Score:2)
Please unplug both laptops and rerun the tests. It's one thing to build a fast laptop that needs to be plugged in and gets too hot to sit on your lap -- its another to build a fast laptop you can use on a park bench.
Intel? Who cares? (Score:2)
They are a has-been. May still take some time to become really obvious.
Re: (Score:2)
Re: (Score:2)
Remember when Apple used to focus?
They're overextending and they're about to get ganked by Intel and AMD.
Hmmm. The M1 team is overseen by a guy who spent a dozen years working in Intel's largest development center [wikipedia.org], in between a couple stints doing R&D and chip design at IBM, before joining Apple about a dozen years ago. Intel looked at him as a CEO candidate [axios.com] after Krzanich got grabby. And Intel's current CEO just told a Yahoo Finance summit [yahoo.com] that "his job" is to get Apple's business back.
But do go on.
Re: (Score:2)
Uh, because Intel hasn't released the power spectrum for that chip yet?
Re: (Score:3)
They didn't need a new generation to beat the M1 on absolute performance, the last generation tigerlake already did that. The article is claiming that the leaked-for-months but unreleased alderlake is beating the M1 Max, which near as I can tell is only available in apple products starting from $3300.
Given the M1Max uses up to 120W it's highly unlikely that the intel part will draw 3x the power, but much of the power budget for apple's chip is likely being "wasted" on pure CPU benchmarks given how many tra