Apple Introduces M1 Pro and M1 Max (apple.com) 201
Apple today announced M1 Pro and M1 Max, its new chips for the Mac. Apple: M1 Pro and M1 Max introduce a system-on-a-chip (SoC) architecture to pro systems for the first time. The chips feature fast unified memory, industry-leading performance per watt, and incredible power efficiency, along with increased memory bandwidth and capacity. M1 Pro offers up to 200GB/s of memory bandwidth with support for up to 32GB of unified memory. M1 Max delivers up to 400GB/s of memory bandwidth -- 2x that of M1 Pro and nearly 6x that of M1 -- and support for up to 64GB of unified memory. And while the latest PC laptops top out at 16GB of graphics memory, having this huge amount of memory enables graphics-intensive workflows previously unimaginable on a notebook. The efficient architecture of M1 Pro and M1 Max means they deliver the same level of performance whether MacBook Pro is plugged in or using the battery. M1 Pro and M1 Max also feature enhanced media engines with dedicated ProRes accelerators specifically for pro video processing. M1 Pro and M1 Max are by far the most powerful chips Apple has ever built.
Utilizing the industry-leading 5-nanometer process technology, M1 Pro packs in 33.7 billion transistors, more than 2x the amount in M1. A new 10-core CPU, including eight high-performance cores and two high-efficiency cores, is up to 70 percent faster than M1, resulting in unbelievable pro CPU performance. Compared with the latest 8-core PC laptop chip, M1 Pro delivers up to 1.7x more CPU performance at the same power level and achieves the PC chip's peak performance using up to 70 percent less power. Even the most demanding tasks, like high-resolution photo editing, are handled with ease by M1 Pro. M1 Pro has an up-to-16-core GPU that is up to 2x faster than M1 and up to 7x faster than the integrated graphics on the latest 8-core PC laptop chip.1 Compared to a powerful discrete GPU for PC notebooks, M1 Pro delivers more performance while using up to 70 percent less power. And M1 Pro can be configured with up to 32GB of fast unified memory, with up to 200GB/s of memory bandwidth, enabling creatives like 3D artists and game developers to do more on the go than ever before.
M1 Max features the same powerful 10-core CPU as M1 Pro and adds a massive 32-core GPU for up to 4x faster graphics performance than M1. With 57 billion transistors -- 70 percent more than M1 Pro and 3.5x more than M1 -- M1 Max is the largest chip Apple has ever built. In addition, the GPU delivers performance comparable to a high-end GPU in a compact pro PC laptop while consuming up to 40 percent less power, and performance similar to that of the highest-end GPU in the largest PC laptops while using up to 100 watts less power.2 This means less heat is generated, fans run quietly and less often, and battery life is amazing in the new MacBook Pro. M1 Max transforms graphics-intensive workflows, including up to 13x faster complex timeline rendering in Final Cut Pro compared to the previous-generation 13-inch MacBook Pro. M1 Max also offers a higher-bandwidth on-chip fabric, and doubles the memory interface compared with M1 Pro for up to 400GB/s, or nearly 6x the memory bandwidth of M1. This allows M1 Max to be configured with up to 64GB of fast unified memory. With its unparalleled performance, M1 Max is the most powerful chip ever built for a pro notebook.
Utilizing the industry-leading 5-nanometer process technology, M1 Pro packs in 33.7 billion transistors, more than 2x the amount in M1. A new 10-core CPU, including eight high-performance cores and two high-efficiency cores, is up to 70 percent faster than M1, resulting in unbelievable pro CPU performance. Compared with the latest 8-core PC laptop chip, M1 Pro delivers up to 1.7x more CPU performance at the same power level and achieves the PC chip's peak performance using up to 70 percent less power. Even the most demanding tasks, like high-resolution photo editing, are handled with ease by M1 Pro. M1 Pro has an up-to-16-core GPU that is up to 2x faster than M1 and up to 7x faster than the integrated graphics on the latest 8-core PC laptop chip.1 Compared to a powerful discrete GPU for PC notebooks, M1 Pro delivers more performance while using up to 70 percent less power. And M1 Pro can be configured with up to 32GB of fast unified memory, with up to 200GB/s of memory bandwidth, enabling creatives like 3D artists and game developers to do more on the go than ever before.
M1 Max features the same powerful 10-core CPU as M1 Pro and adds a massive 32-core GPU for up to 4x faster graphics performance than M1. With 57 billion transistors -- 70 percent more than M1 Pro and 3.5x more than M1 -- M1 Max is the largest chip Apple has ever built. In addition, the GPU delivers performance comparable to a high-end GPU in a compact pro PC laptop while consuming up to 40 percent less power, and performance similar to that of the highest-end GPU in the largest PC laptops while using up to 100 watts less power.2 This means less heat is generated, fans run quietly and less often, and battery life is amazing in the new MacBook Pro. M1 Max transforms graphics-intensive workflows, including up to 13x faster complex timeline rendering in Final Cut Pro compared to the previous-generation 13-inch MacBook Pro. M1 Max also offers a higher-bandwidth on-chip fabric, and doubles the memory interface compared with M1 Pro for up to 400GB/s, or nearly 6x the memory bandwidth of M1. This allows M1 Max to be configured with up to 64GB of fast unified memory. With its unparalleled performance, M1 Max is the most powerful chip ever built for a pro notebook.
x86 is dead (Score:2, Funny)
Re: (Score:2)
RISC-V says hi.
Re: (Score:2)
And goes back to sulk in the corner and is allowed only to return once it gets 2 GHz clock speed and is barely usable for running any kind of desktop app.
Re: (Score:3)
64GB max video + cpu is not mac pro level or even (Score:2)
64GB max video + cpu RAM is not mac pro level or even an mac mini pro level.
also can it drive more then 2 displays (display link / usb powered ones do not count)?
Re:x86 is dead (Score:5, Interesting)
Said like a true believer, if not just a fool.
Only a true believer would post something like you did. Others would see that neither Intel nor AMD _STILL_ have an answer even to the M1 when looking at is as a total price/performance/noise level/power draw/battery life package. If the claims of 1.7x M1 are true, things are getting outright embarrassing for x86.
Re:x86 is dead (Score:4, Insightful)
Low end Ryzen's performance at the time the M1 launched was and double the Apple. In machines at a similar price point to a MacBook the gap was even larger. If course Ryzen continues to improve.
AMD has announced its strategy, as had Intel. Intel is adopting performance and efficiency cores, but it needs software support. AMD are just increasing the dynamic range to get the efficiency and performance.
The other thing to compare is upgradeability. Apple doesn't have any, RAM is integrated. That matters to some people, especially since Apple charges a lot for more RAM.
Re:x86 is dead (Score:5, Insightful)
Ryzens still have something like 2-3x the power draw at high performance levels. It's not that the Ryzens aren't good, it's that the statement:
...total price/performance/noise level/power draw/battery life package
still falls in Apple's favour here. The price is reasonable, the performance is great, and the computers are basically silent because the total power envelope is so low, so there's no need for big fans to cool them. And the laptops are physically cooler as well, which means you can actually use them on your lap.
It may be that AMD has something coming that matches this coming, but they don't have that YET. All of them have loud fans and run hot and throttle when they run for a long time. Some don't even perform at their top level unless they're plugged in.
As always, the claims about the M1 aren't necessarily about the best performance, they're about performance per watt, or performance in a small package. It's not that CPUs don't exist that don't outperform the M1--Apple still sells some of those CPUs--it's that they pack a lot into a SoC that only draws something like 14w.
I've watched plenty of head-to-head comparisons of the M1 laptops vs. PC laptops, and while the M1 doesn't always win in benchmarks and large data processing, it's always straight up flattened everything else in heat, power, noise and battery life.
Re: (Score:2)
Total power draw is irrelevant, the only meaningful metrics are performance per Watt, where Ryzen was ahead of the M1 and I expect will be ahead of these new parts now, and maximum performance.
Performance per watt determines battery life, max performance determines how long you have to wait for it to do something.
Re: (Score:2)
ARM as well. The fastest Snapdragon seems to only get about half as much as an M1, from what I've read. It would be nice if some ARM (or RISC-V) CPU maker could at least get into the same ball game, especially if they can do a UEFI board.
University ASM classes switching to ARM (Score:2)
Said like a true believer, if not just a fool.
Only a true believer would post something like you did. Others would see that neither Intel nor AMD _STILL_ have an answer even to the M1 when looking at is as a total price/performance/noise level/power draw/battery life package. If the claims of 1.7x M1 are true, things are getting outright embarrassing for x86.
Look at the universities, many of their respective assembly language classes have switched to ARM.
The venerable Art of Assembly Language book which started out in x86 16-bit days and is now in its 64-bit incarnation, has a 64-bit ARM version under development.
https://randallhyde.com/Forum/... [randallhyde.com]
Re:x86 is dead (Score:5, Interesting)
Strange statement. I've been using open source software on my M1 Mac for a couple of months now, as well as writing code. Java and C/C++ all seem to work just fine, and the battery life is just bloody amazing. Best of all, the command prompt is actual BSD Unix, so I have a powerful processor and the CLI interface I enjoy the most and am the most productive in. I have my old Dell laptop if I need to do anything Windows-specific, but honestly, other than to use it for video conferencing, I rarely even turn it on anymore.
Re: (Score:2)
I'm in the same boat, and pretty happy with the M1 Mac all around. With Homebrew, pretty much any utility one wants is available, and if one does want to run full virtual machines, Docker can easily run Linux in a container, which will do OK for most things except GUI related stuff.
It will be interesting to see how well the M1 architecture does with security over the long haul. Apple did a decent job with their ARM platform, and it definitely runs rings around anything ARM based.
Re: (Score:3)
I've been using open source software on my M1 Mac for a couple of months now
Did you get accelerated graphics working on whatever FLOSS replacement for Mac OS X you're using?
Re: (Score:2)
He is using MacOSX. Did you read his post ?
Re: (Score:2)
Re: (Score:2)
He wrote "Open Source Software" not "Open Source Operating System", so I see no contradiction.
Or do you somehow believe OSS becomes not so if run on a proprietary operating system?
Re: (Score:2, Insightful)
1) Operating system is software. If I want to use open source software, I logically want to use an open source operating system. (In fact the *overwhelming* majority of the open source software I'm using is *already* a part of the system distribution!)
2) If someone in is prison in Norway, but gets weekends off, does that mean that he's not in prison?
Re: (Score:2)
At work I run PuTTY on (*gasp*) Windows. Do I not use open source software? What if I run PuTTY on an open source operating system, but on a CPU with closed source microcode? I find that you're drawing a somewhat arbitrary line for what it means to be a true user of open source.
Re:x86 is dead (Score:5, Insightful)
I find that you're drawing a somewhat arbitrary line for what it means to be a true user of open source.
Perhaps it's arbitrary, but I believe that there is a case to be made that simple majority is a reasonable criterion here (just like when you say "I commute by bus", you probably don't mean that you use the bus once a month and you drive your car the rest of the month). Running everything using open source except for some small amount of firmware should qualify as using open source software, whereas running PuTTY on Windows doesn't most likely cut it, especially when you consider that the Windows OS itself today probably includes orders of magnitude more of more open source code than PuTTY contains - Edge alone is millions of lines of FLOSS code, then there's at least WTL, .NET, and who knows what else is in the unholy mix known as MS Windows. I hope you realize that this would make the claim of "running open source software" meaningless since EVERYONE could say it, so why bother with it?
Re: x86 is dead (Score:5, Funny)
You give the impression of a person who, on seeing a sign advertising 'hot girls', would complain bitterly on being thrown out for attempting to take their temperature.
Re: (Score:2)
Re:x86 is !dead (Score:3)
Re: (Score:2)
Decades you say?
That depends on if they implement SVE, which is a quite respectable instruction set and is proven for HPC.
Re: (Score:2)
Re: (Score:3)
Look, stop being a twat, OK? You made a mistake, taking my post out of context but now you're doubling down. Making a mistake is fine. Acting like a tosser, not so much.
The OP said this:
> The x86 are amazing chips and ARM is decades away from getting there for high end computing,
This is clearly not true since ARM is currently #1 in high end computing. That is not a disputable fact.
ARM itself is capable of high end computing. That's what I was addressing. ARM isn't decades away, it got there already. Whe
Re: (Score:2, Funny)
Re: (Score:2)
macOS is genuine, certified UNIX®.
case-sensitivity--so what.
Re: (Score:2)
Re: (Score:2)
Easily rectified by e.g. brew, but I hate how basic many of the BSD command line tools are compared to the GNU ones we get with Linux..
Re: (Score:2)
agreed, every time I use the mac OS command line I feel like being trapped in a 90s Linux.
Re: (Score:3)
I've been switching between BSD and Linux for 24 years. Most of my development these days is on back end stuff, with some web interface glue where needed. WSL, even WSL2 just isn't as good.
Re: (Score:2)
lol, how can this be modded "trolling"? definitely anything but fanboyism, suuuure ... X'D
thanks for the laughs :)
Re: (Score:3, Informative)
The performance has already been demonstrated, so I'm not sure what you think they need to prove here. These are the same cores used in the M1 models shipping today, but more of them, with higher clock speeds, and better cooling. Given the massive performance gains that many of us have been enjoying for a year already, his claims seem downright modest, and if you don't think they are too, you haven't been paying enough attention to this space.
Re: (Score:2)
Re: (Score:2)
Not for laptops (Score:3)
Graphics while a huge boost compared to the crappy M1 are still a long long way behind offerings available from Nvidia and AMD.
If you watched the presentation they compared it against "the most expensive gaming laptop" and the performance of the M1 Max was about equal in terms of GPU...
Except that the performance of the Apple laptop doesn't drop on battery only.
And it used I think 100w less power?
Yes there are higher end GPU's more powerful. Apple will get around to trouncing those next year with the M1 Ul
Re: (Score:3, Interesting)
There's no hardware raytracing support, it's all handled in "software" via compute shaders so the result is you end up with very poor performance for any cases where you want to do realtime raytracing. Now in games that's really just a quality thing and realistically most people won't bother for any fast-paced action games but for 3D content creation having a raytraced viewport is just the standard way of working these days.
So looking at the M1 versus the available mid/high end discrete GPU of the time (RTX
Yes raytracing an interesting point (Score:3, Interesting)
There's no hardware raytracing support
That is a interesting point about the raytracing, I wonder if not having specific hardware devoted for raytracing would be made up at all by Metal optimizations for the GPU that it contains which may have something like hardware support for raytracing calculations we just don't know about yet...
Apple does have a guide [apple.com] on how to use Metal to accelerate ray tracing.
I have no reason to expect Apple has yet caught up to that level of GPU support though. Maybe in the deskto
Re: (Score:2)
Re: (Score:2)
There's no hardware raytracing support, it's all handled in "software" via compute shaders so the result is you end up with very poor performance for any cases where you want to do realtime raytracing.
Citation?
IIRC, the Apple GPU in the iPhone has had hardware Ray Tracing for a while now; so, IMHO, there is no reason to believe the GPU in the Mx SoC would be any less well-endowed.
Re: (Score:2)
Sorry to Reply to my own Post; but I need to Retract the Parent.
I was mistaken. The M1 Series has no hardware Ray Tracing.
Sorry.
Re: (Score:2)
I already retracted my post and apologized.
Re: (Score:2)
You get your benchmarks from Apple presentations? Of course you do.
Since when do people who care about graphics performance operate on battery only? Oh yeah, since Apple wants to promote it.
"Yes there are higher end GPU's more powerful. Apple will get around to trouncing those next year with the M1 Ultimate Pro Max chip or whatever they choose to call the Mac Pro desktop chip."
Sounds objective. At least you aren't claiming the generation in the next 3 months, like happened here last time with the M1.
I gu
Re: (Score:2)
Since when do people who care about graphics performance operate on battery only? Oh yeah, since Apple wants to promote it.
Yes because all electronic devices should always be plugged in.
Re: (Score:2)
If you watched the presentation they compared it against "the most expensive gaming laptop" and the performance of the M1 Max was about equal in terms of GPU...
Do you have any idea what laptop they might have used?
Re:Not for laptops (Score:4, Insightful)
I believe they did actually say it was the most "powerful" laptop they could find.
They didn't say what it was and their graph's vertical scale is meaningless (it's "relative performance" and it looks like the M1 Max caps out at "375" while the "most powerful gaming laptop" caps at over "400") so who freaking knows what that means.
Plus, how are they even comparing things? Did they use a benchmark? What benchmark? Relative to what?
Who knows. They didn't give exact figures and they didn't say what they were comparing against.
It'll be interesting to see these things benchmarked when they ship. I expect that they really will have some impressive 3D performance, but none of that really matters because it's not like anyone uses Macs for anything anyway. (Even creative types have mostly moved over to Windows tools thanks to Apple Silicon breaking almost every piece of software professionals use. Sure, the pro-sumer stuff has been ported, but not the pro stuff.)
Re: (Score:3)
From elsewhere in the inter-tubes:
CPU comparisons:
MSI Prestige 14 EVO A11M-220 (i7-1185G7)
MSI GP66 Leopard 11UG-018 (i7-11800H)
GPU comparisons:
MSI GP66 Leopard 11UG-018 (Intel UHD Graphics Xe (32 EUs)
Lenovo Legion 5 82JW0012US (RTX 3050 Ti 85W)
Razor Blade 15 Advanced (RZ09-0409CE53) (RTX 3080 95W)
MSI GE76 Raider 11UH-053 (RTX 3080 155W)
Re: (Score:2)
...and MSI run Linux out of the box (no fiddling around with drivers), well, mine does anyway. You still can't realistically run Linux on an M1, not any well-supported mainstream flavour, anyway. Plus, MSI, Lenovo, Razor, et al. all sell 17" laptops if that's important. M1s max out at 16".
I used to run Linux on an old MacBook Pro but I had to cover the back-lit Apple logo with foil & a Linux sticker so that I didn't get approached by Apple weirdos all the time. Seriously, it's like some kind of cult.
Re: (Score:2)
In the end you buy a computer because it suits your needs, not because you want to having a pissing contest.
If you want a mature and stable distribution of Linux, then you need to go with something in the Intel platform. If you have an M1 computer, then you'll get the Linux distribution that best runs there, knowing there is still a lot of work to be done.
At he same time, while probably still green around the ears, Ubuntu does have target build for M1 Macs:
https://arstechnica.com/gadget... [arstechnica.com]
There is very little software that is Linux only (Score:2)
Re: (Score:2)
I'm not too fond of the whole OS X UI and the ecosystem lock-in. An Ubuntu build that works OOTB and provides the same incredible battery life would go a long way to making an M1 Mac an option for me.
Re: (Score:2)
macOS is a Mach microkernel pretending to be BSD Unix.
Re: (Score:2)
Because Linux is a better operating system with a better GUI.
Re: (Score:2)
Re: (Score:3)
Re: (Score:2)
From elsewhere in the inter-tubes:
CPU comparisons:
MSI Prestige 14 EVO A11M-220 (i7-1185G7)
MSI GP66 Leopard 11UG-018 (i7-11800H)
GPU comparisons:
MSI GP66 Leopard 11UG-018 (Intel UHD Graphics Xe (32 EUs)
Lenovo Legion 5 82JW0012US (RTX 3050 Ti 85W)
Razor Blade 15 Advanced (RZ09-0409CE53) (RTX 3080 95W)
MSI GE76 Raider 11UH-053 (RTX 3080 155W)
You are saying those are the laptop configs Apple compared to?
If so, that's fairly impressive.
Re: (Score:2)
They didn't say what it was and their graph's vertical scale is meaningless (it's "relative performance" and it looks like the M1 Max caps out at "375" while the "most powerful gaming laptop" caps at over "400") so who freaking knows what that means.
Plus, how are they even comparing things? Did they use a benchmark? What benchmark? Relative to what?
Who knows. They didn't give exact figures and they didn't say what they were comparing against.
It'll be interesting to see these things benchmarked when they ship.
I think they did mention the Benchmark they used; but I will have to check.
And you're right: Now that the press embargoes have been lifted, and units soon to ship to the public, they will be benchmarked 32 ways to Sunday, to within a Nanometer of their lives!
Re: (Score:2)
With M1, the x86 apps running on Rosetta were already running as fast as on a fast Windows laptop. No doubt you'll have a hard time finding an x86 app running slower than on a Windows laptop with these CPUs.
That said, if you don't want to break anything, you're just stuck with x86 for your entire life.
I find that very sad.
Re: (Score:2)
... I expect that they really will have some impressive 3D performance, but none of that really matters because it's not like anyone uses Macs for anything anyway. (Even creative types have mostly moved over to Windows tools thanks to Apple Silicon breaking almost every piece of software professionals use. Sure, the pro-sumer stuff has been ported, but not the pro stuff.)
Wow! - now there's a bold statement.
Bold, but incredibly inaccurate.
So, in your world, it's just consumers and creatives that use a Mac ... because you don't.
And even though you don't use a Mac, you claim Apple Silicon has broken every piece of software professionals use?
So, go on, name them. I'm waiting.
Seriously, if this was the case, the creative industry would be up in ARMs (pun intended) - you'd never hear the last of it.
I've heard nothing.
Also, as both a creative and a coder, I've yet to find anything
Re: (Score:2)
If you watched the presentation they compared it against "the most expensive gaming laptop"
Why didn't they compare it against the most powerful gaming laptop?
They said "Most Powerful”.
Re: x86 is dead (Score:2)
People are guessing from the Apple charts that the GPU performance will be comparable to a mobile GeForce 3060. It will be interesting to see if the actual benchmarks will meet those goals.
If it's true, it's basically the first APU in history where the integrated graphics do not suck!
Re: (Score:2)
If it's true, it's basically the first APU in history where the integrated graphics do not suck!
PS4/PS5/XB1/XSX would all disagree.
Re: (Score:2)
They're for 3D rendering and CAD applications (double precision performance).
If you're targeting the professional creative 3D art crowd or the engineers their integrated graphics don't cut it. But if they support external graphics cards that's fine.
Alternatively they could offer some remove SaaS rendering and computing farms. That can work for people with good internet connections for a final render. But it won't help with viewport performan
Re: (Score:2)
Because gamers don't use Apple. When that changes, game makers may support the platform.
Re: x86 is dead (Score:2, Insightful)
Re: (Score:2)
Why not play on Apple?
I just went to the Steam homepage (in Incognito mode, so no risk that it was somehow skewed by my own browsing habits) and looked at the "New and Trending", "Top Sellers," and "Popular Upcoming" tabs. Of the 30 games in the top 10, only 4 have Mac support. I bet your PlayStation supports more than 15% of popular console games.
I've used my Mac laptop for light gaming while away from home (Factorio!) but the current reality is that Windows is still a no brainer for PC gaming.
Re: (Score:2)
Except that gamers don't use Macs
Times change, my friend.
Re: (Score:2)
thanks for the update. then it's just a matter of apple providing a sensible usage license and other software vendors working on compilers and platforms. sounds good.
Re: (Score:3, Informative)
the dumber you are, the more you are willing to believe. See SuperKendall...and PCM2. It runs what it runs, and Apple fanboys will claim that anything it can't is not "anything".
Re: (Score:2)
I can think of individual examples of stuff that runs funky (GIMP) but that's always going to be true for a while.
Re: (Score:2)
So in your view, until they have a computer that runs 100.00000000000% of the previous existing software, they should not release anything?
Good luck being stuck to x86 for the rest of your life. I find that perspective really sad, especially after an event like this one.
ffs (Score:2)
Hate to say it, but as an Android fanboy who utterly dislikes Apples products I'm a bit jealous.
Google has money, why aren't they doing this?
Re: (Score:2)
Google's CPU is launching with the Pixel 6... I think this month, certainly soon.
If you want a high performance laptop though, Ryzen is still king.
Re: (Score:3)
nice euphemism (Score:2)
Re:nice euphemism (Score:5, Interesting)
Technically it is shared memory, but there is a 'switched fabric' memory controller between the CPU and GPU.
What is the significance?
It means the GPU doesn't have to bother the CPU to fetch memory for it, thus slowing down the CPU and tying up the memory bus.
Right now in most shared memory setups, the CPU is the default memory controller, like Intel, the only path to memory is through the DDR memory bus which is connected to the CPU.
https://www.intel.com/content/... [intel.com]
In the M1 architecture both the CPU and GPU can talk directly to memory through the switch. (as far as I understand it) which acts as a memory controller.
The big downside is that nothing is upgradable, but it is fast!
Intel is currently looking at similar arrangements with other vendors where they can share silicon on the CPU die.
Re: (Score:3)
It still ties up the RAM when fetching data. Just not using the CPUs memory controller doesn't make much difference, the bottom line is that RAM is unavailable to the CPU while the GPU is using it.
Another issue is that the RAM is not GDDR, it's not optimised for GPU use.
Gaming and workstation laptops have dedicated GPU RAM for these reasons.
Re:nice euphemism (Score:4, Informative)
I've done comparison between my M1 macbook air to windows amd 3700x and linux quad core i7. I can tell you without any BS, for many daily programming activities, the M1 is faster. For example starting a ReactJS project on m1 MB air is usually 4x faster and more responsive. That unified memory makes a big difference. My AMD 3700x system has 64G of memory and 8 full power cores. Yet the M1 with 4 perf cores out performs it.
I love being able to replace or upgrade components on my windows workstation, but unified memory makes a real difference.
Re: nice euphemism (Score:2)
How do you know the unified memory has anything to do with it? Perhaps the m1 CPU itself is just fast.
Re: (Score:2)
Exactly, the single core speed appears to be the most likely explanation.
Re: (Score:2)
yes, that used to be a slam, but now Apple makes it so it is revolutionary
Re: (Score:2)
Assuming they're using "Unified" in the same way you see in game consoles, then yeah, it's a big difference.
Shared memory on your typical Intel integrated GPU means your GPU's memory bus is wired directly to the CPU. All GPU memory requests get sent to the CPU, then they get processed like any memory access from the CPU would, including going thru the CPU's cache.
On game consoles with unified memory, there's often multiple memory busses. You've got one bus that goes RAM -> CPU -> GPU, and another that
Kinda makes sense, but it is terrible (Score:2)
I am writing this on one of the weirdest collaborations ever designed. An Intel NUC with an embedded AMD GPU on SOC. It has i7-8809G, with an i7 core and n Radeon RX Vega combined on the same chip. It actually performs quite well, and can play many games at 1080p.
So I can see the appeal.
That being said, the Apple chip has many downsides for the public. First of all, lack of OS choice. Even though Linux is being hacked to run on their platform, many things do not work. Than lack of proper PCIe extensibility,
Re: (Score:2)
I dunno, if you go back to the pre "PC-compatible" days, the variety of hardware and unique takes on what an OS should be was astoundingly diverse compared to what we saw during the Wintel-dominated eras of "flexible standards". Sure, the Wintel days meant cheap interchangeable hardware, but that also meant a very generic definition of physical form factors and lowest-common-denominator interoperability between software and hardware.
Re: (Score:2)
Well, we had Linux, FreeBSD, ReactOS, FreeDOS, Haiku, MorphOS/AROS thanks to that open platform. And we had all kind of weird form factors.
Today on ARM, unfortunately even running Linux requires many hacks, and the end results are not reliable. To this day, there is no single ARM workstation available on the market (yes, I know about Ampere, I said "available").
Re: (Score:2)
I'm talking about earlier than that - before Linux and the various free *NIX OS, before Compaq created the "IBM Compatible" there was an explosion of form factors and ideas about what an OS should be. Arguably the ARM space on mobile and hybrid tablet/laptop devices is the closest we're getting to recreating that golden age of unique hardware that depended on unique operating systems.
Re: (Score:3)
Well, we had Linux, FreeBSD, ReactOS, FreeDOS, Haiku, MorphOS/AROS thanks to that open platform. And we had all kind of weird form factors.
Today on ARM, unfortunately even running Linux requires many hacks, and the end results are not reliable. To this day, there is no single ARM workstation available on the market (yes, I know about Ampere, I said "available").
I have to ask a serious question (Honestly not Trolling) :
Since macOS on Apple Silicon is already a certified Unix, who gives a flip if it runs fake Unix (Linux)?
https://www.opengroup.org/open... [opengroup.org]
I know macOS doesn't include some of the usual scripts that most Linuces include; but those are easily added for those who need them.
So why aren't we all just one big, happy Posix family?
Re: (Score:2)
Intel/AMD/Nvidia/Google/Microsoft/Samsung/others have serious resources and talent, especially if they decided to cooperate. Until now, they had no strong incentive to shake up profitable status quo. Apple now gave them a needed push, innovation over the next decade should be interesting to watch. For example, Apple's neglect of enterprise market leaves a big opening for an ultra-upgradable/compatible/customizable platform.
The posts here so far are pretty predictable (Score:2, Troll)
Let's take it up a notch, shall we?
Anandtech has great article on chips (Score:5, Insightful)
Even though they don't have access to the chips yet, Anandtech has a great article [anandtech.com] up going through what details they can surmise, and giving more details on the Intel chips Apple was comparing against in the presentation.
One astounding point - the M1 Max has 57 BILLION transistors, built on a 5nm process... also from the article "AMD advertises 26.8bn transistors for the Navi 21 GPU design at 520mm on TSMC's 7nm process". So wow.
Hipsters (Score:2)
Apple, thanks for fucking enterprise users. (Score:2)
Re: (Score:2)
Well, it looks like if you keep having specific constraints such as external RAM and SSD, you will not follow up with Apple. Now, you could source two types of machines, the "regular one" and the "fast one" and swap machines when the users needs change, instead of just adding RAM/SSD.
Your choice. But if you keep presenting the problem as "I want to add SSD", stop asking yourself questions.
Re: (Score:2)
Have you done any long-term management of Macs? I've seen even the non-upgradable 2012 Retina MacBook Pros still ticking along 7-8 years later. Apple hardware is generally very long-lived, provided it isn't physically abused, and the initial investment in extra RAM/storage will pay off in a machine you don't have to replace for a long, long time.
Granted, we haven't had the ARM machine around long enough to know their eventual useful lifespan, but I have no particular reason to imagine that it will be much d
Re: (Score:2)
I have tight budgets and need to make machines last 3-5 years
If you're on a three year refresh cycle (of Apple products, no less) you don't have tight budgets.
Nice but expensive (Score:2)
So, so pricey (Score:2)
The 14" starts at $1,999. At that price, you're not going to have a lot of non-Mac users switch to Mac.
Re: 32GB? (Score:2)
Re: (Score:2)
I just hope the competition starts to ramp up soon. I'm still waiting on a Thinkpad with true all-day battery life... now an ARM version of the X1 Carbon or X1 Nano, optimized for Ubuntu out of the box - that would be pretty incredible.