Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Hardware Apple Technology

M2 Max Is Basically An M1 Ultra, and M2 Ultra Nearly Doubles the Performance (9to5mac.com) 42

The new Mac Studio started shipping to customers this week, giving product reviewers a chance to test Apple's "most capable chip ever." According to new benchmarks by YouTuber Luke Miani, the M2 Ultra features nearly double the GPU performance of last year's M1 Ultra, with notable performance improvements in other areas. 9to5Mac reports: While the M1 Max and M1 Ultra are blazing fast, the difference between the two wasn't as notable as some expected. In many tasks, the much cheaper M1 Max wasn't too far off from the top-end M1 Ultra variant, especially in video editing, photo editing, and 3D rendering. Despite the M1 Ultra literally being 2 M1 Max's fused, the performance was never doubled. For the M2 series, Apple has made some significant changes under the hood, especially in GPU scaling. In Luke's testing, he found that in some GPU heavy applications, like Blender 3D and 3DMark, the M2 Ultra was sometimes precisely twice the performance of M2 Max -- perfect GPU scaling! In Final Cut Pro exports, it nearly doubled again. He also found that the M2 Ultra doubled the GPU performance of the M1 Ultra in these same benchmarks -- a genuinely remarkable year-over-year upgrade.

The reason for the massive performance improvement is that Apple added a memory controller chip to the M2 generation that balances the load between all of M2 Ultra's cores -- M1 Ultra required the ram to be maxed out before using all cores. M1 Ultra was very good at doing many tasks simultaneously but struggled to do one task, such as benchmarking or rendering, faster than the M1 Max. With M2 Ultra, because of this new memory controller, Apple can now achieve the same incredible performance without the memory buffer needing to be maxed out. It's important to note that some applications cannot take advantage of the M2 Ultra fully, and in non-optimized applications, you should not expect double the performance.

Despite this incredible efficiency and performance, the better deal might be the M2 Max. In Luke's testing, the M2 Max performed very similarly or outperformed last year's M1 Ultra. In Blender, Final Cut Pro, 3DMark, and Rise of the Tomb Raider, the M2 Max consistently performed the same or better than the M1 Ultra. Instead of finding an M1 Ultra on eBay, it might be best to save money and get the M2 Max if you're planning on doing tasks that heavily utilize the GPU. While the GPU performance is similar, the M1 Ultra still has the advantage of far more CPU cores, and will outperform the M2 Max in CPU heavy workloads.

This discussion has been archived. No new comments can be posted.

M2 Max Is Basically An M1 Ultra, and M2 Ultra Nearly Doubles the Performance

Comments Filter:
  • How many times can I stuff "M1 Max", "M1 Ultra" and "M2 Ultra" into a single article? SEO Arms Race Has Left Google and the Web Drowning in Garbage Text [slashdot.org]. The irony is that this is bad SEO, Google will punish for it.
  • No big deal (Score:4, Funny)

    by backslashdot ( 95548 ) on Friday June 16, 2023 @09:01PM (#63609438)

    My uncle, Sam, got me the MK ULTRA. It's so fast everything's a blur, and I never get a false answer.

  • Oh, then...

    Genius marketing, they're hardly going to issue an April First press release that says their product is 15% slower than the previous year's model.

  • As someone who is not into Macs (I use Linux and Windows), I am left bewildered by the "Max" and "Ultra" monikers. To my mind, "Max" is short for "Maximum" and to quote a dictionary definition I just found, "maximum" is an adjective for, "as great, high, or intense as possible or permitted." So shouldn't "Max" be the model that cannot, by definition, be bested, at least in the current line-up? To me, a better hierarchy would be something like "core", "pro", "ultra" then "max". But then again what would I
    • Ultra is pushing *beyond* the limits, from the warning to flat-earth sailors that there was nothing beyond the straits of Gibraltar.

      https://en.wikipedia.org/wiki/... [wikipedia.org]

    • I am confused by Intel chips. Shouldn't an i3, i5, i7 or i9 have three, five, seven, or nine cores?

      Seriously, any normal person realises these are marketing names. There is the annoying thing that there is an M1 "nothing" and M2 "nothing", so if you read M1 you donâ(TM)t know if it's just the low end chip, or the family of four chips. Apart from that, it's marketing names and nothing to get exciting about.
    • by Ed_1024 ( 744566 )

      Compared to most naming conventions, I think the Apple one makes sense in terms of product and linguistics. Ultra means beyond or on the other side of, so Ultra > Max is not hard to grasp: beyond the maximum. It is also what you get when you physically couple two Max chips, which are the most potent single pieces of silicon they make, to produce an Ultra. The Max is still as far as they can take it in one monolithic die.

      Anyway, Mx, Mx Pro, Mx Max and Mx Ultra is not the most difficult progression to reme

  • Where does the M2 GPUs stand when compared to Nvidia or Radeon? Do they reach RTX4 levels of performance? What about the power envelope?

    • Conveniently missing. It's easy to double your performance when you only had 1 core and you double it to two. There's a reason why the Steam Deck and Rog Ally are pretty bad compared to their desktop or even high end laptop
      • True, conveniently missing comparisons, but I don't think this is what apple is going for. I suspect that they will go after the gaming market more like a console and not try to dwell on specs but rather output. Yes, we all know that xbox and playstation HAVE specs, but we don't talk about that in regards to performance like we do gaming PC's. I grabbed no mans sky when they released it for apple silicon and metal and guess what, it's butter on an m2 max on my MBP 16" native resolution. It's probably s

    • There's no information out there, except the different processors come with anything from 7 to 76 GPU cores, 100 to 800 GByte / sec memory bandwidth, and up to 48 MB L3 cache.
    • The Ultra should be around the speed of a 4070 Ti at around 60-70 Watts
  • Silly naming ("max", "ultra", what's next?) aside, next gen matching previous gen's higher tier is pretty usual. It's how the M2 can compete with the M1 Pro (it's faster in single-core, slower in heavily threaded [dev.to], which makes e.g. the new 15" M2 Air a possible substitute for the 14"/16" 2021 Macbook Pros).

    It's more impressive to me how the older M1 Ultra apparently wasted half its power in many scenarios. Like the name (going one further over something called "max"), it does seem like the whole design was an afterthought? And they got it right on the second iteration..

    • I suspect that Apple isn't planning on a 'what's next' in the naming. There will not be an M2 ultra max iridium whatever, M2 Ultra is the permanent top of the line for the generation.

      Next year, the M3 Ultra will be again that top of the line, being 2-4x M3 MAX chips or whatever, and the M3 MAX will be the mid and the M3 will be the standard.

      I wouldn't say any of this stuff is an afterthough, just a concession made to get the M1 out the door. I suspect that M2 is the bigger jump we'll see and M3 will be a

      • I don't think that is the case. M2 was a minor process improvement, M3 will be 3nm instead of 5nm.

        If you look at the differences between M1 and M2, M2 had a tiny bit more hardware in the same package (two more economy cores, three more GPUs, another video encoder). And a little bit of power savings. Actually, either slightly less power at same clock speed, or slightly more speed at slightly more power. So you got overall 10% or a bit more extra performance.

        And something seems to have happened with the
        • you're too hung up on specs and benchmarks. As an owner of an M1 MBP and Mini, and a M2 MBP and Mini, it's not a tiny difference in use. Those improvements are a few points here or there on a benchmark but they aggregate up in actual use.

          My M1 MBP was awesome because it was fast enough and had a great battery. My M2 MBP is the fastest laptop I've ever owned, and what I mean is that the time it takes to 'do' what I tell it to do is the least I've ever seen by a good margin. I open the screen and I'm in,

  • Slow computer, only fast in contrived situations. Don't buy the marketing hype, they are proprietary junk
    • Re: (Score:3, Informative)

      by Anonymous Coward

      Slow computer, only fast in contrived situations. Don't buy the marketing hype, they are proprietary junk

      I've got one (M2 Max) right here. For my purposes, development, it's wicked fast. And silent. And cool to the touch. I'm very happy with it.

      • A raspberry pi would that just as well. In most cases, developing is just fancy text editing.

        I say this as a developer.

        • A raspberry pi would that just as well. In most cases, developing is just fancy text editing.

          I say this as a developer.

          Maxing out all the cores right now (analyzing a chess position). CPU temps are steady at 63C and the fans are at 28%, and I still can't hear them even though it's sitting on my desk. The case is slightly above room temperature. Current system power consumption is 55W.

        • Re: Avoid (Score:5, Insightful)

          by Moridineas ( 213502 ) on Saturday June 17, 2023 @10:45AM (#63610370) Journal

          Isn't that a pretty limited definition of "developer"?

          Compiling? Machine learning? Graphics processing? Virtual machine testing? Video processing? etc

          I'm not a pure professional developer, but I do most of those things pretty regularly in addition to fancy text editing. There are many reasons developers might need more cpu power (I'm a vi guy for fwiw).

        • by berj ( 754323 )

          I have no doubt that I could write a fair portion of my code on a Pi: I do pretty much everything with Vim and Make. However, I like to be able to actually *run* the applications I'm developing for. Last I checked I can't run Maya or Houdini or Nuke on a Pi. That means I need a workstation: Linux, or Windows, or Mac. Windows is right out for me (don't like the user experience and it's not unix.. though I think that gap is closing over the years). And between Mac and Linux I'll choose a Mac every day of

      • by KlomDark ( 6370 )
        Developing WHAT? Like shitty Angular pages, or some goofy Mac-only language?
        • Developing WHAT? Like shitty Angular pages, or some goofy Mac-only language?

          For someone as old as you are, why do you still carry such juvenile opinions?

    • “Contrived situations” yeah; like who even in the real world isediting video, making models, coding, or doing audio and photo work.

      • by noodler ( 724788 )

        I agree that the referred uses are practical but make no mistake. The M2 Ultra is still significantly slower in rendering CGI than even last gen nvidia mobile GPUs.

        • nVidia have the GPU assisted everything market sewn up tight. Just as the M1/2 can render ProRes video faster than an nVidia card, nVidia can do OptiX faster than anyone else. Comparing performance in proprietary workflows is only useful as a warning: "if you work in X, then keep buying Y because Y is the only thing that supports it fully."

    • While the fastest Intel and Amd Ryzen chips (e.g., i9s) are indeed faster, Apple's chips are absolutely competitive with them in the vast majority of workflows. Furthermore, where M* chips really shine is in their performance per watt.

      If you disagree with this, I'd love to see what information you are reading, because pretty much everything I've read since the M1 was announced has been consistent (roughly "fast but not the fastest, and extremely energy/heat efficient").

    • You seriously donâ(TM)t know what you are talking about.
  • Why would Ultra be better than Max? How is the Max chip Max?
    • I'm not defending the naming, because I also have to stop and think about which is which, but "ultra" means beyond the limit, beyond the expected. So in this context "max" would be the maximum to the limit, and ultra would be beyond that. So yeah, it's kind of silly, but it does make sense.

    • An Ultra is two Max chips connected. Literally. So Max = the biggest and bestest, Ultra = two of them.

You are always doing something marginal when the boss drops by your desk.

Working...