Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Graphics Desktops (Apple) Portables (Apple) Apple Hardware Technology

Apple Partnered With Blackmagic On An External GPU For MacBooks (techcrunch.com) 102

Apple has worked with cinema company Blackmagic on an external GPU based around an AMD Radeon Pro 580 graphics card with 8GB of DDR5 RAM. The Blackmagic eGPU features "an HDMI port, four USB 3.1s and three Thunderbolt 3s, the latter of which makes it unique among these peripherals," reports TechCrunch. From the report: The company says the on-board cooling system operates pretty quietly, which should fit nicely alongside those new, quieter MacBook keyboards. Many developers will no doubt prefer to configure their own, but for those who want an easier solution for playing resource-intensive games or graphics rendering on with a MacBook, this is a fairly simple solution. The [$699] eGPU is available now through Apple's retail channels.
This discussion has been archived. No new comments can be posted.

Apple Partnered With Blackmagic On An External GPU For MacBooks

Comments Filter:
  • Why would you try to game on a Mac?

    Serious question. I do a lot of dev on a Mac but my personal gaming rig is still a PC with a high-end internal video card (in a separate room to cut down fan noise, etc.).
    • by omnichad ( 1198475 ) on Friday July 13, 2018 @10:14AM (#56940668) Homepage

      Blackmagic makes hardware for video editing. I guess Apple is trying to keep FCP X relevant even while they hobble their actual hardware. This seems to be proof that the Mac Pro really is going to get killed off in favor of a laptop/all-in-one with an eGPU.

      • Blackmagic makes hardware for video editing. I guess Apple is trying to keep FCP X relevant even while they hobble their actual hardware. This seems to be proof that the Mac Pro really is going to get killed off in favor of a laptop/all-in-one with an eGPU.

        Or that eGPU is part of the "Modular" approach for the new Mac Pro hinted-at by Uncle Craig.

      • by tlhIngan ( 30335 )

        Blackmagic makes hardware for video editing. I guess Apple is trying to keep FCP X relevant even while they hobble their actual hardware. This seems to be proof that the Mac Pro really is going to get killed off in favor of a laptop/all-in-one with an eGPU.

        The Mac Pro (along with the Mac Mini) are the worst selling Macs in the entire lineup. And not because they are completely outdated, either - even when they were released they have historically been bad sellers. People just didn't want them. Even when the

        • you take your laptop around to the shoots, and then dock it at home to the eGPU and do your editing there

          Hollywood still uses FCP - not talking about hobbyists. And they have Fibre Channel RAID arrays for storage. Huge control surfaces for editing. You don't pick up and move around that workstation.

          • by Anonymous Coward

            I see more Avid in the actual industry than FCPX, but FCP was the darling of all the indies.. There was a brief exodus to Premiere after the X launch debacle, but most (and many more) have returned as FCPX has returned to basic feature parity. I'm a Premiere user, but making the switch to FCPX once I finally make the jump to Mac, mostly because of the subscription. I used to be all gung ho about subscription (always have the latest version/features, big suite to use), but realized I haven't made a film in o

            • Oh, sure - Avid was entrenched long before Apple released anything. I'm an FCP 7 user myself, but I haven't touched it often enough to ever want to upgrade. Keeping a Hackintosh around on Sierra because it's broken in later OS releases.

      • by mikael ( 484 )

        They ran a cloud computing demo some time ago where a video artist could do some editing on one frame, press a button to apply to all frames, and then the cloud server would do that in real time allowing the video to be streamed straight back to the Apply PC. Having one of these would let someone run a personal cloud server.

    • Why Not? (Score:4, Interesting)

      by SuperKendall ( 25149 ) on Friday July 13, 2018 @10:16AM (#56940674)

      Why would you not?

      Some Macs (iMac Pro) have powerful video cards now. And having one computer beats having to own and maintain two...

      Beyond that, I was scarred for life trying to keep a gaming PC running Windows operational for many years. A possible slight drop in performance is worth it for my sanity.

      • Why would you not?

        Because:
        a) There's no Mac games available that would require that thing, and
        b) No company is going to develop them because not enough people will ever own one of those to make it worthwhile.

        • a) There's no Mac games available that would require that thing, and

          There are quite a few. For the few that are not on the Mac, you can at least use Bootcamp if you are desperate - you are still configuring Windows, but at least you are not dealing with flaky barely compatible hardware.

          Also Wine/Steam may be a possibility, have not tried as I've not needed to...

          b) No company is going to develop them because not enough people will ever own one of those to make it worthwhile.

          Since most of them are you seem p

          • Have you actually used PC? (As far as compatibility goes.)

            Ad for what to use getting an Apple laptop is like twice the Dell cost and this external GPU is like twice the graphics card cost too.

            • Have you actually used PC? (As far as compatibility goes.)

              I used them for many years which is why I do not any longer.

              Ad for what to use getting an Apple laptop is like twice the Dell cost

              I am happy to buy laptops for 1.5x the price that last 5x as long, are better built, and require less fuckery with the OS. My 17" MacBook Pro from 2010 is still in heavy daily use for example (and still can run the latest OS). Because the case is solid it still looks fantastic, except of course for a scuff in the corner

              • by aliquis ( 678370 )

                My Macbook Pro did cost at-least 50% more than a Dell and came with half the VRAM and a lower resolution screen.
                It kinda worked for a year but even though the 8600m GT was capable of decoding H.264 video Apple wasn't or whatever and when they had fixed support for Flash for it they only did so for the 9600m GT and as such it ran like 70 degrees all the time and the battery wore out very quickly so it wasn't all that useful then. I used it in the home and with connected power cord and moved it around and as

    • by AmazingRuss ( 555076 ) on Friday July 13, 2018 @10:21AM (#56940718)

      Because my work is done on a mac, so I have a macbook. It dual boots windows, and has a GTX 1080 in an EGPU. Works great for regular games and VR. It's a nice looking setup too. Only one wire into the macbook.

      • Because my work is done on a mac, so I have a macbook. It dual boots windows, and has a GTX 1080 in an EGPU. Works great for regular games and VR. It's a nice looking setup too. Only one wire into the macbook.

        There you go!

        You're truly embracing the future of Macdom!!!

      • Only one wire into the macbook.

        Funny how mac guys clearly brags the one expansion/accessories port they have is totally useless now.

        It's like "You shouldn't game on a Mac anyway!" all over again.

        • I don't even know what you tried to say.

          MacBook Pro has always had more than one USB-C port. Plugging this in would take one. You still have more. And the first person that said anything remotely close to "you shouldn't game on a Mac" is you. There was one other guy talking about how there are no games on Mac, which is false, and completely forgets about the ability to install Windows as a dual-boot OS anyway.

          Stop building straw men and knocking the hell out of them.
          .

          • by aliquis ( 678370 )

            I wanted to say that macs only had USB-C ports nowadays and since the world haven't really switched beyond for the charging and data port on phones and tablets I could understand how they were all unused :D ... well, except for adapters which let you hook up your gear ;D

            It was just a joke about the one cable and USB type C ports only.

            Then again a bunch of motherboards have front USB 3.1 gen 2 / type C headers on the board but very few PC cases actually have them so .. You kinda get 1 on the motherboard ther

            • You would be surprised. I'm typing this on a Dell XPS 15 that has two USB-C / Thunderbolt 3 ports, which I use to plug in a Thunderbolt dock that keeps two displays, keyboard, mouse attached. One cable when I walk into my home office, everything fires up. Works with Linux too, as long as you can get past the abomination that is X.org support for HiDPI displays.

              Apple did fuck up when they decided that nobody needed any of the perfectly fine legacy ports though, which would still fit (USB-A, HDMI, Mini Dis

      • by Khyber ( 864651 )

        "Only one wire into the macbook."

        And how many going into or leaving the eGPU? Last I checked, you needed an external monitor for Mac eGPU usage if you wanted any actual decent performance. Oh, so let's add how many additional cords going to-from said monitor?

        One cable, hah!

        • So your argument is that this is a useless product because you would have to plug a single DisplayPort cable into it from a display, and never do anything else with it?

          Really? That sounds like a fucking useability shit show. Oh my god. You have to plug a display into it. THE HORROR.

          Never mind that if you wanted to use that display with literally any laptop ever, you would have to either plug it into a docking station, and plug the docking station into the laptop (exactly the same as this), or plug it in

          • by Khyber ( 864651 )

            You obviously aren't FUCKING READING you brain-damaged fucktard.

            YOU CAN *NOT* GET ****PERFORMANCE**** WITHOUT AN EXTERNAL MONITOR ATTRACHED. LOOPBACK VIA THUNDERBOLT DEGRADES PERFORMANCE.

            You RETARD. Learn how to read and comprehend you fucking middle-school failure.

            • You obviously aren't FUCKING READING you brain-damaged fucktard.

              Jesus man, take it easy. You're going to give yourself a stroke.

              You RETARD. Learn how to read and comprehend you fucking middle-school failure.

              Look, go outside. Take a minute to reflect on whether it's really worth it to get that worked up over a guy's comment on Slashdot. When you're done with that, I'll buy you a beer and we can enjoy the rest of the day.

      • I previously tried the eGPU developer kit, but rarely used it since it wouldn't connect to my 5K monitor and was noisy.

        I wanted something I could leave connected all the time and still have my MBP just connect up to everything with 1 cable.

        I'm pretty happy with it, it made my MBP much faster for 3d, it works with my 5K monitor and it's very quiet. If anything, it actually makes my overall setup a little quieter since the mac's fans no longer spin up due to the 5K monitor load.

        I have a PC I use for VR which

    • Serious question. I do a lot of dev on a Mac but my personal gaming rig is still a PC with a high-end internal video card (in a separate room to cut down fan noise, etc.).

      Other more basic question: If you're going to buy that thing then then why not buy a console instead? For less money.

      Bonus: You'll have a much wider selection of games to play, too.

    • by AHuxley ( 892839 )
      So the creative music and graphics people can get a feel for parts of the new game as its getting made on their own Macs.
      When all ready the resulting game will be great on Windows 10 with any good consumer GPU.
    • by 605dave ( 722736 )

      I actually am excited for this tech because I play around with 3D modeling and animation. Many renderers are available only for Nvidia solutions, which are now going to able to support the Mac. Additional video cards could come in handy for video as well as scientific tasks as well. Gaming is just a side benefit for me.

    • OS X for work, Windows dual-boot for play.

      Windows laptop cannot (legally) do the first bit.

  • by volodymyrbiryuk ( 4780959 ) on Friday July 13, 2018 @10:14AM (#56940666)

    for playing resource-intensive games or graphics rendering on with a MacBook, this is a fairly simple solution

    The simple solution is to just use cloud based GPUs.

    • I don't think there are too many people who would buy this for gaming. For the $700 price on this thing, you could almost build your own gaming PC with comparable specs. I'd say that this mainly looks to be for people who do video production and want to be able to have additional performance when at their desk. They just tack on the extra bit about gaming in the hopes that people assign more value to the product even though it's not something that they'll really use it for. The number of people who would bu
    • Comment removed based on user account deletion
  • So is this for external monitors only, I assume? It's not like you could just plug a thunderbolt cable to your Macbook and have the graphics capability on the laptop display.

    • Yep.

    • "For best results with applications like 3D games, set a display that's attached to the eGPU as the primary system display"
      https://support.apple.com/en-u... [apple.com]

      Figured. But you wouldn't need an external monitor for GPU-accelerated applications, like photoshop or final cut.

      • "For best results with applications like 3D games, set a display that's attached to the eGPU as the primary system display"
        https://support.apple.com/en-u... [apple.com]

        Figured. But you wouldn't need an external monitor for GPU-accelerated applications, like photoshop or final cut.

        From what I understand, it will "loop back" video to the Mac's internal display, as well.

    • I don't know specifically about the linked eGPU but it is possible to use other eGPUs on the internal display. The trade off is that it's a pain in the ass to set up and you do lose a little bit of graphics power as you're sending video data back across the same Thunderbolt 3 link.
    • You could, if there was some tricky drivers built for it. This is essentially how Nvidia Optimus works in equipped notebooks - the discrete GPU is only used for frame rendering, and the frame is handed off to the Intel GPU that draws it to the display. I'm sure they have patented the living balls out of it and wouldn't hesitate to fire off many lawsuits at Apple if they tried to recreate it using AMD GPUs, but it could be technically possible.

  • Blackmagic also charges high prices for their gear as Apple does. Need an HDMI to USB3 capture device? Blackmagic is $300. Any generic company is $50.
  • They now require a giant box on the side. I still repeat my suggestion for a fat macbook.
    • Yeah, because thunderbolt attached external GPUs are brand new, and this is the first one on the market. No wait, they've been around for years, practically since Thunderbolt existed.

  • Cool another innovation for Apple!

    First they invent the mp3 player
    then the smart phone
    then an app store
    then the tablet
    and now external GPU's??

    Real men of genius indeed.

    • Any sufficiently advanced sarcasm is indistinguishable from lying.

      Apple didn't invent any of the things on your list, but they did market them as "innovations".
    • by AHuxley ( 892839 )
      Its like the days of the Radio Shack Expansion Interface for the Radio Shack TRS80.
      A new box next to the new computer to do more computer things with.
    • It takes courage not to bother putting a GPU in your unupgradable computer.

      Next step is obviously making it wireless.
      You haven't lived until you've had 40Gbps piped through the air near your head an nether region.

  • by Anonymous Coward

    GDDR5 != DDR5

  • It's pretty cool and all, but not really new.

    https://www.dell.com/en-us/sho... [dell.com]

    I've actually played with external PCIexpress boxes that connect to Thunderbolt 2 you can put graphics cards in - granted not to this same performance level.

  • Apple is really lagging as a VR development platform which needs a substantial video card. This EGPU will allow Apple to finally get into the VR development world. You can't get the following specs in a skinny laptop with a sad little fan and butterfly keyboard:

    Newegg Recommended VR PC Specs:

    i5-6500 or Greater CPU
    NVIDIA GTX 980 or AMD R9 390 GPU or greater
    16GB+ RAM
    SSD (PCIe NVMe recommended)
    Check out our Newegg approved VR systems

    Official Oculus Rift Recommended Specs:

    Intel i5-4590 equivalent or greater
    NVI

  • Not sure what led Techcrunch to report otherwise, but there are only two Thunderbolt ports... one of which is used to connect to the host Mac, so it merely allows you to daisy chain other Thunderbolt devices.

The hardest part of climbing the ladder of success is getting through the crowd at the bottom.

Working...