Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AMD Graphics Portables Games Hardware

AMD Wants To Standardize the External GPU (arstechnica.com) 172

Soulskill writes: In a recent Facebook post, AMD's Robert Hallock hinted that the company is working on a standardized solution for external GPUs. When people are looking to buy laptops, they often want light, portable machines — but smaller devices often don't have the horsepower to effectively run games. Hallock says, "External GPUs are the answer. External GPUs with standardized connectors, cables, drivers, plug'n'play, OS support, etc." The article points out that the Thunderbolt 3 connector already (kinda) solves this problem, providing up to 40Gbps of bandwidth over a single connector. Still, I find external GPUs intriguing. I like the idea of having a light laptop when I'm moving around, but a capable one when I sit down at home to play a game. It'd also be nice to grab my desktop's GPU when I want to game on my laptop in the living room. Standardization may turn out to be important for GPU-makers if VR ends up taking off. The hardware requirements for those devices are fairly steep, and it'd facilitate adoption if graphics power was more easily expandable.
This discussion has been archived. No new comments can be posted.

AMD Wants To Standardize the External GPU

Comments Filter:
  • Seems like trying to solve a problem that doesn’t exist. The weight of a GPU chip and a couple of extra VRAM chips isn’t going to break anyone’s back. The extra weight on a “gaming laptop” usually comes from the extra battery capacity (to support the power sucking GPU), and the fact that the screen itself is usally on the larger side. Plus whatever “bling” they put to make the case look all cool... Any intelligently designed laptop is going to have a shared heat

    • by Anonymous Coward

      It's not weight, but battery life and upgradeability what this would improve.

    • by Anonymous Coward

      There are a lot of solutions looking for problems, but this isn't one of them. The MacBook is fanless for God's sake! That is so awesome. I want a fanless machine that I can plug into a graphics accelerator hooked up to my TV or monitor at home where I can play AAA games. Why do I need a whole second computer just to house a graphics card or God help me an abominable "gaming laptop"

      • watch that cpu throttle to uselessness in the process of gaming.
      • by jedidiah ( 1196 )

        ...except quite often by the time you deliver something to do that, you might as well deliver an entirely separate machine to go with it.

        The GPU isn't the only limitation of Apple laptops. My main motivation for having a monster laptop myself is not the superior GPU, or more RAM, or a faster CPU. It's the drive bays. Only monster machines come with the amount of storage I want in a laptop.

        The storage needs of PC gaming will likely be hampered by the netbook you're trying to plug into the external GPU.

      • You want to game on a MacBook?

        You'll need a real GPU.
        You're going to want a real keyboard and mouse.
        And a bigger and better display. Or multiple displays.
        You'll want real speakers (or headphones for the retards).
        Multiplayer? You need a good mic to talk to people without them hearing everything in your game looping back to them.

        At this point you've got so much shit on your desk hooked up to the laptop (docking station or not) that it's easier to just get a real desktop.
        You'll get much better CPU performance

    • by Anonymous Coward

      Have you ever taken a close look a modern high-performance graphics card? Get a good feel for their size and weight. These things can pull hundreds of watts and need a large heatsink to dissipate said watts.

      Mobile GPUs don't come close to desktop GPU because the power/heat budget is much larger for a discrete card in a well-ventilated case. Yes, power efficiency is constantly improving but that just means desktop GPUs will keep cranking up performance for a given TDP, the power draw isn't going away.

    • by cfalcon ( 779563 )

      > The weight of a GPU chip and a couple of extra VRAM chips isnâ(TM)t going to break anyoneâ(TM)s back.

      I disagree. Along with a GPU comes cooling requirements, and those definitely add both weight and size.

      An external GPU would let you:

      1)- Easily replace a wonky GPU, which is a gamble right now. "Al, lemme try your graphicbox. Ok, see, mine must be bad. I'll buy a new one."
      2)- Easily upgrade a GPU, which is *almost impossible* now. "Huh, the new card just went on sale. It's been a couple

      • Twenty years' experience tells me that an [easily replaced] external GPU isn't likely to help if the onboard GPU in your laptop goes out; you'll almost certainly need it functional just to boot up...
      • by tnk1 ( 899206 )

        It's an awesome idea that won't get traction because:

        a) It will increase the complexity of the design (which may or may not be a problem for expensive gaming laptops)
        b) It will increase the life of the laptop before a new one is purchased, and thus reduce return business and profit. Gaming laptops are a niche where you get more turnover since they have to follow the leading edge more closely.
        c) The external connector will definitely need the standardization, or the laptop might find itself compatible with

        • by bondsbw ( 888959 )

          b) It will increase the life of the laptop before a new one is purchased

          Not sure if I agree with this one. I get your point, but on the other hand it's a lot less expensive to upgrade from one integrated graphics laptop to another.

        • That depends a lot on your use cases. I will preface this by saying I'm part of the target market here because I have already spoken with my dollars and have a laptop with an external GPU box; specifically an Alienware 15 with the Graphics Amp.

          a- Increased complexity? Sure, and there's no doubt that there will be teething troubles with drivers. I know I had them early on because of effectively having three GPU's (integrated Intel, integrated GTX-970m and external GTX-980). However I think what AMD is aiming

    • Actually, I think an external GPU and power source is a fairly elegant setup. Rather than limiting the GPU capabilities by trying to cram the cards into the laptop format, they can use full desktop GPUs with the associated power supplies and just plug in where you need that power. Then you could have something that performs both tasks of being a very nicely portable laptop and a gaming rig without unnecessary duplication of CPU and RAM or having to manage two separate machines.

      • Only problem is you've still handicapped the CPU by "trying to cram the cards into the laptop format".

        Maybe they can put the CPU in the external enclosure too.

        • I'm not a gamer, but from what I've heard most modern CPUs are capable of handling just about any game out there. Even laptops come with multi-core processors and tens of GBs of RAM these days. The only limitation I see on the laptop format would be disk space. Maybe attaching a fast external storage array to the dock would be a useful add-on so you can keep the cost of the on-board SSD down?

          • +1 to this... I have no mod points or I'd give it. Other solutions include having a home server or NAS you can dump bulk data to for archival storage. This is what I do, and have Windows File History set up to back up to that NAS as well.

    • Heat is a problem that doesn't exist? Good, we can all put all that climate change hysteria to rest.
    • > Seems like trying to solve a problem that doesnâ(TM)t exist.

      Maybe not to you, but when I have a GTX 980 Ti in my Windows box and a (weak) GeForce 750M in the MacBook Pro the ability to use an external GPU in a standardized way would be godsend to us graphics / shader guys. I guess you never play around with ShaderToy [shadertoy.com] on a laptop.

      Anyways, you're missing the fundamental problem:

      GPUs in laptops suck (for high performance).

      I understand the heat + space + energy concerns but when you have to resort to

  • >> It'd also be nice to grab my desktop's GPU when I want to game on my laptop in the living room.

    Congratulations: you just invented the home graphics mainframe!
    • by Anonymous Coward

      >> It'd also be nice to grab my desktop's GPU when I want to game on my laptop in the living room.

      Its called streaming. All of the graphics players are trying to do this as we speak.

    • We had a technology at my university in the early 90s, they were called X-terms - login to a computer running Ultrix or SunOS and graphical programs would render locally on a network connected screen attached to a keyboard and mouse.

  • by CajunArson ( 465943 ) on Friday March 04, 2016 @01:29PM (#51638125) Journal

    Intel has already done the heavy lifting by giving us the Thunderbolt standard that can expose a 40Gbit (or more if you gang connectors) external interface that can transport PCIe to a GPU in a seamless manner.

    If AMD wants to work on making the enclosures, cooling, and power supplies more standardized to make plugging in a wide range of GPUs easy then that's great. If they get all NIH and think they can gin up some proprietary connector instead of just using Thunderbolt then you can forget about this entire announcement right now.

    • Re: (Score:2, Interesting)

      by arbiter1 ( 1204146 )
      TO start using TB, means i guess paying $ to intel. On top of that AMD is already 400 miles behind on a 500mile race with TB as it is so to make their own connection and put it as standard they would have leg up vs intel based laptops. So go figure why AMD wants to make up their own connection at this point.
    • by PPalmgren ( 1009823 ) on Friday March 04, 2016 @02:48PM (#51638855)

      Of the companies I'd worry about making a proprietary connector, AMD isn't one of them. AMD tends to make open standards that can be used on either companies' GPUs without licensing requirements or proprietary hardware. However, if they did make one, I fully expect that NVidia would not use it and make their own implementation.

    • 1: External PCIe exists and has been around for ages. No one uses it.

      2: Thunderbolt doesn't transport PCIe, PCIe transports Thunderbolt which transports whateverthefuck (and gives everything DMA access because lol).

      2a: USB C is a physical connector that can be backed by USB 3.1, USB 3, Thunderbolt 3/2/1, etc. controllers, all of which run over PCIe.

      Thunderbolt is an expensive solution to a problem that doesn't really exist, developed in the hopes of hooking people into it for their really expensive opti

    • Not enough bandwidth. TB is 4 PCIe lanes. Your graphics card uses 16 of those lanes. Trying to use TB would handicap the GPU.

  • by pecosdave ( 536896 ) on Friday March 04, 2016 @01:36PM (#51638235) Homepage Journal

    One of my users was on a big gray Mac Pro, with a fiber card to access the SAN and an AJA card that puts video on the preview/client preview monitors - it's a video card, but a really strange one that acts more as a codec than a traditional video card.

    When that machine became a crash-fest I moved him over to a newer Mac Pro trashcan. That fiber card and AJA card can't be put in the trashcan as it lacks PCIe slots. So I got this Magma Thunderbolt PCIe housing [amzn.com]. That AJA card working in there beautifully. I doubt the Quadro Pro from his old system would work in that thing (it might - I may have to experiment one day) but I have little doubt a budget GeForce card would work in there.

    I could totally plug my ThinkPad W540 into that box and just about any of the newer Macbooks in the building accomplishing what this article is all about.

    Still - intentional and standardized would be nice. Especially with all these Mac people in my building - it would be nice to have GPU's in the Thunderbolt monitors we have floating around - it could save us money when buying laptops if we didn't have to worry about which laptop went to who as long as the monitor was able to handle the job.

    • by Soulskill ( 1459 )

      That's really awesome. I hadn't even considered the IT deployment possibilities.

  • So what this sounds like to me is a standardized docking station.

    Just put a standard connector in a standard location that passes through the VESA Local Bus (or whatever newfangled thing is popular these days). Then have a docking station with a card slot, install a standard desktop video card, and you're all set. This lets AMD (and others) sell video cards to end users of laptops just like they have always done for desktops.

    Now where this could get really interesting is if they do this right, and create

  • Proprietary tech under Apple lock and key is not - and frankly should never be even proposed as - a solution to any question or challenge regarding PC design, especially when you're talking about specifying new standards: you might as well suggest using code SCO thinks it wrote while you're at it.

    No, the answer when specifying new standards is... to develop NEW standards the whole industry can use, and as the developing body, you get to benefit from leading the charge and being on top. Proprietary hardwar
  • Lately they're going for all these crazy niches and "next big things" that usually works out to either being a flop or if it's big, then nVidia can just stroll in from behind with a product once the market is mature. Like an ITX size 175W graphics card and so on. Even when they "win" like with Mantle nobody really cares until it becomes a standard like DirectX12 or Vulkan. Like this, I'm sure AMD will use a ton of money on the standardization effort, then nVidia will come and say "that's neat, here's Maxwel

    • Now consumers have mostly rejected it You say that as if it were, in fact, actually true. I really respect your willingness to demonstrate such a high level of "flexibility." ;)

      • by Kjella ( 173770 )

        Now consumers have mostly rejected it

        You say that as if it were, in fact, actually true. I really respect your willingness to demonstrate such a high level of "flexibility." ;)

        Vizio announces its first consumer 4K TVs, kills all 3D support [theverge.com]
        Sky drops 3D channel [advanced-television.com]
        BBC drops 3D programmes due to lack of interest [bit-tech.net]
        The End Of 3D? ESPN Drops 3D Channel [ipglab.com]
        DirecTV scales back 3D content due to lack of demand [digitaltrends.com]
        Poll: Is 3D TV dead? Do you care? [cnet.com]

        A quote from the last one:

        3D's biggest issue has always been lack of 3D movies and TV shows, however, and they're only getting more scarce. ESPN's highly hyped 3D channel quietly got put to rest two years ago. Many other 3D-only channels, like 3net, Xfinity 3D, Foxtel 3D, Sky 3D and more, are also gone.

        Some download services, like Vudu, still offer 3D, but the total number of 3D Blu-ray movies has dropped off significantly. They peaked in 2013 at 77, up from 66 and 68 the two years previous. Last year? 44, and only 22 so far this year. There will certainly be more in the second half, but I doubt we'll break 40.

        Maybe you liked it, I'm not to argue with personal taste. But it's barely been mentioned as a feature for a couple years now, there's no plans for 4K in 3D in the new Bluray standard and nobody really seems to care. It works for most

    • by armanox ( 826486 )

      I've got high hopes for Zen when it comes out personally.

      • I've got high hopes for Zen when it comes out personally.

        I have a thin thread of hope for Zen this year. Having any more than that seems excessive.

  • Thunderbolt 3 is fierce [thunderbol...nology.net] and could do it. The issue is always market, even with standardization.
    Meanwhile we have morons like Palmer Luckey attacking Apple [theverge.com]; basically the kingmaker in pushing to market modular, externalized resources like Thunderbolt 3 / USB-C.

  • I see lots of people here commenting and bitching that this is a horrible idea. I, however, am apparently the target audience for this very device.

    Right now I'm typing this up on my tiny little 10 inch netbook. I travel around the country very frequently with this thing for casual browsing from hotel to hotel. However, when I'm at the office, I have a full keyboard, mouse, and 22" monitor hooked up to this thing. Am I carrying around a bulky monitor around the country? Nope. But when I'm in the office and d

  • Having one of these would be great for training / running your own personal neural network. Instead of beaming all of your data to a 3rd party you have the work done locally (or series of GPUs even...)

  • AMD promoting a specialized connector for a third-party GPU reminds me of the short lived VESA local bus connector in the early 1990's. It became unnecessary as soon as a general purpose expansion bus (PCI) became available which was fast enough to support gaming GPUs.

    With the arrival of Thunderbolt 3, it looks like AMD's idea is pretty much dead on arrival.

  • I'm sure it's been considered but at least from a programming perspective I'd be more concerned about the latency on the port as regards the ability to process realtime high framerate graphics through there. When I was doing CUDA programming the most difficult (that is, time consuming) part was getting data from main memory to the graphics card. Would the Thunderbolt interface be as fast at shuttling data from main memory to an external graphics card? 40Gbps is great and all, but is the latency low enoug
    • AFAIK, current solutions for external GPUs are simply different electromechanical formats for PCI Express, so this shouldn't be an issue.
  • There are a lot of people out there with laptops, All In Ones, and small form factor desktops out there who are stuck with crummy integrated graphics. They have no way to add a bigger power supply or a giant two slot PCI-E graphics card, so a solution like this would be a godsend to them! Plug it in when you want to play PC games, and leave it disconnected when you want to be portable.

    So, where do I buy one?

  • You want to make it a standard?
    Don't encumber it with patents.

  • ...AMD cant manage driver support for the life of the laptop.

news: gotcha

Working...