AMD Wants To Standardize the External GPU (arstechnica.com) 172
Soulskill writes: In a recent Facebook post, AMD's Robert Hallock hinted that the company is working on a standardized solution for external GPUs. When people are looking to buy laptops, they often want light, portable machines — but smaller devices often don't have the horsepower to effectively run games. Hallock says, "External GPUs are the answer. External GPUs with standardized connectors, cables, drivers, plug'n'play, OS support, etc." The article points out that the Thunderbolt 3 connector already (kinda) solves this problem, providing up to 40Gbps of bandwidth over a single connector. Still, I find external GPUs intriguing. I like the idea of having a light laptop when I'm moving around, but a capable one when I sit down at home to play a game. It'd also be nice to grab my desktop's GPU when I want to game on my laptop in the living room. Standardization may turn out to be important for GPU-makers if VR ends up taking off. The hardware requirements for those devices are fairly steep, and it'd facilitate adoption if graphics power was more easily expandable.
Solution looking for a problem? (Score:1)
Seems like trying to solve a problem that doesn’t exist. The weight of a GPU chip and a couple of extra VRAM chips isn’t going to break anyone’s back. The extra weight on a “gaming laptop” usually comes from the extra battery capacity (to support the power sucking GPU), and the fact that the screen itself is usally on the larger side. Plus whatever “bling” they put to make the case look all cool... Any intelligently designed laptop is going to have a shared heat
Re: (Score:1)
It's not weight, but battery life and upgradeability what this would improve.
Re: (Score:1)
There are a lot of solutions looking for problems, but this isn't one of them. The MacBook is fanless for God's sake! That is so awesome. I want a fanless machine that I can plug into a graphics accelerator hooked up to my TV or monitor at home where I can play AAA games. Why do I need a whole second computer just to house a graphics card or God help me an abominable "gaming laptop"
Re: (Score:3)
Re: (Score:2)
Re: Solution looking for a problem? (Score:3)
Re: (Score:2)
Re: (Score:2)
...except quite often by the time you deliver something to do that, you might as well deliver an entirely separate machine to go with it.
The GPU isn't the only limitation of Apple laptops. My main motivation for having a monster laptop myself is not the superior GPU, or more RAM, or a faster CPU. It's the drive bays. Only monster machines come with the amount of storage I want in a laptop.
The storage needs of PC gaming will likely be hampered by the netbook you're trying to plug into the external GPU.
Re: (Score:3)
You want to game on a MacBook?
You'll need a real GPU.
You're going to want a real keyboard and mouse.
And a bigger and better display. Or multiple displays.
You'll want real speakers (or headphones for the retards).
Multiplayer? You need a good mic to talk to people without them hearing everything in your game looping back to them.
At this point you've got so much shit on your desk hooked up to the laptop (docking station or not) that it's easier to just get a real desktop.
You'll get much better CPU performance
Re: Solution looking for a problem? (Score:1)
Have you ever taken a close look a modern high-performance graphics card? Get a good feel for their size and weight. These things can pull hundreds of watts and need a large heatsink to dissipate said watts.
Mobile GPUs don't come close to desktop GPU because the power/heat budget is much larger for a discrete card in a well-ventilated case. Yes, power efficiency is constantly improving but that just means desktop GPUs will keep cranking up performance for a given TDP, the power draw isn't going away.
Re: (Score:2)
> The weight of a GPU chip and a couple of extra VRAM chips isnâ(TM)t going to break anyoneâ(TM)s back.
I disagree. Along with a GPU comes cooling requirements, and those definitely add both weight and size.
An external GPU would let you:
1)- Easily replace a wonky GPU, which is a gamble right now. "Al, lemme try your graphicbox. Ok, see, mine must be bad. I'll buy a new one."
2)- Easily upgrade a GPU, which is *almost impossible* now. "Huh, the new card just went on sale. It's been a couple
Re: Solution looking for a problem? (Score:2)
Re: (Score:2)
It's an awesome idea that won't get traction because:
a) It will increase the complexity of the design (which may or may not be a problem for expensive gaming laptops)
b) It will increase the life of the laptop before a new one is purchased, and thus reduce return business and profit. Gaming laptops are a niche where you get more turnover since they have to follow the leading edge more closely.
c) The external connector will definitely need the standardization, or the laptop might find itself compatible with
Re: (Score:2)
b) It will increase the life of the laptop before a new one is purchased
Not sure if I agree with this one. I get your point, but on the other hand it's a lot less expensive to upgrade from one integrated graphics laptop to another.
Re: (Score:2)
That depends a lot on your use cases. I will preface this by saying I'm part of the target market here because I have already spoken with my dollars and have a laptop with an external GPU box; specifically an Alienware 15 with the Graphics Amp.
a- Increased complexity? Sure, and there's no doubt that there will be teething troubles with drivers. I know I had them early on because of effectively having three GPU's (integrated Intel, integrated GTX-970m and external GTX-980). However I think what AMD is aiming
Re: (Score:3)
Actually, I think an external GPU and power source is a fairly elegant setup. Rather than limiting the GPU capabilities by trying to cram the cards into the laptop format, they can use full desktop GPUs with the associated power supplies and just plug in where you need that power. Then you could have something that performs both tasks of being a very nicely portable laptop and a gaming rig without unnecessary duplication of CPU and RAM or having to manage two separate machines.
Re: (Score:2)
Only problem is you've still handicapped the CPU by "trying to cram the cards into the laptop format".
Maybe they can put the CPU in the external enclosure too.
Re: (Score:2)
I'm not a gamer, but from what I've heard most modern CPUs are capable of handling just about any game out there. Even laptops come with multi-core processors and tens of GBs of RAM these days. The only limitation I see on the laptop format would be disk space. Maybe attaching a fast external storage array to the dock would be a useful add-on so you can keep the cost of the on-board SSD down?
Re: (Score:2)
+1 to this... I have no mod points or I'd give it. Other solutions include having a home server or NAS you can dump bulk data to for archival storage. This is what I do, and have Windows File History set up to back up to that NAS as well.
Re: (Score:2)
Re: (Score:2)
> Seems like trying to solve a problem that doesnâ(TM)t exist.
Maybe not to you, but when I have a GTX 980 Ti in my Windows box and a (weak) GeForce 750M in the MacBook Pro the ability to use an external GPU in a standardized way would be godsend to us graphics / shader guys. I guess you never play around with ShaderToy [shadertoy.com] on a laptop.
Anyways, you're missing the fundamental problem:
GPUs in laptops suck (for high performance).
I understand the heat + space + energy concerns but when you have to resort to
You just invented the home graphics mainframe! (Score:3)
Congratulations: you just invented the home graphics mainframe!
Re: (Score:1)
>> It'd also be nice to grab my desktop's GPU when I want to game on my laptop in the living room.
Its called streaming. All of the graphics players are trying to do this as we speak.
Re: (Score:2)
I've got an (aging) Mac Mini Server at home that I use for some recording work (Garageband for quick knock out ideas, Reaper for more involved projects). Due to less time with a guitar in hand/sitting at the keys, I thought I'd play around with Steam's streaming solution (using the Mac as the delivery mechanism for the beastly desktop workstation sitting in the home office).
It works very well. Enough that I wound up picking up their dedicated streaming box. I'm wired Cat6 everywhere it counts, so I didn
Re: (Score:2)
We had a technology at my university in the early 90s, they were called X-terms - login to a computer running Ultrix or SunOS and graphical programs would render locally on a network connected screen attached to a keyboard and mouse.
Use Thunderbolt or forget it AMD (Score:5, Insightful)
Intel has already done the heavy lifting by giving us the Thunderbolt standard that can expose a 40Gbit (or more if you gang connectors) external interface that can transport PCIe to a GPU in a seamless manner.
If AMD wants to work on making the enclosures, cooling, and power supplies more standardized to make plugging in a wide range of GPUs easy then that's great. If they get all NIH and think they can gin up some proprietary connector instead of just using Thunderbolt then you can forget about this entire announcement right now.
Re: (Score:2, Interesting)
Re:Use Thunderbolt or forget it AMD (Score:5, Insightful)
Of the companies I'd worry about making a proprietary connector, AMD isn't one of them. AMD tends to make open standards that can be used on either companies' GPUs without licensing requirements or proprietary hardware. However, if they did make one, I fully expect that NVidia would not use it and make their own implementation.
Re: (Score:2)
1: External PCIe exists and has been around for ages. No one uses it.
2: Thunderbolt doesn't transport PCIe, PCIe transports Thunderbolt which transports whateverthefuck (and gives everything DMA access because lol).
2a: USB C is a physical connector that can be backed by USB 3.1, USB 3, Thunderbolt 3/2/1, etc. controllers, all of which run over PCIe.
Thunderbolt is an expensive solution to a problem that doesn't really exist, developed in the hopes of hooking people into it for their really expensive opti
Re: (Score:2)
Not enough bandwidth. TB is 4 PCIe lanes. Your graphics card uses 16 of those lanes. Trying to use TB would handicap the GPU.
Re: (Score:2)
There are no proper external PCIe because of electrical limitations. PCIe is a high bandwidth low latency connection that has limited distances, it was never designed to be used over cables. I can't recall the distance limit but it's something like a total trace length of less than 15 inches.
Intel use Thunderbolt and a couple custom chips to convert and transmit only 4 lanes across a cable and the whole thing is ridiculously expensive. Your graphics card needs 16 lanes, or 4 separate TB connections all with
I've already sort-of done this. (Score:5, Interesting)
One of my users was on a big gray Mac Pro, with a fiber card to access the SAN and an AJA card that puts video on the preview/client preview monitors - it's a video card, but a really strange one that acts more as a codec than a traditional video card.
When that machine became a crash-fest I moved him over to a newer Mac Pro trashcan. That fiber card and AJA card can't be put in the trashcan as it lacks PCIe slots. So I got this Magma Thunderbolt PCIe housing [amzn.com]. That AJA card working in there beautifully. I doubt the Quadro Pro from his old system would work in that thing (it might - I may have to experiment one day) but I have little doubt a budget GeForce card would work in there.
I could totally plug my ThinkPad W540 into that box and just about any of the newer Macbooks in the building accomplishing what this article is all about.
Still - intentional and standardized would be nice. Especially with all these Mac people in my building - it would be nice to have GPU's in the Thunderbolt monitors we have floating around - it could save us money when buying laptops if we didn't have to worry about which laptop went to who as long as the monitor was able to handle the job.
Re: (Score:2)
That's really awesome. I hadn't even considered the IT deployment possibilities.
Re: (Score:2)
The chassis I got was not a cheap solution, no. There's a huge price-gap between the single card chassis and the three. The single card here is only $220 [amzn.com] (it's marketed as a dual but one slot is perma-Thunderbolt card). Don't think I didn't consider getting multiple single card boxes and just chaining them together. It actually would have cost less up front, but I would have had to get more cables - which are insanely expensive - worry about having more power, and worry about keeping it all plugged in w
Standard docking station (Score:2)
So what this sounds like to me is a standardized docking station.
Just put a standard connector in a standard location that passes through the VESA Local Bus (or whatever newfangled thing is popular these days). Then have a docking station with a card slot, install a standard desktop video card, and you're all set. This lets AMD (and others) sell video cards to end users of laptops just like they have always done for desktops.
Now where this could get really interesting is if they do this right, and create
Thunderbolt not a solution. (Score:1)
No, the answer when specifying new standards is... to develop NEW standards the whole industry can use, and as the developing body, you get to benefit from leading the charge and being on top. Proprietary hardwar
Re: (Score:3)
Re: (Score:1)
Intel owns Thunderbolt, not Apple.
Scaramouche, Scaramouche, will you do the Fandango (Score:2)
Apple do own the IP to the Lightning connector though.
Sigh.. poor AMD (Score:2)
Lately they're going for all these crazy niches and "next big things" that usually works out to either being a flop or if it's big, then nVidia can just stroll in from behind with a product once the market is mature. Like an ITX size 175W graphics card and so on. Even when they "win" like with Mantle nobody really cares until it becomes a standard like DirectX12 or Vulkan. Like this, I'm sure AMD will use a ton of money on the standardization effort, then nVidia will come and say "that's neat, here's Maxwel
Re: Sigh.. poor AMD (Score:2)
Now consumers have mostly rejected it You say that as if it were, in fact, actually true. I really respect your willingness to demonstrate such a high level of "flexibility." ;)
Re: (Score:2)
Now consumers have mostly rejected it
You say that as if it were, in fact, actually true. I really respect your willingness to demonstrate such a high level of "flexibility." ;)
Vizio announces its first consumer 4K TVs, kills all 3D support [theverge.com]
Sky drops 3D channel [advanced-television.com]
BBC drops 3D programmes due to lack of interest [bit-tech.net]
The End Of 3D? ESPN Drops 3D Channel [ipglab.com]
DirecTV scales back 3D content due to lack of demand [digitaltrends.com]
Poll: Is 3D TV dead? Do you care? [cnet.com]
A quote from the last one:
3D's biggest issue has always been lack of 3D movies and TV shows, however, and they're only getting more scarce. ESPN's highly hyped 3D channel quietly got put to rest two years ago. Many other 3D-only channels, like 3net, Xfinity 3D, Foxtel 3D, Sky 3D and more, are also gone.
Some download services, like Vudu, still offer 3D, but the total number of 3D Blu-ray movies has dropped off significantly. They peaked in 2013 at 77, up from 66 and 68 the two years previous. Last year? 44, and only 22 so far this year. There will certainly be more in the second half, but I doubt we'll break 40.
Maybe you liked it, I'm not to argue with personal taste. But it's barely been mentioned as a feature for a couple years now, there's no plans for 4K in 3D in the new Bluray standard and nobody really seems to care. It works for most
Re: (Score:2)
I've got high hopes for Zen when it comes out personally.
Re: (Score:2)
I've got high hopes for Zen when it comes out personally.
I have a thin thread of hope for Zen this year. Having any more than that seems excessive.
Developers need to embrace external buses (Score:2, Insightful)
Thunderbolt 3 is fierce [thunderbol...nology.net] and could do it. The issue is always market, even with standardization.
Meanwhile we have morons like Palmer Luckey attacking Apple [theverge.com]; basically the kingmaker in pushing to market modular, externalized resources like Thunderbolt 3 / USB-C.
Loving the idea (Score:2)
I see lots of people here commenting and bitching that this is a horrible idea. I, however, am apparently the target audience for this very device.
Right now I'm typing this up on my tiny little 10 inch netbook. I travel around the country very frequently with this thing for casual browsing from hotel to hotel. However, when I'm at the office, I have a full keyboard, mouse, and 22" monitor hooked up to this thing. Am I carrying around a bulky monitor around the country? Nope. But when I'm in the office and d
Neural Network / ML Training (Score:2)
Having one of these would be great for training / running your own personal neural network. Instead of beaming all of your data to a 3rd party you have the work done locally (or series of GPUs even...)
What was old is new again (Score:2)
AMD promoting a specialized connector for a third-party GPU reminds me of the short lived VESA local bus connector in the early 1990's. It became unnecessary as soon as a general purpose expansion bus (PCI) became available which was fast enough to support gaming GPUs.
With the arrival of Thunderbolt 3, it looks like AMD's idea is pretty much dead on arrival.
Latency? (Score:2)
Re: (Score:2)
Small PC's NEED this! (Score:2)
There are a lot of people out there with laptops, All In Ones, and small form factor desktops out there who are stuck with crummy integrated graphics. They have no way to add a bigger power supply or a giant two slot PCI-E graphics card, so a solution like this would be a godsend to them! Plug it in when you want to play PC games, and leave it disconnected when you want to be portable.
So, where do I buy one?
No to Patents. (Score:2)
You want to make it a standard?
Don't encumber it with patents.
Pointless if... (Score:2)
Re: (Score:3)
What does it take to get canned as a /. editor?
Did he RTFA or go a week without posting a dup?
Re: (Score:2)
He was part of Dice, he came in with them and left with them.
Re: (Score:3)
Negative; I started at Slashdot in December, 2007. Dice didn't buy it until Fall 2012.
Re: (Score:2)
He has a law degree according to his profile. That explains a lot.
Re: (Score:2)
Apparently, he is 10 years old.
http://fairlyoddparents.wikia.... [wikia.com]
But I don't think that is the Timmy we are speaking of.
Re:Soulskill, didn't you get canned as a /. editor (Score:5, Interesting)
When BizX bought Slashdot, they brought only a portion of the existing Slashdot staff with them. That included one of the three editors and one of the four engineers. I'm not sure about the other departments. I'd guess they intended to fill those roles with people from their own organization, but I don't know anything about how they're going about it.
I never met or talked with any of the BizX folks, so I can't tell you much more than that. We editors were the bottom of the decision-making totem pole for the site, so I didn't know about the acquisition until it was done.
Even if I'm no longer affiliated, I still love the site and the community. I'll keep contributing until I see good reason not to.
Yes, I've found another job -- I start on Monday, actually. Really looking forward to it. :)
Re: Soulskill, didn't you get canned as a /. edito (Score:2, Informative)
Good luck and best wishes
Re: Soulskill, didn't you get canned as a /. edito (Score:4)
Wow, someone being nice on Slashdot. :).
My hat is off to you, sir
Re: (Score:2)
Thanks for posting! Congrats on the new job!
Re:No one plays games any more (Score:5, Informative)
I haven't bought one in over a decade, and even my most hardcore gaming friends I have don't own one. Also, other than Microsoft employees, I have never met someone that has one of those XBox things. They just aren't selling. How about improving your mobile CPUs before working on something that no one wants now. As usual, AMD is stuck in the past.
As I type this there are 11 million users logged in to Steam, the primary source for PC games. There are nearly 2 million players actually in-game right now between the top 10 titles alone.
A market of millions is nothing to sneeze at. I personally would love external graphics to become a proper supportable thing rather than the occasional one-off proprietary setup I can't expect to use with the next model. I have a desktop for gaming and a laptop for portability, but with a proper external GPU option I could just have the laptop and pair it with a GPU-equipped dock for when I'm at home.
Re: (Score:2)
Re: (Score:2)
Indeed. Also very convenient for the increasingly rare LAN party. Man I miss those.
Re: (Score:2)
It's almost like somebody, somewhere, might want to only buy one system. How absurd!
Re: (Score:2)
Hardcore gamers do this so that they can bring the laptop without having to bring the GPU. They bring the GPU to the LAN parties, but not to the coffee house, or to work, or the living room, or the back yard. The alternative is to have a laptop + a desktop, but then they must keep the two in sync.
Re: (Score:2)
What's to keep in sync if the two machines serve two mostly orthogonal purposes? In the age of the cloud, that sort of thing should be a total non-problem. You shouldn't worry about the state of your two PCs anymore than you would worry about the state of n+1 mobiles and streamers.
Re: (Score:2)
You could certainly do it that way - dedicated PC for games, laptop for everything else. Some people find that inconvenient to maintain multiple computers. And your UID is low enough that I am guessing you can recognize that not everything is in the cloud. :-)
I'm at a LAN party, and I wanna show something I am working on - my recent Photoshop projects, something I am coding in Visual Studio, an object I am designing in Blender... I don't wanna install all that on 2 machines and keep them in sync. Maybe I
Re: (Score:2)
Re: (Score:3)
Your experience does not match the overall trend.
PC Gaming Market is Estimated to Grow To $35 Billion by 2018 [REPORT] [dazeinfo.com] There's a nice graph at showing how the PC games industry has doubled since 2009.
Re: No one plays games any more (Score:2)
Re: (Score:2)
Data from 2009 to 2015 is not a future preduction.
Re: (Score:2)
3 consoles = XBOX, PS4, Wii U.
The graphic isn't clipped for me (I'm using Firefox 44.0.2). The axis is labeled from 2009 to 2018. Here is the raw image [netdna-cdn.com] if you want to look at it that way. Your confusion probably stems from not seeing the axis. The graph shows both historical data and projected data. Since the discussion is about the historical data, you can debate if the future predictions are "made up" or not, but it makes no difference to the point at hand. The AC I replied to was saying that PC gam
It's called "Old Age" (Score:2)
Re: (Score:2)
I haven't bought one in over a decade, and even my most hardcore gaming friends I have don't own one. Also, other than Microsoft employees, I have never met someone that has one of those XBox things.
Okay, but the PS4 is doing great. Maybe you should buy one of those. I hear there are roughly 20 good games for it by now.
Re: (Score:2)
> Also, other than Microsoft employees, I have never met someone that has one of those XBox things.
Really? Do you only know boring people ? MS sold about 35 million XBox 1st gen, 85 million XBos 360 and 20 millions XBox One. [1]
That figure includes 12 millions XBox One in the North America. That is about 10% of US Household. So in all likelyhood, there are multiple on your block.
[1] http://www.vgchartz.com/analys... [vgchartz.com]
Re: (Score:2)
Re: (Score:2)
This is true as far as it goes, but misses the point that the GPU connection at that point is PCI-E. You can pick and choose your GPU in the Dell/Alienware solution and it works really well. Yes, the connector is proprietary, but that's because there were no standards for external, pluggable PCI-E.
For the record, I have an Alienware 15 that is my primary box and I love it. I have the external box (Amplifier) with a GTX-980 in it right now for heavy lifting on games. It's really nice when I'm on the road to
Re: (Score:1)
Quake 3 was the best FPS
LOL
Re:No one plays games any more (Score:4, Funny)
>> Quake 3 was the best FPS
> LOL
No, that's the best moba.
Re: (Score:3)
Re: No one plays games any more (Score:2)
Re: (Score:2)
You're confusing single player games with multiplayer games like quake.
Many FPS have single-player and multiplayer game modes. If the game doesn't have a good single-player mode, I'm not interested. I rarely ever play multiplayer mode unless it has a sniper rifle and I can snipe as I run around. Nothing infuriates an opponent than a head shot on the run.
Re: (Score:2)
Re:No one plays games any more (Score:4, Interesting)
I call bullshit. Your friends may not have bought console games in years due to your argument on games being comatose because the console controllers ceased evolving their input quantity, thus putting a stop on gameplay advancement,
but there is still plenty of PC gaming innovation happening thanks to the keyboard. Mount&Blade games, space sim, RTT games in the past 3 or 4 years had great releases, the continuation of great RTS storytelling and some innovation with the ex-Relic team coordinating the Homeworld successors,
FPS games that get advancement beyond the quantitative and response time capacity of console controllers (hybrid building/FPS or crazy shit like Planetstorm),
etc. etc. etc.
The only comatose things are consoles, as they are now mostly copy pasta with games we already played as you say, except with a focus on improving graphics and dicking around with "cinematic experience" because gameplay advancement is down the toilet.
PC gaming is just fine and dandy, with indie devs slowly piling up their revenue of initial smaller games on the road towards AAA conglomeration without any Publishers like EA fucking them up with the contractual "innovation is too risky" BS.
The only thing that's missing is Valve creating a marketing push with SteamOS like Microsoft did in the years it was focused on spreading Windows to every household everywhere via gaming, by fully focusing on helping devs with development teaching, bug fixing help, development tools, and stuffing their SteamOS brand on every video game start screen;
and AMD getting their shit back together by hopefully being the first to implement graphene and giving Intel a giant competitive nudge.
Re:No one plays games any more (Score:5, Insightful)
They've sold 20 million XBox One. Your getting the wrong conclusions from your evidence. From 1982-2012, sales went first away from mom&pop stores, and then towards online purchases.
Also, I think you've just gotten older. You get sentimental over games from when you were a kid...Kids don't play Quake anymore, of course they could [kongregate.com]. The game is dated, the model has been improved upon.
Re: (Score:3)
You get sentimental over games from when you were a kid.
As I've gotten older, I'm less inclined to pay $60 for a PC game. I can wait a few years to buy the same title for 20 bucks or less on Steam. I may have even replaced the video card to play the game in its AAA glory.
Re: No one plays games any more (Score:2)
Re: (Score:2)
While I disagree in general that the gaming market is dead — certainly the decline of brick & mortar stores tracks to some degree with the decline of games being distributed on physical media — I do agree that Quake 3 is the best FPS.
Re: (Score:2)
"Also, you're correct about the XBox not being popular. My parents would go weeks without selling an XBox game."
What a dumb statement. Of course they'd go weeks - brick and mortar game shops are dying. People who own XBox One and PS4 systems are using digital distribution for the most part.
Re: No one plays games any more (Score:2)
Re: (Score:2)
In that era, the originality of a best selling game was driven by the technology to do something completely different and to make use of the latest technology at the time (VGA, SVGA/ModeX) and SoundBlaster cards/MIDI. And speed was important, so code optimization took priority.
People were doing casual games (simple platform games), but those were written on top of the Windows API.
Re: (Score:3, Insightful)
Re: (Score:2)
Re: (Score:2)
"high school comp sci teacher"
You can stop talking now.
Re: (Score:1)
don't forget to also manufacture a giant red vibrating horse dildo
I believe it uses the new Intel Thunderbutt interface
Re: (Score:2)
Tenderbutt
Re: (Score:2)
How do you play games without buttons? How do you write an essay on a virtual keyboard? Yes, a phone does it all - very, very poorly.
Re: (Score:3)
>> How do you...?
Easy: this guy is a manager. If all you do is schedule things, have meetings and delegate things with initials (e.g,. "JR can you handle this?"), you can live on a mobile device with a fairly large screen. You only need a computer if you actually have to do work.