Wireless PCIe To Enable Remote Graphics Cards 181
J. Dzhugashvili writes "If you read Slashdot, odds are you already know about WiGig and the 7Gbps wireless networking it promises. The people at Atheros and Wilocity are now working on an interesting application for the spec: wireless PCI Express. In a nutshell, wPCIe enables a PCI Express switch with local and remote components linked by a 60GHz connection. The first applications, which will start sampling next year, will let you connect your laptop to a base station with all kinds of storage controllers, networking controllers, and yes, an external graphics processor. wPCIe works transparently to the operating system, which only sees additional devices connected over PCI Express. And as icing on the cake, wPCie controllers will let you connect to standard Wi-Fi networks, too."
I must admit... (Score:2)
Re: (Score:2)
Re:I must admit... (Score:5, Informative)
You're right, and the summary is wrong and the article's a bit misleading.
"... will let you connect your laptop to a base station with all kinds of storage controllers, networking controllers, and yes, an external graphics processor."
Sorta... PCIe 16x is 16 GB/s [wikipedia.org], that's with a big B for bytes. They're hoping for 7Gbps, or 875 MB/s. [techreport.com] "the spec should move "quickly" to 7Gbps (875MB/s)." That's 1/20th the speed of 16x PCIe. They might be able to do PCIe x1 but that's it.
If they would have read the whitepaper that is all explained [wilocity.com]:
"A reliable wPCIe connection can be maintained with a relatively low data rate channel. However, to achieve meaningful performance between local and remote devices, the data rate needs to be on the order of 2 Gbps, or that of a single lane of PCIe. The only practical wireless channel that can support this capacity is 60 GHz."
So basically this can transfer wirelessly at ~500+ MB/s, so you can have wireless BD-ROM, wireless hard drives, and yes even wireless displays, since it's fast enough to transfer 1080i without any compression, [ttop.com] but I'm sorry to dash the hopes of anyone that thought they could someday upgrade their laptop's video card by simply buying a wireless external Radeon HD 5970 or Geforce GTX 480, you will still need a GPU connected by 16x PCIe to process the video and then stream it similar to what OnLive Remote Gaming Service offers now. [slashdot.org]
Re:I must admit... (Score:5, Interesting)
What are the possibilities of channel bonding, though? WiFi has 11 channels, is it possible to build a sender/receiver pair that can move data over multiple channels at once? Perhaps soon there will be 7Gbit, then 14Gbit, then 21Gbit, etc implementations. Need more bandwidth? Add more radios.
Re: (Score:2)
What are the possibilities of channel bonding, though? WiFi has 11 channels, is it possible to build a sender/receiver pair that can move data over multiple channels at once? Perhaps soon there will be 7Gbit, then 14Gbit, then 21Gbit, etc implementations. Need more bandwidth? Add more radios.
We're already crowding the spectrum enough with wireless standards. Do we need more waste crowding in on an already crowded space?
Re:I must admit... (Score:4, Informative)
Re: (Score:2)
The 11 WiFi channels (in the 2.4 GHz band) overlap each other such that there are really only 3 non-overlapping channels possible (1, 6, 11). Remember that WiFi (actually pre-WiFi 802.11) started out with 1 Mbit/s transmission speeds. At that time, 5 MHz channel spacing allowed 11 non-overlapping channels. But with 802.11b (which was the first "WiFi" version) and ever since, channel widths have been at least 22 MHz - hence only 3 usable channels.
At 60 GHz, I think they're going for speed in a single chan
Re: (Score:2)
The FCC has set aside 57-64GHz for high speed unlicensed wireless use. If the IEEE 802.15.3c standard is followed, then this is divided up into 4 channels.
Re: (Score:2)
WiFi has 11 channels, but of those, only 3 don't actually overlap.
Re: (Score:2)
Good analysis, though one caveat: fast enough to do 1080i is really just fast enough to do 720p. And 720p is really only adequate for very small laptops. Anyone attempting to do 1080i on a PC (instead of 1080p) either has no idea what they're doing or they don't need my validation.
Wireless disk readers and HDD's, though, are an interesting application. Inter-device networking at 500 MB/s might be fast enough for Avatar-style fast swapping of files between base stations and laptops. And 500 MB/s might al
Re:I must admit... (Score:5, Interesting)
True, but consider this possibility:
Right now everyone's looking at the traditional model. That is, a portable CPU connected to a GPU connected to a display, and adding in a wireless form factor to it.
What if, instead, the base station contained the CPU AND the GPU connected directly together - much like a desktop system now - to do all the hard math and 3D rendering? - which then outputs a wireless PCIe signal, which is then picked up by the portable device, like a netbook, with a basic GPU, a small processor, and little to no HD space? It's only job would be, much like a thin client - would be to provide you access to the computing power in the "main" section of the house.
It would be like having a docking station for your netbook that turns it into a desktop powerhouse - only you could walk around the house with it. And, when the time comes that you want to take it outside, you still have the basic capabilities of a netbook.
That might be a product worth selling to, say, a family of four. "You can pay for four notebooks, or four netbooks and this powerful base station".
Re: (Score:2)
like a netbook, with a basic GPU, a small processor, and little to no HD space? It's only job would be, much like a thin client - would be to provide you access to the computing power in the "main" section of the house.
You mean like you can do right now by using the netbook as an X terminal?
That might be a product worth selling to, say, a family of four. "You can pay for four notebooks, or four netbooks and this powerful base station".
Much as people use old PCs as X-terminals (I have done it in a small office), except with the advantage of portability. I was thinking of buying a netbook to use in just this way.
Re: (Score:2, Informative)
Re: (Score:2)
Re: (Score:2)
Wireless hard drives sound cool, but what about latency and lag?
I'm on the internet wirelessly, I do get latency problems from time to time due to the extra "hop".
As far as I'm concerned the internet is like a wireless hard drive to me. :) Ok, a wireless network of networks.
Graphics at a decent FPS rate does require huge amounts of bandwidth at a decent screen resolution. Just look at how online video (eg Youtube) is buffered before it starts.
A wireless network at speeds whereby wireless hard drives and wir
Retro Tech (Score:2, Interesting)
Re:I must admit... (Score:5, Funny)
You'd better get used to your computer experience looking like thaaaaaaaaat if your display has to be sennnnnnnnt over a wireless linnnnnk.
Re:I must admit... (Score:5, Informative)
Not to mention security. I mean, you thought Tempest [wikipedia.org] was bad before, now I can wirelessly sniff and alter PCI traffic, which is a direct conduit into the RAM.
Re: (Score:2)
Does the PCI bus really work that way? Are you sure that the device controls where the data goes into memory? I would have thought that the destination is safely set up in software to point somewhere harmless like a raw data buffer, and then the device dumps into that spot.
Re:I must admit... (Score:5, Interesting)
Some recent systems have IOMMUs which provide privilege separate between hardware devices much like normal MMUs govern software. However, unless this sort of IOMMU device is active, PCI and PCIe hardware is generally capable of transferring data to or from any other connected device, including any area of system RAM. Sometime this can even extend to external interfaces; for example, people have been known to take advantage of the DMA capabilities of the Firewire protocol to read the contents of RAM on an active system.
In general, non-hotpluggable hardware has been granted the same level of trust as the OS kernel, so no one worried very much about it. IOMMUs were more about protecting against faulty or corrupted software (device drivers) than malicious hardware. However, more and more hardware is hotpluggable these days. Also, some software interfaces are becoming too complex to really trust—consider, for example, the interface to a modern GPU, which must transfer data to and from RAM, and perhaps other GPUs, under the control of code provided by user-level application software (shaders, GPGPU). Without an IOMMU it is up to the driver software to prove that such code is perfectly safe, which is an inherently hard problem.
Re: (Score:2)
Yes. All low level devices are wired to the CPU by the memory bus. Writing data to a PCI card is simply a matter of writing to a certain memory address. The PCI card will see a specific address on the memory address bus and know the data on the data bus is intended for it.
It's not like x86 CPUs have one bus for devices and one for memory.
Re: (Score:2)
If the device only has 32-bit PCI then it is limited to the lower 4G of memory space, if it is 64-bit PCI then it can go anywhere.
More correct: If the device only support Single Address Cycle then it is limited to the lower 4G of memory space, if it support Dual Address Cycle then it can go anywhere.
Re:I must admit... (Score:5, Insightful)
You're unlikely to be able to *alter* PCI traffic, though you could perhaps *insert* PCI traffic.
Still, people figured out properly encrypting wireless links some time ago. Tempest is primarily interesting because the signals you're looking at are unintentional (and often unknown) side effects and they often deal with links that are impossible or unreasonable to encrypt.
Re: (Score:2)
Not to mention security. I mean, you thought Tempest [wikipedia.org] was bad before, now I can wirelessly sniff and alter PCI traffic, which is a direct conduit into the RAM.
Yep. Can't wait. Endless fun and games at the next Powerpoint presentation by Corporate.
Re: (Score:2)
Re:I must admit... (Score:5, Informative)
It's not that bad -- I've done it before.
X Windows over plain old wifi.
Re: (Score:2)
What would that other e stand for?
Express PCI Express?
Re: (Score:3, Informative)
Re: (Score:2)
Mostly because ExpressCard is entirely too slow to match what we have in a desktop machine.
Which is a shame -- I have wanted that product pretty much since I saw it, though I'd much rather have it integrated into a standardized docking station of some sort.
Every time I think that through, though, I basically decide that I want my docking station to be either a desktop or a server -- spare CPUs and RAM, extra disks, separate monitor/keyboard/mouse, and of course, PCI, PCIe, USB, FireWire, etc.
I think next ti
Re: (Score:2)
Is it ? From the specification, you can read:
So, the standard range is less than 10 meters ? This is anything but awe inspiring.
Maybe the text is misleading, and it is not a standard 10m range. But that is the impression I get ...
Re: (Score:2)
I just hope Apple isn't behind this technology.
You'll get degraded signal if you hold or sit next to your computer. And if someone using a microwave within 100meters you lose signal completely. But that is a feature so you know when your food is done cooking.
Re: (Score:2)
What did you expect at 60Ghz?
This signal will not penetrate a sheet of paper.
Re: (Score:2)
That was my first thought as well.
My second thought was "I wonder how they're going to handle security and authentication?" Which rather took the shine off my first thought, I'm afraid.
Re: (Score:2)
No thank you. I can see their argument WiGig does a few cm and is a cable replacement tech. You cannot even send across the room with it. It is however yet another "microwave your brain" in the house.
I'd rather have PCIe over optical and a standard dockable optical connector that can tolerate at least a few thousands dock/undock cycles on my laptop. Is it that difficult?
Question (Score:3, Insightful)
To those in the know, why will this succeed where UWB/wireless USB failed in the market?
Remote graphics seems like an even more esoteric need than the remote mass storage, printing, cameras that UWB would have offered?
Re:Question (Score:4, Insightful)
The performance of early wireless USB hardware was pretty shit, and it was uncommon and ill standardized, so you usually still had to plug a dongle in, just to get performance worse than plugging in a cable. Plus, basic NAS/print server boxes had become really cheap and fairly easy to use. Anybody who wasn't a technophobe or living in a box(and thus not the target market for pricey and sometimes flakey wireless USB) already had his mass storage and printers shared over a network, wired or wireless, and his human interface devices wireless via bluetooth or proprietary RF, if he cared about that. Wireless USB didn't really enable any novel use cases that anybody cared about.
On the other hand, there is basically no way of plugging in "internal" expansion cards over a network(in the home context, I'm sure that some quite clever things have been done with I/O virtualization over infiniband, or whatever). Particularly with the rise of various "switchable graphics" technologies, I assume that the use case is basically this: User has nice thin, light, long-running laptop. They come home, sit within a dozen meters of a little box(containing a graphics card or two, with one head connected to their TV), and suddenly their laptop has a screaming gamer's graphics card supplementing the onboard card, either usable on the built-in screen, or via the second head connected to the TV, or both.(Analogs could be imagined for professional mobile workstation applications, where simply sitting near your desk connects you to a quartet of CUDA cards and an SAS controller with 4Us worth of drives hanging off it.
Will the market care, enough to bring the volume up and the price down? I have no idea. However, it at least has the advantage of allowing things not otherwise possible, unlike wireless USB, which pretty much covered the same ground as a mixture of bluetooth peripherals and resource sharing protocols over TCP/IP; but years later and without the standardization..
Re: (Score:2)
The best use case I can think of for this is when my laptop finally shrinks into my phone. I don't have to attach a zillion wires to the phone to get it to operate as my computer.
-l
Re: (Score:2)
Re: (Score:2)
Yes, but remote graphics is much more difficult to do over WiFi. When you already have a router, why buy wireless USB or UWB devices, which you need a special dongle or card for, when you can just buy one with WiFi and be done with it. Plus, if you're doing wireless, you're likely using it for multiple PCs, which is even more reason to go with something you already have. Who's going to buy a separate $20-$50 dongle for every computer they want to print from, for example, when they don't need to? 802.11g off
Re: (Score:2)
Re: (Score:2)
Not completely sure, but I think one problem with UWB is that the power levels had to be set SO low to appease those *licensed* operators of the spectrum it overlaps that performance just ain't all that great.
Then again, it could be that UWB was torpedoed by the Osborne Effect of having 60 GHz coming "in the near future." Honestly, electronics evolve so fast it's hard to understand how anybody makes a buck in "latest/greatest technology" anymore. I was looking at hard drives this morning. Seems the new p
Good news, everyone! (Score:5, Insightful)
Re: (Score:2)
Yeah, this'll be awesome for early adopters and then start to suck as soon as their neighbors get it, kind of like how 802.11g sucks but N is still OK (for now).
Also, people getting pwned in online games will stop saying "fucking lag!" and start saying "goddamn microwave!"
Re: (Score:2)
Thankfully, the 60 GHz spectrum is unlikely to get that polluted. It has severe penetration problems. If you have a cup of water (or a screaming child) between your laptop and a base station, the signal would likely be interrupted. Having a few walls between you and your neighbors should be fine.
This is in stark contrast to 802.11*, which can pollute for 1/2 a block.
Re: (Score:2)
As if it were not already embarassingly simple to obtain.
Re: (Score:2)
"Band"-aid (Score:2, Interesting)
Nice but what's the range, and is the spectrum licensed or will we end up dealing with a "tragedy of the commons" much like the 2.4 Ghz band?
Re:"Band"-aid (Score:4, Informative)
Re: (Score:2)
It's unlicensed. If it were wider, wireless phones and stuff would just use the entire wider band. We've seen this before with 802.11n: "Why let different carriers broadcast simultaneously on different bands when we can just take the entire spectrum and make our network super fast?"
Great Breakthrough, Limited Performance (Score:2)
That sounds like a wonderful idea and the thought of having a wireless graphics card for a laptop is very tempting.
But how much performance can we really squeeze out of it? I mean, for a power user who wants a higher resolution than his integrated card can offer, it's a godsend. But for gaming? No way.
Also, I'll admit I'm not very wise on the technical details of PCIe, but if you're putting all of the above-mentioned devices in contention for 7Gbps of bandwidth, there's really not a lot you can milk from
Re: (Score:2)
I also forgot, you can get laptops, right now, that are comparable to the desktop. I condier them a complete waste, but that's besides the point.
Re: (Score:2)
I only paid around $1k for my laptop recently and there are no games currently available that I can't run on at least medium settings. And that's even opting for a better screen than graphics card when I made my purchase. Can I game at super hacker leet graphics levels? No, but I can play all modern games with decent settings and a decent framerate.
Re: (Score:2)
I bought one of those a couple years ago, mostly because I planned to do video editing on it; gaming was a nice bonus.
It's a piece of shit. Runs so hot that everyone who uses it comments on the heat, but if you throttle it any it feels crazy-slow. The heat's so bad that if I don't make some sort of special arrangement for it to sit up where it can get airflow, it'll overheat and shut down during games (or sometimes just while playing back a video!). The damn rear vent points down and to the back at a 45
Re: (Score:2)
A screaming gamer laptop is actually pretty reasonably priced these days, and only a bit slower than the screaming gamer desktop. However, it is still hot, heavy, and loud, and doesn't get thrilling battery life.
The convenience would be being able to buy a thin-and-light with excellent battery life, that c
Re: (Score:2)
It helps a lot that recently PC game developers have been targeting the Xbox 360 and PS3 as their main platforms. These platforms are unchanging, effectively locking developers to the state-of-the-art circa 2005. This keeps PC builds to a lower visual standard, which gives mainstream laptops a chance of actually keeping up.
Once we see new consoles launched (possibly 2012 or 2013), we'll see developers targeting Poly / RAM budgets of those machines. The PC build requirements will then go through the roof,
Pci-e x1 is to slow for all of that video will suc (Score:3, Informative)
Pci-e x1 is to slow for all of that video will suck at that speed and then you want to add more io to it?
Very practical (Score:5, Funny)
The best feature of this proposed standard is that if you place a ceramic mug directly between your CPU and the external graphic processor, it will keep your (coffee, soup or whatever) steaming hot, all day long! Those days of long WoW raids with only cold beverages and snacks are over!
Re: (Score:2)
I “read the book” (as they used to say), and that will only work if you put it on a rotating platter. Or just use a spoon.
Hmm, from what I know, this should actually work (using a spoon to make the fluid rotate in the mug in the field). But I doubt you can buy a 800W wireless transmitter in your normal electronics shop. ;)
Something wireless I might not hate? (Score:3, Interesting)
Very cool stuff if it materializes.
Imagine a small lightweight machine with say an ULV i3 or i5 CPU, small-ish screen and weak-ass integrated graphics. Place the machine on it's docking pad (No connectors to get bent or boards to break) and suddenly it's got (wireless?) juice and access to kick-ass graphics, and a big monitor, as well as whatever else is in the base-station.
A desktop replacement that remains light and portable for road warriors, with none of the fragility associated with docking connectors. With those transmissions speeds I presume this is going to be a point-blank range affair, so snooping shouldn't be (much?) of a problem.
Not exactly... (Score:2)
Wilocity told us that wPCIe can push bits at up to 5Gbps (625MB/s), and that the spec should move "quickly" to 7Gbps (875MB/s).
If you consider that PCIe 16x is 16GB/s (128Gbps), this is very underwhelming. Call me a sceptic but I don't see a real-world application of "wireless PCI-E" that is slower than a 1-lane PCI-E. Well, at least a real-world application regarding graphics...
Re: (Score:2)
You do realize that PCIe 1.0 16x is still 4GB/s, right? The point is, would the integrated graphics of a laptop be slower than a card limited to less than 1GB/s? I bet the answer is no.
Re: (Score:2)
PCIe Scaling analysis [tomshardware.com]. It's three years old, though.
Re: (Score:2)
wPCIe enabled hard drives will completely erase the need for both 'internal' and 'external' HDDs.
You'll have small flat box with a motherboard + CPU and Ra
Re: (Score:2)
Do remember, practically speaking that's 500 MBps for all of your devices. If you have 3 hard drives, a monitor, a card reader, a keyboard and mouse, that's about 100 MBps left for each of them. And when this finally comes out, 500 MBps will seem even smaller.
I remember similar claims about Bluetooth. It was going to be the universal standard, trivial networking between everything, yadda yadda. Ultimately, it's kind of a pain to connect bluetooth devices. "Is this my mouse? No, it's not showing up. O
Re: (Score:2)
With those transmissions speeds I presume this is going to be a point-blank range affair, so snooping shouldn't be (much?) of a problem.
But even at point blank ranges, you still need to worry about interference.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
well if you can't see how it would be useful, then clearly it's no good~
Just great! (Score:2)
Neat idea but it'll suck where it needs to shine (Score:3, Insightful)
Let's say I've got even a little building with 50 people who want to use this. Will I be able to pack 50 of these point-to-point units into a building and have all of these systems perform at peak capacity without stepping all over each other? That would be amazing.
And, aside from the technical issues of getting it to work well in a dense environment, there's still one cord that needs to be connected to the laptop. Power. If I have to plug that in, I may as well snap the laptop into a docking station and skip the wireless connection entirely. One connection is one connection and I won't have to worry about interference, security, bandwidth, etc.
Re: (Score:2)
what are you doing where all the people need to be pushing 7G constantly across the bus?* If that's the case, the it's probably not for that situation. Most people in most office don't need to be using that kind of data all the time.
You could create a reliable system so you could take your laptop anywhere and have it display on a large screen or projector. so you walk into a meeting room and it links up. You want to display something on your TV, it links up.
Perhaps you have a hand held device and want to sh
Re: (Score:2)
Did you miss the part where they're talking about docking stations with video cards built in, USB3, network, etc.?
"The first applications, which will start sampling next year, will let you connect your laptop to a base station with all kinds of storage controllers, networking controllers, and yes, an external graphics processor."
I don't know how your company works but, around here, we expect people to show up at roughly the same time every day and...erm...work. Like simultaneously. And, yes, many of our l
RF bath anyone? (Score:2)
As a skeptical person who usually maintains a scientific 'prove your crazy theory if you expect my buy in' ideology...
I have to say if you had to bet money on what wireless technology actually WILL cause cancer and your options are cell phone, wireless access point or wireless PCIe sounds, I think wireless PCIe would win.
What Killer App? (Score:2)
The diagram shown at TFA indicates a single PCIe lane (x1) is provided. What PCIe devices would benefit from being wireless?
Re: (Score:3, Informative)
And yet, it's all the bandwidth I need to attach a less-powerful video card (such as the Matrox G550 [matrox.com], which can run off a PCIe x1 slot) to my laptop, allowing me to dock onto another monitor or two on my desk quickly and easily.
"Nutshell" (Score:2)
I don't know about you, but I don't want to have something operating at 60GHz sitting in my lap, thanks... I'll stick to super-long HDMI or DVI cables if I need to route a monitor sig
Awesome, I will own you now. (Score:2)
Wireless PCI Express? Awesome. I'll just walk by with a specially designed device, master the bus, and DMA the entire contents of your RAM over to a laptop. Then I'll change some interesting bytes here and there, and DMA it back.
This sounds like the dumbest attack vector since FireWire came out with physical DMA support.
Re: (Score:2)
This sounds like the dumbest attack vector since FireWire came out with physical DMA support.
Yes. As I once pointed out on the Linux kernel mailing list, the FireWire driver enabled external access to physical memory by default, via an un-commented line of code. Only for the first 4GB, though; nobody had updated that backdoor for 64-bit. (There are hardware registers which control the address range for which that feature works. The Linux defaults were to allow 0..4GB-1)
Re: (Score:2)
The whitepaper is silent on the subject, but presumably there is some sort of secure authentication, if not encryption scheme. If not this protocol is more or less worthless.
Re: (Score:2)
If anyone's transmitting that kind of sensitive data over an unencrypted wireless link, then they're as ignorant as you.
um... (Score:2)
Yo dawg (Score:2)
I herd, you like Windows drivers, so we have put a low-level low-latency bus protocol supported by Windows driver on top of a low-reliability high-latency protocol supported by Windows driver, so you can use Windows drivers while you use Windows drivers.
Seriously, there is no excuse for this, other than providing this kind of illusory compatibility. Properly engineered systems have protocols optimized to efficiently use media throughput while taking latency into account. This thing can be best described as
Re: (Score:2)
Re: (Score:2)
The only imaginable goal of this is to provide driver-level compatibility without writing any new software. What would be utterly pointless on anything but Windows.
So... (Score:2)
Now all we will need is wireless power!
Though I think this already exists, based on some of these supposed Office freak-out videos where some bloke goes crazy and tosses a monitor, which mysteriously does not have any cables connected! One might assume that it was staged, but it's wireless power and wireless graphics!
Wireless Dicking Station! (Score:2)
Put a wireless power charger for the laptop batteries and you may have created the world's first fool proof and universal docking station for laptops and the dumb ass fools who use them.
Wired? Only one cable more! (Score:2)
Even 60hz has a very limited transmission range, I am preferring the wired option above the wireless; because anything which gets sent to the air can get intercepted and influenced by any nearby devices or transmitters.
I like the idea to separate elements from a your PC and start to wire(lessly) connect them together; so not only your pc can be used at your office, but also in your living quarters, bedroom or anywhere where you got a station with a monitor. Security has to come in mind and a wired option sh
Re: (Score:3, Funny)
Re: (Score:2)
Just change a few physical constants to open up some more bandwidth.. emacs has a command for that right?
Re: (Score:2)
But can't you smell the per-monitor pricing scheme coming up?
Re: (Score:2)
like cable vod? wait having it at the headend soun (Score:2)
like cable vod? wait having it at the headend sounds like a better idea then what they have now.
Re: (Score:2)
At 60GHz those homes better be very close together and very small. Is your house wider than 10m?
Re: (Score:2)
I would bet it would be for a overhead display or something like that, not Crysis.
Re: (Score:2)
Not to mention popping this sucker into a smartphone would allow another large degree of mobile computing.
Yeah, that was the thought that popped into my head. It would be cool to wirelessly and effortlessly connect my super-powered smartphone to a keyboard, mouse, and monitor. With all the computing power being crammed into smartphones, that would be a really awesome way to set up home and office workstations. I'm not talking about running Crysis, but for web surfing and document editing this would be a cool application.
Re: (Score:2)
Agree. My immediate thought upon reading the intro was "use with virtually tethered tablet for use near a base station, like in the house."
The speed of the connection allows for an ultralight tablet with "unlimited" supportive stuff off the tablet and located at the base.
Doesn't appear to do anything I can envision for the true classic desktop itself; perhaps improved server connection at home? Dunno.
And it will introduce another wireless standard with which to confuse the masses.
And to add to phones.
Re: (Score:2)
Re: (Score:3, Informative)
We've already had external GPUs for laptops. They failed horribly. That is why you don't see them.
Re: (Score:2)
Why encode? The link is fast enough to transmit HD video unencoded.
Re: (Score:2)
What kind of transmission range are you envisioning 60GHz to have? Because from my understanding, you'd be lucky to get a couple feet at best.