Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
Displays Networking Wireless Networking Technology

Wireless PCIe To Enable Remote Graphics Cards 181

Posted by timothy
from the no-cat-involved dept.
J. Dzhugashvili writes "If you read Slashdot, odds are you already know about WiGig and the 7Gbps wireless networking it promises. The people at Atheros and Wilocity are now working on an interesting application for the spec: wireless PCI Express. In a nutshell, wPCIe enables a PCI Express switch with local and remote components linked by a 60GHz connection. The first applications, which will start sampling next year, will let you connect your laptop to a base station with all kinds of storage controllers, networking controllers, and yes, an external graphics processor. wPCIe works transparently to the operating system, which only sees additional devices connected over PCI Express. And as icing on the cake, wPCie controllers will let you connect to standard Wi-Fi networks, too."
This discussion has been archived. No new comments can be posted.

Wireless PCIe To Enable Remote Graphics Cards

Comments Filter:
  • That is coooooooooooool.
    • Although interesting, I imagine it will be impractical for many devices that would be adversely affected by the latency, even if the bandwidth was suitable. Additionally, wireless networks come with a certain amount of packet loss, which means even if they are encapsulating the PCI bus protocols over a TCP stream with large enough buffers to queue up lost packets, the huge burst in latency would most certainly cause strange behavior for many devices as modern OS's probably assume that if a PCI device does n
      • Re:I must admit... (Score:5, Informative)

        by iamhassi (659463) on Wednesday July 14, 2010 @06:36PM (#32908036) Journal
        "I imagine it will be impractical for many devices"

        You're right, and the summary is wrong and the article's a bit misleading.

        "... will let you connect your laptop to a base station with all kinds of storage controllers, networking controllers, and yes, an external graphics processor."

        Sorta... PCIe 16x is 16 GB/s [wikipedia.org], that's with a big B for bytes. They're hoping for 7Gbps, or 875 MB/s. [techreport.com] "the spec should move "quickly" to 7Gbps (875MB/s)." That's 1/20th the speed of 16x PCIe. They might be able to do PCIe x1 but that's it.

        If they would have read the whitepaper that is all explained [wilocity.com]:

        "A reliable wPCIe connection can be maintained with a relatively low data rate channel. However, to achieve meaningful performance between local and remote devices, the data rate needs to be on the order of 2 Gbps, or that of a single lane of PCIe. The only practical wireless channel that can support this capacity is 60 GHz."

        So basically this can transfer wirelessly at ~500+ MB/s, so you can have wireless BD-ROM, wireless hard drives, and yes even wireless displays, since it's fast enough to transfer 1080i without any compression, [ttop.com] but I'm sorry to dash the hopes of anyone that thought they could someday upgrade their laptop's video card by simply buying a wireless external Radeon HD 5970 or Geforce GTX 480, you will still need a GPU connected by 16x PCIe to process the video and then stream it similar to what OnLive Remote Gaming Service offers now. [slashdot.org]
        • Re:I must admit... (Score:5, Interesting)

          by ThreeGigs (239452) on Wednesday July 14, 2010 @07:26PM (#32908390)

          What are the possibilities of channel bonding, though? WiFi has 11 channels, is it possible to build a sender/receiver pair that can move data over multiple channels at once? Perhaps soon there will be 7Gbit, then 14Gbit, then 21Gbit, etc implementations. Need more bandwidth? Add more radios.

          • by tyrione (134248)

            What are the possibilities of channel bonding, though? WiFi has 11 channels, is it possible to build a sender/receiver pair that can move data over multiple channels at once? Perhaps soon there will be 7Gbit, then 14Gbit, then 21Gbit, etc implementations. Need more bandwidth? Add more radios.

            We're already crowding the spectrum enough with wireless standards. Do we need more waste crowding in on an already crowded space?

            • Re:I must admit... (Score:4, Informative)

              by blankinthefill (665181) <blachanc.gmail@com> on Wednesday July 14, 2010 @08:50PM (#32908896) Journal
              They're talking about using the 60GHz bands, which are heavily limited by line of sight, and have extremely poor to no penetration of physical objects. Those facts make it perfect for this kind of high bandwidth, short range application, without further cluttering the spectrum for those around you.
            • by flatulus (260854)

              The 11 WiFi channels (in the 2.4 GHz band) overlap each other such that there are really only 3 non-overlapping channels possible (1, 6, 11). Remember that WiFi (actually pre-WiFi 802.11) started out with 1 Mbit/s transmission speeds. At that time, 5 MHz channel spacing allowed 11 non-overlapping channels. But with 802.11b (which was the first "WiFi" version) and ever since, channel widths have been at least 22 MHz - hence only 3 usable channels.

              At 60 GHz, I think they're going for speed in a single chan

          • by MattskEE (925706)

            The FCC has set aside 57-64GHz for high speed unlicensed wireless use. If the IEEE 802.15.3c standard is followed, then this is divided up into 4 channels.

          • by wgoodman (1109297)

            WiFi has 11 channels, but of those, only 3 don't actually overlap.

        • by cgenman (325138)

          Good analysis, though one caveat: fast enough to do 1080i is really just fast enough to do 720p. And 720p is really only adequate for very small laptops. Anyone attempting to do 1080i on a PC (instead of 1080p) either has no idea what they're doing or they don't need my validation.

          Wireless disk readers and HDD's, though, are an interesting application. Inter-device networking at 500 MB/s might be fast enough for Avatar-style fast swapping of files between base stations and laptops. And 500 MB/s might al

        • Re:I must admit... (Score:5, Interesting)

          by boyko.at.netqos (1024767) on Wednesday July 14, 2010 @09:16PM (#32909016)

          True, but consider this possibility:

          Right now everyone's looking at the traditional model. That is, a portable CPU connected to a GPU connected to a display, and adding in a wireless form factor to it.

          What if, instead, the base station contained the CPU AND the GPU connected directly together - much like a desktop system now - to do all the hard math and 3D rendering? - which then outputs a wireless PCIe signal, which is then picked up by the portable device, like a netbook, with a basic GPU, a small processor, and little to no HD space? It's only job would be, much like a thin client - would be to provide you access to the computing power in the "main" section of the house.

          It would be like having a docking station for your netbook that turns it into a desktop powerhouse - only you could walk around the house with it. And, when the time comes that you want to take it outside, you still have the basic capabilities of a netbook.

          That might be a product worth selling to, say, a family of four. "You can pay for four notebooks, or four netbooks and this powerful base station".

          • like a netbook, with a basic GPU, a small processor, and little to no HD space? It's only job would be, much like a thin client - would be to provide you access to the computing power in the "main" section of the house.

            You mean like you can do right now by using the netbook as an X terminal?

            That might be a product worth selling to, say, a family of four. "You can pay for four notebooks, or four netbooks and this powerful base station".

            Much as people use old PCs as X-terminals (I have done it in a small office), except with the advantage of portability. I was thinking of buying a netbook to use in just this way.

          • Re: (Score:2, Informative)

            by theun4gven (1024069)
            But you couldn't walk around the house with it. You could walk around the same room. Next room, maybe, but certainly not two rooms over nor even another floor.
          • This also provides work for multi-core CPUs. What better task for a central, 16+ CPU machine than to serve separate applications to each member of a family? Everybody runs their own portable client and the big box in the basement does all the heavy lifting.
        • by 56ker (566853)

          Wireless hard drives sound cool, but what about latency and lag?

          I'm on the internet wirelessly, I do get latency problems from time to time due to the extra "hop".

          As far as I'm concerned the internet is like a wireless hard drive to me. :) Ok, a wireless network of networks.

          Graphics at a decent FPS rate does require huge amounts of bandwidth at a decent screen resolution. Just look at how online video (eg Youtube) is buffered before it starts.

          A wireless network at speeds whereby wireless hard drives and wir

      • Retro Tech (Score:2, Interesting)

        Wait, so you're saying they'll be able to send a continuous color video stream THROUGH THE AIRWAVES??? Wow, that's so incredible! I bet they wish they'd had this technology back in the middle of the last century...
    • by Peach Rings (1782482) on Wednesday July 14, 2010 @04:49PM (#32907050) Homepage

      You'd better get used to your computer experience looking like thaaaaaaaaat if your display has to be sennnnnnnnt over a wireless linnnnnk.

      • Re:I must admit... (Score:5, Informative)

        by inKubus (199753) on Wednesday July 14, 2010 @05:04PM (#32907246) Homepage Journal

        Not to mention security. I mean, you thought Tempest [wikipedia.org] was bad before, now I can wirelessly sniff and alter PCI traffic, which is a direct conduit into the RAM.

        • Does the PCI bus really work that way? Are you sure that the device controls where the data goes into memory? I would have thought that the destination is safely set up in software to point somewhere harmless like a raw data buffer, and then the device dumps into that spot.

          • Re:I must admit... (Score:5, Interesting)

            by JesseMcDonald (536341) on Wednesday July 14, 2010 @06:00PM (#32907688) Homepage

            Some recent systems have IOMMUs which provide privilege separate between hardware devices much like normal MMUs govern software. However, unless this sort of IOMMU device is active, PCI and PCIe hardware is generally capable of transferring data to or from any other connected device, including any area of system RAM. Sometime this can even extend to external interfaces; for example, people have been known to take advantage of the DMA capabilities of the Firewire protocol to read the contents of RAM on an active system.

            In general, non-hotpluggable hardware has been granted the same level of trust as the OS kernel, so no one worried very much about it. IOMMUs were more about protecting against faulty or corrupted software (device drivers) than malicious hardware. However, more and more hardware is hotpluggable these days. Also, some software interfaces are becoming too complex to really trust—consider, for example, the interface to a modern GPU, which must transfer data to and from RAM, and perhaps other GPUs, under the control of code provided by user-level application software (shaders, GPGPU). Without an IOMMU it is up to the driver software to prove that such code is perfectly safe, which is an inherently hard problem.

          • Yes. All low level devices are wired to the CPU by the memory bus. Writing data to a PCI card is simply a matter of writing to a certain memory address. The PCI card will see a specific address on the memory address bus and know the data on the data bus is intended for it.
            It's not like x86 CPUs have one bus for devices and one for memory.

        • Re:I must admit... (Score:5, Insightful)

          by blueg3 (192743) on Wednesday July 14, 2010 @06:33PM (#32908010)

          You're unlikely to be able to *alter* PCI traffic, though you could perhaps *insert* PCI traffic.

          Still, people figured out properly encrypting wireless links some time ago. Tempest is primarily interesting because the signals you're looking at are unintentional (and often unknown) side effects and they often deal with links that are impossible or unreasonable to encrypt.

        • by grcumb (781340)

          Not to mention security. I mean, you thought Tempest [wikipedia.org] was bad before, now I can wirelessly sniff and alter PCI traffic, which is a direct conduit into the RAM.

          Yep. Can't wait. Endless fun and games at the next Powerpoint presentation by Corporate.

        • by TeknoHog (164938)
          So if you use this for a certain OpenGL demo on an external GPU, is that a tempest in a teapot?
      • Re:I must admit... (Score:5, Informative)

        by Entropius (188861) on Wednesday July 14, 2010 @06:00PM (#32907692)

        It's not that bad -- I've done it before.

        X Windows over plain old wifi.

    • by morcego (260031)

      Is it ? From the specification, you can read:

      Support for beamforming, enabling robust communication at distances beyond 10 meters

      So, the standard range is less than 10 meters ? This is anything but awe inspiring.

      Maybe the text is misleading, and it is not a standard 10m range. But that is the impression I get ...

      • by Rivalz (1431453)

        I just hope Apple isn't behind this technology.

        You'll get degraded signal if you hold or sit next to your computer. And if someone using a microwave within 100meters you lose signal completely. But that is a feature so you know when your food is done cooking.

      • by h4rr4r (612664)

        What did you expect at 60Ghz?
        This signal will not penetrate a sheet of paper.

    • That was my first thought as well.

      My second thought was "I wonder how they're going to handle security and authentication?" Which rather took the shine off my first thought, I'm afraid.

    • by arivanov (12034)

      No thank you. I can see their argument WiGig does a few cm and is a cable replacement tech. You cannot even send across the room with it. It is however yet another "microwave your brain" in the house.

      I'd rather have PCIe over optical and a standard dockable optical connector that can tolerate at least a few thousands dock/undock cycles on my laptop. Is it that difficult?

  • Question (Score:3, Insightful)

    by The Clockwork Troll (655321) on Wednesday July 14, 2010 @04:45PM (#32906994) Journal

    To those in the know, why will this succeed where UWB/wireless USB failed in the market?

    Remote graphics seems like an even more esoteric need than the remote mass storage, printing, cameras that UWB would have offered?

    • Re:Question (Score:4, Insightful)

      by fuzzyfuzzyfungus (1223518) on Wednesday July 14, 2010 @05:31PM (#32907490) Journal
      I have no idea whether it will go anywhere; but I'd assume that the one major strike in its favor is that, unlike wireless USB, wireless PCIe addresses use cases that basic boring ethernet/wifi do not.

      The performance of early wireless USB hardware was pretty shit, and it was uncommon and ill standardized, so you usually still had to plug a dongle in, just to get performance worse than plugging in a cable. Plus, basic NAS/print server boxes had become really cheap and fairly easy to use. Anybody who wasn't a technophobe or living in a box(and thus not the target market for pricey and sometimes flakey wireless USB) already had his mass storage and printers shared over a network, wired or wireless, and his human interface devices wireless via bluetooth or proprietary RF, if he cared about that. Wireless USB didn't really enable any novel use cases that anybody cared about.

      On the other hand, there is basically no way of plugging in "internal" expansion cards over a network(in the home context, I'm sure that some quite clever things have been done with I/O virtualization over infiniband, or whatever). Particularly with the rise of various "switchable graphics" technologies, I assume that the use case is basically this: User has nice thin, light, long-running laptop. They come home, sit within a dozen meters of a little box(containing a graphics card or two, with one head connected to their TV), and suddenly their laptop has a screaming gamer's graphics card supplementing the onboard card, either usable on the built-in screen, or via the second head connected to the TV, or both.(Analogs could be imagined for professional mobile workstation applications, where simply sitting near your desk connects you to a quartet of CUDA cards and an SAS controller with 4Us worth of drives hanging off it.

      Will the market care, enough to bring the volume up and the price down? I have no idea. However, it at least has the advantage of allowing things not otherwise possible, unlike wireless USB, which pretty much covered the same ground as a mixture of bluetooth peripherals and resource sharing protocols over TCP/IP; but years later and without the standardization..
      • by Luyseyal (3154)

        The best use case I can think of for this is when my laptop finally shrinks into my phone. I don't have to attach a zillion wires to the phone to get it to operate as my computer.

        -l

        • Unless Mr. Fusion has come on the market by then, you'll probably be attaching a power cable in any case, so having a standardized combination of power cable/high density PCIe connector would presumably be even easier..(except, of course, that the cellphone guys will standardize shortly after hell freezes over).
    • by Urza9814 (883915)

      Yes, but remote graphics is much more difficult to do over WiFi. When you already have a router, why buy wireless USB or UWB devices, which you need a special dongle or card for, when you can just buy one with WiFi and be done with it. Plus, if you're doing wireless, you're likely using it for multiple PCs, which is even more reason to go with something you already have. Who's going to buy a separate $20-$50 dongle for every computer they want to print from, for example, when they don't need to? 802.11g off

    • This is all about hooking up to your TV without requiring a bunch of setup from your laptop or a dedicated computer. Instead of fucking around with cables every time you want to hook your laptop up, you just have to plug a box in permanently and enable it on your laptop when you want to. I don't know if it'll catch on, but it's definitely aiming for a different niche.
    • by flatulus (260854)

      Not completely sure, but I think one problem with UWB is that the power levels had to be set SO low to appease those *licensed* operators of the spectrum it overlaps that performance just ain't all that great.

      Then again, it could be that UWB was torpedoed by the Osborne Effect of having 60 GHz coming "in the near future." Honestly, electronics evolve so fast it's hard to understand how anybody makes a buck in "latest/greatest technology" anymore. I was looking at hard drives this morning. Seems the new p

  • by Drakkenmensch (1255800) on Wednesday July 14, 2010 @04:45PM (#32906998)
    We'll soon have ONE MORE wireless signal to keep track of, when all those we already have work so well together!
    • Yeah, this'll be awesome for early adopters and then start to suck as soon as their neighbors get it, kind of like how 802.11g sucks but N is still OK (for now).

      Also, people getting pwned in online games will stop saying "fucking lag!" and start saying "goddamn microwave!"

    • by cgenman (325138)

      Thankfully, the 60 GHz spectrum is unlikely to get that polluted. It has severe penetration problems. If you have a cup of water (or a screaming child) between your laptop and a base station, the signal would likely be interrupted. Having a few walls between you and your neighbors should be fine.

      This is in stark contrast to 802.11*, which can pollute for 1/2 a block.

  • "Band"-aid (Score:2, Interesting)

    by Ostracus (1354233)

    Nice but what's the range, and is the spectrum licensed or will we end up dealing with a "tragedy of the commons" much like the 2.4 Ghz band?

  • That sounds like a wonderful idea and the thought of having a wireless graphics card for a laptop is very tempting.

    But how much performance can we really squeeze out of it? I mean, for a power user who wants a higher resolution than his integrated card can offer, it's a godsend. But for gaming? No way.

    Also, I'll admit I'm not very wise on the technical details of PCIe, but if you're putting all of the above-mentioned devices in contention for 7Gbps of bandwidth, there's really not a lot you can milk from

    • by geekoid (135745)

      I also forgot, you can get laptops, right now, that are comparable to the desktop. I condier them a complete waste, but that's besides the point.

      • I only paid around $1k for my laptop recently and there are no games currently available that I can't run on at least medium settings. And that's even opting for a better screen than graphics card when I made my purchase. Can I game at super hacker leet graphics levels? No, but I can play all modern games with decent settings and a decent framerate.

        • I bought one of those a couple years ago, mostly because I planned to do video editing on it; gaming was a nice bonus.

          It's a piece of shit. Runs so hot that everyone who uses it comments on the heat, but if you throttle it any it feels crazy-slow. The heat's so bad that if I don't make some sort of special arrangement for it to sit up where it can get airflow, it'll overheat and shut down during games (or sometimes just while playing back a video!). The damn rear vent points down and to the back at a 45

      • I suspect that the main factor is not the price premium associated with powerful laptops(which is much more modest than it used to be, though still nonzero); but the heat/weight/battery life penalty.

        A screaming gamer laptop is actually pretty reasonably priced these days, and only a bit slower than the screaming gamer desktop. However, it is still hot, heavy, and loud, and doesn't get thrilling battery life.

        The convenience would be being able to buy a thin-and-light with excellent battery life, that c
        • by cgenman (325138)

          It helps a lot that recently PC game developers have been targeting the Xbox 360 and PS3 as their main platforms. These platforms are unchanging, effectively locking developers to the state-of-the-art circa 2005. This keeps PC builds to a lower visual standard, which gives mainstream laptops a chance of actually keeping up.

          Once we see new consoles launched (possibly 2012 or 2013), we'll see developers targeting Poly / RAM budgets of those machines. The PC build requirements will then go through the roof,

  • by Joe The Dragon (967727) on Wednesday July 14, 2010 @04:52PM (#32907092)

    Pci-e x1 is to slow for all of that video will suck at that speed and then you want to add more io to it?

  • by ultramk (470198) <ultramkNO@SPAMpacbell.net> on Wednesday July 14, 2010 @04:58PM (#32907178)

    The best feature of this proposed standard is that if you place a ceramic mug directly between your CPU and the external graphic processor, it will keep your (coffee, soup or whatever) steaming hot, all day long! Those days of long WoW raids with only cold beverages and snacks are over!

    • I “read the book” (as they used to say), and that will only work if you put it on a rotating platter. Or just use a spoon.

      Hmm, from what I know, this should actually work (using a spoon to make the fluid rotate in the mug in the field). But I doubt you can buy a 800W wireless transmitter in your normal electronics shop. ;)

  • by starslab (60014) <andrew&skyhawk,ca> on Wednesday July 14, 2010 @04:59PM (#32907180) Homepage
    I will admit some incredulity when I read the title. "Wireless <i>what?!</i>"

    Very cool stuff if it materializes.

    Imagine a small lightweight machine with say an ULV i3 or i5 CPU, small-ish screen and weak-ass integrated graphics. Place the machine on it's docking pad (No connectors to get bent or boards to break) and suddenly it's got (wireless?) juice and access to kick-ass graphics, and a big monitor, as well as whatever else is in the base-station.

    A desktop replacement that remains light and portable for road warriors, with none of the fragility associated with docking connectors. With those transmissions speeds I presume this is going to be a point-blank range affair, so snooping shouldn't be (much?) of a problem.
    • Wilocity told us that wPCIe can push bits at up to 5Gbps (625MB/s), and that the spec should move "quickly" to 7Gbps (875MB/s).

      If you consider that PCIe 16x is 16GB/s (128Gbps), this is very underwhelming. Call me a sceptic but I don't see a real-world application of "wireless PCI-E" that is slower than a 1-lane PCI-E. Well, at least a real-world application regarding graphics...

    • by w0mprat (1317953)
      Your imagination is not going far enough. Imagine just merely placing wPCIe enabled PC components on a desk, getting power from an inductive pad even. Your rig is a cluster of bits with no connecting wires. wPCIe really means that your system southbridge chip is going to be a kind of wireless access point to whatever devices are a metre or two away.

      wPCIe enabled hard drives will completely erase the need for both 'internal' and 'external' HDDs.

      You'll have small flat box with a motherboard + CPU and Ra
      • by cgenman (325138)

        Do remember, practically speaking that's 500 MBps for all of your devices. If you have 3 hard drives, a monitor, a card reader, a keyboard and mouse, that's about 100 MBps left for each of them. And when this finally comes out, 500 MBps will seem even smaller.

        I remember similar claims about Bluetooth. It was going to be the universal standard, trivial networking between everything, yadda yadda. Ultimately, it's kind of a pain to connect bluetooth devices. "Is this my mouse? No, it's not showing up. O

    • by jhfry (829244)

      With those transmissions speeds I presume this is going to be a point-blank range affair, so snooping shouldn't be (much?) of a problem.

      But even at point blank ranges, you still need to worry about interference.

  • Seriously?!! With timing issue and precision required by the GPU to interface with the rest of the system, do we really want it bridging over WiFi (60Ghz)? Of all the devices, this is one peripheral I'd want to leave with physical bus access (electron flow). That, and the CPU and RAM.

    • by Tapewolf (1639955)
      I think the fun part is when the video card is disconnected or the signal strength drops. Bus dropouts should be especially fun when the GPU is running some kind of program code...
    • by geekoid (135745)

      well if you can't see how it would be useful, then clearly it's no good~

  • Now hidden cameras will be able to stream up-skirt videos in HD!
  • by jtownatpunk.net (245670) on Wednesday July 14, 2010 @06:03PM (#32907712)

    Let's say I've got even a little building with 50 people who want to use this. Will I be able to pack 50 of these point-to-point units into a building and have all of these systems perform at peak capacity without stepping all over each other? That would be amazing.

    And, aside from the technical issues of getting it to work well in a dense environment, there's still one cord that needs to be connected to the laptop. Power. If I have to plug that in, I may as well snap the laptop into a docking station and skip the wireless connection entirely. One connection is one connection and I won't have to worry about interference, security, bandwidth, etc.

    • by geekoid (135745)

      what are you doing where all the people need to be pushing 7G constantly across the bus?* If that's the case, the it's probably not for that situation. Most people in most office don't need to be using that kind of data all the time.

      You could create a reliable system so you could take your laptop anywhere and have it display on a large screen or projector. so you walk into a meeting room and it links up. You want to display something on your TV, it links up.

      Perhaps you have a hand held device and want to sh

      • Did you miss the part where they're talking about docking stations with video cards built in, USB3, network, etc.?

        "The first applications, which will start sampling next year, will let you connect your laptop to a base station with all kinds of storage controllers, networking controllers, and yes, an external graphics processor."

        I don't know how your company works but, around here, we expect people to show up at roughly the same time every day and...erm...work. Like simultaneously. And, yes, many of our l

  • As a skeptical person who usually maintains a scientific 'prove your crazy theory if you expect my buy in' ideology...

    I have to say if you had to bet money on what wireless technology actually WILL cause cancer and your options are cell phone, wireless access point or wireless PCIe sounds, I think wireless PCIe would win.

  • The diagram shown at TFA indicates a single PCIe lane (x1) is provided. What PCIe devices would benefit from being wireless?

    • Graphics Card? 1x is hardly the cutting edge in graphics card bandwidth. Modern mainboards often have x16 slots for graphics cards. Decent if you must do *wireless* video, but inferior to say HDMI from a laptop.
    • Wired NIC? Maybe, but 802.11n has similar bandwidth with actual range.
    • USB host controller? Seems kinda silly given the ubiquitous ultracheap wireless mice and keyboards
    • Re: (Score:3, Informative)

      by Mr. DOS (1276020)

      Graphics Card? 1x is hardly the cutting edge in graphics card bandwidth.

      And yet, it's all the bandwidth I need to attach a less-powerful video card (such as the Matrox G550 [matrox.com], which can run off a PCIe x1 slot) to my laptop, allowing me to dock onto another monitor or two on my desk quickly and easily.

  • In a nutshell, wPCIe enables a PCI Express switch with local and remote components linked by a 60GHz connection. The first applications, which will start sampling next year, will let you connect your laptop to a base station with all kinds of storage controllers, networking controllers, and yes, an external graphics processor.

    I don't know about you, but I don't want to have something operating at 60GHz sitting in my lap, thanks... I'll stick to super-long HDMI or DVI cables if I need to route a monitor sig

  • Wireless PCI Express? Awesome. I'll just walk by with a specially designed device, master the bus, and DMA the entire contents of your RAM over to a laptop. Then I'll change some interesting bytes here and there, and DMA it back.

    This sounds like the dumbest attack vector since FireWire came out with physical DMA support.

    • by Animats (122034)

      This sounds like the dumbest attack vector since FireWire came out with physical DMA support.

      Yes. As I once pointed out on the Linux kernel mailing list, the FireWire driver enabled external access to physical memory by default, via an un-commented line of code. Only for the first 4GB, though; nobody had updated that backdoor for 64-bit. (There are hardware registers which control the address range for which that feature works. The Linux defaults were to allow 0..4GB-1)

    • by butlerm (3112)

      The whitepaper is silent on the subject, but presumably there is some sort of secure authentication, if not encryption scheme. If not this protocol is more or less worthless.

    • by Ant P. (974313)

      If anyone's transmitting that kind of sensitive data over an unencrypted wireless link, then they're as ignorant as you.

  • Let me think about this... 10lb computer in my lap, $500 wireless video card across the room by the screen... OR... Computer across the room by the screen and 10oz, $30 wireless keyboard in my lap. What am I missing here? Oh wait, I want my neighbors to not only steal my internet connection, I'd also like them to be able to stream everything I do on my desktop live.
  • I herd, you like Windows drivers, so we have put a low-level low-latency bus protocol supported by Windows driver on top of a low-reliability high-latency protocol supported by Windows driver, so you can use Windows drivers while you use Windows drivers.

    Seriously, there is no excuse for this, other than providing this kind of illusory compatibility. Properly engineered systems have protocols optimized to efficiently use media throughput while taking latency into account. This thing can be best described as

    • by yuhong (1378501)
      It isn't just Windows, it is any OS that supports PCI.
      • by Alex Belits (437) *

        The only imaginable goal of this is to provide driver-level compatibility without writing any new software. What would be utterly pointless on anything but Windows.

  • Now all we will need is wireless power!

    Though I think this already exists, based on some of these supposed Office freak-out videos where some bloke goes crazy and tosses a monitor, which mysteriously does not have any cables connected! One might assume that it was staged, but it's wireless power and wireless graphics!

  • Put a wireless power charger for the laptop batteries and you may have created the world's first fool proof and universal docking station for laptops and the dumb ass fools who use them.

  • Even 60hz has a very limited transmission range, I am preferring the wired option above the wireless; because anything which gets sent to the air can get intercepted and influenced by any nearby devices or transmitters.

    I like the idea to separate elements from a your PC and start to wire(lessly) connect them together; so not only your pc can be used at your office, but also in your living quarters, bedroom or anywhere where you got a station with a monitor. Security has to come in mind and a wired option sh

"Why should we subsidize intellectual curiosity?" -Ronald Reagan

Working...