Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Networking Wireless Networking Hardware

Ubiquitous Multi-Gigabit Wireless Within Three Years 152

Anonymous Howard passed us a link to the Press Escape blog, and a post about the future of ultra-fast wireless connectivity. Georgia Tech researchers unveiled plans to use ultra-high frequency radio transmissions to achieve very high data transmission rates over short distances. In a few years, the article says, we'll have ubiquitous multi-gigabit wireless connectivity, with some significant advances already under their belts. "GEDC team have already achieved wireless data-transfer rates of 15 gigabits per second (Gbps) at a distance of 1 meter, 10 Gbps at 2 meters and 5 Gbps at 5 meters. 'The goal here is to maximize data throughput to make possible a host of new wireless applications for home and office connectivity,' said Prof. Joy Laskar, GEDC director and lead researcher on the project along with Stephane Pinel. Pinel is confident that Very high speed, p2p data connections could be available potentially in less than two years. The research could lead to devices such as external hard drives, laptop computers, MP-3 players, cell phones, commercial kiosks and others could transfer huge amounts of data in seconds while data centers could install racks of servers without the customary jumble of wires."
This discussion has been archived. No new comments can be posted.

Ubiquitous Multi-Gigabit Wireless Within Three Years

Comments Filter:
  • do not underestimate the bandwidth of a harddrive being passed to the friend next to you.
    • These speeds are basically marketing hype. the need to start declaring 54MB at [b]half-duplex[/b]. We all know marketers love big numbers, but what they dont tell you is it runs at half-duplex, so you lucky to even get half of the rated speed. Although, at higher bandwidth rates (and hopefully increased throughput!) this will become less and less of a problem.

      Multi-Gigabit [half duplex] FTW!...(ish)
  • by Aqua_boy17 ( 962670 ) on Thursday July 19, 2007 @02:03PM (#19917537)

    we'll have ubiquitous multi-gigabit wireless connectivity, with some significant advances already under their belts.
    If they're running this from laptops for extended periods, that may be the only thing remaining under their belts.
    • by Moby Cock ( 771358 ) on Thursday July 19, 2007 @02:14PM (#19917719) Homepage
      Technically speaking, at 60GHz, you'd be millimetrewaving your privates.
    • by drakaan ( 688386 ) on Thursday July 19, 2007 @02:16PM (#19917747) Homepage Journal

      while data centers could install racks of servers without the customary jumble of wires

      Somehow I don't see "whole data centers" using a data transmission method where any device can potentially intercept the data going to and coming from any other device. Might make your hosting clients a bit nervous.

      • by wurp ( 51446 )
        uh, encrypt the traffic...
        • by drakaan ( 688386 )
          Right...just like they do with WEP [securityfocus.com].
          Yes, you'd have to be stupid not to encrypt wireless traffic. That doesn't make it safe, though, assuming the data is actually worth obtaining.
          • by wurp ( 51446 )
            Just because some encryption mechanisms are crappy, doesn't mean you can't put together good encryption.

            SSL with a reasonable key length is essentially unbreakable unless there an exploit in the encryption software.

            Some day someone may come up with a trick to defeat it, but there are lots of other encryption mechanisms that are just as secure.

            Honestly, I don't get why wireless uses WEP or WPA. We've had VPNs and SSL secure enough for very valuable financial data for a long time; they are proven in the fiel
            • by drakaan ( 688386 )
              True, just saying that given the choice between leaking data all over the place and taking for granted that it's secured by some unassailable algorithm vs. having a fiber-optic cable transmitting the same data, sufficiently security-conscious people aren't going to opt for wireless...especially an unproven wireless technology.
              Good encryption means different things to different people, and it also means different things depending on the type of data being transmitted and its value to someone who wants to ge
              • by wurp ( 51446 )
                Honestly, I agree - for a long time to come, reputable datacenters won't (and shouldn't) use wireless except in limited circumstances. The common acceptance of the sad state of wireless security gets to me sometimes, and I tend to be reactionary when people claim it as a real problem rather than something easily solved if the standards boards and/or manufacturers would just raise the bar.

                In a datacenter, though, there is no reason to go with an unproven technology, especially when the alternative is to run
      • I didn't read TFA, but I find it hard to believe that there's enough spectrum available to permit a dozen or so racks of 1U servers to communicate with UHF signals. Especially if (as is becoming common) they're hooked up to both a SAN and a router. Couple the bandwidth required for both signals with frequency separation requirements (so signals don't interfere with each other) and pretty soon you've got signals spread across more spectrum than one antenna can handle effectively. Then what? Do you instal
        • by Ant P. ( 974313 )
          Why not use wireless without radio waves? Have racks connected by a bunch of low-power lasers at ceiling height. It'd look nice too.
    • by BlueParrot ( 965239 ) on Thursday July 19, 2007 @04:54PM (#19919301)
      It doesn't make any sense to make a network card emit microwaves at intensities similar to microwaves because not only would you get a huge power consumption, it is also massive overkill unless you plan to search the sky for stelth bombers. The FCC ( or local equivalent ) would probably have a few things to say about it as well. The scaremongering about radiation from comunications equipment is simple unbeleivable. You are more likely to get hurt from tripping in a cat5e cable.
      • by Kirkoff ( 143587 )
        While people do go to crazy about the effects of "radiation" from electronics, there can be problems. In the VHF range for instance, about 200W of energy can give you a severe burn. Different frequency ranges have different absorption levels in the body. The electromagnetic radiation is somewhat cumulative so a lot of RF at high power is not a good idea; They say that RF burns are like sun burn under the skin.

    • Even if you had a 2.4ghz wireless device powerful enough to actually harm you (and you don't -- your cell phone is orders of magnitude more powerful than your WAP), you'd have to be unconscious for it to do any real damage, simply because you'd feel your body being heated by it, and get the fuck away from it long before it heated you to a point where it could actually hurt you.

      So, yeah, if standing in front of your Linksys feels like you're in a 400 degree oven, then you have a problem. Good news is, its p
  • by Travoltus ( 110240 ) on Thursday July 19, 2007 @02:06PM (#19917591) Journal
    Maybe some lower security data centers might enable wireless, but I doubt it. Being that we're a financial institution (a small one, mind you), there's no way in the h to the e to the double hockey sticks that I'd ever enable any kind of wireless anything in our data center.

    I'd rather deal with a network cable gone sentient and whipping around like a snake and attacking people, than go wireless at the data center.

    Only an idiot thinks there's a wireless transmission that's invulnerable to being intercepted. Heck, wired communications aren't 100% secure, either, but my boss's business is about minimizing risk, and wireless networks even inside a data center is not minimizing risk.
    • by tedrlord ( 95173 )
      I figure you could put it in a faraday cage of some sort. Still, I'd prefer a little planning and cable management to several hundred machines and peripherals transmitting wirelessly anyday. Especially since I have to spend days on end in there every so often.
      • Faraday cage wouldn't work. Any openings need to be smaller than the depth of the skin current, or the signal induced on the inside surface will just flow out through the cracks and re-radiate.
    • Sure you can get 15Gbps, but if you start sharing that bandwidth among dozens of servers it wouldn't be all that fast anyway.
    • by walt-sjc ( 145127 ) on Thursday July 19, 2007 @02:25PM (#19917881)
      My little cage at the colo doesn't have 5 servers. It has hundreds. I'm also sharing that datacenter with many many other companies that have cages with hundreds of servers. We deal with SAN / iSCSI, NAS, backups over networks, etc. With the noise and limited bandwidth available in a shared frequency space, I seriously doubt any type of wireless will be very useful in a datacenter - especially since everything is already connected via hard-wired connections.

      It also won't be very useful in my home, where wires are already easy to run for the short-distance devices, and noise / distance prohibits the use in cases where I could really use and WANT high-speed wireless.

      So it does sound like a neat trick, but what is a valid, viable use case for it?

      I could REALLY use something much different. I want to get rid of the 20 or so wall-wart power supplies under my desk. I want one larger power supply that I can run small cables to all the devices. Why can't devices negotiate for how much voltage / current they need?
      • I want something different too. I dont want higher speed, I want more range. I want one or two megabits at 30 miles NLOS. Either simple point-to-point, with many different 'channels' for seperation, or point-to-multipoint. Of course, the question of wether such a thing is technically possible is irrelevant, because the telco's would kill it in its crib anyway.
        • I want something different too. I dont want higher speed, I want more range. I want one or two megabits at 30 miles

          Thirty miles is an alright start but I'd like at least 100 or 200 mile range. I love hiking and photography and would like to be able to upload, transmit my photos wirelessly to a server. While it may be possible to do so with a 30 mile range that would require a lot more tower transceivers.

          Falcon
      • > So it does sound like a neat trick, but what is a valid, viable use case for it?

        Maybe for your AV stuff? No wires b/n your dvd, receiver, and tv would be nice.
      • My little cage at the colo doesn't have 5 servers. It has hundreds. I'm also sharing that datacenter with many many other companies that have cages with hundreds of servers. We deal with SAN / iSCSI, NAS, backups over networks, etc. With the noise and limited bandwidth available in a shared frequency space, I seriously doubt any type of wireless will be very useful in a datacenter - especially since everything is already connected via hard-wired connections.

        I've seen security rooms inside datacenters that had copper cloth over the windows, etc etc. What if every cage in the colo were a faraday cage? In theory, wouldn't that permit this? Or, how about UWB? Isn't UWB supposed to allow an effectively infinite number of transmitter/receiver pairs to operate together? If the whole building were shielded so that it wouldn't penetrate, it would eliminate interference issues.

        I still think that fiber is more desirable. I wish it were cheaper (although it's getting c

      • So it does sound like a neat trick, but what is a valid, viable use case for it?

        When my PC converges with my PDA, I want to be able to walk up to any display and use it without hauling a cable around. Better yet: I want HD video in my glasses, connected wirelessly to my PDA PC.

        I have an external hard drive for my work laptop. It would be nice to be able to connect to it wirelessly. I'd also like to be able to sync my laptop to a docking station wirelessly.

        There are all kinds of nifty things you can do

      • Why can't devices negotiate for how much voltage / current they need?

        They do — via the wall wart.

        I wouldn't be surprised if someone proposes a standard for low-voltage DC distribution in the home. You'll wind up with dual-socket outlets, with your standard AC socket and two to four 12V sockets. Maybe use a multi-bladed plug to determine how much current the device can sink (each blade signifies 500ma, so a four-bladed plug can sink 2A). Somebody else has probably already thought this out in detail, so I'll just wait for someone to post a link to a complete spec...

        • No, they don't "negotiate".

          Wallwarts are DUMB. They have a transformer and a rectifier and sometimes a couple caps / resistors, and occasionally (but rarely) a voltage regular. That's it.

          USB and modern PoE systems can negotiate for current draw. That's kind of what I am talking about.
    • That's exactly why, when I first read about this, that I thought that the appeal of high speed wireless would mostly be on the consumer end. Most businesses are bound to see the potential security risk of wireless and stay away from it, regardless of how fast it is.

      As I don't manage any data centers, I'd love it. Mostly because the wife has forbade me from running CAT5 through the house and I'm stuck with 802.11g connections. It's annoying to try to transfer a large file from the office upstairs to, say,
    • there's no way in the h to the e to the double hockey sticks that I'd ever enable any kind of wireless anything in our data center. ... my boss's business is about minimizing risk, and wireless networks even inside a data center is not minimizing risk.

      Your network is on the internet. That and any non free software you have are bigger threats than sftp over wireless.

      • That and any non free software you have are bigger threats than sftp over wireless.
        Going to cite anything for this one? Or am I going to be left waiting like I have with this gem? [slashdot.org]
        • My biggest fan [slashdot.org] would have me point to a link proving the relative insecurity of non free software. I think he should simply look around.

          • No, he was asking for proof of your implied assertion that "non-free" software is somehow inherently less secure than everything else. Perhaps you missed that?

            That's a nice link you have there, BTW. Everyone should click on it.

            • Even better than that, if he'd linked to his own post [slashdot.org] like he normally does, we'd all have seen that I posted a link to the comment list where the employee in question admitted he was in the wrong:

              Employers can terminate employees without cause.

              Or with cause, which is what happened in my case.
          • Okay, I looked around and found nothing to prove what you said.

            Now you try!

    • by ffejie ( 779512 )
      Agreed, but it's less about security and more about speed and troubleshooting, I would think. Sure, my home datacenter (a NAS and a Xbox360) might like to use wireless, but tell that to a guy trying to get 10-40Gbps out of his servers. I don't think that 15 Gbps is going to do it across his datacenter.
    • Only an idiot thinks there's a wireless transmission that's invulnerable to being intercepted. Heck, wired communications aren't 100% secure, either, but my boss's business is about minimizing risk, and wireless networks even inside a data center is not minimizing risk.

      Only an idiot thinks his copper connections aren't radiating his data on RF frequencies that can be picked up outside his building with the right gear. This tech is decades old.

      The wise man encrypts all of his connections on any medium using
      • Only an idiot thinks his copper connections aren't radiating his data on RF frequencies that can be picked up outside his building with the right gear. This tech is decades old.

        I never said it was 100% secure.

        Heck, wired communications aren't 100% secure, either

        You did read this, right?
        • I see my comment wasn't clear - using secure communications over any media is much more important for minimizing risk than whatever that media happens to be. e.g., if you want to minimize risk, and have to chose between TLS over wireless or in-the-clear over wired, go with the wireless.

          If you're using TLS on both I'm not sure that wired gets you any more security, though wired has plenty of other advantages, but I'm not sure the 'never wireless in the data center because of risk' rationale is necessarily t
  • FTFA (Score:3, Interesting)

    by SighKoPath ( 956085 ) on Thursday July 19, 2007 @02:07PM (#19917617)

    Pinel is quick to point out that a multi-gigabit wireless system would present no health concerns as the transmitted power is extremely low, in the vicinity of 10 milliwatts or less and the 60 GHz frequency is stopped by human skin and cannot penetrate the body. The team admits that the fact that multi-gigabit transmission is easily stopped means that line-of-sight is essential, and this could be a stumbling block in practical settings.
    Doesn't this make it being wireless kinda pointless? It's like a wired connection where you can't step over the cable or drill a hole through the wall!
    • You could wire a transmitter/access point into everyroom near the lights. You would still have to wire indoors, but you would have untethered movement.
      • Or you could just put relays in line of sight of one another. You wouldn't need too many if each device also could relay.
    • Re:FTFA (Score:4, Insightful)

      by Moby Cock ( 771358 ) on Thursday July 19, 2007 @02:19PM (#19917807) Homepage
      Useless? No. But very application specific. However, there is a great appeal in making Personal Area Networks.

      That and being able to connect a DVD player to a TV without a cable would be, in a purely geek way, quite elegant.
      • by gatzke ( 2977 )

        Until someone turns on a microwave.

        Or you live in an apartment and your n nearest neighbors compete for bandwidth.

        Or somebody nukes us and the EMP keeps you from watching American Idol.
  • by MontyApollo ( 849862 ) on Thursday July 19, 2007 @02:08PM (#19917631)
    Could this kind of bandwidth run a remote display?

    I always thought it would be cool to have a pad that was nothing more than a screen and input device that you could carry around the home instead of a full-fledged laptop. You would be actually "running" your powerful desktop off basically a second screen that you could carry around with you in the house.
    • Didn't Capt Picard have one of those?
    • Re: (Score:3, Informative)

      Comment removed based on user account deletion
      • by Ant P. ( 974313 )
        You could probably do that over 802.11n using a small amount of compression. Might be a good case for the MNG format.
        • Comment removed based on user account deletion
          • You could cut the bandwidth down a lot with interframe compression. Unless you're playing a FPS, you're unlikely to be updating every pixel 60 times a second. Even something relatively simple like VNC could probably run quite fast over a 200Mb/s network. Of course, it would make more sense to have an X server in your display, so you would just send the high-level drawing commands over the absence-of-wire, including OpenGL commands if the display supports GLX. You can buy ARM CPUs with on-board 3D for ve
    • by phorest ( 877315 )

      You mean like this? [viewsonic.com]

      It's a great idea/product, but unfortunately it bombed in the marketplace. I am writing this reply with mine. I love it! All it really is is a remote desktop slave, but I can administrate my whole network from my living room while watching a movie, lounging on the couch.

      When it first came out ~2002, they were a little less than a full-featured notebook computer. I got mine in early 2004 from a company that buys pallets of discontinued tech prodcts for $400.00, (that price included AL

      • That looks interesting, but I was thinking more of a just a 2nd display rather than a remote desktop on a lightweight computer. The original price on that thing was the same as a laptop - no wonder it failed. At $200 it might be okay for somethings, but $400 still seems a little steep. I don't imagine the video card was too great, while in the case of a pure 2nd display it would be utilizing whatever video card your desktop has.

        • by phorest ( 877315 )

          At $200 it might be okay for somethings, but $400 still seems a little steep

          The latest iteration of these go for about $1200.00, which is just crazy in my estimation, however the $400.00 to me is reasonable, just for the freedom of not having a smoking-hot notebook in my lap. The thing runs cold as a stone. Also I have been using it for 3 years and really nothing to wear out 'cept the battery and the stylus.

          Since I am not a video/gamer the video quality is OK for what I use it for. The active matrix is neat, (use your fingernail as a stylus) and at 600 X 800 is just about righ

  • You'll be able to watch pr0n through your neighbors open wireless network *and* fry up a steak by positioning the frying pan between the access point and your notebook. Don't worry, the sunburn should fade in a few weeks.
  • by nincehelser ( 935936 ) on Thursday July 19, 2007 @02:13PM (#19917707)
    I can't see any real application for this in a data center. They'll always use wires, switches, and routers. One simple reason is that one bad wireless transmitter could jam a whole bunch of nearby servers, which probably wouldn't be good. Wires have their uses. Sometimes it's good to keep your data flow contained and controlled.
    • by suggsjc ( 726146 )
      Don't get me wrong, I don't see this happening any time soon. But to go so far as to say words like "always" or "never" is just begging your foot to be inserted into your mouth at least sometime down the road.

      They'll always use wires, switches, and routers.

      Well, two out of the three you just mentioned (and I guess conceivably even the wires too) are subject to failing. So just because having a physical connection makes you feel all warm and fuzzy inside (and rightly so with the current state of wireless

      • It is closed mindedness like this that can keep good tech from even having a chance. You really don't have to defend your stance from what is currently available, but to say that nothing will ever be good enough to replace those good old fashioned, tried and tested wires is simply ludicrous.

        Wireless technologies cannot, ever, provide as much bandwidth as a wired (copper/fiber/whatever) connection can, simply because wires allow for a higher signal to noise ratio. Additionally, wireless is a shared medium, equivalent to using a hub instead of a switch/router.

  • by Anonymous Coward on Thursday July 19, 2007 @02:15PM (#19917735)
    great. now ill never have a reason to meet girls
  • by ookabooka ( 731013 ) on Thursday July 19, 2007 @02:18PM (#19917793)
    There are 2 ways to increase the amount of data that can be sent. Increase the carrier frequency or increase the bandwidth. What these people have done is increase the carrier frequency. Wireless today runs on 2.4ghz, these devices run up to 60ghz. What does that mean? Well it'll take more energy, higher frequency means higher energy, also it attenuates more, meaning shorter range. Not only that, but it can will be more readily absorbed by things like bricks, desks, your foot, etc.

    The alternative to this is to increase bandwidth, say use 2.1ghz through 2.6ghz for 1 signal. The obviously downsides to this are you can't run many concurrent streams.
    All in all wireless data transfer has a very real ceiling on the amount of data that can be transferred, lower frequency means longer range and ability to go through obstacles, at the cost of reduced data-carrying capacity. I guess the point of this post is to point out that there is only so far we can go with wireless data transfer. I don't think it will be able to keep up (over the long run) with the increasing size of traffic to be a viable alternative to cables when it comes to things like comptuer networking. Anyone have any thoughts on this?
    • The use of multiplexing codes has not been fully exploited, yet. MIMO and others are used extensively in cellular networks (which are, let's face it, wireless networks too) but are less common in 802.11 and similar networks.

      Perhaps the next generation of wireless will include UWB/CDMA based transmission.
      • Even with multiplexing there is still a very real limit to the throughput of a certain frequency. I suppose my point is that there are clever ways to allocate bandwidth to users depending on how much they need, or to combine a bunch of frequencies to get the throughput you need, but it just isn't realistic to think that one day everything can be wireless and sending movies to and from each other no problem. Basically with wires you can do intelligent switching, but wireless requires you to broadcast and tak
    • by Anonymous Coward
      You can run at 2.45Ghz, and instead of keeping constant power of a few milliwatts, instead, say, modulate the power output from, you know, 1000 watts to 1.21 gigawatts, you can use the resulting modulation to carry more information per wave. This would be really hot new technology, and really start the economy cookin'.
    • by Detritus ( 11846 )
      The relevant parameters are bandwidth and signal-to-noise ratio, not the carrier frequency. See the Shannon-Hartley theorem [wikipedia.org] for details.
      • The relevant parameters are bandwidth and signal-to-noise ratio
        right but center frequency (roughly the same as carrier frequency) affects availible bandwidth for a few reasons.

        1: the obvious limit, there is no such thing as 1mhz of bandwidth centered on 100khz.
        2: antenna capabilities, you couldn't easilly (if at all) design an antenna to go from 10khz to 1.01mhz but you could easilly design one to go from 10mhz to 11mhz.
        3: spectrum crowding, its pretty crowded arround the few ghz spectrum, the only way to g
    • by rcw-work ( 30090 ) on Thursday July 19, 2007 @03:39PM (#19918601)

      There are 2 ways to increase the amount of data that can be sent.

      There are actually four:

      • Increase the signal strength (using a directional antenna or amplifier)
      • Decrease noise (use higher-quality components, shut off interfering transmitters, use directional antennas)
      • Increase the signal bandwidth
      • Increase signal spectral efficiency (for example use OFDM instead of FSK)

      Changing the carrier frequency has no effect, except that there's more room for higher-bandwidth signals at higher frequencies. 2.400-2.422GHz seems like a smaller chunk than 400-422MHz, but it can carry the same data.

      The formula for how many bits you can send and receive error-free is the Shannon-Hartley theorem [wikipedia.org], and spectral efficiency is typically stated as a percentage of the theoretical.

      • Yeah I was going to include that in my post but didn't seem relevant as I see improving signal to noise (directional atennas, shutting off other equipment nearby) as "cheating". For example, assume my cellphone has the best electronics available, how can I increase the signal to noise ratio? Standing at the focal point of a dish aimed at the nearest tower? I just sort of assumed the S/N ratio is going to be essentially constant, or probably worse in the future. You could increase the number of symbols (OFDM
        • by rcw-work ( 30090 )

          For example, assume my cellphone has the best electronics available, how can I increase the signal to noise ratio? Standing at the focal point of a dish aimed at the nearest tower?

          One way to do this is to use an array of smaller antennas, and change through software how signals are timed through each. Say I have two dipoles 30cm (1 nanosecond) apart. If I transmit something at exactly the same time from both, the receiving antenna broadside to the two transmitting antennas will see a 3db boost. If I delay

  • Tin foil hat act like antenna and capture all of multi-gigabit signal and route all of data direct to cerebral cortex, where corpus medula hippocampus cerebellum act like giant "Google" and put all "byte" into main storage. some time Often cause all of sound to "ears" like bad technical translation Chinese goto English, like bad video game of the cheap PC accessories. when All of signal "Scramble" brainwave, error message to help tech support gets to you responding quickly. Zipping all of signs to Brain
  • by retro128 ( 318602 ) on Thursday July 19, 2007 @02:26PM (#19917891)
    This technology could be used in applications besides just strict data transfer. 15Gbs should be fast enough to drive a display, as well. The proverbial rats' nest behind your computer could completely disappear with this technology. Keyboards, mice, displays, network - Just about cable plugged into the back of your computer could be replaced with wireless this fast.

    But if only it were so simple. Of course now the problem we have is with security. Never mind TEMPEST [wikipedia.org]. If you had a big enough antenna and you could decrypt (it IS encrypted...heavily...right?) the datastream emanating from this technology from a distance - you could see the display, keystrokes, data transfers, everything. Obviously, strong encryption is very important - But the overhead from strong encryption will reduce the theoretical bandwidth because of the extra baggage on the packets, and increase costs significantly because of the very specialized ASICs that will likely be required to encrypt a stream at that speed. And they'd have to be standard across all devices. AND an exploit had better not be discovered in the algorithm. Then there's the issue of the 60GHz band. A frequency that high is very unforgiving of obstructions, even at the short ranges we're talking about. If you have a metal desk, forget it. And what about jamming from computers in close proximity? What about from a "l33t hax0r" with some time on his hands and an inclination to make trouble?
    • If you had a big enough antenna and you could decrypt (it IS encrypted...heavily...right?) the datastream emanating from this technology from a distance - you could see the display, keystrokes, data transfers, everything. Obviously, strong encryption is very important - But the overhead from strong encryption will reduce the theoretical bandwidth because of the extra baggage on the packets, and increase costs significantly because of the very specialized ASICs that will likely be required to encrypt a stream at that speed.

      The real problem here therefore is one of cost. You can have as much bandwidth as you can pay for (because this is the kind of problem that responds well to parallelism. The penalty for that parallelism need not be all that significant. You can have no encryption cheaply, but uh, yeah. Next.

      I don't suppose anyone out there knows of any properties of physics that would allow for linked "random" number generating systems that were consistent? :)

    • The proverbial rats' nest behind your computer could completely disappear with this technology.

      No, the problem you will then have is power. Everything still needs power. Keyboard, monitor, mouse etc.
  • It might be commercially 'possible' in a few years, and I'm sure that countries other than the US will even have it, but the US ISP monopolies will never make it available.
  • by Kjella ( 173770 ) on Thursday July 19, 2007 @02:57PM (#19918111) Homepage
    ...when it said wireless in the data center. Yes, I've heard the theoretical figures for wi-fi. Try dropping a bunch of access points and various clients in tight proximity and see what it's really like. In a datacenter you can run 10x 10Gbps wires right next to eachother without problems. Can you do that with wireless? Hell no. I imagine the speeds quoted are ideal with free line-of-sight and no interference, good luck trying to achieve that in that bunch of wires. Personally I was fed up with wireless when I realized one AP couldn't even cover the ground floor of my parent's house. It'd take probably three to cover the whole house. Great... not.
    • by blhack ( 921171 )
      Unless your parents live in a copper mesh manufacturing facility there is no reason that an access point wouldn't cover their floor. Did you buy the AP at a flea market? Did you place it inside of the microwave? MY GOD MAN WHAT IS THE PROBLEM!?

      i suppose that you could always combine one of these [hyperlinktech.com] with one of these [hyperlinktech.com] and use the combo to cook burritos...

      mmmm, burritos!
  • The i-squared-r law at 60ghz means that even if the spectra was available (it's not) then you'll need both line of site (reflections won't help and will slow the data rate considerably) and you'll need the will to gulp content that fast. Of course, a shared fixture like an access point in WiFi suffers from duty-cycle problems and raw bandwidth will help. But we could also use spread-spectrum and/or advanced coding techniques like n-Pole modulation to accomplish the same thing.

    Therefore, with all due respect
  • I don't mean to be a wet blanket, but all of the advantages of the latest whiz-bang technology don't amount to a bucket of warm spit unless and until the major carriers adopt it. If I live to be a hundred, I'll never see Gigabit data service where I live in the St. Louis MetroEast area of Illinois because no one will force our regulated monopoly (AT&T) to provide it. Until Universal Service is expanded to include broadband, and regulatory bodies set the definition of the term broadband to be 2 Mbits/sec
  • The research could lead to devices such as external hard drives, laptop computers, MP-3 players, cell phones, commercial kiosks and others could transfer huge amounts of data in seconds
    How about enabling my external USB-drive to use the 480Mbit available first? Or what about a NAS that can fill up a 1GBps ethernet? Wired isn't slow, it's just not used right.
  • First, successful lab demonstration of multi-gigabyte speeds with mass-market capable technology is still missing. Call that at least 5 years to a real product. Then deployment. Who needs this stuff enough to deploy it immediately? Right, allmost nobody. Also the first product generation will not really be usable. Call it another 5 years to wide-scale deployment. That gives me an estimate of at least 10 years, but more likely 20 years. The 3 years are a direct lie, plain and simple.

    I hope these ethically ch
  • I expect the line-of-sight requirement is a dealbreaker for 'personal area network' type situations. I've got my computer underneath my desk, and all gadgets that could possibly benefit from high-speed wireless links are above the desk. Reconfiguring my desk to provide LOS for everything (including keeping the desk clean, no stacks of paper between the computer and the gadgets) would be a major PITA. I'll stick with wired connections, thank you.

    High-speed wireless could be useful for 'last mile' connections
  • OK, not that bad. I'm a little disappointed in ATT 2.5G.

Math is like love -- a simple idea but it can get complicated. -- R. Drabek

Working...