Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Wireless Networking Media Networking Hardware IT

Wireless LANs Face Huge Scaling Challenges 89

BobB writes with this excerpt from NetworkWorld: "Early WLANs focused on growing the number of access points to cover a given area. But today, many wireless administrators are focusing more attention on scaling capacity to address a surge in end users and the multimedia content they consume (this is particularly being seen at universities). Supporting this involves everything from rethinking DNS infrastructure to developing a deeper understanding of what access points can handle. And 802.11n is no silver bullet, warn those building big wireless networks. 'These scaling issues are becoming more and more apparent where lots of folks show up and you need to make things happen,' says the former IT director for a big Ivy League campus."
This discussion has been archived. No new comments can be posted.

Wireless LANs Face Huge Scaling Challenges

Comments Filter:
  • So basically (Score:5, Insightful)

    by Architect_sasyr ( 938685 ) on Saturday August 30, 2008 @05:24AM (#24808299)
    ...we're having the same issues we did when we stopped using dialup and moved to broadband?
  • Hmmm (Score:5, Insightful)

    by Colin Smith ( 2679 ) on Saturday August 30, 2008 @05:28AM (#24808323)

    Bits of wire are dedicated to individuals, wifi spectrum is shared between individuals. Who'd have thought that might create scalability issues...

    Perhaps dedicating a little bit of the spectrum to each individual might fix the scalability problems.
     

  • No (Score:5, Insightful)

    by Colin Smith ( 2679 ) on Saturday August 30, 2008 @05:31AM (#24808335)

    We're having the same scalability issues which existed with 10base2 technology and 10/100baseT on a hub. The solution is "the switch".

     

  • Re:Hmmm (Score:4, Insightful)

    by thompson.ash ( 1346829 ) on Saturday August 30, 2008 @05:38AM (#24808369) Journal

    Surely dedicating a segment of that spectrum would cause problems ensuring equality of access?

    At the moment it seems that the more people you have on, the lower your bandwidth - stands to reason.

    Surely allocating fixed bandwidth on a first come first served basis would mean eventually you would run out of bandwidth to allocate and people would be denied access?

  • Re:Hmmm (Score:5, Insightful)

    by walt-sjc ( 145127 ) on Saturday August 30, 2008 @07:07AM (#24808731)

    You will find a large number of those individuals right here on /.

    About a year or so ago there was a discussion about WiFi, and I mentioned that I wired my entire house with the standard 2 RG6U, 2 Cat5e, 2 fiber to every room, sometimes two drops in a room. I have jacks EVERYWHERE. People said I was nuts. I said I was future-proofing - they claimed wireless would get faster too. And the response is Of course it will get faster, but so will physical cable as we have seen.

    The bottom line is that wireless can not and will not replace physical cable. It can only supplement. Primary connectivity should always be planned to be wired. Yes it's more expensive. A LOT more expensive. But you need it.

    Wireless by nature is flaky. I can have a laptop 10 feet from an AP and it can drop connection (and I don't care what brand of laptop or AP you have - it happens.) Why? Because the primary wireless frequency, 2.4Ghz, is a cesspool. I find it highly obnoxious that the FCC refused to allocate a band specifically and ONLY for WiFi - especially considering how extremely important connectivity is in this modern world. But Alas, they are only concerned about how much money they can bring in via auctioning off a PUBLIC resource, selling it to a corporate entity which in return lets the public use that band for insane prices.

  • Re:Hmmm (Score:2, Insightful)

    by Anonymous Coward on Saturday August 30, 2008 @07:57AM (#24808955)

    You have heard about that other part of WiFi - "A"
    which is where the N feature is used effectively and due to the multiple portions of spectrum used by "A" you should really have less issues.
    Toss in MIMO and you have a winner.

    P

  • by atomico ( 162710 ) <miguel.cardo@gmai l . com> on Saturday August 30, 2008 @07:57AM (#24808959) Homepage

    Access network planning and optimization is a big expense for mobile network operators: selecting sites, anntenas and channel allocation, base stations, base station controllers... lots of complexity which has to be handled carefully to obtain a decent quality of service without breaking the bank. It is a full-grown discipline with its specialized training, books, professionals, etc.

    Don't expect that WLAN can work magically without a similar effort.

  • Well duh... (Score:3, Insightful)

    by BlueParrot ( 965239 ) on Saturday August 30, 2008 @09:09AM (#24809387)

    At the end of the day the electromagnetic spectrum can carry only so much information using a given number of frequencies. If you want to send data at this and that many bits per second, you are going to need a frequency with a similar number of periods per second. Ok, it's not quite that simple, but at the end of teh day higehr data rates means you need higher frequencies. If you fix the frequency that instantly caps the theoretical maximum amount of data you can transmit. There are two ways to adress this:

    a)Increase the frequency

    b)Deploy more access points so you are less likely to have many computers using the same one.

    The second alternative is essentially equivalent to using more wired networks and fewer wireless ones. Even if all teh comunication in the network is done in some sort of p2p mesh, increasing the number of access points increases hardware costs, which is teh same problem as you have with wired networks.

    Thus to get large data throghput you need to increase the frequency. Eventually you reach frequencies where the lightwaves no longer bend around obstacles and you will need a waveguide, such as telephone line, a coaxial cable , or optic fibre. This is why wired networks will always outperform wireless. By using a waveguide you are not limited in frequency by the requirement that the signal should have a wavelength long enough to dodge obstacles and difract around corners, and thus you can increease the frequency far beyond what you will ever achieve with wireless comunication, hence getting better bandwidth.

    These are physical limits, not merely technological ones. If you want high bandwidth you will need high frequencies, which in turn means you will eventually need either line of sight between the nodes or a waveguide ( wire ). Ok, theoretically something like a proton beam has a frequency so high you will be limited by other things ( such as energy consumption ) rather than frequency, but you need line of sight for those as well. I guess if you used neutrinos or some other very penetrating radiation you would always have line of sight, but barring any sudden breakthroughs in neutrino detection/generation I doubt that is going to be practical for simple data transfer any time soon.

  • Re:Hmmm (Score:5, Insightful)

    by Migraineman ( 632203 ) on Saturday August 30, 2008 @10:36AM (#24810067)
    802.11whatever is an access point solution. Folks who expect it to be a backhaul or backbone solution are ... not well versed in network architectures. I find it amusing that folks think an ad-hoc mesh of 802.11 nodes will *ever* have performance comparable to wired/fibered connections. Just the "shared medium" aspect should be enough to indicate performance will degrade as more connections are added. Shoveling more nodes into the mesh won't magically improve performance.

    Eh, it doesn't surprise me. Evidence of this logical disassociation is everywhere - digital cameras, cars, appliances, computers, tools ... Listen carefully and you'll hear the cries of the oppressed - "I don't want to know how it works, I just want to [do blah]."

All your files have been destroyed (sorry). Paul.

Working...