Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Wireless Networking Hardware

Pushing Wi-Fi's Limits: Problems and Solutions 80

securitas writes "Forbes technology columnist Arik Hesseldahl discusses the problems with 802.11x Wi-Fi - speed and range - and how to push its limits in a pair of his Ten O'Clock Tech columns. He discusses the alphabet soup of Wi-Fi standards, so-called 'Super G' dual channel bonding that allows two of 11 channels to act as one (and the interference problems that ensue), and the multiple input/multiple output (MIMO) method 'using multiple antennas to break a single, high-rate signal into several lower-rate signals' that could be a solution. Pushing Wi-Fi's Limits, Part Two focuses on repeaters, Wi-Fi mesh networks, WiMax and a company called BelAir Networks that has deployed several Wi-Fi mesh networks."
This discussion has been archived. No new comments can be posted.

Pushing Wi-Fi's Limits: Problems and Solutions

Comments Filter:
  • A netgear (Score:1, Funny)

    by Zorilla ( 791636 )
    Hey look! A netgear Wireless router sitting right there. I can make a first post without getting my IP ba...[Wireless Signal: Weak]..........[Wireless Signal: Good].......

    crap! Failed it!
    • by Anonymous Coward
      Obviously this individual is paid by Linksys to bash Netgear wireless routers. Don't fall for the propaganda! You know, the "[Wireless Signal: Weak]" stuff is faked--he just typed it in!

      This is the last time I buy a Linksys router or any other Cisco product, for that ma
      • i've had to test a few wifi routers, and the buffalo routers [buffalotech.com] [buffalotech.com] have significantly outperformed anything from linksys, netgear, or d-link. i recommend them to everyone i work with.
      • I'll bash BOTH Netgear AND Linksys: BOTH are too chickenshit to issue Linux drivers for their 802.11g cards, or even ALLOW anyone to create those drivers.
        • or even ALLOW anyone to create those drivers.

          Bullshit!

          There are Linux drivers for both Prism [prism54.org] and Atheros [thewebhost.de] chipsets, which are used in Linksys and Netgear cards.

          • I have a Linksys WMP54G. It has a Broadcom chip. Which Linksys 802.11g card has either a Prism or Atheros chip?
            • Are you going to apologise to Netgear or just pretend you didn't mention them in your previous post?
            • I have a Linksys WMP54G. It has a Broadcom chip.

              Ahhhh, yes. Broadcom has never been very friendly towards driver developers.

              The good news is, your card works under Linux with NdisWrapper [sourceforge.net].

              Which Linksys 802.11g card has either a Prism or Atheros chip?

              The WUSB54G has a Prism Chipset.
              The WPC55AG and WMP55AG have Atheros chipsets.

  • I want a mesh net!
  • Real issue (Score:5, Informative)

    by kneecarrot ( 646291 ) on Monday July 05, 2004 @04:23PM (#9616007)
    While standards and spectrum sharing are definitely factors, hardware must move quite a bit forward if it is going to become more useful than small home networks and looking cool at a Starbucks. The real problem right now is the quality of the radio chips coming out of Taiwan. They are typically way under specified range and allow for alot of bleeding between channels. The average home user won't notice it, but when you are rigging up multi-antenna setups or relying on precise timing for a repeater, it matters to a HUGE extent.
    • Re:Real issue (Score:5, Insightful)

      by Jeff DeMaagd ( 2015 ) on Monday July 05, 2004 @04:40PM (#9616101) Homepage Journal
      One thing that scares me is the multi-channel radios. There are effectively only three non-overlapping channels, and some APs are starting to take up two of them? I know the spectrum is unlicenced for low power, but I think that's usually just rude and mean for one person to just take two channels. Right now, I experiment with three APs being on, all three channels, but being in a rural area and having checked with all the neighbors, no one else is using the spectrum near me.
    • I've often wondered about the quality of 802.11 hardware. I'm not used to anyone using microwave and cheap in the same sentence. All of the commercial S-band radios that I've worked with are very expensive gear. What corners are they cutting in the consumer-grade hardware?
      • What power levels? High power 802.11 gets expensive, up to $150 for a 200mW AP. Of course, that's nothing compared to commercial hardware that puts out orders of magnitute more power, and the recievers are probably a lot more sensitive as well. The best for 802.11 that I've seen is -90dB sensitivity, typical is I think -80dB.

        I think one advantage is rediculous mass production to drive the costs down.
  • by 2057 ( 600541 ) on Monday July 05, 2004 @04:26PM (#9616023) Homepage Journal
    I wonder how healthy it is to be surrounded day in and day out by all these microwaves and such....
    • Bah... We already have AM, FM, TV, and all other sorts of signals going through us... we arent seeing any adverse side-effects, are we?
      • by MikeXpop ( 614167 ) <{moc.rabworcder} {ta} {ekim}> on Monday July 05, 2004 @04:38PM (#9616096) Journal
        How would we know?
        • Well you could try counting sperm... http://www.theregister.co.uk/2004/06/28/mobile_bal l_rot/

          "Carrying a mobile phone can reduce a man's sperm count by as much as 30 per cent, according to Hungarian scientists."

          Fortunately us geeks have WiFi access to lots of porn so it should not be too difficult....
      • Bah... We already have AM, FM, TV, and all other sorts of signals going through us... we arent seeing any adverse side-effects, are we?

        Not unless you count the positively MASSIVE increase in cancer in the 20th Century. According to NIH, the rate of cancer increased 64% just between 1970 and 1997.

        Granted, it's pretty hard to tell whether that's from all the e-m waves flying through the air (or through our walls and under the ground we walk and live on) or from the pesticides and chemical additives (e.g., sa

        • by jdhutchins ( 559010 ) on Monday July 05, 2004 @04:50PM (#9616160)
          The rate of cancer that we found went up 64% between 1970 and 1997. That doens't necessarily mean there was 64% more cancer, it just means we know how to find cancer better. I'm not saying that cancer hasn't gone up any, but it's probably not that dramatic.
        • Cancer is a disease of (primarily) old age. Cancer rates increase when life expectancy increases. More people are living long enough to get cancer. Eating a healthy diet will result in increased rates of cancer.
        • If you think that correlation implies causation, then I have an Eternal Life Braclet [alexchiu.com] to sell to you.

          The only way to determine the cause of an effect is via scientific experimentation - a double-blind study with experimental and control groups. We could, for example, take 100 mice, leave 50 in "normal" cages, and put the other 50 in cages near to a broadband EM source. Then, we examine the rate of cancer in each group - if it goes up, then and only then can we say that "electromagnetic radiation (at a cert

          • ... mice are nothing like humans. A mouse's whole body is about the size of half a wavelength at 2.4GHz. You can block a wifi signal with your hand. If we assume that RF penetrates wet squashy flesh to a depth of about 1", the RF will penetrate right through the mouse's body. It will, however, only reach a little way into the human body.
            • So, substitute chimps for mice. Or, if you don't want to piss off the animal rights wackjobs, use lawyers instead.

              Even if you do plan to test on larger animals later, it's a good ideal to do your first round of experiments with mice simply due to the cost factor. Once you've gotten your initial set of data, you can repeat the experiment with pigs or chimps, both of which are much closer to humans in in size and physiology.

    • by mOoZik ( 698544 ) on Monday July 05, 2004 @04:42PM (#9616114) Homepage
      All the popular notions of microwaves being harmful are pretty unfounded. You must remember, they're like other waves, like radio, UV, IR, radar, and so forth. It seems to me most misconception arises from the fact that we use microwaves to cook food and that stray waves from cell phones and other such things could be harmful to us, but again, this is untrue for most purposes.

      You see, microwaves excite water molecules - they make them move back and forth really fast - thus heating them and increasing their temperature; this is how a microwave oven works. The fear with cell phones (which have a very weak transmitter) is that they may increase the temperature of brain cells or other, critical cells above a normal temperature, thus cause an unfavorable outcome. However, studies have shown that the increase of temperature from a cell phone antenna - when put against one's ear - is less than 1/10th of a degree Centigrade. As you can imagine, this is insignificant; our bodies are able to remain undamaged at temperatures MUCH higher than this.

      The point is that cellphones, while not the topic of this article, transmit much more powerful microwaves, much closer to the head. This means most WiFi waves are not at all powerful enough to have an impact.

      Of course, I do not suggest you stick your head near a multi-megawatt microwave transmitter.

      • Ignorance is bliss. Now try to learn something.

        A Microwave oven works by transmitting the resonate frequency of water "Your microwave oven operates at a frequency of 2.45 GHz (gigahertz)" [hypertextbook.com] This is also the center frequency of 802.11 transmitters.

        The human body is mostly made of the water that this frequency resonates.

        • I didn't think the frequency is a resonance frequency. There's a whole band of frequencies that gets absorbed by water, there's nothing special about 2.45GHz. (that's why radio communications are so difficult from a sub).

          I bet I can cook you with a 1GHz signal.
          • Like any substance, water absorbs energy most effectively at particular frequencies. There is certainly more than one peak in the absorbance spectrum (corresponding to various vibrational modes of the water molecule), and these peaks are probably reasonably broad - so microwave energy not exactly at the peak should work fine.

            Think of water as if it were blue tinted glass. If you shine blue light on blue tinted glass it will get a little warmer, but not nearly as warm as it gets if you shine red light on
        • Ignorance is bliss. Now try to learn something.

          Wow, if you're going to be snarky at least get your facts straight.

          You're right about the frequency of the microwave oven, but it's not the resonance frequency of water, that's 545GHz.

          Microwaves work by electromagnetically vibrating any asymetrical (polar) molecules found in the target foodstuffs. Water is usually a very large percentage of that, but you're just vibrating the molecule, not causing it to resonate. If you did, the water on the outside of th
    • "I wonder how healthy it is to be surrounded day in and day out by all these microwaves and such...."

      What's it been, 3 generations so far?
  • by ofdm ( 748594 ) * on Monday July 05, 2004 @04:49PM (#9616154)
    This solution seems to be quite a clever approach. A fundamental problem with 802.11b is the lack of spectrum. Although the channels are labelled 1 to 11 (in the USA), the numbers refer to a spacing of 5MHz chunks over 50MHz in the 2.4GHz ISM band. The problem is that the 802.11b signal uses almost 4 of those channels to actually transmit data. As a result, in order to have systems on different frequencies which are not interfering with each other, you end up with three effective channels - 1, 6, and 11. (If you have a WiFi AP accessible, check what channel it's on - most likely it will be one of those). Due to the low number of channels, it's impossible to do much in the way of channel planning. The result is that adjacent APs have to share the spectrum. The net outcome is that the data rate that users get between their client and the AP is reduced.

    802.11a at 5GHz was supposed to solve this. The 5GHz band is notable because of the extra spectrum it has. Compared to the 3 effective channels at 2.4GHz, the 5GHz UNII band has (again, it depends on your country) at least 8 usable channels of 20MHz. Additionally, the link rate is between 6 and 54 Mbps (as compared to 1 to 11Mbps for 11b, although this is somewhat moot given the growing preponderance of 11g solutions at 2.4Ghz). However, the 802.11a market never really took off and killed the 11b market the way we (engineers) expected it to. Mostly due to good (if slippery) marketing of 11g. As a result, there's a lot of unused 11a spectrum begging to be used. There are a lot of people with 2.4GHz equipment who want more range without losing data throughput. Using the 11a spectrum to extend the 11b/g range is what these guys have done. Neat - they get to use a superior technology with cheap chips available, to leverage a large market (albeit of dullards wed to an inferior solution).

    • The problem with 802.11a is that the range is typically even more limited than 802.11b/g. As for this mesh, it sounds like a pretty expensive proposition. They'll need two radios and two antennas per unit and units probably every few hundred yards/meters. Ask Ricochet/Metricom how much it costs to put that much equipment in place. Also you lose speed with each hop so there is goint to be a lot more than just one or two wired nodes in even a small city.
      • 802.11a ... range is typically even more limited than 802.11b/g

        In reality, 11a range, for the same transmit power, is not that bad compared to 11b. The limit to range is the sensitivity of the receiver - at what received signal strength does the packet error rate drop to an unacceptable level - usually taken as 10% but in reality somewhere between 1% and 10%.

        For a well designed 11b receiver, the receive sensitivity for 1Mbps data rate is around -94dBm (i.e, the signal is about as strong as the noise at

        • How much different is the propagation of 5 GHz RF vs. that of 2.5 GHz in air? I thought it gets blocked a lot more readily by trees and such too.
          • It's a lame answer, but ... it depends. Going from 2 to 5 GHz does result in an increase in attenuation, particularly the exponent in the expression. Another obvious change is the multipath. At 5GHz it's somewhat narrower band than at 2.4GHz (simply due to the relationship between delay and carrier frequency), which has benefits in the OFDM rates because the FEC across subcarriers works better with narrow-band fading. At 5Ghz, the signal beam doesn't defract quite as much as 2GHz, so trees etc are slightly
    • The result is that adjacent APs have to share the spectrum.

      I once thought up a solution for this. The APs could have active antennas with a grid of elements, much like modern military radars. This sort of antenna is directional and the beam is electronically steerable. As long as there was a different band for downstream and upstream, interference would be virtually eliminated. Finding the client's direction and the schedule for listening to the clients would have to be somehow solved.

      This sort of AP wou

  • by BelugaParty ( 684507 ) on Monday July 05, 2004 @04:50PM (#9616162)
    I can sit in a college library and browse people's laptops as if they are on a trusted network. People don't realize how public WiFi is in these environments. I think the main cause for this is the connection wizard (microsoft specifically). When you first connect the computer for wireless access it automagically, without a lot of warning, shares folders, printers ... etc, because it is assuming you are at your house with your linksys router; not at the library, coffee shop, or hijacking i-net from an apartment complex across the street.
  • by Cytlid ( 95255 ) on Monday July 05, 2004 @05:40PM (#9616421)
    Concerning the second article, 802.11a seemed pretty clever to use for the uplink. A mesh within a mesh. But isn't 802.11a unencrypted? What's to stop me from pulling over along the side of the road with my trusty 802.11a nic and sniffing cleartext (uplink) traffic? That's a lot of pop3 passwords, my friends.
    • ... isn't 802.11a unencrypted ...

      No. 11a, 11b and 11g are physical layer (PHY) components of the 802.11 standard. The Medium Access Layer (MAC) which sits above the PHY layer is common (with very small differences) to all the PHY layers.

      Security is specified by 802.11i (ratified in the last week or so) and applies equally to both 11a, 11b, and 11g. Until recently, they all shared the same poor excuse for security - WEP or proprietary extensions such as Cisco's LEAP. In the future, they will all share (wh

  • use more power (Score:3, Informative)

    by max born ( 739948 ) on Monday July 05, 2004 @05:48PM (#9616460)
    Many of these problems can be easily solved with more power. The FCC has imposed severe power limitations on 802.11 of about 100-200mW per channel.

    If the FCC would allow us amatuers to use, say, half the power that cell phone companies do, we'd be able to Wi-Fi the whole country.

    Give us the tools and we'll finish job.
    • No, I think the solution is manufacturing more electromagnetic spectrum!
    • Re:use more power (Score:3, Informative)

      by ofdm ( 748594 ) *
      This is, unfortunately, a common misconception. Increasing the range of 11b means that the available bandwidth has to be shared more widely - meaning that each user gets less bandwidth, which is to say data throughput.

      Imagine if you will, a world where you could hear everyone talking within a block of you. Sounds great - you can hear your stereo from a mile away (well, this already happens). Unfortunately you can also hear everyone elses stereo, and everyone else talking, and their refrigerators humming,

      • Re:use more power (Score:3, Informative)

        by max born ( 739948 )
        Yes, but with 802.11g and 802.16 the specs allow for up to 50+Mbps, several T1s. For surfing the net and reading email this is enough to supprort more users than may be commonly found in the limited area covered by the current power restrictions.

        The 802.X specs cleverly implement CSMA/CA, a collision avoidance system that seems to work pretty well. From my rooftop (downtown San Francisco), I can see 150+ networks yet never experience any symptoms of interference.

        Also, in setting the 802.11 limits on p
        • Re:use more power (Score:3, Interesting)

          by ofdm ( 748594 ) *

          I ... never experience any symptoms of interference

          That's most likely because, as you said, your traffic requirements are low, and possibly the traffic on the newtworks you can see isn't particularly heavy. If you have access to the PHY layer, you will see that collisions are in fact very common. The standard provides a couple of ways for dealing with this. (I'm sorry if I'm teaching you how to suck eggs here - I don't know what you know, so I'm aiming low).

          At the base level, each data packet is acknowl

          • No, you're quite right. I agree. I guess my point of contention is with the FCC. IMHO local communties should decide on power limitations. So for a small town with a few users over a large area it may be more beneficial to use higher powered trancievers than for a densely populated city. But local municipalities should be able to make decisions about how much interference is acceptable through their own elected representatives.

            The FCC's one-size-fits-all model is hindering WiFi's expansion.

            BTW, in the O'R
        • I think that the regulations are just fine for omnidirectional installations. However if I am using a pair of old dishes for a point-to-point link, I should be able to use as much power as I can without nuking animals that stray in to the path of the beam.
    • There's a problem, though, with using more power. You increase interference with everybody else while making a small improvement with your intended recipient. A directional antenna helps you when you receive as well as when you transmit. If you need to serve an area, you can still benefit from an antenna that concentrates radiation in a pancake shape so you don't waste power transmitting straight up. High power conflicts with sharing.
  • by fegriffin ( 793733 ) on Monday July 05, 2004 @05:49PM (#9616465)
    Although there is never enough bandwidth, until we can solve the last mile bottleneck, 11Mbs 802.11b networks will be sufficient. With ADSL and cablemodem rates at less than 1Mbs that is where the problem needs to be solved.
    • 802.11b/11Mbs is fine for surfing the net. 11b will run into problems as the density increases.

      The next generation wireless is media access. Your TVs talking to the digital cable modem, the digital recording/storage devices, and satellite receiver without cables, distributing the HDTV signals from all your possible sources to all your possible end user devices/displays.

      That is what the next gen wireless is all about. The range needs to be big enough to cover your house but small enough that neighbors do n
  • The good thing about mesh networks and leaving things unsecured is that you can fileshare without much worry from the RIAA becuse you can always claim that it wasnt you downloading from a IP it was someone hooking into your wireless router .Justin Frankel the godfather of distributed P2P file sharing has also sugested that it should be done on his own blog .

    May 19, 2004 I've been saying this for a long time, well over a year now: http://www.salon.com/tech/feature/2004/05/18/safe_ and_insecure/index.html

    • Actually no, according to an article by the heise lawyer in the recent issue (sorry, no links, I only have the printed version) of German computer magazine c't - it may be different in other countries.

      If somebody shares illegal data using your access point whithout your knowledge and the police stops by it's your responsibility to show them that it wasn't you sharing all these warez, kiddie porn and MP3s. So if you plan on making your AP a hotspot, you should really think about how to create detailed log f
  • That would solve all the problems.

    Think about all the radio stations on the radio dial and how much more efficient we could use 'our'(public) spectrum if we used wifi or even some variant like directional wifi(pringle can).

    I think only the government is impeding this.

    The radio spectrum is not used wisely.

    Lets change it. Time to get rid of 'radio'.
  • Will Smith? (Score:1, Funny)

    by darkain ( 749283 )
    I just hope that BelAir isn'r ran by The Fresh Prince of BelAir [imdb.com]
  • I've recently been having trouble with some wireless hardware. After exhausting the hardware manufacturer's support options, I looked for general wireless support forums, and didn't seem to find very many. Can anyone recommend a good place to get support for wireless hardware?
  • i'm using my neighbors wireless right now and i dont mean nextdoor, he lives 4 blocks away! the signal usually doesnt cut out that much but when it does then everything i'm sending get curup!#KJH!KJ#H!BNMSDUIAYHKJ as i was saying everything gets curupted, but at least i get free broadband(w00t i'm gonna bittorrent all night)
  • Google for ICOM ID-1 for manual & spec's...

    In short: 128 kb/sec, has USB & 10BaseT cables,
    as well as a microphone! Does Data at that speed
    but also Digital Voice at 4.8 kb/sec & Analog Voice, all on 1.2 GHz (an Amateur Band)

    12 V @ 6 A on higher power, but reportedly more
    reliable than traditional WiFi gear, in cars, etc.

    Oh, it would require a Ham Radio License...

I judge a religion as being good or bad based on whether its adherents become better people as a result of practicing it. - Joe Mullally, computer salesman

Working...