Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Hardware

The Joys of Microwaves And Wireless 67

Simone Paddock pointed me to the article on O'Reillynet about some quick and dirty testing of WEP, wireless, microwaves, ovens and all sorts of fun stuff. The article is entertaining and informative - my favorite kind.
This discussion has been archived. No new comments can be posted.

The Joys of Microwaves And Wireless

Comments Filter:
  • Wahahahahahah... Karma be damned... I've already watched my post go back and forth between "Interesting" and "Off Topic" 3 times in 10 minutes!
  • Funny, I tried to submit a REAL news story about wireles a couple days back "2001-03-29 00:17:32 Color Cell Phones (articles,tech) (rejected)." Thought was still interesting, so I put the story on MU [current.nu] with a few references. It was announced by Sprint themselves Yesterday [sprintpcs.com], and Yahoo [yahoo.com], AND... cnet [cnet.com] have already ran the story now too. PDA Buzz has mentioned it yesterday [pdabuzz.com]. And it's not like it's shocking news if you look at some of the cell phone stuff going on, like the PalmOS Samsung [pdabuzz.com] and the countless other PDA replacement cell phones coming out.

    But I guess it's not /. news until they can be sure to be the LAST to report it! Hmm.. Ya, but those wireless microwaves... Wooo doggy, uber geek, but... Ah, nevermind, it is sort of cool, just a bit more pointless. Just glad there are at least one [i4u.com] or two [kuro5hin.org] other sources of Geek news than slashdot...

  • ssh is not a useful network testing tool. it -royally sucks- at transferring data, even with encryption turned off. It performs way too many data copies and does system calls on buffers that are way too small.

    Use ttcp or netperf.
  • How about a hair dryer, an espresso machine, an electric razor, a TV (phlat panel and CRT types), a nearby PC running a benchmark, a washing machine and/or dryer? Gotta give it that lived in environmental background noise.

    All of those items operate at a different frequency than the wireless network. They should not have an effect, only devices that output noise in the 2.4 Ghz band should cause a problem.

  • Microwaves are one thing, but let's look at something that operates in the same band. On my wireless lan, if I get on my 2.4GHz cordless, I have no more network, WEP or not.

    Microwaves do operate in the same band. They run at 2.4 GHz. This may have something to do with why the 2.4 GHz band is unlicensed.

  • He didn't use any cipher with ssh. So everything was unencrypted.
  • That's not completely true.

    The WEP encryption works like this (assuming 128 bit version)

    - every station has a 104 bit secret key
    - every packet has a 24 bit IV (initialization vector) in the header.

    The 24 bit IV and the 104 bit key are combined into a 128 bit key, which is used to seed a pseudo random generator based on RC4. The resulting stream of random numbers is then XOR-ed with the payload. To avoid tampering with the contents, there's also a 32 bit ICV (integrity check vector appended) which is a CRC on the plaintext.

    Now, the short IV is vulnerable to attacks because if you ever find two packets with the same IV (and the same secret key), you know the contents have been XOR-ed with the same stream.

    However, if all the packets you receive use different IV, your only way to find the contents is to crack the 104 bit secret key.

    The proposed changes to WEP include a larger IV, that's not appended to the secret key, but XOR-ed.
    This will still result in a total 128 bit key length to seed the RC4 engine, so this will be a firmware-only upgrade on most (all ?) hardware vendors.
  • The 802.11a 54MBs standard has been finalized, but we have been waiting for vendors to make the chip sets. AtherOS [atheros.com] is the first vendor to ship a product.

    At 5GHz, walls attentuate the signal even more than at 2.4GHz, the 802.11b standard, so it remains to be seen how close you will get to the 54MBs signalling rate. With the MAC protocol overhead, even in the best enviroment, you'll loose between 30-50%. This is not just a wireless feature, 100 MBs Ethernet also gets a MAC penalty, just not as much as wireless.

    What will really impress the geek in all of us is a new wireless products utilizing Ultra Wide Band [uwb.org] (UWB) techniques. A UWB radio tranmits its signal using gaussian monocycles instead of sine waves. With ultra-low power emmisions and over 2GHz of spectrum, conventional narrow-band systems are not disturbed and UWB signals appear as white noise making UWB very hard to detect. Highspeed bandwith, precise precision and location, and RADAR capabilities are being demonstrated today. A leader in the field is Time Domain [timedomain.com]. The FCC is expected to rule on legalizing UWB this fall.

  • That may be true...but there is blead over, we had 2 2.4GHz phones at work(different lines) and they interfered with each other, even though both were supposed to frequency hop. dunno..just an observation..(I think on was a motorola and one was a panasonic, with the 2 antennas)
  • We have a site where we're providing wireless to a school. Every day at lunch the site gets knocked out because of the microwaves. This is a timely article for us I'm looking for ways to eliminate the interference.

    We know for a fact it's the microwaves since turning them off cleans up the link and turning them on kills it. We've tried changing their positions and replacing some of the older microwaves. It still seems to be a problem. Strangely the micorwave we brought in has no problems but another of the same brand causes problems. Using a spectrum analyzer has also shown the microwaves to be a problem.

    Can anyone suggest a way to shield a microwave or otherwise elminate the source of interference?
  • I just wanna know the speed in a typical evironment like my home, which is where *I* will be using it.

    I'm using the Lucent stuff in my home. I've got the RSG-1000 gateway hooked to my cable modem, then I've got a tower PC with a PCI-to-PC card adapter, my laptop with it's PC card and my Jornada 720 gets a card too.

    I've been able to download file at 400kps from the 'net, but that doesn't tell me whether the speed limit is the card or the broadband (my guess is the latter).

    I'm running Windows on all the machines (I know, I know, but I make a living writing stuff for Windows) - if someone can suggest a compatible benchmark for peer-to-peer speed testing I'd be happy to give it a whirl.

    On the topic of Windows... great glubb almighty, is SMB the slowest fscking thing in the word or what? I do better using Zip disks to transfer files!
  • I'd like to see this test repeated with a Ghz (1.2? 1/5?) cordless phone in the room.

    Based on my experience, you do not want to use a 2.4Ghz phone if you're running 802.11b.

    I've got Lucent/Orinoc stuff. Bought a Siemens coordless, ran it in the same room as the wireless gateway (which is always on). The coordless would not see the base station if it were farther than a foot away.

    I didn't bother checking my throughput with the phone on, I just took the phone back. My old 900Mhz phone works just fine, no interference.

    If anyone knows a 2-handset capable 900Mhz phone, give a shout.
  • While I appreciate the authors intent to attempt a fair comparison and evaluation of 802.11 wireless radio transmission speeds compared to manufacturer specifications, there are a number of serious errors in the method of testing and subsequent conclusions.

    First let me express my bias: I do not believe that the 802.11b standard utilizing Direct Sequence Spread Spectrum (DSSS) technology suits mobile or last-mile usage. Ideally this technology suits point to point (PtP) or as a controlled point to multipoint (PtMP) involving less than an half dozen receivers. Now taken with a grain of salt, let us continue.

    1. There is no baseline.
    There are many variables involved in the transmission of data between two computers, and while the author made a valid attempt to remove some by eliminating disk I/O operations, interface delays (PCMCIA/Cardbus), protocol overheads (802.11b, 802.2, TCP), and signal quality (retransmissions/bad packets) were ignored. The simple addition of a test utilizing 10baseT PCMCIA cards and running the same test would have accounted for the TCP and 802.2 overheads, leaving only the 802.11b (wireless) protocol and retransmissions as variables which is what he was testing for.

    2. Method of comparision flawed. Assuming that a 1Mbit radio link will provide 1,048,576 bytes/second is incorrect. The 1Mbit number refers to the maximum RADIO speed possible, not throughput that could be seen. The transmission process tested was SSH --> O/S --> TCP/IP --> Ethernet --> Wireless and reversed on the receiving side. Since this computer is probably used day to day with 100baseT ethernet and sees higher transmission rates than 4Mbit for this comparision it can be ignored in the lack of a valid baseline (see #1 above). SSH encapsulates the raw data provided with its own headers, TCP/IP adds packet headers, Ethernet adds MAC headers, and Wireless adds 802.11b headers. Depending on the packet size being utilized by the SSH implementations, this overhead can result in a packet overhead of 20% to 60%. In this case based on the times published, it is probably in the 20-30% range.

    After deploying a real network supporting 10,000 desktops in 16 communities I can safely say that DSSS technology is being deployed and used incorrectly as it is not well suited to either office mobile, last-mile or roaming mobile deployments. The large overheads involved in the 802.11b protocol reduce actual throughput and the congestion that results from colocating cells in unacceptable outside of a small office or home environment.

    The industry as a whole needs to take a hard look without the spin created by the large manufacturers pushing their own products at the true capabilities of both DSSS and FHSS technologies. DSSS has an excellent application in providing network backbone, and high-rate point-to-point communication it is not suited for outdoor point-to-multipoint deployment or in-office roaming between access points, let alone mobile roaming like a cell phone. Frequency Hopping technologies should be deployed in the busy office, are perfectly suited for outdoor deployment, are extremely resistant to interference and support high-speed roaming between cells at 60 mph for in-vehicle applications that will start to appear shortly.

  • Yes, and herbal tea can hardly be described as standard tea.
  • Well, most digital cordless phones seem to be either 900MHz or 2.4GHz. 802.11 runs at 2.4-2.4835MHz in the US, and bluetooth runs at 2.4GHz as well. Both the phones and, AFAIK the wireless data technologies, use frequency hopping so they should be able to coexist peacefully.

    Actually, almost all of the 802.11b (definitely anything with a WiFi logo) uses Direct Sequence, rather than Frequency Hopping. Another thing to note is that _anything_ operating in the 2.4GHz band has the potential to interfere. I've seen a Frequency Hopping radio punch holes through a Direct Sequence radios throughput, because it's hopping happened to correspond to the same frequency at times (the frequency hopper also had a 100mW transmitter compared to a 35mW transmitter on the Direct Sequence radio).
  • Appart from some packet loss, our broadband from sprint rocks. That an some 802.11, and its alot fewer cables around the house.
  • Latency... waiting... still waiting... watching all the other 'puters taking turns... more latency... gameing? not just no, never but don't ever even think about quake!
  • Guess I'm no the only one sitting to close to the microwave...(re: moderation) hehe


    3 S.E.A.S - Virtual Interaction Configuration (VIC) - VISION OF VISIONS!
  • Why would you want warm Mountain Dew?

    (Diet Coke or other soft drink may be substituted for 'Dew if applicable)
  • Actually, I have a Lucent 802.11b Wavelan, an Apple Airport, and a Siemens 2.4GHz phone and I've noticed no problems. I tried to cause problems but I couldn't.

    In fact, I've had more problems with the phone itself -- it dials too fast for my phone company and I have to put in a 1010xxx prefix in order to get it to work and not give a fast busy.
  • Now, a quick question: Do the cans of Moutain Dew need to have pop in them still?

    If not, I could use the excuse that all those cans laying around are to "optimize" my network connection :)


    It's a joke. Laugh!
  • Sorry, I should have run echo $((128-104)) :-(
  • with more than one magnetron

    I should forward this to those people I left my old Litton commercial grade nuker with. It's...uh... near the computer center. :)

    --

  • Gads. I forgot flurecent lights... noisy buggers.

    --

  • Now it looks like I want someone to mod down my own post. Dammit.
  • I just wanna know the speed in a typical evironment like my home, which is where *I* will be using it. Now, I would understand a test like what you proposed, for benchmarking in an industrial situation, but I don't believe that was the intent of the author.
  • From the article: The first interesting thing that I noticed was that, no matter how hard I tried, I couldn't squeeze out more than about 4 Mbits.

    I have noticed this as well, and with various types of equipment. At the time I wasn't sure if this was a limitation of the cards with respect to TCP connections, or just due to a lot of collisions. Does anybody have explanations or similar experience?

    It would seem that this is relatively unimportant, considering a lot of people can't get 4 megabits to the net anyway. However, as this bandwidth is shared in a peer-to-peer network, I would worry that things would start to get ugly when the number of peers is > 2.

  • What's the impact when a plasmoid is being ggenerated in the microwave?
  • He mentions early in the article that there is a small amount of overhead... because of CPU time and accessing the hard disk. Why then does he scale the overhead percentage-wise with each of his tests? That overhead... should stay exactly the same

    In addition to calculations and disk access, there's also overhead that is dependent on the line speed; specifically, the link access times (avoiding collisions with the other end accessing the medium) and the transmission time for checksums and the other minutia that make up headers. It may have been this to which his calculations referred.

  • You're right about the encryption. They market it as 64 and 128-bit, but it's really 40 and 104 with 24-bit hard-coded Initialization Vectors. Thus the crack at UC Berkeley.
  • First of all, 11MBS is the bit rate on the air interface. NOT the actual throughput achieved by an end user. 1st problem: Overhead. If the test was run using FTP, there is FTP overhead, TCP overhead, IP overhead, and 802.11 overhead. (In the neighborhood of 80 bytes per packet) 2nd problem: 802.11 is NOT a full duplex connection. This means that whenever an acknowledgement is being transmitted by TCP, no other data is actually being transmitted on the network. Thus, the sending client has to wait for the channel to clear befor transmitting more data, limiting the throughput even more. 3rd problem: 802.11 overhead traffic. The 802.11 protocol calls out a lot of overhead packets that are not found on wired LAN's. For example, Beacons, RTS, and CTS signals that are used by the end stations to access the network. As you can see, with all these limitations, 4 MBPS application throughput is about the best you will ever see in a single client environment. As more clients are added to the network, throughput actually decreases depending upon how active the others are.
  • Use ttcpw; it's a Windows port of the classic ttcp network testing program, and will prove very useful in networked file transfer tests, with or without interference.

    And yes, it's probably the broadband.

    SMB is often faster than some distributed filesystems, but it also doesn't do as much. However, compared to a local hard drive, it's painful.
  • Obviously, my point is that the "Microwave Interference" has nothing to do with home microwaves. These studies have already been done, and having some layman shout "woohoo, it works fine with *my* microwave, I don't know what *they're* talking about" does nothing but propogate myths and false information.

    Similarly, people will wonder if this will cause cancer, but they would have much greater risk from their cellphone (say, almost nil), and similarly, much greater risk from their microwave, (which is to say, not much) when both of which are far safer than The Sun! (which people don't even think about, blah, blah, sunscreen, blah blah blah...)

    My point is that a little research and correct information could have saved us all from this garbage, ad-hockery, and mockery of what used to be science. Apparently I was born in the wrong era, or came to the wrong place.
  • Modern microwaves are too well shielded and run on too tight a frequency band to really give interesting data in an experiment like this.

    I want to try this with my grandparents' microwave. It's really old, but that carries with it a few "advantages". It's massively overpowered (it browns out the kitchen lights when turned on), inadequately shielded (I think the housing is bakelite), and runs just fine with the door open (I don't think they had product safety regulations when this thing was built). I've offered to buy them a brand new microwave in exchange for this beast, but no luck.

    Now that I think about it, when I say "I" want to try this experiment with my grandparents' microwave, I actually mean I want a "research assistant" to try this experiment.

  • You are wrong.
  • by Anonymous Coward
    Last month I had a vasectomy. Now I've got wireless rocks.
  • I also have a 802.11 setup and a microwave :) and thrown into the mix, an x10 dvd anywhere 2.4 Ghz audio video sending unit which I use to play radio from a satellite reciever on my stereo. I use the 802.11 setup to play mp3's off of an NFS mounted file system on a laptop also on the same stereo.

    Here is what I see.

    1) If I am playing the radio through the x10 setup and turn on the microwave, it practally destroys my speakers. This is true on all of the 4 channels that the x10 supports. It is impossible to use the X10 and the microwave at the same time.

    2) If I am playing an mp3 on the laptop and switch the stereo over to the x10 receiver, i will here the laptop accessing the file system as a series of fuzzy zip sounds.

    3) If i am playing mp3's through the laptop and run the microwave. I see and hear no noticable difference. Also the 2.4 GHz broadcast from the X10 equipement does not seem to affect the laptops's disk access in anyway.

    Now I realize that this is not an itensive network application, nor or is very scientific. It is however a real world application of the technology and my take is that if you are using an 802.11 device to do downloads or serious network intensive applications, sure you will suffer from interference from other 2.4 Ghz devices (including microwaves which by the way happen to fall exactly in that spectrum). Go do these things on your PC. If on the other hand, you are using your wireless network for more low key applicatioins such as surfing the net, checking email, simple file access. You will not notice any problem at all, it is simply not a problem.
  • A few cans of strategically placed Mountain Dew next to your wireless card helps reflect the microwaves, allowing a fast connection with no interruptions.

    Pie tins work wonders too.
  • But does anyone know what the deal is with the 802.11 PRISMII chipsets? The linux-wlan (http://www.linux-wlan.com/) project supports it, but no ad-hoc mode so far.

    While browsing around source code, I see that the wvlan_cs module out of the latest pcmcia-cs package, originally designed to work with the Lucent cards, supports the PrismII chipsets as well, at least nominally. And in any case, it supports ad-hoc where the linux-wlan does not.

    So what's the deal? 2 different drivers supporting the same card, I guess? Will prismII support continue in wvlan_cs? I'm so confused. :)

    ---

  • I would claim that you sir, have "no fucking clue" when it comes to 802.11b. Let's look at your claims: DSSS has a maximum theoretical thruput of 4-5Mbps? I don't think so. I have personally run performance tests of Aironet and Lucent 802.11b cards, and have acheived thruput of 7.6Mbps. 50% higher than your "maximum". Results are right here. [ucar.edu]

    You also claim that WEP adds zero overhead becuase it is a "hash" done in hardware. WEP is not a hash. It is RC5 encryption, a stream cipher not a hash. Moreover, many products do not implement the cipher in hardware. Specifically, Lucent uses software and takes a 20% performance hit when WEP is enabled. The Aironet cards use hardwar and take no performance hit. See the above URL for details.

    The reason DSSS is marketed above FHSS is because it is faster. The fastest FHSS 802.11 hardware runs at 2Mbps, and is in fact cheaper than DSSS hardware. 802.11b is winning because it is flat out the fastest afordable wireless ethernet technology on the market.

    So, I ask, if we are "fucking stupid idiots" for buying 802.11b, what would you recommed? 2Mbps FHSS gear? Non-existant HomeRF that even Intel abandoning? Please, o wise one, enlighten us "fucking stupid idiots"
  • Ah, got it. He meant all the little bits of protocol data that got sent along in addition to the 11,000,000 bits of payload data getting sent across the stream. Thanks.


    --Brogdon
  • I'd like to see this test repeated with a Ghz (1.2? 1/5?) cordless phone in the room.

    Well, most digital cordless phones seem to be either 900MHz or 2.4GHz. 802.11 runs at 2.4-2.4835MHz in the US, and bluetooth runs at 2.4GHz as well. Both the phones and, AFAIK the wireless data technologies, use frequency hopping so they should be able to coexist peacefully.

    ---

  • So now I can heat up pizza in my car, right?
  • I am not surprised that a modern microwave oven in good condition (like the one in the photo) did not interfere with the LAN. However, an older oven (even a 300W one) with a faulty seal could easily radiate 100 times as much power and cause real problems.

    Thought: If your microwave messes up your wireless LAN then get the oven professionally tested. An oven with a faulty seal can be dangerous.

  • How about a hair dryer, an espresso machine, an electric razor, a TV (phlat panel and CRT types), a nearby PC running a benchmark, a washing machine and/or dryer? Gotta give it that lived in environmental background noise. :)

    I recall one office that was over a metal shop, where alot of the RFI came from the Arc Welders.

    Very nasty stuff, messed with the desktops big time, even though they were on separate power systems.

    Being located directly over the stop didn't help.

    Check out the Vinny the Vampire [clik.to] comic strip

  • If you take a florecent (sp?) light bulb and put it up near a CB antenna and Key the mic the bulb will light up.

    If you take a regular light bulb and throw it up in front of powerful enough microwave transmitter it will go offlike a flash cube.

    But the thing about interference is a matter of frequency, that of interference being within the same frequency range.

    The 60hz audio hum caused by improper grounding in audio equiptment is within the range of the audio equiptment subceptability. I'm sure wireless transmissions and reception interference is as well constrained within the specified range of the equipment.


    3 S.E.A.S - Virtual Interaction Configuration (VIC) - VISION OF VISIONS!
  • Furthermore, the FCC is supposed to ensure that the two devices not cause interference with each other. I'm quite sure there is an FCC sticker on your microwave oven.
  • I was sending it to the server when a Giant Sunspot [yahoo.com] ate it! I swear!

    --

  • How about a hair dryer, an espresso machine, an electric razor, a TV (phlat panel and CRT types), a nearby PC running a benchmark, a washing machine and/or dryer? Gotta give it that lived in environmental background noise. :)

    Incidentally, the tea was delicious (although a bit too hot.) Mmmm ... Yerba Buena Maté

    I'd name that microwave Eddie.

    --

  • I'd like to see this test repeated with a Ghz (1.2? 1/5?) cordless phone in the room.
    I'm hesitant to buy one in case it interferes with my future bluetooth devices :)
    (same goes for 802.11 I guess)
  • No, it is 24 bit encryption strength due to the faulty design of WEP.

    The key lengths are 40 bits and 128 bits as advertised but the effective strength is not 128-24 bits it is 24 bits.

    There is a fixed version of the protocol being worked on by the IEEE.

  • If you read the literature, you'll notice that a conventional microwave has little or no effect on an 802.11b transfer.

    Bzzzzttt!! if the guy had read the litterature he would have mistakenly thought the encryption protocol works.

    There is no substitute for checking this type of stuff out for real. Just because the marketroid who wrote the manual claims there is no effect does not mean that the statement is true.

    Believe it or not, marketting people often use a sophisticated technique to sell faulty products, it is called lying. Other prominent exponents of the technique are to be found in politics (no new taxes, no more greenhouse gases).

    In my day research meant going into a lab and doing an experiment like the article describes. Going into the library to read up second hand research claims is not the same thing, nor is it superior.

  • The "mbit" rating on physical layers is the raw cycle rate of the bit symbols in the stream. That is, 10BaseT, in the middle of a packet, is sending bits at 10 mbps. But you're not always in the middle of a packet. You're forming packets, detecting the carrier, backing off collisions, sending preamble bits, sending header bits, doing it in two directions at once, etc., etc...

    A moderately loaded TCP/IP/10BaseT network with well-behaved and responsive nodes on it will usually appear to have 3 mbps of user-data throughput, or roughly 300 kilobytes-per-second download rates in ftp.

    In order to advance technology, I will now make a rash generalization that will in the future be looked upon as shortsighted and embarassing:

    Nobody will ever really need more than 300 KB/s anyway.

    --Blair
  • They missed one test case I'd love to see- laptop in the running microwave.
  • just how many of these do I need to keep my beverages warm next to the PC?
  • Similar to the observations above, I have a home setup with:

    - 2.4 GHZ video/audio child baby monitor (Safety 1st brand).
    - 2 laptops with Lucent Orinoco gold PC wireless card w/128 bit encryption (as in the article) talking to a Lucent AP 500 base station.
    - An X10 2.4 GHZ wireless video camera array (3 cams).
    - A microwave oven in close proximity to all.

    It took me quite some time to figure out how to get any of this to work simultaneously. It turned out that all I had to do was set the X10 cameras to the fourth channel instead of the default, and suddenly they all worked simultaneously without any noticable problems. Sometimes you can see individual ethernet packets warp the screen of the baby monitor, but that seems to only happen sometimes. I don't know why. Otherwise, good to perfect throughput on the ethernet, nice clear screen on the baby monitor and good pictures on the X10 cameras. And my food tastes cooked. :)

    I'm thoroughly amazed that it all works as well as it does. I am afraid of what will happen if I get any other microwave devices, though. There's clarity somehow in the cacophony, and I don't want to disturb that ordered chaos.
  • Microwaves are one thing, but let's look at something that operates in the same band. On my wireless lan, if I get on my 2.4GHz cordless, I have no more network, WEP or not.
  • If you read the literature, you'll notice that a conventional microwave has little or no effect on an 802.11b transfer.

    Only a commercial microwave, with more than one magnetron will make a significant impact. Please, do your research before linking to such unsubstantiated fluff, and please correct them as well.

    (does anyone even read technical literature anymore?)
  • Does anyone know if the fix the IEEE is working on will be flashable onto current cards, or is it a silicon change?

    Also, the original reports made it seem like an attack would be "hard" to do and would require alot of traffic to analyze. Is a current installation with low traffic usage vulnerable to real-world haX0rs today?

  • by Anonymous Coward on Friday March 30, 2001 @02:44PM (#326211)
    Obviously, this man has no fucking clue about the technology he is reviewing. Does he realize how Direct Sequence Spread Spectrum works? Does he realize that ANY DSSS 11Mb link will result in an absolutel MAXIMUM throughput of 4-5Mbps because of the radio overhead (30%) and the asyncronous nature of radio communication? Does he know the frequency and spread of the channel he is on? Does he know that that particular DSSS channel probably doesn't overlap with the frequency of his microwave oven, since, if it did, he would have probably seen somewhere in the range of a 30-80% drop in bandwidth? Microwave ovens appear as almost a tall vertical line usually centered at 2450MHz with a maximum of +/- 1MHz spread on a Frequency/Power graph, while DSSS radios have a spread of almost 7MHz. Does he know how WEP operates? Does he understand that WEP adds absolutely zero overhead to the actual radio data because it boils down to a SIMPLE KEYPAIR HASH THAT IS DONE IN HARDWARE? Does he realize that if he got a "2.4 GHz" cordless phone that operated with a frequency hopping radio as most of them are that he'd get almost 50% data rate reduction despite his phone working wonderfully? Does anyone seem to understand that DSSS sucks and is only marketed above FHSS because it is CHEAPER?

    Fucking stupid idiots buying 802.11b DSSS.
  • by brogdon ( 65526 ) on Friday March 30, 2001 @02:31PM (#326212) Homepage
    "Before we get to the numbers, I'd like to point out that even with the above fancy command, there was still a small amount of system overhead in actually getting the packets sent. As the exact amount is difficult to calculate but non-trivial, I decided to weigh the figures like this:

    At 1 Mbps, it should take 11.00 seconds to transmit my 11 Mbit file, in the best possible case. On average (each test was sampled five times and averaged), it took 14.91 seconds to complete.

    So, we have 3.91 seconds of unaccounted-for overhead (or 35 percent of the total transmission time.) For purposes of argument, we'll assume that the 1 Mbit speed is optimal, and deduct 35 percent from all transmission speeds (chalking it up to system overhead.) And so we are grading on a curve.


    I may be crazy here, but why does the amount of non-transmission overhead vary indirectly with the speed of transmission? He mentions early in the article that there is a small amount of overhead in generating the bytes he's sending and timing the process, etc. because of CPU time and accessing the hard disk. Why then does he scale the overhead percentage-wise with each of his tests? That overhead time shouldn't disappear just because you're using a faster connection, it should stay exactly the same - the rest of the time should vary with the transmission speed. If you redo his calculations without scaling the overhead, the results look a lot more logical for the bandwidths he's using (bandwith and transfer time vary more correctly with each other, that is).


    --Brogdon
  • by Deluge ( 94014 ) on Friday March 30, 2001 @02:12PM (#326213)
    The standard seems to be 700W (at least that's what most microwave-ready foods go by when specifying heating times). There's plenty of 1kW+ Microwave ovens available.

    If one's to perform such a silly test (see earlier post about FCC regs and devices that need to be able to deal with EM interference) then at least test it with something more than a cupwarmer.

    ---

  • by electricmonk ( 169355 ) on Friday March 30, 2001 @06:04PM (#326214) Homepage
    From the article:

    This essentially means "Take these 11 million random bits, copy them over the network using no encryption, just dump them into the bit bucket on the other end, and tell me how long it took you." I did this to try to minimize the impact of disk usage and CPU crunching, and just try to make the bits fly as fast as possible.

    Umm... well, if he used ssh in order to make a connection to his other computer, wouldn't he still be technically using encryption? Or is the algorithm used to encrypt WEP connections really that much more taxing on the CPU?

  • by BlowCat ( 216402 ) on Friday March 30, 2001 @02:07PM (#326215)
    Lucent cards don't support 128-bit encryption. The allowed key length can be 5 bytes (40 bits) or 13 bytes (104 bits).

    I admit that 104-bit encryption is as hard to break as 128-bit encryption now, but Lucent is exaggerating the card's cryptographic strength by a factor of 2**16=65536. Not good!

  • by Freeptop ( 123103 ) on Friday March 30, 2001 @03:07PM (#326216)
    As an engineer who works on 802.11 radios, I have a few issues with these tests.
    First, the author performed only one test for each configuration. You can never come to a valid conclusion about performance from a one-shot test.
    Second, the author doesn't really know much about RF, otherwise they'd realize that Microwaves are shielded to prevent from cooking the person operating the thing. While the shielding doesn't stop everything, it stops enough to let most 802.11 traffic get through without too much difficulty. As others have pointed out, the way to go is to operate other devices that communicate at 2.4GHz, such as certain cordless phones, Bluetooth devices, etc.

    Third, the author doesn't appear to have read much on 802.11, or the author would realize that some amount of the overhead involved in 802.11 is used up in wireless headers that a simple application is never going to know exist.
    Fourth, the author used tests on one vendor's radio to come to a conclusion about all 802.11 radios, and I can tell you that not all radios are created equally.

    On the other hand, the author's results aren't actually terribly off from the performance seen in most 802.11b radios. Most 802.11b radios actually get a final throughput of about 3-5Mbps when running at "11 Mbps." Some of this is due to bus speed limitations (PCMCIA is slow), and some of it is due to the radios themselves (hey, you try getting a nice fast processor in there and maintain the price points), but whatever the limitation, no radio gets much more than 5Mbps in the best-case scenario (at least, of the radios currently on the market).

    In the meantime, I'm looking forward to 802.11a, which will operate in the 5GHz band (hopefully, there will be less interference there) and should have a throughput of about 54Mbps (if they can ever finalize a standard, that is).
  • by Bonker ( 243350 ) on Friday March 30, 2001 @02:00PM (#326217)
    But I keep getting these festering lesions on my arms and chest when I try play EQ or Q3A. Must be 'line noise'.

    "I pray that I never suffer an internal burn" - Nicolai Tesla

Don't panic.

Working...