Nonlinear Neural Nets Smooth Wi-Fi Packets 204
mindless4210 writes "Smart Packets, Inc has developed the Smart WiFi Algorithm, a packet sizing technology which can predict the near future of network conditions based on the recent past. The development was originally started to enable smooth real-time data delivery for applications such as streaming video, but when tested on 802.11b networks it was shown to increase data throughput by 100%. The technology can be applied at the application level, the operating system level, or at the firmware level."
Re:could be handy.. (Score:5, Insightful)
I'm not a network engineer, but latency is more important than bandwidth for ping times and such.
For an example pay a half-life game, open the console and type net_graph 3. That'll show you your fps, ping, and in/out bandwith used.
Why Neural Networks? (Score:5, Insightful)
Re:Why Neural Networks? (Score:4, Insightful)
Well, it's quite obviously because a Support Vector Machine is inherently linear, and to make it nonlinear, you must insert a nonlinear kernel which you need to select by hand.
If you'd read the article, you'd see that they are using a recurrent-feedback neural network; good luck finding a recurrent-feedback nonlinear kernel for a SVM....! You can't just plug in a radial bias function and expect it to work. In this application, they are looking for fast, elastic response to rapidly changing conditions as well as a low tolerance for predictive errors--something an RFNN is ideal for, and that a SVM is absolutely terrible at.
Skeptic (Score:5, Insightful)
just as a selling point because it sounds
like something extremely advanced and "related
to artificial intelligence".
usually the neural network is just a
very simple, possibly linear, adaptive filter
which means that really contains no more
than a few matrix multiplications
yes it has some success in approximating
things locally, but terms like "learning"
are really misused
After RTFA (the second) it actually
seems that they did try two or three
things before, but really i wouldn't
"welcome our new intelligent packet sizers overlords"
just yet.
Why wireless only (Score:5, Insightful)
Re:What? (Score:3, Insightful)
Not gonna happen. The poster was just using random random terms that have nothing to do with this article, trying to sound smart, and is probably laughing as the post gets moderated up.
Everything the poster mentione, such as Naive Bayes [google.com] and Support Vector Machines [google.com] are used for static tasks, like classification, not for realtime feedback situations. They learn once and predict forever. They don't learn iteratively and keep changing. Follow the Google links I just gave and peruse the first few sites that come up if you're not sure. They are used for things like text classification, handwriting recognition, voice recognition, etc., i.e., "train once use often."
EE Times Article (Score:4, Insightful)
This sounds like a great improvement to 802.11x technology...now let's open-source it so we can all benefit!
Re:Skeptic (Score:4, Insightful)
The simplicity of the calculation does not mean it is not a learning algorithm. Real neural networks are quite simple, as each "neuron" is simply a weighted average of the inputs passed through a sigmoid or step function. However, en masse they perform better than most other algorithms at handwriting recognition. They take a training set and operate on it repeatedly, updating their parameters, until some sort of convergence is reached. Their performance on a test set is a measure of how well they have learned. This is a learning algorithm.
Even linear regression is a learning algorithm. You give it a bunch of training data as input (i.e. x,y pairs), iterate on that data until it converges, and is then used to predict new data. There happens to be an analytic solution to the iteration, but this does not make it any less of a learning algorithm.
I think maybe your definition of "learning" is unnecessarily strict. The simplicity of the computation is not what defines this category of algorithms.
Re:Hahaha (Score:3, Insightful)
Wireless net congestion... (Score:3, Insightful)
I think part of the ease of predictability may have a little to do with the kind of protocols used to collision detection/TDMA in congested 802.11 nets. If they are suffciently simple a single node could outmanuver the others.
Some questions...
What is the behavior of this algorithm as the number of enabled clients increases and the bandwidth demand of the clients exceeds the channel capacity? Does it degrade gracefully? Does it unfairly compete with non-enabled clients?
Re:Hahaha (Score:3, Insightful)
Re:Looks sketchy to me (Score:3, Insightful)
And people have been doing this before. The EE-times article mentions that. Apparently no-one has either not made so much progress or just not made so much of a fuss over it before. A quick search for "variable packet length and wireless" turns up quite a lot of results though. I'm fairly confident that you can find previous research in this area if you look around.
I'm not entirely convinced that Kalman filters would do a good job though. Or rather, they may need more additions in order to be efficient in this particular problem space. Thus making it bigger and less desireable to put into eg firmware. It's not unreasonable that they haven't considered it though. But they did seem to try some other systems, some of them seem a bit too optimistic to be reasonable in this problem space though. According to EE-times they did attempts with database lookup and expert systems eg. Seems like both of those could be "trivially" rejected to not be adaptive and accurate enough.
It's not impossible that there exist methods which are a lot more efficient than NN to solve this problem. NN seems to do a good job though, so I guess it warrants some more looking into.