Harnessing Interference For Faster Wireless Data 91
holy_calamity writes "Inventor of the Quicktime codec Steve Perlman has unveiled a new wireless technology he claims can deliver thousands of times more bandwidth to mobile devices than existing technology. Each user is served by multiple transmitters, which send out waves carefully designed to combine into a data signal only at a device's location. That technique enables every user to be targeted with a signal with the same total bandwidth that would usually be shared between users, says Perlman."
Can still charge (Score:2)
If companies like AT&T and Verizon come up with a good way to provide boundless bandwidth... what do you think are the chances that they'll stop charging for high usage? 0.005% maybe?
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
No, but the fixed costs of the infrastructure eventually ends up being far exceeded by the revenue coming in which is his point. Or do you care to point out how 1GB of "overage" data somehow costs AT&T 10 times more than a "regular" GB of data.
Re: (Score:2)
Given no amount of bandwidth is truly "boundless" or "infinite", some joker will come up with some way to saturate the line 24/7
Nope, not buying it. The demand for bandwidth is not infinite. Just look at ethernet in a typical business network with a normal file server, traffic to the internet, etc. There is just no scarcity there anymore: Even if you have a "fast" file server, the bottleneck is the disk array, not the 10GbE uplink to the switch. Nothing normal people do even remotely taxes a network with gigabit to the users and 10GbE to the servers.
The fact is that consumption does not increase linearly with capacity because the mo
Re: (Score:2)
AT&T is nowhere near saturating their lines, nor are any of the other major carriers. They cried wolf in Canada and Canada made them actually reveal the data and nowhere in their network were they anywhere near 50% saturation at peak.
Re: (Score:2)
Re: (Score:2)
So you're anti-capitalistic now? You're socialist?
Why is it so many /. posts demand socialistic freebees and but a capitalist paycheck. Its should only be free (and socialist) when its out of someone else's pocket.
I guess the entitled generation has spoken.
Re: (Score:2)
Because we have to eat until the day comes when we get the socialist freebies? Because it's necessary to counterbalance all the corporations succeeding at socializing costs while privitizing profits?
Because the 'invisible hand of the market' is supposed to drive the retail price down to the marghinal cost of production, but it never seems to do so?
Because we figure that with all the subsidies the taxpayers have given the telcos we should own them by now?
Take your pick.
Re: (Score:2)
Because capitalism assumes low barriers of entry to a market (so that if AT&T charges too much, new startups pop up to eat their market share), while phone industry has extremely high barriers. Because the assumptions behind capitalism don't hold for phone industry, it would be stupid to insist running said industry by it, unless one has some kind of ideological commitment to capitalism - and we only need to look at th
Re: (Score:2)
The economic cost is different in the short term and long term. The financial cost to the company will be different to that because markets and technology never work properly, but it's probably a good start.
In the short term the economic cost to using extra bandwidth is zero until capacity is reached. So is the financial cost to the company. The infrastructure and organizations have to be maintained whether is being used or not. When it reaches capacity the economic cost is the cost of failing to deliver th
Re: (Score:2)
Re: (Score:2)
Right, it's not as if they inflate their actual costs or anything. I mean it's makes perfectly logical sense that a 200MB data plan from AT&T costs $15 while 2GB costs $25. Care to explain to me why the former plan costs nearly 7x more per MB than the latter?
Re: (Score:2)
Care to explain to me why the former plan costs nearly 7x more per MB than the latter?
Because you're making the common mistake of not understanding (or pretending not to) that both plans carry administrative costs and overhead that costs a lot more than the bandwidth. You're thinking that the only thing built into those prices is the actual bandwidth. Which is exactly wrong.
admin costs built into the voice plan (Score:2)
If admin overhead was the reason, then graphing cost vs data allocation should result in a straight line that crosses the axis at a value equivalent to the overhead. Instead, the higher data rate plans become progressively cheaper even factoring in some constant amount of overhead. For example, my local telco has the following plans for mobile internet:
$15 250MB
$25 1GB
$60 3GB
$75 Unlimited
Strangely, their 4G iPad plans are totally different:
$20 500MB
$35 5GB
Re: (Score:1)
Re: (Score:2)
No, he was assuming that the administrative costs and overhead of the two data plans are identical. Considering the only thing that changes is the bandwidth or data cap, it is a reasonable assumption. Actual operation of the service is automated and doesn't have proportionate cost increases.
For example, a sysadmin that can manage a 100 Mbps Fast Ethernet switch doesn't cost 10x less than one that can manage a GbE switch. Nor is there 10x the work involved.
Yes, there is a different capital cost, but ongoing
Re: (Score:2)
Re: (Score:2)
of course - utilisation might well vary according to plan.
and cost is really only driven by utilization during peak periods.
more importantly, this cost has to cover the cost of customer acquisition and infrastructure build.
here's the reality though;
Companies charge the fee which they think will maximize their profit.
They don't really know what that is, so they wave their hands a little (actually a lot). They try to charge fees which will cover their investment and generate a profit on each segment whilst bu
Re: (Score:2)
You fail economics. The "chance" is zero, because it's not random. AT&T strives to make as much money as possible, unless competition forces them to do otherwise.
Reliablity? (Score:2)
Doesn't sound very reliable to me - and what if you move slightly?
Re: Move Slightly! (Score:2)
"I'm sorry, I thought I was downloading a song by a CC licensed garage band, but I moved slightly and wound up with Britney Spears. So Sorry!"
Re: (Score:1)
Re: (Score:1)
Re: (Score:3)
The article says their implementation should be sensitive enough to deal with movement, even driving. Not sure if that claim will hold up but at least they're thinking about it.
Re: (Score:2)
Re: (Score:1)
As an RF engineer, when I hear people mention the words simple and radio in the same sentence I smile inwardly and anticipate a project that gets to the desperation phase more rapidly than usual without any design input to allow it for tuning the performance of each circuit block.
In short, radio is never as simple as you think it is.
Security? (Score:3)
Does this have implications for enhanced wireless security? A wireless signal that can only be received in a specific location seems like a valuable thing.
Re: (Score:2)
New ? Hardly. (Score:4, Informative)
802.11n already does this, they call it "beam-forming". Cisco features it in their high-end access points, using multiple antennae to send the same payload but with varying phase shift, which recombine at the receiver to produce a stronger coherent wave.
I love how the summary introduces him as the "inventor of the Quicktime codec". Yeah, he provided the RPZA ("road pizza") codec, which is so damn simple it made Bink Video look like fine art, back in the day.
Re: (Score:2)
That's just to help the signal. It still shares bandwidth amongst all users. With this, each user can theoretically get full-spectrum downstream. Also different in that it broadcasts from multiple access points which is hardly trivial.
Re: (Score:3)
Again, Ciscos can do this. I don't care much for the company, but I've a client with more money than brains and they have a HUGE deployment of these things. The WiFi is actually faster than the wired lan, despite having 300+ clients.
Re: (Score:2)
Re: (Score:2)
Bingo. It sounds like he's trying to take a "cloud-sourced" approach to MIMO, with a little meshiness thrown in for good measure.
Plus I think the whole "support for non-stationary receivers is a huge issue" and "needs to avoid interference that's not of its own making" aspects will make this a non-starter. Good luck getting that spectrum, or finding a big enough group of fixed-wireless customers to make this either useful or profitable.
WiMAX and LTE are already doing MIMO and beamforming (perhaps to varying
Re: (Score:2)
May suck for cell phones, but it'll be great for me. I live in an area where I'm just barely out of range of DSL and cable. Not anything like montana or alaska... Southern alabama, Right between Mobile and pascagoula. I'm currently using 3G wireless from verizon, and it pretty much sucks. 1.1MB down, when it works, the rest of the time, SOL.
My stationary USB card in a 3G router would love to site nice and still for this to work.
Re: (Score:3)
802.11n already does this, they call it "beam-forming". Cisco features it in their high-end access points, using multiple antennae to send the same payload but with varying phase shift, which recombine at the receiver to produce a stronger coherent wave.
Which is a variant on "steerable null" - a multi-antenna hack that lets the antennas at a cell site send out beams configured such that, at each active remote device paired with the site, the signals intended for all the OTHER active receivers cancel out. (
Codecs vs. Containers (Score:1)
A common enough confusion I suspect. To be pedantic: QuickTime is a media container, not a codec. It's similar to the way that AVI and OGG aren't codecs. They're containers for stuff like MP4 (confusingly sometimes also the name of a container format), Vorbis (the codec behind most Ogg audio files), or Mp3.
We're Toast (Score:2)
I can see it now. Sprint hires hacker to hack the T-mobile phone network and in a single keystroke explode their customer base; other networks follow moments later. The first and final act in what is later to be known as the Carrier Wars.
Re: (Score:1)
So no cause for alarm here, people... Unless you happen to work for a telco that is...
Perlman says it's not beam-forming (Score:2)
In his June 4 presentation [youtube.com] he states that it's "not beam-forming". He doesn't say much about what it is, though.
His white paper [rearden.com] (PDF) gives a bit more detail, though still not much. It sounds akin to MIMO, but instead of phase-aligning multiple signals to increase the strength (i.e. beam-forming), the antennae are more widely distributed, and complex-formed signals are broadcast from each antenna in careful sync, so that they interfere at each receiver to produce the desired signal.
Here is a more technical breakdown: (Score:2)
http://www.eedailynews.com/2011/08/wireless-inventors-hype-dido-no-not.html [eedailynews.com]
Re: (Score:1)
Interference from other sources is a killer (Score:5, Informative)
Re: (Score:2)
Nope. Sorry. You're wrong. Electromagnetic waves add very nicely, so that your signal remains there, even if many other signals are simultaneously being transmitted.
The overall idea is fine, in principle. As other people have said, it is 802.11n beam-forming on steroids. If you had 1000 transmitters, and if you could know the exact time delay and attenuation from each of those transmitters to your cell phone, then (indeed) you could make them all add together precisely where your cell phone is.
Re: (Score:2)
'Perlman estimates that the first commercial use of DIDO technology could come as early as the end of next year. But even then, the first deployments are likely to be outside of the United States.
In a DIDO system, a data center on the Internet determines the wireless signals that each transmitter will send based in part on the location and number of other DIDO transmitters in the area. In order for the data center to know what the resulting interference patterns will be, there can't be any other sources of signals outside those generated by the DIDO transmitters.
What that means is that a DIDO system would have to be used on currently unused or completely reclaimed spectrum. So DIDO won't be improving your Wi-Fi or 3G experience, because those parts of the spectrum are already crowded with transmitters. It's more likely to be embraced in the near term in countries that have a lot more unused spectrum than does the United States, Perlman said.' (Source: San Jose Mercury News, 8/3/2011 http://www.mercurynews.com/business/ci_18603178 [mercurynews.com])
So Perlman himself says that, yes, in a closed system, his design works -- but only in that closed system. Given that some stray RF emissions are generated from various things (as we know from the amateur radio folks whenever BPL comes up, for example), there is an interference problem.
[* ...actially, if you're really gonzo, you can adjust all the transmitters to make their signals cancel out exactly, at everyone else's cell phones, so long as you have more transmitters than cell phones. In principle. But I don't think anyone is seriously proposing that...]
From the above-mentioned article, Perlman says you do not need more transmitters than cell phones. He suggests a
Re: (Score:1)
What that means is that a DIDO system would have to be used on currently unused or completely reclaimed spectrum.
Sounds to me like this would make it trivial to design cell blockers for say, Theatres, Libraries, Museums, etc. Or even for someone to make an even lower powered one to shut up that asshole sitting behind you in a restaurant blabbing at 80db who just won't shut the fuck up. Obviously they'd be illegal for personal use, but damn they'd be satisfying to use at times...
Re: (Score:2)
I was skeptical at first (Score:3)
How do you come up with signals that not only constructively and destructively interfere in precisely the right spot in precisely the right way to deliver data to a device, but also for those same signals to simultaneously interfere at other points to deliver different data?
Designing radio signals that will interfere with one another in just the right way takes complex mathematics and careful coordination among the different DIDO transmitters. "The computational requirements are very large, but we solved that by using a cloud server," says Perlman.
Oh! The cloud. I thought he might dodge the question with some hand-waving. But he's got the cloud on it.
Where do I sign up? And how do I make sure the guy sitting next to me isn't stealing my signal?
Re: (Score:2)
Oh! The cloud. I thought he might dodge the question with some hand-waving. But he's got the cloud on it.
Yup. Since a Cloud is just a Virtualized Data Center, there's no reason why it will be a problem in a production deployment ... I mean, we can always just throw another Cloud at the problem.
Where do I sign up?
I'm sure they'll let you know.
And how do I make sure the guy sitting next to me isn't stealing my signal?
Large bladed object (through the interlopers device of course) should remove all worries (and probably give you some nice breathing room at the crowded cafe ... at least till the authorities arrive.
Re: (Score:2)
That's a good move. Clouds are everywhere, the coverage area will be huge!
Re: (Score:2)
How do you come up with signals that not only constructively and destructively interfere in precisely the right spot in precisely the right way to deliver data to a device, but also for those same signals to simultaneously interfere at other points to deliver different data?
For each of the N partners you compute a signal that puts a null on all the N-1 OTHER partners but not on the partner of interest. That can be heard everywhere except near the other partners - and the one you're talking to is somewhere
Re: (Score:2)
No you don't. The cloud already did all the math for you.
Phased Arrays (Score:3)
In other words, phased arrays. It'd be really cool if he could deploy it for wireless communication though. There's a lot of wasted wireless bandwidth to recoup.
Hmmm... (Score:2)
I can't seem to find any reference to it, but I read about a similar system several years ago, where communications for a submarine would be split up into several waves, which only combine into a useful signal at the point the submarine is supposed to be.
Don't know wether it was an idea or something that was actually implemented, though.
What about secondary interference points? (Score:3)
Waves don't only interfere constructively at one point. They interfere constructively at many points, to varying degrees. What happens when two devices are using mirrored interference points?
Instead of targeting specific devices, what about dividing the landscape into many physical regions, using constructive interference to cover an area rather than a single device. It would be like space-division multiplexing.
My biggest concern with this tech is not transmission from towers to individual devices, but rather the return call. What are the computational requirements for a receiver using this technology?
Re: (Score:1)
Re: (Score:3, Interesting)
The device can send a signal back which will interfere with other devices, but incoming signals at the antennas can be weighted by the same coefficients (or at least derived from the same) to again cancel all th
Re: (Score:2)
It will use directional transmission not radial waves.
Though if you have a mesh of transmitters you would could use radial waves i would think but the complexity function based on the number of users and nodes would be exponentials within exponentials.
Can't beat Shannon this way. (Score:2)
This is cute, but it won't let you beat information theory limits on signal bandwidth. After all, no matter how these signals are intended to be combined through spatial interference, each antenna is emitting a signal which varies only in time. So the more signals you try to pack into one antenna's output, the more those signals project onto one another ("overlap"), and so the more they get mixed together. From each antenna's perspective, this is just a baroque form of time-domain multiplexing (TDMA) --