Wave interference is a relatively simple scientific concept that can be difficult to grasp at first. Many are introduced to the idea in high school or college physics, usually being shown something like the double slit experiment. Whilst this is a great demonstration of the wave properties of light it’s not exactly obvious how the constructive and destructive interference actually works. Something like the following video, I feel, gives a far better visual impression of what wave interference and superpositioning does in the real world.
The really cool demonstration comes in at about 55 seconds in where they demonstrate a concentric wave singularity, or what they call “The Spike”. Basically they make the waves work in such a way that once they meet in the middle they all interfere with each other at just the right point. This results in the rapid formation of a cavity in the middle which is then slammed shut as the waves return to their peak. The resulting geyser flows upward for far longer than you’d expect it to which is a great demonstration of the power of constructive interference with waves.
FloWave itself was constructed to replicate currents and waves seen in the ocean. This allows companies and researchers to test out their technologies in a controlled environment before they get deployed offshore, potentially saving costly repairs and re-engineering. That means that it’s mostly used to test out how things respond to various kinds of waves and currents, rather than generating awesome wave spikes that shoot water several stories into the air. Still I’d love something like this on a smaller scale to do my own demonstrations of wave interference.
Fiber is the future of all communications, that’s a fact that any technologist will be able to tell you. Whilst copper is still the mainstay for the majority its lifetime is limited as optics are fast approaching the point where they’re feasible for everything. However even fiber has its limits, one that some feel we were going to hit sooner rather than later which could cause severe issues for the Internet’s future. However new research coming out of the University of California, San Diego paves the way for boosting our fiber network’s bandwidth significantly.
Today’s fiber networks are made up of long runs of fiber optic cable interspersed with things called repeaters or regenerators. Essentially these devices are responsible for boosting up the optical signal which becomes degraded as it travels down the fiber. The problem with these devices is that they’re expensive, add in latency and are power hungry devices, attributes that aren’t exactly desirable. These problems are born out of a physical limitation of fiber networks which puts an upper limit on the amount of power you can send down an optical cable. Past a certain point the more power you put down a fiber the more interference you generate meaning there’s only so much you can pump into a cable before you’re doing more harm than good. The new research however proposes a novel way to deal with this: interfere with the signal before it’s sent.
The problem with interference that’s generated by increasing the power of the signal is that it’s unpredictable meaning there’s really no good way to combat it. The researchers however figured out a way of conditioning the signal before it’s transmitted which allows the interference to become predictable. Then at the receiving end they’ve used what they’re calling “frequency combs” to reverse the interference on the other end, pulling a useful signal out of interference. In the lab tests they were able to send the signal over 12,000KM without the use of a repeater, an absolutely astonishing distance. Using such technology could drastically improve the efficiency of our current dark fiber networks which would go a long way to avoiding the bandwidth crunch.
It will be a little while off before this technology makes its way into widespread use as whilst it shows a lot of promise the application within the lab falls short of a practical implementation. Current optical fibers carry around 32 different signals whereas the system that the researchers developed can currently only handle 5. Ramping up the number of channels they can support is a non-trivial task but at least it’s engineering challenge and not a theoretical one.
It will likely come as a shock to many to find out that Australia leads the world in terms of 4G speeds, edging out many other countries by a very healthy margin. As someone who’s a regular user of 4G for both business and pleasure I can attest to the fact that the speeds are phenomenal with many of the CBD areas around Australia giving me 10~20Mbps on a regular basis. However the speeds have notably degenerated over time as back in the early days it wasn’t unheard of to get double those speeds, even if you were on the fringes of reception. The primary factor in this is an increased user base and thus as the network becomes more loaded the bandwidth available to everyone starts to turn south.
There’s 2 factors at work here, both of which influence the amount of bandwidth that a device will be able to use. The primary one is the size of the backhaul pipe on the tower as that is the hard limit on how much traffic can pass through a particular end point. The second, and arguably just as important, factor is the number of devices vs the number of antennas on the base station as this will determine how much of the backhaul speed can be delivered to a specific device. This is what I believe has been mostly responsible for the reduction in 4G speeds I’ve experienced but according to the engineers at Artemis, a new communications start up founded by Steve Perlman (the guy behind the now defuct OnLive), that might not be the case forever.
Artemis new system hopes to solve the latter part of the equation not by eliminating signal interference, that’s by definition impossible, but instead wants to utilize it in order to create pCells (personal cells) that are unique to each and every device that’s present on their network. According to Perlman this would allow an unlimited number of devices to coexist in the same area and yet still receive the same amount of signal and bandwidth as if they were on it all by themselves. Whilst he hasn’t divulged exactly how this is done yet he has revealed enough for us to get a good idea about how it functions and I have to say it’s quite impressive.
So the base stations you see in the above picture are only a small part of the equation, indeed from what I’ve read they’re not much different to a traditional base station under the hood. The magic comes in the form of the calculations that are done prior to the signal being sent out as instead of blindly broadcasting (like current cell towers do) they instead use your, and everyone else who is connected to the local pCell network, location to determine how the signals be sent out. This then manifests as a signal that’s coherent only at the location of your handset giving you the full amount of signal bandwidth regardless of how many other devices are nearby.
I did enough communications and signal processing at university to know something like this is possible (indeed it’s a similar kind of technology that powers “sound lasers”) and could well work in practice. The challenges facing this technology are many but from a technical standpoint there are 2 major ones I can see. Firstly it doesn’t solve the backhaul bandwidth issue meaning that there’s still an upper limit on how much data can be passed through a tower, regardless of how good the signal is. For a place like Australia this would be easily solved by implementing a full fibre network which, unfortunately, seems to be off the cards currently. The second problem is more nuanced and has to do with the calculations required and the potential impacts that might have on the network.
Creating these kinds of signals, ones that are only coherent at a specific location, requires a fair bit of back end calculations to occur prior to being able to send the signal out. The more devices you have in any particular area the more challenging this becomes and the longer that this will take to calculate before the signal can be generated. This has the potential to introduce signal lag into the network, something that might be somewhat tolerable from a data perspective but is intolerable when it comes to voice transmission. To their credit Artemis acknowledges this challenge and has stated that their system can do up to 100 devices currently so it will be very interesting to see if it can scale out like they believe it can.
Of course this all hinges on the incumbent cellular providers getting on board with this technology, something which a few have already said their aware of but haven’t gone much further than that. If it works as advertised then it’s definitely a disruptive technology, one that I believe should be adopted everywhere, but large companies tend to shy away from things like this which could strongly hamper adoption. Still this tech could have wide reaching applications outside the mobile arena as things like municpal wireless could also use it to their advantage. Whether it will see application there, or anywhere for that matter, will be something to watch out for.
I’m not exactly a corporate jet setter (although the past couple months would attest otherwise) but I’ve see the inside of a plane enough times to know the law of the land. For me I spend the majority of my time buried in a book, right now its the Wheel of Time series, as I don’t really get a chance to read for pleasure at any other time. For long haul flights I’ll usually have my laptop in tow as well although lately I’ve left that in the checked baggage, mostly because the in flight entertainment systems have gotten a lot better. Still I’ve had the pleasure of being on some flights that offer in flight wireless and whilst its usability was on the low side it was an apt demonstration of how far aviation technology has come, and where it was heading.
Rewind back a decade or so and the idea of allowing radio transmitting devices to operate on flights was akin to wanting to make the plane crash. The stance of the various aviation bodies was easy to understand however: they were simply unable to test all of the available transmitting devices with their aircraft to ensure that no interference was possible and thus had to ban them all outright. Their relenting on wireless networking was due in a large part to the rigorous specifications of 802.11a/g/n which include transmission power limits as well as their frequencies being well outside of any that aircraft use for necessary functions. Of course not every device strictly adheres to it but there’s little to be gained from juicing up the power levels on your wireless, especially if it’s running on a battery.
However the use of these systems is usually restricted to after take off through until the plane is making its final approaches for landing. Whilst I’ve heard a lot of people say that this was due to the interference I thought the reasoning was far more simple, it was to keep you aware during the most risky points of flight: take off and landing. Of course my theory falls apart in the face of reality as I’ve not once been told to put my book away during these times, even when they’re doing the safety demonstration, but have been told on numerous occasions that my laptop should be put away until I’m told it’s allowed again.
Recent announcements from the Federal Aviation Authority in the USA however show that the rules against electronic devices are slowly being changed to allow more broad use cases with them now allowing use of electronic devices during take off and landing. They’re still limiting the use of wireless to the in flight system (although whether the 10,000ft restriction is still in effect isn’t something I could ascertain) and about and the outright band on all other transmission devices remains in effect. It might surprise you to find out that I actually agree with the latter restriction but not for the sake of the airlines however, it’s for those poor cell towers.
You see when you’re on the ground your mobile phone has a finite transmission range that’s limited primarily by the numerous things that get in the signals way as it travels from the cell tower to you. As a consequence of this you’re likely only ever hitting a handful of different towers, something which they deal with easily through hand-offs between each other. However when you’re in a plane those obstructions are no longer in your way and suddenly you’re effectively able to hit dozens of towers all at the same time. This, in effect, is like a small denial of service attack and they’re simply not designed to handle it. The best way to combat this would be to use some form of picocell on the plane itself, something which I had heard was in development a long time ago but can’t find any links to support now. Still for the short term this is unlikely to change unless the telecommunications companies think its worth their while to support it and the FAA agrees to change the rules.
Personally though I’m far more interested in technology that makes those in flight wireless systems more usuable like the new Ground to Orbit systems that GoGo wireless has been testing. Whilst the current 10Mbps of bandwidth might be enough for the odd Tweet or Facebook post it’s rarely usable for anything else, especially when there’s a few people online at the same time. Of course some also take solace in the fact that they’re incommunicado for the duration of the flight, something which I don’t quite mind myself.