Posts Tagged‘bandwidth’

Artemis pCell pWave

The Artemis pCell: Making Interference Work For You.

It will likely come as a shock to many to find out that Australia leads the world in terms of 4G speeds, edging out many other countries by a very healthy margin. As someone who’s a regular user of 4G for both business and pleasure I can attest to the fact that the speeds are phenomenal with many of the CBD areas around Australia giving me 10~20Mbps on a regular basis. However the speeds have notably degenerated over time as back in the early days it wasn’t unheard of to get double those speeds, even if you were on the fringes of reception. The primary factor in this is an increased user base and thus as the network becomes more loaded the bandwidth available to everyone starts to turn south.

There’s 2 factors at work here, both of which influence the amount of bandwidth that a device will be able to use. The primary one is the size of the backhaul pipe on the tower as that is the hard limit on how much traffic can pass through a particular end point. The second, and arguably just as important, factor is the number of devices vs the number of antennas on the base station as this will determine how much of the backhaul speed can be delivered to a specific device. This is what I believe has been mostly responsible for the reduction in 4G speeds I’ve experienced but according to the engineers at Artemis, a new communications start up founded by Steve Perlman (the guy behind the now defuct OnLive), that might not be the case forever.

Artemis pCell pWaveArtemis new system hopes to solve the latter part of the equation not by eliminating signal interference, that’s by definition impossible, but instead wants to utilize it in order to create pCells (personal cells) that are unique to each and every device that’s present on their network. According to Perlman this would allow an unlimited number of devices to coexist in the same area and yet still receive the same amount of signal and bandwidth as if they were on it all by themselves. Whilst he hasn’t divulged exactly how this is done yet he has revealed enough for us to get a good idea about how it functions and I have to say it’s quite impressive.

So the base stations you see in the above picture are only a small part of the equation, indeed from what I’ve read they’re not much different to a traditional base station under the hood. The magic comes in the form of the calculations that are done prior to the signal being sent out as instead of blindly broadcasting (like current cell towers do) they instead use your, and everyone else who is connected to the local pCell network, location to determine how the signals be sent out. This then manifests as a signal that’s coherent only at the location of your handset giving you the full amount of signal bandwidth regardless of how many other devices are nearby.

I did enough communications and signal processing at university to know something like this is possible (indeed it’s a similar kind of technology that powers “sound lasers”) and could well work in practice. The challenges facing this technology are many but from a technical standpoint there are 2 major ones I can see. Firstly it doesn’t solve the backhaul bandwidth issue meaning that there’s still an upper limit on how much data can be passed through a tower, regardless of how good the signal is. For a place like Australia this would be easily solved by implementing a full fibre network which, unfortunately, seems to be off the cards currently. The second problem is more nuanced and has to do with the calculations required and the potential impacts that might have on the network.

Creating these kinds of signals, ones that are only coherent at a specific location, requires a fair bit of  back end calculations to occur prior to being able to send the signal out. The more devices you have in any particular area the more challenging this becomes and the longer that this will take to calculate before the signal can be generated. This has the potential to introduce signal lag into the network, something that might be somewhat tolerable from a data perspective but is intolerable when it comes to voice transmission. To their credit Artemis acknowledges this challenge  and has stated that their system can do up to 100 devices currently so it will be very interesting to see if it can scale out like they believe it can.

Of course this all hinges on the incumbent cellular providers getting on board with this technology, something which a few have already said their aware of but haven’t gone much further than that. If it works as advertised then it’s definitely a disruptive technology, one that I believe should be adopted everywhere, but large companies tend to shy away from things like this which could strongly hamper adoption. Still this tech could have wide reaching applications outside the mobile arena as things like municpal wireless could also use it to their advantage. Whether it will see application there, or anywhere for that matter, will be something to watch out for.

 

Uploading

Oh The Things I Could Do If I Only Had the Bandwidth.

The Internet situation I have at home is what I’d call workable but far from ideal. I’m an ADSL2+ subscriber, a technology that will give you speeds up to 25MBps should you be really close to the exchange, on good copper and (this is key) make the appropriate sacrifices to your last mile providers. Whilst my line of sight distance to the exchange promises speeds in the 15MBps range I’m lucky to see about 40% of that with my sync speed usually hovering around the 4~5MBps range. For a lot of things this is quite usable, indeed as someone who had dial-up for most of his life these speeds are still something I’m thankful for, but it’s becoming increasingly obvious that my reach far exceeds my grasp something which as a technology centric person is fast becoming an untenable position.

Uploading

Honestly I don’t think about it too much as it’s not like it’s a recent realisation and, since the difference between the best and worst speeds I’ve had weren’t that great in retrospect, I’ve developed a lot of habits to cope with it. Most of these are running things over longer periods when I wouldn’t be using the Internet anyway but not all tasks fit nicely into that solution. Indeed last night when I wanted to add in a video that I recorded to my post, one that was only ~180MB in size, I knew there was going to be a pretty long delay in getting the post online. The total upload time was around 30mins in the end which is just enough time for me to get distracted with other things and completely forget about what I was doing until later that night.

Sure it’s not an amazing example of why I need faster Internet but it does highlight the issue. The video wasn’t particularly large nor super high resolution (720p, 60fps), it was produced on technology that’s over 2 years old and uploaded to a service that’s been around for 7 years. The bottleneck in that equation is the connection that all of them share from my home network, something which hasn’t changed that much in the last decade that I’ve been a broadband Internet user.

For me it’s even worse when I run up against the limitations of paltry connection for things like services I’d like to host myself. In its infancy this blog was hosted from my little server at home but it became quickly apparent that little things like pictures were simply untenable because they’d take forever to load even if I shrunk them down to near unusable sizes. It became even worse when I started looking into using the point to point VPN feature in Azure for connecting a small home environment to the virtual machines I’m running in the cloud as my tiny connection was simply not enough to handle the kind of traffic it would produce. That might not sound like a big deal but for any startup in Australia thinking about doing something similar it kills the idea of creating using the service in that fashion which puts a lot of pressure on their remaining runway.

It’s reasons like this which keep me highly skeptical of the Liberal’s plan for the NBN as the speeds they’re aspiring towards aren’t that much dissimilar to what I’m supposed to be getting now. Indeed they can’t even really guarantee those speeds thanks to their reliance on the woefully inadequate copper network for the last run in their FTTN plan. Canberra residents will be able to tell you how much of a folly their idea is after the debacle that is TransACT (recently bought for $60 million and then its infrastructure sold for $9 million) which utterly failed to deliver on it’s promises, even when they deployed their own copper infrastructure.

It also doesn’t help that their leader thinks that 25MBps is more than enough for Australian residents which, if true, would mean that ADSL2+ would be enough for everyone, including businesses. Us IT admins have known that this hasn’t been the case for a while, especially considering how rare it is to get those speeds, and the reliance on the primary limiting factor (Telstra’s copper network) for the Liberal’s NBN plan effectively ensures that this will continue on for the foreseeable future.

All those points pale in comparison to the one key factor: we will need to go full fibre eventually.

The copper we have deployed in Australia has a hard upper limit to the amount of bandwidth it can carry, one that we’re already running up against today. It can be improved through remediation by installing thicker cables but that’s a pretty expensive endeavour, especially when you take into account the additional infrastructure required to support the faster speeds. Since there’s no plan to do such remediation on the scales required (either by Telstra or as part of the Liberal’s NBN plan) these current limitations will remain in place. Fibre on the other hand doesn’t suffer from the same issues with the new cables able to carry several orders of magnitude more bandwidth just with today’s technology. The cost of deploying it isn’t cheap, as we already know, but considering it will pay for itself well before it reaches the end of its useful life.

My whinging is slightly moot because I’ll probably be one of the lucky ones to have fibre being rolled out to my neighbourhood before the election but I do feel the NBN’s effectiveness will be drastically decreased if its not ubiquitous. It’s one of the few multi-term policies that will have real, tangible benefits for all Australians and messing with it will turn it from a grand project to a pointless exercise. I hope the Liberal’s policy is really just all that much hot air to placate their base because otherwise the Internet future of Australia will be incredibly dim and that’s not something that I, or any user of technology, wants for this country.

Is Tethered Internet Usage So Different?

I remember getting my first ever phone with a data plan. It was 3 years ago and I remember looking through nearly every carrier’s offerings to see where I could get the best deal. I wasn’t going to get a contract since I change my phone at least once a year (thank you FBT exemption) and I was going to buy the handset outright, so many of the bundle deals going at the time weren’t available to me. I eventually settled on 3 mobile as they had the best of both worlds in terms of plan cost and data, totaling a mere $40/month for $150 worth of calls and 1GB of data. Still when I was talking to them about how the usage was calculated I seemed to hit a nerve over certain use cases.

Now I’m not a big user of mobile data despite my daily consumption of web services on my mobile devices, usually averaging about 200MB/month. Still there have been times that I’ve really needed the extra capacity like when I’m away and need an Internet connection for my laptop. Of course tethering the two devices together doesn’t take much effort at all, my first phone only needed a driver for it to work, and as far as I could tell the requests would look like they were coming directly from my phone. However the sales representatives told me in no uncertain terms that I’d have to get a separate data plan if I wanted to tether my handset or if I dared to plug my sim card into a 3G modem.

Of course upon testing these restrictions I found them to be patently false.

Now it could’ve just been misinformed sales people who got mixed up when I told them what I was planning to do with my new data enabled phone but the idea that tethered Internet usage is somehow different to normal Internet usage wasn’t a new idea to me. In the USA pretty much every carrier will charge you a premium on top of whatever plan you’ve got if you want to tether it to another device, usually providing a special application that enables the functionality. Of course this has spurred people to develop applications that circumvent these restrictions on all the major smart phone platforms (iOS users will have to jailbreak unfortunately) and the carriers aren’t able to tell the difference. But that hasn’t stopped them from taking action against those who would thwart their juicy revenue streams.

Most recently it seems that the carriers have been putting pressure on Google to remove tethering applications from the Android app store:

It seems a few American carriers have started working with Google to disable access to tethering apps in the Android Market in recent weeks, ostensibly because they make it easier for users to circumvent the official tethering capabilities offered on many recent smartphones — capabilities that carry a plan surcharge. Sure, it’s a shame that they’re doing it, but from Verizon’s perspective, it’s all about protecting revenue — business as usual. It’s Google’s role in this soap opera that’s a cause for greater concern.

Whilst this is another unfortunate sign that no matter how hard Google tries to be “open” it will still be at the mercy of the carriers their banning of tethering apps sets a worrying precedent for carriers looking to control the Android platform. Sure they already had a pretty good level of control over it since they all release their own custom versions of Android for handsets on their network but now they’re also exerting pressure over the one part that was ostensibly never meant to be influenced by them. I can understand that they’re just trying to protect their bottom line but the question has to be asked: is tethering really that much of a big deal for them?

It could be that my view is skewed by the Australian way of doing things, where data caps are the norm and the term “unlimited” is either a scam or at dial-up level speeds. Still from what I’ve seen of the USA market many wireless data plans come with caps anyway so the bandwidth argument is out the window. Tethering to a device requires no intervention from the carrier and there are free applications available on nearly every platform that provide the required functionality. In essence the carriers are charging you for a feature that should be free and are now strong-arming Google into protecting their bottom lines.

I’m thankful that this isn’t the norm here in Australia yet but we have an unhealthy habit of imitating our friends in the USA so you can see why this kind of behavior concerns me. Since I’m also a firm believer in the idea that once I’ve bought the hardware its mine to do with as I please and tethering falls under that realm. Tethering is one of those things that really shouldn’t be an issue and Google capitulating to the carriers just shows how difficult it is to operate in the mobile space, especially if you’re striving to make it as open as you possibly can.