Posts Tagged‘internet’

Malcolm Turnbull Nice NBN You Have There

The Liberal’s NBN Plan is Just Plain Bad.

Last week I regaled you with a story of the inconsistent nature of Australia’s broadband and how the current NBN was going to solve that through replacing the aging copper network with optical fibre. However whilst the fundamental works to deliver it are underway it is still in its nascent stages and could be easily usurped by a government that didn’t agree with its end goals. With the election looking more and more like it’ll swing towards the coalition’s favour there has been a real risk that the NBN we end up with won’t be the one that we were promised at the start, although the lack of a concrete plan has left me biting my tongue whilst I await the proposal.

Today Malcolm Turnbull announced his NBN plan, and it’s not good at all.

Malcolm Turnbull Nice NBN You Have There

Instead of rolling out fibre to 93% of Australians and covering the rest off with satellite and wireless connections the Liberal’s NBN will instead only roll fibre to 22%, the remaining 71% will be covered by FTTN. According to Turnbull’s estimations this will enable all Australians to have broadband speeds of up to 25MBps by 2016 with a planned upgrade of up to 100MBps by 2019. The total cost for this plan would be around $29 billion which is about $15 billion less than the current planned total expenditure required for Labor’s FTTP NBN. If you’re of the mind that the NBN was going to be a waste of money that’d take too long to implement then these numbers would look great to you but unfortunately they’re anything but.

For starters the promise of speeds of up to 25MBps isn’t much of an upgrade over what’s available with the current ADSL2+ infrastructure. Indeed most of the places that they’re looking to cover with this can already get such services so rigging fibre up to their nodes will likely not net much benefit to them. Predominantly this is because the last mile will still be on the copper network which is the major limiting factor in delivering higher speeds to residential areas. They might be able to roll out FTTN within that time frame but it’s highly unlikely that you’ll see any dramatic speed increases, especially if you’re on an old line.

Under the Liberal’s plan you could, however, pay for the last mile run to your house which, going by estimates from other countries that have done similar, could range anywhere from $2500 to $5000. Now I know a lot of people who would pay for that, indeed I would probably be among them, but I’d much rather it be rolled out to everyone indiscriminately otherwise we end up in a worse situation we have now. The idea behind the NBN was ubiquitous access to high speed Internet no matter where you are in Australia so forcing users to pay for the privilege kind of defeats its whole purpose.

Probably the biggest issue for me though is how the coalition plans to get to 100MBps without running FTTP. The technologies that Turnbull has talked about in the past just won’t be able to deliver the speeds he’s talking about. Realistically the only way to reliably attain those speeds across Australia would be with an FTTP network however upgrading a FTTN solution will cost somewhere on the order of $21 billion. All added up that makes the Liberal’s NBN almost $5 billion more than the current Labor one so it’s little wonder that they’ve been trying to talk up the cost in the past week or so.

You can have a look at their policy documents here but be warned it’s thin on facts and plays fast and loose with data. I’d do a step by step takedown of all the crazy in there but there are people who are much more qualified than me to do that and I’ll be sure to tweet links when they do.

Suffice to say the Liberal’s policy announcement has done nothing but confirm our worst fears about the Liberal party’s utter lack of understanding about why the FTTP NBN was a good thing for Australia. Their plan might be cheaper but it will fail to deliver the speeds they say it will and will thus provide a lot less value for the same dollars spent on a FTTP solution. I can only hope come election time we end up with a hung parliament again because the independents will guarantee that nobody fucks with the FTTP NBN.

NBN Fibre

Why Australia Needs The FTTP NBN.

The state of broadband Internet in Australia is one of incredible inconsistency. I lived without it for the better part of my youth, being stuck behind a dial up connection because my local exchange simply didn’t have the required number of people interested in getting broadband to warrant any telco installing the required infrastructure there. I was elated when we were provided a directional wireless connection that gave me speeds that were comparable to that of my city dwelling friends but to call it reliable was being kind as strong winds would often see it disconnect at the most inconvenient of times.

NBN Fibre

The situation didn’t improve much when I moved into the city though as whilst I was pretty much guaranteed ADSL wherever I lived the speed at which it was delivered varied drastically. In my first home, which was in an affluent and established suburb, usually capped out at well below half of its maximum speed. The second home fared much better despite being about as far away from the closest exchange as the other house was. My current residence is on par with the first, even with the technological jump from ADSL to ADSL2+. As to the reason behind this I can not be completely sure but there is no doubt that the aging copper infrastructure is likely to blame.

I say this because my parents, who still live out in the house that I grew up in, were able to acquire an ADSL2+ connection and have been on it for a couple years. They’re not big Internet users though and I’d never really had the need to use it much when I’m out there visiting but downloading a file over their connection last week revealed that their connection speeds were almost triple mine, despite their long line of sight distance to their exchange. Their connection is likely newer than most in Canberra thanks to their rural neighbourhood being a somewhat recent development (~30 years or so). You can then imagine my frustration with the current copper infrastructure as it simply can not be relied upon to provide consistent speeds, even in places where you’d expect it to be better.

There’s a solution on the horizon however in the form of the National Broadband Network. The current plan of rolling out fibre to 93% of Australian households (commonly referred to as Fibre to the Premises/Home, or FTTP/H) elminates the traditional instability that plagues the current copper infrastructure along with providing an order of magnitude higher speeds. Whilst this is all well and good from a consumer perspective it will also have incredible benefits for Australia economically. There’s no denying that the cost is quite high, on the order of $37 billion, but not only will it pay itself back in real terms long before its useful life has elapsed it will also provide benefits far exceeding that cost shortly after its completion.

Should this year’s election go the way everyone is thinking it will though the glorious NBN future will look decidedly grim if the Coalition has their way with it. They’ve been opponents of it from the get go, criticising it as a wasteful use of government resources. Whilst their plan might not sound that much different on the surface, choosing to only run Fibre to the Node (FTTN) rather than the premises, it is a decidedly inferior solution that will not deliver the same level of benefits as the currently envisioned NBN. The reason behind this is simple: it still uses the same copper infrastructure that has caused so many issues for current broadband users in Australia.

You don’t have to look much further than Canberra’s own FTTN network TransACT to know just how horrific such a solution is. After a decade of providing lackluster service, one that provided almost no benefit over ADSL2+, TransACT wrote down their capital investment and sold it to iiNet. If FTTN can’t survive in a region that is arguably one of the most affluent and tech savvy in Australia then it has absolutely no chance of surviving elsewhere, especially when current ADSL services can still be seen as competitive. You could make the argument that the copper could be upgraded/remediated but then you’re basically just building a FTTP solution using copper, so why not just go for optic fibre instead?

What really puts it in perspective is that the International Space Station, you know that thing whizzing 300KM above earth at Mach 26, has faster Internet than the average Australian does. Considering your average satellite connection isn’t much faster than dial up the fact that the ISS can beat the majority of Australians speed wise shows just how bad staying on copper will be. FTTN won’t remedy those last mile runs where all the attenuation happens and that means that you can’t guarantee minimum speeds like you can with FTTP.

The NBN represents a great opportunity to turn Australia into a technological leader, transforming us from something of an Internet backwater to a highly interconnected nation with infrastructure that will last us centuries. It will mean far more for Australia than faster loading web pages but failing to go the whole for the whole FTTP will make it an irrelevant boondoggle. Whilst we only have party lines to go on at the moment with the “fully detailed” plan still forthcoming it’s still safe to say that the Coalition are bad news for it, no matter which angle you view their plan from.

The Sun Sets on a Martian Day.

I’ll just put this here, a sunset on Mars as seen by the Curiosity rover:

YouTube Preview Image

I had one of those moments watching this video where I just considered the chain of events that led up to me being able to see this. There’s a robot on another planet, several million kilometers away, that’s beaming pictures back to Earth. Those pictures were then made available to the public via a vast, interconnected network that spans the entire globe. One person on that network decided to collate them into a video and make that available via said network. I then, using commodity hardware that anyone can purchase, was able to view that video. The chain of events leading up to that point seem so improbable when you look at as a completed system but they all exist and are all products of human innovation.

Isn’t that just mind blowingly awesome?

Intel Smart TV

Intel Could Be Your Next Pay TV Provider.

One thing that not many people knew was that I was pretty keen on the whole Google TV idea when it was announced 2 years ago. I think that was partly due to the fact that it was a collaboration between several companies that I admire (Sony, Logitech and, one I didn’t know about at the time, Intel) and also because of what it promised to deliver to the end users. I was a fairly staunch supporter of it, to the point where I remember getting into an argument with my friends that consumers were simply not ready for something like it rather than it being a failed product. In all honesty I can’t really support that position any more and the idea of Google TV seems to be dead in the water for the foreseeable future.

Intel Smart TV

What I didn’t know was that whilst Google, Sony and Logitech might have put the idea to one side Intel has been working on developing their own product along similar lines, albeit from a different angle than you’d expect. Whilst I can’t imagine that they had invested that much in developing the hardware for the TVs (a quick Google search reveals that they were Intel Atoms, something they had been developing for 2 years prior to Google TV’s release) it appears that they’re still seeking some returns on that initial investment. At the same time however reports are coming in that Intel is dropping anywhere from $100 million to $1 billion on developing this new product, a serious amount of coin that industry analysts believe is an order of magnitude above anyone who’s playing around in this space currently.

The difference between this and other Internet set top boxes appears to be the content deals that Intel is looking to strike with current cable TV providers. Now anyone who’s ever looked into getting any kind of pay TV package knows that whatever you sign up for you’re going to get a whole bunch of channels you don’t want bundled in alongside the ones you do, effectively diluting the value you derive from the service significantly. Pay TV providers have long fought against the idea of allowing people to pick and choose (and indeed anyone who attempted to provide such a service didn’t appear to last long, ala SelecTV Australia) but with the success of on demand services like NetFlix and Hulu it’s quite possible that they might be coming around to the idea and see Intel as the vector of choice.

The feature list that’s been thrown around press prior to an anticipated announcement at CES next week (which may or may not happen, according to who you believe) does sound rather impressive, essentially giving you the on demand access that everyone wants right alongside the traditional programming that we’ve come to expect from pay TV services. The “Cloud DVR” idea, being able to replay/rewind/fast-forward shows without having to record them yourself, is evident of this and it would seem that the idea of providing the traditional channels as well would just seem to be a clever ploy to get the content onto their network. Of course traditional programming is required for certain things like sports and other live events, something which the on demand services have yet to fully incorporate into their offerings.

Whilst I’m not entirely enthused with the idea of yet another set top box (I’m already running low on HDMI ports as it is) the information I’ve been able to dig up on Intel’s offering does sound pretty compelling. Of course many of the features aren’t exactly new, you can do many of the things now with the right piece of hardware and pay TV subscriptions, but the ability to pick and choose channels would be and then getting that Hulu-esque interface to watch previous episodes would be something that would interest me. If the price point is right, and its available globally rather than just the USA, I could see myself trying it out for the select few channels that I’d like to see (along with their giant back catalogues, of course).

In any case it will be very interesting to see if Intel does say anything about their upcoming offering next week as if they do we’ll have information direct from the source and if they don’t we’ll have a good indication of which analysts really are talking to people who are involved in the project.

AdTrap Product Image

Paying to Block Ads: It Should Go To The Content Creators, Not Some 3rd Party.

I don’t run ads here and there’s a really simple reason for that: I have the luxury of not needing to. This blog is one of my longest running hobbies and whilst the cost to me is non-zero in terms of time and actual cash I’m willing to eat both those costs simply for the love of it. There is a point where I’ve told myself that I’ll start running ads (that’s the point where I can make a living off doing this) but that’s somewhere in the order of 50 times the traffic I’m receiving today. Not an impossible goal really but certainly a long way off from where I currently am.

It’s for that particular reason that I don’t run ad blocking services on my browser. You see for the most part I don’t even really notice the ads unless they start forming obvious patterns or have obnoxious auto-playing music and I figure that as a fellow content creator I understand their reason for being there. Even though I don’t usually click on them I know that the author is getting at least some kind of reward for providing that information for free to me, even if it’s not much. I completely support everyone else’s freedom to block ads as they see fit however as I know that overall they’re in a minority and they won’t be the death of free online content any time soon.

Then I read this article titled “How Much Would You Pay to Never See an Online Ad Again?” thinking that it might be some new inventive start-up idea like Flattr which would be working with publishers in order to get rid of advertising on their site. AdTrap is in fact quite the opposite being a hardware device that sits between your modem and router (it actually necessitates that configuration which rules out people using integrated devices) that works to remove ads before they reach your browser. Taken at face value the marketing makes it sound like a pretty fantastic device given all the features it’s touting (many of which are not born of it, simply of the way it connects into your existing infrastructure) and it can be yours all for the low price of $120.

Now granted I had some idea in my head of what AdTrap was (care of the title of the article that led me to it) so it’s possible some of my angst directed towards this product is born of that but I’m not totally on board with the idea of paying someone else in order to block ads. It’s one thing to provide that kind of technology for free, that’s kind of expected on the Internet, but building a business around denying revenue to content creators doesn’t sit right with me. I’d be much more on board with being able to pay people directly in order to remove ads, a la Reddit Gold, rather than some 3rd party who isn’t really doing anything for the content creators with their product.

In the end I guess it doesn’t really matter that much as again the number of users who actually end up buying one of these things will be in the minority and won’t have any meaningful impact on revenue. I guess I just take issue with people profiting from such an endeavour as the motives then change from being simply altruistic to maximising their revenue at the cost of other’s. I’m not going to go on some crusade to try and take them down however as the market will be the final judge of it and if the people want something like this then it was inevitable that it would be created.

The Viability of Cloud Gaming.

The idea of cloud gaming is a seductive one especially for those of us who lived through the times when upgrading your computer every 12 months was a requirement if you didn’t want to be watching a slide show. Abstracting the hardware requirement away from the user and then letting them play on any device above a certain, extremely low threshold would appear to be the solution to the upgrade and availability issues of dedicated gaming platforms. I’ve long made the case that the end product is something of a niche market, one that I was never quite sure would be viable on a large scale. With the demise of OnLive I could very easily make my point based around that but you can never write off an industry on the failures of the first to markets (see Iridium Communications for proof of this).

Providing even a small cloud gaming service requires some rather massive investments in capital expenditure, especially with the hardware that’s currently available today. For OnLive this meant that only one of their servers could serve one of their users at a time which was terrible from a scalability point as they could never really service that many customers without bleeding money on infrastructure. For cloud gaming services of the future however they might be in luck as both NVIDIA and AMD are working on cloud GPUs that will enable them to get much higher densities than the current 1 to 1 ratio. There’ll still be an upper limit to that’s much lower than most cloud services (which typically serve thousands per server) but at the very least the scalability problem is now an engineering issue rather than a capital one.

The second major challenge that cloud gaming companies face how latency sensitive a good portion of the games market is. Whilst you can get down to very low latency numbers with strategically placed servers you’re still going to be adding a good chunk of input lag on top of any server latency which will be unacceptable for a lot of games. Sure there are titles where this won’t be an issue but cutting off a large section of the market (FPS, RTS, RPGs and any mix of them inbetween) further reduces the viability of any potential cloud gaming service.

In fact for many of the titles that could benefit from a cloud gaming service can already be ported to the web thanks to things like Unity or the use of OpenGL extensions in HTML5. Indeed many of the games that I could see being published on a cloud platform (casual MMORPGs, turn based strategy games, etc.) wouldn’t be much different if they were brought to the traditional web instead. Sure you lose some of the platform agnosticity because of this but you can arguably reach the same number of people using that as you could with a cloud platform.

User expectations are also set rather high for cloud services with many of them being flat fee, unlimited usage scenarios (think Pandora, NetFlix, etc). The current business models for cloud gaming didn’t gel well with this mindset as you were paying for the games you wanted to play (often cheaper than retail, sometimes not) for a limited period of time, akin to a long term rental. Whilst this works for some people most users will expect to be able to pay a flat fee in order to access a catalogue they can then use at their leisure and this has significant ramifications for how publishers and developers will license their games to cloud developers. It’s not an insurmountable problem (the music industry came around eventually so the games industry can’t be far behind) but it does introduce a market dynamic that cloud gaming services have not yet investigated.

With all these things being considered I find it hard to see how cloud gaming services can viable in the near term as whilst all the issues are solvable they all work against delivering something that can turn you a profit. Cloud GPUs, ever increasing quality of Internet connections and the desire by many to migrate completely to cloud based services does mean that there’s a trend towards cloud gaming services becoming viable in the future however the other, fundamental limitations could see those pressures rendered null and void. This is something I’m willing to be proven wrong on though as I’ve invested myself heavily in cloud principles and I know that its capable of great things. Whether cloudifying our gaming experience is one of them is something that I don’t believe is currently feasible however and I don’t see that changing for a while.

IMG_20120806_090748

I’ve Tasted 4G Speeds, and They Were Good.

I’ve never had much luck with Internet speeds. That’s probably because unlike most of my geek brethren I always seemingly forget to check the distance to the nearest exchange from the  place I was looking to rent or purchase, something which is now top of my list. Heck even my parents who live in a rural area outside Canberra manage to get better speeds than me thanks to their short distance to their exchange, even though my line of sight distance is almost equal to theirs. It’s still a worlds away from the dial up that I used to make do with but I know there’s a whole other world of faster speeds out there that are just tantalizingly out of reach for me.

At the top of the list is the the holy grail of Internet connections in Australia: the National Broadband Network. Whilst it might be in the realms of fairy tales and unicorns for most people in Australia I know a couple people who’ve managed to get themselves on the service thanks to being in the right place at the right time. From what they tell me its everything that its marketed to be with extremely fast speeds that aren’t dependent on distance from exchange, the modem your using or how much your hardware likes you on a particular day. Unfortunately short of moving into a location that has it already (there are quite a few now, but they’re still the minority) the wait for it is going to be quite long.

There is one technology that is available today that can deliver some pretty impressive speeds so long as your’re within range of a city CBD. The tech I am referring to is, of course, 4G wireless.

Now if you’ve been here for a while I’d forgive you for thinking that I wasn’t a big fan of the whole 4G idea especially when its mentioned in the same breath as the NBN. It is true that I believe they’re solutions to different problems but just as the underlying technology alludes to (Long Term Evolution, or LTE, if you were wondering) I do believe that it is the future of wireless communications. Unfortunately I don’t believe that the wireless network would be capable of supporting all the Internet requirements of Australians even if the specification is theoretically capable of it. It certainly has its place though, however.

As part of my new position with Dell I was given a laptop for accessing the corporate network but the site I’m currently attending doesn’t have an unfettered connection for me to use in order to do so. Initially I was just tethering to my phone as I have a pretty decent data plan (1.5GB/month) that barely ever gets close to being used and for the most part it worked well. However should I pick up my phone to go somewhere or if my S2 was having a particularly bad day I’d lose the connection, dropping anything that required to be always on (like the VPN). Frustrated I decided to grab myself a wireless broadband dongle and for a cool $130 I got myself a 4G one that had 3GB for the first month.

It’s a rather tiny device  resembling an overgrown USB stick (and it in fact has a USB stick in it as well for driver installation, pretty neat) so you can imagine I was slightly sceptical about its capabilities to deliver true 4G speeds with such a small antenna. The signal in the area where I’ve used it the most isn’t particularly fantastic either and I was relegated to the NextG network, which is still not bad by mobile broadband standards. However over the weekend I was up in the middle of Sydney on the 11th floor of a hotel in Darling Harbor and I was privy to full bars of signal strength on the 4G network. So like any self respecting geek I gave it what for.

And boy did it ever deliver.

For regular web browsing the difference wasn’t particularly noticeable but I did see something when I opened up Steam on my laptop to try and get a game configured. The download speed I saw was about 2 MB/second and I figured it was just updating from the cache. It in fact wasn’t and was downloading at those blazing speeds right over the wireless broadband. To put that in perspective that kind of speed is about 4 times what I get regularly at home and I wasn’t even trying to stress the connection fully. In hindsight I should’ve done a speed test just to show you what it was theoretically capable of but just simple Steam download test seemed sufficient enough to prove its value.

Unfortunately I feel that the ludicrous speeds I saw are a product of the lack of usage at the moment. Currently there are only a handful of 4G handsets capable of being used in Telstra’s network and the $130 dongle looks quite expensive next to the $30 3G dongle that’d do the job for pretty much everyone. Whether the 4G network is capable scaling up to the same level of demand that the 3G networks currently have is a question that won’t be answered until 4G reaches a similar level of penetration that 3G has today. With the rapid pace of handset development that could come much sooner than you think and 4G services might become much more commonplace sooner rather than later.

The Death of the Truly Single Player Games.

In the days before ubiquitous high speed Internet people the idea of having games that were only available when you were online were few and far between with the precious few usually being MMORPGs. As time went on however and the world became more reliably connected game developers sought to take advantage of this by creating much more involved online experiences. This also lead to the development of some of the most insane forms of DRM that have ever existed, schemes where the game will constantly phone home in order to verify if the player is allowed to continue playing the game. The advent of cheap and ubiquitous Internet access then has been both a blessing and a curse to us gamers and it may be much more of the latter for one particular type of game.

Way back when an Internet connection was considered something of a luxury the idea of integrating any kind of on line experience was something of a pipe dream. There was still usually some form of multiplayer but that would usually be reserved the hallowed times of LAN parties. Thus the focus of the game was squarely on the single player experience as that would be the main attraction for potential gamers. This is not to say that before broadband arrived there was some kind of golden age of single player games (some of my favourite games of all time are less than 5 years old) but there definitely was more of a focus on the single player experience back then.

Today its much more common to see games with online components that are critical to the overall experience. For the most part this is usually some form of persistent multiplayer which has shown to be one of the most successful ways to keep players engaged with the game (and hence the brand) long after the single player experience has faded from memory. We can squarely lay the blame for this behaviour at big titles like Call of Duty and Battlefield as most multiplayer systems are seeking to emulate the success those games enjoyed. However the biggest blow that single player games has come from something else: the online requirement to just to be able to play games.

Now I’m not specifically referring to always on DRM, although that is in the same category, more the requirement now for many games to go online at least once before they let you play the game. For many of us this check comes in the form of a login to Steam before we’re able to play the games and for others its built directly into the game, usually via a phone home to ensure that the key is still valid. Whilst there is usually an offline mode available I’ve had (and heard many similar stories) quite a few issues trying to get that to work, even when I still have an Internet connection to put them into said mode. For modern games then the idea that something is truly single player, a game that can be installed and played without the need of any external resources, is dead in the water.

This became painfully obvious when Diablo III, a game considered by many (including myself) to be a primarily single player experience, came with all the problems that are evident in games like MMORPGs. The idea that a single player experience required maintenance enraged many players and whilst I can understand the reasons behind it I also share their frustration because it calls into question just how long these games will continue to exist in the future. Whilst Blizzard does an amazing job with keeping old titles running (I believe the old Battle.Net for Diablo 1 is still up and running) many companies won’t care to keep the infrastructure up and running once all the profit has been squeezed out of a title. Some do give the courtesy of patching the games to function in stand alone mode before that happens, but its unfortunately not common.

It’s strange to consider then that the true single player games of the Internet dark ages might live on forever whilst their progeny may not be usable a couple years down the line. There’s a valid argument for companies not wanting to support things that are simply costing them money that’s only used by a handful of people but it then begs the question as to why the game was developed with such heavy reliance on those features in the first place. Unfortunately it doesn’t look like this will be a trend that will be reversed any time soon and our salvation in many cases will come from the dirty pirates who crack these systems for us at no cost. This can not be relied on however and it should really fall to the game developers to have an exit strategy for games that they no longer want to support should they want to keep the loyalty of their long time customers.

TV On The Internet a Fad? Son, I’m Going to Take That Crack Pipe.

I learnt a long time ago that one of the biggest factors in pricing something, especially in the high tech industry, is convenience. For someone who was always a do-it-yourself-er the notion was pretty foreign to me, I mean why would I spend the extra dollars to  have something done for me when I was equally capable of doing it myself? Of course the second I switched from being a salaried employee to a contractor who’s time is billed in hours my equations for determinting something’s value changed drastically and I begun to appreciate being able to pay to get something done rather than having to spend my precious time on it myself.

The convenience factor is what has driven me to try and find some kind of TV solution akin to those that are available in the USA. Unfortunately the only thing that comes close are the less than legal alternatives which is a right shame as I would gladly pay the going rate to get the same service here in Australia. I’m not alone in this regard either as many Australians turn to alternative methods in order to get their fix of their favorite shows. What this says to me is that teh future of TV is definitely moving towards being a more on demand service like those provided by Netflix and Hulu and less like traditional TV channels.

Some industry executives would disagree with me on that point, to the point of saying that watching TV on the Internet is nothing short of a fad that will eventually pass. There’s been a couple clarifications to that post since it first went live but the sentiment remains that they believe people who abandon their cable subscriptions, “cable cutters” as it were, are in the minority and once economic conditions improve they’ll be back again. I can understand the reasoning behind a cable exec taking this kind of position, but it’s woefully misguided.

For starters Netflix alone counts for around a third of peak bandwidth usage in the USA.  To put this in perspective that’s double all BitTorrent traffic and triple YouTube, both considered to be hives of piracy among the cable cartels. This is in conjunction with the fact that people are using their Xboxs to watch movies and listen to music more than they’re using them to play games, usually through online services. Taking all of this into consideration you’d be mad to think that the future is still in traditional pay TV services as there’s a very clear trend towards on-demand media, provided through your local Internet connection, is what customers are looking for.

There’s two reasons to explain why cable companies are thinking this way. The first, and least likely, is that they’re simply unaware of the current trends in the media market space. This is not entirely impossible as there have been a few examples in recent times (BlockBuster being the first that comes to mind) who simply failed to recognise where the market was moving and paid the ultimate price for it in the end. The far more likely reason is simple bravado as the cable companies can’t really take the stand and say that they’re aware of the changing market demands but will do nothing about it. No for them its best, at least in the short term, to write off the phenomena completely. In the long term of course this tactic won’t work, but I get the feeling none of them are playing a particularly long game at this point.

As I’ve said many times before media companies and rights holders have fought tooth and nail against every technological advancement for the past century and the only constant in every one of them is that in the end the technology won out. Eventually these companies will have to wake up to the reality that their outdated business models don’t fit into the current market and they’ll either have to adapt or die.

FYX, Global Mode and Geoblocking.

Coding a location based service introduced me to a lot of interesting concepts. The biggest of which was geocoding, an imprecise science of transcribing a user’s IP address into a real world location. I say imprecise because there’s really no good way of doing it and most of the geocoding and reverse-geocoding services out there rely on long lists that match an IP to its location. These lists aren’t entirely accurate so the location you get back from them is usually only good as an initial estimate and you’re better off using something like the HTML5 location  framework or just simply asking the user where the hell they are in the world. Unfortunately those inaccurate lists drive a whole lot of current services, most of them with the intent of limiting said service to a certain geographical location.

I’ve written about this practice before and how it’s something of a hangover from the times of DVDs and region locking. From a technology standpoint it makes little sense to block access to certain countries (whether they block you is another matter) as all you’re doing is limiting your market. From a business and legal standpoint the waters are a little murkier as most of the geo-restricted services, the ones of note anyway, are done simply because it’s either not in their business interests to do so (although I believe that’s short sighted) or there’s a lot of legal wrangling to be done in order for it to be made available globally.

A clucky New Zealand ISP, FYX, was attempting to solve this problem of geoblocking and whilst they have withdrawn the service from the market (but are looking to bring it back) I still want to talk about their approach and why its inherently flawed.

FYX is offering what they call “Global Mode” for their Internet Services which apparently makes their users appear as if they’re not from any particular country at all. Their thinking is that once you’re a global user services that were once blocked because of your region will suddenly be available to you, undoing the damage to the free Internet that those inaccurate translations lists can cause. However the idea that no location = geoblocking services ineffective is severely flawed which would be apparent to anyone who’s even had a passing encounter with these services.

For starters most sites with geoblocking enabled do so by using a whitelist meaning that only people of specific countries will be able to access those services. For things like Hulu and netflix they are hard coded to IPs residing within the USA boundaries and anything that’s not on those lists will automatically get blocked. Of course there’s some in-browser trickery that you can do to get around this (although that’s not at the ISP layer) but the only guaranteed solution is to access them through a connection that appears to originate from an IP they trust. Simply not updating the location on those lists won’t do the trick so you’d need to do something more. It’s entirely possible that they’re doing something more fancy than this but the solution I can think of wouldn’t be very scalable, nor particularly proftiable.

It also seems that they might’ve got the attention of some rights holders groups who put pressure on their parent company to do away with the service. Legally there didn’t seem to be anything wrong with the idea (apart from the fact that it probably wouldn’t work as well as advertised) but that wouldn’t stop media companies from threatening to take them to court if such a service was continued to be offered. It really shows how scared  such organisations are of new technology if a small time ISP with a not-so-special service can be a big enough blip on the radar to warrant such action. I’ll be interested to see how FYX progresses with this, especially if they detail some more info on just how they go about enabling their Global Mode.

The reality of the situation is that we’re trending to a much more connected world, one where the traditional barriers to the free flow of information are no longer present. Companies that made their fortunes in the past need to adapt to the present and not attempt to litigate their way to profitability. Eventually that won’t be an option for them (think BlockBuster vs Netflix) and I really can’t wait for the day that geoblocking is just a silly memory of when companies thought that their decades old business models still worked in an ever changing world.