Posts Tagged‘internet’

Google’s Project Loon: Internet For Everyone.

Just outside the Googleplex in Mountain View California there’s a small facility that was the birthplace for many of the revolutionary technologies that Google is known for today. It’s called Google [x] and is akin to the giant research and development labs of corporations in ages past where no idea is off limits. It’s spawned some of the most amazing projects that Google has made public including the Driverless Car and Project Glass. These are only a handful of the projects that are currently under development at this lab however with vast majority of them remaining secret until they’re ready for release into the world. One more of their projects has just reached that milestone and it’s called Project Loon.

The idea is incredibly simple: provide Internet access to everyone regardless of their location. How they’re going about that however is the genius part: they’re going to use a system of high altitude balloons and base relay stations with each of them being able to cover a 40KM area. For countries that don’t have the resources to lay the cables required to provide Internet this provides a really easy solution to covering large areas and even makes providing Internet possible to regions that would otherwise be inaccessible.

What’s really amazing however is how they’re going about solving some of the issues you run into when you’re using balloons as your transportation system:

YouTube Preview Image

The height they fly at is around the bottom end of the range for your typical weather balloon (they can be found from 18KM all the way up to 38KM) and is about half the height from where Felix Baumgartner made his high altitude jump from last year. I wasn’t aware that different layers of the stratosphere had different wind directions and making use of them to keep the balloons in position is just an awesome piece of engineering. Of course this would all be for naught if the Internet service they delivered wasn’t anything above what’s available now with satellite broadband, but it seems they’ve got that covered too.

The Loon stations use the 2.4GHz and 5.8GHz frequencies for communications with ground receivers and base stations and are capable of delivering speeds comparable to 3G (~2MBps or so). Now if I’m honest the choice to use these public signal spaces seems like a little bit of a gamble as whilst it’s free to use it’s also a signal space that’s already quite congested. I guess this is less of a problem in the places where Loon is primarily aimed at, namely regional and remote areas, but even those places have microwaves and personal wifi networks. It’s not an insurmountable problem of course, and I’m sure the way-smarter-than-me people at Google[x] have already thought of that, it’s just an issue with anything that tries to use that same frequency space.

I might never end up being a user of this particular project but as someone who lived on the end of a 56K line for the majority of his life I can tell you how exciting this is for people living outside broadband enabled areas. According to Google it’s launching this month in New Zealand to a bunch of pilot users so it won’t be long before we see how this technology works in the real world. From there I’m keen to see where they take it next as there’s a lot of developing countries where this technology could make some really big waves.

Uploading

Oh The Things I Could Do If I Only Had the Bandwidth.

The Internet situation I have at home is what I’d call workable but far from ideal. I’m an ADSL2+ subscriber, a technology that will give you speeds up to 25MBps should you be really close to the exchange, on good copper and (this is key) make the appropriate sacrifices to your last mile providers. Whilst my line of sight distance to the exchange promises speeds in the 15MBps range I’m lucky to see about 40% of that with my sync speed usually hovering around the 4~5MBps range. For a lot of things this is quite usable, indeed as someone who had dial-up for most of his life these speeds are still something I’m thankful for, but it’s becoming increasingly obvious that my reach far exceeds my grasp something which as a technology centric person is fast becoming an untenable position.

Uploading

Honestly I don’t think about it too much as it’s not like it’s a recent realisation and, since the difference between the best and worst speeds I’ve had weren’t that great in retrospect, I’ve developed a lot of habits to cope with it. Most of these are running things over longer periods when I wouldn’t be using the Internet anyway but not all tasks fit nicely into that solution. Indeed last night when I wanted to add in a video that I recorded to my post, one that was only ~180MB in size, I knew there was going to be a pretty long delay in getting the post online. The total upload time was around 30mins in the end which is just enough time for me to get distracted with other things and completely forget about what I was doing until later that night.

Sure it’s not an amazing example of why I need faster Internet but it does highlight the issue. The video wasn’t particularly large nor super high resolution (720p, 60fps), it was produced on technology that’s over 2 years old and uploaded to a service that’s been around for 7 years. The bottleneck in that equation is the connection that all of them share from my home network, something which hasn’t changed that much in the last decade that I’ve been a broadband Internet user.

For me it’s even worse when I run up against the limitations of paltry connection for things like services I’d like to host myself. In its infancy this blog was hosted from my little server at home but it became quickly apparent that little things like pictures were simply untenable because they’d take forever to load even if I shrunk them down to near unusable sizes. It became even worse when I started looking into using the point to point VPN feature in Azure for connecting a small home environment to the virtual machines I’m running in the cloud as my tiny connection was simply not enough to handle the kind of traffic it would produce. That might not sound like a big deal but for any startup in Australia thinking about doing something similar it kills the idea of creating using the service in that fashion which puts a lot of pressure on their remaining runway.

It’s reasons like this which keep me highly skeptical of the Liberal’s plan for the NBN as the speeds they’re aspiring towards aren’t that much dissimilar to what I’m supposed to be getting now. Indeed they can’t even really guarantee those speeds thanks to their reliance on the woefully inadequate copper network for the last run in their FTTN plan. Canberra residents will be able to tell you how much of a folly their idea is after the debacle that is TransACT (recently bought for $60 million and then its infrastructure sold for $9 million) which utterly failed to deliver on it’s promises, even when they deployed their own copper infrastructure.

It also doesn’t help that their leader thinks that 25MBps is more than enough for Australian residents which, if true, would mean that ADSL2+ would be enough for everyone, including businesses. Us IT admins have known that this hasn’t been the case for a while, especially considering how rare it is to get those speeds, and the reliance on the primary limiting factor (Telstra’s copper network) for the Liberal’s NBN plan effectively ensures that this will continue on for the foreseeable future.

All those points pale in comparison to the one key factor: we will need to go full fibre eventually.

The copper we have deployed in Australia has a hard upper limit to the amount of bandwidth it can carry, one that we’re already running up against today. It can be improved through remediation by installing thicker cables but that’s a pretty expensive endeavour, especially when you take into account the additional infrastructure required to support the faster speeds. Since there’s no plan to do such remediation on the scales required (either by Telstra or as part of the Liberal’s NBN plan) these current limitations will remain in place. Fibre on the other hand doesn’t suffer from the same issues with the new cables able to carry several orders of magnitude more bandwidth just with today’s technology. The cost of deploying it isn’t cheap, as we already know, but considering it will pay for itself well before it reaches the end of its useful life.

My whinging is slightly moot because I’ll probably be one of the lucky ones to have fibre being rolled out to my neighbourhood before the election but I do feel the NBN’s effectiveness will be drastically decreased if its not ubiquitous. It’s one of the few multi-term policies that will have real, tangible benefits for all Australians and messing with it will turn it from a grand project to a pointless exercise. I hope the Liberal’s policy is really just all that much hot air to placate their base because otherwise the Internet future of Australia will be incredibly dim and that’s not something that I, or any user of technology, wants for this country.

DSC_0148

Microsoft’s Internet Connection is the Least of Your Worries.

After spending a week deep in the bowels of Microsoft’s premier tech conference and writing about them breathlessly for Lifehacker Australia you’d be forgiven for thinking I’m something of a Microsoft shill. It’s true that I think the direction they’re going in for their infrastructure products is pretty spectacular and the excitement for those developments is genuine. However if you’ve been here for a while you’ll know that I’m also among their harshest critics, especially when they do something that drastically out of line with my expectations as one of their consumers. However I believe in giving credit where its due and a recent PA Report article has brought Microsoft’s credentials in one area into question when they honestly shouldn’t be.

DSC_0148

The article I’m referring to is this one:

I’m worried that there are going to be a few million consoles trying to dial into the home servers on Christmas morning, about the time when a mass of people begin to download new games through Microsoft’s servers. Remember, every game will be available digitally day and date of the retail version, so you’re going to see a spike in the number of people who buy their Xbox One games online.

I’m worried about what happens when that new Halo or Call of Duty is released and the system is stressed well above normal operating conditions. If their system falls, no matter how good our Internet connections, we won’t be able to play games.

Taken at face value this appears to be a fair comment. We can all remember times when the Xbox Live service came down in a screaming heap, usually around christmas time or even when a large release happened. Indeed even doing a quick Google search reveals there’s been a couple of outages in recent memory although digging deeper into them reveals that it was usually part of routine maintenance and only affected small groups of people at a time. With all the other criticism that’s being levelled at Microsoft of late (most of which I believe is completely valid) it’s not unreasonable to question their ability to keep a service of this scale running.

However as the title of this post alludes to I don’t think that’s going to be an issue.

The picture shown above is from the Windows Azure Internals session by Mark Russinovich which I attended last week at TechEd North America. It details the current infrastructure that underpins the Windows Azure platform which powers all of Microsoft’s sites including the Xbox Live service. If you have a look at the rest of the slides from the presentation you’ll see how far that architecture has come since they first introduced it 5 years ago when the over-subscription rates were much, much higher for the entire Azure stack. What this meant was that when something big happened the network simply couldn’t handle it and caved under the pressure. With this current generation of the Azure infrastructure however it’s far less oversubscribed and has several orders of magnitude more servers behind it. With that in mind it’s far less likely that Microsoft will struggle to service large spikes like they have done in the past as the capacity they have on tap is just phenomenal.

Of course this doesn’t alleviate the issues with the always/often on DRM or the myriad of other issues that people are criticizing the XboxOne for but it should show you that worrying about Microsoft’s ability to run a reliable service shouldn’t be one of them. Of course I’m just approaching this from an infrastructure point of view and it’s entirely possible for the Xbox Live system to have some systemic issue that will cause it to fail no matter how much hardware they throw at it. I’m not too concerned about that however as Microsoft isn’t your run of the mill startup who’s just learning how to scale.

I guess we’ll just have to wait and see how right or wrong I am.

What Hath The Internet Done To You, Dear User.

Unquestionably the Internet has drastically altered my behaviour in many ways. In the beginning it was merely a curiosity, something I was able to lord over the other kids in the playground because I was one of the precious few that had it and would become instant friends with many who wanted to see it. As I grew older and my interests broadened it became my primary resource for finding information, leading me to investigate many wild things that I would not have paid any attention to otherwise. Most recently it became my platform for communicating with the wider world whilst also elevating my career to places that I couldn’t of dreamed of.

In short, I feel the Internet has been good to me.

Looking from the outside in however would probably paint a much different picture. My near fanatical obsession for my current object of desire often led me down destructive paths, one of which was my World of Warcraft addiction would could not exist without an Internet connection. My desire for information often leads me down paths that aren’t relevant for anything past satisfying my curiosity, filling my head with facts that I will likely never find a use for. The Internet has also chronicled some of my worst moments and whilst they’re not exactly common knowledge they serve as a reminder of the parts of myself that I’m not particularly proud of.

However whilst it would be easy to lay the blame directly at the tool which enabled this behaviour, an easy thing to do given its current ease of use and pseudo-anonymity that enables everyone’s Inner Fuckwad, I can’t say that these same things would happen absent the Internet. There are many, many people that advocate cutting down or doing away with the Internet (or anything, realistically) will lead you onto untold benefits but as a Verge reporter found these effects are usually only temporary:

It’s a been a year now since I “surfed the web” or “checked my email” or “liked” anything with a figurative rather than literal thumbs up. I’ve managed to stay disconnected, just like I planned. I’m internet free.

And now I’m supposed to tell you how it solved all my problems. I’m supposed to be enlightened. I’m supposed to be more “real,” now. More perfect.

But instead it’s 8PM and I just woke up. I slept all day, woke with eight voicemails on my phone from friends and coworkers. I went to my coffee shop to consume dinner, the Knicks game, my two newspapers, and a copy of The New Yorker. And now I’m watching Toy Story while I glance occasionally at the blinking cursor in this text document, willing it to write itself, willing it to generate the epiphanies my life has failed to produce.

I’m not going to say that the Internet isn’t an enabler for some particularly bad behaviours, my use of it is a great testament to that, but the issues that cause can always be traced back to the person. For me my WoW addiction was an escape from the crazy world I had put myself into, working 3 different jobs while studying full time at university. In my escape I found some control and, unfortunately, also power over other people that was incredibly intoxicating. Only when that power dwindled and I was left with no one else to turn to did I start to realise how destructive it had become and I ended up leaving that part of me behind for several years.

Maybe we need that time away in order to get clarity on those destructive behaviours that we associate with specific tools. My honeymoon was decidedly devoid of technology, even though I smuggled my laptop along with me, and after the first couple days of adjustment I felt oddly liberated. Whilst the revelations I came to at that time weren’t about my use of the Internet (indeed this was several years after I dreged myself out of that rather dark place) I certainly felt I had a better understanding of how I interacted with those things that I was absent from. Perhaps then instead of advocating giving up something completely we should take time constrained breaks, lest we establish the same bad habits using alternative means.

That is definitely something I can attest to as many of my life changing decisions have been made when I’ve been in situations that were decidedly different from the norm, not from giving something up completely. Indeed I feel abandoning something completely often means giving up part of yourself, your identity. Of course there are times when this is appropriate but for something as benign as Internet use I don’t believe that giving it up will solve your problems. However seconding yourself away from it for a time might just give you the insight needed to rectify the worst parts of it and broaden your perspective on the issues at hand.

Malcolm Turnbull Nice NBN You Have There

The Liberal’s NBN Plan is Just Plain Bad.

Last week I regaled you with a story of the inconsistent nature of Australia’s broadband and how the current NBN was going to solve that through replacing the aging copper network with optical fibre. However whilst the fundamental works to deliver it are underway it is still in its nascent stages and could be easily usurped by a government that didn’t agree with its end goals. With the election looking more and more like it’ll swing towards the coalition’s favour there has been a real risk that the NBN we end up with won’t be the one that we were promised at the start, although the lack of a concrete plan has left me biting my tongue whilst I await the proposal.

Today Malcolm Turnbull announced his NBN plan, and it’s not good at all.

Malcolm Turnbull Nice NBN You Have There

Instead of rolling out fibre to 93% of Australians and covering the rest off with satellite and wireless connections the Liberal’s NBN will instead only roll fibre to 22%, the remaining 71% will be covered by FTTN. According to Turnbull’s estimations this will enable all Australians to have broadband speeds of up to 25MBps by 2016 with a planned upgrade of up to 100MBps by 2019. The total cost for this plan would be around $29 billion which is about $15 billion less than the current planned total expenditure required for Labor’s FTTP NBN. If you’re of the mind that the NBN was going to be a waste of money that’d take too long to implement then these numbers would look great to you but unfortunately they’re anything but.

For starters the promise of speeds of up to 25MBps isn’t much of an upgrade over what’s available with the current ADSL2+ infrastructure. Indeed most of the places that they’re looking to cover with this can already get such services so rigging fibre up to their nodes will likely not net much benefit to them. Predominantly this is because the last mile will still be on the copper network which is the major limiting factor in delivering higher speeds to residential areas. They might be able to roll out FTTN within that time frame but it’s highly unlikely that you’ll see any dramatic speed increases, especially if you’re on an old line.

Under the Liberal’s plan you could, however, pay for the last mile run to your house which, going by estimates from other countries that have done similar, could range anywhere from $2500 to $5000. Now I know a lot of people who would pay for that, indeed I would probably be among them, but I’d much rather it be rolled out to everyone indiscriminately otherwise we end up in a worse situation we have now. The idea behind the NBN was ubiquitous access to high speed Internet no matter where you are in Australia so forcing users to pay for the privilege kind of defeats its whole purpose.

Probably the biggest issue for me though is how the coalition plans to get to 100MBps without running FTTP. The technologies that Turnbull has talked about in the past just won’t be able to deliver the speeds he’s talking about. Realistically the only way to reliably attain those speeds across Australia would be with an FTTP network however upgrading a FTTN solution will cost somewhere on the order of $21 billion. All added up that makes the Liberal’s NBN almost $5 billion more than the current Labor one so it’s little wonder that they’ve been trying to talk up the cost in the past week or so.

You can have a look at their policy documents here but be warned it’s thin on facts and plays fast and loose with data. I’d do a step by step takedown of all the crazy in there but there are people who are much more qualified than me to do that and I’ll be sure to tweet links when they do.

Suffice to say the Liberal’s policy announcement has done nothing but confirm our worst fears about the Liberal party’s utter lack of understanding about why the FTTP NBN was a good thing for Australia. Their plan might be cheaper but it will fail to deliver the speeds they say it will and will thus provide a lot less value for the same dollars spent on a FTTP solution. I can only hope come election time we end up with a hung parliament again because the independents will guarantee that nobody fucks with the FTTP NBN.

NBN Fibre

Why Australia Needs The FTTP NBN.

The state of broadband Internet in Australia is one of incredible inconsistency. I lived without it for the better part of my youth, being stuck behind a dial up connection because my local exchange simply didn’t have the required number of people interested in getting broadband to warrant any telco installing the required infrastructure there. I was elated when we were provided a directional wireless connection that gave me speeds that were comparable to that of my city dwelling friends but to call it reliable was being kind as strong winds would often see it disconnect at the most inconvenient of times.

NBN Fibre

The situation didn’t improve much when I moved into the city though as whilst I was pretty much guaranteed ADSL wherever I lived the speed at which it was delivered varied drastically. In my first home, which was in an affluent and established suburb, usually capped out at well below half of its maximum speed. The second home fared much better despite being about as far away from the closest exchange as the other house was. My current residence is on par with the first, even with the technological jump from ADSL to ADSL2+. As to the reason behind this I can not be completely sure but there is no doubt that the aging copper infrastructure is likely to blame.

I say this because my parents, who still live out in the house that I grew up in, were able to acquire an ADSL2+ connection and have been on it for a couple years. They’re not big Internet users though and I’d never really had the need to use it much when I’m out there visiting but downloading a file over their connection last week revealed that their connection speeds were almost triple mine, despite their long line of sight distance to their exchange. Their connection is likely newer than most in Canberra thanks to their rural neighbourhood being a somewhat recent development (~30 years or so). You can then imagine my frustration with the current copper infrastructure as it simply can not be relied upon to provide consistent speeds, even in places where you’d expect it to be better.

There’s a solution on the horizon however in the form of the National Broadband Network. The current plan of rolling out fibre to 93% of Australian households (commonly referred to as Fibre to the Premises/Home, or FTTP/H) elminates the traditional instability that plagues the current copper infrastructure along with providing an order of magnitude higher speeds. Whilst this is all well and good from a consumer perspective it will also have incredible benefits for Australia economically. There’s no denying that the cost is quite high, on the order of $37 billion, but not only will it pay itself back in real terms long before its useful life has elapsed it will also provide benefits far exceeding that cost shortly after its completion.

Should this year’s election go the way everyone is thinking it will though the glorious NBN future will look decidedly grim if the Coalition has their way with it. They’ve been opponents of it from the get go, criticising it as a wasteful use of government resources. Whilst their plan might not sound that much different on the surface, choosing to only run Fibre to the Node (FTTN) rather than the premises, it is a decidedly inferior solution that will not deliver the same level of benefits as the currently envisioned NBN. The reason behind this is simple: it still uses the same copper infrastructure that has caused so many issues for current broadband users in Australia.

You don’t have to look much further than Canberra’s own FTTN network TransACT to know just how horrific such a solution is. After a decade of providing lackluster service, one that provided almost no benefit over ADSL2+, TransACT wrote down their capital investment and sold it to iiNet. If FTTN can’t survive in a region that is arguably one of the most affluent and tech savvy in Australia then it has absolutely no chance of surviving elsewhere, especially when current ADSL services can still be seen as competitive. You could make the argument that the copper could be upgraded/remediated but then you’re basically just building a FTTP solution using copper, so why not just go for optic fibre instead?

What really puts it in perspective is that the International Space Station, you know that thing whizzing 300KM above earth at Mach 26, has faster Internet than the average Australian does. Considering your average satellite connection isn’t much faster than dial up the fact that the ISS can beat the majority of Australians speed wise shows just how bad staying on copper will be. FTTN won’t remedy those last mile runs where all the attenuation happens and that means that you can’t guarantee minimum speeds like you can with FTTP.

The NBN represents a great opportunity to turn Australia into a technological leader, transforming us from something of an Internet backwater to a highly interconnected nation with infrastructure that will last us centuries. It will mean far more for Australia than faster loading web pages but failing to go the whole for the whole FTTP will make it an irrelevant boondoggle. Whilst we only have party lines to go on at the moment with the “fully detailed” plan still forthcoming it’s still safe to say that the Coalition are bad news for it, no matter which angle you view their plan from.

The Sun Sets on a Martian Day.

I’ll just put this here, a sunset on Mars as seen by the Curiosity rover:

YouTube Preview Image

I had one of those moments watching this video where I just considered the chain of events that led up to me being able to see this. There’s a robot on another planet, several million kilometers away, that’s beaming pictures back to Earth. Those pictures were then made available to the public via a vast, interconnected network that spans the entire globe. One person on that network decided to collate them into a video and make that available via said network. I then, using commodity hardware that anyone can purchase, was able to view that video. The chain of events leading up to that point seem so improbable when you look at as a completed system but they all exist and are all products of human innovation.

Isn’t that just mind blowingly awesome?

Intel Smart TV

Intel Could Be Your Next Pay TV Provider.

One thing that not many people knew was that I was pretty keen on the whole Google TV idea when it was announced 2 years ago. I think that was partly due to the fact that it was a collaboration between several companies that I admire (Sony, Logitech and, one I didn’t know about at the time, Intel) and also because of what it promised to deliver to the end users. I was a fairly staunch supporter of it, to the point where I remember getting into an argument with my friends that consumers were simply not ready for something like it rather than it being a failed product. In all honesty I can’t really support that position any more and the idea of Google TV seems to be dead in the water for the foreseeable future.

Intel Smart TV

What I didn’t know was that whilst Google, Sony and Logitech might have put the idea to one side Intel has been working on developing their own product along similar lines, albeit from a different angle than you’d expect. Whilst I can’t imagine that they had invested that much in developing the hardware for the TVs (a quick Google search reveals that they were Intel Atoms, something they had been developing for 2 years prior to Google TV’s release) it appears that they’re still seeking some returns on that initial investment. At the same time however reports are coming in that Intel is dropping anywhere from $100 million to $1 billion on developing this new product, a serious amount of coin that industry analysts believe is an order of magnitude above anyone who’s playing around in this space currently.

The difference between this and other Internet set top boxes appears to be the content deals that Intel is looking to strike with current cable TV providers. Now anyone who’s ever looked into getting any kind of pay TV package knows that whatever you sign up for you’re going to get a whole bunch of channels you don’t want bundled in alongside the ones you do, effectively diluting the value you derive from the service significantly. Pay TV providers have long fought against the idea of allowing people to pick and choose (and indeed anyone who attempted to provide such a service didn’t appear to last long, ala SelecTV Australia) but with the success of on demand services like NetFlix and Hulu it’s quite possible that they might be coming around to the idea and see Intel as the vector of choice.

The feature list that’s been thrown around press prior to an anticipated announcement at CES next week (which may or may not happen, according to who you believe) does sound rather impressive, essentially giving you the on demand access that everyone wants right alongside the traditional programming that we’ve come to expect from pay TV services. The “Cloud DVR” idea, being able to replay/rewind/fast-forward shows without having to record them yourself, is evident of this and it would seem that the idea of providing the traditional channels as well would just seem to be a clever ploy to get the content onto their network. Of course traditional programming is required for certain things like sports and other live events, something which the on demand services have yet to fully incorporate into their offerings.

Whilst I’m not entirely enthused with the idea of yet another set top box (I’m already running low on HDMI ports as it is) the information I’ve been able to dig up on Intel’s offering does sound pretty compelling. Of course many of the features aren’t exactly new, you can do many of the things now with the right piece of hardware and pay TV subscriptions, but the ability to pick and choose channels would be and then getting that Hulu-esque interface to watch previous episodes would be something that would interest me. If the price point is right, and its available globally rather than just the USA, I could see myself trying it out for the select few channels that I’d like to see (along with their giant back catalogues, of course).

In any case it will be very interesting to see if Intel does say anything about their upcoming offering next week as if they do we’ll have information direct from the source and if they don’t we’ll have a good indication of which analysts really are talking to people who are involved in the project.

AdTrap Product Image

Paying to Block Ads: It Should Go To The Content Creators, Not Some 3rd Party.

I don’t run ads here and there’s a really simple reason for that: I have the luxury of not needing to. This blog is one of my longest running hobbies and whilst the cost to me is non-zero in terms of time and actual cash I’m willing to eat both those costs simply for the love of it. There is a point where I’ve told myself that I’ll start running ads (that’s the point where I can make a living off doing this) but that’s somewhere in the order of 50 times the traffic I’m receiving today. Not an impossible goal really but certainly a long way off from where I currently am.

It’s for that particular reason that I don’t run ad blocking services on my browser. You see for the most part I don’t even really notice the ads unless they start forming obvious patterns or have obnoxious auto-playing music and I figure that as a fellow content creator I understand their reason for being there. Even though I don’t usually click on them I know that the author is getting at least some kind of reward for providing that information for free to me, even if it’s not much. I completely support everyone else’s freedom to block ads as they see fit however as I know that overall they’re in a minority and they won’t be the death of free online content any time soon.

Then I read this article titled “How Much Would You Pay to Never See an Online Ad Again?” thinking that it might be some new inventive start-up idea like Flattr which would be working with publishers in order to get rid of advertising on their site. AdTrap is in fact quite the opposite being a hardware device that sits between your modem and router (it actually necessitates that configuration which rules out people using integrated devices) that works to remove ads before they reach your browser. Taken at face value the marketing makes it sound like a pretty fantastic device given all the features it’s touting (many of which are not born of it, simply of the way it connects into your existing infrastructure) and it can be yours all for the low price of $120.

Now granted I had some idea in my head of what AdTrap was (care of the title of the article that led me to it) so it’s possible some of my angst directed towards this product is born of that but I’m not totally on board with the idea of paying someone else in order to block ads. It’s one thing to provide that kind of technology for free, that’s kind of expected on the Internet, but building a business around denying revenue to content creators doesn’t sit right with me. I’d be much more on board with being able to pay people directly in order to remove ads, a la Reddit Gold, rather than some 3rd party who isn’t really doing anything for the content creators with their product.

In the end I guess it doesn’t really matter that much as again the number of users who actually end up buying one of these things will be in the minority and won’t have any meaningful impact on revenue. I guess I just take issue with people profiting from such an endeavour as the motives then change from being simply altruistic to maximising their revenue at the cost of other’s. I’m not going to go on some crusade to try and take them down however as the market will be the final judge of it and if the people want something like this then it was inevitable that it would be created.

The Viability of Cloud Gaming.

The idea of cloud gaming is a seductive one especially for those of us who lived through the times when upgrading your computer every 12 months was a requirement if you didn’t want to be watching a slide show. Abstracting the hardware requirement away from the user and then letting them play on any device above a certain, extremely low threshold would appear to be the solution to the upgrade and availability issues of dedicated gaming platforms. I’ve long made the case that the end product is something of a niche market, one that I was never quite sure would be viable on a large scale. With the demise of OnLive I could very easily make my point based around that but you can never write off an industry on the failures of the first to markets (see Iridium Communications for proof of this).

Providing even a small cloud gaming service requires some rather massive investments in capital expenditure, especially with the hardware that’s currently available today. For OnLive this meant that only one of their servers could serve one of their users at a time which was terrible from a scalability point as they could never really service that many customers without bleeding money on infrastructure. For cloud gaming services of the future however they might be in luck as both NVIDIA and AMD are working on cloud GPUs that will enable them to get much higher densities than the current 1 to 1 ratio. There’ll still be an upper limit to that’s much lower than most cloud services (which typically serve thousands per server) but at the very least the scalability problem is now an engineering issue rather than a capital one.

The second major challenge that cloud gaming companies face how latency sensitive a good portion of the games market is. Whilst you can get down to very low latency numbers with strategically placed servers you’re still going to be adding a good chunk of input lag on top of any server latency which will be unacceptable for a lot of games. Sure there are titles where this won’t be an issue but cutting off a large section of the market (FPS, RTS, RPGs and any mix of them inbetween) further reduces the viability of any potential cloud gaming service.

In fact for many of the titles that could benefit from a cloud gaming service can already be ported to the web thanks to things like Unity or the use of OpenGL extensions in HTML5. Indeed many of the games that I could see being published on a cloud platform (casual MMORPGs, turn based strategy games, etc.) wouldn’t be much different if they were brought to the traditional web instead. Sure you lose some of the platform agnosticity because of this but you can arguably reach the same number of people using that as you could with a cloud platform.

User expectations are also set rather high for cloud services with many of them being flat fee, unlimited usage scenarios (think Pandora, NetFlix, etc). The current business models for cloud gaming didn’t gel well with this mindset as you were paying for the games you wanted to play (often cheaper than retail, sometimes not) for a limited period of time, akin to a long term rental. Whilst this works for some people most users will expect to be able to pay a flat fee in order to access a catalogue they can then use at their leisure and this has significant ramifications for how publishers and developers will license their games to cloud developers. It’s not an insurmountable problem (the music industry came around eventually so the games industry can’t be far behind) but it does introduce a market dynamic that cloud gaming services have not yet investigated.

With all these things being considered I find it hard to see how cloud gaming services can viable in the near term as whilst all the issues are solvable they all work against delivering something that can turn you a profit. Cloud GPUs, ever increasing quality of Internet connections and the desire by many to migrate completely to cloud based services does mean that there’s a trend towards cloud gaming services becoming viable in the future however the other, fundamental limitations could see those pressures rendered null and void. This is something I’m willing to be proven wrong on though as I’ve invested myself heavily in cloud principles and I know that its capable of great things. Whether cloudifying our gaming experience is one of them is something that I don’t believe is currently feasible however and I don’t see that changing for a while.