If you want Netflix in Australia there’s really only one way to do it: get yourself a VPN with an endpoint in the states. That’s not an entirely difficult process, indeed many of my less tech savvy friends have managed to accomplish it without any panicked phone calls to me. The legality of doing that is something I’m not qualified to get into but since there hasn’t been a massive arrest spree of nefarious VPN users I can’t imagine it’s far outside the bounds of law. Indeed you couldn’t really do that unless you also cracked down on the more legitimate users of VPN services, like businesses and those with regulatory commitments around protecting customer data. However if you’d ask the BBC users of VPNs are nothing but dirty pirates and it’s our ISP’s job to snoop on them.
In a submission to the Australian Government, presumably under the larger anti-piracy campaign that Brandis is heading, the BBC makes a whole list of suggestions as to how they should go about combating Australia’s voracious appetite for purloined content. Among the numerous points is the notion that a lot of pirates now use a VPN to hide their nefarious activities. In the BBC’s world ISPs would take this as a kind of black flag, signalling that any heavy VPN user was likely also engaging in copyright infringement. They’d then be subject to the woeful idea of having their Internet slowed down or cut off, presumably if they couldn’t somehow prove that it was legitimate. Even though they go on to talk about false positives the ideas they discuss in their submission are fucking atrocious and I hope they never see the light of day.
I have the rather fortunate (or unfortunate, depending on how you look at it) ability of being able to do my work from almost anywhere I choose, including my home. This does mean that I have to VPN back into the mothership in order to get access to my email, chat and all other corporate resources which can’t be made available over the regular Internet. Since I do a lot of this at home under the BBC’s suggestion I’d probably be flagged as a potential pirate and be subject to measures to curb my behaviour. Needless to say I don’t think I’m particularly unique in this either so there’s vast potential for numerous false positives to spring up under this system.
Worse still all of those proposed measures fall on the ISP’s shoulders to design, implement and enforce. Not only would this put an undue burden on them, which they’d instantly pass onto us in the form of increased prices, it would also make them culpable when an infringing user figured out how to defeat their monitoring system. Now everyone knows that it doesn’t take long for people to circumvent these systems which, again, increases pressure on the ISPs to implement even more invasive and draconian systems. It’s a slippery slope that we really shouldn’t be going down.
Instead of constantly looking towards the stick as the solution to Australia’s piracy woes it’s time for companies, and the Australian government, to start looking at the carrot. Start looking at incentives for rights holders to license content in Australia or mandating that we get the same content at the same time for the same price as it is elsewhere. The numerous Netflix users in Australia shows there’s demand for such a service, we just need it to match the same criteria that customers overseas expect. Once we get that I’m sure you’ll see a massive reduction in the amount of piracy in Australia, coupled with the increase in sales that the right’s holders seem so desperate to protect.
Last year I fucked up.
There’s really no other way to put it, I made the rookie mistake of not backing up everything before I started executing commands that could have some really bad consequences. I’d like to say it was hubris, thinking that my many years in the industry had made me immune to things like this, but in reality it was just my lack of knowledge of how certain commands worked. Thankfully it wasn’t a dreaded full wipe and I was able to restore the essence of this blog (I.E. the writing) without too much trouble, however over time it became apparent just how incomplete that restore was. Whilst I was able to restore quite a lot of the pictures I’ve used over the years I was still lacking lots of them, some of them on some of my favourite posts.
Thankfully, after writing some rather complicated PowerShell scripts, I was able to bulk restore a lot of images. Mostly this was because of the way I do the screenshots for my reviews, meaning there was a copy of pretty much everything on PC, I just had to find them. I’ve been reviewing games for quite some time though and that’s meant I’ve changed PCs a couple times, meaning some of the images are lost in the sea of old hard drives I have lying around the place. Whilst I was able to scrounge up a good chunk of them by finding an old version of the server I used to host locally there were still some images that eluded me, forcing me to think of other places that might have a copy of them.
My site has been on the Wayback Machine for some time now so I figured that there would (hopefully) be a copy of most of my images on there. For the most part there is, even the full sized ones, however there were still multiple images that weren’t there either. My last bastion of hope was Google’s cache of my website however they only store (or at least, make available) the latest version that they have indexed. Sometimes this meant that I could find an image here or there, as they seem to be archived separately and aren’t deleted if you remove it, however it was still at hit or miss affair. In the end I managed to get the list of missing images down from about 2000 to 150 and thanks to a fortuitous hard drive backup I found most of those will hopefully be eliminated in short order.
What kept me going throughout most of this was the mantra that many privacy advocates and parents alike have parroted many times: the Internet never forgets. For the most part I’d be inclined to agree with this as the vast majority of the information that I had put out there, even though I had erased the source, was still available for anyone to view. However the memory of the Internet, much like that of the humans that run it, isn’t a perfect one, routinely forgetting things, jumbling them up or just plain not remembering them at all. The traces of what you’re searching for are likely there somewhere, but there’s no guarantee that the Internet will remember everything for you.
Not so long time readers will know that a month ago (exactly, strangely enough) I posted about the issues that a FTTN NBN wouldn’t fix, namely that of the horrendous nature of the copper network that Telstra currently maintains. When I posted it I figured that my almost unusably slow Internet was the byproduct of the incumbent weather and would soon rectify itself, something which had happened in the past. Unfortunately that wasn’t the case at all and after many days of sunshine and no improvement in sight I decided to do the thing I had been regretting: calling up Telstra to get the line investigated.
You can then imagine my elation when I saw that they now have a handy online form for you to fill out instead of calling them. Like a dutiful consumer I filled it out and sent it on its way, not caring about the multiple warnings about getting charged $120 if there was no fault found. The site guaranteed me a response within a week and so I waited for them to respond. Almost like clockwork a response appear from Telstra a week later claiming that the problem had been resolved and inviting me to take a survey about my experience. If my problem hadn’t been fixed, it said, I could say so on the survey and they’d continue investigating the issue.
Of course the fault hadn’t been fixed as no one had contacted me since lodging the fault so it was obvious that they hadn’t done any troubleshooting at all, that was just the system automatically closing out a ticket that had no action on it. I replied to the survey in kind, outlining the issues I was experiencing and the troubleshooting steps I had taken to fix it. I received a call back a day later from an agent who was going to handle my case who was very understanding of the situation I was in. However the earliest he could send out a technician was a month away although he promised he’d get that moved up.
I never heard back from him after a couple call backs where he told me he couldn’t do anything for me (even though he promised to keep me updated). Luckily the technician did arrive on the scheduled date although at 8AM rather than the agreed time of after 5pm. Upon inspection of my outlet he asked if I was able to get a connection at all as the line was essentially unusable by his diagnostic tools. After a quick trip to the pit he came back with an assessment that shouldn’t shock anyone but should make you lose all faith in the state of Telstra’s copper network.
Essentially the pit had been uncovered for quite some time, much like the above picture, with the terminals exposed to the elements. Another technician had been by recently though as they had put a temporary cover the terminals to protect it however this had to have been done after my terminal had degraded. A simple rewiring job fixed the issue but the pit still remains uncovered although, hopefully, the terminals are now protected from the elements so that it won’t happen again in the future.
The issue here is that I know this isn’t exactly uncommon as I’ve managed to pass multiple pits in my travellings around Canberra that are in a similar state. To get speeds higher than what I get right now would mean that a lot of remediation to the copper network would need to be done and no where in the government’s NBN plan does it stipulate that happening. This makes their promise of getting higher speeds to everyone cheaper and faster hollow as the infrastructure they’re relying on to provide it simply isn’t capable of delivering the required outcomes. I could go on but I feel like I’ve said my peace about this a dozen times over already. I just wanted to highlight the amount of rigmarole I had to go through to get a single connection fixed which, when multiplied by an entire nation, shows how infeasible a FTTN NBN really is.
Growing up in a rural area meant that my Internet experience was always going to be below that of my city living counterparts. This wasn’t much of an issue for a while as dial-up was pretty much all you could hope for anywhere in Australia however the advent of broadband changed this significantly. From then on the disparity in Internet accessibility was pretty clear and the gap only grew as time went on. This didn’t seem to change much after I moved into the city either, always seeming to luck out with places that connected at speeds far below the advertised maximum that our current gen ADSL lines were capable of. Worst still they almost always seemed to be at the mercy of the weather with adverse conditions dropping speeds or disconnecting us from the Internet completely.
My current place of residence never got great speeds, topping out at 6Mbps and only managing to sustain that connection for a couple hours before falling over. I can expect to get a pretty stable 4Mbps connection most of the time however the last few days have seen Canberra get a nice amount of rain and the speeds I was able to get barely tickled 1Mbps no matter how many times I reconnected, reset my modem or shouted incoherently at the sky. It was obvious then that my situation was caused by the incumbent weather, filling my local Telstra pit with water which sent the signal to noise ratio into the ground. Usually this is something I’d just take on the chin but this situation was meant to be improved by now if it wasn’t for the current government.
Prior to the election my area was scheduled to start construction in October last year however it became one of the areas that disappeared off NBNco’s deployment map shortly after the Abbot government came into power. This meant I would then come under their revised plan to bring in FTTN through VDSL which has the unfortunate consequence of leaving me on the known-bad infrastructure in my street. So my speeds might improve but it’d be unlikely that I’d get “at least” 20Mbps and I could guarantee that every time it rained I’d be in for another bout of tragic Internet speeds, if I could connect to it at all.
The big issue with the Liberal’s NBN plan is that my situation is by no means unique and indeed quite typical thanks to the aging infrastructure that is commonplace throughout much of Australia. Indeed the only place that I know gets speeds as advertised for their cable run are my parents who still live in a rural area. The reason for this is because the copper is new out there and is quite capable of carrying the higher speeds. My infrastructure on the other hand, in a place where you’d expect it to be regularly maintained, doesn’t hold a candle to theirs and will continue to suffer from issues after we get “upgraded”.
A full FTTP NBN on the other hand would eliminate these issues providing ubiquitous access that’s, above all, dependable and reliable. The copper mile last run that the majority of Australia will end up using as part of the Liberal’s NBN just can’t provide that, not without significant remediation which neither Telstra nor the government has any interest in doing. Hopefully the Liberal government wakes up and realises this before we get too far down the FTTN hole as it’s been shown that the majority of Australian’s want the FTTP NBN and they’re more than willing to pay for it.
Convincing the wider tech community that the the FTTN NBN is a bad idea isn’t exactly a hard task as anyone who’s worked in technology understands the fundamental benefits of a primarily fibre network over one that’s copper. Indeed even non-technical users of Australia’s current broadband network are predominately in favour of the fully fibre solution knowing that it will lead to a better, more reliable service than anything the copper network can deliver. The was a glimmer of hope back in September when Turnbull commissioned NBNco to do a full report on the current rollout and how that would compare to his FTTN solution however his reaction to a recent NBNco report seems to show otherwise.
The document in question is a report that NBNCo prepared during the caretaker period that all government departments enter prior to an election. The content of the document has been rather devastating to the Coalition’s stance that FTTN can be delivered faster and cheaper with NBNCo stating in no uncertain terms that they would not be able to meet the deadlines promised before the election. Additionally many of the fundamental problems with the FTTN solution were also highlighted which should be a very clear signal to Turnbull that his solution is simply not tenable, at least in its current form.
However Turnbull has done as much as he can to discredit this report, taking the stance that it was heavily outdated and written over 6 months ago. However this is clearly not the case as there’s ample evidence that it was written recently, even if it was during the recent caretaker period (where, you could potentially argue, that NBNCo was still under the influence of Labor). In all honesty though the time at which it was written is largely irrelevant as the criticisms of it have been echoed by myself and other IT pundits for as long as the Coalition has spruiked their FTTN policy.
Worse still the official NBNCo report, which Turnbull has previously stated he’ll bind himself to, was provided to him almost 2 weeks ago and hasn’t seen the light of day since. It was even brought up during question time during a recent sitting of parliament and Turnbull was defiant in his stance to not release it. We’ll hopefully be getting some insight into what the report actually contains tomorrow as a redacted version of the report will be made available to some journalists. For someone who wanted a lot more transparency from NBNCo he is being awfully hypocritical as, if he was right about FTTN being cheaper and faster to implement, would have supported that view. The good money is then on the fact that the report is far more damning about the Coalition’s policy than Turnbull had hoped it’d be.
If Turnbull wants to keep any shred of creditability with the technically inclined voters he’s going to have to fess up sooner or later that the Coalition’s policy was a non-starter and pursuing the FTTP solution is the right way to go. Heck he doesn’t even have to do the former if he doesn’t want to but putting his stamp on the FTTP NBN would go a long way to undoing the damage to his reputation as the head of technology for Australia. I guess we’ll know more about why he’s acting the way he is tomorrow.
Whilst I’m not a jet bound workaholic like I thought I’d be when I was this age (ah the naivety of teenagers) I have done my fair share of travel for work. I’ve come to find out that I’m not in either of the extremes of the two camps on it as I’m not particularly adverse to it but neither do I look forward to it like many I have met. Indeed many of the exotic places that I can say I’ve been too were because of work related travel and they truly are experiences that I treasure but should they have become the norm for me I can see myself swiftly becoming sick of it. New places are always fun to visit but I’ve never been on a work trip that wasn’t primarily about work.
It occurred to me that I’d developed a kind of ritual when it came to hotel rooms, something that upon reflection hasn’t changed in quite a while. As far as I can tell I developed it back when I was travelling the USA which I can only assume was because of the multitude of different places we stayed in over the course of the month we spent over there. The reasons for it are simple: I need to know what facilities I have access to and, in the event of the absence, arrange for alternatives. I’m sure this isn’t unique to me either but it was quite interesting to see what habits I had ingrained in myself over the past couple years.
For instance, and this might be a telltale sign of my generation, the first thing I’ll do will be to seek out what kind of Internet connection I have at my disposal. For the most part I’m bound for disappointment, as is the case with my current accommodation ($10 for 24 hours, 700MB limit), but the process of discovering what I’ve got to work with can be quite fun. If I’m in a particularly vindictive move I’ll bust out my network scanner tools and see how well their Internet access scheme has been set up (which, if you’re wondering, hotels seem to be getting better at) but for travel in Australia I’ll usually just tether to my phone.
The next one, which is something of a guilty pleasure of mine, is to crawl through the various pay TV channels to see if they have any of my favorites on them. If Discovery is on there then I’m guaranteed to binge on it for at least an hour each night, usually at the cost of a decent night’s sleep. It gets even worse when you consider just how bad most of the programming on there is and how much of it is continuous repeats but for some reason when I’m in a hotel room that’s one of my top things to do.
I also have to inspect the bed to see if I’ve ended up with a proper bed or the notorious faux-queen (as pictured above). My fellow giants will understand just how irritating those kinds of beds are, especially if they’re paired with an equally tragic mattress.
I think this whole thing just caught me off guard because I didn’t really think of something I had to do after every check in but thinking back to all my stays the first hour or so spent in the room is almost always spent methodically going through each of those items. Is this something that you do? (please say yes, I don’t need another thing that I might be potentially OCD about).
Do you remember the last time the Clean Feed hit the Australian news? I most certainly don’t but luckily I blogged about it every time it happened and the last time it crossed my path was over 2 years ago when some Australian ISPs decided to voluntarily block 500 sites. Suffice to say the No Clean Feed movement, something which I was an active part of, was completely successful and we haven’t had to speak of it again. Indeed I thought that any modern society looking to implement something like Australia’s Internet Filter would see just how politically toxic it was and then think twice about it.
Turns out I was wrong.
David Cameron, Prime Minister of the United Kingdom, has announced a policy that looks eerily similar to the Clean Feed policy that Senator Conroy introduced all those years ago. Essentially it’s a pornography filter and while at first glance it looks like it might be opt-in it’s in fact going to be the dreaded opt-out, meaning that every Internet user in the UK will have their connection filtered unless they ask nicely for their ISP to stop. The rhetoric surrounding the policy is also eerily similar to the Clean Feed with a heavy focus on the impacts to children and attempting to curb the child pornography. If I didn’t know any better I’d say that they’d straight up copied everything about the Clean Feed and simply changed a few words here and there to make it their own. Predictably the Internet is in an uproar about this and the policy is getting all the scrutiny it deserves.
Cameron thinks that his filter will be infallible (gosh where have I heard that before) and that “it should not be the case that technically literate children can just flick the filters off at the click of a mouse without anyone knowing”. Now forgetting for a second that most parents aren’t exactly technically inclined it wouldn’t take a child genius to work out that a proxy site like HideMyAss was all that was required to bypass a filter like that. Sure you could then block those VPN sites but, hang on a second, they’re legitimate sites with completely legal use cases. So you either resign yourself to having an ineffectual filter or you go down that rather ugly path where you make anything that can bypass it illegal, something which I’m sure a lot of businesses would have something to say about.
Had Cameron done a little bit of homework he would have found out that he could win the same number of votes without alienating the tech community by saying that the filter would be opt-in. I’ve said many times in the past that I support such a policy because it gives concerned parents an easy option whilst leaving the majority of Internet users untouched. It’s also better for the ISPs as they can plan a filtering solution based on a minority of their users, rather than having to scale up a solution that has to support their entire user base. For some reason though the default position for policies like this seems to be always-on and anything else is seen as a weak compromise. Funnily enough the thing that would supposedly make such a system more effective will end up killing it in the end, even if Cameron doesn’t see it now.
So, people of the UK, it’s now time for you to do what us Australian’s did and rally together to fight Cameron’s filter policy. I’m not saying it’s going to be easy, nor without any significant effort, but after 3 years we managed to kill our Clean Feed policy for good and made talk of it so politically toxic that neither party dares mention it again. You’ll now have to do the same: contacting members of parliament, staging demonstrations and, most important of all, not letting up until they drop this policy in favor of the next voting winning scheme.
We’ve got your back, fellow members of the Commonwealth.
Just outside the Googleplex in Mountain View California there’s a small facility that was the birthplace for many of the revolutionary technologies that Google is known for today. It’s called Google [x] and is akin to the giant research and development labs of corporations in ages past where no idea is off limits. It’s spawned some of the most amazing projects that Google has made public including the Driverless Car and Project Glass. These are only a handful of the projects that are currently under development at this lab however with vast majority of them remaining secret until they’re ready for release into the world. One more of their projects has just reached that milestone and it’s called Project Loon.
The idea is incredibly simple: provide Internet access to everyone regardless of their location. How they’re going about that however is the genius part: they’re going to use a system of high altitude balloons and base relay stations with each of them being able to cover a 40KM area. For countries that don’t have the resources to lay the cables required to provide Internet this provides a really easy solution to covering large areas and even makes providing Internet possible to regions that would otherwise be inaccessible.
What’s really amazing however is how they’re going about solving some of the issues you run into when you’re using balloons as your transportation system:
The height they fly at is around the bottom end of the range for your typical weather balloon (they can be found from 18KM all the way up to 38KM) and is about half the height from where Felix Baumgartner made his high altitude jump from last year. I wasn’t aware that different layers of the stratosphere had different wind directions and making use of them to keep the balloons in position is just an awesome piece of engineering. Of course this would all be for naught if the Internet service they delivered wasn’t anything above what’s available now with satellite broadband, but it seems they’ve got that covered too.
The Loon stations use the 2.4GHz and 5.8GHz frequencies for communications with ground receivers and base stations and are capable of delivering speeds comparable to 3G (~2MBps or so). Now if I’m honest the choice to use these public signal spaces seems like a little bit of a gamble as whilst it’s free to use it’s also a signal space that’s already quite congested. I guess this is less of a problem in the places where Loon is primarily aimed at, namely regional and remote areas, but even those places have microwaves and personal wifi networks. It’s not an insurmountable problem of course, and I’m sure the way-smarter-than-me people at Google[x] have already thought of that, it’s just an issue with anything that tries to use that same frequency space.
I might never end up being a user of this particular project but as someone who lived on the end of a 56K line for the majority of his life I can tell you how exciting this is for people living outside broadband enabled areas. According to Google it’s launching this month in New Zealand to a bunch of pilot users so it won’t be long before we see how this technology works in the real world. From there I’m keen to see where they take it next as there’s a lot of developing countries where this technology could make some really big waves.
The Internet situation I have at home is what I’d call workable but far from ideal. I’m an ADSL2+ subscriber, a technology that will give you speeds up to 25MBps should you be really close to the exchange, on good copper and (this is key) make the appropriate sacrifices to your last mile providers. Whilst my line of sight distance to the exchange promises speeds in the 15MBps range I’m lucky to see about 40% of that with my sync speed usually hovering around the 4~5MBps range. For a lot of things this is quite usable, indeed as someone who had dial-up for most of his life these speeds are still something I’m thankful for, but it’s becoming increasingly obvious that my reach far exceeds my grasp something which as a technology centric person is fast becoming an untenable position.
Honestly I don’t think about it too much as it’s not like it’s a recent realisation and, since the difference between the best and worst speeds I’ve had weren’t that great in retrospect, I’ve developed a lot of habits to cope with it. Most of these are running things over longer periods when I wouldn’t be using the Internet anyway but not all tasks fit nicely into that solution. Indeed last night when I wanted to add in a video that I recorded to my post, one that was only ~180MB in size, I knew there was going to be a pretty long delay in getting the post online. The total upload time was around 30mins in the end which is just enough time for me to get distracted with other things and completely forget about what I was doing until later that night.
Sure it’s not an amazing example of why I need faster Internet but it does highlight the issue. The video wasn’t particularly large nor super high resolution (720p, 60fps), it was produced on technology that’s over 2 years old and uploaded to a service that’s been around for 7 years. The bottleneck in that equation is the connection that all of them share from my home network, something which hasn’t changed that much in the last decade that I’ve been a broadband Internet user.
For me it’s even worse when I run up against the limitations of paltry connection for things like services I’d like to host myself. In its infancy this blog was hosted from my little server at home but it became quickly apparent that little things like pictures were simply untenable because they’d take forever to load even if I shrunk them down to near unusable sizes. It became even worse when I started looking into using the point to point VPN feature in Azure for connecting a small home environment to the virtual machines I’m running in the cloud as my tiny connection was simply not enough to handle the kind of traffic it would produce. That might not sound like a big deal but for any startup in Australia thinking about doing something similar it kills the idea of creating using the service in that fashion which puts a lot of pressure on their remaining runway.
It’s reasons like this which keep me highly skeptical of the Liberal’s plan for the NBN as the speeds they’re aspiring towards aren’t that much dissimilar to what I’m supposed to be getting now. Indeed they can’t even really guarantee those speeds thanks to their reliance on the woefully inadequate copper network for the last run in their FTTN plan. Canberra residents will be able to tell you how much of a folly their idea is after the debacle that is TransACT (recently bought for $60 million and then its infrastructure sold for $9 million) which utterly failed to deliver on it’s promises, even when they deployed their own copper infrastructure.
It also doesn’t help that their leader thinks that 25MBps is more than enough for Australian residents which, if true, would mean that ADSL2+ would be enough for everyone, including businesses. Us IT admins have known that this hasn’t been the case for a while, especially considering how rare it is to get those speeds, and the reliance on the primary limiting factor (Telstra’s copper network) for the Liberal’s NBN plan effectively ensures that this will continue on for the foreseeable future.
All those points pale in comparison to the one key factor: we will need to go full fibre eventually.
The copper we have deployed in Australia has a hard upper limit to the amount of bandwidth it can carry, one that we’re already running up against today. It can be improved through remediation by installing thicker cables but that’s a pretty expensive endeavour, especially when you take into account the additional infrastructure required to support the faster speeds. Since there’s no plan to do such remediation on the scales required (either by Telstra or as part of the Liberal’s NBN plan) these current limitations will remain in place. Fibre on the other hand doesn’t suffer from the same issues with the new cables able to carry several orders of magnitude more bandwidth just with today’s technology. The cost of deploying it isn’t cheap, as we already know, but considering it will pay for itself well before it reaches the end of its useful life.
My whinging is slightly moot because I’ll probably be one of the lucky ones to have fibre being rolled out to my neighbourhood before the election but I do feel the NBN’s effectiveness will be drastically decreased if its not ubiquitous. It’s one of the few multi-term policies that will have real, tangible benefits for all Australians and messing with it will turn it from a grand project to a pointless exercise. I hope the Liberal’s policy is really just all that much hot air to placate their base because otherwise the Internet future of Australia will be incredibly dim and that’s not something that I, or any user of technology, wants for this country.
After spending a week deep in the bowels of Microsoft’s premier tech conference and writing about them breathlessly for Lifehacker Australia you’d be forgiven for thinking I’m something of a Microsoft shill. It’s true that I think the direction they’re going in for their infrastructure products is pretty spectacular and the excitement for those developments is genuine. However if you’ve been here for a while you’ll know that I’m also among their harshest critics, especially when they do something that drastically out of line with my expectations as one of their consumers. However I believe in giving credit where its due and a recent PA Report article has brought Microsoft’s credentials in one area into question when they honestly shouldn’t be.
The article I’m referring to is this one:
I’m worried that there are going to be a few million consoles trying to dial into the home servers on Christmas morning, about the time when a mass of people begin to download new games through Microsoft’s servers. Remember, every game will be available digitally day and date of the retail version, so you’re going to see a spike in the number of people who buy their Xbox One games online.
I’m worried about what happens when that new Halo or Call of Duty is released and the system is stressed well above normal operating conditions. If their system falls, no matter how good our Internet connections, we won’t be able to play games.
Taken at face value this appears to be a fair comment. We can all remember times when the Xbox Live service came down in a screaming heap, usually around christmas time or even when a large release happened. Indeed even doing a quick Google search reveals there’s been a couple of outages in recent memory although digging deeper into them reveals that it was usually part of routine maintenance and only affected small groups of people at a time. With all the other criticism that’s being levelled at Microsoft of late (most of which I believe is completely valid) it’s not unreasonable to question their ability to keep a service of this scale running.
However as the title of this post alludes to I don’t think that’s going to be an issue.
The picture shown above is from the Windows Azure Internals session by Mark Russinovich which I attended last week at TechEd North America. It details the current infrastructure that underpins the Windows Azure platform which powers all of Microsoft’s sites including the Xbox Live service. If you have a look at the rest of the slides from the presentation you’ll see how far that architecture has come since they first introduced it 5 years ago when the over-subscription rates were much, much higher for the entire Azure stack. What this meant was that when something big happened the network simply couldn’t handle it and caved under the pressure. With this current generation of the Azure infrastructure however it’s far less oversubscribed and has several orders of magnitude more servers behind it. With that in mind it’s far less likely that Microsoft will struggle to service large spikes like they have done in the past as the capacity they have on tap is just phenomenal.
Of course this doesn’t alleviate the issues with the always/often on DRM or the myriad of other issues that people are criticizing the XboxOne for but it should show you that worrying about Microsoft’s ability to run a reliable service shouldn’t be one of them. Of course I’m just approaching this from an infrastructure point of view and it’s entirely possible for the Xbox Live system to have some systemic issue that will cause it to fail no matter how much hardware they throw at it. I’m not too concerned about that however as Microsoft isn’t your run of the mill startup who’s just learning how to scale.
I guess we’ll just have to wait and see how right or wrong I am.