Whilst I’m not a jet bound workaholic like I thought I’d be when I was this age (ah the naivety of teenagers) I have done my fair share of travel for work. I’ve come to find out that I’m not in either of the extremes of the two camps on it as I’m not particularly adverse to it but neither do I look forward to it like many I have met. Indeed many of the exotic places that I can say I’ve been too were because of work related travel and they truly are experiences that I treasure but should they have become the norm for me I can see myself swiftly becoming sick of it. New places are always fun to visit but I’ve never been on a work trip that wasn’t primarily about work.
It occurred to me that I’d developed a kind of ritual when it came to hotel rooms, something that upon reflection hasn’t changed in quite a while. As far as I can tell I developed it back when I was travelling the USA which I can only assume was because of the multitude of different places we stayed in over the course of the month we spent over there. The reasons for it are simple: I need to know what facilities I have access to and, in the event of the absence, arrange for alternatives. I’m sure this isn’t unique to me either but it was quite interesting to see what habits I had ingrained in myself over the past couple years.
For instance, and this might be a telltale sign of my generation, the first thing I’ll do will be to seek out what kind of Internet connection I have at my disposal. For the most part I’m bound for disappointment, as is the case with my current accommodation ($10 for 24 hours, 700MB limit), but the process of discovering what I’ve got to work with can be quite fun. If I’m in a particularly vindictive move I’ll bust out my network scanner tools and see how well their Internet access scheme has been set up (which, if you’re wondering, hotels seem to be getting better at) but for travel in Australia I’ll usually just tether to my phone.
The next one, which is something of a guilty pleasure of mine, is to crawl through the various pay TV channels to see if they have any of my favorites on them. If Discovery is on there then I’m guaranteed to binge on it for at least an hour each night, usually at the cost of a decent night’s sleep. It gets even worse when you consider just how bad most of the programming on there is and how much of it is continuous repeats but for some reason when I’m in a hotel room that’s one of my top things to do.
I also have to inspect the bed to see if I’ve ended up with a proper bed or the notorious faux-queen (as pictured above). My fellow giants will understand just how irritating those kinds of beds are, especially if they’re paired with an equally tragic mattress.
I think this whole thing just caught me off guard because I didn’t really think of something I had to do after every check in but thinking back to all my stays the first hour or so spent in the room is almost always spent methodically going through each of those items. Is this something that you do? (please say yes, I don’t need another thing that I might be potentially OCD about).
Do you remember the last time the Clean Feed hit the Australian news? I most certainly don’t but luckily I blogged about it every time it happened and the last time it crossed my path was over 2 years ago when some Australian ISPs decided to voluntarily block 500 sites. Suffice to say the No Clean Feed movement, something which I was an active part of, was completely successful and we haven’t had to speak of it again. Indeed I thought that any modern society looking to implement something like Australia’s Internet Filter would see just how politically toxic it was and then think twice about it.
Turns out I was wrong.
David Cameron, Prime Minister of the United Kingdom, has announced a policy that looks eerily similar to the Clean Feed policy that Senator Conroy introduced all those years ago. Essentially it’s a pornography filter and while at first glance it looks like it might be opt-in it’s in fact going to be the dreaded opt-out, meaning that every Internet user in the UK will have their connection filtered unless they ask nicely for their ISP to stop. The rhetoric surrounding the policy is also eerily similar to the Clean Feed with a heavy focus on the impacts to children and attempting to curb the child pornography. If I didn’t know any better I’d say that they’d straight up copied everything about the Clean Feed and simply changed a few words here and there to make it their own. Predictably the Internet is in an uproar about this and the policy is getting all the scrutiny it deserves.
Cameron thinks that his filter will be infallible (gosh where have I heard that before) and that “it should not be the case that technically literate children can just flick the filters off at the click of a mouse without anyone knowing”. Now forgetting for a second that most parents aren’t exactly technically inclined it wouldn’t take a child genius to work out that a proxy site like HideMyAss was all that was required to bypass a filter like that. Sure you could then block those VPN sites but, hang on a second, they’re legitimate sites with completely legal use cases. So you either resign yourself to having an ineffectual filter or you go down that rather ugly path where you make anything that can bypass it illegal, something which I’m sure a lot of businesses would have something to say about.
Had Cameron done a little bit of homework he would have found out that he could win the same number of votes without alienating the tech community by saying that the filter would be opt-in. I’ve said many times in the past that I support such a policy because it gives concerned parents an easy option whilst leaving the majority of Internet users untouched. It’s also better for the ISPs as they can plan a filtering solution based on a minority of their users, rather than having to scale up a solution that has to support their entire user base. For some reason though the default position for policies like this seems to be always-on and anything else is seen as a weak compromise. Funnily enough the thing that would supposedly make such a system more effective will end up killing it in the end, even if Cameron doesn’t see it now.
So, people of the UK, it’s now time for you to do what us Australian’s did and rally together to fight Cameron’s filter policy. I’m not saying it’s going to be easy, nor without any significant effort, but after 3 years we managed to kill our Clean Feed policy for good and made talk of it so politically toxic that neither party dares mention it again. You’ll now have to do the same: contacting members of parliament, staging demonstrations and, most important of all, not letting up until they drop this policy in favor of the next voting winning scheme.
We’ve got your back, fellow members of the Commonwealth.
Just outside the Googleplex in Mountain View California there’s a small facility that was the birthplace for many of the revolutionary technologies that Google is known for today. It’s called Google [x] and is akin to the giant research and development labs of corporations in ages past where no idea is off limits. It’s spawned some of the most amazing projects that Google has made public including the Driverless Car and Project Glass. These are only a handful of the projects that are currently under development at this lab however with vast majority of them remaining secret until they’re ready for release into the world. One more of their projects has just reached that milestone and it’s called Project Loon.
The idea is incredibly simple: provide Internet access to everyone regardless of their location. How they’re going about that however is the genius part: they’re going to use a system of high altitude balloons and base relay stations with each of them being able to cover a 40KM area. For countries that don’t have the resources to lay the cables required to provide Internet this provides a really easy solution to covering large areas and even makes providing Internet possible to regions that would otherwise be inaccessible.
What’s really amazing however is how they’re going about solving some of the issues you run into when you’re using balloons as your transportation system:
The height they fly at is around the bottom end of the range for your typical weather balloon (they can be found from 18KM all the way up to 38KM) and is about half the height from where Felix Baumgartner made his high altitude jump from last year. I wasn’t aware that different layers of the stratosphere had different wind directions and making use of them to keep the balloons in position is just an awesome piece of engineering. Of course this would all be for naught if the Internet service they delivered wasn’t anything above what’s available now with satellite broadband, but it seems they’ve got that covered too.
The Loon stations use the 2.4GHz and 5.8GHz frequencies for communications with ground receivers and base stations and are capable of delivering speeds comparable to 3G (~2MBps or so). Now if I’m honest the choice to use these public signal spaces seems like a little bit of a gamble as whilst it’s free to use it’s also a signal space that’s already quite congested. I guess this is less of a problem in the places where Loon is primarily aimed at, namely regional and remote areas, but even those places have microwaves and personal wifi networks. It’s not an insurmountable problem of course, and I’m sure the way-smarter-than-me people at Google[x] have already thought of that, it’s just an issue with anything that tries to use that same frequency space.
I might never end up being a user of this particular project but as someone who lived on the end of a 56K line for the majority of his life I can tell you how exciting this is for people living outside broadband enabled areas. According to Google it’s launching this month in New Zealand to a bunch of pilot users so it won’t be long before we see how this technology works in the real world. From there I’m keen to see where they take it next as there’s a lot of developing countries where this technology could make some really big waves.
The Internet situation I have at home is what I’d call workable but far from ideal. I’m an ADSL2+ subscriber, a technology that will give you speeds up to 25MBps should you be really close to the exchange, on good copper and (this is key) make the appropriate sacrifices to your last mile providers. Whilst my line of sight distance to the exchange promises speeds in the 15MBps range I’m lucky to see about 40% of that with my sync speed usually hovering around the 4~5MBps range. For a lot of things this is quite usable, indeed as someone who had dial-up for most of his life these speeds are still something I’m thankful for, but it’s becoming increasingly obvious that my reach far exceeds my grasp something which as a technology centric person is fast becoming an untenable position.
Honestly I don’t think about it too much as it’s not like it’s a recent realisation and, since the difference between the best and worst speeds I’ve had weren’t that great in retrospect, I’ve developed a lot of habits to cope with it. Most of these are running things over longer periods when I wouldn’t be using the Internet anyway but not all tasks fit nicely into that solution. Indeed last night when I wanted to add in a video that I recorded to my post, one that was only ~180MB in size, I knew there was going to be a pretty long delay in getting the post online. The total upload time was around 30mins in the end which is just enough time for me to get distracted with other things and completely forget about what I was doing until later that night.
Sure it’s not an amazing example of why I need faster Internet but it does highlight the issue. The video wasn’t particularly large nor super high resolution (720p, 60fps), it was produced on technology that’s over 2 years old and uploaded to a service that’s been around for 7 years. The bottleneck in that equation is the connection that all of them share from my home network, something which hasn’t changed that much in the last decade that I’ve been a broadband Internet user.
For me it’s even worse when I run up against the limitations of paltry connection for things like services I’d like to host myself. In its infancy this blog was hosted from my little server at home but it became quickly apparent that little things like pictures were simply untenable because they’d take forever to load even if I shrunk them down to near unusable sizes. It became even worse when I started looking into using the point to point VPN feature in Azure for connecting a small home environment to the virtual machines I’m running in the cloud as my tiny connection was simply not enough to handle the kind of traffic it would produce. That might not sound like a big deal but for any startup in Australia thinking about doing something similar it kills the idea of creating using the service in that fashion which puts a lot of pressure on their remaining runway.
It’s reasons like this which keep me highly skeptical of the Liberal’s plan for the NBN as the speeds they’re aspiring towards aren’t that much dissimilar to what I’m supposed to be getting now. Indeed they can’t even really guarantee those speeds thanks to their reliance on the woefully inadequate copper network for the last run in their FTTN plan. Canberra residents will be able to tell you how much of a folly their idea is after the debacle that is TransACT (recently bought for $60 million and then its infrastructure sold for $9 million) which utterly failed to deliver on it’s promises, even when they deployed their own copper infrastructure.
It also doesn’t help that their leader thinks that 25MBps is more than enough for Australian residents which, if true, would mean that ADSL2+ would be enough for everyone, including businesses. Us IT admins have known that this hasn’t been the case for a while, especially considering how rare it is to get those speeds, and the reliance on the primary limiting factor (Telstra’s copper network) for the Liberal’s NBN plan effectively ensures that this will continue on for the foreseeable future.
All those points pale in comparison to the one key factor: we will need to go full fibre eventually.
The copper we have deployed in Australia has a hard upper limit to the amount of bandwidth it can carry, one that we’re already running up against today. It can be improved through remediation by installing thicker cables but that’s a pretty expensive endeavour, especially when you take into account the additional infrastructure required to support the faster speeds. Since there’s no plan to do such remediation on the scales required (either by Telstra or as part of the Liberal’s NBN plan) these current limitations will remain in place. Fibre on the other hand doesn’t suffer from the same issues with the new cables able to carry several orders of magnitude more bandwidth just with today’s technology. The cost of deploying it isn’t cheap, as we already know, but considering it will pay for itself well before it reaches the end of its useful life.
My whinging is slightly moot because I’ll probably be one of the lucky ones to have fibre being rolled out to my neighbourhood before the election but I do feel the NBN’s effectiveness will be drastically decreased if its not ubiquitous. It’s one of the few multi-term policies that will have real, tangible benefits for all Australians and messing with it will turn it from a grand project to a pointless exercise. I hope the Liberal’s policy is really just all that much hot air to placate their base because otherwise the Internet future of Australia will be incredibly dim and that’s not something that I, or any user of technology, wants for this country.
After spending a week deep in the bowels of Microsoft’s premier tech conference and writing about them breathlessly for Lifehacker Australia you’d be forgiven for thinking I’m something of a Microsoft shill. It’s true that I think the direction they’re going in for their infrastructure products is pretty spectacular and the excitement for those developments is genuine. However if you’ve been here for a while you’ll know that I’m also among their harshest critics, especially when they do something that drastically out of line with my expectations as one of their consumers. However I believe in giving credit where its due and a recent PA Report article has brought Microsoft’s credentials in one area into question when they honestly shouldn’t be.
The article I’m referring to is this one:
I’m worried that there are going to be a few million consoles trying to dial into the home servers on Christmas morning, about the time when a mass of people begin to download new games through Microsoft’s servers. Remember, every game will be available digitally day and date of the retail version, so you’re going to see a spike in the number of people who buy their Xbox One games online.
I’m worried about what happens when that new Halo or Call of Duty is released and the system is stressed well above normal operating conditions. If their system falls, no matter how good our Internet connections, we won’t be able to play games.
Taken at face value this appears to be a fair comment. We can all remember times when the Xbox Live service came down in a screaming heap, usually around christmas time or even when a large release happened. Indeed even doing a quick Google search reveals there’s been a couple of outages in recent memory although digging deeper into them reveals that it was usually part of routine maintenance and only affected small groups of people at a time. With all the other criticism that’s being levelled at Microsoft of late (most of which I believe is completely valid) it’s not unreasonable to question their ability to keep a service of this scale running.
However as the title of this post alludes to I don’t think that’s going to be an issue.
The picture shown above is from the Windows Azure Internals session by Mark Russinovich which I attended last week at TechEd North America. It details the current infrastructure that underpins the Windows Azure platform which powers all of Microsoft’s sites including the Xbox Live service. If you have a look at the rest of the slides from the presentation you’ll see how far that architecture has come since they first introduced it 5 years ago when the over-subscription rates were much, much higher for the entire Azure stack. What this meant was that when something big happened the network simply couldn’t handle it and caved under the pressure. With this current generation of the Azure infrastructure however it’s far less oversubscribed and has several orders of magnitude more servers behind it. With that in mind it’s far less likely that Microsoft will struggle to service large spikes like they have done in the past as the capacity they have on tap is just phenomenal.
Of course this doesn’t alleviate the issues with the always/often on DRM or the myriad of other issues that people are criticizing the XboxOne for but it should show you that worrying about Microsoft’s ability to run a reliable service shouldn’t be one of them. Of course I’m just approaching this from an infrastructure point of view and it’s entirely possible for the Xbox Live system to have some systemic issue that will cause it to fail no matter how much hardware they throw at it. I’m not too concerned about that however as Microsoft isn’t your run of the mill startup who’s just learning how to scale.
I guess we’ll just have to wait and see how right or wrong I am.
Unquestionably the Internet has drastically altered my behaviour in many ways. In the beginning it was merely a curiosity, something I was able to lord over the other kids in the playground because I was one of the precious few that had it and would become instant friends with many who wanted to see it. As I grew older and my interests broadened it became my primary resource for finding information, leading me to investigate many wild things that I would not have paid any attention to otherwise. Most recently it became my platform for communicating with the wider world whilst also elevating my career to places that I couldn’t of dreamed of.
In short, I feel the Internet has been good to me.
Looking from the outside in however would probably paint a much different picture. My near fanatical obsession for my current object of desire often led me down destructive paths, one of which was my World of Warcraft addiction would could not exist without an Internet connection. My desire for information often leads me down paths that aren’t relevant for anything past satisfying my curiosity, filling my head with facts that I will likely never find a use for. The Internet has also chronicled some of my worst moments and whilst they’re not exactly common knowledge they serve as a reminder of the parts of myself that I’m not particularly proud of.
However whilst it would be easy to lay the blame directly at the tool which enabled this behaviour, an easy thing to do given its current ease of use and pseudo-anonymity that enables everyone’s Inner Fuckwad, I can’t say that these same things would happen absent the Internet. There are many, many people that advocate cutting down or doing away with the Internet (or anything, realistically) will lead you onto untold benefits but as a Verge reporter found these effects are usually only temporary:
It’s a been a year now since I “surfed the web” or “checked my email” or “liked” anything with a figurative rather than literal thumbs up. I’ve managed to stay disconnected, just like I planned. I’m internet free.
And now I’m supposed to tell you how it solved all my problems. I’m supposed to be enlightened. I’m supposed to be more “real,” now. More perfect.
But instead it’s 8PM and I just woke up. I slept all day, woke with eight voicemails on my phone from friends and coworkers. I went to my coffee shop to consume dinner, the Knicks game, my two newspapers, and a copy of The New Yorker. And now I’m watching Toy Story while I glance occasionally at the blinking cursor in this text document, willing it to write itself, willing it to generate the epiphanies my life has failed to produce.
I’m not going to say that the Internet isn’t an enabler for some particularly bad behaviours, my use of it is a great testament to that, but the issues that cause can always be traced back to the person. For me my WoW addiction was an escape from the crazy world I had put myself into, working 3 different jobs while studying full time at university. In my escape I found some control and, unfortunately, also power over other people that was incredibly intoxicating. Only when that power dwindled and I was left with no one else to turn to did I start to realise how destructive it had become and I ended up leaving that part of me behind for several years.
Maybe we need that time away in order to get clarity on those destructive behaviours that we associate with specific tools. My honeymoon was decidedly devoid of technology, even though I smuggled my laptop along with me, and after the first couple days of adjustment I felt oddly liberated. Whilst the revelations I came to at that time weren’t about my use of the Internet (indeed this was several years after I dreged myself out of that rather dark place) I certainly felt I had a better understanding of how I interacted with those things that I was absent from. Perhaps then instead of advocating giving up something completely we should take time constrained breaks, lest we establish the same bad habits using alternative means.
That is definitely something I can attest to as many of my life changing decisions have been made when I’ve been in situations that were decidedly different from the norm, not from giving something up completely. Indeed I feel abandoning something completely often means giving up part of yourself, your identity. Of course there are times when this is appropriate but for something as benign as Internet use I don’t believe that giving it up will solve your problems. However seconding yourself away from it for a time might just give you the insight needed to rectify the worst parts of it and broaden your perspective on the issues at hand.
Last week I regaled you with a story of the inconsistent nature of Australia’s broadband and how the current NBN was going to solve that through replacing the aging copper network with optical fibre. However whilst the fundamental works to deliver it are underway it is still in its nascent stages and could be easily usurped by a government that didn’t agree with its end goals. With the election looking more and more like it’ll swing towards the coalition’s favour there has been a real risk that the NBN we end up with won’t be the one that we were promised at the start, although the lack of a concrete plan has left me biting my tongue whilst I await the proposal.
Today Malcolm Turnbull announced his NBN plan, and it’s not good at all.
Instead of rolling out fibre to 93% of Australians and covering the rest off with satellite and wireless connections the Liberal’s NBN will instead only roll fibre to 22%, the remaining 71% will be covered by FTTN. According to Turnbull’s estimations this will enable all Australians to have broadband speeds of up to 25MBps by 2016 with a planned upgrade of up to 100MBps by 2019. The total cost for this plan would be around $29 billion which is about $15 billion less than the current planned total expenditure required for Labor’s FTTP NBN. If you’re of the mind that the NBN was going to be a waste of money that’d take too long to implement then these numbers would look great to you but unfortunately they’re anything but.
For starters the promise of speeds of up to 25MBps isn’t much of an upgrade over what’s available with the current ADSL2+ infrastructure. Indeed most of the places that they’re looking to cover with this can already get such services so rigging fibre up to their nodes will likely not net much benefit to them. Predominantly this is because the last mile will still be on the copper network which is the major limiting factor in delivering higher speeds to residential areas. They might be able to roll out FTTN within that time frame but it’s highly unlikely that you’ll see any dramatic speed increases, especially if you’re on an old line.
Under the Liberal’s plan you could, however, pay for the last mile run to your house which, going by estimates from other countries that have done similar, could range anywhere from $2500 to $5000. Now I know a lot of people who would pay for that, indeed I would probably be among them, but I’d much rather it be rolled out to everyone indiscriminately otherwise we end up in a worse situation we have now. The idea behind the NBN was ubiquitous access to high speed Internet no matter where you are in Australia so forcing users to pay for the privilege kind of defeats its whole purpose.
Probably the biggest issue for me though is how the coalition plans to get to 100MBps without running FTTP. The technologies that Turnbull has talked about in the past just won’t be able to deliver the speeds he’s talking about. Realistically the only way to reliably attain those speeds across Australia would be with an FTTP network however upgrading a FTTN solution will cost somewhere on the order of $21 billion. All added up that makes the Liberal’s NBN almost $5 billion more than the current Labor one so it’s little wonder that they’ve been trying to talk up the cost in the past week or so.
You can have a look at their policy documents here but be warned it’s thin on facts and plays fast and loose with data. I’d do a step by step takedown of all the crazy in there but there are people who are much more qualified than me to do that and I’ll be sure to tweet links when they do.
Suffice to say the Liberal’s policy announcement has done nothing but confirm our worst fears about the Liberal party’s utter lack of understanding about why the FTTP NBN was a good thing for Australia. Their plan might be cheaper but it will fail to deliver the speeds they say it will and will thus provide a lot less value for the same dollars spent on a FTTP solution. I can only hope come election time we end up with a hung parliament again because the independents will guarantee that nobody fucks with the FTTP NBN.
The state of broadband Internet in Australia is one of incredible inconsistency. I lived without it for the better part of my youth, being stuck behind a dial up connection because my local exchange simply didn’t have the required number of people interested in getting broadband to warrant any telco installing the required infrastructure there. I was elated when we were provided a directional wireless connection that gave me speeds that were comparable to that of my city dwelling friends but to call it reliable was being kind as strong winds would often see it disconnect at the most inconvenient of times.
The situation didn’t improve much when I moved into the city though as whilst I was pretty much guaranteed ADSL wherever I lived the speed at which it was delivered varied drastically. In my first home, which was in an affluent and established suburb, usually capped out at well below half of its maximum speed. The second home fared much better despite being about as far away from the closest exchange as the other house was. My current residence is on par with the first, even with the technological jump from ADSL to ADSL2+. As to the reason behind this I can not be completely sure but there is no doubt that the aging copper infrastructure is likely to blame.
I say this because my parents, who still live out in the house that I grew up in, were able to acquire an ADSL2+ connection and have been on it for a couple years. They’re not big Internet users though and I’d never really had the need to use it much when I’m out there visiting but downloading a file over their connection last week revealed that their connection speeds were almost triple mine, despite their long line of sight distance to their exchange. Their connection is likely newer than most in Canberra thanks to their rural neighbourhood being a somewhat recent development (~30 years or so). You can then imagine my frustration with the current copper infrastructure as it simply can not be relied upon to provide consistent speeds, even in places where you’d expect it to be better.
There’s a solution on the horizon however in the form of the National Broadband Network. The current plan of rolling out fibre to 93% of Australian households (commonly referred to as Fibre to the Premises/Home, or FTTP/H) elminates the traditional instability that plagues the current copper infrastructure along with providing an order of magnitude higher speeds. Whilst this is all well and good from a consumer perspective it will also have incredible benefits for Australia economically. There’s no denying that the cost is quite high, on the order of $37 billion, but not only will it pay itself back in real terms long before its useful life has elapsed it will also provide benefits far exceeding that cost shortly after its completion.
Should this year’s election go the way everyone is thinking it will though the glorious NBN future will look decidedly grim if the Coalition has their way with it. They’ve been opponents of it from the get go, criticising it as a wasteful use of government resources. Whilst their plan might not sound that much different on the surface, choosing to only run Fibre to the Node (FTTN) rather than the premises, it is a decidedly inferior solution that will not deliver the same level of benefits as the currently envisioned NBN. The reason behind this is simple: it still uses the same copper infrastructure that has caused so many issues for current broadband users in Australia.
You don’t have to look much further than Canberra’s own FTTN network TransACT to know just how horrific such a solution is. After a decade of providing lackluster service, one that provided almost no benefit over ADSL2+, TransACT wrote down their capital investment and sold it to iiNet. If FTTN can’t survive in a region that is arguably one of the most affluent and tech savvy in Australia then it has absolutely no chance of surviving elsewhere, especially when current ADSL services can still be seen as competitive. You could make the argument that the copper could be upgraded/remediated but then you’re basically just building a FTTP solution using copper, so why not just go for optic fibre instead?
What really puts it in perspective is that the International Space Station, you know that thing whizzing 300KM above earth at Mach 26, has faster Internet than the average Australian does. Considering your average satellite connection isn’t much faster than dial up the fact that the ISS can beat the majority of Australians speed wise shows just how bad staying on copper will be. FTTN won’t remedy those last mile runs where all the attenuation happens and that means that you can’t guarantee minimum speeds like you can with FTTP.
The NBN represents a great opportunity to turn Australia into a technological leader, transforming us from something of an Internet backwater to a highly interconnected nation with infrastructure that will last us centuries. It will mean far more for Australia than faster loading web pages but failing to go the whole for the whole FTTP will make it an irrelevant boondoggle. Whilst we only have party lines to go on at the moment with the “fully detailed” plan still forthcoming it’s still safe to say that the Coalition are bad news for it, no matter which angle you view their plan from.
I’ll just put this here, a sunset on Mars as seen by the Curiosity rover:
I had one of those moments watching this video where I just considered the chain of events that led up to me being able to see this. There’s a robot on another planet, several million kilometers away, that’s beaming pictures back to Earth. Those pictures were then made available to the public via a vast, interconnected network that spans the entire globe. One person on that network decided to collate them into a video and make that available via said network. I then, using commodity hardware that anyone can purchase, was able to view that video. The chain of events leading up to that point seem so improbable when you look at as a completed system but they all exist and are all products of human innovation.
Isn’t that just mind blowingly awesome?
One thing that not many people knew was that I was pretty keen on the whole Google TV idea when it was announced 2 years ago. I think that was partly due to the fact that it was a collaboration between several companies that I admire (Sony, Logitech and, one I didn’t know about at the time, Intel) and also because of what it promised to deliver to the end users. I was a fairly staunch supporter of it, to the point where I remember getting into an argument with my friends that consumers were simply not ready for something like it rather than it being a failed product. In all honesty I can’t really support that position any more and the idea of Google TV seems to be dead in the water for the foreseeable future.
What I didn’t know was that whilst Google, Sony and Logitech might have put the idea to one side Intel has been working on developing their own product along similar lines, albeit from a different angle than you’d expect. Whilst I can’t imagine that they had invested that much in developing the hardware for the TVs (a quick Google search reveals that they were Intel Atoms, something they had been developing for 2 years prior to Google TV’s release) it appears that they’re still seeking some returns on that initial investment. At the same time however reports are coming in that Intel is dropping anywhere from $100 million to $1 billion on developing this new product, a serious amount of coin that industry analysts believe is an order of magnitude above anyone who’s playing around in this space currently.
The difference between this and other Internet set top boxes appears to be the content deals that Intel is looking to strike with current cable TV providers. Now anyone who’s ever looked into getting any kind of pay TV package knows that whatever you sign up for you’re going to get a whole bunch of channels you don’t want bundled in alongside the ones you do, effectively diluting the value you derive from the service significantly. Pay TV providers have long fought against the idea of allowing people to pick and choose (and indeed anyone who attempted to provide such a service didn’t appear to last long, ala SelecTV Australia) but with the success of on demand services like NetFlix and Hulu it’s quite possible that they might be coming around to the idea and see Intel as the vector of choice.
The feature list that’s been thrown around press prior to an anticipated announcement at CES next week (which may or may not happen, according to who you believe) does sound rather impressive, essentially giving you the on demand access that everyone wants right alongside the traditional programming that we’ve come to expect from pay TV services. The “Cloud DVR” idea, being able to replay/rewind/fast-forward shows without having to record them yourself, is evident of this and it would seem that the idea of providing the traditional channels as well would just seem to be a clever ploy to get the content onto their network. Of course traditional programming is required for certain things like sports and other live events, something which the on demand services have yet to fully incorporate into their offerings.
Whilst I’m not entirely enthused with the idea of yet another set top box (I’m already running low on HDMI ports as it is) the information I’ve been able to dig up on Intel’s offering does sound pretty compelling. Of course many of the features aren’t exactly new, you can do many of the things now with the right piece of hardware and pay TV subscriptions, but the ability to pick and choose channels would be and then getting that Hulu-esque interface to watch previous episodes would be something that would interest me. If the price point is right, and its available globally rather than just the USA, I could see myself trying it out for the select few channels that I’d like to see (along with their giant back catalogues, of course).
In any case it will be very interesting to see if Intel does say anything about their upcoming offering next week as if they do we’ll have information direct from the source and if they don’t we’ll have a good indication of which analysts really are talking to people who are involved in the project.