One thing that not many people knew was that I was pretty keen on the whole Google TV idea when it was announced 2 years ago. I think that was partly due to the fact that it was a collaboration between several companies that I admire (Sony, Logitech and, one I didn’t know about at the time, Intel) and also because of what it promised to deliver to the end users. I was a fairly staunch supporter of it, to the point where I remember getting into an argument with my friends that consumers were simply not ready for something like it rather than it being a failed product. In all honesty I can’t really support that position any more and the idea of Google TV seems to be dead in the water for the foreseeable future.
What I didn’t know was that whilst Google, Sony and Logitech might have put the idea to one side Intel has been working on developing their own product along similar lines, albeit from a different angle than you’d expect. Whilst I can’t imagine that they had invested that much in developing the hardware for the TVs (a quick Google search reveals that they were Intel Atoms, something they had been developing for 2 years prior to Google TV’s release) it appears that they’re still seeking some returns on that initial investment. At the same time however reports are coming in that Intel is dropping anywhere from $100 million to $1 billion on developing this new product, a serious amount of coin that industry analysts believe is an order of magnitude above anyone who’s playing around in this space currently.
The difference between this and other Internet set top boxes appears to be the content deals that Intel is looking to strike with current cable TV providers. Now anyone who’s ever looked into getting any kind of pay TV package knows that whatever you sign up for you’re going to get a whole bunch of channels you don’t want bundled in alongside the ones you do, effectively diluting the value you derive from the service significantly. Pay TV providers have long fought against the idea of allowing people to pick and choose (and indeed anyone who attempted to provide such a service didn’t appear to last long, ala SelecTV Australia) but with the success of on demand services like NetFlix and Hulu it’s quite possible that they might be coming around to the idea and see Intel as the vector of choice.
The feature list that’s been thrown around press prior to an anticipated announcement at CES next week (which may or may not happen, according to who you believe) does sound rather impressive, essentially giving you the on demand access that everyone wants right alongside the traditional programming that we’ve come to expect from pay TV services. The “Cloud DVR” idea, being able to replay/rewind/fast-forward shows without having to record them yourself, is evident of this and it would seem that the idea of providing the traditional channels as well would just seem to be a clever ploy to get the content onto their network. Of course traditional programming is required for certain things like sports and other live events, something which the on demand services have yet to fully incorporate into their offerings.
Whilst I’m not entirely enthused with the idea of yet another set top box (I’m already running low on HDMI ports as it is) the information I’ve been able to dig up on Intel’s offering does sound pretty compelling. Of course many of the features aren’t exactly new, you can do many of the things now with the right piece of hardware and pay TV subscriptions, but the ability to pick and choose channels would be and then getting that Hulu-esque interface to watch previous episodes would be something that would interest me. If the price point is right, and its available globally rather than just the USA, I could see myself trying it out for the select few channels that I’d like to see (along with their giant back catalogues, of course).
In any case it will be very interesting to see if Intel does say anything about their upcoming offering next week as if they do we’ll have information direct from the source and if they don’t we’ll have a good indication of which analysts really are talking to people who are involved in the project.
I don’t run ads here and there’s a really simple reason for that: I have the luxury of not needing to. This blog is one of my longest running hobbies and whilst the cost to me is non-zero in terms of time and actual cash I’m willing to eat both those costs simply for the love of it. There is a point where I’ve told myself that I’ll start running ads (that’s the point where I can make a living off doing this) but that’s somewhere in the order of 50 times the traffic I’m receiving today. Not an impossible goal really but certainly a long way off from where I currently am.
It’s for that particular reason that I don’t run ad blocking services on my browser. You see for the most part I don’t even really notice the ads unless they start forming obvious patterns or have obnoxious auto-playing music and I figure that as a fellow content creator I understand their reason for being there. Even though I don’t usually click on them I know that the author is getting at least some kind of reward for providing that information for free to me, even if it’s not much. I completely support everyone else’s freedom to block ads as they see fit however as I know that overall they’re in a minority and they won’t be the death of free online content any time soon.
Then I read this article titled “How Much Would You Pay to Never See an Online Ad Again?” thinking that it might be some new inventive start-up idea like Flattr which would be working with publishers in order to get rid of advertising on their site. AdTrap is in fact quite the opposite being a hardware device that sits between your modem and router (it actually necessitates that configuration which rules out people using integrated devices) that works to remove ads before they reach your browser. Taken at face value the marketing makes it sound like a pretty fantastic device given all the features it’s touting (many of which are not born of it, simply of the way it connects into your existing infrastructure) and it can be yours all for the low price of $120.
Now granted I had some idea in my head of what AdTrap was (care of the title of the article that led me to it) so it’s possible some of my angst directed towards this product is born of that but I’m not totally on board with the idea of paying someone else in order to block ads. It’s one thing to provide that kind of technology for free, that’s kind of expected on the Internet, but building a business around denying revenue to content creators doesn’t sit right with me. I’d be much more on board with being able to pay people directly in order to remove ads, a la Reddit Gold, rather than some 3rd party who isn’t really doing anything for the content creators with their product.
In the end I guess it doesn’t really matter that much as again the number of users who actually end up buying one of these things will be in the minority and won’t have any meaningful impact on revenue. I guess I just take issue with people profiting from such an endeavour as the motives then change from being simply altruistic to maximising their revenue at the cost of other’s. I’m not going to go on some crusade to try and take them down however as the market will be the final judge of it and if the people want something like this then it was inevitable that it would be created.
The idea of cloud gaming is a seductive one especially for those of us who lived through the times when upgrading your computer every 12 months was a requirement if you didn’t want to be watching a slide show. Abstracting the hardware requirement away from the user and then letting them play on any device above a certain, extremely low threshold would appear to be the solution to the upgrade and availability issues of dedicated gaming platforms. I’ve long made the case that the end product is something of a niche market, one that I was never quite sure would be viable on a large scale. With the demise of OnLive I could very easily make my point based around that but you can never write off an industry on the failures of the first to markets (see Iridium Communications for proof of this).
Providing even a small cloud gaming service requires some rather massive investments in capital expenditure, especially with the hardware that’s currently available today. For OnLive this meant that only one of their servers could serve one of their users at a time which was terrible from a scalability point as they could never really service that many customers without bleeding money on infrastructure. For cloud gaming services of the future however they might be in luck as both NVIDIA and AMD are working on cloud GPUs that will enable them to get much higher densities than the current 1 to 1 ratio. There’ll still be an upper limit to that’s much lower than most cloud services (which typically serve thousands per server) but at the very least the scalability problem is now an engineering issue rather than a capital one.
The second major challenge that cloud gaming companies face how latency sensitive a good portion of the games market is. Whilst you can get down to very low latency numbers with strategically placed servers you’re still going to be adding a good chunk of input lag on top of any server latency which will be unacceptable for a lot of games. Sure there are titles where this won’t be an issue but cutting off a large section of the market (FPS, RTS, RPGs and any mix of them inbetween) further reduces the viability of any potential cloud gaming service.
In fact for many of the titles that could benefit from a cloud gaming service can already be ported to the web thanks to things like Unity or the use of OpenGL extensions in HTML5. Indeed many of the games that I could see being published on a cloud platform (casual MMORPGs, turn based strategy games, etc.) wouldn’t be much different if they were brought to the traditional web instead. Sure you lose some of the platform agnosticity because of this but you can arguably reach the same number of people using that as you could with a cloud platform.
User expectations are also set rather high for cloud services with many of them being flat fee, unlimited usage scenarios (think Pandora, NetFlix, etc). The current business models for cloud gaming didn’t gel well with this mindset as you were paying for the games you wanted to play (often cheaper than retail, sometimes not) for a limited period of time, akin to a long term rental. Whilst this works for some people most users will expect to be able to pay a flat fee in order to access a catalogue they can then use at their leisure and this has significant ramifications for how publishers and developers will license their games to cloud developers. It’s not an insurmountable problem (the music industry came around eventually so the games industry can’t be far behind) but it does introduce a market dynamic that cloud gaming services have not yet investigated.
With all these things being considered I find it hard to see how cloud gaming services can viable in the near term as whilst all the issues are solvable they all work against delivering something that can turn you a profit. Cloud GPUs, ever increasing quality of Internet connections and the desire by many to migrate completely to cloud based services does mean that there’s a trend towards cloud gaming services becoming viable in the future however the other, fundamental limitations could see those pressures rendered null and void. This is something I’m willing to be proven wrong on though as I’ve invested myself heavily in cloud principles and I know that its capable of great things. Whether cloudifying our gaming experience is one of them is something that I don’t believe is currently feasible however and I don’t see that changing for a while.
I’ve never had much luck with Internet speeds. That’s probably because unlike most of my geek brethren I always seemingly forget to check the distance to the nearest exchange from the place I was looking to rent or purchase, something which is now top of my list. Heck even my parents who live in a rural area outside Canberra manage to get better speeds than me thanks to their short distance to their exchange, even though my line of sight distance is almost equal to theirs. It’s still a worlds away from the dial up that I used to make do with but I know there’s a whole other world of faster speeds out there that are just tantalizingly out of reach for me.
At the top of the list is the the holy grail of Internet connections in Australia: the National Broadband Network. Whilst it might be in the realms of fairy tales and unicorns for most people in Australia I know a couple people who’ve managed to get themselves on the service thanks to being in the right place at the right time. From what they tell me its everything that its marketed to be with extremely fast speeds that aren’t dependent on distance from exchange, the modem your using or how much your hardware likes you on a particular day. Unfortunately short of moving into a location that has it already (there are quite a few now, but they’re still the minority) the wait for it is going to be quite long.
There is one technology that is available today that can deliver some pretty impressive speeds so long as your’re within range of a city CBD. The tech I am referring to is, of course, 4G wireless.
Now if you’ve been here for a while I’d forgive you for thinking that I wasn’t a big fan of the whole 4G idea especially when its mentioned in the same breath as the NBN. It is true that I believe they’re solutions to different problems but just as the underlying technology alludes to (Long Term Evolution, or LTE, if you were wondering) I do believe that it is the future of wireless communications. Unfortunately I don’t believe that the wireless network would be capable of supporting all the Internet requirements of Australians even if the specification is theoretically capable of it. It certainly has its place though, however.
As part of my new position with Dell I was given a laptop for accessing the corporate network but the site I’m currently attending doesn’t have an unfettered connection for me to use in order to do so. Initially I was just tethering to my phone as I have a pretty decent data plan (1.5GB/month) that barely ever gets close to being used and for the most part it worked well. However should I pick up my phone to go somewhere or if my S2 was having a particularly bad day I’d lose the connection, dropping anything that required to be always on (like the VPN). Frustrated I decided to grab myself a wireless broadband dongle and for a cool $130 I got myself a 4G one that had 3GB for the first month.
It’s a rather tiny device resembling an overgrown USB stick (and it in fact has a USB stick in it as well for driver installation, pretty neat) so you can imagine I was slightly sceptical about its capabilities to deliver true 4G speeds with such a small antenna. The signal in the area where I’ve used it the most isn’t particularly fantastic either and I was relegated to the NextG network, which is still not bad by mobile broadband standards. However over the weekend I was up in the middle of Sydney on the 11th floor of a hotel in Darling Harbor and I was privy to full bars of signal strength on the 4G network. So like any self respecting geek I gave it what for.
And boy did it ever deliver.
For regular web browsing the difference wasn’t particularly noticeable but I did see something when I opened up Steam on my laptop to try and get a game configured. The download speed I saw was about 2 MB/second and I figured it was just updating from the cache. It in fact wasn’t and was downloading at those blazing speeds right over the wireless broadband. To put that in perspective that kind of speed is about 4 times what I get regularly at home and I wasn’t even trying to stress the connection fully. In hindsight I should’ve done a speed test just to show you what it was theoretically capable of but just simple Steam download test seemed sufficient enough to prove its value.
Unfortunately I feel that the ludicrous speeds I saw are a product of the lack of usage at the moment. Currently there are only a handful of 4G handsets capable of being used in Telstra’s network and the $130 dongle looks quite expensive next to the $30 3G dongle that’d do the job for pretty much everyone. Whether the 4G network is capable scaling up to the same level of demand that the 3G networks currently have is a question that won’t be answered until 4G reaches a similar level of penetration that 3G has today. With the rapid pace of handset development that could come much sooner than you think and 4G services might become much more commonplace sooner rather than later.
In the days before ubiquitous high speed Internet people the idea of having games that were only available when you were online were few and far between with the precious few usually being MMORPGs. As time went on however and the world became more reliably connected game developers sought to take advantage of this by creating much more involved online experiences. This also lead to the development of some of the most insane forms of DRM that have ever existed, schemes where the game will constantly phone home in order to verify if the player is allowed to continue playing the game. The advent of cheap and ubiquitous Internet access then has been both a blessing and a curse to us gamers and it may be much more of the latter for one particular type of game.
Way back when an Internet connection was considered something of a luxury the idea of integrating any kind of on line experience was something of a pipe dream. There was still usually some form of multiplayer but that would usually be reserved the hallowed times of LAN parties. Thus the focus of the game was squarely on the single player experience as that would be the main attraction for potential gamers. This is not to say that before broadband arrived there was some kind of golden age of single player games (some of my favourite games of all time are less than 5 years old) but there definitely was more of a focus on the single player experience back then.
Today its much more common to see games with online components that are critical to the overall experience. For the most part this is usually some form of persistent multiplayer which has shown to be one of the most successful ways to keep players engaged with the game (and hence the brand) long after the single player experience has faded from memory. We can squarely lay the blame for this behaviour at big titles like Call of Duty and Battlefield as most multiplayer systems are seeking to emulate the success those games enjoyed. However the biggest blow that single player games has come from something else: the online requirement to just to be able to play games.
Now I’m not specifically referring to always on DRM, although that is in the same category, more the requirement now for many games to go online at least once before they let you play the game. For many of us this check comes in the form of a login to Steam before we’re able to play the games and for others its built directly into the game, usually via a phone home to ensure that the key is still valid. Whilst there is usually an offline mode available I’ve had (and heard many similar stories) quite a few issues trying to get that to work, even when I still have an Internet connection to put them into said mode. For modern games then the idea that something is truly single player, a game that can be installed and played without the need of any external resources, is dead in the water.
This became painfully obvious when Diablo III, a game considered by many (including myself) to be a primarily single player experience, came with all the problems that are evident in games like MMORPGs. The idea that a single player experience required maintenance enraged many players and whilst I can understand the reasons behind it I also share their frustration because it calls into question just how long these games will continue to exist in the future. Whilst Blizzard does an amazing job with keeping old titles running (I believe the old Battle.Net for Diablo 1 is still up and running) many companies won’t care to keep the infrastructure up and running once all the profit has been squeezed out of a title. Some do give the courtesy of patching the games to function in stand alone mode before that happens, but its unfortunately not common.
It’s strange to consider then that the true single player games of the Internet dark ages might live on forever whilst their progeny may not be usable a couple years down the line. There’s a valid argument for companies not wanting to support things that are simply costing them money that’s only used by a handful of people but it then begs the question as to why the game was developed with such heavy reliance on those features in the first place. Unfortunately it doesn’t look like this will be a trend that will be reversed any time soon and our salvation in many cases will come from the dirty pirates who crack these systems for us at no cost. This can not be relied on however and it should really fall to the game developers to have an exit strategy for games that they no longer want to support should they want to keep the loyalty of their long time customers.
I learnt a long time ago that one of the biggest factors in pricing something, especially in the high tech industry, is convenience. For someone who was always a do-it-yourself-er the notion was pretty foreign to me, I mean why would I spend the extra dollars to have something done for me when I was equally capable of doing it myself? Of course the second I switched from being a salaried employee to a contractor who’s time is billed in hours my equations for determinting something’s value changed drastically and I begun to appreciate being able to pay to get something done rather than having to spend my precious time on it myself.
The convenience factor is what has driven me to try and find some kind of TV solution akin to those that are available in the USA. Unfortunately the only thing that comes close are the less than legal alternatives which is a right shame as I would gladly pay the going rate to get the same service here in Australia. I’m not alone in this regard either as many Australians turn to alternative methods in order to get their fix of their favorite shows. What this says to me is that teh future of TV is definitely moving towards being a more on demand service like those provided by Netflix and Hulu and less like traditional TV channels.
Some industry executives would disagree with me on that point, to the point of saying that watching TV on the Internet is nothing short of a fad that will eventually pass. There’s been a couple clarifications to that post since it first went live but the sentiment remains that they believe people who abandon their cable subscriptions, “cable cutters” as it were, are in the minority and once economic conditions improve they’ll be back again. I can understand the reasoning behind a cable exec taking this kind of position, but it’s woefully misguided.
For starters Netflix alone counts for around a third of peak bandwidth usage in the USA. To put this in perspective that’s double all BitTorrent traffic and triple YouTube, both considered to be hives of piracy among the cable cartels. This is in conjunction with the fact that people are using their Xboxs to watch movies and listen to music more than they’re using them to play games, usually through online services. Taking all of this into consideration you’d be mad to think that the future is still in traditional pay TV services as there’s a very clear trend towards on-demand media, provided through your local Internet connection, is what customers are looking for.
There’s two reasons to explain why cable companies are thinking this way. The first, and least likely, is that they’re simply unaware of the current trends in the media market space. This is not entirely impossible as there have been a few examples in recent times (BlockBuster being the first that comes to mind) who simply failed to recognise where the market was moving and paid the ultimate price for it in the end. The far more likely reason is simple bravado as the cable companies can’t really take the stand and say that they’re aware of the changing market demands but will do nothing about it. No for them its best, at least in the short term, to write off the phenomena completely. In the long term of course this tactic won’t work, but I get the feeling none of them are playing a particularly long game at this point.
As I’ve said many times before media companies and rights holders have fought tooth and nail against every technological advancement for the past century and the only constant in every one of them is that in the end the technology won out. Eventually these companies will have to wake up to the reality that their outdated business models don’t fit into the current market and they’ll either have to adapt or die.
Coding a location based service introduced me to a lot of interesting concepts. The biggest of which was geocoding, an imprecise science of transcribing a user’s IP address into a real world location. I say imprecise because there’s really no good way of doing it and most of the geocoding and reverse-geocoding services out there rely on long lists that match an IP to its location. These lists aren’t entirely accurate so the location you get back from them is usually only good as an initial estimate and you’re better off using something like the HTML5 location framework or just simply asking the user where the hell they are in the world. Unfortunately those inaccurate lists drive a whole lot of current services, most of them with the intent of limiting said service to a certain geographical location.
I’ve written about this practice before and how it’s something of a hangover from the times of DVDs and region locking. From a technology standpoint it makes little sense to block access to certain countries (whether they block you is another matter) as all you’re doing is limiting your market. From a business and legal standpoint the waters are a little murkier as most of the geo-restricted services, the ones of note anyway, are done simply because it’s either not in their business interests to do so (although I believe that’s short sighted) or there’s a lot of legal wrangling to be done in order for it to be made available globally.
A clucky New Zealand ISP, FYX, was attempting to solve this problem of geoblocking and whilst they have withdrawn the service from the market (but are looking to bring it back) I still want to talk about their approach and why its inherently flawed.
FYX is offering what they call “Global Mode” for their Internet Services which apparently makes their users appear as if they’re not from any particular country at all. Their thinking is that once you’re a global user services that were once blocked because of your region will suddenly be available to you, undoing the damage to the free Internet that those inaccurate translations lists can cause. However the idea that no location = geoblocking services ineffective is severely flawed which would be apparent to anyone who’s even had a passing encounter with these services.
For starters most sites with geoblocking enabled do so by using a whitelist meaning that only people of specific countries will be able to access those services. For things like Hulu and netflix they are hard coded to IPs residing within the USA boundaries and anything that’s not on those lists will automatically get blocked. Of course there’s some in-browser trickery that you can do to get around this (although that’s not at the ISP layer) but the only guaranteed solution is to access them through a connection that appears to originate from an IP they trust. Simply not updating the location on those lists won’t do the trick so you’d need to do something more. It’s entirely possible that they’re doing something more fancy than this but the solution I can think of wouldn’t be very scalable, nor particularly proftiable.
It also seems that they might’ve got the attention of some rights holders groups who put pressure on their parent company to do away with the service. Legally there didn’t seem to be anything wrong with the idea (apart from the fact that it probably wouldn’t work as well as advertised) but that wouldn’t stop media companies from threatening to take them to court if such a service was continued to be offered. It really shows how scared such organisations are of new technology if a small time ISP with a not-so-special service can be a big enough blip on the radar to warrant such action. I’ll be interested to see how FYX progresses with this, especially if they detail some more info on just how they go about enabling their Global Mode.
The reality of the situation is that we’re trending to a much more connected world, one where the traditional barriers to the free flow of information are no longer present. Companies that made their fortunes in the past need to adapt to the present and not attempt to litigate their way to profitability. Eventually that won’t be an option for them (think BlockBuster vs Netflix) and I really can’t wait for the day that geoblocking is just a silly memory of when companies thought that their decades old business models still worked in an ever changing world.
There’s little doubt that the past decade has brought upon us rapid change that our current legislature is only just beginning to deal with. One of my long time bugbears, the R18+ rating for games, is a great example of this showing how outdated some of our policies are when it comes to the modern world. Unfortunately such political antiquity isn’t just isolated to the video games industry it extends to all areas that have been heavily affected by the changes the Internet has brought, not least of which is the delivery of content such as TV programs, newspapers and radio. This rift has not gone unnoticed and it seems the government is finally looking to take action on it.
Enter the Convergence Review a report that’s was commissioned in 2011 to review the policy framework surrounding Australia’s media and communications. It’s a hefty tome, weighing in at some 176 pages, detailing nearly every aspect of Australia’s current regulatory framework for delivering content to us Australians. I haven’t managed to get through the whole thing but you don’t need to read far into it to understand that it’s a well researched and carefully thought out document, one that should definitely be taken into consideration in reforming Australia’s regulatory framework for media. There are a couple points that really blew me away in there and I’d like to highlight them here.
For starters the review recommends that the licensing of broadcasting services be abolished in its entirety. In essence this puts traditional broadcasters on a level playing ground with digital natives who don’t have the same requirements placed upon them and their content. Not too long ago such an idea would seem to be a foolish notion as no licensing means that anyone could just start broadcasting whatever they wanted with no control on how it was presented. However with the advent of sites like YouTube such license free broadcasting is already a reality and attempting regulate it in the same fashion as traditional methods would be troublesome and most likely ineffective. Abolishing licensing removes restrictions that don’t make sense anymore given that the same content can be delivered without it.
Such a maneuver like that brings into question what kind of mechanisms you would have to govern the kind of content that gets broadcasted. The review takes this into consideration and recognizes that there needs to be some regulation in order to keep in line with Australian standards (like protecting children from inappropriate content). However the regulations it would apply are not to every content organisation. Instead the regulations will target content organisations based on the size of the organisation and the scope of their audience. This allows content organisations a lot of flexibility with how they deliver content and will encourage quite a bit of innovation in this area.
The review also recommends that media standards apply to all platforms, making the regulations technology agnostic. Doing this would ensure that we don’t end up in this same situation again when another technological breakthrough forces a rethink of our policy platform which as you can tell from the review is going to be a rather arduous process. Keeping the standards consistent across mediums also means that we won’t end up with another R18+ situation where we have half-baked legislation for one medium and mature frameworks in another.
The whole review feels like a unification that’s been long coming as the media landscape becomes increasingly varied to the point where treating them individually is complicated and inefficient. These points I’ve touched on are also just the most striking of the review’s recommendations with many more solid ideas for reforming Australia’s communications and media policies for a future that’s increasingly technologically driven. Seeing reports like this gives me a lot of hope for Australia’s future and I urge the government to take the review to heart and use it to drive Australia forward.
IT is one of the few services that all companies require to compete in today’s markets. IT support then is one of those rare industries where jobs are always around to be had, even for those working in entry level positions. Of course this assumes that you put in the required effort to stay current as letting your skills lapse for 2 or more years will likely leave you a generation of technology behind, making employment difficult. This is of course due to the IT industry constantly evolving and changing itself and much like other industries certain jobs can be made completely redundant by technological advancements.
For the past couple decades though the types of jobs you expect to see in IT support have remained roughly the same, save for the specializations brought on by technology. As more and more enterprises came online and technology began to develop a multitude of specializations became available, enabling then generic “IT guys” to become highly skilled workers in their targeted niche. I should I know, just on a decade ago I was one of those generic IT support guys and today I’m considered to be a specialist when it comes to hardware and virtualization. Back when I started my career the latter of those two skills wasn’t even in the vernacular of the IT community, let alone a viable career path.
Like any skilled position though specialists aren’t exactly cheap, especially for small to medium enterprises (SMEs). This leads to an entire second industry of work-for-hire specialists (usually under the term “consultants”) and companies looking to take the pain out of utilizing the technology without having to pay for the expertise to come in house. This isn’t really a surprise (any skilled industry will develop these secondary markets) but with IT there’s a lot more opportunity to automate and leverage economies of scale, more so than any other industry.
This is where Cloud Computing comes in.
The central idea behind cloud computing is that an application can be developed to run on a platform which can dynamically deliver resources to it as required. The idea is quite simple but the execution of it is extraordinarily complicated requiring vast levels of automation and streamlining of processes. It’s just an engineering problem however, one that’s been surmounted by several companies and used to great effect by many other companies who have little wish to maintain their own infrastructure. In essence this is just outsourcing taken to the next level, but following this trend to its logical conclusion leads to some interesting (and, if you’re an IT support worker, troubling) predictions.
For SMEs the cost of running their own local infrastructure, as well as the support staff that goes along with it, can be one of their largest cost centres. Cloud computing and SaaS offers the opportunity for SMEs to eliminate much of the cost whilst keeping the same level of functionality, giving them more capital to either reinvest in the business or bolster their profit margins. You would think then that this would just be a relocation of jobs from one place to another but cloud services utilize much fewer staff due to the economies of scale that they employ, leaving fewer jobs available for those who had skills in those area.
In essence cloud computing eliminates the need for the bulk of skilled jobs in the IT industry. There will still be need for most of the entry level jobs that cater to regular desktop users but the back end infrastructure could easily be handled by another company. There’s nothing fundamentally wrong with this, pushing back against such innovation never succeeds, but it does call into question those jobs that these IT admins currently hold and where their future lies.
Outside of high tech and recently established businesses the adoption rate of cloud services hasn’t been that high. Whilst many of the fundamentals of the cloud paradigm (virtualization, on-demand resourcing, infrastructure agnostic frameworks) have found their way into the datacenter the next logical step, migrating those same services into the cloud, hasn’t occurred. Primarily I believe this is due to the lack of trust and control in the services as well as companies not wanting to write off the large investments they have in infrastructure. This will change over time of course, especially as that infrastructure begins to age.
For what its worth I still believe that the ultimate end goal will be some kind of hybrid solution, especially for governments and the like. Cloud providers, whilst being very good at what they do, simply can’t satisfy the need of all customers. It is then highly likely that many companies will outsource routine things to the cloud (such as email, word processing, etc) but still rely on in house expertise for the customer applications that aren’t, and probably will never be, available in the cloud. Cloud computing then will probably see a shift in some areas of specialization but for the most part I believe us IT support guys won’t have any trouble finding work.
We’re still in the very early days of cloud computing and its effects on the industry are still hard to judge. There’s no doubt that cloud computing has the potential to fundamentally change the way the world does IT services and whatever happens those of us in IT support will have to change to accommodate it. Whether that comes in the form of reskilling, training or looking for a job in a different industry is yet to be determined but suffice to say that the next decade will see some radical changes in the way businesses approach their IT infrastructure.
There’s little doubt in my mind that the National Broadband Network will be a major benefit to Australia, way past the investment we’re making in it. It’s one of those rare pieces of legislation that will almost certainly outlive the government that started it and the Labor government should be commended for that. Indeed something like the National Broadband Network is almost a necessity if Australia wants to keep pace with the rest of the world in a technological sense as otherwise we’d be stuck on aging copper infrastructure that really doesn’t have any legs left in it. Still whilst anyone in the IT or related sectors would agree that the NBN will be good for business it’s not entirely clear what those benefits will be.
News.com.au ran a story this morning that pointed to research showing only 30% of Australian businesses had a “medium to high” understanding of the benefits available to them through the NBN. Making a few assumptions here I’m guessing the survey didn’t ask actual questions to gauge their true understanding so it’s likely that that number is actually a lot lower than the survey lets on. I’ll admit that for a non-technical person, who was likely the one answering the survey, the benefits of ubiquitous high speed Internet for your business are not entirely clear especially when the Internet they have now is probably doing them well enough.
The businesses geared to make the most of the NBN are ones with multiple offices spread throughout Australia. Right now getting a good inter-office connection, whether a full WAN or just some trickery using VPN tunnels and a regular ADSL, is either an expensive or complicated affair. The NBN will provide high speed interconnects at prices that many businesses will be able to afford. This means you’ll be able to get almost 100MB connections between offices giving you LAN like speeds between disparate offices. It might not sound like much but even small government agencies currently struggle with this (I’ve worked for more than one) and the boost in productivity from better connections between regional offices is very noticeable. This would also extend to remote workers as well, since it’s highly likely that they’ll have NBN access as well.
Having a large connection also enables businesses to move services out of expensive hosted data centres and onto their own premises. Right now it’s nigh on impossible to host client facing services internally unless you want to shell out a lot of money for the business type Internet plans. The NBN will bring data centre level speeds to almost every home and place of business in Australia enabling current businesses the opportunity to migrate inwards, saving on rental and administration costs. Sure the facilities they have might not be as good as what they can get elsewhere but the cost savings of not using a co-located service (believe me, they’re not cheap) would be more than worth it.
There’s also a host of services that are currently infeasible to operate, due to their high bandwidth use, that would become feasible thanks to the NBN. Such services won’t be available immediately but as the NBN reaches a threshold of active users then we can expect either local innovators to create them or for current Internet giants to localize their services for Australia. Predominately I see this taking the form of cloud based services which are accessible from Australia but have yet to have local nodes due to the lack of supporting infrastructure. This would also help cloud providers crack into that ever elusive Australian government sector which has remained resistant due to the restrictions placed on where their data can be stored.
The NBN will also bring about many other ancillary benefits due to the higher speed and ubiquitous access that business will be able to take advantage of. Indeed the flow on effects of a fully fibre communications network will have benefits that will flow on for decades for both businesses and consumers alike. Realistically this list is just the tip of the iceberg as over time there will be numerous services that become available in order to take advantage of our new capabilities. I personally can’t wait to get onto it, enough so that moving to one of the fibre enabled locations is tempting, albeit not tempting enough to make me move to Tasmania.