You’d think that with my time spent as a retail worker I’d have some sense of loyalty to real world shop fronts, knowing that there’s value in a good salesperson’s opinion on what product best suits my needs. There’s something to that and indeed should I find myself out of my depth or simply not wanting to do the research myself I’ll head on into the store but my primary means for conducting my shopping is still via online merchants. Whilst its hard to argue the convenience factor of the majority of the experience the last mile delivery system is somewhat lacklustre, usually requiring me to either truck out to a depot, abscond from work early or hope that my darling wife will be able to break herself away from her studies so the goods can be delivered.
Before anyone suggests getting it delivered to my work I’ll have to say that my experience in doing so has been rather mixed. In the past I had had places where the delivery guys came right up to our reception desks to deliver things and this worked great. However as I graduated to bigger and better places that had delivery docks my lowly deliveries often got lost in the works, sometimes for days on end, with no way for me to track them. Thus I’ve since refrained from using them as at least when I get them delivered to my home I’ll either still have tracking from the courier or a note from Australia Post telling me where to pick it up. However if the latest innovation from Australia Post has anything to do with it I might need not rely on either of those processes again thanks to the introduction of Parcel Lockers.
For the uninitiated Parcel Lockers are a free service from Australia Post. You sign up for one at their website, select the location where you’d like your parcels delivered to and you’ll receive a shipping address which you can have your packages delivered to. Then when your package arrives you’ll receive a SMS with a code in it and you can then go to the locker in question and retrieve your package. Initially they were only available in a few select locations, the middle of Canberra being one of them, but they’ve since spread to other mid to large sized post offices although their availability at postal locations is still not ubiquitous.
After forgetting that I had signed up for one for the better part of 3 months I finally decided to give them a go to see how the process would pan out. I figured I’d keep it simple so I ordered a book from Book Depository that I’ve been eyeing off for ages (Critical Path if you’re wondering, and yes I’m trying to do exactly that) so that if I didn’t get it there’d be no great loss. About 2 weeks after placing the order I got my message saying a parcel was ready for me to pick up. Picking it up was painless, just punch in the code and the parcel locker opens for you, the screen even tells you where to look if it’s that hard for you to notice it opening. That’s it, nothing more to it.
Of course there are some limitations to this service as you can see from the picture above. You can’t get anything you want delivered to them as they don’t have sizes to accommodate everything and I’d hazard a guess that they’d send you a message to come collect it from somewhere else should you attempt to do so. Additionally since these are obviously at something of a premium they’ll get aggressive should you fail to pick it up swiftly (I forgot to get mine on the day and was told to pick it up 2 business days later before the afternoon). The simple solution to this is to get more of them something which Australia Post appears to be doing.
Ultimately what I’d love to have would be my very own parcel locker style device at my house that deliveries could be made to. I’d be happy to pay for the privilege too as the amount of convenience it would deliver would exceed even that of the current parcel lockers. However I’d likely be just as happy if my local post office had one as whilst this is somewhat convenient it’s only just above going to my local post office since I don’t live anywhere near to one of these (and indeed only recently started working in walking distance to one). Unfortunately they don’t seem to have a roadmap available as to when these will become available in other locations but I can’t imagine this is something they’ll want to limit just to the bigger distribution centres.
Canberra is a strange little microcosm. If you live here chances are you are either working directly for the government as a member of the public service or you’re part of an organisation that’s servicing said government. This is especially true in the field of IT as anyone with a respectable amount of IT experience can make a very good living working for any of the large department’s headquarters. I have made my IT career in this place and in my time here I’ve spent all of my time lusting after the cutting edge of technology whilst dealing with the realities of what large government departments actually need to function. As long time readers will be aware I’ve been something of a cloud junkie for a while now but not once have I been able to use it at my places of work, and there’s a good reason for that.
Not that you’d know that if you heard the latest bit of rhetoric from the current government which has criticised the current AGIMO APS ICT Strategy for providing only “notional” guidelines for using cloud base services. Whilst I’ll agree that the financial implications are rather cumbersome (although this is true of any procurement activity within the government, as anyone who’s worked in one can tell you) what annoyed me was the idea that security requirements were too onerous. The simple fact of the matter is that many government departments have regulatory and legal obligations not to use overseas cloud providers due to the legislation that restricts Australian government data from travelling outside our borders.
The technical term for this is data sovereignty and for the vast majority of the large government departments of Australia they’re legally bound to keep all their services, and the data that they rely on, on Australian soil. The legislation is so strict in this regard that even data that’s not technically sensitive, like say specifications of machines or network topologies, in some cases can’t be given to external vendors and must instead only be inspected on site. The idea then that governments could take advantage of cloud providers, most of which don’t have availability zones here in Australia, is completely ludicrous and no amount of IT strategy policies can change that.
Of course cloud providers aren’t unaware of these issues, indeed I’ve met with several people behind some of the larger public clouds on this, and many of them are bringing availability zones to Australia. Indeed Amazon Web Services has already made itself available here and Microsoft’s Azure platform is expected to land on our shores sometime next year. The latter is probably the more important of the two as if the next AGIMO policy turns out the way it’s intended the Microsoft cloud will be the defacto solution for light user agencies thanks to the heavy amount of Microsoft products in use at those places.
Whilst I might be a little peeved at the rhetoric behind the review of the APS ICT Strategy I do welcome it as even though it was only written a couple years ago it’s still in need of an update due to the heavy shift towards cloud services and user centric IT that we’ve seen recently. The advent of Australian availability zones will mean that the government agencies most able to take advantage of cloud services will finally be able to, especially with AGIMO policy behind them. Still it will be up to the cloud providers to ensure their systems can meet the requirements of these agencies and there’s still every possibility that they will still not be enough for some departments to take advantage of.
We’ll have to see how that pans out, however.
Ask any computer science graduate about the first programmable computer and the answer you’ll likely receive would be the Difference Engine, a conceptual design by Charles Babbage. Whilst the design wasn’t entirely new (that honour goes to J. H. Müller who wrote about the idea some 36 earlier) he was the first to obtain funding to create such a device although he never managed to get it to work, despite blowing the equivalent of $350,000 in government money on trying to build it. Still modern day attempts at creating the engine with the tolerances of the time period have shown that such a device would have worked should have he created it.
But Babbage’s device wasn’t created in a vacuum, it built on the wealth of mechanical engineering knowledge from the decades that proceeded him. Whilst there was nothing quiet as elaborate as his Analytical Engine there were some marvellous pieces of automata, ones that are almost worthy of the title of programmable computer:
The fact that this was built over 240 years ago says a lot about the ingenuity that’s contained within it. Indeed the fact that you’re able to code your own message into The Writer, using the set of blocks at the back, is what elevates it above other machines of the time. Sure there were many other automata that were programmable in some fashion, usually by changing a drum, but this one allows configuration on a scale that they simply could not achieve. Probably the most impressive thing about it is that it still works today, something which many machines of today will not be able to claim in 240 years time.
Whilst a machine of this nature might not be able to lay claim to the title of first programmable computer you can definitely see the similarities between it and it’s more complex cousins that came decades later. If anything it’s a testament to the additive nature of technological developments, each one of them building upon the foundations of those that came before it.
The resignation of the National Broadband Network board was an expected move due to the current government’s high level of criticism of the project. Of course while I, and many other technically inclined observers, disagreed with the reasons cited for Turnbull’s request for their resignations I understood that should we want to get the NBN in the way we (the general public) wanted it then it was a necessary move that would allow the Liberal party to put their stamp on the project. However what followed seemed to be the worst possible outcome, one that could potentially see the NBN sent down the dark FTTN path that would doom Australia into remaining as an Internet backwater for the next few decades.
They hired ex-Telstra CEO Ziggy Switkowski.
For anyone who lived through his tenure as the head of Australia’s largest telecommunications company his appointment to the head of the NBN board was a massive red flag. It would be enough to be outraged at his appointment for the implementation of data caps and a whole host of other misdeeds that have plagued Australia’s Internet industry since his time in office but the real crux of the matter is that since his ousting at Telstra he’s not been involved in the telecommunications industry for a decade. Whatever experience he had with it is now long dated and whilst I’m thankful that his tenure as head of the board is only temporary (until a new CEO is found) the fact that he has approved other former Telstra executives to the NBN board shows that even a small amount of time there could have dire implications
News came yesterday however that Turnbull has appointed Simon Hackett, of Internode fame, was appointed to the NBN board. In all honesty I never expected this to come through as whilst there were a few grass roots campaigns to get that to happen I didn’t think that they’d have the required visibility in order to make it happen. However Hackett is a well known name in the Australian telecommunications industry and it’s likely that his reputation was enough for Turnbull to consider him for the position. Best of all he’s been a big supporter of the FTTH NBN since the get go and with this appointment will be able to heavily influence the board’s decisions about the future of Australia’s communication network.
Whilst I was always hopeful that a full review of the feasibility of the NBN would come back with resounding support for a FTTH solution this will almost certainly guarantee such an outcome. Of course Turnbull could still override that but with his staunch stance of going with the review’s decision it’s highly unlikely he’d do that, less he risk some (even more) severe political backlash. The most likely change I can see coming though is that a good chunk of the rollout, mostly for sites where there is no current contracts, will fall to Telstra. Whilst I’m a little on the fence about this (they’d be double dipping in that they’d get paid to build the new network and for disconnecting their current customers) it’s hard to argue that Telstra isn’t a good fit for this. I guess the fact that they won’t end up owning it in the end does make it a fair bit more palatable.
So hopefully with Hackett’s appointment to the NBNCo board we’ll have a much more technically inclined view presented at the higher levels, one that will be able to influence decisions to go down the right path. There’s still a few more board members to be appointed and hopefully more of them are in the same vein as Hackett as I’d rather not see it be fully staffed with people from Telstra.
I’ve worked with a lot of different hardware in my life, from the old days of tinkering with my Intel 80286 through to esoteric Linux systems running on DEC tin until I, like everyone else in the industry, settled on x86-64 as the defacto standard. Among the various platforms I was happy to avoid (including such lovely things as Sun SPARC) was Intel’s Itanium range as it’s architecture was so foreign from anything else it was guaranteed that whatever you were trying to do, outside of building software specifically for that platform, was doomed to failure. The only time I ever came close to seeing it being deployed was on the whim of a purchasing manager who needed guaranteed 100% uptime until they realised the size of the cheque they’d need to sign to get it.
If Intel’s original dream was to be believed then this post would be coming to you care of their processors. You see back when it was first developed everything was still stuck in the world of 32bit and the path forward wasn’t looking particularly bright. Itanium was meant to be the answer to this, with Intel’s brand name and global presence behind it we would hopefully see all applications make their migration to the latest and greatest 64bit platform. However the complete lack of any backwards compatibility with any currently developed software and applications meant adopting it was a troublesome exercise and was a death knell for any kind of consumer adoption. Seeing this AMD swooped in with their dually compatible x86-64 architecture which proceeded to spread to all the places that Itanium couldn’t, forcing Intel to adopt the standard in their consumer line of hardware.
Itanium refused to die however finding a home in the niche high end market due to its redundancy features and solid performance for optimized applications. However the number of vendors continuing to support the platform dwindled from their already low numbers with it eventually falling to HP being the only real supplier of Itanium hardware in the form of their NonStop server line. It wasn’t a bad racket for them to keep up though considering the total Itanium market was something on the order of $4 billion a year and with only 55,000 servers shipped per year you can see how much of a premium they attract). Still all the IT workers of the world have long wondered when Itanium would finally bite the dust and it seems that that day is about to come.
HP has just announced that it will be transitioning its NonStop server range from Itanium to x86 effectively putting an end to the only sales channel that Intel had for their platform. What will replace it is still up in the air but it’s safe to assume it will be another Intel chip, likely one from their older Xeon line that shares many of the features that the Itanium had without the incompatible architecture. Current Itanium hardware is likely to stick around for an almost indefinite amount of time however due to the places it has managed to find itself in, much to the dismay of system administrators everywhere.
In terms of accomplishing it’s original vision Itanium was an unabashed failure, never finding the consumer adoption that it so desired and never becoming the herald of 64bit architecture. Commercially though it was somewhat of a success thanks to its features that made it attractive to the high end market but even then it was only a small fraction of total worldwide server sales, barely enough to make it a viable platform for anything but wholly custom solutions. The writing was on the wall when Microsoft said that Windows Server 2008 was the last version to support it and now with HP bowing out the death clock for Itanium has begun ticking in earnest, even if the final death knell won’t come for the better part of a decade.
The SR-71, commonly referred to as the Blackbird, was a pinnacle of engineering. Released back in 1966 it was capable of cruising at Mach 3.2 at incredible heights, all the way up to 25KM above the Earth’s surface. It was the only craft that had the capability to outrun any missiles thrown at it and it’s for this reason alone that not one Blackbird was ever lost to enemy action (although a dozen did fail in a variety of other scenarios). However the advent of modern surveillance techniques, such as the introduction of high resolution spy satellites and unmanned drones made the capabilities that the Blackbird offered somewhat redundant and it was finally retired from service back in 1998. Still plane enthusiasts like myself have always wondered if there would ever be a successor craft as nothing has come close to matching the Blackbird’s raw speed.
The rumours of a successor started spreading over 3 decades ago when it was speculated that the USA, specifically Lockheed Martin, had the capability to build a Mach 5 version of the Blackbird. It was called Project Aurora by the public and there have been numerous sightings attributed to the project over the years as well as a lot of sonic boom data gathered by various agencies pointing towards a hypersonic craft flying in certain areas. However nothing concrete was ever established and it appear that should the USA be working on a Blackbird successor it was keeping it under tight wraps, not wanting a single detail of it to escape. A recent announcement however points to the Aurora being just a rumour with the Blackbirds successor being a new hypersonic craft called the SR-72.
Whilst just a concept at this stage, with the first scaled prototype due in 2023, the SR-72′s capabilities are set to eclipse that of the venerable Blackbird significantly. The target cruise speed for the craft is a whopping Mach 6, double that of its predecessor. The technology to support this kind of speed is still highly experimental to the point where most of the craft built to get to those kinds of speeds (in air) have all ended rather catastrophically. Indeed switching between traditional jet engines and the high speed scramjets is still an unsolved problem (all those previous scramjet examples were rocket powered) and is likely the reason for the SR-72′s long production schedule.
What’s particularly interesting about the SR-72 though is the fact that Lockheed Martin is actually considering building it as the aforementioned reasons for the Blackbird’s retirement haven’t gone away. Whilst this current concept design seems to lend itself to a high speed reconnaissance drone (I can’t find any direct mention of it being manned and there’s no visible windows on the craft), something which does fit into the USA’s current vision for their military capabilities, it’s still a rather expensive way of doing reconnaissance. However the SR-72 will apparently have a strike capable variant, something which the Blackbird did not have. I can’t myself foresee a reason for having such a high speed craft to do bombing runs (isn’t that what we have missiles for?) but then again I’m not an expert on military strategy so there’s probably something I’m missing there.
As a technology geek though the prospect of seeing a successor to the SR-72 makes me giddy with excitement as the developments required to make it a reality would mean the validation of a whole bunch of tech that could provide huge benefits to the rest of the world. Whilst I’m sure the trickle down wouldn’t happen for another decade or so after the SR-72′s debut you can rest assured that once scramjet technology has been made feasible it’ll find its way into other aircraft meaning super fast air travel for plebs like us. Plus there will also be all the demonstrations and air shows for Lockheed Martin to show off its new toy, something which I’m definitely looking forward to.
The tech world was all abuzz about Phonebloks just over a month ago with many hailing it as the next logical step in the smartphone revolution. Whilst I liked the idea since it spoke to the PC builder in me it was hard to overlook the larger issues that plagued the idea, namely the numerous technical problems as well as the lack of buy in from component manufacturers. Since then I hadn’t heard anything further on it and figured that the Thunderclap campaign they had ended without too much fuss but it appears that it might have caught the attention of people who could make the idea happen.
Those people are Motorola.
As it turns out Motorola has been working on their own version of the Phonebloks idea for quite some time now, over a year in fact. It’s called Project Ara and came about as a result of the work they did during Sticky, essentially trucking around the USA with unlocked handsets and 3D printers and holding a series of makeathons. The idea is apparently quite well developed with a ton of technical work already done and some conceptual pieces shown above. Probably the most exciting thing for Phonebloks followers ;will be the fact that Motorola has since reached out to Dave Hakkens and are hoping to use his community in order to further their idea. By their powers combined it might just be possible for a modular handset to make its way into the real world.
Motorola’s handset division, if you recall, was acquired by Google some 2 years ago mostly due to their wide portfolio of patents that Google wanted to get its hands on. At the same time it was thought that Google would then begin using Motorola for their first party Nexus handsets however that part hasn’t seemed to eventuate with Google leaving them to do their own thing. However such a close tie with Google might provide Project Ara the resources it needs to actually be successful as there’s really no other operating system they could use (and no, the Ubuntu and Firefox alternatives aren’t ready for prime time yet).
Of course the technical issues that were present in the Phonebloks idea don’t go away just because some technicians from Motorola are working on them. Whilst Motorola’s design is quite a bit less modular than what Phonebloks was purporting it does look like it has a bit more connectivity available per module. Whether that will be enough to support the amount of connectivity required for things like quad core ARM CPUs or high resolution cameras will remain to be seen however.
So whilst the Phonebloks idea in its original form might never see the light of day it does appear that at least one manufacturer is willing to put some effort into developing a modular handset. There’s still a lot of challenges for it to overcome before the idea can be made viable but the fact that real engineers are working on it with the backing of their company gives a lot of credence to it. I wouldn’t expect to see any working prototypes for a while to come though, even with Motorola’s full backing, but potentially in a year or so we might start to see some make their way to trade shows and I’ll be very interested to see their capabilities.
Much like my current aversion to smartwatches I’m equally disinterested in the idea of a fitness tracker. As a man of science I do like the idea in principle as anyone looking to better themselves should track as much data as they can to ensure what they’re doing is actually having an effect. However all the devices on market don’t appear to be much more than smart pedometers with nice interfaces something which doesn’t really track the kinds of things I’m looking for (since most of my exercise isn’t aerobic in nature). I don’t discount their value for others but if I was going to invest in one it’d have to do a lot more than just be an accelerometer attached to my wrist.
I may have found one in AIRO, a rather Jony Ive-esque device coming from a new 3 person startup. For the most part it sports the same features as other health trackers, presumably through the same method of an incorporated accelerometer, but its real claim to fame comes from its apparent ability to detect metabolites in your blood, without having to cut yourself to do so. AIRO also claims to be able to detect the quality of the food you’re eating as well which, from what I can tell by looking at their website, seems to be related to the macro-nutrient breakdown. As someone who regularly struggles to get enough calories to support their goals (yeah I’m one of those people, believe me it’s not as great as you might think it is) and really can’t be bothered to use a calorie tracker this is of particular interest to me, something I’d consider plonking down a chunk of change for.
Of course the sceptic in me was instantly roused by the idea that a device could non-invasively determine such things because such technology would be a boon to diabetics, not to mention any research program looking at monitoring caloric intake. Indeed something like this is so far out of left field that most of the mainstream coverage of the device doesn’t go into just how it works, except for referring to the fact that it measures calories and macro-nutrient breakdown based on light. It sounds like a great theory but since there’s no source material provided to show how their method works, nor any validation using standard means like doubly labelled water or even short term experiments with strictly controlled caloric intake.
I was going to leave it at that, and indeed not even write about it since I wanted to see some validation of the idea before I said anything, but then I stumbled across this article from ScienceDaily which links to a German study that has been able to measure blood glucose with infrared light. The function of their device sounds different to the one AIRO purports, instead using the infrared light to penetrate the skin and cause a resonance in the glucose within the bloodstream which their device can then pick up. Their device sounds like it would be anything but wearable however with a “shoebox sized” device planned to be released within the next few years. This doesn’t validate the idea behind AIRO but it does lend some credence to the idea that you’d be able to extract some kind of information about blood metabolites using light pulses.
So I’m definitely intrigued now, possibly to the point of shelling out the requisite $159 to get one delivered when they come out, but I would love to see some validation of the device by the inventors to prove their device can do what they say it can do. It’s not like this would be particularly difficult, hell if they send me a prototype device I’ll happily engage in a tightly controlled caloric diet in order to prove it can measure everything, and it would go a long way to convince the sceptics that what they’ve made really is as good as they say it is. Heck I bet there’s even a couple other startups that’d love to do some testing to prove that their products also work as intended (I’m looking at you, Soylent) and having that kind validation would be extremely valuable for both involved.
I hadn’t been in Visual Studio for a while now, mostly because I had given up on all of my side projects due to the amount of time they soaked up vs my desire to do better game reviews on here, requiring me to spend more time actually playing the games. I had come up with an idea for a game a while back and was really enjoying developing the concept in my head so I figured it would be good to code up a small application to get my head back in the game before I tackled something a little bit more difficult. One particular idea I had was a Soundcloud downloader/library manager as whilst there are other tools to do this job they’re a little cumbersome and I figured it couldn’t be too difficult to whip it up in a days worth of coding.
How wrong I was.
The Soundcloud API has a good amount of documentation about it and from what I could tell I would be able to get my stream using it. However since this wasn’t something that was publicly available I’d have to authenticate to the API first through their OAuth2 interface, something which I had done with other sites so I wasn’t too concerned that it would be a barrier. Of course the big difference between those other projects and this one was the fact that this application was going to be a desktop app and so I figured I was either going to have to do some trickery to get the token or manually step through the process in order to get authenticated.
After having a quick Google around it looked like the OAuth library I had used previously, DotNetOpenAuth, would probably be able to fit the bill and it didn’t take me long to find a couple examples that looked like they’d do the job. Even better I found an article that showed an example of the exact problem I was trying to solve, albeit using a different library to the one I was. Great, I thought, I’ll just marry up the examples and get myself going in no time and after a little bit of working around I was getting what appeared to be an auth token back. Strangely though I couldn’t access any resources using it, either through my application or directly through my browser (as I had been able to do in the past). Busting open Fiddler showed that I was getting 401 (unauthorized) errors back, indicating that the token I was providing wasn’t a viable option.
After digging around and looking at some various resources it appears that whilst the OAuth API might still be online it’s not the preferred way of accessing anything and, as far as I can tell, is mostly deprecated. No worries I’ll just hit up the OAuth2 API instead, figuring that it should be relatively simple to authenticate to it since DotNetOpenAuth now natively supports it. Try as I might to find a working example I just simply could not get it to work with Soundcloud’s API, not even using the sample application that DotNetOpenAuth provides. Trying to search for other, more simplistic examples left me empty handed, especially if I tried to search for a desktop application workflow.
I’m willing to admit that I probably missed something here but honestly the amount of code and complexity that appears to be required to handle the OAuth2 authentication process, even when you’re using a library, seems rather ludicrous. Apparently WinRT has it pretty easy but those are web pages masquerading as applications which are able to take advantage of their auth work flow, something which I was able to make work quite easily in the past. If someone knows of a better library or has an example of the OAuth2 process working with a desktop application in C# then I’d love to see it because I simply couldn’t find out how to do it, at least not after half a day of frustration.
In defending the FTTP NBN I’ve seen nearly every argument imaginable as to why we shouldn’t be doing it. Whilst I can understand the concerns around rollout times and the total cost the NBN stands as one of the few multi-term, nation-wide infrastructure projects that has tangible benefits that will last for decades to come. Some of the more esoteric arguments I’ve received have hinged on the idea that consumers are quite ok with the current state of Australian Internet and that anything above that is a wasteful exercise that will only support consumers of pornography and illegal downloads. This could not be further from the truth but it still seems to remain as a valid talking point for anti-NBN stalwarts.
Shown above are the take up rates from NBNCo released back in March this year and it’s pretty easy to discern a common trend here. Whilst there are a couple places showing stagnant growth most are showing very strong upward trends. Indeed most of them track the 2 highest take up rate areas pretty closely and so it is reasonable to conclude that they will eventually all see similar adoption rates. Considering that the service is still in its infancy overall take up rates of 30% are pretty amazing and will only get better as the offerings from various ISPs improve. So there definitely seems to be a demand for the NBN although the question then becomes do people want higher speeds or are they just looking for a more reliable Internet connection?
The report that NBNCo presented to the Parliamentary Joint Committee on the National Broadband network contains some pretty in depth analysis of the current consumers of the NBN and the data is quite telling. Initially it was thought that the bulk of the NBN’s customers would be on the lowest plan possible (12Mbps down /1Mbps up), on the order of 49% or so with the next biggest sector being 25/10 at 27%. Actual deployed numbers differ from that significantly with the lowest sector accounting for 39% and the next biggest sector being the top tier plan (100/40) with 31%. This means that there’s a large number of Australians who want the fastest Internet they can get their hands on and the majority of them want speeds above what they’re currently getting.
This echoes the sentiment that’s been seen oversees with similar projects like Google Fiber in Kansas City. This runs contrary to the Liberal’s position that 25Mbps would be enough for the average household as it seems like many would like to take advantage of higher speeds. Whilst its looking more and more like the NBN will remain untouched in its current form (although it might end up being rolled out by Telstra) those ideas still seem to permeate the rhetoric of NBN detractors. As the numbers show Australians are craving faster, more stable Internet connections and given the opportunity they’ll take the best options available to them.
Honestly I know this shouldn’t bother me as much as it does, especially considering that the rhetoric has died down considerably since the election, but the idea that the NBN isn’t needed, or even wanted, by the Australian public is just so wrong that it borders on offensive. The NBN is going to elevate Australia to being one of the most connected countries in the world, rivalling some of the top technologically advanced nations. I know that I, as well as many of my technically inclined friends, have big plans for when those high speed connections become available and I’m sure many businesses will have the same.