Technology

Windows 10 Start Menu

Windows 10: The Windows 8 For Those Who Can’t Get Over 7.

Microsoft really can’t seem to win sometimes. If they stop making noticeable changes to their products everyone starts whining about how they’re no longer innovating and that people will start to look for alternatives. However should they really try something innovative everyone rebels, pushing Microsoft to go back to the way things ought to be done. It happened with Vista, the Ribbon interface and most recently with Windows 8. Usually what happens though is that the essence of the update makes it into the new version with compromises made to appease those who simply can’t handle change.

And with that, ladies and gentlemen, Microsoft has announced Windows 10.

Windows 10 Start Menu

Everyone seems to be collectively shitting their pants over the fact that Microsoft skipped a version number, somehow forgetting that most of the recent versions of Windows have come sans any number at all. If you want to get pedantic about it (and really, I do) the last 10 versions of Windows have been: Windows 3.1, Windows 95, Windows 98, Windows NT 4.0, Windows 2000, Windows ME (gag), Windows XP, Windows Vista, Windows 7 and Windows 8. If you were expecting them to release Windows 9 because of the last 2 versions of Windows just happened to be in numerical order I’m going to hazard a guess you ate a lot of paint as a child.

On a more serious note the changes that many people were expecting to make up the 8.2 release appear to have been bundled into Windows 10. The start menu makes its triumphant return after 2 years on the sidelines although those modern/metro apps that everyone loved to hate will now make an appearance on there. For someone like me who hasn’t really relied on the start menu even since before Windows 8 arrived (pressing the window key and then typing in what I want is much faster than clicking my way through the menu) I’m none too bothered with its return. It will probably make Windows 10 more attractive to the enterprise though as many of them are still in the midst of upgrading from XP (or purposefully delaying upgrading to 8).

The return of the start menu goes hand in hand with the removal of the metro UI that hosted those kinds of apps, which have now been given the ability to run in a window on the desktop. This is probably one of the better improvements as it no longer means you get a full screen app taking over your desktop if you accidentally click on something that somehow associated itself with a metro app. For me this most often seems to happen with mail as even though I’ve got Outlook installed the Mail app still seems to want to launch itself every so often. Whether or not this will make that style of apps more palatable to the larger world will have to remain to be seen, however.

There’s also been a few other minor updates announced like the inclusion of multiple desktops and improved aero-snap. The command line has also received a usability update, now allowing you to use CTRL + C and CTRL + V to copy and paste respectively. In all honesty if you’re still doing your work in the command line on any version of Windows above Vista you’re doing it wrong as PowerShell has been the shell of choice for everyone for the better part of 7 years. I’m sure some users will be in love with that change but the vast majority of us moved on long ago.

The release date is scheduled for late next year with a technical preview available right now for enterprising enthusiasts. It will be interesting to see what the take up rate is as that date might be a little too late for enterprises who are still running XP who will most likely favour 7 instead. That being said the upgrade path from 7 to 10 is far easier so there is the possibility of Windows 10 seeing a surge in uptake a couple years down the road. For those early adopters of Windows 7 this next release might just be hitting the sweet spot for them to upgrade so there’s every chance that 10 will be as successful as 7.

I’ll reserve my judgement on the new OS until I’ve had a good chance to sit down and use it for an extended period of time. Microsoft rarely makes an OS that’s beyond saving (I’d really only count ME in there) and whilst I might disagree with the masses on 8’s usability I can’t fault Microsoft for capitulating to them. Hopefully the changes aren’t just skin deep as this is shaping up to be the last major revision of Windows we’ll ever see and there’d be nothing worse than for Microsoft to build their future empire on sand.

IMG_4732

ASUS Transformer Pad TF103C Review.

I’ve only really owned one tablet, the original Microsoft Surface RT, and try as I might to integrate it into parts of my life I honestly really can’t figure out where it fits in. Primarily I think this is a function of apps as whilst the Surface is capable in most respects there’s really no killer feature that makes me want to use it for that specific purpose. Indeed this is probably due to my heavy embedding within the Android ecosystem, with all the characteristics that make my phone mine persisted across Google’s cloud. With that in mind when ASUS offered me a review unit of their new Transformer Pad TF103C for a couple weeks to review I was intrigued to see how the experience would compare.

IMG_4732

The TF103C is a 10.1″ tablet, sporting a quad core, 64 bit Intel Atom processor that runs at up to 1.86GHz. For a tablet those specs are pretty high end which, considering the included keyboard signals that the TF103C is aimed more towards productivity than simply being a beefy Android tablet. The screen is an IPS display with a 1200 x 800 resolution which is a little on the low side, especially now that retina level displays are fairly commonplace. You can get it with either 8GB or 16GB of internal storage which you can easily upgrade to 64GB via the embedded SDHC slot. It also includes the usual affair of wireless interfaces, connectors and sensors although one feature of note is the full sized USB port on the dock. With a RRP of $429 (with street prices coming in well under that) there’s definitely a lot packed in the TF103C for the price.

As a full unit the TF103C is actually pretty hefty. coming in at a total 1.1KGs although the tablet itself only makes up about half that. The keyboard dock doesn’t contain an additional battery or anything else that you’d think would make it so heavy, especially considering other chiclet style keyboards come in at about half that. Considering my full ultrabook weighs in at about 1.5KGs it does take away some of the appeal of having a device like this, at least from my perspective. That being said I’m not exactly the biggest tablet user, so the use of two different form factors is lost on me somewhat.

When used in docked form the TF103C is actually quite capable, especially when you attach a mouse to the dock’s USB port. I had wondered how Android would fair when used in a more traditional desktop way and it actually works quite well, mostly since the web versions of your typical productivity applications have evolved a lot in the past couple years. The keyboard is probably a little on the small side for people with larger hands but it was definitely usable for quick tasks or replying to email. It falls a little short if you’re going to use it on your lap however due to the fact that the screen can’t be tilted back past a certain point. It’s still usable but it’s a much better experience when used on a desk.

The quad core Intel Atom powering the TF103C is extremely capable, as evidenced by the fact that everything on it runs without a stutter or hiccup. I threw a few of the more intensive games I could find at it and never noticed any slowdown, commendable for a tablet in this price range. When you’re using such performance however the battery life does take quite a hit, knocking the rated 9.5 hours of run time to less than 4. That being said it managed to stay charged for about a week when it was idle making it quite usable as a casual computing device.

All in all I was impressed with the capabilities the TF103C displayed, even if I couldn’t really see it replacing any one of the devices I have currently. There’s a few missed opportunities, like integrating a battery into the keyboard and allowing the screen to tilt more, however overall it’s a very capable device for the asking price. I could definitely see it having a place on the coffee table as something to be used when needed with the added keyboard dock capability coming in handy for more grunty work. It might not end up replacing the device you have now but if you’re looking for a decent tablet that can also be productive then you wouldn’t go wrong with the TF103C.

A review unit was provided to The Refined Geek for 2 weeks for reviewing purposes.

Medieval vs Modern: The Making of a Gargoyle.

One thing that always fascinates me is how much (or indeed how little) technology can change some processes. Technology almost always makes things better, faster and cheaper but you’d think there’s a few areas where technology simply couldn’t put a dent in good old fashioned human processes. I don’t know why but when I saw the following video I thought there would be no way that modern processes could be better suited to the task than simply giving it over to a stone mason. By the end of the video however I was stunned at just how fast, and accurate, we could mill out a giant block of sandstone.

Honestly I probably should have expected it as I’ve seen numerous demonstrations of similar technology producing wildly intricate show pieces using all sorts of material. However I figured something like this, a craft that many would have thought was now in the domain of only a handful of dedicated practitioners, would be better suited to human hands. I have to say though that I doubt anyone today could carve out something like that in the space of 10 hours, even if you counted in all the preparation time they did before hand. It’s surprisingly hard to find out just how long it took to carve your average stone gargoyle unfortunately so I’m not sure how this compares to times when stone carving a s a profession was more common.

Realistically though that’s all a flimsy premise for me to post yet another large engineering demonstration video. I can’t help it though, they tickle me in all the right ways :)

IBM_Watson

IBM’s Watson has an API, and It’s Answering Questions.

In a world where Siri can book you a restaurant and Google Now can tell you when you should head for the gate at the airport it can feel like the AI future that many sci-fi fantasies envisioned is already here. Indeed to some extent it is, many aspects of our lives are now farmed out to clouds of servers that make decisions for us, but those machines still lack a fundamental understanding of, well, anything. They’re what are called expert systems, algorithms trained on data to make decisions in a narrow problem space. The AI future that we’re heading towards is going to be far more than that, one where those systems actually understand data and can make far better decisions based on that. One of the first steps to this is IBM’s Watson and it’s creators have done something amazing with it.

IBM_Watson

Whilst currently only open to partner developers IBM has created an API for Watson, allowing you to pose it a question and receive an answer. There’s not a lot of information around what data sets it currently understands (the example is in the form of a Jeopardy! question) but their solution documents reference a Watson Content Store which, presumably, has several pre-canned training sets to get companies started with developing solutions. Indeed some of the applications that IBM’s partner agencies have already developed suggest that Watson is quite capable of digesting large swaths of information and providing valuable insights in a relatively short timeframe.

I’m sure many of my IT savvy readers are seeing the parallels between Watson and a lot of the marketing material that surrounds anything with the buzzword “Big Data”. Indeed much of the concepts of operation are similar: take big chunks of data, throw them into a system and then hope that something comes out the other end. However Watson’s API suggests something that’s far more accessible, dealing in native human language and providing evidence to back up the answers it gives you. Compare this to Big Data tools, which often require you to either learn a certain type of language or create convoluted reports, and I think Watson has the ability to find widespread use while Big Data keeps its buzzword status.

For me the big applications for something like this come for places where curating domain specific knowledge is a long, time consuming task. Medicine and law both spring to mind as there’s reams of information available to power a Watson based system and those fields could most certainly benefit from having easier access to those vast treasure troves. It’s pretty easy to imagine a lawyer looking for all precedents set against a certain law or a doctor asking for all diseases with a list of symptoms, both queries answered with all the evidence to boot.

Of course it remains to be seen if Watson is up to the task as whilst it’s prowess on Jeopardy! was nothing short of amazing I’ve still yet to see any of its other applications in use. The partner applications do look very interesting, and should hopefully be the proving grounds that Watson needs, but until it starts seeing widespread use all we really have to go on is the result of a single API call. Still I think it has great potential and hopefully it won’t be too long before the wider public can get access to some of Watson’s computing genius.

Tailoring Stuff

When Will Buying Clothing Online be as Good as Offline?

I’m not exactly what you’d call a fashionista, the ebbs and flows of what’s current often pass me by, but I do have my own style which I usually refresh on a yearly basis. More recently this has tended towards my work attire, mostly because I spend a great deal more time in it than I did previously. However the act of shopping for clothes is one I like to avoid as I find it tiresome, especially when trying to find the right sizes to fit my not-so-normal dimensions. Thus I’ve recently turned towards custom services and tailoring in order to get what I want in the sizes that fit me but, if I’m honest, the online world still seems to be light years behind that which I can get from the more traditional fashion outlets.

Tailoring Stuff

For instance one of the most frustrating pieces of clothing for me to buy is business shirts. Usually they fall short in one of my three key categories (length, sleeve length and fit in the mid section) so I figured that getting some custom made would be a great way to go. So I decided that I’d last out for a couple shirts from 2 online retailers, Original Stitch and Shirts My Way, to see if I could get something that would tick all 3 categories. I was also going to do a review of them against each other to see which one of the retailers provided the better fit and would thus become my defacto supplier of shirts for the foreseeable future. However upon receiving both shirts I was greeted with the unfortunate reality: they both sucked.

They seemed to get some of the things right, like the neck size and overall shirt length, however they both seemed to be made to fit someone who weighed about 40kg more than I do with the mid section being like a tent. Both of them also had ridiculously billowy sleeves, making my arms appear to be twice as wide as they should be. I kind of expected something like this to happen with Original Stitch, since their measurements aren’t exactly comprehensive, but Shirts My Way also suffered from the same issues even though I followed their guidelines exactly. Comparing this to the things I’ve had fitted or tailored in the past I was extremely disappointed as I was expecting as good or better service.

The problem could be partially solved by technology as 3D scanning could provide extremely accurate sizing that online stores could then incorporate in order to ensure you got the right fit the first time around. In fact I’d argue that there should be some kind of open standard for this, allowing all the various companies to develop their brand of solutions for it that would be interoperable between different clothing companies. That is something of a pipe dream, I know, but I can’t be the only person who has had this kind of frustration trying to get the right fits from online retailers.

I guess for now I should just stick with the tried and true methods for getting the clothing that I want as the online experience, whilst infinitely more convenient, ultimately delivers a lacklustre product. I’m hopeful that change is coming although it’s going to take time for it to become widespread and I’m sure that there won’t be any standards across the industry for a long time after that. Maybe one day I’ll be able to order the right fits from the comfort of my own home but, unfortunately, that day is not today.

BBC Derp

The BBC Thinks all VPN Users are Pirates.

If you want Netflix in Australia there’s really only one way to do it: get yourself a VPN with an endpoint in the states. That’s not an entirely difficult process, indeed many of my less tech savvy friends have managed to accomplish it without any panicked phone calls to me. The legality of doing that is something I’m not qualified to get into but since there hasn’t been a massive arrest spree of nefarious VPN users I can’t imagine it’s far outside the bounds of law. Indeed you couldn’t really do that unless you also cracked down on the more legitimate users of VPN services, like businesses and those with regulatory commitments around protecting customer data. However if you’d ask the BBC users of VPNs are nothing but dirty pirates and it’s our ISP’s job to snoop on them.

BBC Derp

In a submission to the Australian Government, presumably under the larger anti-piracy campaign that Brandis is heading, the BBC makes a whole list of suggestions as to how they should go about combating Australia’s voracious appetite for purloined content. Among the numerous points is the notion that a lot of pirates now use a VPN to hide their nefarious activities. In the BBC’s world ISPs would take this as a kind of black flag, signalling that any heavy VPN user was likely also engaging in copyright infringement. They’d then be subject to the woeful idea of having their Internet slowed down or cut off, presumably if they couldn’t somehow prove that it was legitimate. Even though they go on to talk about false positives the ideas they discuss in their submission are fucking atrocious and I hope they never see the light of day.

I have the rather fortunate (or unfortunate, depending on how you look at it) ability of being able to do my work from almost anywhere I choose, including my home. This does mean that I have to VPN back into the mothership in order to get access to my email, chat and all other corporate resources which can’t be made available over the regular Internet. Since I do a lot of this at home under the BBC’s suggestion I’d probably be flagged as a potential pirate and be subject to measures to curb my behaviour. Needless to say I don’t think I’m particularly unique in this either so there’s vast potential for numerous false positives to spring up under this system.

Worse still all of those proposed measures fall on the ISP’s shoulders to design, implement and enforce. Not only would this put an undue burden on them, which they’d instantly pass onto us in the form of increased prices, it would also make them culpable when an infringing user figured out how to defeat their monitoring system. Now everyone knows that it doesn’t take long for people to circumvent these systems which, again, increases pressure on the ISPs to implement even more invasive and draconian systems. It’s a slippery slope that we really shouldn’t be going down.

Instead of constantly looking towards the stick as the solution to Australia’s piracy woes it’s time for companies, and the Australian government, to start looking at the carrot. Start looking at incentives for rights holders to license content in Australia or mandating that we get the same content at the same time for the same price as it is elsewhere. The numerous Netflix users in Australia shows there’s demand for such a service, we just need it to match the same criteria that customers overseas expect. Once we get that I’m sure you’ll see a massive reduction in the amount of piracy in Australia, coupled with the increase in sales that the right’s holders seem so desperate to protect.

Apple Watch Space Black

Now We Can Stop Talking About the iWatch.

I honestly couldn’t tell you how long I’ve been hearing people talk about Apple getting into the smartwatch business. It seemed every time that WWDC or any other Apple event rolled around there’d be another flurry of speculation as to what their wearable would be. Like most rumours details on it were scant and so the Internet, as always, circlejerked itself into a frenzy about a product that might not have even been in development. In the absence of a real product competitors stepped up to the plate and, to their credit, the devices have started to look more compelling. Well today Apple finally announced their Watch and it’s decidedly mediocre.

Apple Watch Space Black

For starters it makes the same mistake that many smartwatches do: it follows the current design trend for nearly all other smartwatches. Partly this is due to the nature of LCD screens being rectangular, limiting what you can do with them, however for a company like Apple you’d expect them to buck the trend a bit. Instead you’ve got what looks like an Apple-ized version of the Pebble Steel, not entirely unpleasing but at the same time feeling incredibly bland. I guess if you’re a fan of having a shrunken iPhone on your wrist then the style will appeal to you but honestly smartwatches which look like smartwatches are a definite turn off for me and I know I’m not alone in thinking this.

Details as to what’s actually under the hood of this thing are scarce, probably because unlike most devices Apple announces you won’t be able to get your hands on this one right away. Instead you’ll be waiting until after March next year to get your hands on one and the starting price is somewhere on the order of $350. That’s towards the premium end of the smartwatch spectrum, something which shouldn’t be entirely unexpected, and could be indicative of the overall quality of the device. Indeed what little details they’ve let slip do seem to indicate it’s got some decent materials science behind it (both in the sapphire screen and the case metals) which should hopefully make it a more durable device.

Feature wise it’s pretty much as you’d expect, sporting the usual array of notifications pushed from your phone alongside a typical array of sensors. Apple did finally make its way into the world of NFC today, both with the Apple Watch and the new iPhone, so you’ll be able to load up your credit card details into it and use the watch to make payments. Honestly that’s pretty cool, and definitely something I’d like to see other smartwatch manufacturers emulate, although I’m not entirely hopeful that it’ll work anywhere bar the USA. Apple also toutes an interface that’s been designed around the smaller screen but without an actual sample to look over I really couldn’t tell you how good or bad it would be.

So all that blather and bluster that preceded this announcement was, surprise, completely overblown and the resulting product really does nothing to stand out in the sea of computerized hand adornments. I’m sure there’s going to be a built in market from current Apple fans but outside that I really can’t see the appeal of the Apple Watch over the numerous other devices. Apple does have a good 6 months or so to tweak the product before release so there’s potential for it to become something before they drop it on the public.

Professional Memory Holder

DDR4 Appears on The Market; I Realise I’ve Been Under a Rock.

Whilst I don’t spend as much time as I used to keeping current with all things PC hardware related I still maintain a pretty good working knowledge of where the field is going. That’s partly due to my career being in the field (although I’m technically a services guy) but mostly it’s because I love new tech. You’d think then that DDR4, the next generation in PC memory, making its commercial debut wouldn’t be much of a surprise to me but I had absolutely no idea it was in the pipeline. Indeed had I not been building out a new gaming rig for a friend of mine I wouldn’t have known it was coming, nor that I could buy it today if I was so inclined.

Professional Memory Holder

Double Data Rate Generation 4 (DDR4) memory is the direct successor to the current standard, DDR3, which has been in widespread use since 2007. Both standards (indeed pretty much all memory standards) were developed by the Joint Electron Device Engineering Council (JEDEC) who have been working on DDR4 since about 2005. The reasoning behind the long lead times on new standards like this is complicated but it comes down to a function of getting everyone to agree to the standard, manufacturers developing products around said standard and then, finally, them making their way into the hands of consumers. Thus whilst new memory modules come and go with the regular tech cycle typically the standards driving them remain standard for the better part of a decade or two which is probably why this writer neglected to keep current on it.

In terms of actual improvements DDR4 seems like an evolutionary step forward rather than a revolutionary one. That being said the improvements introduced with the new specification are nothing to sneeze at with one of the big improvements being a reduction in the voltage (and thus power) that the specification requires. Typical DDR4 modules will now use 1.2V compared to DDR3’s 1.5V and the low voltage variant, typically seen in low power systems like smartphones and the like, goes all the way down to 1.05V.To end consumers this won’t mean too much but for large scale deployments the savings from running this new memory add up very quickly.

As you’d expect there’s also been a bump up in the operating speed of DDR4 modules, ranging from 2133Mhz all the way up to 4266Mhz. Essentially the lowest tier of performance DDR4 memory will match the top performers of DDR3 and the amount of headroom for future development is quite significant. This will have a direct impact on the performance of systems that are powered by DDR4 memory and whilst most consumers won’t notice the difference it’s definitely going to be a defining feature of enthusiast PCs for the next couple years. I know that I updated my dream PC specs to include it even though the first generation of products is only just hitting the market.

DDR4 chips are also meant to be a lot more dense than their DDR3 predecessors, especially considering that the specification has also accommodated 3D layering technologies like Samsung’s V-NAND. Many are saying that this will lead to DDR4 being cheaper for a comparable amount of memory vs DDR3 however right now you’ll be paying about a 40% premium on pretty much everything if you want to build a system around the new style of memory. This is to be expected though and whilst I can eventually see DDR4 eclipsing DDR3 on a price per gigabyte basis that won’t be for several years to come. DDR3 has 7 years worth of economies of scale built up and they won’t become irrelevant for a very long time.

So whilst I might be a little shocked that I was so out of the loop I didn’t know a new memory standard had made its way into reality I’m glad it has. The improvements might be incremental rather than a bold leap forward but progress in this sphere is so slow that anything is worth celebrating. The fact that you can build systems with it today is just another bonus, one that I’m sure is making dents in geek’s budgets the world over.

Facebook Bluray Archiving

You Won’t See Blu Ray Archiving Anytime Soon.

Ask your IT administrator what medium they back up all your data to and the answer is likely some form of magnetic tape storage. For many people that’d be somewhat surprising as the last time they saw a tape was probably a couple decades ago and it wasn’t used to store much more than a blurry movie or maybe a couple songs. However in the world of IT archiving and backup there’s really no other medium that can beat tapes for capacity, durability or cost. Many have tried to unseat tapes from their storage crown but they’re simply too good at what they do and Facebook’s latest experiment, using Blu Ray disc caddies as an archiving solution, isn’t likely to take over from them anytime soon.

Facebook Bluray ArchivingThe idea Facebook has come with is, to their credit, pretty novel. Essentially they’ve created these small Blu Ray caddies each of which contains 12 discs. These are all housed in a robotic enclosure which is about the size of a standard server rack. Each of these racks is capable of storing up to 10,000 discs which apparently gives rise to a total 1PB worth of storage in a single rack. Primarily it seems to be a response to their current HDD based backup solutions which, whilst providing better turn around for access, are typically far more costly than other archiving solutions. What interests me though is why Facebook would be pursuing something like this when there are other archiving systems already available, ones with much better ROI for the investment.

The storage figures quoted peg the individual disc sizes at 100GB something which is covered off under the BD-R XL specification. These discs aren’t exactly cheap and whilst I’m sure you could get a decent discount when buying 10,000 the street price for them is currently on the order of $60. If they’re able to even get a 50% discount on these discs that means that you’re still on the hook for about $300K just for the media. If you wanted to get a similar amount of storage on tapes (say using the 1.5TB HP LTO-5 which can be had for $40) you’re only paying about $27K a tenth of the cost. You could even halve that again if you were able to use compression on the tapes although honestly you don’t really need to at that price point.

Indeed pretty much every single advantage that Facebook is purporting this Blu Ray storage system to have is the same benefit you get with a tape drive. Tapes are low power, as their storage requires no active current draw, are readily portable (and indeed there are entire companies already dedicated to doing this for you) and have many of the same durability qualities that DVDs do. When you combine this with the fact that they’re an already proven technology with dozens of competitive offers on the table it really does make you wonder why Facebook is investigating this idea at all.

I’d hazard a guess it’s just another cool engineering product, something that they’ll trial for a little while before mothballing completely once they look at the costs of actually bringing something like that into production. I mean I like the idea, it’s always good to see companies challenging the status quo, however sometimes the best solutions are the ones that have stood the test of time. Tapes, whether you love them or hate them, by far outclass this system in almost all ways possible and that won’t change until you can get Blu Ray discs at the same dollars per gigabyte that you can get tapes. Even then Facebook is going to have to try hard to find some advantage that Blu Rays have that tapes don’t as right now I don’t think anyone can come up with one.

Can you?

 

Australia's Shitty Internet

Why We Need the Full FTTP NBN.

The unfortunate truth about telecommunications within Australia is that everyone is under the iron rule of a single company: Telstra. Whilst the situation has improved somewhat in the last decade, mostly under threat of legal action from the Australian government, Australia still remains something of an Internet backwater. This can almost wholly be traced back to the lack of investment on Telstra’s behalf in new infrastructure with their most advanced technology being their aging HFC networks that were only deployed in limited areas. This is why the NBN was such a great idea as it would radically modernize our telecommunications network whilst also ensuring that we were no longer under the control of a company that had long since given up on innovating.

Australia's Shitty Internet

 

To us Australians my opening statements aren’t anything surprising, this is the reality that we’ve been living with for some time now. However when outsiders look in, like say the free CDN/DDoS protection service Cloudflare (who I’ve recently started using again), and find that bandwidth from Telstrat is about 20 times more expensive than their cheapest providers it does give you some perspective on the situation. Whilst you would expect some variability for different locations (given the number of dark fiber connections and other infrastructure) a 20x increase does appear wildly out of proportion. The original NBN would be the solution to this as it would upend Telstra’s grip on the backbone connections that drive these prices however the Liberal’s new MTM solution will do none of this.

Right now much of the debate of the NBN has been framed around the speeds that will be delivered to customers however that’s really only half of the story. In order to support the massive speed increases that customers would be seeing with the FTTP NBN the back end infrastructure would need to be upgraded as well and this would include the interconnects that drive the peering prices that Cloudflare sees. Such infrastructure would also form the backbone of wide area networks that businesses and organisations use to connect their offices together, not to mention all the other services that rely on backhaul bandwidth. The MTM NBN simply doesn’t have the same requirements, nor the future expandability, to necessitate the investment in this kind of back end infrastructure and, worse still, the last mile connections will still be under the control of Telstra.

That last point is one I feel that doesn’t get enough attention in the mainstream media. The Liberals have released several videos that harp on about the point of making the right amount of investment in the NBN, citing that there’s a cut off point where extra bandwidth doesn’t enable people to do anything more. The problem with that thinking is though that, with the MTM NBN, you cannot guarantee that everyone will have access to those kinds of speeds. Indeed the MTM NBN can only guarantee 50Mbps to people who are 200m or less away from an exchange which, unfortunately, the vast majority of Australians aren’t. Comparatively FTTP can deliver the same speeds regardless of distances and also has the ability to provide higher speeds well into the future.

In all honesty though the NBN has been transformed from a long term, highly valuable infrastructure project to a political football, one that the Liberal party is intent to kick around as long as it suits their agenda. Australia had such potential to become a leader in Internet services with an expansive fiber network that would have rivalled all others worldwide. Instead we have a hodge podge solution that does nothing to address the issues at hand and the high broadband costs, for both consumers and businesses alike, will continue as long as Telstra controls a vast majority of the critical infrastructure. Maybe one day we’ll get the NBN we need but that day seems to get further and further away with each passing day.