There are certain fundamental limitations when it comes to current wireless communications. Mostly it comes down to the bandwidth of the frequencies used as more devices come online the more congested they become. Simply changing frequencies isn’t enough to solve the problem however, especially when it comes to technology that’s as ubiquitous as wifi. This is what has driven many to look for alternative technologies, some looking to make the interference work for us whilst others are looking at doing away with radio frequencies entirely. Li-Fi is a proposed technology that uses light instead of RF to transmit data and, whilst it posits speeds up to 100 times faster than conventional wifi, I doubt it will ever become the wireless communication technology of choice.
Li-Fi utilizes standard light bulbs that are switched on and off in nanoseconds, too fast for the human eye to perceive any change in the output of the light. Whilst the lights need to remain in an on state in order to transmit data they are apparently able to still transmit when the light level is below that which the human eye can perceive. A direct line of sight isn’t required for the technology to work either as light reflected off walls was still able to produce a usable, albeit significantly reduced, data signal. The first commercial products were demonstrated sometime last year so the technology isn’t just a nice theory.
However such technology is severely limited by numerous factors. The biggest limitation is the fact that it can’t work without near or direct line of sight between the sender and receiver which means that a transmitter is required in every discrete room that you want to use your receiver in. This also means that whatever is feeding data into those transmitters, like say a cabled connection, also need to be present. Compared to a wifi endpoint, which usually just needs to be placed in a central location to work, this is a rather heavy requirement to satisfy.
Worse still this technology cannot work outside due to sunlight overpowering the signal. This likely also means that any indoor implementation would suffer greatly if there was sunlight entering the room. Thus the idea that Li-Fi would be 100 times faster than conventional wifi is likely just laboratory numbers and not representative of the real world performance.
The primary driver for technologies like these is convenience, something which Li-Fi simply can’t provide given its current limitations. Setting up a Li-Fi system won’t be as easy as screwing in a few new light bulbs, it will likely require some heavy investment in either cabling infrastructure or ethernet-over-power systems to support them. Compare this to any wifi endpoint which just needs one data connection to cover a large area (which can be set up in minutes) and I’m not sure customers will care how fast Li-Fi can be, especially if they also have to buy a new smartphone to use it.
I’m sure there will be some niche applications of this technology but past that I can’t really see it catching on. Faster speeds are always great but they’re all for naught if the limitations on their use are as severe as they are with Li-Fi. Realistically you can get pretty much the same effect with a wired connection and even then the most limiting factor is likely your Internet connection, not your interconnect. Of course I’m always open to being proved wrong on this but honestly I can’t see it happening.
The last time I wrote about Amazon Prime Air was almost 2 years ago to the day and back then it seemed to be little more than a flight of fancy. Back then drones, whilst still being somewhat commonplace, were still something of an emerging space especially when it came to regulations and companies making use of them. Indeed the idea instantly ran afoul of the FAA, something which Amazon was surprisingly blase about at the time. Still there had been musings of them continuing development of the program and today they’ve shown off another prototype drone that they might use in the future.
The drone is an interesting beast, capable of both VTOL and regular flight. This was most likely done to increase the effective range of the craft as traditional flight is a lot less energy intensive than 100% VTOL flight. The new prototype drone has a stated range of 16 miles (about 25KM) which you’d probably have to cut in half for the return trip. Whilst that’s likely an order of magnitude above the previous prototype they showcased 2 years ago it still means that a serviced based on them will either be very limited or Amazon is planning a massive shakeup of its distribution network.
Of course the timing of this announcement (and the accompanying video below) mere hours before the yearly Cyber Monday sale starts in earnest can’t be denied. Amazon Prime Air is undeniably a marketing tactic, one that’s worked well enough in the past to warrant them trying it again in order to boost sales on this day. On the flip side Amazon does seem pretty committed to the idea, with their various proposals for airspace usage and “dozens of prototypes” in the works, however until they start offering the service to real customers it’s going to be easy to remain skeptical.
Last time I wrote about Amazon Prime Air one of my local readers mentioned that a similar service was looking to take off here in Australia. The offering was going to be a joint effort between Flirtey, a delivery drone developer, and Zookal a local text book sale and rent service. They were targeting mid last year for their first delivery by drone however that never came to pass. Indeed an article earlier this year was all I could dredge up on the service where they still have yet to use the service commercially. To their credit Flirtey did make the first drone delivery in the USA in July this year so the technology is there it just needs to be put to use.
Whether or not something like this will see widespread adoption however is something I’m still not sure on. Right now the centralized distribution models that most companies employ simply don’t work with the incredibly limited range that most drones have. Even if the range issue could be solved I’m still not sure if it would be economical to use them, unless the delivery fees were substantially higher (and then how many customers would pay for that?). Don’t get me wrong, I still think it’d be incredibly cool to get something delivered by drone, but at this point I’m still not 100% sold on the idea that it can be done economically.
There’s little doubt now that the Multi-Technology Mix was a viable path forward for the NBN. The tenants of faster, cheaper and sooner have all fallen by the wayside in one way or another. The speed guarantees were dropped very quickly as NBNCo (now known as just nbn™) came face to face with the reality that the copper network simply couldn’t support them. The cost of their solution has come into question numerous times and has shown to be completely incorrect. Worst still the subsequent cost blowouts are almost wholly attributed to the changes made by the MTM switch, not the original FTTP solution. Lastly with the delays that the FTTN trials have experienced along with the disruption to provisioning activities that were already under way there is no chance that we’ll have it sooner. Worse still it appears that the HFC network, the backbone upon which Turnbull built his MTM idea, isn’t up to the task of providing NBN services.
The leaked report shows that, in its current state, the Optus HFC network simply doesn’t have the capacity nor is it up to the standards required to service NBN customers. Chief among the numerous issues listed in the presentation is the fact that the Optus cable network is heavily oversubscribed and would require additional backhaul and nodes to support new customers. Among the other issues listed are pieces of equipment that are in need of replacement, the presence of ingress noise reducing user speeds and the complexity of the established HFC network’s multipathing infrastructure. All said the cost of remediating this network (or “overbuilding” it as they are saying) ranges from $150 million up to $800 million in addition to the capital already spent to acquire the network.
Some of the options presented to fix this solution are frankly comical, like the idea that nbn should engage Telstra to extend their HFC network to cover the areas currently serviced by Optus. Further options peg FTTP as the most expensive with FTTdp (fiber to the distribution point) and FTTN coming in as the cheaper alternatives. The last one is some horrendous mix of FTTdp and Telstra HFC which would just lead to confusion for consumers, what with 2 NBN offerings in the same suburb that had wildly different services and speeds available on them. Put simply Optus’ HFC network being in the state it is has no good solution other than the one that the original NBN plan had in mind.
The ubiquitous fiber approach that the original NBN sought to implement avoided all the issues that the MTM solution is now encountering for the simple fact that we can’t trust the current state of any of the networks deployed in Australia. It has been known for a long time that the copper network is aging and in dire need of replacement, unable to reliably provide the speeds that many consumers now demand. The HFC network has always been riddled with issues with nearly every metro deployment suffering from major congestion issues from the day it was implemented. Relying on both these things to deliver broadband services was doomed to fail and it’s not surprising that that’s exactly what we’ve seen ever since the MTM solution was announced.
Frankly this kind of news no longer surprises me. I had hoped that the Liberals would have just taken credit for the original idea that Labor put forward but they went one step further and trashed the whole thing. A full FTTP solution would have catapulted Australia to the forefront of the global digital economy, providing benefits far in excess of its cost. Now however we’re likely decades away from achieving that, all thanks to the short sightedness of a potential one term government. There really is little to hope for when it comes to the future of the NBN and there’s no question in my mind of who is to blame.
Long time readers will know that I’ve long held the belief that OSX and iOS were bound to merge at some point in the future. For me the reasons for thinking this are wide and varied, but it is most easily seen in ever vanishing delineation between the two hardware lines that support them. The iPad Pro was the last volley that iOS launched against its OSX brethren and, for me, was the concrete proof that Apple was looking to merge the two product lines once and for all. Some recent off-hand remarks from CEO Tim Cook convinced many of my line of thinking, enough so that Tim Cook has come out saying that Apple won’t be developing a converged Mac/iPad device.
That statement probably shouldn’t come as much of surprise given that Cook called the Surface Book “deluded” just under a week ago. Whilst I can understand that it’s every CEO’s right to have a dig at the competition the commentary from Cook does seem a little naive in this regard. The Surface has shown that there’s a market for a tablet-first laptop hybrid and there’s every reason to expect a laptop first tablet hybrid will meet similar success. Indeed the initial reactions to the Surface Book are overwhelmingly positive so Cook might want to reconsider the rhetoric he’s using on this, especially if they ever start eyeing off creating a competing device like they did with the iPad Pro.
The response about non-convergence though is an interesting one. Indeed, as Windows 8 showed, spanning a platform between all types of devices can lead to a whole raft of compromises that leaves nobody happy. However Microsoft has shown that it can be done right with Windows 10 and the Surface Book is their chief demonstrator of how a converged system can work. By distancing himself from the idea that the platforms will never meet in the middle, apart from the handful of integration services that work across both platforms, Cook limits the potential synergy that can be gained from such integration.
At the same time I get the feeling that the response might have be born out of the concern he stirred up with his previous comment about not needing a PC any more. He later clarified that as not needing a PC that’s not a Mac since they are apparently not Personal Computers. For fans of the Mac platform this felt like a clear signal that Apple feels PCs are an also ran, something that they keep going in order to endear brand loyalty more than anything else. When you look at the size of the entire Mac business compared to the rest of Apple it certainly looks that way with it making less than 10% of the company’s earnings. For those who use OSX as their platform for creation the consternation about it going away is a real concern.
As you can probably tell I don’t entirely believe Tim Cook’s comments on this matter. Whilst no company would want to take an axe to a solid revenue stream like the Mac platform the constant blurring of the lines between the OSX and iOS based product lines makes the future for them seem inevitable. It might not come as a big bang with the two wed in an unholy codebase marriage but over time I feel the lines between what differentiates either product line will be so blurred as to be meaningless. Indeed if the success of Microsoft’s Surface line is anything to go by Apple may have their hand forced in this regard, something that few would have ever expected to see happen to a market leader like Apple.
I was always of the opinion that the health trackers on the market were little more than gimmicks. Most of them were glorified pedometers worn by people who wanted to look like they were fitness conscious people rather than actually using them to stay fit. The introduction of heart rate tracking however presented functionality that wasn’t available before and piqued my interest. However the lack of continuous passive heart rate monitoring meant that they weren’t particularly useful in that regard so I held off until that was available. The Jawbone Up3 was the first to offer that functionality and, whilst it’s still limited to non-active periods, was enough for me to purchase my first fitness tracker. After using it for a month or so I thought I’d report my findings on it as most of the reviews out there focus on it at launch, rather than how it is now.
The device itself is small, lightweight and relatively easy to forget that it’s strapped to your wrist once you get it on. The band adjustment system is a little awkward, requiring you to take it off to adjust it and then put it back on, but once you get it to the right size it’s not much of an issue. The charging mechanism could be done better as it requires you to line up all the contacts perfectly or the band will simply not charge. It’d be far better to have an inductive charging system for it however given the device’s size and weight I’d hazard a guess that that was likely not an option. For the fashion conscious the Up3 seems to go unnoticed by most with only a few people I knew noticing it over the time I’ve had it. Overall as a piece of tech I like it however looks aren’t everything when it comes to fitness trackers.
The spec sheet for the Up3 has a laundry list of sensors in it however you really only get to see the data collected from two of them: the pedometer and the heart rate monitor. Whilst I understand that having all that data would be confusing for most users for someone like me it’d definitely be of interest. This means that, whilst the Up3 might be the most feature packed fitness tracker out there, in terms of actual, usable functionality it’s quite similar to a lot of bands already out there. For many that will make the rather high asking price a hard pill to swallow. There’s been promises of access to more data through the API for some time now but so far they have gone unfulfilled.
What the Up3 really has going for it though is the app which is well designed and highly functional. Setting everything up took about 5 minutes and it instantly began tracking everything. The SmartCoach feature is interesting as it skirts around providing direct health advice but tries to encourage certain, well established healthy behaviours. All the functions work as expected with my favourite being the sleep alarm. Whilst it took a little tweaking to get right (it seemed to just go off at the time I set for the most part initially) once it’s done I definitely felt more awake when it buzzed me. It’s not a panacea to all your sleep woes though but it did give me insight into what behaviours might have been affecting my sleep patterns and what I could do to fix them.
The heart rate tracking seems relatively accurate from a trend point of view. I could definitely tell when I was exercising, sitting down or in a particularly heated meeting where my heart was racing. It’s definitely not 100% accurate as there were numerous spikes, dips and gaps in the readings which often meant that the daily average was not entirely reliable. Again it was more interesting to see the trending over time and linking deviations to certain behaviours. If accuracy is the name of the game however the Up3 is probably not for you as it simply can’t be used for more than averaging.
What’s really missing from the Up3 and it’s associated app is the integration and distillation of all the data it’s able to capture. Many have looked to heart rate monitoring as a way to get more accurate calorie burn rates but the Up3 only uses the pedometer input to do this. The various other sensor inputs could also prove valuable in determining passive calorie burn rate (I, for instance, tend to run “hotter” than most people, something the skin temperature sensor can pick up on) but again their data is unused. On a pure specification level the Up3 is the most advanced tracker out there but that means nothing if that technology isn’t put to good use.
Would I recommend buying one? I’m torn honestly. On the one hand it does do the basic functions very well and the app looks a lot better than anything the competition has put out so far. However you’re paying a lot for technology that you’re simply not going to use, hoping that it will become available sometime in the future. Unless the optical heartrate tracking of other fitness trackers isn’t cutting it for you then it’s hard to recommend the Up3 above them and other, simpler trackers will provide much of the same benefit for a lower price. Overall the Up3 has the potential to be something great, but paying for potential, rather than actual functionality, is something that only early adopters do. That was an easier sell 6 months ago but with only one major update since then I don’t think many are willing to buy something on spec.
You’ve likely seen examples of 360º video on YouTube before, those curious little things that allow you to look around the scene as it plays out. Most of these come courtesy of custom rigs that people have created to capture video from all angles, using software to stitch them all together. Others are simply CGI that’s been rendered in the appropriate way to give you the full 360º view. Whilst these are amazing demonstrations of the technology they all share the same fundamental limitation: you’re rooted to the camera. True 3D video, where you’re able to move freely about the scene, is not yet a reality but it will be soon thanks to Lytro’s new camera, the Immerge.
That odd UFO looking device is the Immerge, containing hundreds of the lightfield sensors (the things that powered the original Lytro and the Illum) within each of its rings. There’s no change in the underlying technology, the lightfield sensors have the same intensity plus direction sensing capabilities, however these will be the first sensors in Lytro’s range to boast video capture. This, combined with the enormous array of sensors, allows the Immerge to capture all the details of a scene, including geometry and lighting. The resulting video, which needs to be captured and processed on a specially designed server that the camera needs, allows the viewer to move around the scene independently of the camera. Suffice to say that’s a big step up from the 360º video we’re used to seeing today and, I feel, is what 3D video should be.
The Immerge poses some rather interesting challenges however, both in terms of content production and its consumption. For starters it’s wildly different from any kind of professional camera currently available, one that doesn’t allow a crew to be anywhere near it whilst its filming (unless they want to be part of the scene). Lytro understands this and has made it remotely operable however that doesn’t detract from the fact that traditional filming techniques simply won’t work with the Immerge. Indeed this kind of camera demands a whole new way of thinking as you’re no longer in charge of where the viewer will be looking, nor where they’ll end up in a scene.
Similarly on the consumer end the Immerge relies on the burgeoning consumer VR industry in order to have an effective platform for it to really shine. This isn’t going to be a cinema style experience any time soon, the technology simply isn’t there, instead Immerge videos will likely be viewed by people at home on their Oculus Rifts or similar. There’s definitely a growing interest in this space by consumers, as I’ve detailed in the past, however for a device like the Immerge I’m not sure that’s enough. There’s potentially other possibilities that I’m not thinking of, like shooting on the Immerge and then editing everything down to a regular movie, which might make it more viable but i feel like that would be leaving so much of the Immerge’s potential at the door.
Despite all that though the Immerge does look like an impressive piece of kit and it will be able to do things that no other device is currently capable of doing. This pivot towards the professional video market could be the play that makes their struggle in the consumer market all worthwhile. We won’t have to wait long to see it either as Lytro has committed to the Immerge being publicly available in Q1 next year. Whether or not it resonates with the professional content creators and their consumers will be an interesting thing to see as the technology really does have a lot of promise.
The lukewarm reception that Windows 8 and 8.1 received meant that many customers held steadfast to their Windows 7 installations. Whilst it wasn’t a Vista level catastrophe it was still enough to cement the idea that every other version of Windows was worth skipping. At the same time however it also set the stage for making Windows 7 the new XP, opening up the potential for history to repeat itself many years down the line. This is something that Microsoft is keen to avoid, aggressively pursuing users and corporations alike to upgrade to Windows 10. That strategy appears to be working and Microsoft seems confident enough in the numbers to finally cut the cord with Windows 7, stopping sales of the operating system from October next year.
It might sound like a minor point, indeed you haven’t been able to buy most retail versions of Windows 7 for about a year now, however it’s telling about how confident Microsoft is feeling about Windows 10. The decision to cut all versions but Windows 7 Pro from OEM offerings was due to the poor sales of 8/8.1, something which likely wouldn’t be improved with Windows 10 so close to release. The stellar reception that Windows 10 received, passing both of its beleaguered predecessors in under a month, gave Microsoft the confidence it needed put an end date to Windows 7 sales once and for all.
Of course this doesn’t mean that the current Windows 7 install base is going anywhere, it still has extended support until 2020. This is a little shorter than XP’s lifecycle was, 11 years vs 13 years, and subsequently Windows 10’s (in its current incanation) current lifespan is set to be shorter again at 10 years. Thankfully this will present fewer challenges to both consumers and enterprises alike, given that they share much of the same codebase under the hood. Still the majority of the growth in the Windows 10 marketshare has likely come from the consumer space rather than the enterprise.
This is most certainly the case among gamers with Windows 10 now representing a massive 27.64% of users on the Steam platform. Whilst that might sound unsurprising, PC gamers are the most likely to be on the latest technology, Windows 7 was widely regarded as being one of the best platforms for gaming. Windows 8 (and by extension Windows 10 since most of the criticisms apply to both versions) on the other hand was met with some rather harsh criticism about what it could mean for PC gaming. Of course here we are several years later PC gaming is stronger than ever and gamers are adopting the newer platform in droves.
For Microsoft, who’ve gone on record saying that Windows 10 is slated to be the last version of Windows ever, cutting off the flow of previous versions of Windows is critical to ensuring that their current flagship OS reaches critical mass quickly. The early success they’ve seen has given them some momentum however they’ll need an aggressive push over the holiday season in order to overcome the current slump they’re finding themselves in. It’s proven to be popular among early adopters however now comes the hard task of convincing everyone else that it’s worth the trouble of upgrading. The next couple quarters will be telling in that regard and will be key to ensuring Windows 10’s position as the defacto OS for a long time to come.
It’s rare that we see a technology come full circle like virtual reality has. Back in the 90s there was a surge of interest in it with the large, clunky Virtuality machines being found in arcades and pizza joints the world over. Then it fell by the wayside, the expensive machines and the death of the arcades cementing them as a 90s fad. However the last few years have seen a resurgence in interest in VR with numerous startups and big brands hoping to bring the technology to the consumer. For the most part they’re all basically the same however there’s one that’s getting some attention and when you see the demo below you’ll see why.
Taken at face value the above demo doesn’t really look like anything different from what current VR systems are capable of however there is one key difference: no reference cards or QR codes anywhere to be seen. Most VR works off some form of visual cue so that it can determine things like distance and position however Magic Leap’s system appears to have no such limitation. What’s interesting about this is that they’ve repurposed another technology in order to gather the required information. In the past I would’ve guessed a scanning IR laser or something similar but it’s actually a light-field sensor.
Light-field sensors differ from traditional camera sensors by being able to capture directional information about the light in addition to the brightness and colour. For the consumer grade cameras we’ve seen based on this technology it meant that pictures could be refocused after the image was taken and even given a subtle 3D effect. For Magic Leap however it appears that they’re using a light field sensor to map out the environment they’re in, providing them a 3D picture of what it’s looking at. Then, with that information, they can superimpose a 3D model and have it realistically interact with the world (like the robot disappearing behind the table leg and the solar system reflecting off the table).
Whilst Magic Leap’s plans might be a little more sky high than an entertainment device (it appears they want to be a successful version of Google Glass) that’s most certainly going to be where their primary market will be. Whilst we’ve welcomed smartphones into almost every aspect of our lives it seems that an always on, wearable device like this is still irksome enough that widespread adoption isn’t likely to happen. Still though even in that “niche” there’s a lot of potential for technology like this and I’m sure Magic Leap will have no trouble finding hordes of willing beta testers.
Ever since my own failed attempt to build a 3D printer I’ve been fascinated by the rapid progress that has been made in this field. In under a decade 3D printing has gone from a niche hobby, one that required numerous hours to get working, to a commodity service. The engineering work has then been translated to different fields and numerous materials beyond simple plastic. However every so often someone manages to do 3D printing in a way that I had honestly never thought of, like this project where they 3D print a sculpture using rocks and string:
Whilst it might not be the most automated or practical way to create sculptures it is by far one of the most novel. Like a traditional selective laser sinter printer each new layer is formed by piling a layer of material over the previous. This is then secured by placing string on top of it, forming the eventual shape of the sculpture. They call this material reversible concrete which is partly true, the aggregate they appear to be using looks like the stuff you’d use in concrete, however I doubt the structural properties match that of its more permanent brethren. Still though it’s an interesting idea that could have some wider applications outside the arts space.
The current MTM NBN is by all accounts a total mess. Every single promise that the Liberal party has made with respect to it has been broken. First the guaranteed speed being delivered to the majority of Australians was scrapped. Then the timeline blew out as the FTTN trials took far longer to accomplish than they stated they would. Finally the cost of the network, widely described as being a third of the FTTP solution, has since ballooned to well above any cost estimate that preceded it. The slim sliver of hope that all us technologically inclined Australians hang on to is that this current government goes single term and that Labor would reintroduce the FTTP NBN in all its glory. Whilst it seems that Labor is committed to their original idea the future of Australia’s Internet will bear the scars of the Liberals term in office.
Jason Clare, who’s picked up the Shadow Communications Minister position in the last Labor cabinet reshuffle before the next election, has stated that they’d ramp up the number of homes connected to fiber if they were successful at the next election. Whilst there’s no solid policy documents available yet to determine what that means Clare has clearly signalled that FTTN rollouts are on the way out. This is good news however it does mean that Australia’s Internet infrastructure won’t be the fiber heaven that it was once envisioned to be. Instead we will be left with a network that’s mostly fiber with pockets of Internet backwaters with little hope of change in the near future.
Essentially it would seem that Labor would keep current contract commitments which would mean a handful of FTTN sites would still be deployed and anyone on a HFC network would remain on them for the foreseeable future. Whilst these are currently serviceable their upgrade paths are far less clear than their fully fiber based brethren. This means that the money spent on upgrading the HFC networks, as well as any money spent on remediating copper to make FTTN work, is wasted capital that could have been invested in the superior fiber only solution. Labor isn’t to blame for this, I understand that breaking contractual commitments is something they’d like to avoid, but it shows just how much damage the Liberals MTM NBN plan has done to Australia’s technological future.
Unfortunately there’s really no fix for this, especially if you want something politically palatable.
If we’re serious about transitioning Australia away from the resources backed economy that’s powered us over the last decade investments like the FTTP NBN are what we are going to need. There’s clear relationships between Internet speeds and economic growth something which would quickly make the asking price look extremely reasonable. Doing it half-arsed with a cobbled together mix of technologies will only result in a poor experience, dampening any benefits that such a network could provide. The real solution, the one that will last us as long as our current copper network has, is to make it all fiber. Only then will we be able to accelerate our growth at the same rapid pace as the rest of the world is and only then will we see the full benefits of what a FTTP NBN can provide.