Technology

Fiber Optics

Fiber’s Future Looks Bright with Frequency Combs.

Fiber is the future of all communications, that’s a fact that any technologist will be able to tell you. Whilst copper is still the mainstay for the majority its lifetime is limited as optics are fast approaching the point where they’re feasible for everything. However even fiber has its limits, one that some feel we were going to hit sooner rather than later which could cause severe issues for the Internet’s future. However new research coming out of the University of California, San Diego paves the way for boosting our fiber network’s bandwidth significantly.

Fiber Optics

 

Today’s fiber networks are made up of long runs of fiber optic cable interspersed with things called repeaters or regenerators. Essentially these devices are responsible for boosting up the optical signal which becomes degraded as it travels down the fiber. The problem with these devices is that they’re expensive, add in latency and are power hungry devices, attributes that aren’t exactly desirable. These problems are born out of a physical limitation of fiber networks which puts an upper limit on the amount of power you can send down an optical cable. Past a certain point the more power you put down a fiber the more interference you generate meaning there’s only so much you can pump into a cable before you’re doing more harm than good. The new research however proposes a novel way to deal with this: interfere with the signal before it’s sent.

The problem with interference that’s generated by increasing the power of the signal is that it’s unpredictable meaning there’s really no good way to combat it. The researchers however figured out a way of conditioning the signal before it’s transmitted which allows the interference to become predictable. Then at the receiving end they’ve used what they’re calling “frequency combs” to reverse the interference on the other end, pulling a useful signal out of interference. In the lab tests they were able to send the signal over 12,000KM without the use of a repeater, an absolutely astonishing distance. Using such technology could drastically improve the efficiency of our current dark fiber networks which would go a long way to avoiding the bandwidth crunch.

It will be a little while off before this technology makes its way into widespread use as whilst it shows a lot of promise the application within the lab falls short of a practical implementation. Current optical fibers carry around 32 different signals whereas the system that the researchers developed can currently only handle 5. Ramping up the number of channels they can support is a non-trivial task but at least it’s engineering challenge and not a theoretical one.

ssdgraph

The Near-Term Future of PC Storage.

I had grand ideas that my current PC build would be all solid state. Sure the cost would’ve been high, on the order of $1500 to get about 2TB in RAID10, but the performance potential was hard to deny. In the end however I opted for good old fashioned spinning rust mostly because current RAID controllers don’t do TRIM on SSDs, meaning I would likely be in for a lovely performance downgrade in the not too distant future. Despite that I was keenly aware of just how feasible it was to go full SSD for all my PC storage and how the days of the traditional hard drive are likely to be numbered.

ssdgraph

Ever since their first commercial introduction all those years ago SSDs have been rapidly plummeting in price with the most recent drop coming off the back of a few key technological innovations. Whilst they’re still an order of magnitude away from traditional HDDs in terms of cost per gigabyte ($0.50/GB for SSD, $0.05/GB for HDD) the gap in performance between the two is more than enough to justify the current price differential. For laptops and other portable devices that don’t require large amounts of onboard storage SSDs have already become the sole storage platform in many cases however they still lose out for large scale data storage. That differential could come to a quick close however, although I don’t think SSDs’ rise to fame will be instantaneous past that point.

One thing that has always plagued SSDs is the question around their durability and longevity as the flash cells upon which they rely have a defined life in terms of read and write cycles. Whilst SSDs have, for the most part, proven reliable even when deployed at scale the fact is that they’ve really only had about 5 or so years of production level use to back them up. Compare that to hard drives which have track records stretching back decades and you can see why many enterprises are still tentative about replacing their fleet en-masse; We just don’t know how the various components that make up a SSD will stand the test of time.

However concerns like that are likely to take a back seat if things like a 30TB drive by 2018 come to fruition. Increasing capacity on traditional hard drives has always proven to be a difficult affair as there’s only so many platters you can fit in the standard space. Whilst we’re starting to see a trickle of 10TB drives into the enterprise market they’re likely not going to be available at a cost effective point for consumers anytime soon and that gives a lot of leeway to SSDs to play catchup to their traditional brethren. That means cost parity could come much sooner than many anticipated, and that’s the point where the decision about your storage medium is already made for the consumer.

We likely won’t see spinning rust disappear for the better part of a decade but the next couple years are going to see something of a paradigm shift in terms of which platform is considered before another. SSDs already reign supreme as the drive to have your operating system residing on, all they require now is a comparative cost per gigabyte to graduate beyond that. Once we reach that point it’s likely to be an inflection point in terms of the way we store our data and, for consumers like us, a great time to upgrade our storage.

NASA Glitter Mirror

NASA Gets Crafty: The Glitter Mirror.

Your garden variety telescope is usually what’s called a refracting telescope, one that uses a series of lenses to enlarge far away objects for your viewing pleasure. For backyard astronomy they work quite well, often providing a great view of our nearby celestial objects, however for scientific observations they’re usually not as desirable. Instead most large scientific telescopes use what’s called a reflecting telescope which utilizes a large mirror which then reflects the image onto a sensor for capture. The larger the mirror the bigger and more detailed picture you can capture, however bigger mirrors come with their own challenges especially when you want to launch them into space. Thus researchers are always looking for novel ways to create a mirror and one potential avenue that NASA is pursuing is, put simply, a little fabulous.

NASA Glitter Mirror

One method that many large telescopes use to get around the problem of creating huge mirrors is to use numerous smaller ones. This does introduce some additional complexity, like needing to make sure all the mirrors align properly to produce a coherent image on the sensor, however that does come with some added benefits like being able to eliminate distortions created by the atmosphere. NASA’s new idea takes this to an extreme, replacing the mirror with a cloud of glitter-like particles held in place with lasers. Each of those particles then acts like a tiny mirror, much like their larger counterparts . Then, on the sensor side, software is being developed to turn the resulting kaleidoscope of colours  back into a coherent image.

Compared to traditional mirrors on telescopes, especially space based ones like the Hubble, this has the potential to both significantly reduce weight whilst at the same time dramatically increasing the size of the mirror we can use. The  bigger the mirror the more light that can be captured and analysed and a mirror designed with this cloud of particles could be many times greater than its current counterparts. The current test apparatus (shown above) uses a traditional lens covered in glitter which was used to validate the concept by using 2 simulated “stars” that shone through it. Whilst the current incarnation used multiple exposures and a lot of image processing to create the final image it does show that the concept could work however it requires much more investigation before it can be used for observations.

A potential mission to verify the technology in space would use a small satellite with a prototype cloud, no bigger than a bottle cap in size. This would be primarily aimed at verifying that the cloud could be deployed and manipulated in space as designed and, if that proved successful then they could move on to capturing images. Whilst there doesn’t appear to be a strict timeline for that yet this concept, called Orbiting Rainbows, is part of the NASA Innovative Advanced Concepts program and so research on the idea will likely continue for some time to come. Whether it will result in an actual telescope however is anyone’s guess but such technology does show incredible promise.

nikola phone case

The Nikola Phone Case is Bunk.

I understand that a basic understanding of circuit fundamentals isn’t in the core curriculum for everyone but the lack of knowledge around some electrical phenomena really astounds me. Whilst most people understand the idea of radio waves, at least to the point of knowing that they power our wireless transmissions and that they can be blocked by stuff, many seem to overestimate the amount of power that these things carry. This misunderstanding is what has led several questionable Kickstarter campaigns to gain large amounts of funding, all on the back of faulty thinking that simply doesn’t line up with reality. The latest incarnation of this comes to us in the form of the Nikola Phone Case which purports to do things that are, simply, vastly overblown.

nikola phone caseThe Nikola Phone Case states that it’s able to harvest the energy that your phone “wastes” when it’s transmitting data using it’s wireless capabilities. They state that your phone uses a lot of power to transmit these signals and that only a fraction of these signals end up making their way to their destination. Their case taps into this wasted wireless signal and then captures it, stores it and then feeds it back into your phone to charge its battery. Whilst they’ve yet to provide any solid figures, those are forthcoming in the next couple weeks according to the comments section, they have a lovely little animated graph that shows one phone at 70% after 8 hours (with case) compared to the other at 30% (without case). Sounds pretty awesome right? Well like most things which harvest energy from the air it’s likely not going to be as effective as its creators are making out to be.

For starters the the idea hinges on tapping into the “wasted” energy  which implies that it doesn’t mess with the useful signal at all. Problem is there’s really no way to tell which is useful signal and which isn’t so, most likely, the case simply gets in the way of all signals. This would then lead to a reduction in signal strength across all radios which usually means that the handset would then attempt to boost the signal in order to improve reception, using more power in the process. The overall net effect of this would likely be either the same amount of battery life or worse, not the claimed significant increase.

There’s also the issue of battery drain for most smartphones devices not being primarily driven by the device’s radio. Today’s smartphones carry processors in them that are as powerful as some desktops were 10 years ago and thus draw an immense amount of power. Couple that with the large screens and the backlights that power them and you’ll often find that these things total up to much more battery usage than all of the radios do. Indeed if you’re on an Android device you can check this for yourself and you’ll likely find that the various apps running in the background are responsible for most of the battery usage, not your radio.

There’s nothing wrong with the Nikola Phone Case at a fundamental technological level, it will be able harvest RF energy and pump it back into your phone no problem, however the claims of massive increases in battery life will likely not pan out to be true. Like many similar devices that have come before it they’ve likely got far too excited about an effect that won’t be anywhere near as significant outside the lab. I’ll be more than happy to eat my words if they can give us an actual, factual demonstration of the technology under real world circumstances but until then I’ll sit on this side of the fence, waiting for evidence to change my mind.

Science Powered Zen Garden.

I’ve always appreciated the simple beauty of Zen gardens, mostly from afar as my natural instinct is to run directly to the perfectly groomed sand and mess it all up. That being whilst I may have kindled an interest in gardening recently (thanks to my wife giving me some chilli plants for Christmas) I have very little interest in creating one of these myself, even of the desktop variety. The video below however demonstrates a kind of Zen garden that I could very well see myself spending numerous hours, mostly because it’s driven by some simple, but incredibly cool, science.

On the surface it seems like a relatively simple mechanism of action, two steel balls roll their away across the sand and produce all sorts of patterns along the way. The reality of it is quite a bit more interesting however as, if you watch closely, you can see that the two steel balls’ motion is linked together around a single point of motion. This is because, as Core77’s post shows, there’s only a single arm underneath the table which most likely houses 2 independent magnets that are able to slide up and down its length. In all honesty this is far more impressive to me than how I would’ve approached the problem as it makes producing the complex patterns that much more challenging. If it was left to me I would’ve had a huge array of magnets underneath the surface, but that seems like cheating after seeing this.

OSX El Capitan

OSX El Capitan: Countdown to Mac iOS.

When you think of Apple what kind of company do you think they are? Many will answer that they’re a technology company, some a computing company, but there are precious few who recognise them as a hardware company. Whilst they may run large non-hardware enterprises like the App Store and iTunes these all began their lives as loss-leaders for their respective hardware platforms (the iPhone and the iPod). OSX didn’t start out its life in that way, indeed it was long seen as the only competitor to Windows with any significant market share, however it has been fast approaching the same status as its iCompanions for some time now and the recently announced El Capitan version solidifies its future.

OSX El Capitan

I haven’t covered an OSX version in any detail since I mentioned OSX Lion in passing some 4 years ago now and for good reason: there’s simply nothing to write about. The Wikipedia entry on OSX versions sum up the differences in just a few lines and for the most part the improvements with each version come down to new iOS apps being ported and the vague “under-the-hood” improvements that come with every version. The rhetoric from Apple surrounding the El Capitan release even speaks to this lack of major changes directly, stating things like “Refinements to the Mac Experience” and “Improvements to System Performance” as their key focus. Whilst those kinds of improvements are welcome in any OS release the fact that the last 6 years haven’t seen much in the way of innovation in the OSX product line is telling of where it’s heading.

The Mountain Lion release of OSX was the first indication that OSX was likely heading towards an iLine style of product with many iOS features making their way into the operating system. Mavericks continued this with the addition of another 2 previously iOS exclusives and Yosemite bringing Handoff to bridge between other iOS devices. El Capitan doesn’t make any specific moves forward in this regard however it is telling that Apple’s latest flagship compute product, the revamped and razor thin Macbook, is much more comparable to an upscale tablet than it is to an actual laptop. In true Apple fashion it doesn’t really compare with either, attempting to define a new market segment in which they can be the dominant player.

If it wasn’t obvious what I’m getting at here is that OSX is fast approaching two things: becoming another product in the iOS line and, in terms of being a desktop OS, irrelevance. Apple has done well with their converged ecosystem, achieving a level of unification that every other ecosystem envies, however that strategy is most certainly focused on the iOS line above all else. This is most easily seen in the fact that the innovation happens on iOS and then ported back to OSX. This is not something that I feel Apple would want to continue doing long into the future. Thus it would seem inevitable that OSX would eventually pass the torch to iOS running on a laptop form factor, it’s just a matter of when.

This is not to say it would be a bad thing for the platform, far from it. In terms of general OS level tasks OSX performs more than adequately and has done so for the better part of a decade. What it does mean however is that the core adherents which powered Apple’s return from the doldrums all those years ago are becoming a smaller part of Apple’s overall strategy and will thus recieve much less love in the future. For Apple this isn’t much of a concern, the margins on PCs (even their premium models), have always been slim when compared to their consumer tech line. However for those who have a love for all things OSX they might want to start looking at making the transition if an iOS based future isn’t right for them.

batterizer_on_black-100588215-orig

Batteriser: It’ll Extend Battery Life, Just not as Much as Claimed.

Alkaline batteries are something that many of us have come to accept as a necessary evil. Either you pay up the premium price for the good batteries, and hope that their claims are as good as they say they are, or you risk it with a giant pack of cheap ones hoping that you don’t have to replace them every other week. They’re less of an issue now that decent rechargeables come as standard for many devices but I can count at least 4 devices in my living room still powered by these disposable power packs. I’ve heard all sorts of strange ways to extend the life of these things, from tapping them on something or warming them up in your hands, but none really provide a good solution. Batteriser purports to be able to squeeze up to 800% more life out of your alkaline cells, all by slipping on this thin metal sleeve.

batterizer_on_black-100588215-orig

All alkaline batteries start out their life providing a healthy 1.5v to the device they’re powering. As batteries become depleted however their output voltage begins to drop, slowly tapering down before dropping off a cliff depending on battery formulation. This is how all modern battery meters, including the one in your phone, work: by comparing the current output of the battery against its ideal voltage and then giving you an estimated amount of percentage left. Batteriser’s claim to fame is that, regardless of the current output of the battery, it will boost up the voltage to 1.5v making it appear “new” until all the chemical energy runs out. On the surface that might sound like some form of wizardry but it’s a well known circuit referred to as a Joule Thief.

Essentially the Batteriser device is a DC to DC converter, one that takes a wide range of input voltages and boosts them up to 1.5v. They’re coy on exactly how far down their converter can go but considering that most batteries become completely unusable past the 0.6v range my guess is that they don’t work past that. Their claim is that most devices will consider a battery dead below the 1.4v range which, if you take the rather gross assumption battery capacity decreases linearly as a function of voltage (which the graphs I liked to earlier clearly show is false), then they can provide almost 800% more life out of a single AA battery.

That claim, as you can probably guess, is well out of touch with reality.

This isn’t exactly a new and unknown phenomenon, it’s an inherent property of any kind of chemical battery that you’ll find on the market today. Developers of products that use alkaline batteries know this and will often include circuitry of this nature within them in order to make their devices last longer. Indeed it doesn’t take long to find numerous home made versions of such a device, albeit with far less slick packaging than what Batteriser is showing. So, if the claims in the patent match the final production device, it might be able to squeeze some more juice out of your batteries but I wouldn’t be surprised if the additional life you got wasn’t as great as they claim it is, especially for any modern portable device.

The real innovation here is the miniaturization of the whole package which, admittedly, is quite neat in its own right. However the claims they’re making are wildly out of alignment with reality, something which isn’t going to help their case when the eventual Indiegogo campaign hits. Still I’d be interested to see just how well it functions in the real world as even a simple doubling of battery life would likely be worth the asking price (currently $10 for 4 of them, reusable). At the very least it’s on the plausible side of these kinds of crowdfunded projects rather than being outright snake oil like we’ve seen so many times before.

thunderbolt-3

Thunderbolt 3 Brings USB-C, 40Gb/s Bandwidth.

There are few computer interconnects that have been as pervasive as USB. Its limitations are numerous however the ease at which it could be integrated into electronic devices ensured that it became the defacto standard for nearly everything that needed to talk to a PC. Few other connectors have dared to try to battle it for the connectivity crown, Firewire being the only one that comes to mind, but the new upstart of Thunderbolt as the potential to usurp the crown. Right now it’s mostly reserved for the few who’ve splashed out for a new Macbook but the amount of connectivity, bandwidth and versatility that the Thunderbolt 3 specification from Intel brings is, quite frankly, astounding.

thunderbolt-3

Thunderbolt, in its current incarnation, uses its own proprietary connector. There’s nothing wrong with that specifically, especially when you consider the fact that a single Thunderbolt connection can breakout into all manner of signals, however its size and shape don’t lend it well to applications in portable or slimline devices. The latest revision of the Thunderbolt specification however, announced recently by Intel at Computex in Taiwan, ditches the current connector in favour of the USB Type-C connector which, along with the space savings, brings other benefits like a reversible connector and hopefully much cheaper production costs. Of course the connector is really just one tiny aspect of all the benefits that Thunderbolt 3 will bring.

The new Thunderbolt 3 interface will double the current bandwidth available from 20Gb/s to 40Gb/s, enough to drive two 4K displays at 60hz off a single cable. To put that in perspective the current standard for high resolution screen interconnects, DisplayPort, currently only delivers 17Gb/s with the future 1.3 version is slated to deliver 34Gb/s. On its own that might not be exactly groundbreaking news for consumers, who really cares what the raw numbers are as long as it displays the pictures, but combine that with the fact that Thunderbolt 3 can deliver 100W worth of power and suddenly things are a lot different. That means you could run your monitor off the one cable, even large monitors like my AOC G2460PGs, which only draw 65W under load.

Like its predecessors Thunderbolt 3 will be able to carry all sorts of signals along its wires, including up to 4 lanes worth of PCIe. Whilst many seem to be getting excited about the possibility of external graphics cards, despite the obvious limitations they have, I’m more excited about more general purpose stuff that can be done with external PCIe lanes. The solutions available for doing that right now aren’t great but with 100W of power and 4 PCIe lanes over a single cable there’s potential for them to become a whole lot more palatable.

Of course we’ll be waiting quite a bit of time before Thunderbolt 3 becomes commonplace as manufacturers of both PCs and devices that have that connector ramp up to support it. The adoption of a more common connector, along with the numerous benefits of the Thunderbolt interface, has the potential to accelerate this however they still have a mountain to climb before they can knock USB down. Still I’m excited for the possibilities, even if it will mean a new PC to support them.

Who am I kidding, I’ll take any excuse to get a new PC.

Windows 10 29th July 2015

Windows 10: Coming July 29th.

The date for the final version of Windows has been set: July 29 of this year.

The announcement comes as a shock to no one, Microsoft had repeatedly committed to making Windows 10 generally available sometime this year, however the timing is far more aggressive than I would have expected. The Windows Insider program was going along well although the indications were that most of the builds still had a decidedly beta feel to them along with many features being missing. Indeed the latest build was released just three days ago indicating that a full release was still some time away. Microsoft isn’t one to give soft dates, especially for their flagship OS, so we can take the July 29 date as gospel from here on out.

Windows 10 29th July 2015Since everyone in the Insider program has had their hands on Windows 10 for some time now the list of features likely won’t surprise you however there were a few things that caught my eye in Microsoft’s announcement post. By the looks of it Office 2016 will be released alongside the new version of Windows including a new universal app version that’s geared towards touch devices. Considering how clumsy the desktop Office products felt on touch screens this is a welcome addition for tablet and transformer devices although I’d hazard a guess that the desktop version will still be the preferred one for many. What’s really interesting though is that OneNote and Outlook, long considered staples of the Office suite by many, will now be included in the base version of Windows for free. It’s not a big of an upset as including say Word or Excel would be but still an unexpected move none-the-less.

Many of the decidedly lacklustre default metro apps will get some new life breathed into them with an update to the universal app platform. On the surface this removes their irritating “takes over your entire desktop when launched” behaviour and makes them behave a lot more like a traditional app. Whether or not they’ll be improved to the point of usable beyond that is something that I’ll have to wait and see although I do have to admit that some of the built in apps (like the PDF reader) were quite useful to have. How the well integration between those apps, the cloud and other devices that can run universal apps, works remains to be seen although I’ve heard positive things about this experience in the past.

It seems that Microsoft has had this date in mind for some time now as all my home Windows 8.1 installs last night chirped up with a “Reserve your free Windows 10!” pop up late last night. This is the realisation of the promise Microsoft made back at the start of the year to provide a free Windows 10 update to all current consumer level customers, something I thought would likely be handled through a redemption portal or similar. However, based on the success Microsoft had in getting people to upgrade from 8 to 8.1 with a similar notification, I can see why they’ve taken this approach as it’s far more likely to get people upgrading than a free Windows 10 serial would.

What will be truly interesting to see is if the pattern of adoption continues with major Windows versions. Windows 7, which is now approaching middle age, still remains unchallenged by the previous two upstarts. The barriers to transitioning are now much lower than they once were, however customers have shown that familiarity is something they value above nearly everything else. Windows 10 has all the makings of a Windows version that consumers want but we all know that what people say they want and what they actually want are two different things.

 

Hook_web_slideshow__0000_Death

PowerPoint Doesn’t Need to Die, Bad Communicators Do.

Ah PowerPoint, the thing that everyone seems to loathe when they walk into a meeting yet still, when it comes time for them to present something, it’s the first tool they look to for getting their idea across. Indeed in my professional career I’ve spent many hours standing in front of a projection screen, the wall behind me illuminated by slide after slide of information I was hoping to convey to my audience, jabbering on about what the words behind me meant. It seems that every year there’s someone calling for the death of the defacto presentation tool with them lamenting its use in many well publicised scandals and failures. However like the poor workman who blames his tools PowerPoint is not responsible for much of the ills aimed at it. That, unfortunately, lies with the people who use it.

Hook_web_slideshow__0000_Death

PowerPoint, like every Microsoft Office product, when put in the hands of the masses ends up being used in ways that it never should have been. This does not necessarily mean the tool is bad, indeed I’d like to see a valid argument for the death of say Word given the grave misuses it has been put to, more that it was likely not the most appropriate medium for the message it was trying to convey or the audience it was presented to. When used in its most appropriate setting, which I contend is as a sort of public prompt card for both the speaker and the audience, PowerPoint works exceptionally well for conveying ideas and concepts. What it’s not great at doing is presenting complex data in a readily digestible format.

But then again there are very few tools that can.

You see many of the grave misgivings that have been attributed to PowerPoint are the result of its users attempting to cram an inordinate amount of information into a single panel, hoping that it somehow all makes its way across to the audience. PowerPoint, on its own, simply does not have the capability to distill information down in that matter and as such relies on the user’s ability to do that. If the user then lacks the ability to do that both coherent and accurately then the result will, obviously, not be usable. There’s no real easy solution to this as creating infographics that convey real information in a digestible format is a world unto itself but blaming the tool for the ills of its users, and thus calling for the banning of its use, seems awfully shortsighted.

Indeed if it was not for PowerPoint then it would be another one of the Microsoft Office suite that would be met with the same derision as they all have the capability to display information in some capacity, just not in the format that most presentations follow. Every time people have lamented PowerPoint to me I’ve asked them to suggest an alternative tool that solves the issues they speak of and every time I have not recieved a satisfactory answer. The fact of the matter is that, as a presentation tool, PowerPoint is one of the top in its class and that’s why so many turn to it. The fact that it’s found at the center of a lot of well publicised problems isn’t because of its problematic use, just that it’s the most popular tool to use.

What really needs to improve is the way in which take intricate and complex data and distill that down to its essence for imparting it on others. This is an incredibly wide and diverse problem space, one that entire companies have founded their business models on. It is not something that we pin on a simple presentation tool, it is a fundamental shift away from thinking that complex ideas can be summed up in a handful words and a couple pretty pictures. Should we want to impart knowledge upon someone else then it is up to us to take them on that journey, crafting an experience that leaves them with enough information for them to be able to impart that idea on someone else. If you’re not capable of doing nor PowerPoint nor any other piece of software will help you.