Monthly Archives: October 2015

Labor’s Return to FTTP Scarred by the NBN’s MTM Past.

The current MTM NBN is by all accounts a total mess. Every single promise that the Liberal party has made with respect to it has been broken. First the guaranteed speed being delivered to the majority of Australians was scrapped. Then the timeline blew out as the FTTN trials took far longer to accomplish than they stated they would. Finally the cost of the network, widely described as being a third of the FTTP solution, has since ballooned to well above any cost estimate that preceded it. The slim sliver of hope that all us technologically inclined Australians hang on to is that this current government goes single term and that Labor would reintroduce the FTTP NBN in all its glory. Whilst it seems that Labor is committed to their original idea the future of Australia’s Internet will bear the scars of the Liberals term in office.

nbn-smh

Jason Clare, who’s picked up the Shadow Communications Minister position in the last Labor cabinet reshuffle before the next election, has stated that they’d ramp up the number of homes connected to fiber if they were successful at the next election. Whilst there’s no solid policy documents available yet to determine what that means Clare has clearly signalled that FTTN rollouts are on the way out. This is good news however it does mean that Australia’s Internet infrastructure won’t be the fiber heaven that it was once envisioned to be. Instead we will be left with a network that’s mostly fiber with pockets of Internet backwaters with little hope of change in the near future.

Essentially it would seem that Labor would keep current contract commitments which would mean a handful of FTTN sites would still be deployed and anyone on a HFC network would remain on them for the foreseeable future. Whilst these are currently serviceable their upgrade paths are far less clear than their fully fiber based brethren. This means that the money spent on upgrading the HFC networks, as well as any money spent on remediating copper to make FTTN work, is wasted capital that could have been invested in the superior fiber only solution. Labor isn’t to blame for this, I understand that breaking contractual commitments is something they’d like to avoid, but it shows just how much damage the Liberals MTM NBN plan has done to Australia’s technological future.

Unfortunately there’s really no fix for this, especially if you want something politically palatable.

If we’re serious about transitioning Australia away from the resources backed economy that’s powered us over the last decade investments like the FTTP NBN are what we are going to need. There’s clear relationships between Internet speeds and economic growth something which would quickly make the asking price look extremely reasonable. Doing it half-arsed with a cobbled together mix of technologies will only result in a poor experience, dampening any benefits that such a network could provide. The real solution, the one that will last us as long as our current copper network has, is to make it all fiber. Only then will we be able to accelerate our growth at the same rapid pace as the rest of the world is and only then will we see the full benefits of what a FTTP NBN can provide.

The Light-L16 Isn’t “DSLR Quality”.

It’s well known that the camera industry has been struggling for some time and the reason for that is simple: smartphones. There used to be a wide gap in quality between smartphones and dedicated cameras however that gap has closed significantly over the past couple years. Now the market segment that used to be dominated by a myriad of pocket cameras has all but evaporated. This has left something of a gap that some smaller companies have tried to fill like Lytro did with their quirky lightfield cameras. Light is the next company to attempt to revitalize the pocket camera market, albeit in a way (and at a price point) that’s likely to fall as flat as Lytro’s Illum did.

L16-FRONT

The Light-L16 is going to be their debut device, a pocket camera that contains no less than 16 independent camera modules scattered about its face. For any one picture up to 10 of these cameras can fire at once and, using their “computational photography” algorithms the L-16 can produce images of up to 52MP. On the back there’s a large touchscreen that’s powered by a custom version of Android M, allowing you to view and manipulate your photos with the full power of a Snapdragon 820 chip. All of this can be had for $1299 if you preorder soon or $1699 when it finally goes into full production. It sounds impressive, and indeed some of the images look great, however it’s not going to be DSLR quality, no matter how many camera modules they cram into it.

You see those modules they’re using are pulled from smartphones which means they share the same limitations. The sensors themselves are going to be tiny, around 1/10th the size of most DSLR cameras and half again smaller than full frames. The pixels on these sensors then are much smaller, meaning they capture less detail and perform worse in low light than DSLRs do. You can overcome some of these limitations through multiple image captures, like the L-16 is capable of, however that’s not going to give you the full 52MP that they claim due to computational losses. There are some neat tricks they can pull like adjusting the focus point (ala Lytro) after the photo is taken but as we’ve seen that’s not a killer feature for cameras to have.

Those modules are also arranged in a rather peculiar way, and I’m not talking about the way they’re laid out on the device. There’s 5 x 35mm, 5 x 70mm and 6 x 150mm. This is fine in and of itself however they can’t claim true optical zoom over that range as there’s no graduations between all those modules. Sure you can interpolate using the different lenses but that’s just a fancy way of saying digital zoom without the negative connotations that come with it. The hard fact of the matter is that you can’t have prime lenses and act like you have zooms at the same time, they’re just physically not the same thing.

Worst of all is the price which is already way above entry level DSLRs even if you purchase them new with a couple lenses. Sure I can understand form factor is a deal breaker here however this camera is over double the thickness of current smartphones. Add that to the fact that it’s a separate device and I don’t think people who are currently satisfied with their smartphones are going to pick one up just because. Just like the Lytro before it the L-16 is going to struggle to find a market outside of a tiny niche of camera tech enthusiasts, especially at the full retail price.

This may just sound like the rantings of a DSLR purist who likes nothing else, and in part it is, however I’m fine with experimental technology like this as long as it doesn’t make claims that don’t line up with reality. DSLRs are a step above other cameras in numerous regards mostly for the control they give you over how the image is crafted. Smartphones do what they do well and are by far the best platform for those who use them exclusively. The L-16 however is a halfway point between them, it will provide much better pictures than any smartphone but it will fall short of DSLRs. Thinking any differently means ignoring the fundamental differences that separates DSLRs and smartphone cameras, something which I simply can’t do.

Beyond the LHC: AWAKE.

The Large Hadron Collider has proven to be the boon to particle physics that everyone had imagined to be but it’s far from done yet. We’ll likely be getting great data out of the LHC for a couple decades to come, especially with the current and future upgrades that are planned. However it has its limit and considering the time it took to build the LHC many are looking towards what will replace it when the time comes. Trouble is that current colliders like the LHC can only get more powerful by being longer, something which the LHC struggled with at its 27KM length. However there are alternatives to current particle acceleration technologies and one of them is set to be trialled at the LHC next year.

DESYFascinationHeaderImage

The experiment is called AWAKE and was approved by the CERN board back in 2013. Recently however it was granted additional funding in order to pursue its goal. At its core the AWAKE experiment is a fundamentally different approach to particle acceleration, one that could dramatically reduce the size of accelerators. It won’t be the first accelerator of this type to ever be built, indeed proof of concept machines already exist at over a dozen facilities around the world, however it will be the first time CERN has experimented with the technology. All going well the experiment is slated to see first light sometime towards the end of next year with their proof of concept device.

Traditional particle colliders work on alternating electric fields to propel particles forward, much like a rail gun does with magnetic fields. Such fields place a lot of engineering constraints on the containment vessels with more powerful fields requiring more energy which can cause arcing if driven too high. To get around this particle accelerators typically favour length over field strength, allowing the particles a much longer time to accelerate before collision. AWAKE however works on a different principle, one called Plasma Wakefield Acceleration.

In a Wakefield accelerator instead of particles being directly accelerated by an electric field they’re instead injected into a specially constructed plasma. First a set of charged particles, or laser light, is sent through the plasma. This then sets off an oscillation within the plasma creating alternating regions of positive and negative charge. Then when electrons are injected into this oscillating plasma they’re accelerated, chasing the positive regions which are quickly collapsing and reforming in front of them. In essence the electrons surf on the oscillating wave, allowing them to achieve much greater velocities in a much quicker time. The AWAKE project has a great animation of the experiment here.

The results of this experiment will be key to the construction of future accelerators as there’s only so much further we can go with current technology. Wakefield based accelerators have the potential to push us beyond the current energy collision limits, opening up the possibility of understanding physics beyond our current standard model. Such information is key to understanding our universe as it stands today as there is so much beauty and knowledge still out there, just waiting for us to discover it.

 

Until Dawn: Interactive B Grade Horror.

Long time readers will know that horror and I don’t really get along. As a genre I don’t find it particularly engaging although there have been several examples which have managed to break through my disdain. Still even those few examples haven’t been enough to change my base dislike of nearly everything that bears the horror tag. Indeed that’s the reason why I deliberately avoided Until Dawn for as long as I did and it was only after thumbing through numerous reviews of it did I change my mind. Whilst Until Dawn might not be the title that finally gets me to see the merits of the horror genre it is an exquisitely built game in its own right, one deserving of all the attention it has received.

Until Dawn Review Screenshot Wallpaper Title Screen

It was just like any other winter getaway when eight close friends went to one of their parent’s mountain lodges for a weekend of partying. They were to spend the week revelling like all teenagers do, indulging in things that their parents would likely disapprove of. However their night quickly turns sinister as they are reminded of the tragic past that brought them here that seems to haunt them at every corner. Your decisions will guide them through this night and determine who makes it to the end and who meets their untimely demise at the horrors of the mountain.

Until Dawn makes good use of the grunt of the PlayStation4, bringing graphics that are far beyond anything that the previous generation of consoles was capable of. Whilst most of the time the graphics are hidden behind the dark horror movie aesthetic There are still numerous moments that allow you to appreciate the level of work that’s gone into crafting the visual aspects of the game. There are a few rough edges though with performance taking a dive regularly, especially in outdoor scenes or action heavy sequences. The game isn’t unplayable because of it however you can definitely tell that the priority was aesthetic over optimization, meaning a constant 30fps experience isn’t guaranteed. Considering this is Supermassive Game’s first PlayStation4 title I’m willing to give them a little leeway however I’d expect future titles to not make the same mistake.

Until Dawn Review Screenshot Wallpaper Josh

In terms of game play Until Dawn is in the same league as other interactive fiction titles like Heavy Rain and Beyond: Two Souls. For the most part you’ll be wandering around the mostly linear environment, looking at various objects and interacting with other characters to further the story. Until Dawn’s flagship feature is the “Butterfly Effect” system which chronicles most of your decisions which will have an impact on the story down the line. There’s also collectibles called Totems which when found show you a glimpse of a possible future event, allowing you to get some insight into how they might unfold. Finally for all the action parts of Until Dawn you’ll be using a pretty standard quick time events system. All in all at a mechanical level Until Dawn is pretty much what you’d expect when it comes to an interactive fiction game.

The walking around and looking at things part is mostly well done (with some issues I’ll discuss later) with interactive objects that are in your character’s field of view being highlighted by a small white glowing orb. This typically means that as long as you do a full 360 of a room you’re likely to find everything in it which is good as I can’t tell you how frustrating it is to try and figure out sometimes in games like this. Until Dawn does reward exploration, meaning that if you think you’re going down a non-obvious section chances are you’ll find something juicy at the end of it. Thankfully the walking speed can be sped up some as well, meaning you’re not going to double your play time just because you wanted to explore a little bit.

Until Dawn Review Screenshot Wallpaper Quick Time Events

Whenever you perform an action or make a certain dialogue choice the screen will explode in a cascade of butterflies, indicating that your choice will impact on future events. This is a more extravagant version Telltale’s “X will remember that” feature with the added benefit that you can go back and review your choices at any point. Unlike most other games though, where decisions that will have major impacts on the story will usually be showcased as such, Until Dawn rarely makes such a distinction. If I’m honest I found that to be a little frustrating as it was hard to tell when what seemed like a minor decision actually had major consequences. In reality though that’s much closer to a real world experience so I can definitely appreciate it for that reason.

The quick time event system works as you’d expect it to, giving you a limited amount of time to respond by pressing the right button or moving the control stick in the right way. It’s broken up every so often by having you make decisions, like taking a safe route vs a quicker one or hiding vs running, which can have similar butterfly effect impacts as dialogue choices do. The one interesting differentiator that Until Dawn has is the “DON’T MOVE” sections which can actually be something of a challenge when your heart is racing and your hands are shaking the controls. So nothing revolutionary here, not that you’d really be expecting that from games in this genre.

Until Dawn Review Screenshot Wallpaper Who Are You

Whilst Until Dawn does show an incredible amount of polish in most regards there’s still some rough edges in a few key areas. The collision detection is a bit iffy, being a little too wide at the character’s feet which makes you get stuck on objects that you’d think you could just walk past. This also extends to things your character holds like a burning torch which seem to lack collision detection, allowing you to put your character’s hand through walls. There’s also the performance issues which I noted previously which, whilst not degrading the game into a slideshow, are definitely noticeable. Still for a first crack at a game of this calibre it’s commendable that Supermassive Games was able to put out something with this level of polish, especially on a new platform for them.

Now being someone who’s avoided the horror genre it almost all its incarnations I don’t feel entirely qualified to give an objective view on how good the story is. Certainly it seems to share many of the tropes that you’d associate with a teen horror movie but whether they’re well executed or not is something I’ll have to leave up to the reader. To me it was a predictable narrative, one that attempted to use jump scares and triggered music to try and build tension. Whilst it didn’t bore me to sleep like so many horror movies have done I still wouldn’t recommend it on story alone.

Until Dawn Review Screenshot Wallpaper Its Over

Until Dawn is a great debut title for Supermassive Games on the PlayStation4 showing that Quantic Games isn’t the only developer who can create great interactive fiction. The graphics are what I’ve come to expect from current generation titles, making full use of the grunt available on the platform. Mechanically it plays as you’d expect with only a few rough edges in need of additional polish. The writer’s aversion to horror though means that the story didn’t strike much of a chord although it’s likely to delight horror fans the world over. In summation Until Dawn is a superbly executed game one that both horror fans and interactive fiction junkies can enjoy.

Rating: 8.5/10

Until Dawn is available on PlayStation4 right now for $78. Total play time was approximately 7 hours.

The Chemistry of the Volkswagen Scandal.

The Volkswagen emissions scandal is by far one of the most egregious acts of deceptive engineering we’ve seen in a long time. Whilst the full story of how and why it came about won’t be known for some time the realities of it are already starting to become apparent. What really intrigued me however isn’t so much the drama that has arisen out of this scandal but the engineering and science I’ve had to familiarize myself with to understand just what was going on. As it turns out there’s some quite interesting chemistry at work here and, potentially, Volkswagen have shot themselves in the foot just because they didn’t want to use too much of a particular additive:

The additive in question is called AdBlue and is comparatively cheap ($1/litre seems pretty common) when compared to other fluids that most modern cars require. The problem that Volkswagen appears to have faced was that they didn’t have the time or resources require to retrofit certain models with the system when it became apparent that they couldn’t meet emissions standards. As to why they chose to defeat the emissions testing devices instead of simply delaying a model release (a bad, but much better, situation than what they currently find themselves in) is something we probably won’t know for a while.

Regardless it was an interesting aside to the larger scandal as I wasn’t familiar with this kind of diesel technology previously. Indeed now that I understand it the scandal seems all the more absurd as the additive is cheap, the technology well known and has successful implementations in many other vehicles. Still it introduced me to some interesting engineering and science that I wasn’t privy to before, so there is that at least.

Carbon Nanotubes Break Barriers for Moore’s Law.

In the last decade there’s been a move away from raw CPU speed as an indicator of performance. Back when single cores were the norm it was an easy way to judge which CPU would be faster than the other in a general sense however the switch to multiple cores threw this into question. Partly this comes from architecture decisions and software’s ability to make use of multiple cores but it also came hand in hand with a stalling CPU speeds. This is mostly a limitation of current technology as faster switching meant more heat, something most processors could not handle more of. This could be set to change however as research out IBM’s Thomas J. Watson Research Center proposes a new way of constructing transistors that overcomes that limitation.

Carbon Nanotube Transistors

Current day processors, whether they be the monsters powering servers or the small ones ticking away in your smartwatch, are all constructed through a process called photolithography. In this process a silicon wafer is covered in a photosensitive chemical and then exposed to light through a mask. This is what imprints the CPU pattern onto the blank silicon substrate, creating all the circuitry of a CPU. This process is what allows us to pack billions upon billions of transistors into a space little bigger than your thumbnail. However it has its limitations related to things like the wavelength of light used (higher frequencies are needed for smaller features) and the purity of the substrate. IBM’s research takes a very different approach by instead using carbon nanotubes as the transistor material and creating features by aligning and placing them rather than etching them in.

Essentially what IBM does is take a heap of carbon nanotubes, which in their native form are a large unordered mess, and then aligns them on top of a silicon wafer. When the nanotubes are placed correctly, like they are in the picture shown above, they form a transistor. Additionally the researchers have devised a method to attach electrical connectors onto these newly formed transistors in such a way that their electrical resistance is independent of their width. What this means is that the traditional limitation of increasing heat with increased frequency is now decoupled, allowing them to greatly reduce the size of the connectors potentially allowing for a boost in CPU frequency.

The main issue such technology faces is that it is radically different from the way we currently manufacture CPUs today. There’s a lot of investment in current lithography based fabs and this method likely can’t make use of that investment. So the challenge these researchers face is creating a scalable method with which they can produce chips based on this technology, hopefully in a way that can be adapted for use in current fabs. This is why you’re not likely to see processors based on this technology for some time, probably not for another 5 years at least according to the researchers.

What it does show though is that there is potential for Moore’s Law to continue for a long time into the future. It seems whenever we brush up against a fundamental limitation, one that has plagued us for decades, new research rears its head to show that it can be tackled. There’s every chance that carbon nanotubes won’t become the new transistor material of choice but insights like these are what will keep Moore’s Law trucking along.

Quantum Computing Comes to Silicon.

Traditional computing is bound in binary data, the world of zeroes and ones. This constraint was originally born out of a engineering limitation, designed to ensure that these different states could be easily represented by differing voltage levels. This hasn’t proved to be much of a limiting factor in the progress that computing has made however but there are different styles of computing which make use of more than just those zeroes and ones. The most notable one is quantum computing which is able to represent an exponential amount of states depending on the number of qubits (analogous to transistors) that the quantum chip has. Whilst there have been some examples of quantum computers hitting the market, even if their quantum-ness is still in question, they are typically based on exotic materials meaning mass production of them is tricky. This could change with the latest research to come out of the University of New South Wales as they’ve made an incredible breakthrough.

UNSW Qubit in Silicon

Back in 2012 the team at UNSW demonstrated that they could build a single qubit in silicon. This by itself was an amazing discovery as previously created qubits were usually reliant on materials like niobium cooled to superconducting temperatures to achieve their quantum state. However a single qubit isn’t exactly useful on its own and so the researchers were tasked with getting their qubits talking to each other. This is a lot harder than you’d think as qubits don’t communicate in the same way that regular transistors do and so traditional techniques for connecting things in silicon won’t work. So after 3 years worth of research UNSW’s quantum computing team has finally cracked it and allowed two qubits made in silicon to communicate.

This has allowed them to build a quantum logic gate, the fundamental building block for a larger scale quantum computer. One thing that will be interesting to see is how their system scales out with additional qubits. It’s one thing to get two qubits talking together, indeed there’s been several (non-silicon) examples of that in the past, however as you scale up the number of qubits things start to get a lot more difficult. This is because larger numbers of qubits are more prone to quantum decoherence and typically require additional circuitry to overcome it. Whilst they might be able to mass produce a chip with a large number of qubits it might not be of any use if the qubits can’t stay in coherence.

It will be interesting to see what applications their particular kind of quantum chip will have once they build a larger scale version of it. Currently the commercially available quantum computers from D-Wave are limited to a specific problem space called quantum annealing and, as of yet, have failed to conclusively prove that they’re achieving a quantum speedup. The problem is larger than just D-Wave however as there is still some debate about how we classify quantum speedup and how to properly compare it to more traditional methods. Still this is an issue that UNSW’s potential future chip will have to face should it come to market.

We’re still a long way off from seeing a generalized quantum computer hitting the market any time soon but achievements like those coming out of UNSW are crucial in making them a reality. We have a lot of investment in developing computers on silicon and if those investments can be directly translated to quantum computing then it’s highly likely that we’ll see a lot of success. I’m sure the researchers are going to have several big chip companies knocking down their doors to get a license for this tech as it really does have a lot of potential.

Destiny: The Taken King: Oryx Has Come For Your Light.

Destiny was the first game to break my staunch opposition to playing first person shooters on consoles. Being a long time member of the PC master race meant that it took me quite a while to get used to the way consoles do things and had it not been for Destiny’s MMORPG stylings I might not have stuck it through. However I’m glad I did as the numerous hours I’ve spent in Destiny since then were ones I very much enjoyed. The time between me capping out at level 32 and the release of the House of Wolves expansion though was long enough for me to fall back to DOTA 2 and I missed much of that release. The Taken King however promised to completely upend the way Destiny did things and proved to be the perfect time for me to reignite my addiction to Bungie’s flagship IP.

Destiny The Taken King Screenshot Wallpaper Title Screen

Six of you went down into that pit, looking to end the dark grip that Crota held on our Moon. His death rung out across the galaxy, sending ripples through the darkness. His father, Oryx, felt his son slip from this plane and immediately swore vengeance upon you and the light. Oryx has appeared in our system aboard a mighty dreadnought, capable of decimating entire armies with a single attack. Slayer of Crota it is now up to you to face Oryx as he and his Taken are swarming over the entire solar system and threaten to snuff out the light once and for all. Are you strong enough to face this challenge guardian?

Graphically Destiny hasn’t change much, retaining the same level of impressive graphics that aptly demonstrate the capabilities of current generation console hardware. There has been a significant overhaul to the UI elements however with the vast majority of them looking sharper and feeling a lot more responsive. It might sound like a minor detail but it’s a big leap up in some regards, especially with the new quest/mission and bounty interface. It’s the kind of stuff you’d expect to see in expansions like this and I doubt we’ll see any major graphical changes until Destiny 2.0.

Destiny The Taken King Screenshot Wallpaper Into The Breach

The core game is mostly unchanged, consisting of the same cover based shooter game play with the RPG elements sprinkled on top. Every class has been granted a new subclass meaning that each of them now has one that covers each of the elements (fire, void and arc). The levelling and progression system has been significantly overhauled, providing a much smoother experience to levelling your character up both in level and gear terms. Many of the ancillary activities, like the Nightfall and Exotic Bounties, have also been reworked to favour continuous progression rather than constant praying to RNGJesus. Additionally there’s been many quality of life improvements which have made doing lots of things in Destiny a lot easier, much to the relief of long term players like myself. This is all in concert with a campaign that’s about half as long again as the original, making The Taken King well worth its current asking price.

The new subclasses might sound like a small addition but they’ve given new life to a lot of aspects of Destiny. Each new class essentially filled a hole that was lacking in the other two, allowing each class to be far more versatile than they were previously. The only issue I have with them so far is that the new subclasses feel a lot better at more things than their predecessors do which means pretty much everyone is solely using those classes now. Sure the Defender Titan’s bubble is still the ultimate defensive super however it pales in comparison to the toolset that the Sunbreaker has at their disposal. This might just be my impression given my current playstyle (mostly solo) and may change when I finally attempt the raid this week.

Destiny The Taken King Screenshot Wallpaper Mission Screen

The revamp of the levelling system is probably the most welcome change. Unlike before where your level was determined by the light your gear had the new maximum level is now 40, achievable through grinding XP like any other RPG. Then once you hit the maximum level you have your Light Level which is an average of your gear’s damage output and defensive capabilities. As it stands right now that’s pretty much the only thing that matters when you’re attempting an encounter and so the higher your light level the easier it will be. Whilst it’s not a huge change from the previous system it does mean that there’s quite a lot more variety in terms of what gear you’ll end up using. This also means that a lot more pieces of gear, even those lowly blues which we used to disassemble immediately, are now quite viable.

This comes hand in hand with a revamp of how you can obtain gear in the game. Whilst there are still guaranteed ways of obtaining gear through marks (now Legendary Marks which are slightly hard to come by) you’ll mostly be looking for engrams. The engram rate has been significantly increased and when you get them decrypted they’ll roll a random light level in a +/- range of your current light level. This means that, in order to progress, it’s best to equip your highest light level gear and decode engrams. Doing this I was able to get myself to light level 295 in a relatively short period of time, more than enough to attempt the raid. This, coupled with the new avenues to better gear, mean that progression is far smoother than it was before.

Destiny The Taken King Screenshot Wallpaper Hammer of Sol

The revamped quest and bounties system has everything in it that guardians have been asking for from day one. There’s now a separate tab that has everything in it, allowing you to track objectives so you can quickly see your status without having to pop out into the menu. This has allowed Bungie to include multiple questlines that all run simultaneously, something which was just not possible before. Many of these quests will help you get solid boosts in your light level and, if you follow some of the longer ones, unlock some of the best gear in the game. By far the stand out piece, in my mind, is the heavy sword which (when fully levelled) makes PVE encounters a breeze and the crucible a punchbro’s dream.

The Taken King expansion also gets massive kudos for fleshing out the lore of Destiny significantly. Part of this comes from the extended campaign missions that take you deep into the world of the Hive and the Taken but there’s also dozens of new bits of information scattered throughout the world (accessible through ghost scans). The Books of Sorrow provide a lot of background detail to all the races and their involvement with the Darkness as well as providing some insight into the events that led up to the Traveller’s current state. I’m still picking through it myself but it’s honestly great to see Bungie fleshing out this world as there’s so much potential here and I’d hate for it to go to waste. Additionally Nolan North replacing Peter Dinklage as the voice of your ghost is a welcome change, as is the addition of many more hours of voice acting from all the central characters.

Destiny The Taken King Screenshot Wallpaper Lookin Sharp

Destiny: The Taken King is the expansion that all guardians had hoped for, bringing with it all the improvements that were sorely needed. It’s a testament to Bungie’s dedication to this IP as the amount of extra content they’ve released for Destiny in just one year is, honestly, staggering. The changes they made with this latest expansion have improved the experience dramatically, both for casual players and the hardcore alike. If you’d been staving off playing Destiny then now would definitely be a great time to give it a whirl as this is the game many are saying it should have been at launch. For long time guardians like myself it’s what was needed to bring me back to the fold and keep me playing for a long time to come.

Rating: 9.5/10

Destiny: The Taken King is available on PlayStation3, PlayStation4, XboxOne and Xbox360 for $79, $79, $79 and $79 respectively. Game was played on the PlayStation4 with approximately 20 hours of play time, reaching light level 295.

Freevolt: Yet Again “Free Energy” Rears Its Ugly Head.

Our world is dominated by devices that need to be plugged in on a regular basis, a necessary tedium for the ever connected lifestyle many of us now lead. Doing away with that is an appealing idea, leaving the cords for things that never move. That idea won’t become reality any time soon however, due to the challenges we face in miniaturization of power generation and storage. That, of course, hasn’t stopped numerous companies from saying that they have done so with the most recent batch purporting to be able to harvest energy from wireless signals. The latest company to do this is called Freevolt and unfortunately their PR department has fallen prey to the superlative claims that many of its predecessors have.

Freevolt

Their idea is the same as pretty much all the other free energy ideas that have cropped up over the past couple of years. Essentially their device (which shares the company’s name) has a couple different antennas on it which can harvest electromagnetic waves and transform them into energy. Unlike other devices, which typically were some kind of purpose built thing that just “never needed recharging”, Freevolt wants to be the platform on which developers build devices that use their technology. Their website showcases numerous applications that they believe their device will be able to power including things like wearables and smoke alarms. The only current application of their technology though is the CleanSpace tag which, as of writing, is not available.

Had Freevolt constrained their marketing spiel to just ultra low power things like individual sensors I would’ve let it slide however they’re not just claiming that. The fact of the matter is that numerous devices which they claim could be powered by this tech simply couldn’t be, especially with their current form factors. Their website clearly shows something like a health tracker which is far too small to contain the required antennas and electronics, not to mention that their power requirements are far above the 100 microwatts they claim they can generate. Indeed even devices that could integrate the technology, like a smoke alarm, would still have current draws above what this device could provide.

To be fair their whitepaper makes far more tempered claims about what their device is capable of, mostly aimed at extending battery life rather than outright replacing it. However, whilst such claims might be realistic, they fail to account for the fact that many of the same benefits they’re purporting could likely be achieve by simply adding another battery to the device. I don’t know how much their device will cost but I’d hazard a guess that it’d cost a lot more than adding in an additional battery pack. This is all based on the assumption that the device operates in an environment that’s heavy enough in RF to charge the device at its optimal rate, something which I don’t think will hold true in enough cases to make it viable.

I seriously don’t understand why companies continue to pursue ideas like this as they have either turn out to be completely farcical, infeasible or simply just not economically viable. Sure there is energy to be harvested from EM waves but the energy is so low that the cost of acquiring that energy is far beyond any of the alternatives. Freevolt might think they’re onto something but the second they start shipping their dev kit I can guarantee the field results will be nothing like what they’re purporting. Not that that will discourage anyone from trying it again though as it seems there’s always another fool willing to be parted with their money.