NewHorizons_Pluto

New Horizons Enters Safe Mode, Causes Heart Attacks.

There are numerous risks that spacecraft face when traversing the deep black of space. Since we’ve sent many probes to many locations most of these risks are well known and thus we’ve built systems to accommodate them. Most craft carry with them fully redundant main systems, ensuring that if the main one fails that the backup can carry on the task that the probe was designed to do. The systems themselves are also built to withstand the torturous conditions that space throws at them, ensuring that even a single piece of hardware has a pretty good chance of surviving its journey. However sometimes even all that engineering can’t account for what happens out there and yesterday that happened to New Horizons.

NewHorizons_Pluto

New Horizons is a mission led by NASA which will be the first robotic probe to make a close approach to Pluto. Its primary mission is to capture the most detailed view of Pluto yet, generating vast amounts of data about our most diminutive dwarf planet. Unlike many similar missions though New Horizons won’t be entering Pluto’s orbit, instead it will capture as much data as it can as it whips by Pluto at a blistering 17 km/s. Then it will set its sights on one of the numerous Kuiper Belt objects where it will do the same. This mission has been a long time in the making launching in early 2006 and is scheduled to “arrive” at pluto in the next 10 days.

However, just yesterday, the craft entered safe mode.

What caused this to happen is not yet known however one good piece of news is that the craft is still contactable and operating within expected parameters for an event of this nature. Essentially the primary computer sensed a fault and, as it is programmed to do in this situation, switched over to the backup system and put the probe into safe mode. Whilst NASA engineers have received some information as to what the fault might be they have opted to do further diagnostics before switching the probe back onto its primary systems. This means that science activities that were scheduled for the next few days will likely be delayed whilst these troubleshooting process occur. Thankfully there were only a few images scheduled to be taken and there should be ample time to get the probe running before its closest approach to Pluto.

The potential causes behind an event of this nature are numerous but since the probe is acting as expected in such a situation it is most likely recoverable. My gut feeling is that it might have been a cosmic ray flipping a bit, something which the processors that probes like New Horizons are designed to detect. As we get more data trickled back down (it takes 9 hours for signals to reach New Horizons) we’ll know for sure what caused the problem and what the time frame will be to recover.

Events like this aren’t uncommon, nor are they unexpected, but having one this close to the mission’s ultimate goal, especially after the long wait to get there, is sure to be causing some heartache for the engineers at NASA. New Horizons will only have a very limited opportunity to do the high resolution mapping that it was built to do and events like these just up the pressure on everyone to make sure that the craft delivers as expected. I have every confidence that the team at NASA will get everything in order in no time at all however I’m sure there’s going to be some late nights for them in the next few days.

Godspeed, New Horizons.

Homesick Review Screenshot Wallpaper A Spot of Colour

Homesick: Longing For a World Long Since Past.

With game creation now within the reach of anyone who has the time to dedicate to it the differentiators usually stem from the strengths of their creators. Many come from a writing background, pouring themselves into the creation of a brilliant narrative that flows through the game. Others develop wild and intriguing mechanics, some that allow the players to develop their own story within a world they create. Few however find their strength in the art and graphical fidelity as out of all the things that make a game it is by far the most costly and time consuming to create. Homesick is one of those rare few indie games that brings with it astonishing visual quality that even rivals recent AAA titles.

Homesick Review Screenshot Wallpaper A Spot of Colour

You wake to a world that’s cold and unfamiliar. The world is barren, bereft of nearly all life and seemingly cold despite the sun’s unrelenting rays punching down through every crack and crevice. As you explore though you see remnants of the world that once was, little reminders that show people were here…once. However you struggle to make sense of the world, the books and letters are all written in code and try as you might there’s little sense to be made of them. You know one thing though, you must get out. You must find your way into the light.

I would forgive people for thinking that Homesick was simply a demo project for a new engine as it’s honestly by far the best looking indie game I’ve ever seen. The attention to details is astounding from things like the rooms with wallpaper peeling off to the fully working (but out of tune) piano. Looking at Barrett Meeker’s (the creative director) history in animation and effects it’s not hard to see why as he’s worked on such titles like Scott Pilgrim vs. the World. If I hadn’t taken these screenshots myself I would’ve written them off as carefully crafted renders but the game really does look this good when you’re playing it. Of course there were some sacrifices made for this beauty, namely the extremely simplistic animations and accompanying sound effects, but it’s hard to deny that the graphics are anything but amazing.

Homesick Review Screenshot Wallpaper A Most Unusual Place to Rest

Homesick is your (now) bog standard walking simulator where you’ll move forward at a relatively slow pace that encourages you to take in your surroundings, look at everything and essentially be a tourist in the game’s world. Each room has a set of puzzles that you’ll need to figure out in order to progress and, interestingly, they all share the same end goal. However that doesn’t detract from the challenge at all as figuring out how to accomplish said goal can sometimes involve a myriad of steps, not all of which will be obvious at first glance. Once you finish a section it’s off to the dream world which will allow you to progress to the next section.

The Kickstarter for Homesick described the puzzles as “hard, yet fair and sensible” and for the most part that rings true. The game provides absolutely not tutorial to speak of so for the first 10 minutes or so you’re on your own to figure out how everything fits together in this world. Thankfully whilst all the rooms are interconnected they’re not dependent on each other, meaning that each new puzzle is self contained and does not require any backtracking. There is a couple times where you can miss an important clue which will get you stuck (hint: make sure you look at all the filing cabinets carefully) but other than that you should be able to work things out eventually. My favourite by far was the blocks puzzle but I won’t say much more lest I spoil the fun.

Homesick Review Screenshot Wallpaper Home Sweet Home

Whilst the game is extremely pretty it does suffer from a few areas that could’ve used a little bit more polish. For some reason there are certain places where I’d get a lot of slowdown, usually when turning past a corner in some of the first rooms. There’s a couple other places where this happens too which leads me to believe there’s some unoptimized geometry hiding somewhere. There’s also a couple glitches that require a game restart to overcome, like an issue (which was said to be fixed but still happened for me) where holding a certain item would overwrite your entire inventory. Thankfully I didn’t lose too much progress but it was still a frustrating experience.

The story of Homesick is what you make of it as for the vast majority of the game you really have no clue about anything. Once you unlock the ability to decipher the riddles you can go back through the entire game and read everything which does give you a good sense of the world before your time in it. With games like these, ones where much of the story is locked behind globs of text hidden everywhere, I find it hard to get emotionally invested in the story and Homesick was no exception. I do admit that when I started to slowly unravel the code of the world I was a little excited but that wasn’t enough to drive me to slowly walk back through everything just so I could read some things.

Homesick Review Screenshot Wallpaper Am I Free

Homesick is a stunner of a game with graphics that will remain unchallenged by the indie scene for a long time. Once you dig beneath the surface though what remains is your typical walking simulator game, with all the requisite puzzles and hidden pieces of text to flesh out the world. Whilst it’s worth playing for the graphics alone I really can’t say that there was much more that drew me in, mostly due to my resistance to reading large walls of text after I’ve slowly trotted my way through everything. Still I’m sure fans of this genre will find a lot to love and would not hesitate to recommend it to all the indie fans out there.

Rating: 7/10

Homesick is available on PC right now $14.99. Total play time was approximately 2 hours.

hp-machine-memristor-2015-06-05-01

HP’s “The Machine” Killed, Surprising No One.

Back in the day it didn’t take much for me to get excited about a new technology. The rapid progressions we saw from the late 90s through to the early 2010s had us all fervently awaiting the next big thing as it seemed nearly anything was within our grasp. The combination of getting older and being disappointed a certain number of times hardened me against this optimism and now I routinely attempt to avoid the hype for anything I don’t feel is a sure bet. Indeed I said much the same about HP’s The Machine last year and it seems my skepticism has paid dividends although I can’t say I feel that great about it.

hp-machine-memristor-2015-06-05-01

For the uninitiated HP’s The Machine was going to be the next revolutionary step in computing. Whilst the mockups would be familiar to anyone who’s seen the inside of a standard server those components were going to be anything but, incorporating such wild technologies as memristors and optical interconnects. What put this above many other pie in the sky concepts (of which I include things like D-Wave’s quantum computers as the jury is still out on whether or not they’re providing a quantum speedup) is that it was based on real progress that HP had made in many of those spaces in recent years. Even that wasn’t enough to break through my cynicism however.

And today I found out I was right, god damnit.

The reasons cited were ones I was pretty sure would come to fruition, namely the fact that no one has been able to commercialize memristors at scale in any meaningful way. Since The Machine was supposed to be almost solely based off of that technology it should be no surprise that it’s been canned on the back of that. Now instead of being the moonshot style project that HP announced last year it’s instead going to be some form of technology demonstrator platform, ostensibly to draw software developers across to this new architecture in order to get them to build on it.

Unfortunately this will likely end up being not much more than a giant server with a silly amount of RAM stuffed into it, 320TB to be precise. Whilst this may attract some people to the platform out of curiosity I can’t imagine that anyone would be willing to shell out the requisite cash on the hopes that they’d be able to use a production version of The Machine sometime down the line. It would be like the Sony Cell processor all over again instead of costing you maybe a couple thousand to experiment with it you’d be in the tens of thousands, maybe hundreds, just to get your hands on some experimental architecture. HP might attempt to subsidise that but considering the already downgraded vision I can’t fathom them throwing even more money at it.

HP could very well turn around in 5 or 10 years with a working prototype to make me look stupid and, honestly, if they did I would very much welcome it. Whilst predictions about Moore’s Law ending happen at an inverse rate to them coming true (read: not at all) it doesn’t mean there isn’t a few ceilings we’ve seen on the horizon that will need to be addressed if we want to continue this rapid pace of innovation. HP’s The Machine was one of the few ideas that could’ve pushed us ahead of the curve significantly and its demise is, whilst completely expected, still a heart wrenching outcome.

samsung_curved_uhdu9000_front

Curved Screens Are a Waste of Money.

Consumer electronics vendors are always looking for the next thing that will convince us to upgrade to the latest and greatest. For screens and TVs this use to be a race of resolution and frame rate however things began to stall once 1080p became ubiquitous. 3D and 4K were the last two features which screen manufacturers used to tempt us although neither of them really proved to be a compelling reason for many to upgrade. Faced with flagging sales the race was on to find another must-have feature and the result is the bevy of curved screens that are now flooding the market. Like their predecessors though curved screens don’t provide anything that’s worth having and, all things considered, might be a detrimental attribute.

samsung_curved_uhdu9000_front

You’d be forgiven for thinking that a curved screen is a premium product as they’re most certainly priced that way. Most curved screens usually tack on an extra thousand or two over an equivalent flat and should you want any other premium feature (like say it being thin) then you’re going to be paying some serious coin. The benefits of a curved screen, according to the manufacturers, is that they provide a more theatrical experience, making the screen appear bigger as more of it is in your field of view. Others will say that it reduces picture distortion as objects in the middle of a flat screen will appear larger than those at the edge. The hard fact of the matter is that, for almost all use cases, none of these attributes will be true.

As Ars Technica demonstrated last year the idea that a curved screen can have a larger apparent size than its flat counterpart only works in scenarios that aren’t likely to occur with regular viewing. Should you find yourself 3 feet away from your 55″ screen (an absolutely ludicrous prospect for any living room) then yes, the curve may make the screen appear slightly larger than it actually is. If you’re in a much more typical setting, I.E. not directly in front of it and at a more reasonable distance, then the effect vanishes. Suffice to say you’re much better off actually buying a bigger set than investing in a curved one to try and get the same effect.

The picture distortion argument is similarly flawed as most reviewers report seeing increased geometric distortions when viewing content on a curved screen. The fundamental problem here is that the content wasn’t created with a curved screen in mind. Cameras use rectilinear lenses to capture images onto a flat sensor plane, something which isn’t taken into account when the resulting image is displayed on a curved screen. Thus the image is by definition distorted and since none of the manufacturers I’ve seen talk about their image correction technology for curved screens it’s safe to assume they’re doing nothing to correct it.

So if you’ve been eyeing off a new TV upgrade (like I recently have) and are thinking about going curved the simple answer is: don’t. The premium charged for that feature nets no benefits in typical usage scenarios and is far more likely to create problems than it is to solve them. Thankfully there are still many great flat screens available, typically with all the same features of their curved brethrens for a much lower price. Hopefully we don’t have to wait too long for this fad to pass as it’s honestly worse than 3D and 4K as they at least had some partial benefits for certain situations.

Fiber Optics

Fiber’s Future Looks Bright with Frequency Combs.

Fiber is the future of all communications, that’s a fact that any technologist will be able to tell you. Whilst copper is still the mainstay for the majority its lifetime is limited as optics are fast approaching the point where they’re feasible for everything. However even fiber has its limits, one that some feel we were going to hit sooner rather than later which could cause severe issues for the Internet’s future. However new research coming out of the University of California, San Diego paves the way for boosting our fiber network’s bandwidth significantly.

Fiber Optics

 

Today’s fiber networks are made up of long runs of fiber optic cable interspersed with things called repeaters or regenerators. Essentially these devices are responsible for boosting up the optical signal which becomes degraded as it travels down the fiber. The problem with these devices is that they’re expensive, add in latency and are power hungry devices, attributes that aren’t exactly desirable. These problems are born out of a physical limitation of fiber networks which puts an upper limit on the amount of power you can send down an optical cable. Past a certain point the more power you put down a fiber the more interference you generate meaning there’s only so much you can pump into a cable before you’re doing more harm than good. The new research however proposes a novel way to deal with this: interfere with the signal before it’s sent.

The problem with interference that’s generated by increasing the power of the signal is that it’s unpredictable meaning there’s really no good way to combat it. The researchers however figured out a way of conditioning the signal before it’s transmitted which allows the interference to become predictable. Then at the receiving end they’ve used what they’re calling “frequency combs” to reverse the interference on the other end, pulling a useful signal out of interference. In the lab tests they were able to send the signal over 12,000KM without the use of a repeater, an absolutely astonishing distance. Using such technology could drastically improve the efficiency of our current dark fiber networks which would go a long way to avoiding the bandwidth crunch.

It will be a little while off before this technology makes its way into widespread use as whilst it shows a lot of promise the application within the lab falls short of a practical implementation. Current optical fibers carry around 32 different signals whereas the system that the researchers developed can currently only handle 5. Ramping up the number of channels they can support is a non-trivial task but at least it’s engineering challenge and not a theoretical one.

SpaceX CRS-7 Explosion

SpaceX CRS-7 Fails Shortly After Launch.

It seems somewhat trite to say it but rocket science is hard. Ask anyone who lived near a NASA testing site back in the heydays of the space program and they’ll regale you with stories of numerous rockets thundering skyward only to meet their fate shortly after. There is no universal reason behind rockets exploding as there are so many things in which a failure leads to a rapid, unplanned deconstruction event. The only universal truth behind sending things into orbit atop a giant continuous explosion is that one day one of your rockets will end up blowing itself to bits. Today that has happened to SpaceX.

SpaceX CRS-7 Explosion

The CRS-7 mission was SpaceX’s 7th commercial resupply mission to the International Space Station with its primary payload consisting of around 1800kgs of supplies and equipment. The most important piece of cargo it was carrying was the International Docking Adapter (IDA-1) which would have been used to convert one of the current Pressurized Mating Adapters to the new NASA Docking System. This would have allowed resupply craft such as the Dragon capsule to dock directly with the ISS rather than being grappled and attached, which is currently not the preferred method for coupling craft (especially for crew egress in emergency). Other payloads included things like the Meteor Shower Camera which was actually a backup camera as the primary was lost in the Antares rocket explosion of last year.

Elon Musk tweeted shortly after the incident that the cause appears to be an overpressure event in the upper stage LOX tank. Watching the video you can see what he’s alluding to here as shortly after take off there appears to be a rupture in the upper tank which leads to the massive cloud of gas enveloping the rocket. The event happened shortly after the rocket reached max-q, the point at which the aerodynamic stresses on the craft have reached their maximum. It’s possible that the combination of a high pressure event coinciding with max-q was enough to rupture the tank which then led to its demise. SpaceX is still continuing its investigation however and we’ll have a full picture once they conduct a full fault analysis.

A few keen observers have noted that unlike other rocket failures, which usually end in a rather spectacular fireball, it appears that the payload capsule may have survived. The press conference held shortly after made mention of telemetry data being received for some time after the explosion had occurred which would indicate that the capsule did manage to survive. However it’s unlikely that the payload would be retrievable as no one has mentioned seeing parachutes after the explosion happened. It would be a great boon to the few secondary payloads if they were able to be recovered but I’m certain none of them are holding their breath.

This marks the first failed launch out of 18 for SpaceX’s Falcon-9 program, a milestone I’m sure none were hoping they’d mark. Putting that in perspective though this is a 13 year old space company who’s managed to do things that took their competitors decades to do. I’m sure the investigations that are currently underway will identify the cause in short order and future flights will not suffer the same fate. My heart goes out to all the engineers at SpaceX during this time as it cannot be easy picking through the debris of your flagship rocket.

The Silent Age Review Screenshot Wallpaper Title Screen

The Silent Age: Time Traveler Joe

PC ports of mobile games have mostly been of low quality. Whilst many of the games make use of a base engine that’s portable between platforms often those who are doing the porting are the ones who developed the original game and the paradigms they learnt developing for a mobile platform don’t translate across. There are exceptions to this, of course, however it’s been the main reason why I’ve steered clear of many ported titles. The Silent Age however has received wide and varied praise, even after it recently made the transition to the PC and so my interest was piqued. Whilst the game might not be winning any awards in the graphics or game play department it did manage to provide one of the better story experiences I’ve had with games of this nature.

The Silent Age Review Screenshot Wallpaper Title Screen

You’re Joe, the lowly janitor of the giant research and development corporation Archon. For the most part your life is pretty mundane except for the wild and wonderful things that your partner in crime, fellow janitor Frank, tells you about. One day however you’re called up to management and, lucky for you, it’s good news! You’re getting promoted, taking over all of Frank’s responsibilities because you’ve shown such dedication to your job (with no pay increase, of course, you understand). When you go down to inspect the place where you’ll be doing your new duties however you notice something strange, a trail of blood leading into one of the restricted areas. Following that trail starts you on a long journey that will eventually end with you saving the world.

The Silent Age comes to us care of the Unity platform however you’d be forgiven for thinking that it was an old school flash game that had been revamped for the mobile and PC platforms. It shares a similar aesthetic to many of the games from the era when Flash reigned supreme with simple colours, soft gradients and very simple animations. On a mobile screen I’m sure it looks plenty good although on my 24″ monitors the simple style does lose a little bit of its lustre. Still it’s not a bad looking game by any stretch of the imagination but you can tell which platform it was designed for primarily.

The Silent Age Review Screenshot Wallpaper Good Enough

Mechanically The Silent Age plays just like any other indie adventure game with your usual cavalcade of puzzles that consist of wildly clicking on everything and trying every item in your inventory to see if something works. The puzzles are really just short breaks between the longer dialogue sections which, interestingly enough, are all fully voiced. There’s a small extra dimension added by the time travel device, allowing you to travel to the past or future at will, but it’s nothing like the mind bending time manipulation made famous by some other indie titles. Other than that there’s really not much more to The Silent Age something which I ended up appreciating as it meant there wasn’t a bunch of other mechanics thrown in needlessly. It’s pretty much the most basic form of an adventure game I’ve played in a while and that simplicity was incredibly refreshing.

The puzzles are pretty logical with all of them having pretty obvious solutions. There’s no real difficulty curve to speak of as pretty much all of them felt about on par with each other, although there were a few puzzles that managed to stump me completely. Usually this was a result of me missing something or not recognizing a particular visual clue (a good example being the pile of wood in the tunnel under the hospital, it just looked like background to me) so that’s not something I’d fault the developer for. Some of the puzzles were a little ludicrous, requiring a little knowledge about how some things could potentially interact, but at least most of them wouldn’t take more than ten minutes or so of blind clicking to get past. Overall it wasn’t exactly a challenging experience which I felt was by design.

The Silent Age Review Screenshot Wallpaper CHAINSAWWWW

The PC port was a smooth one as pretty much everything in the game worked as expected. The 2D nature helps a lot in this regard as there’s a pretty good translation between tapping on the screen and using a mouse cursor but I’ve seen lesser developers even manage to ruin that. There was one particular problem which caught me out several times however which was that my mouse, if it strayed outside the bounds of the main window, would not be captured. So every so often I’d end up clicking on my web browser or whatever else I had open on my second monitor at the time, closing the game down. A minor complaint, to be sure, but one that’s easily fixed.

The story of The Silent Age is one of the better examples I’ve come across recently, especially for a mobile title. Whilst it’s not exactly the most gripping or emotionally charged story I’ve played of late it does a good job of setting everything up and staying true to itself internally. Of course whenever you introduce time travel into a story things start to get a little weird depending on what model of causality and paradox resolution you ascribe to and The Silent Age is no exception to this. However they manage to stay true to the rules they set up which is more than most high budget films are capable of. Overall I’d say it was satisfying even if it wasn’t the most engaging story.

The Silent Age Review Screenshot Wallpaper Deathly Silence

The Silent Age is a succinct story told through the medium of video games, one that manages to avoid many of the pitfalls that have befallen its fellow mobile to PC port brethren. The art style is simple and clean, reminiscent of Flash games of ages gone by. The puzzle mechanics are straightforward, ensuring that no one will be stuck for hours trying every single item in their inventory to progress to the next level. The story, whilst above average for its peers, lacks a few key elements that would elevate it to a gripping, must-play tale. Overall The Silent Age was a solid experience, even if it wasn’t ground breaking.

Rating: 7.5/10

The Silent Age is available on PC, Android and iOS right now for $9.99, $6.50 and $6.50 respectively. Game was played on the PC with approximately 2 hours of total play time and 71% of the achievements unlocked.

online infringement

Australian Copyright Amendment Sails Into Law.

Despite all the evidence to the contrary rights holders are able to convince governments around the world that piracy is a problem best faced with legislation rather than outright competition. It’s been shown time and time again that access to a reasonably priced legitimate service results in drastic reductions in the rates of piracy and, funnily enough, increased revenue for the businesses that adopt this new strategy. Australia had been somewhat immune to the rights lobby’s ploys for a while, with several high court rulings not finding in their favour. However our current government (and, unfortunately, the opposition) seems more than happy to bend to the whims of this group with their most recent bow coming in the form of a website blocking bill.

online infringementThe bill itself clocks in at a mere 9 pages with the explanatory notes not going much further. Simply put it provides a legislative avenue for rights holders to compel ISPs to block access to sites that hold infringing material through the use of a court injunction. How that blocking should be done isn’t mentioned at all, nor is there any mention of recourse activities that a site can undertake to have themselves unblocked should they find themselves a target of an injunction. Probably the only diamond in this pile of horseshit of legislation is the protection that ISPs get from costs born out of this process, but only if they choose not to fight any injunction that may be placed upon them. However all of that is moot when compared to the real issue at hand here.

It’s just not going to fucking work.

As I wrote last year when Brandis and co were soliciting ideas for this exact legislation no matter what kind of blocking the ISPs employ (which, let’s be honest here, will be the lowest and most painless form of blocking they can get away with) it will be circumvented instantly by anyone and everyone. The Australian government isn’t the first government to engage in wholesale blocking of sites and so solutions to get around them are plentiful, many of them completely free to access. Hell with a very healthy amount of VPN usage in Australia already most people already have a method by which to cut the ISPs completely out of the picture, rendering any action they take completely moot.

The big problem that I, and many others, have with legislation like this is that it sets a bad precedent that could be used to justify further site blocking policies down the line. It doesn’t take much effort to take this bill, rework it to target other objectionable content and then have that pushed through parliament. Sure, we can hope that the process means that such policies won’t make it through due to the obvious chilling effects that it might have, however this legislation faced no opposition from either of the major parties so it follows that future ones could see just as slim opposition. Worst still there’s almost no chance that it will ever be repealed as no government ever wants to give up power it’s granted itself.

In the end this is just another piece of evidence to show that our current government has a fundamental lack of understanding of technology and its implications. The bill is worthless, a bit of pandering to the rights lobbyists who will wield it with reckless abandon which will fail it achieve its goals from day one. Already there are numerous sites telling users how to circumvent it and there is absolutely no amount of legislation that can be passed to stop them. All we can hope for now is that this doesn’t prove to be the first step on a slippery slope towards larger scale censorship as the Great Firewall of Australia begins to smoulder.

ssdgraph

The Near-Term Future of PC Storage.

I had grand ideas that my current PC build would be all solid state. Sure the cost would’ve been high, on the order of $1500 to get about 2TB in RAID10, but the performance potential was hard to deny. In the end however I opted for good old fashioned spinning rust mostly because current RAID controllers don’t do TRIM on SSDs, meaning I would likely be in for a lovely performance downgrade in the not too distant future. Despite that I was keenly aware of just how feasible it was to go full SSD for all my PC storage and how the days of the traditional hard drive are likely to be numbered.

ssdgraph

Ever since their first commercial introduction all those years ago SSDs have been rapidly plummeting in price with the most recent drop coming off the back of a few key technological innovations. Whilst they’re still an order of magnitude away from traditional HDDs in terms of cost per gigabyte ($0.50/GB for SSD, $0.05/GB for HDD) the gap in performance between the two is more than enough to justify the current price differential. For laptops and other portable devices that don’t require large amounts of onboard storage SSDs have already become the sole storage platform in many cases however they still lose out for large scale data storage. That differential could come to a quick close however, although I don’t think SSDs’ rise to fame will be instantaneous past that point.

One thing that has always plagued SSDs is the question around their durability and longevity as the flash cells upon which they rely have a defined life in terms of read and write cycles. Whilst SSDs have, for the most part, proven reliable even when deployed at scale the fact is that they’ve really only had about 5 or so years of production level use to back them up. Compare that to hard drives which have track records stretching back decades and you can see why many enterprises are still tentative about replacing their fleet en-masse; We just don’t know how the various components that make up a SSD will stand the test of time.

However concerns like that are likely to take a back seat if things like a 30TB drive by 2018 come to fruition. Increasing capacity on traditional hard drives has always proven to be a difficult affair as there’s only so many platters you can fit in the standard space. Whilst we’re starting to see a trickle of 10TB drives into the enterprise market they’re likely not going to be available at a cost effective point for consumers anytime soon and that gives a lot of leeway to SSDs to play catchup to their traditional brethren. That means cost parity could come much sooner than many anticipated, and that’s the point where the decision about your storage medium is already made for the consumer.

We likely won’t see spinning rust disappear for the better part of a decade but the next couple years are going to see something of a paradigm shift in terms of which platform is considered before another. SSDs already reign supreme as the drive to have your operating system residing on, all they require now is a comparative cost per gigabyte to graduate beyond that. Once we reach that point it’s likely to be an inflection point in terms of the way we store our data and, for consumers like us, a great time to upgrade our storage.

e3-2015

Half a Decade Later E3 Becomes Relevant Again.

For as long as I’ve been writing this blog E3 hasn’t been much more than a distraction when it rolls around. Indeed in the 7 years I’ve been writing about games I’ve only ever covered it twice and usually only in passing, picking out a couple things that piqued my interest at the time. The reasons behind this would be obvious to any gamer as E3 has been largely irrelevant to the gaming community since about 2007 with most of the big announcements coming out of other conventions like PAX. However this year something seemed to change as the both the gaming industry and community seemed to rally behind this years expo, making it one of the most talked about to date.

e3-2015

The reason behind E3’s quick fall into obscurity was fuelled by the extremely questionable decision back in 2007 to close off the event to the general public and instead only allow games industry representatives and journalists. The first year after this was done saw the attendance drop to a mere 10,000 (down from 60,000 the year previous) and the following year saw it drop by half again. The other conventions that popped up in E3’s absence soaked up all these attendees and, by consequence, all of the attention of the games industry and press. Thus E3 spent the last 5 years attempting to rebuild its relevance but struggled to find a foothold with such stiff competition.

This year however has proven to be E3’s one of its greatest on record with attendance above 50,000 for the first time since they made that awful decision all those years ago. This rise in attendance has also come hand in hand with a much greater industry presence, boasting a much greater presence from major game developers and publishers. There were also numerous major announcements from pretty much all of the large players in the console and PC markets, something we really hadn’t seen at a single event for some time. For someone who’s been extremely jaded about E3 for so long it honestly took me by surprise just how relevant E3 had become and what that might mean for the conference’s future.

The challenge that E3 now faces is building on the momentum that they’ve created this year in order to re-cement their position as top dog of the games conferences. In it absence many of the larger players in the games industry opted to either patronize other conferences or set up their own, many of which have now gone on to be quite profitable events (like BlizzCon, for example). E3 will likely never be able to replace them however given the resounding success of this year’s conference there is potential for them to start drawing business away from some of the other conferences.

In the end though more competition in this space will hopefully lead to better things for the wider gaming community. It will be interesting to see if E3 can repeat their success next year and what the other conventions will be doing in response.