Monthly Archives: January 2013

Automating The Configuration of Brocade Interconnects.

Once an IT environment gets past a certain size the requirement for automation grows exponentially. When you’re working in an environment with a handful of servers and a dozen or so PCs its easy to just spend the time doing everything manually. If you’re in a situation like I am right now where a single deployment covers some 400+ physical servers then that process isn’t particularly feasible, especially if you want any level of consistency across the fleet. It should come as no surprise then that I spend the vast majority of my time automating the commissioning of IT infrastructure and since I don’t want to do something 400 times it usually sees me trying to automate things that really don’t want to be automated.

Dell 8 4 FC SAN Fibre Interconnect Module

Take for instance this little fellow, a Dell 8/4 Fibre Channel Interconnect module for a M1000e chassis (sounds sexy, right?). Don’t let that Dell badge on the outside fool you, like a lot of Dell hardware it’s actually a rebranded Brocade fibre switch under the hood, albeit with a significantly paired down feature set. For the most part it’s just a dumb NPIV device that acts as a pass through for high speed connections but it does have a little bit of smarts in it, enough so it would typically come under the purview of your on site storage team. However due to its paired down nature it doesn’t work with any of Brocade’s management software (at least none that we have here) and so the storage team wasn’t particularly interested in managing it. Fair cop but there was still a few things that needed to be configured on it, so my colleague and I set about figuring out how to do that.

Usually this is when I’ll track down the CLI or automation guide for the particular product and then dig around for the commands I need in order to get it configured. Try as I might I couldn’t find anything from Brocade themselves as they usually recommend using something like DCFM for configuration. There is a SSH interface on the devices however which does have a rather comprehensive set of commands in it but there didn’t appear to be any way to get at these remotely. We could, of course, use something like TCL with EXPECT to essentially automate this process but that’s traditionally quite messy so we asked our on site resident from Brocade if there was a better solution.

There isn’t, apparently.

So off we went building up a TCL file that would do the configuration for us and initially it all worked as expected (pun completely unintentional I assure you). Our test environment worked every time once we had all the initial kinks worked out of the script and we were confident enough to start moving it up the chain. Of course this is when problems start to become apparent and during our testing we began to find some really weird behaviours coming from the switches, things that aren’t mentioned anywhere nor are obvious unless you’re doing exactly what we’re doing.

So in order to build up the original TCL script file I’d PuTTy into one of the switches and execute the command. Then once I had confirmed the changes I wanted to be done had been done I’d then put them into the script. Pretty standard stuff but after re-running the scripts I’d find they inexplicably fail at certain points, usually when attempting to reconfigure a switch that had already been deployed. Essentially I’d look for a “Access denied” message after trying the default password and then send along the correct one afterwards as that’s all that was required when using PuTTy.

However looking at the logs not only is the message it sends back different, saying “Login incorrect”, it also doesn’t just ask for the correct password it also requests the user name again. There are also significant differences in the way output is written between the two interfaces which means for things like EXPECT you have to code around them otherwise you’ll end up trying to send input at wrong times and read lines that you might not want to. It’s clear that there’s 2 interfaces to the Brocade switches and they differ enough between each other to make coding against one incompatible with the other which is just unacceptable.

Realistically what’s required is for Brocade to release some kind of configuration tool like Dell’s RACADM which provides a direct hook into these devices so they can be automated properly. I’ve found old forum posts that reference something like that for Perl but as far as I, and the Brocade people I’ve talked to, am aware there’s nothing like that available for these particular devices. It’s not like its impossible to code EXPECT up to do what we want it to but it’s ugly, unmaintainable and likely to break with firmware updates. If there is a better solution I’d love to hear it but after all the time I have invested in this I’m pretty sure there isn’t one.

Unless Brocade has something in the works, nudge nudge 😉

Bigelow Aerospace Brings NASA’s Technology Full Circle.

If you wander over to the Space section of this blog you won’t have to look far to figure out which company I have a huge man crush on. Whilst SpaceX might be the toast of the private space flight industry thanks to their incredibly impressive achievements and lofty goals they’re far from the only player in the game and they’re really only currently focused on getting cargo and people into orbit, keeping them there is still someone else’s job. This isn’t to say that no one is working on solving that particular problem however and Bigelow Aerospace, a company I’ve mentioned in passing a couple times, is one such company.

Bigelow Aerospace BA-2100 Inflatable Space Module

Bigelow Aerospace is the brain child of Robert Bigelow funded primarily from the fortune he made from his ownership stake in the Budget Suites of America hotel chain. Unlike most private space companies which are primarily focusing on the launch side of the equation Bigelow is instead focusing solely on the staying up there part, developing technology for a new kind of space station that promises to deliver much larger usable volumes at a fraction of the cost of traditional space station modules. They’re in fact so far along the development path that they already have 2 of their modules Genesis I and Genesis II in orbit right now and they’ve been there for the better part of 6 and 5 years respectively.

Their modules are based off of a pretty novel idea that NASA was developing back in the early 1990s. Dubbed TransHab the idea was to be able to build modules that were of a certain size when launched but could then be inflated once in orbit to provide much more room. Additionally the inflatable design means that it’s much more resistant to micrometeorite impacts as the outer surface will flex, reducing risks to the crew and lowering ongoing maintenance costs. Unfortunately due to the budget overruns of the International Space Station project the TransHab was ultimately cancelled but Bigelow licensed the technology from NASA and set about creating his own versions of them.

The goal for Bigelow was to start up his own private space stations in orbit, essentially extending his hotel chain to outer space. Whilst they’ve had functional verification of their systems for a long time now their biggest issue was a lack of transportation methods in order to get people up there. Seats on Soyuz craft are now going for upwards of $50 million dollars and Bigelow’s plans just aren’t feasible at that price point. Indeed the current lack of usable alternatives prompted Bigelow to slash its staffing by over half at the end of 2011 although they have begun rehiring now in preparation for the availability of such services coming online in 2016.

What is pretty incredible though was the recent news that Bigelow has won a contract with NASA to provide an inflatable module for the International Space Station. Whilst there’s scant details about what the module will actually be (that’s apparently scheduled for a press conference today) it’s a safe bet that it’d be something like their planned BA-330 although it’s entirely possible that they might go for gold and debut their giant BA-2100 (pictured above) which would almost triple the current liveable volume of the ISS. It may seem counter-intuitive for NASA to buy their own technology back off a private manufacturer but Bigelow has invested some $180 million into getting the project this far, a sum that I’m sure no one at NASA wanted to spend when they already have so much invested in rigid modules.

The amount of innovation we’re seeing in the private space industry is simply staggering as we’re fast approaching the point where the only thing that stands between you and your own private space station is the capital required. Sure that’s still no small barrier but the fact that we’re commoditizing space travel means that it’ll soon be something that will be within reach for all of us, much like the commercialization of air flight last century. NASA’s contract with Bigelow is proof that the nascent space company is at the point where it’s technology is ready for prime time and I can’t wait to see one of their modules up in space.

Consoles Aren’t Going Away and Mobiles Won’t Take Over.

You wouldn’t have to be a reader for long to know that my preferred gaming platform is the PC but I’m pretty sure it comes as no surprise that I have all of the current generation consoles (apart from the WiiU, but I do have a Wii). I grew up with both platforms and arguably I was more of a console gamer when I was younger but as time went on I found that PC gaming just sat better with me. What I’m getting at here is that whilst I might be a PC gamer I’m certainly not one to call for the demise of the consoles and indeed believe that the platform will be around for quite a long time to come.

Current Gen Consoles Playstation 3 Xbox360 Wii

Others don’t share that view, in particular Ben Cousins who wrote this article on Kotaku outlining the reasons why consoles are going away:

Many people (me included) have been saying publicly that they think the ‘console’—dedicated hardware designed primarily for gaming—is on its way out.

I used to keep a list of famous developers and executives who shared my view, but it got too big to maintain!

Anyway, here’s just two whom you might care about: David Jaffe and Hideo Kojima.

He then goes on to list 5 data points and 2 assumptions that back up his claim and on the surface they appear plausible. Indeed many of the supporting points are based at least partially on ideas that everyone involved in the games industry knew about but I feel the conclusions drawn from them are a little over-reaching, enough so that his idea that consoles are going away is at the least premature and at the worst grossly misinformed.

Take for example the first data point about consoles being sold at a loss. This is no revelation as console makers have been doing this for decades prior and have still managed to turn a profitable business from them. Indeed while Nintendo might be breaking its usual rule of not selling consoles at a loss it doesn’t take much for them to become profitable with the sale of a single title enough to push it over the line. In fact if you look at the past 5 years things look pretty good for the major consoles, especially for Microsoft and Nintendo. I believe Cousins is being slightly unfair by going back further than that because those years were right at the beginning of the current generation console’s life and that’s arguably the point at which the greatest losses will be incurred.

 

I’m also not sure how 40% of the sales occurring after the price drops supports his idea that these people are somehow the mainstream gamers. Taken literally that means that the majority, I.E. >50% of current gen console owners, bought their console before these price drops/product revisions occurred. I’d also argue that a portion of those new sales were also current owners upgrading older consoles as in the case of the Xbox the original was something of a jet engine when used and the subsequent iterations vastly improved that experience. I’ve heard similar tales from PS3 Slim owners as well so I don’t feel the “mainstream gamer” argument holds up with console sale figures alone.

It’s not a secret that mobile devices are pervasive but it’s also quite known what they’re capable of and what their primary use is. Indeed console makers are aware of this and have been working to expand their console experience onto the mobile platform. Microsoft has long been working towards achieving their Three Screens idea which would see the experience between Xbox360, Windows Phone 8 and Windows 8 unified together enabling developers to provide the same experience regardless of the platform. We’re still a long way from achieving that and whilst smartphones do a good job of getting close to the console experience they’re still not in the same league, something which console owners are acutely aware of.

The rest is speculation based off those points which I won’t bother digging into but suffice to say I don’t get the feeling that consoles are going to go anywhere in a hurry and I’m willing to say that there’ll definitely be several more generations to come. The mobile market might be growing but I believe it’s an additive market, one that’s bringing more gamers in not one that’s cannibalizing gamers away. There’s also the fact that consoles are increasingly becoming the media centre of the house, something that smartphones are going to have a hard time replacing. Still we’re both deep in speculation territory here so the only way to settle this will be to wait it out and hope that both our opinion pieces are still online in a decades time.

I Have Seen The HFR Future, And It Is Good.

As long time readers will know I’ve got a bit of thing for the technology of recording pictures both of the still kind in the form of photography and, more recently, in that of the cinema. I was truly fascinated by the amount of work that James Cameron put into creating Avatar as in the past 3D had simply been a gimmick used for attractions at amusement parks and not something for serious cinema. I feel like Avatar changed that (although I do admit that it’s been heavily misused since then) as it opened up 3D as being another tool in the director’s cinematography kit, one that can be used to great effect.

The Hobbit Rivendale

Indeed apart from Avatar and TRON: Legacy, both films designed to be visual masterpieces, there hadn’t been any movies that I felt used it to proper effect. There were many where it was inappropriate (Hot Tub Time Machine anyone?) and some where it didn’t add nor subtract anything (Dredd comes to mind) but there wasn’t any that I felt took that careful, considered approach to ensure that the 3D was used appropriately. That was until I saw this video from the production set of The Hobbit: An Unexpected Journey.

Whilst I’m sure not everyone will find that video as enthralling as I did the short of it is that the 3D rigs that were used in the filming of The Hobbit are essentially unique in their design as they set out to do things that no movie had ever done before. Primarily this was because they didn’t go with more traditional 3D cameras which have specially designed lenses and sensors in order to do 3D. Instead they were using RED EPICs a camera capable of delivering resolutions up to 5K and, more importantly, at a frame rate twice that of traditional cinema. These cameras then required special rigs in order to get the 3D effect right which were marvels of engineering in and of themselves.

I had been drooling over the camera set up for a long time and finally managed to see the final results on Saturday. Now I was a bit worried about what I was about to see as many film critics had said awful, awful things about how HFR had ruined the entire experience for them and since I had broken my rule of avoiding the hype my expectations were much higher than they normally are. It got worse when I got to the cinema and I was handed a pair of polarized 3D glasses rather than the active-shutter ones I thought they would use but I didn’t let that phase me and settled in for the next 3 hours.

What followed blew even my fan boy level expectations out of the water.

You’d be forgiven for thinking that the above still was a retouched shot (done solely for the promos) but the entire movie is like that, drenched in luscious colour and with incredible amounts of detail everywhere. I will admit that the 48FPS takes a little getting used to thanks to its prevalence in the cheaper forms of media elsewhere but for things like wide open panoramas and dialogue scenes it’s something that you’ve just got to see in person.  It’s not perfect yet as some of the green screened sections didn’t feel quite right but Peter Jackson really is onto something here and I’m sure the next 2 instalments will only improve on the original.

I won’t go as far to say that this is the future of cinema however as there was an incredible amount of investment done in order to get this to work the way it did and whilst duplicating it might be cheaper thanks to The Hobbit footing the R&D bill I can’t see many wanting to take it on. Indeed I hope that it stays as esoteric as it is now as that will mean that only those who want to invest the time in doing HFR 3D right will attempt to do so, rather than the rampant band wagoning we saw after Avatar premiered. With all that said it should come as no surprise that I recommend you see The Hobbit and do so in HFR 3D as that was the way it was intended to be seen.

Red Hot Metal And Water – The Reaction Isn’t What You’d Expect.

There’s a really interesting experiment you can do in the comfort of your own home that demonstrates an effect I’m about to show you. All you need is a frying pan and some water. Heat up the frying pan until its good and hot and then flick droplets of water onto the pan. Curiously the droplets won’t instantly burst into little puffs of steam, instead they’ll skitter around on the surface of the pan in apparent defiance of the blazing surface that’s underneath it. This effect happens when any kind of liquid comes into contact with a surface past a certain temperature but I hadn’t really considered what would happen if you put the surface in the liquid:

The phenomenon at work here is called the Leidenfrost Effect. It’s a pretty cool reaction whereby an initial layer of vapour formed by a liquid hitting a sufficiently hot surface forms a protective barrier which is what allows those water droplets I described earlier to skitter around rather than turning into steam. It’s clearly visible in the video at the start where a pocket of water vapour forms around the outside of the red hot sphere. It eventually collapses as the vapour isn’t a perfect insulator but it does manage to stay quite hot for a lot longer than you’d expect.

One thing I can’t figure out a good explanation for those is the incredible sounds that are produced. The rapid generation of steam could possibly explain part of it as some of the sounds are similar to what you hear from say a steam wand on a coffee machine but most of them have a definite metallic twang to them. It’s quite possible that all of the noises are coming from the ball itself as it cools down much like some cars which make a distinct “tink” noise when turned off (the noise comes from the exhaust pipe cooling down). I wasn’t able to track down a name or reliable explanation for this effect however so if you’ve got one I’m all ears 😉

The Walking Dead: I’m Sorry Clementine.

Much to the chagrin of many of my friends I haven’t really got into the whole Walking Dead craze that seemed to sweep the Internet over the past couple years, mostly because my wife went ahead and started watching them without me. Couple that with the fact that I’m a terrible reader (I only seem to find time for it on long haul flights) I have also given the comics on which the whole craze is based a miss. I tell you this because The Walking Dead game seemed to attract just as much fandom as the IP’s other incarnations but that was most certainly not the reason I decided to play it. Instead I had heard that Telltale Games had done well with this particular franchise and since their treatment of Sam & Max was pretty decent I figured the hype was probably well earned.

The Walking Dead Screenshot Wallpaper Title Screen

The Walking Dead takes place in modern day America with you playing as Lee Everett, a university professor who’s been recently convicted of killing his wife’s lover and is on his way to jail. On the way however the police car you’re in hits an unidentified person sending the car tumbling over the embankment and leaving you trapped in the car. After looking around it’s clear that something is amiss with the officer who was driving you rising from the dead and attempting to attack you. Things only seem to get worse from here on out as you struggle to survive and protect the few people you manage to team up with.

Whilst I haven’t played many Telltale games (although I’ve watched someone play through most of the Sam & Max series) I still got the feeling that their titles had a distinctive style and The Walking Dead certainly fits in with that idea. Due to the extreme cross platform nature of The Walking Dead the graphics aren’t particularly great but the heavy use of comic-book stylization (I’ve seen people say its cel-shaded but I’m not entirely sure about that) means that it still works well. The animations and sound effects are somewhat rudimentary but this is made up in spades by the voice acting which I’ll touch on more later.

The Walking Dead Screenshot Wallpaper Little Dairy Farm of Horrors

Whilst The Walking Dead is more like an interactive movie with game elements the core game mechanics are those of an adventure game coupled with a few modern innovations like quick time events to drive some of the more action oriented sections. If you’ve played other titles in the same genre like Heavy Rain then this style will be very familiar to you where the game play elements are there to serve as a break from the usually quite intense story sections. Of course decisions you make during these sections can also have an impact on how the story unfolds, something which The Walking Dead informs you of at the start of every episode.

Even for a modern adventure game the puzzles that are thrown at you are rather simplistic usually consisting of you tracking down a particular item or following the bouncing ball in order to progress to the next area. Some of the puzzles are also completely optional, as far as I could tell, as there were a couple times when I’d do things that didn’t seem to have any impact past the scene in question. For a game that is heavily focused on the story rather than the game play I can’t really fault it for this as hard puzzles usually only serve to break immersion and frustrate the player but if you were expecting The Longest Journey level brain ticklers than you’ll be disappointed.

The Walking Dead Screenshot Wallpaper Duck Thinks Youre Incredibly Awesome

What I was thankful for was the simplistic inventory system that shied away from having some form of combine or use one item with another item type mechanic that a lot of games like this have. Usually this just ends up in frustration as you try to find the right item combination in order to solve the problem, something that I’m not usually a fan of. Instead if you have an item that can interact with something in the world it’ll show up as an option taking a lot of the guesswork out of the equation. Sure figuring something out can be fun and The Walking Dead certainly has some satisfying challenges but playing inventory item roulette isn’t one of them.

The Walking Dead is, for the most part, bug and glitch free however I had several occasions when the game broke on me in one way or another. Typically this took the form of the keyboard or mouse simply not responding during an interactive section, rendering me unable to progress any further until I reloaded. This wasn’t usually a problem but sometimes it did mean losing a bit of progress, forcing me to replay through a section. By far the worst bug was when a particular cut scene somehow managed to double itself up with all the characters saying their lines twice over the top of each other and the animations attempting to do the same. Personally I’d put this down to the multi-platform release which means that the amount of time that Telltale could spend on QAing each platform was reduced significantly. In all honesty though I thought most of these bugs would be ironed out given the time since the initial release.

The Walking Dead Screenshot Wallpaper Lee and Clementine

Realistically though you wouldn’t be playing this game for the game mechanics, you’ll be playing it for the story. The Walking Dead tells you in no uncertain terms that the choices you make will affect the outcome of the game and that’s 100% true. Depending on the choices you make certain characters may or may not be alive, people might react to you differently or you might end up in a situation that you didn’t expect to find yourself in. At the end of each episode you’ll also be greeted with a statistic screen which shows how your choices lined up with the greater community and the results can be rather surprising at times.

What really got me initially were the small decisions that I’d make in the heat of the moment having drastic repercussions later on, sometimes right after doing so. Traditionally your choices in these kinds of games were almost irrelevant due to the complexity of creating multiple story arcs that have some level of coherency. The Walking Dead still has decisions like that at times during the game but it’s hard to know which one is which before you make it. I can’t tell you the number of times that I found myself wanting to go back and change something because the result wasn’t what I had expected but since there’s no quick save/load function (a deliberate omission) there’s really no way to do it unless you want to play the whole episode over again. Even then you might not be able to shape the story in the way you want.

The Walking Dead Screenshot Wallpaper Kenny Faces His Demons

I also want to give a lot of credit to the voice acting as it’s not easy to make something fully voice acted and have it come out as well as it has in The Walking Dead. Whilst there can be some strange fluctuations in tone should you choose different types of responses (Lee usually has passive, neutral and aggressive options) the sound bites themselves are well spoken and full of emotion which is probably one of the reasons I found it so easy to sympathize with the characters. There’s been quite a few games I’ve played recently that have been ruined by sub-par voice actors so The Walking Dead was a welcome change and one that I hope more game developers take note of.

The story was one of the great examples where I could hate everything that was happening but still felt a deep emotional connection to most of the characters. The relationship between Lee and Clem is a beautiful one and whilst I won’t spoil the ending anyone who’s been through it will tell you that it’s utterly heart breaking, to the point where I was just staring at the monitor, not wanting to accept what was happening. From what I can gather though this is what The Walking Dead franchise is all about and it does a damn good job of making you care for a lot of people before putting them through all sorts of hell, taking you along with them.

The Walking Dead is a great example of an episodic game done right as each of the sections stands well on its own but together they form something that is very much greater than the sum of its parts. The graphics are simple yet well executed, the voice acting superb and the story so engrossing that you’re likely to be thinking “what if” for a long time to come after you finish it. If you’re a fan of adventure games or The Walking Dead itself then there’s going to be a lot to love in this cinematic adventure game and I can recommend it enough.

Rating: 9.0/10

The Walking Dead is available on PC, PlayStation 3, Xbox360 and iOS right now for $24.99, $29.99, $29.99 and $14.99. Game was played on the PC with around 10 hours played and 100% of the achievements unlocked. 

Windows RT Running on ARM Has Full Win32 Compatibility.

As far back as I can remember the differences between the full version of Windows 8 and the tablet version, now dubbed Windows RT, were made pretty clear. Whilst the Modern UI section of them was going to be essentially identical the full version of Windows wasn’t going to run on anything that wasn’t x86 compatible and RT would be the version that could run on low power systems like ARM. This, logically, came with some draw backs the largest of which is the omission of the desktop environment on Windows RT devices. In all honesty this didn’t bother me as Microsoft is making a version of their Surface tablet (and I’m sure others will as well) that would run the full desktop anyway.

The delineation also made a lot of sense due to the different markets that both versions were targeting. The full version is squarely aimed at the desktop/laptop space whilst the RT version is strictly for mobile computing. In terms of functionality there’s a lot of crossover between these two spaces but the separation essentially meant that you had your desktop with its oodles of backwards compatibility that Microsoft is known for whilst also getting that nice, highly focused tablet environment should you want it.

However as it turns out Windows RT is far more full featured that I first thought and is capable of running Win32 applications:

Windows 8 RT Running x86 Programs

Thanks one intrepid user, Clrokr, over at XDA Developers it has been found out that Microsoft actually included full Win32 compatibility in Windows RT devices that run on the ARM architecture. Whilst this doesn’t mean you can straight up run those executables on said platform it does mean that any Windows application that you have the source of can be recompiled to run, without issue, on Windows RT devices. The above screenshot is from another user, peterdn, who has recompiled PuTTy to run on ARM and it appears to be functioning quite fine. Other applications have also been tested as well and shown to work as you’d expect.

Thinking about it more clearly this shouldn’t have come as a surprise as the architecture diagram for Windows 8 clearly shows that C/C++/C# are fully supported on both platforms and the inclusion of the desktop on Windows RT devices (again something I wasn’t aware of) would have you thinking everything was there to support this. As it turns out the only thing that was stopping this from working in the first place was runtime authentication level that was hard coded to only allow Microsoft signed applications to run in such an environment. The jailbreak that Clrokr details in this post is simply an in memory overwrite of this value which will allow any application to run. From there you just need to recompile your application and you’re golden.

The reasons for the lock out make sense from a business point of view: Microsoft was trying to create a pristine tablet environment that was tightly controlled in order to create a better experience. However at the same time porting all of the underlying architecture to ARM would have required quite a bit of effort and locking this functionality away from people seems like a strange idea. Whilst I’m not going to say they should unlock it for everyone having it as a configurable option would have meant that most users wouldn’t know about it but power users, like the ones who discovered this, could take advantage of it. I haven’t seen if Microsoft has made an official response to this yet or not but I’m sure they’d win more than a couple fans if they did this and it doesn’t look like it would be that hard to implement.

I was genuinely surprised by this as I hadn’t caught on pretty much all of Windows, including everything that makes it tick under the hood, had been ported across to the ARM architecture. I had believed that it was just a port of the core functionality required to support the WinRT framework but as the above screenshots prove Windows RT devices are pretty much fully fledged copies of Windows, they just need their applications recompiled in order to work. Of course questions of how those applications fair vs their modernized counterparts in a tablet environment remains to be seen but it’s interesting that the option is there and that Microsoft has gone to such lengths to keep people from fiddling with it.

 

Game of the Year 2012.

2012 was the year I decided to ramp up my game reviews significantly, aiming to get at least one done per week. I got pretty close to that goal managing to get through a grand total of 48 games last year, well over the double previous year’s tally. I have to say that I really enjoyed the whole experience as I often found myself going outside my comfort zone in order to find something to review and the number of indie games I’ve played this year is more than all the years prior put together. Now that 2012 is firmly in the rear view mirror it comes time for me to reflect on all the games that I’ve played and crown one of them Game of the Year 2012.

As always here’s a list in chronological order of the games I reviewed during 2012:

Now people who’ve been here a while (and have read my Guide to Game Reviews on The Refined Geek) will know that my review scores tend towards the infamous 7 to 10 scale rather than 0 to 10 but from time to time I’ll venture below that curve for games that really deserve it. Notable mentions that did this include Lone Survivor, I Am Alive and (drum roll please) this year’s winner of lowest score received: Dear Esther. My review of that game was probably one of the most controversial reviews I’ve ever written as I had people telling me I simply “didn’t get it” all the way up to saying that it wasn’t fair for me to judge it as a game because it wasn’t. Sadly nothing of what anyone sad to me could change the horrific experience I had with Dear Esther and it gives me an undue amount of pleasure to give it the Wooden Spoon as worst game of 2012.

Whilst 2011 saw me give a notable mention to Gemini Rue for being a stand out indie game of that year I can’t feel like I can do the same this year: there’s just so many deserving titles and unlike Gemini Rue nearly all of them got the praise they deserved. Indeed the reason I found out (and subsequently played) so many indie titles this year was because of the attention they were receiving in the larger video games press and if it wasn’t for a few kind words from some of my trusted sources many of these indie games might not have seen a review here. Whilst I’ll stop short of giving an award to the indie game scene (because that’s incredibly lame) I will say that I’m looking forward to what the indie scene brings forth in 2013 and beyond.

One game that I’d like to give an honourable mention to, since it is by far my most played game of 2012 by a long shot, is Defense of the Ancients 2 (DOTA 2). Whilst I starting playing it back towards the end of 2011 I really didn’t get that into it until just after I wrote my initial review of it but after then my play time in it snow balled considerably. This was helped a lot by the fact that a cadre of my competitive gaming friends joined along with me which fuelled my addiction to it to perilous heights. Today I’ve played over 600 games and ranked up well over the same amount of hours playing, watching and talking about DOTA 2 and Valve deserves an extraordinary amount of credit for making this game what it is today. It’s not my game of the year since it’s more like a meth addiction than anything else, but that doesn’t detract from its accomplishments.

I’ll be honest, choosing my game of the year (even with the beautiful hindsight granted by having a big list of games I’ve played right there to look over) was tough. Whilst there were a lot of good games there were no amazing stand outs like there was the year previous. Going by review scores the best game of last year for me was Journey and whilst I was very tempted to give it that honour, like IGN has done, I couldn’t shake this feeling in the back of my head that there was another game that was more deserving but I couldn’t figure out which one to pick. The answer came to me, funnily enough, in the middle of a New Years eve party in the early hours of the morning and I still agree with that decision today.

My Game of the Year for 2012 is To The Moon.

If for the simple fact that I’m fighting back tears right now isn’t proof enough that this game had a massive impact on me To The Moon is one of those games that eschewed game play in favour of telling a beautiful, gripping story. Sure the game play was flawed and the disjointed pacing was one of the reasons that it didn’t score better than Journey but if just thinking about it can cause that kind of reaction in me then I know it had an impact that few games have had. I could continue gushing about it for hours if I wanted to but you really need to experience it for yourself as it’s an incredibly personal experience, one that will stick with you for a long time.

I had debated whether or not to continue my 1 review per week deal this year as whilst I thoroughly enjoyed the experience and opportunities it has granted me (2 games this year were sent to me for review, a 100% increase on last year!) it does take a fair bit of time to get through them. However considering the amount of DOTA 2 I’ve managed to fit in the past year I figure that cutting back on that in favour of more games will see more deadlines hit more frequently meaning more regular reviews for you, my readers. I won’t make any grandiose promises about reviewing more games this year than last but I’ll guarantee I’ll try my hardest to get one out a week and continue to pillage the vast reaches of all game genres and developers.

P.S. What was your game of the year? I’m really keen to know.

2013 Might Be Linux’s Year For Gaming.

The defacto platform of choice for any gamer used to be the Microsoft Windows based PC however the last decade has seen that change to be some form of console. Today, whilst we’re seeing something of a resurgence in the PC market thanks in part to some good releases this year and ageing console hardware, PCs are somewhere on the order take about 5% of the video game market. If we then extrapolate from there using the fact that only about 1~2% of the PC market is Linux (although this number could be higher if restricted to gamers) then you can see why many companies have ignored it for so long, it just doesn’t make financial sense to get into it. However there’s been a few recent announcements that shows there’s an increasing amount of attention being paid to this ultra-niche and that makes for some interesting speculation.

Linux Distros Tux

Gaming on Linux has always been an exercise in frustration, usually due to the Windows-centric nature of the gaming industry. Back in the day Linux suffered from a lack of good driver support for modern graphics cards and this made it nearly impossible to get games running on there at an acceptable level. Once that was sorted out (whether you count binary blobs as “sorted” is up to you) there was still the issue that most games were simply not coded for Linux leaving their users with very few options. Many chose to run their games through WINE or Cedega which actually works quite well, especially for popular titles, but many where still left wanting  for titles that would run natively. The Humble Indie Bundle has gone a long way to getting developers working on Linux but it’s still something of a poor cousin to the Windows Platform.

Late last year saw Valve open up beta access to Steam on Linux bringing with it some 50 odd titles to the platform. It came as little surprise that they did this considering that they did the same thing with OSX just over 2 years ago which was undoubtedly a success for them. I haven’t really heard much on it since then, mostly because none of my gamer friends run Linux, but there’s evidence to suggest that it’s going pretty well as Valve is making further bets on Linux. As it turns out their upcoming Steam Box will be running some form of Linux under the hood:

Valve’s engineer talked about their labs and that they want to change the “frustrating lack of innovation in the area of computer hardware”. He also mentioned a console launch in 2013 and that it will specifically use Linux and not Windows. Furthermore he said that Valve’s labs will reveal yet another new hardware in 2013, most likely rumored controllers and VR equipment but we can expect some new exciting stuff.

I’ll be honest and say that I really didn’t expect this even with all the bellyaching people have been doing about Windows 8. You see whilst being able to brag about 55 titles being on the platform already that’s only 2% of their current catalogue. You could argue that emulation is good enough now that all the titles could be made available through the use of WINE which is a possibility but Valve doesn’t offer that option with OSX currently so it’s unlikely to happen. Realistically unless the current developers have intentions to do a Linux release now the release of the Steam Box/Steam on Linux isn’t going to be enough to tempt them to do it, especially if they’ve already recovered their costs from PC sales.

That being said all it might take is one industry heavyweight to put their weight behind Linux to start a cascade of others doing the same. As it turns out Blizzard is doing just that with one of their titles slated for a Linux release some time this year. Blizzard has a long history with cross platform releases as they were one of the few companies to do releases for Mac OS decades ago and they’ve stated many times that they have a Linux World of Warcraft client that they’ve shied away from releasing due to support concerns. Releasing an official client for one of their games on Linux will be their way of verifying whether its worth it for them to continue doing so and should it prove successful it could be the shot in the arm that Linux needs to become a viable platform for games developers to target.

Does this mean that I’ll be switching over? Probably not as I’m a Microsoft guy at heart and I know my current platform too well to just drop it for something else (even though I do have a lot of experience with Linux). I’m very interested to see how the Steam Box is going to be positioned as it being Linux changes the idea I had in my head for it and makes Valve’s previous comments about them all the more intriguing. Whilst 2013 might not be a blockbuster year for Linux gaming it is shaping up to be the turning point where it starts to become viable.

Rapid Domestication (or OMG Cute Foxes!).

I have this obsession with esoterica; things that are hard to find or to track down trigger this thing in the back of my head that just won’t go away until I find them. Most of the time its pretty harmless stuff, usually only sending me down a flurry of Google searches, but sometimes it can drive me to apparent madness like when I scoured eBay for hours looking for a copy of Uncharted 3 Explorer Edition when I found out they were no longer available through stores. One of the weirder times this desire for the esoteric hit me was back when I was researching dog breeds for a potential new puppy and I stumbled across something quite intriguing.

Back in the 1950s a Russian scientists by the name of Dmitry Belyaev began a breeding program with wild foxes. His aims were simple, he wanted to study the origins of domestication and gain insight into the differences between our dogs and their wild counterparts. For this experiment he selected the Silver Fox and began selectively breeding them for more domestic tendencies. The results were quite remarkable and within a few generations Belyaev had foxes that were nothing like their wild counterparts, even to the point of them developing different coats, curling their tails and behaving much more like your garden variety canine than anything else.

It didn’t take me long to track down a breeder that had them available (SibFox, who appears to have shut down) and the low low price of US$6000 seemed reasonable, at least compared to some other types of dogs. Australia’s quarantine laws and a concerned wife thankfully put this idea firmly out of the realms of plausibility but I still think that a domesticated fox would make for a pretty good pet.

I just couldn’t take it out to my parents farm, however.

Interestingly enough there’s a lot of evidence to suggest that cats and dogs actually domesticated themselves, foregoing their wild behaviours in favour of living side by side with humans in order to increase their chances of survival. This has certainly worked well for them with domesticated animal numbers far exceeding that of their wild brethren and suffice to say its a much easier existence for many of them. It’s quite a recent phenomenon too, in evolutionary terms, as the first evidence of domesticated animals only dates back 9,000 years or so.

Pretty wild (ha!), isn’t it?