Long time readers will know how much this blog has struggled with its various incarnations over the past 4 years. Initially I ran it from home on a server that I was using for development purposes so it ran inside a virtual machine that contained not one, but two database engines (MS-SQL for development and MySQL for the blog) all behind the tenuous 1.5Mbit upstream connection. This held up ok until I wanted to do anything fancy like put pictures on there (which would kill the connection for anything over 50kb) and it was relatively unstable, going down for days at a time since I couldn’t get a reliable remote connection to it. Since then I’ve churned my way through different virtual private servers (and all the issues they have) before landing on my current Burst.NET Ubuntu box which has been the best of the bunch so far.
Well, on the surface at least.
Since my blog as attained a steady amount of traffic it usually doesn’t take long for someone to pipe up when it goes down, especially if it happens during the day time in Australia. Since I now have remote access to the server I’m one command away from rebooting it should anything happen to it and have done so multiple times when it has come to my attention. However there’s a good 12 or so hours during the day when I’m not really paying attention to the blog due to being at home and/or sleep and downtime during this period usually goes unnoticed until I try to login during the morning. Since a good chunk of my audience is in the USA this can mean an awful amount of missed traffic which isn’t the greatest way to start the day.
Now when I first set up the blog on this host there were a couple teething issues (mostly due to my rusty Linux skills) but for probably 2 months afterwards everything ran without the slightest indication of an issue. Then every so often the blog would simply stop responding, the server would be up and everything else on it was running fine but try as I might I couldn’t get it to serve out a PHP page. Wanting to get it back up as quickly as I could I recycled the Apache service and it came back up instantly and I figured it was just some transient error and went back to my everyday blogging routine. However it kept happening, usually at the most inopportune times, and so last weekend I sat down to find the root cause of the issue.
Turns out its WordPress itself.
The above screenshot shows the error pretty quickly, essentially Apache has reached the maximum number of clients it can serve and will start to reject users after that point. Whilst the causes of this are wide and varied it can usually the culprit can usually be traced down to some WordPress plugin or script that’s opening up connections and then not closing them properly. The best way to take care of this is to fix the script in question but since I have little interest in diving into the mess that is PHP I’ve simply upped the MaxClients setting, reduced the time out period and scheduled an Apache reboot to clear out anything that gets stuck open. All of these combined seems to be an effective solution to this issue in the mean time and once I feel up to the task of delving through all the code to find the offending script I’ll nip it in the bud for good.
Apart from that little quirk though this iteration of the blog’s underlying infrastructure has been pretty fantastic with all the plugins functioning the way I expect them to without me having to fiddle with web.config settings for hours on end. It’s also significantly faster as well, reducing page load times by half for dynamic pages and becoming near instant when its served from cache. You could attribute this to the fact that it’s a lot beefier than its predecessor but neither of them showed significant load for an extended period of time. I guess where I’m going with this is that if you’re going to host your own WordPress blog it’s just plain better on Linux, especially if you’ve better things to be doing (like, you know, blogging).
Have you heard of the term “Ladder Anxiety“? If you’ve ever played in competitive 1 on 1 games you’ll the know the feeling intimately, that sense of dread you get before you hit the find match button that builds upon itself until you see that final score screen. When I played StarCraft 2 a lot I would get this all the time, to the point where if I was just a little bit cold my whole body would shudder violently until the nervous energy turned me into a raging furnace. I’ve eventually learned what I can do to tame that wild beast but by far the best thing for it was simply to not play 1 v 1 as I found my stress levels were far lower when I was playing in a team. For StarCraft 2 this kind of defeats the point since it’s balanced for 1 v 1 but for other games, like my current addiction in DOTA 2, it’s par for the course.
For all of these games the in built ranking system is usually very coarse, serving as an indication of where you fit in with the larger gaming populace and only giving solid rankings for the highest level players. The reasoning behind this is pretty simple as anything more granular than that leads to some rather undesirable behaviour within the greater community. The ELO ratings that were used back in the original WarCraft 3 DOTA map were a good example of this as players would often use it as an excuse to force people into certain roles (your ELO is too low, you’re playing support), criticize them for not playing the way they think you should be playing or just simply being jerks for the sake of it. You might be thinking that this is all par for the course for something that’s on the Internet but the simple fact is that you never want to give jerks tools that enable them to be better jerks, especially if you can avoid it.
You can then imagine my reaction when I heard about the upcoming release of the DotaBuff Rating system. I first came across it when they had a poll up to determine whether it should be a widely available stat or something you can only see for yourself and was hopeful that the community that struggled against the perils of the previous ELO system would make the right decision. Whilst the Reddit DOTA 2 players appeared to be on the right track the wider player base apparently voted, in 2 to 1 odds, to make it open to everyone. The backlash against that idea was strong enough for them to rethink their position on the matter with them saying that they’d move it into a “paid only” feature. Whilst its debatable as to whether or not that was their plan all along the furore generated by the potential implementation of DBR caught the eyes of Valve and they decided to go nuclear on the situation.
In the latest patch to hit DOTA 2 an option was introduced into the game settings that allowed you to choose whether or not sites like DotaBuff would be allowed to view your match data. This option was disabled by default meaning that the vast majority of the data that DotaBuff had been collecting since its inception would no longer be available to it. Additionally it’s no longer possible to reconstruct the download link for the replay file meaning that the more in depth statistics are simply unavailable. People like me who are interested in their ongoing statistics would of course enable it again but as some of my recent games have shown I’m not in the majority of users. Whilst I might abhor the introduction of a rating that arguably made an elitist community worse it doesn’t bode well for the ancillary developer community that was trying to add value to one of Valve’s burgeoning ecosystems.
Now its easy to argue that its foolish to base your business around someone else’s business, especially in this web driven age where API changes like this can spell death for your nascent company. However it’s also hard to ignore the fact that if you don’t do it someone else will and there’s every chance that they’ll see some level of success for it. DotaBuff is, to me at least, a great resource for personal statistics tracking and being able to compare myself to the wider world (but no the other way around) was an invaluable resource. Valve I feel went too far in its reaction to the DBR situation and could have easily resolved the situation without resorting to nuclear level responses. Hopefully this is just an overcorrection and they can reach a happy middle ground as in its current form the API is a shadow of its former self.
To be truthful the DOTA community has grown a lot since I used to play it back in WarCraft 3 and whilst I wouldn’t want to poke the bear by giving everyone unfettered access to DBR I don’t believe it was particularly threaten by having it available privately. Sure it might be a bit more granular than Valve’s preferred system (searching the replays with your name and selecting the skill rating) but I’m sure that’s nothing that couldn’t be fixed by a few friendly emails rather than a whole of game API limitation. There’s probably more to this story than what I’m seeing however and time will tell if this change will spell the end for stats tracking sites like DotaBuff.
Ever since the first console was released they have always been at arms length with the greater world of computing. Initially this was just a difference in inputs as consoles were primarily games machines and thus did not require a fully fledged keyboard but over time they grew into being purpose built systems. This is something of a double edged sword as whilst a tightly controlled hardware platform allows developers to code against a set of specifications it also usually meant that every platform was unique which often meant that there was a learning curve for developers every time a new system came out. Sony was particularly guilty of this as the PlayStation 2 and 3 were both notoriously difficult to code for; the latter especially given its unique combination of linear coprocessors and giant non-linear unit.
There was no real indication that this trend was going to stop either as all of the current generation of consoles use some non-standard variant of some comparably esoteric processor. Indeed the only console in recent memory to attempt to use a more standard processor, the original Xbox, was succeeded by a PowerPC driven Xbox360 which would make you think that the current industry standard of x86 processors just weren’t suited to the console environment. Taking into account that the WiiU came out with a PowerPC CPU it seem logical that the next generation would continue this trend but it seems there’s a sea change on the horizon.
Early last year rumours started circulating that the next generation PlayStation, codenamed Orbis, was going to be sporting a x86 based processor but the next generation Xbox, Durango, was most likely going to be continuing with a PowerPC CPU. As it turns out this isn’t the case and Durango will in fact be sporting an x86 (well if you want to be pedantic its x86-64, or x64). This means that its highly likely that code built on the windows platform will be portable to Durango and makes the Xbox the launchpad for the final screen in Microsoft’s Three Screens idea. This essentially means that nearly all major gaming platforms share the same coding base which should make cross platform releases far easier than they have been.
News just in also reveals the specifications of the PlayStation 4 confirming the x86 rumours. It also brings with it some rather interesting news: AMD is looking to be the CPU/GPU manufacturer of choice for the next generation of consoles.
There’s no denying that AMD has had a rough couple years with their most recent quarter posting a net loss of $473 million. It’s not unique to them either as Intel has been dealing with sliding revenue figures as the mobile sector heats up and demand for ARM based processors, which neither of the 2 big chip manufacturer’s provide, skyrockets. Indeed Intel has stated several times that they’re shifting their strategy to try and capture that sector of the market with their most recent announcement being that they won’t be building motherboards any more. AMD seems to have lucked out in securing the CPU for the Orbis (and whilst I can’t find a definitive source it looks like their processor will be in Durango too) and the GPU for both of them which will guarantee them a steady stream of income for quite a while to come. Whether or not this will be enough to reinvigorate the chip giant remains to be seen but there’s no denying that it’s a big win for them.
The end result, I believe, will be an extremely fast maturation of the development frameworks available for the next generation of consoles thanks to their x86 base. What this means is that we’re likely to see titles making the most of the hardware much sooner than we have for other platforms thanks to their ubiquity of their underlying architecture. This will be both a blessing and a curse as whilst the first couple years will see some really impressive titles past that point there might not be a whole lot of room for optimizations. This is ignoring the GPU of course where there always seems to be better ways of doing things but it will be quickly outpaced by its newer brethren. Combine this with the availability of the SteamBox and we could see PCs making a come back as the gaming platform of choice once the consoles start showing their age.
I’m sure everyone has heard of the idea of an unstoppable force meeting an immovable object. For anyone who’s interested in scientific principles it can be a pretty irritating thought experiment as you wrangle with definitions, principles and the limitations of your own knowledge of science. Personally I never really thought about it much past the point of thinking that they’d both be converted to pure energy (this makes the assumption they’re both physical objects with mass) but as it turns out there’s a much, much better explanation. One that makes me feel a little dumb for not researching it a little further:
The idea itself is in fact a paradox since the existence of one or the other of the two parts of the equation means that the other simply can not exist. If you have something that is immovable then its impossible for an unstoppable force to exist and vice versa. Indeed diving into the semantics of it like the video does makes their existence even more problematic, even if we ignore the energy requirements and just go by the laws of physics. I have to say that the end result of them simply passing through each other was not something that I would have expected but then again I only did 6 months worth of physics at university.
When I first wrote about Planetary Resources early last year I was erring on the side of cautious optimism because back then there wasn’t a whole lot of information available regarding how they were actually going to achieve their goal. Indeed even their first goal of building and launching multiple space telescopes sounded like it was beyond the capabilities of even veteran players in this industry. Still the investors backing them weren’t the type to be taken for a ride so I figured they were worth keeping an eye on to see how they progressed towards their goal.
And boy have they ever:
The above video shows off one of their prototypes of the Arkyd-100 space based telescope. Now back when Planetary Resources first started talking about what they were going to do I wasn’t expecting something of this size. Indeed I don’t believe anyone has attempted to make a space based telescope that small before as you’re usually trying to amp up your light gathering potential with a large mirror. Still despite the relatively small mirror size they should be quite capable of doing the required imagery that will lead them to potential mineable asteroids.
Their communications set up is also highly intriguing as traditional space communications require large dishes and costly receiving equipment back here on earth. Planetary Resources are instead looking to use lasers for their deep space communications an idea that I didn’t think would be possible. A quick bit of research turns up this document from NASA’s Jet Propulsion Lab which goes into some detail about their feasibility and shockingly it appears to only be an engineering challenge at this point. How long it will take to turn it into something usable remains to be seen but considering Planetary Resources are looking to launch within the next couple years I’d hazard a guess that they’re already pretty close to getting it working.
Looking at all this you’d think I’d be ashamed of my initial scepticism but I’m not, I love it when people prove me wrong like this. Indeed the work that Planetary Resources are doing closely resembles that of the early days of SpaceX, a company which has gone on to achieve things that no other private company has done before. Given enough time it’s looking like Planetary Resources will be able to do the same and that gets me all kinds of excited.
The roguelike genre has always been on the periphery of my gaming world; sitting in the corner with its randomly generated levels promising me all sorts of wonders should I take the time to play through it. Of course it’s a fool’s gambit since the Roguelike genre dictates that your path in the game will be a slave to your computer’s random number generator forcing you to make the best of the situation that you’ve been dealt. I think it’s this exact reason that I avoided the genre for so long, I’m not the kind of player who likes being out of control of a situation especially when if a wrong move means I won’t be able to reload and try again. You’d then think that FTL: Faster Than Light wouldn’t get a look in but it overcame the barrier by being in space and having several recommendations from friends.
FTL puts you in control of a small federation ship that has intercepted a data packet from the rebel fleet that’s hell bent on taking your empire down. This data could prove to be invaluable in stopping them so it’s up to you to get back to your fleet in order deliver this information. You’re a long way away however and the rebel fleet is hot on your heels, forcing you to venture through some sectors of space that you probably wouldn’t have gone through in the first place. Indeed space seems to be a rather hostile place as you’ll face many obstacles along your way and even upon reaching your final destination there will still be many challenges for you to overcome.
In typical roguelike fashion FTL eschews modern graphics, instead favouring pixel art styling for everything. As far as I can tell they’re actually vector based images as rendering them on my 1680 x 1050 screen didn’t give me the huge pixel blocks that I usually get with titles like this which is a pretty great achievement. The simple, clean art style also helps immensely with the game play as it’s much easier to distinguish everything on screen, something which can be crucial when you’re in the middle of a battle and clicking wildly. The UI elements are also straight forward and their functions clear further adding to FTL’s overall usability. It really pleases me when a game manages to get the graphics and UI right without being too over the top as I can’t tell you how many times a bad interface has soured me on the whole game experience.
FTL’s game play is your run of the mill Rougelike dungeon affair with you moving from beacon to beacon, each of which is randomly generated and contains things like an event, an enemy ship or simply nothing. Depending on the choices you’ve made in what to upgrade, what kind of crew members you have and even what weapons you have equipped the events (and the way they play out) will change which means that no two play throughs will ever be a like. This is both a blessing and a curse of the genre as whilst you’ll never be playing the same game twice this does mean that you’ll often find yourself in situations that you’ve never been in before and should you make the wrong choices you’ll be doing it all again in no short order.
For the most part you’ll spend your time fighting other ships with varying levels of weaponry, configurations and additional abilities that are sure to make your life far more difficult than it should be. Whilst the combat occurs in real time there it’s still in essence turn based thanks to the time limits placed on all actions you can take. In the beginning the scales are most certainly stacked in your favour as you have several times the hull of any enemy ship and can usually take them out with a well placed missile, leaving you to clean them up at your leisure. This doesn’t last particularly long however and you’ll soon find yourself waging a battle on several different fronts.
The combat system is actually quite detailed with many viable strategies available. The initial ship you’re given, The Kestrel, is a pretty typical “blast them until they stop moving” type of craft which is optimized for taking out their shields with a missile and then pummelling them with your laser. Other ship configurations, which you unlock by completing certain achievements, focus on different ways of taking out the enemy. The first (and currently only) ship that I unlocked uses an ion canon to disable enemy systems whilst a single drone wears them down. Others focus on boarding parties where your crew is teleported to the other ship to wreck havoc which usually requires careful micromanagement to pull off correctly. These are just the main types of combat as there’s a lot more variation if you include the different types of weapons and drones, each of which can have devastating effects if used correctly.
I spent most of my time on the Kestrel, favouring to upgrade my shields initially to prevent most of the early hull damage whilst looking for some kind of weapon to give me the edge. I’d usually end up keeping one missile around in order to disable their shields so I could then unleash with my other weapons but I did have a lot of success with 2 lasers and 1 beam weapon which would usually let me drop their shields before doing a lot of sweeping damage to them. The issue with this was that it was something of a one trick pony and direct hits to the shields or weapons systems usually left me rather vulnerable so it was always a race to disable their weapons first before they could do it to me.
Now I know this is probably going to sound like I’m missing the point of the roguelike genre but the fact is that a good chunk of this game (I’d say about 50% or so) is pure, unadulterated luck. There were several times when after my first jump in a new game I’d find myself in an asteroid field or next to a sun that’s about to go nova which would do enormous amounts of hull damage to my ship before I could escape. This then put me on the back foot as I’d have to use my scrap for repairs rather than upgrades which usually meant even more hull damage and thus the cycle goes. Sometimes it swung the other way with FTL coughing up a weapon that was seriously broken for when I got it, effectively enabling me to take down all sorts of foes without having to pay too much attention to strategy. Many will argue that this is part of the fun but I’ve got one story that I feel proves my point somewhat.
So I had gotten to the final stage with an amazing ship, nearly full compliment of crew and all the missiles and hull I could want. Not wanting to lose this game I hunted around for the save game files and copied them off (yeah, yeah, I know) before heading off towards the final mission. Upon reaching it I did pretty well but didn’t make it past the first phase and so I reloaded it and tried again. I did this no less than 20 times and whilst I got the first stage down pat it appears the second stage still eludes me. Without doing that I would’ve had to have invested a lot more time to get to that point and there would be no guarantees that I could get there with a similarly decked out ship. Essentially, if I was playing the game normally, I wouldn’t have had the same opportunity to learn the boss fight if I hadn’t jacked my save game which irritates me. I know a lot of people enjoy this kind of challenge but after a while I have to say the hour long build up to inevitable demise started to wear on me.
Despite my misgivings with the Rougelike genre I really did enjoy FTL: Faster Than Light for what it is. When I started off just getting around without dying was a challenge but later on it was easy for me to get to the final stand without too much hassle. Of course how I did from there was completely dependent on how much the RNG liked me that day but that didn’t stop me from trying time and time again. I’ve still yet to get passed the second phase of the final boss fight but you can rest assured I’ll keep trying. I might not go the whole hog every time (I still have that game saved) but there is a certain satisfaction in playing from start to finish and I’m sure that’s what keeps everyone coming back.
FTL: Faster Than Light is available on PC, OSX and Linux right now for $10. Game was mostly played on the easy difficulty setting with around 7 hours total play time.
As always I’m not-so-secretly working on a side project of mine (although I’ve kept it’s true nature a secret from most) which utilizes Windows Azure as the underlying platform. I’ve been working on it for the past 3 months or so and whilst it isn’t my first Azure application it is the first one that I’ve actually put into production. That means I’ve had to deal with all the issues associated with doing that, from building an error reporting framework to making code changes that have no effect in development but fix critical issues when the application is deployed. I’ve also come to the realisation that some the architectural decisions I made, ones done with an eye cast towards future scalability, aren’t as sound as I first thought they were.
I’ve touched on some of the issues and considerations that Azure Tables has previously but what I haven’t dug into is the reasons you would choose to use. On the surface it looks like a stripped down version of a relational database, missing some features but making up for it by being an extremely cheap way of storing a whole lot of data. Figuring that my application was going to be huge some day (as all us developers do) I made the decision to use Azure Tables for everything. Sure querying the data was a little cumbersome but there were ways to code around that, and code around I did. The end solution does work as intended when deployed into production but there are some quirks which don’t sit well with me.
For starters querying data from Azure Tables on anything but the partition key and row key will force a table scan. Those familiar with NOSQL style databases will tell me that that’s the point, storage services like these are optimized for this situation and outside of that you’re better off using an old fashioned SQL database. I realised this when I was developing it however the situations I had in mind fit in well with with the partition/row key paradigm as often I’d need to get a whole partition, single record or (and this is the killer) the entire table itself. Whilst Azure Tables might be great at the first 2 things it’s absolutely rubbish at the latter and this causes me no end of issues.
In the beginning I, like most developers, simply developed something that worked. This included a couple calls along the lines of “get all the records in this table then do something with each of them”. This worked well up until I started getting hundreds of thousands of rows needing to be returned which often ended with the query being killed long before it could complete. Frustrated I implemented a solution that attempted to iterate over all records in the table by requesting all of the records and then following the continuation tokens as they were given to me. This kind of worked although anyone who’s worked with Azure and LINQ will tell you that I reinvented the wheel by forgoing the .AsTableServiceQuery() method which does that all for you. Indeed the end result was essentially the same and the only way around it was to put in some manual retry logic (in addition to the regular RetryPolicy). This works but retrieving/iterating over 800,000 records takes some 5 hours to complete, unacceptable when I can do the same thing on my home PC in a minute or two.
It’s not a limitation of the instances I’m using either as I’m using Azure SQL for one part of it which uses a subset of the data, but still the same number of records, is able to return in a fraction of the time. Indeed the issue seems to come from the fact that Azure Tables lacks the ability to iterate and re-runs the giant query every time I request a the next 1000 records. This often runs into the execution time limit which terminates all connections from my instance to the storage, causing a flurry of errors to occur. The solution seems clear though, I need to move off Azure Tables and onto Azure SQL.
Realistically I should’ve realised this a lot sooner as there are numerous queries I make on things other than the partition and row keys which are critical to the way my application functions. This comes with its own challenges as scaling out the application becomes a lot harder but honestly I’m kidding myself by thinking I’ll need that level of scalability any time soon, especially when I can simply move database tables around on Azure instances to get the required performance and once that’s not enough I’ll finally try to understand SQL Federations properly and that will sort it for good.
Australia has one of the best education systems available as evidence by our top 10 rankings for literacy, science and mathematics as well as our overall education index of 0.993, tying us for first place with countries like Denmark and Finland. While our system isn’t exactly unique in its implementation I do believe schemes like HECS/HELP are one of the main reasons that the majority of Australians now pursue tertiary education and whilst this might bring about other issues (like a lack of people in trades) it’s clear that benefits far outweigh the costs. Indeed as someone who couldn’t have afforded university without the help of the government and now has a great career to show for it I’m something of a testament to that idea.
Recently however there’s been some criticism of the HECS-HELP system, mostly focused on the amount of student debt owing to the government and the sizeable chunk of that which is never expected to be repaid:
The Grattan Institute’s annual Mapping Australian Higher Education report finds that students and former students have accumulated HECS-HELP debts of $26.3 billion.
This is about an extra $10 billion owing, in real terms, than in 2007.
The interest bill on the income-contingent loan scheme, formerly known as HECS, is nearly $600 million a year, the institute estimates.
And it says HELP debt not expected to be repaid rose to $6.2 billion in 2012.
The report makes for some intriguing reading and does indeed state that there’s a good 25% or so of the current student debt that’s likely to never be repaid. The reasons behind it though are interesting as whilst some would have you think that it’s due to students skipping out on their debts in way or another (ala Liberal MP Steve Ciobo) it’s in fact primarily due to students either dying or moving overseas. Now there’s not a whole lot we can do about the former (except maybe investing more in the health care sector) but the latter is a problem that’s been around for decades and I’ve yet to see a solution proposed, either from the government or the private sector.
Australian graduates, especially in some sectors, suffer from a distinct lack of choice when it comes to finally finding a career once they’re done with their university studies. Whilst I might have managed to make a decent career without looking too far you have to appreciate the fact that my degree isn’t in IT, it’s in engineering, and such is the case for many graduates who try to find something in their chosen path. Usually they can get close but the chances of landing an opportunity directly in their field of study are usually pretty slim and that leads them to look overseas. I myself did exactly that not too long after I graduated and was pretty staggered at the number of opportunities available abroad that I was more than qualified for.
Another point that the report makes is that student debt is seemingly sky rocketing when compared decades prior. The graph above demonstrates that quite clearly but it doesn’t give you any indication as to why this is happening. For starters Australia’s population has increased by about 5.8 million in since 1989 or about 35%. At the same time participation in tertiary education has well over doubled in this time with the vast majority having some form of tertiary qualification and 27% of all Australians now carrying a bachelor’s degree or higher. Essentially there’s been a major cultural shift over the past 2 decades towards pursuing an education through universities rather than other avenues and this is what is responsible for the increase we’ve seen. This isn’t exactly an issue considering our GDP has quadrupled in the same time frame and whilst I won’t say there’s a causative link there I’d say you’d be hard pressed to uncouple higher education rates from improved GDP figures.
Realistically the issue of unpaid student debts isn’t much of an issue for the Australian government considering the wide reaching benefits that our high quality and freely available education system gives us. We still need to do something about our best and brightest moving overseas to greener pastures but it’s clear that the economic benefits of free education for anyone who wants it vastly outweighs the cost of providing it. Even if we were to erase all student debt in one year it would still be only a few percent of the total budget, something that could be easily done should there be any burning need for it to happen. There isn’t of course since the cost of servicing that debt is so low (comparatively) and there are much better things to spend that money on.
I’ve been using my Nokia Lumia 900 for some time now and whilst it’s a solid handset Windows Phone 7 is starting to feel pretty old hat at this point, especially with the Windows Phone 8 successor out in the Lumia 920. However I had made the decision to go back to Android due to the application ecosystem on there. Don’t get me wrong for most people Windows Phone has pretty much everything you need but for someone like me who revels in doing all sorts of esoteric things with his phone (like replicating iCloud levels of functionality, but better) Android is just the platform for me. With that in mind I had been searching for a handset that would suit me and I, like many others, found it in the Nexus 4.
Spec wise its a pretty comparable phone to everything else out there with the only glaring technical fault being the lack of a proper 4G modem. Still its big screen, highly capable processor and above all stock Android experience with updates that come direct from Google make up for that in spades. The price too is pretty amazing as I paid well over 50% more for my Galaxy S2 back in the day. So it was many months ago that I had resigned myself to wait for the eventual release of the Nexus 4 so I could make the transition back the Android platform and all the goodness that would come along with it.
Unfortunately for me the phone went on sale at some ludicrous time for us Australians so I wasn’t awake for the initial run of them and missed my chance at getting in on the first bunch. I wasn’t particularly worried though as they had a mailing list I could join for when stock would be available again and I figured that after the initial rush it wouldn’t be too hard to get my hands on one of them. However the stock they got sold out so quickly that by the time I checked my email and found they were available again they had sold out, leaving me without the opportunity to purchase one yet again. Thinking that there’s no way that Google would be out of stock for long (they never were for previous Nexus phones) I resigned myself to wait until it became available again, or at least a pre-order system came up.
Despite stories I hear of handsets being available for some times and tales of people being able to order one at various times I have not once seen a screen that differs from the one shown above. Nearly every day for the past 2 months I’ve been checking the Nexus site in the hopes that they’d become available but not once have I had the chance to purchase one. Now Google and LG have been pointing fingers in both directions as to who is to blame for this but in the end that doesn’t matter because both of them are losing more and more customers the longer these supply issues continue. It doesn’t help when they announce that AT&T will start stocking them this month which has to mean a good portion of inventory was diverted from web sales to go them instead. That doesn’t build any good will for Google in my mind especially when I’ve been wanting to give them my money for well over 2 months now.
And with that in mind I think I’m done waiting for it.
For the price the Nexus 4 looked like a great device but time hasn’t made the specifications look any better, especially considering the bevy of super powerful smartphones that debuted at CES not too long ago. I, along with many other potential Nexus 4 buyers, would have gladly snapped up one of their handsets long ago if it was available to us and the next generation wouldn’t have got much of a look in. However due to the major delays I’m now no longer considering the Nexus 4 viable when I might only be a month or two away from owning something like the ZTE Grand S which boasts better specifications all round and is probably the thinnest handset you’ll find. Sure I’ll lose the completely stock experience and direct updates from Google but after waiting for so long the damage has been done and I need to find myself a better suitor.
Way back in the day, before I had a decent Internet connection and Steam, I was something of a handheld gaming fiend. I can remember my GameBoy fondly, it was a giant device in my little hands and I had but one game to play on it: Nigel Mansell’s World Championship Racing. I kind of skipped the next couple generations of handhelds though (unless you count playing the demo unit at work) but I did have a PSP for a long time, one that was modded to heck and back. I even bought my wife a Nintendo DS as a present which is still loved to this day but more for its utility value than a gaming platform. The end result of this is that while I was aware of Scribblenauts I never really bothered to check it out. That is until it became available on Steam late last year.
Scribblenauts Unlimited puts you in control of Maxwell, a rooster helmeted kid who’s been bestowed with the incredible power of being able to bring any object to life through the use of his magical notebook. Like any kid with unlimited power Maxwell, accompanied by his sister Lily, goes a bit crazy and starts creating all sorts of mischief. That is until an old man curses both of them and tells them that it can only be lifted should they use their powers to help out others. Strangely enough Maxwell seems unaffected by this curse but his sister starts turning to stone. Therefore its up to Maxwell to do good deeds in order to collect Starites, the only things capable of healing Lily.
When I first started up Scribblenauts Unlimited it’s art style reminded me heavily of the flash games of yesteryear as it has similar colour palettes and animation styles. This seems to be purely coincidental however as whilst 5th Cell has a history of mobile/casual game development they haven’t made a single flash game although some of their recent titles were apparently inspired by them. The simplistic art style combines well with their sound design which incorporates a lot of subtle background music, universal voice acting (ala The Sims) and hilarious sound effects.
Scribblenauts Unlimited is a 2D puzzler that has a rather unique game play mechanic: you can create nearly any object you can think of by simply typing it out. So unlike most puzzlers where you either have to get a combination of items/switches/things right or search around for hours looking for the appropriate key or whatever you’re instead presented some kind of problem from a someone/thing walking around and then its up to you to come up with the most appropriate item to solve it. Since you have literally tens of thousands of items to choose from it’s usually easy to find something to get the job done, but that’s not where the fun comes from.
One of the situations I can remember was a group of kids had lost their ball in an old man’s yard and wanted me to get it back for them. Easy enough I just walked over there to pick it up however the old man intervened and took the ball away from me. It’s obvious then that they want you to distract the old man so you can get the ball but I figured there was a better way. I gave myself a grappling hook and shot the ball and dragged it back to me, completing the mission. This also had the hilarious consequence of also triggering the old man to retrieve it which confused the physics engine and sent me flying across the map.
You can also add adjectives to almost any object in the world which can significantly change their properties. Whilst I usually used this for hilarious effect, like creating a Bacon Narwhal or a Flying Supersonic Armoured Stegosaurus, it also made some of the puzzles incredibly easy. For instance when a kid was getting attacked by a bully all I did to solve the puzzle was add “friendly” as an objective to said bully and he turned from giver of wedgies to nerd lover. It also ends up making some rather weird objects, like if you write “psychic glasses” you get a pair of glasses that has a crystal ball stuck on the front of it.
While the default Maxwell is more than capable of completing all the puzzles you’ll probably find yourself wanting to get around levels faster, especially as they start to increase in size. Most of the time I’d draw myself up a Fast Jetpack (adding fast to it doubles the speed) in order to get around better although I graduated to my supersonic flying stegosaurus after I realised they had dinosaurs coded in. It would be nice if you could use the inventory system to set up a default set of items for you as any time you reset the level you’ll lose everything, even the costumes Maxwell is wearing. In fact the whole inventory seems pointless as it’s just as fast to type things out as it is to use them from there. That’s probably a hangover from its DS origins, though.
From a technical perspective Scribblenauts Unlimited is solid with the only real problems coming out of the physics engine getting into weird edge cases like the aforementioned grappling hook incident. In terms of the game play though it does get repetitive really quickly as whilst you can make all sorts of hilarious combinations it starts to wear thing after you’ve made several things that are on fire or made Cthulu fight Zeus for the 100th time. It does play well in small bursts though which, again, I think is due to its DS origins and its concurrent release on the WiiU which is arguably more aimed towards casual players.
There’s also a rudimentary story but it’s really only that, serving only to provide a kind of backdrop with a moral that can be summed up as “be good to one another”. I’m not saying its bad, just incredibly simplistic and considering it really only comes into play at the start and the end means that there’s no character development at all. Of course this is a game that sacrifices story in favour game play so I won’t judge it too harshly based on that but it’s something that bears mentioning.
Scribblenauts Unlimited is one of those games where the emergent game play is what makes it so much fun and the puzzles just seem to be catalysts to bring on all sorts of unintended behaviour. It really is a lot of fun to solve puzzles in completely unintuitive ways, especially if the solution simply makes no sense. It does start to wear thin quite quickly however and would probably be far better positioned as an iOS title since it lends itself so well to short bursts of game play. Still it’s technically sound, very enjoyable and both visually and aurally pleasing so it’s probably worth a look in if you’re looking for a break from more traditional games.
Scribblenauts Unlimited is available on PC, WiiU and Nintendo 3DS right now for $49.99, $78 and $58 respectively. Game was played on the PC with 5.5 hours total playtime and 50% of the achievements unlocked.