Monthly Archives: February 2013

Why You Shouldn’t Invest In BitCoins.

Much like my stance on Instagram I’ve seemingly been at odds with the BitCoin community ever since I penned my first post on it almost 2 years ago. The angst seems to stem primarily from the fact that I lumped it in with Ponzi schemes thanks to its early adopter favouritism and reliance on outside wealth injection. After the first crash however BitCoins started to show some stability and their intended function started to be their primary use. Indeed the amount of investment in the BitCoin ecosystem has sky-rocketed in the past year or so and this had led to a period of much more mild growth that was far more sustainable than its previous spikes were.

It was for that reason that I held my tongue on the latest round of price volatility as I assumed it was just the market recovering from the shock of the Pirateat40 scheme unravelling. That particular incident had all the makings of another price crash but it was obvious that whilst there was a great deal of value lost it wasn’t enough to make a lasting impression on the economy and it soon recovered back to a healthy percentage of its previous value. The last month however has started to show some worrying trends that hark back to the speculative bubble.

BitCoin Price Chart

If you zoom in on either of those 2 ramps the gradients are frighteningly similar although the price jump is from $15 to $25 rather than $3 to $10. Whilst the value jump might not be as severe as it was before (~66% rather than 300%) it’s still cause for some concern due to the time frame that it has happened in. When the value jumps up this fast it encourages people to keep their BitCoins rather than using them and attracts those who are looking to make a return. This puts even more upward pressure on the price which eventually leads to the kind of value crash that happened back in 2011.

Others would disagree with me however, saying that its actually a great time to invest in BitCoins. The reasons Anzaldi gives for wanting you to invest in BitCoins however don’t make a whole lot of sense as he doesn’t believe this round of growth is unsustainable (and even admits that the only other thing that gives this kind of ROI are all scams) and that the reward halving coupled with the deployment of ASIC chips are what are behind this stratospheric, real growth. The fact of the matter is that neither of these really has any influence over the current market rate for BitCoins, it all comes down to what people are willing to pay for them.

Prior to the lead up of the previous crash BitCoins had already experienced some pretty crazy growth, going from prices measured in cents to dollars in the space of a couple months. This immediately led to a flood of people entering the market who were seeking fast returns and had no intention of using BitCoins for their intended purpose. This current round of growth feels eerily familiar to back then and with people seeing rapid growth its highly likely that those same speculators will come back. It’s those speculators that are driving the price of BitCoins up not the factors that Anzaldi claims. If they were the price would have begun this current upward trend back in November (it did go up, but not like this and stablized shortly after) and the introduction of ASICs is far more likely to flood the market with more coins as hardware investors look to recoup some of their investments, rather than holding onto them for the long haul.

This kind of wild volatility isn’t helping BitCoins intended use as an universal currency that was free of any central agency. If this growth spurt leads to a new stable equilibrium then no harm, no foul but it really does look like history repeating itself. I’m hopeful that the market is smart enough to realise this and not get caught up in a buy and hold spree however as they’ve managed to do that in the past. As long as we remember that it’s BitCoin’s worth is derived from its liquidity and not its  value then these kinds of knife edge situations can be avoided.

My Stance on Instagram Explained.

Ho boy, rarely have I copped more flak for a post both online and offline than my piece early last year on how the general population of Instagram made me feel. In all honesty whilst I knew there were a few people it would piss off, which was one of the reasons it sat in my drafts folder for ages, I still felt like I had some valid points to make based on my observations based around the Instagram user base at large. Many people took offence to this, arguing points ranging from “Why should that matter to you anyway?” to “You’re using it wrong, there’s great communities on there”. I was hoping that the comments section would have been the end of all of it but late last week the topic came up again and I lost an hour in the ensuing debate so I figured it was time I made my position on this whole matter more clear.

FR0001

I recognise that for every example I can dredge up of someone posting a horribly framed and filtered picture of their breakfast someone else can just as easily show me something like this. My criticism wasn’t levelled at people who use the service in this fashion but reading back over the post and the ensuing comments I never really made that entirely clear, so mea culpa on that one. However I don’t feel that the general thrust of my argument has been invalidated by that as many users agree that the vast majority of stuff on Instagram isn’t particularly great. This isn’t unique to Instagram however as any user generated content site suffers from Sturgeon’s Law and honestly the mentality of users on said sites really doesn’t vary that much but Instagram hit closer to home thanks to my interest in this particular area.

I’ve also had people try to bring me back into the Instagram fold in order to convince me that there’s something in the platform for me. Now whilst I wasn’t an active user for quite some time I did have the application installed on my Galaxy S2 for the better part of the year, mostly so I could view pictures linked to me on Twitter without having to use Instagram’s then rather shitty web interface. From time to time I’d look at pictures on there and see some genuinely good ones but not often enough to convince me that it was worth investing my time to better my feed by subscribing to said users. The fact of the matter is I already have many other avenues for discovering photographers that I like, ones that share a critical characteristic with.

Our preferred platform of choice.

For me the undisputed platform of choice is my DSLR. I’ve tried many other camera systems from high end point and shoots, film SLRs and yes multitudes of cameras in phones but in the end I always end up coming back to my DSLR. The reasoning behind this is because of the amount of control and influence I have over the final image, something which I struggle with on any other platform. It may sound weird if you prefer the simplicity that’s granted to you by camera phones (something which I do understand) but I find it a lot easier to take pictures on my DSLR, to the point where using anything else just frustrates me. I think that’s because I know that whilst I can do a lot of things in post should I so desire there are some things I simply can’t unless I’m using my preferred platform of choice.

This is somewhat at odds with the Instagram community which, as far as I’m aware, doesn’t take particularly kindly to those who take photos outside of their phone and then upload them via the service. If I was going to use Instagram again that’s the way I would use it but I’d rather not antagonize the community further by breaking the current social norm on there. For now I really only use Facebook to distribute pictures (mostly because my recent photographic endeavours have involved friend’s weddings) but I’ve been a fan of Flickr and 500px for a long time now as they seem to be more my kind of people.

I’ve come to realise that even my beloved DSLR community isn’t immune to this kind of malarkey either as there are far, far too many people who I walking around with a $1000+ camera with the shocking kit lens on it shooting in auto thinking that they’re the next Don McCullin. The criticisms I’ve levelled at Instagram apply to them as well although they’ve yet to congregate onto a platform that’s as ubiquitous as Instagram has become.

After the backlash I received I set myself a challenge to try and use my camera phone to produce pictures that I’d be proud to share and the above is probably one of the dozens I’ve taken that’s anywhere near what I wanted it to be. 6 months of trying have shown me there’s definitely a lot of effort required into creating good pictures, arguably the same amount as required by using a DSLR, but I still feel like I’m constrained by my phone. Maybe that’s a personal thing, something that I could overcome with more time and dedication, but in saying that I’d propose the same thing to all the Instagrammers out there. Borrow a friends DSLR and see the world from our side. Maybe you’ll come away with an appreciation for the technology that helped give birth to the platform you so love today.

3 Tips on Improving Azure Table Storage Performance and Reliability.

If you’re a developer like me you’ve likely got a set of expectations about the way you handle data. Most likely they all have their roots in the object-oriented/relational paradigm meaning that you’d expect to be able to get some insight into your data by simply running a few queries against it or simply looking at the table, possibly sorting it to find something out. The day you decide to try out something like Azure Table storage however you’ll find that these tools simply aren’t available to you any more due to the nature of the service. It’s at this point where, if you’re like me, you’ll get a little nervous as your data can end up feeling like something of a black box.

A while back I posted about how I was over-thinking the scalability of my Azure application and how I was about to make the move to Azure SQL. That’s been my task for the past 3 weeks or so and what started out as a relatively simple task of simply moving data from one storage mechanism to another has turned into this herculean task that has seen me dive deeper into both Azure Tables and SQL than I have ever done previously. Along the way I’ve found out a few things that, whilst not changing my mind about the migration away from Azure tables, certainly would have made my life a whole bunch easier had I known about them.

1. If you need to query all the records in an Azure table, do it partition by partition.

The not-so-fun thing about Azure Tables is that unless you’re keeping track of your data in your application there’s no real metrics you can dredge up in order to give you some idea of what you’ve actually got. For me this meant that I had one table that I knew the count of (due to some background processing I do using that table) however there are 2 others which I have absolutely 0 idea about how much data is actually contained in there. Estimates using my development database led me to believe there was an order of magnitude more data in there than I thought there was which in turn led me to the conclusion that using .AsTableServiceQuery() to return the whole table was doomed from the start.

However Azure Tables isn’t too bad at returning an entire partition’s worth of data, even if the records number in the 10s or 100s of thousands. Sure the query time goes up linearly depending on how many records you’ve got (as Azure Tables will only return a max of 1000 records at a time) but if they’re all within the same partition you avoid the troublesome table scan which dramatically affects the performance of the query, sometimes to the point of it getting cancelled which isn’t handled by the default RetryPolicy framework. If you need all the data in the entire table you can then do queries on each partition and then dump them all in a list inside your application and then continue to do your query.

2. Optimize your context for querying or updating/inserting records.

Unbeknownst to me the TableServiceContext class has quite a few configuration options available that will allow you to change the way the context behaves. The vast majority of errors I was experiencing came from my background processor which primarily dealt with reading data without making any modifications to the records. If you have applications where this is the case then it’s best to set the Context.MergeOption to MergeOption.NoTracking as this means the context won’t attempt to track the entities.

If you have multiple threads running or queries that return large amounts of records this can lead to a rather large improvement in performance as the context doesn’t have to track any changes to them and the garbage collector can free up these objects even if you use the context for another query. Of course this means that if you do need to make any changes you’ll have to change the context and then attach to the entity in question but you’re probably doing that already. Or at least you should be.

3. Modify your web.config or app.config file to dramatically improve performance and reliability.

For some unknown reason the default number of HTTP connections that a Windows Azure application can make (although I get the feeling this affects all applications making use of the .NET frameworks) is set to 2. Yes just 2. This then manifests itself as all sorts of crazy errors that don’t make a whole bunch of sense like “the underlying connection was closed” when you try to make more than 2 requests at any one time (which includes queries to Azure Tables). The max number of connections you can specify depends on the size of the instance you’re using but Microsoft has a helpful guide on how to set this and other settings in order to make the most out of it.

Additionally some of the guys at Microsoft have collected a bunch of tips for improving the performance of Azure Tables in various circumstances. I’ve cherry picked out the best ones which I’ve confirmed that have worked wonders for me however there’s a fair few more in there that might be of use to you, especially if you’re looking to get every performance edge you can. Many of them are circumstantial and some require you to plan out or storage architecture in advance (so something that can’t be easily retrofitted into an existing app) but since the others have worked I hazard a guess they would to.

I might not be making use of some of these tips now that my application is going to be SQL and TOPAZ but if I can save anyone the trouble I went through trying to sort through all those esoteric errors I can at least say it was worth it. Some of these tips are just good to know regardless of the platform you’re on (like the default HTTP connection limit) and should be incorporated into your application as soon as its feasible. I’ve yet to get all my data into production yet as its still migrating but I get the feeling I might go on another path of discovery with Azure SQL in the not too distant future and I’ll be sure to share my tips for it then.

Visual Representation of The Effects of Herd Immunity (Or Anti-Vaxxers: Listen Up)

Arguing with facts on your side can sometimes feel like a Sisyphean task, especially on the Internet. For the most part when I claim something on this blog I try to back it up with reputable information sources if I haven’t done the research myself and if I’m talking completely out my ass I try to make that known so you can take that information with the required grain of salt. However when people comment on here I feel obliged to reply to them, even if what they’re saying has no basis in any kind of fact or reality. This can feel like a form of asymmetric warfare at times as the amount of time taken to disprove something is usually an order of magnitude more than what it took to write it in the first place.

Now I don’t usually like to pick on people who make comments here, if you’ve taken the time to post here I feel it’s better to respond to you directly on the post, but some of them simply demand more attention than I’ve already given to them. The one I’m thinking of in particular is this comment where they claim that herd immunity has been debunked, something that’s never been brought forth in any research paper that I’ve been able to track down. As far as I can tell it all comes down to the opinion of a one Dr. Blaylock who’s opinions have always been radically different from the scientific norm. He’s not a scientific dissenter either as many of his claims have been thoroughly debunked by other research but the herd immunity claim seems to remain.

Herd Immunity Demonstration

 

Whilst it would be all well and good for me to simply link to research papers which show case this fact quite well I thought it’d be better to point to something that demonstrates the point visually. The picture above is from this simulation tool which shows the results of what happens when a disease moves through a population. The first couple are interesting to get a feel for how an uncontrolled infection can spread even if only a single person is infected. The latter ones deal with some real life situations and demonstrate quite aptly why herd immunity works and why we’ve started to see small epidemics in isolated populations where they don’t vaccinated their children.

Probably the most shocking revelation I got from this simulation was the existence of Waldorf schools who’s official stance on vaccinations is “we have no official stance” but then immediately goes on to recommend parents don’t vaccinate their kids against a wide spectrum of diseases. Apart from the giant hypocrisy of saying one thing but then encouraging the other this kind of behaviour is inherently dangerous because it will mean there’s a cluster of unvaccinated people in constant contact with one another, a hot bed for a potential epidemic. It’s one thing to claim that but it’s already happened once and there was potential for another outbreak to occur due to the incredibly low vaccination rate. Considering that doesn’t happen anywhere else in the world where vaccination rates are above a certain threshold it’s a timely reminder that herd immunity is real and when its broken the consequences can be devastating.

I would go on but I think I’m preaching to the choir here as whilst the number of comments I get disagreeing with me out numbers those who do I know that if that reflected reality us humans would be in a far worse state, health wise, than we are today. The fact of the matter is that herd immunity is real and works beautifully for protecting those precious few who can not be vaccinated for one reason or another. Failing to vaccinate is not only a bad decision personally it also puts others at risk and that’s the only reason I need to support the current standard of mandated vaccinations.

The Cave: It Seems That We All Can’t Get Along.

I missed the boat on many of Tim Schafer’s games. Whilst I was aware of the titles that rocketed him to game developer stardom (Monkey Island, Manic Mansion and Psychonauts) I never ended up seeking them out, even more recently when I’ve been told I have to play them. You can probably attribute that to the fact that many of my friends had Apple IIs or other similar Mac computers and as such weren’t able to share games with me, the primary one being the original Monkey Island series. Still his games seem to have something of a following and if the Kickstarter for the Doublefine Adventure was anything to go by I figured their latest release, The Cave, would be worth playing.

The Cave Screenshot Wallpaper Title Screen

Upon starting up The Cave you’ll be greeted by a smooth talking narrator who introduces himself as the cave you’re about to dive into, something we’re told just to go along with. After a short setting of the scene you’re then introduced to the 7 playable characters that you can choose to bring with you on the journey. They are (in no particular order): The Knight, The Adventurer, The Monk, The Twins, The Time Traveler, The Scientist and The Hillbilly. Each of them has their own little story which you’ll dive into as you venture deep into the cave, revealing their troublesome past and hopefully work towards making their present a little better.

The Cave has gone for a stylized 2.5D environment, locking your movement to the traditional 2D platformer style which uses 3D models for everything on screen. Typically heavy stylization goes hand in hand with simplicity (as the choice to heavily stylize is usually done as a trade off for better performance) however The Cave’s various environments are drenched in detail with modern lighting effects, particle systems and intricate set pieces. All put together it works very well with each of the various sections of the cave having its own distinct feeling, especially the unique character rooms.

The Cave Screenshot Wallpaper Limited Liability Waiver

At the beginning you’re shown the group of 7 characters and you get to choose 3 of them to go along for the ride. The choice is arbitrary as no matter who you end up choosing you will be able to make it through to the end. Your choice of characters only affects the path you will take to reach the end although there are some sections which might go a bit quicker if you choose certain characters over others. In the end though due to the unfortunate choice of 7 characters rather than say 6 or 9 you’ll have to play the game through a full 3 times in order to see all of the character’s stories, if that’s of interest to you.

The Cave is your traditional puzzler/platformer, making you jump from platform to platform in order to find the right items to use in the right place or to pull various levers in order to progress to the next section. The twist comes from each of the characters that you choose to take on your journey as each of them has some kind of special ability that can be used to solve the puzzles. Now for the most part these abilities really only come into play during the character’s unique section of the cave but there are times during the intervening puzzles where these abilities might come in handy. The Knight for instance can go completely invulnerable which is kind of handy when you want to fall off ledges in order to descend quickly.

The Cave Screenshot Wallpaper Excalibur

Thankfully there’s no real inventory to speak of so you won’t spend your time hoarding dozens of items in the hopes you’ll need to use them. Instead in The Cave each of your 3 characters can only hold a single item at a time. Whilst there are some puzzles that require all of your characters to have an item and be doing something with it most of the time it’s only the main character that needs to do so. However much like other puzzle games there’s no shortage of things which you can pick up and interact with which can sometimes have you holding things that serve no purpose what so ever. This is part of the challenge of course but its usually fairly obvious what goes where.

As for the puzzles themselves most of them are relatively obvious with solutions that come about organically or by trial and error should you get stuck. Usually frustration sets in when you’ve picked up an item at one place then placed it down to get another item that you need to use right then and there, forcing you to backtrack some distance to get it again. There were some puzzles which stumped me to the point of needing a walk through guide but most of them were me thinking a puzzle should was solved when it really wasn’t. There was one puzzle which I thought was a bit rough however (the final stage, very last puzzle if you’re wondering) which whilst not being rubber duck key sort of thing was still in the realms of “LOL DEVELOPER LOGIC”.

The Cave Screenshot Wallpaper Swimming in the floor bug

The Cave is well coded considering its simultaneous release across several different platforms however there was one quirk which proved to be endlessly frustrating and one hilarious bug (pictured above). The quirk seems to be due to the dual control scheme that The Cave uses, letting you control your characters with the keyboard or mouse (or both at the same time, if you’re so inclined). However if you click in a location and then try to use the keyboard, like I tended to do accidentally when resting my hand on the mouse, there’s a 3 second or so period where the keyboard just simply doesn’t respond. This isn’t due to my keyboard or mouse as I don’t have this problem in any other game and it caused no end of frustration when my characters wouldn’t move the way I told them to. It’s not exactly game breaking but it is incredibly frustrating so I hope it gets fixed soon.

The bug shown above is also nothing really serious, just a clipping issue where my character was able to swim through the ground, but there’s probably a quick fix to it that could be implemented without too much trouble.

I thought the story of The Cave was interesting but lacked any real depth to it. Sure the character’s backgrounds are explored decently through the cave paintings and their unique puzzle caves but none of them are particularly likeable or relatable. Now I get this is the point some what but their stories didn’t have any impact on me one way or the other. It’s made up for in spades by the fun and novel game mechanics so I guess what I’m getting at is that the story is serviceable but that’s not the reason I’d be playing the game.

The Cave Screenshot Wallpaper Monk Mountain

The Cave is a solid platformer that brings in unique game mechanics and a pleasant art style to form a game that’s quite enjoyable to play. Many are seeing this as a teaser of things to come with the Doublefine Adventure and if this is true it should be shaping up to be something quite special, especially for fans of Schafer’s games. I had a good time with The Cave, although my second play through didn’t last particularly long (I stopped about half way through the first unique puzzle) but then again I’m the kind of player who gets rapidly disinterested in games I’ve already completed. The Cave is certainly worth a play through just for the unique experience it provides.

Rating: 8.25/10

The Cave is available right now on PC, Xbox360, Playstation 3 and WiiU right now for $19.99 and an equivalent amount of points on the varying systems respectively. Game was played on the PC with 4 hours played and 19% of the achievements unlocked.

Is The Second Hand Market Really That Detrimental?

I’m not a big user of the second hand market but there have been times when I’ve delved into it in order to get what I want. Usually its when I find out about a particular collector’s edition too late to buy a retail copy and will just wait it out until someone wants to hock their copy on eBay where I’ll snap it up for a song. The last game I did this with was Uncharted 3 (although I failed to mention the saga in the review) and whilst I didn’t get all the collector’s edition downloadable goodies the seller went out of their way to make sure I got a similar amount of value as they did when they purchased it new. I certainly didn’t expect this but it was deeply appreciated all the same.

EB Games Trade In Banner

However his generosity is a symptom of the larger problem at play here. Almost 2 years ago a silent war began between developers (well mostly likely the publishers) and the second hand market where first sale doctrine was being usurped by crippling used games. The first title that I purchased which was affected by this Mass Effect 2 and whilst I have no intention of ever selling that game the fact that it was crippled after initial sale didn’t sit particularly well with me. The trend has been on the increase as of late with many games including some form of one time use DLC in order to make second hand titles less attractive.

It gets even worse when rumours start surfacing that the next generation consoles will start supporting features that cripple second hand games natively removing the requirement from game developers to implement their own system. The justification would probably be something along the lines of “this is what we’ve done for ages on the PC” which is kind of true if you count CD keys but they were usually transferable. There’s also the sticky issue of digital downloads which currently have no method on any platform for enabling resale which is why many publishers are beginning to favour those platforms instead of physical retail releases.

The golden days of unsellable digital titles (and by extension crippled second hand titles) may not be long for this world however as the German consumer protection group VZBV has started legal proceedings against Valve in regards to the Steam platform. This isn’t the first time they’ve gone up against them but recent rulings in the EU have set up some precedents which could lead to digital distribution platforms having to implement some kind of second hand market. Considering Steam has been dealing in digital trade for many years now it’s not like they’re incapable of delivering such functionality, they just simply haven’t had the incentive to do so. Heavy fines from the EU could be the push they need in order to get them moving in the right direction but we’ll have to wait until the court case resolves before we’ll see any real movement on this issue.

I have real trouble seeing how the second hand game market is such a detriment to publishers. Indeed many people use trade-ins in order to fund new game purchases and removing that will put a downward pressure on new sales, to the tune of 10% or so. Now I don’t know how much revenue that publishers are making off those second hand uncrippling schemes but I’m sure a 10% increase is above that, especially if you count the amount of good will generated from not being a dick about the used market. Valve would be heralded as the second coming if they enabled used game trading on Steam, even if they charged a nominal fee to facilitate the transaction.

Really I can’t see any downsides to supporting the second hard market and actively working against it doesn’t do the publishers any favours. I’m not saying they have to go out and actively help facilitate it but they could simply not try to work against it like they’re doing right now. Digital distributors do have to pick up their game in this regard however and I hope it doesn’t come down to strong arming them with the law. Should the EU ruling hold up however that’s could very well be what happens but it would at least be a positive result for us consumers.

L.A. Noire’s Blooper Reel Is Awesome (and Hilarious).

L.A. Noire was a great game in its own right. It provided a story that was captivating enough that my wife would pester me to play it, eager to find out what happened next. A good chunk of its success comes from the MotionScan technology used to recreate the voice actor’s facial expressions while delivering dialogue which was critical to one of L.A. Noire’s critical mechanics: being able to tell if someone was lying to you or not. One thing I hadn’t considered however was the fact that there would inevitably be bloopers but unlike other games that use motion capture these would be captured in glorious detail. The results are incredibly hilarious:

I think my favourite part of all of it is the sneezing.

It also highlights one of the more glaring issues with the MotionScan technology and that’s it’s current limitation of only capturing facial movements and expressions. This was enough to facilitate L.A. Noire where much of the dialogue between characters takes place with the cameras firmly focused on their face but when their bodies got involved it was incredibly obvious that they lacked the same level of fidelity that their faces had. This gives you this weird disjunct between the head and the rest of the body making it look like they took human faces and slapped them on a robot body.

This didn’t go unnoticed by Depth Analysis (the company behind the technology) however and they’re currently working on extending the technology to be able to capture full body motion. Whether it will appear in another L.A. Noire game any time soon remains to be seen as Team Bondi was shut down at the end of 2011. The IP, and I’d figure by extension all the tools they developed for L.A. Noire, is owned by Rockstar however so there’s the possibility that another development house will end up creating another title using the technology (and possibly the IP).

The big question will be data density however as L.A. Noire was a huge game which spanned 3 discs on the Xbox and just barely fit on a single bluray for the PS3. Considering that was just the faces a game that included a full body motion capture would likely require several times more data in order for it to be possible. This isn’t to say it’s impossible, there are already data mediums that have this level of storage available to them, just that such a game would likely be in the range of 100GB or more.

Considering that the next generation of consoles are rumoured to come out (or at least be announced) this year there’s the highly likely possibility that they’ll bring in a new storage solution both for new games and movies a like. This could be something as simple as a bigger hard drive as they’ve always been on the small side but it could extend to something as exotic as ultraviolet discs which have an order of magnitude greater storage than blurays. Still that’s all wild speculation at this point but if the rumours are true we’ll only have to wait 2 weeks to see what Sony is bringing to the table which is really quite exciting.

1992-2013: So Long MiniDisc, You Will Be Missed.

It was the year 2000, a time when Napster was still nascent and the Internet was still that esoteric play ground for nerds or those who dared to trudge through the horror that was GeoCities. By this time I was already fully set in my geek ways with my very own computer in my room that I’d while away countless hours on, usually on Dune 2 or Diablo. Of course the way of the geek isn’t exactly cheap, my new computer had set my parents back a rather pretty penny or two, and they had said in no uncertain terms that I was no longer allowed to spend their money any more. It was time for me to get a job.

I was apprehensive at first after the horror stories I had heard from friends working in various fast food restaurants and other entry level jobs but the motivation to be able to have my own capital, money that I couldn’t be told what do to with, was far too tantalizing to give up. As luck would have it I landed in what was then geek heaven of Dick Smith Electronics and whilst it wasn’t all roses from day 1 it certainly was the perfect place for me, allowing me to fiddle with gadgets endlessly without having to shell out the requisite dollars.

Sony MZR55 MiniDisc Player

Then one day a particular gadget caught my eye, the Sony MZ-R55. For those who aren’t familiar with this magnificent little beast it was one of the first MiniDisc players from Sony that you could truly consider portable as most of the models prior to that were rather large and bulky, even if they were “portable” in the true sense of the word. It’s size didn’t come cheap however as whilst CD players had become a commodity item at that point, with even the most expensive and lavish units costing under $100, the MZ-R55 was retailing for $500+ even with my ludicrous cost price + 10% employee discount. The price didn’t phase young me however, that MiniDisc player would one day be mine and that day did eventually come.

It wasn’t just geek lust after the size that attracted me to MiniDiscs it was the audio quality coupled with the amazing ability to have tracks I could skip to that pushed me over the edge. My MP3 collection had just started to take shape and I wasn’t impressed with the quality I got when they translated to tape. Recording on MiniDisc however, which was done by a pure optical TOS-LINK connection from a SoundBlaster Audigy card, proved to be far superior in every respect. Plus having a remote and a rechargeable battery proved to be the ultimate of convenience features and my little MZ-R55 saw use every day.

The player also earned a special place in my heart when I journeyed to Japan in 2001. You see apart from myself and a close friend of mine there were no other MiniDisc users that I knew of and I certainly didn’t sell many of them at work. In Japan however they were far bigger than CDs and there were even terminals where you could choose a selection of tracks and then have them burnt to a MiniDisc while you were waiting. That wasn’t what won the MiniDisc a special place in my heart however, no it was something far more special than that.

The trip was part of a school excursion arranged my Japanese teacher and part of that was a home stay with a family. I was billeted with a family of 3 girls and their mother. My host sister’s name was Akiko and I spent 5 days in their house speaking horrific Japanese, enjoying their company and even putting on a “traditional” Australian barbecue at their house. At the end of it all, during a tear soaked farwell that had all of the home stay families gathered together to see us off, she handed me a single MiniDisc with all her favourite songs on it. I had been fairly stoic up until that point but it was then that I lost it and spent much of the rest of the trip listening to it. Maybe that’s why I love Utada Hikaru so much.

And then today news reached me that Sony was stopping production of all MiniDisc systems next month.

You’d think that I’d be upset about this but MiniDisc had been an also ran for some time now; I had already mourned its death a long time ago. Instead when I heard about that today all I remembered was that amazing piece of technology that found its niche in a couple places, one of them in my home. Sure it had its share of problems and no one in their right mind would spend as much as I did in order to use them but it was like the vinyl of my geek generation, it just felt all over better. Whilst other manufacturers might continue to make MiniDiscs and their associated systems Sony was the original and them shutting down production signals the end of its era, even if it had technically happened years ago.

For those of us who had MiniDisc players we loved them to bits, sometimes literally with later models that had a tendency to shake screws loose. They were a stop gap technology that was the first to bridge the gap between the digital and physical world without having to resort to analogue means and the format itself was something of a technical marvel to with the discs being almost archival levels of quality thanks to them being based on Magneto-Optical technology. I really could go on for hours about how good they were and all the fond memories I had with my MZ-R55 but I’m already emotional enough as it is.

Here’s to MiniDisc. You might not have been the raving success that the WalkMan was but you were everything that it was and more to me. You won’t be forgotten, that I can assure you.

You Can’t Archive Digital Video? Surely You Jest.

On recommendation of a friend I recently watched a documentary called Side by Side which details the history of the primary technology behind the cinema: the cameras. It starts off by giving you an introduction into the traditional photographic methods that were used to create films in the past and then goes on to detail the rise of digital in the same space. Being something of a photographic buff myself as well as a technological geek who can’t get enough of technology the topic wasn’t something I was unfamiliar with but it was highly interesting to see what people in the industry were thinking about the biggest change to happen in their industry in almost a century.

RED Epic Side Shot

Like much of my generation I grew up digitally with the vast majority of my life spent alongside computers and other non-analog style equipment. I was familiar with film as my father was something of a photographer (I believe his camera of choice was a Pentax K1000 which he still has, along with his Canon 60D) and my parents gave me my own little camera to experiment with. It wasn’t until a good decade and a half later that I’d find myself in possession of my first DSLR and still not another few years until after then that I’d find some actual passion for it. What I’m getting at here is that I’m inherently biased towards digital since it’s where I found my feet and it’s my preferred tool for capturing images.

One of the arguments that I’ve often heard levelled at digital formats, both in the form of images and your general everyday data, is that there’s no good way to archive it in order for future generations to be able to view it. Film and paper, the traditional means with which we’ve stored information for centuries, would appear to archive quite well due to the amount of knowledge contained in those formats that has stood the test of time. Ignoring for the moment that digital representations of data are still something of a nascent technology by comparison the question of how we archive it has come up time and time again and everyone seems to be under the impression that there’s no way to archive it.

This just isn’t the case.

Just before I was set to graduate from university I had been snooping around for a better job after my jump to a developer hadn’t worked out as I planned. As luck would have it I managed to land a job at the National Archives of Australia, a relatively small organisation tasked with the monumental effort of cataloguing all records of note that were produced in Australia. This encompassed all things from regular documents used in the course of government to things of cultural value like the air line tickets from when the Beatles visited Australia. Whilst they were primarily concerned with physical records (as shown by their tremendous halls filled with boxes) there was a small project within this organisation that was dedicated to the preservation of records that were born digital and were never to see the physical world.

I can’t take much credit for the work that they did there, I was merely a care taker of the infrastructure that was installed long before I arrived but I can tell you about the work they were doing there. The project team, consisting mostly of developers with just 2 IT admins (including myself), was dedicated to preserving digital files in the same way you would do with a paper record. At the time a lot of people were still printing them off and then archiving them in that way however it became clear that this process wasn’t going to be sustainable, especially considering that the NAA had only catalogued about 10% of their entire collection when I was there (that’s right, they didn’t know what 90% of the stuff they had contained). Thankfully many of the ideas used in the physical realm translated well to the digital one and thus XENA was born.

XENA is an open source project headed by the team at NAA that can take everyday files and convert them into an archival format. This format contains not only the content but also the “essence” of the document, I.E. it’s presentation, layout and any quirks that make that document, that document. The viewer included is then able to reconstruct the original document using the data contained within the file and since the project is open source should the NAA cease development on the project the data will still be available for all of those who used the XENA program. The released version does not currently support video but I can tell you that they were working on it while I was there but the needs of archiving digital documents was the more pressing requirement at the time.

Ah ha, I’ll hear some film advocates say, but what about the medium you store them on? Surely there’s no platform that can guarantee that the data will still be readable in 20 years, heck even 10 I’ll bet! You might think this, and should you have bought any of the first generation of CD-Rs I wouldn’t fault you for it, but we have many ways of storing data for long term archival purposes. Tapes are by far the most popular (and stand the test of time quite well) but for truly archival quality data storage that exists today nothing beats magneto-optical discs which can have lives measured in centuries. Of course we could always dive into the world of cutting edge science for likes like a sapphire etched platinum disc that might be capable of storing data for up to 10 million years but I think I’ve already hammered home the point enough.

There’s no denying that there are challenges to be overcome with the archival of digital data as the methods we developed for traditional means only serve as a pointer in the right direction. Indeed attempting to apply them to digital the world has often had disastrous results like the first reel of magnetic tape brought to the NAA which was inadvertenly baked in an oven (done with paper to kill microbes before archival), destroying the data forever. This isn’t to say we don’t have anything nor are we not working on it however and as technology improves so will the methods available for archiving digital data. It’s simply a matter of time until digital becomes as durable as its analogue counterpart and, dare I say it, not long before it surpasses it.

No Time To Explain: It’s Like Super Meat Boy, But Fun.

I have a love/hate relationship with the new wave of hardcore platformers that have swept through the game scene recently due to the indie game developer revolution. Initially I find them quite fun, as I did with Super Meat Boy and They Bleed Pixels, but usually towards the end when the difficulty starts to ramp up and my total play time sky rocket despite progress slowing to a crawl I tend to get frustrated with them. None of them have matched up to the Nintendo Hard hell that was Battletoads but ramping the difficulty up to insanity in the later levels might be part of the fun for some, but it certainly isn’t for me. No Time To Explain is another instalment in the indie platformer genre and despite my history with them the videos were intriguing enough to make me want to play it.

No Time To Explain Screnshot Wallpaper Intro Level

No Time To Explain drops you in a nondescript house with you casually minding your own business. Not long after a good chunk of your house is blown away by some unknown force and then suddenly someone who looks strikingly similar to you appears. “I’m you from the future. There’s no time to explain!” he exclaims at you before he’s snatched away by a giant alien crab who’s intent on taking him, you, away. You then find yourself in possession of a weapon capable of dealing untold amounts of damage whilst also functioning as a partial jetpack to get you over any obstacles in your way. It’s then up to you to rescue yourself from whatever dangers you find yourself in.

Whilst I’ve described some games in the past as being Flash-like due to their styling and choice of colour palettes No Time To Explain is in fact a flash game brought to you as a standalone executable thanks to Adobe’s AIR framework. This means the graphics are pretty much what you’d expect to see from any browser based flash game. This isn’t necessarily a bad thing, indeed for No Time To Explain the cartoonish presentation is what makes it so hilariously awesome, but there’s a certain standard that flash games seem to hit and never get passed no matter how long is spent on it. It’s probably a limitation of the platform more than anything although I can’t really comment since the last time I looked at ActionScript I got scared and decided to stick to C#.

No Time To Explain Screnshot Wallpaper You From The Future

Whilst Not Time To Explain starts off as a kind of soft core version of Metal Slug where you basically just wailing on random things with your giant beam weapon the core game mechanic is actually that of a physics based platformer. Your gun, whilst unleashing torrents of destruction where ever you aim it, also has  something of a kick to it. Pointing it in the right direction can send you soaring up into the clouds or launch you across wide gaps at incredible speed. The trouble then becomes figuring out what the right angles, amount of force and then how to correct your trajectory whilst you’re up in the air.

At the beginning this is relatively easy as your landing zones are huge and there’s nothing that will kill you brutally should you get your timing wrong. Soon after however there will be spikes coating surfaces, bottomless pits to fall in and jumps/obstacles that seem to be next to impossible to cross the first time you see them. Thanks to the decent auto-save system though you’ll be able to fine tune your strategy rapidly without having to go through everything from the start again. I have to say that this was a welcome change from the Super Meat Boy way of doing things where one particular obstacle could block you for ages simply because it took so long to get there in the first place.

No Time To Explain Screnshot Wallpaper Shark Boss

Each section is capped off with a boss fight which usually involves aiming your laser at whatever is moving and then waiting for it to keel over. This is perhaps where the save system is a little too good as there’s not a whole lot of challenge in the majority of the boss fights when you can literally stand in one section the entire time and simply wail on them until they die. Of course you can make it interesting for yourself (and speed up the process) by dodging the incoming bullets and positioning yourself better but that’s not technically a challenge the game provides. There was one boss fight where the quick save system didn’t apply which was a refreshing change but there were bigger issues at play there.

The Drill Squirrel boss is the first one where you can actually “die” in the sense that should you get injured at a specific point you’ll be sent back to the start of the fight rather than respawned where you were last standing. This is fine in and of itself however the fight is completely and utterly broken should certain things happen. Easy ways to replicate this are: be in the pit when he does his laser eyes at you or be on the same platform during said event. Once you’re past that the next section, where the pits fill with lava and fiery columns spew up from the ground, simply won’t happen and the drill squirrel will get stuck in the ground. This isn’t the only bug either, should you get bounced into a wall by him during the second phase you’ll get stuck in there as well but at least the game recognizes it and restarts you from the start.

No Time To Explain Screnshot Wallpaper Weird Polygon Thing

No Time To explain is an awesome platformer title, combining some of the twitch aspects of its more insanely difficult brethren with mechanics that make the platforming enjoyable rather than a chore. For the most part it works well with many of the times I got stuck being down to me not getting the puzzle rather than game breaking bugs. However there are still some teething issues that need to be worked out, especially with that one particular boss, before I could say it was a trouble free experience. I also have a small gripe over the price since it’s rather short (and is available a lot cheaper direct from the developer) but it is on sale right now which kind of renders that complaint moot. Overall I quite enjoyed No Time To Explain and after reading through the developer’s blog I have to say I’m interested in their future titles and hope that their recent Greenlight success will give them the capital to see it through.

Rating: 8.25/10

No Time To Explain is available on PC right now for $9.99. Total game time was approximately 2 hours.