Monthly Archives: April 2011

R18+ Instead of MA15+? You’ve Got My Attention.

The last two months have seen the R18+ debate flare up to fever pitch levels once again with gamers all around Australia enjoying both the joyous highs and perilous lows. It all started back at the start of March when the Australian Classification Board banned the upcoming release of the latest Mortal Kombat, leaving gamers reeling from the loss of yet another AAA title to the dreaded RC rating. Just over 2 weeks later saw Minister O’Conner give an ultimatum to Australia’s states and territories giving us hope that one day Australian gamers wouldn’t have to put up with being treated as children forever. This was then brought crashing down again when Attorney-General Clark decided to oppose the idea, effectively forcing O’Conner’s hand at a full classification system upheaval and delaying the introduction of a R18+ rating for a good while.

The seeds of dissent have already taken hold however with the vast majority of the Australian public being very supportive of the introduction of a R18+ rating. Whilst it’s not a big enough issue to swing an election one way or the other it still manages to garner a good chunk of media attention whenever it pops up and its opposition face an uphill battle in convincing Australia that it’s a bad idea. It seems that the issue is starting to reach boiling point with the South Australian Attorney-General, Jon Rau, declaring that he’ll go it alone if the national scheme gets stuttered (with the ACT following suit) and wants to abolish the MA15+ rating entirely:

Rau, and the South Australian Labor Government, has said that he will abolish the MA15+ rating in that state, as a way of “more clearly defining” what is (and is not) suitable for children.

His proposed plan would change the system to include G, PG, M and R18+ classifications (while still allowing for games to be Refused Classification or effectively banned), making a “clear difference” between what adults can play and what is available to children.

There has been quite the reaction to this news in the media with many supporting the introduction of the R18+ rating but staying mum on the whole removal of the MA15+ rating. It’s true that the MA15+ rating has been used quite broadly in Australia with many games that got R18+ equivalents in other countries being down rated for Australia, many without modification. Additionally MA15+ rated titles are supposed to be controlled via identity checks (since they’re restricted to people over 15) however there’s no real enforcement of this and I can tell you that as a enterprising youth I was able to acquire many MA15+ titles and I was only ever checked once, when I was 16. I would happily pay the price of the MA15+ to get R18+ but I’m not so sure that it’s in Australia’s best interests to do away with the rating entirely.

You see the idea of a R18+ game brings about a whole set of rules that will need to be followed for the rating to be effective. Since these games are effectively becoming a controlled substance like cigarettes and alcohol there will need to be ID checks for those who look under 25, possible regulation of marketing materials for the games and access to the physical copies of the games restricted. This does place a burden on the retailers and could see some of them refuse to stock R18+ games just so they don’t have to bother with the controls. This already happens in the USA with Walmart refusing to stock any game classified AO or movie classified as NC17+. The MA15+ rating could still prove useful to publishers who are seeking to make their product more accessible, even if that means reworking it slightly.

That doesn’t mean that the MA15+ rating itself couldn’t be reworked a little to match up more closely with its international counterparts. The M rating already covers off material that is considered to be unsuitable for people under the age of 15 and many countries put their mature delineations at 16 or 17 (PEGI and ESRB respectively) along with their R18+ equivalent. In all honesty I believe PEGI gets it most right with their incremental ratings system but there’s even still merit with the ESRB model that allows for some material to be sold unhindered whilst still giving the R18+ option for when its required.

Realistically Australia’s rating system needs an overhaul as whilst I’d love the R18+ rating to be introduced tomorrow doing so in the style of “You can buy it in one place but not the other but ordering it from there is fine” sort of thing we’ve got in the ACT for porn (and soon R18+ games) isn’t doing us any favors. We’ll probably have to deal with the virtual R18+ ghetto for a while whilst the wheels of the government slowly turn which is still a positive result for Australian gamers, even if they’ll have to route all their purchases through Canberra or Adelaide. It’s the first step in a long way to the total reform of the classification system and it really can’t come any sooner.

My Windows Phone 7 Dilemma.

You know I was pretty hyped about getting a WP7 handset after having a short play with one in the store. The slick interface and overall usability of it was so high that I thought it was worth a shot and I had really nothing to lose since my work would be paying for it. The NoDo update was on the horizon however so I decided that I’d hold back until it made its way into production so that I wouldn’t have to deal with the same frustrations that day 0 customers had. Most notably this would be the inclusion of copy and paste, but there were also a few other fixes that I thought would be good to have and worth the wait.

The problem is however that unlike regular Windows patching there’s a gate keeper between me and Microsoft’s patches for their new mobile platform. You see the patches have to pass muster with the carriers first before they can be distributed to the handsets although Microsoft had said in the past that they were working with them to make the process as quick as possible. Unfortunately for us Australian customers looking for a WP7 handset you really only have one carrier to go with: Telstra. Now this wouldn’t be so much of a bad option normally since Telstra had to start playing straight after their retail and wholesale arms were broken apart but it seems that they’re not up to the job of testing WP7 updates:

Universal availability of the copy-and-paste update to Windows Phone 7, codenamed NoDo, is almost here, according to Microsoft’s latest schedule update. The final unpatched phone available in the US market, the HTC Surround sold by AT&T should start to receive its updates within the next ten business days. The network’s other two handsets, the Samsung Focus and LG Quantum, have been receiving updates since last week.

European carrier Deutsche Telekom (which includes T-Mobile UK) has at last finished its testing, as has Australian carrier Optus. Updates from phones on these networks should also appear within the next ten business days or so. This leaves only two carriers still delaying the updates due to “testing”: Telefonica, in Spain, and Telstra, in Australia.

This was the one area where I was expecting Microsoft to shine through since their bread and butter products have depended so heavily on their patch services for well over a decade. Sure the vast majority of the blame should be leveled at the carriers since they’re the ones causing the delays, but Microsoft isn’t innocent of incurring delays either. Of course the original iPhone and Android handsets weren’t immune to problems like this either but I had expected Microsoft’s late coming to the party to be at least coupled with a strong patch and feature release scheme so they could play catch up quickly.

It might seem like an extraordinarily small gripe considering the rest of the platform looks solid but when minor feature releases like this take so long to get through the pipeline it makes me wonder just how long I’ll have to wait for the next update, codenamed Mango, to drop. Amongst other things Mango will bring full HTML5 support to WP7 something which it currently lacks in its browser. Whilst the IE9 implementation of HTML5 does leave some things to be desired (my newest app idea uses HTML5 bits, and IE9 mangles it) it is a lot better than not having it at all, especially when so many mobile versions of sites rely on HTML5 functionality. With speculation now brewing that this update might slip to next year that’s starting to put WP7 at a serious disadvantage, unless some enterprising browser developer ports to WP7 ala Opera et al.

I’m still planning to nab myself one of these handsets if only for the few times I’ll want to try my hand at developing an application for it but with such delays piling up on each other I could very well see myself changing to Android or back to iOS until they’re finished playing the catch up game. I’m sure as time goes on they’ll develop a much better relationship with the carriers and hopefully they’ll be able to leverage that to remove some of the roadblocks to getting patches and updates out to us consumers. Until then however WP7 users are going to be at the whim of the carriers, even more so than they are normally.

Goodbye, My Sweet Optical Drive.

I’ve been drooling over the specifications of my next computer for well over a month now, tweaking bits here and there to ensure that the PC I end up building will provide the best value for money I can get. Sure there are a few extravagances in it like the Corsair H70 water cooling kit and the Razer Megasoma mouse pad but otherwise it’s a very respectable rig that will serve me well over the course of the next few years. The initial design I had in my head however failed to account for a few of the real world issues that actually building this system would entail, forcing me to make some tough decisions.

Firstly the case I currently use, a Lian Li PC-B20B, has a drive cage that only fits 4 hard drives in it. Sure I’d probably be able to stuff one in the floppy bay but its far from an ideal solution and it just so happens that the perfect place for the water cooling kit would be right smack bang where the hard drive bay currently is. I’m not sure how I stumbled across it but I saw this awesome product from Lian Li the EX-34NB which converts 3 of the front drive bays into 4 internal hard drive bays, complete with a fan. It was the perfect solution to my dilemma allowing me to have the 4 storage drives and the water cooling solution living together in my case in perfect harmony.

Of course then I asked myself the question, where would the SSD go?

The obvious choice would be in the floppy slot since I have 2 of them and neither of them are getting used, but I may have to remove the cage to fit the water cooler in there (it looks to be a tight fit from the measurements). Additionally the motherboard I’m looking at going with, the AsRock P67 Extreme6, comes with a nifty front bay adapter for a couple USB3 ports that doubles as a SSD mounting kit. This means though that I’d have to be giving up one of the longest lived components that I’ve kept for the better part of a decade, my dual layer DVD burner.

I couldn’t tell you exactly when I bought it but I do know I shelled out a good $200+ dollars for my little IDE burner, top of the line for its time. I can tell you one of the primary reasons I bought it however, it came with a black bezel that matched my gigantic black case perfectly. It was the perfect little work horse and whilst its dual layer abilities were only used a couple times when I forayed into the dark world of Xbox360 “backups” it still burnt many a DVD for me without complaint. It had also developed a curious little quirk over the years, opening with such force that it thought someone had pushed it back in after it had opened, causing it to promptly close. Still it functioned well for what I needed and it stayed with me through 2 full computer upgrades.

Thinking back over the past year or so I can only think of a few times that I ever really needed to burn a DVD for something, most of the time being able to cope quite well with my trusty little flash drive or network shares. Indeed many of the games that I bought either had a digital distribution option or were copied to my hard drive before attempting to install them. Whilst I’d be sad to see the one component that’s been constant in my computing life for such a long time to go I really can’t see a need for it anymore, especially when its taking up a potential mounting spot for my future SSD.

That’s not to say I think that optical media and their respective hardware are dead though, far from it. Whilst the cost of flash drives has come down significantly over the past decade they’re still an order of magnitude more expensive to produce than an optical disc. Indeed even in the lucrative server markets nearly all vendors still provide their updates and tools on CDs simply because the cost of doing so on a flash drive is just too high. Sure if you included the cost of the drive in that whole equation that might change matters slightly but like the floppy drive before it we’ve still got a good decade or so before optical media will be phased out of normal use, although it will still hang on for a long time to come.

It was an interesting realization for me to come to since optical media is the first format I witnessed being born, gain mainstream adoption and then begin to fade in obsolescence. Of course I’m still a long way from being rid of optical drives completely, my PC will be one of only 2 PCs in my house to not have an attached optical drive, but it is the signal that things are moving on and the replacement of flash media is ready to take the helm.

I’ll have to find a fitting home for my long time pal, probably in the media PC where he’ll get used every so often.

Why Epic is Wrong About Mobile Gaming.

The world of mobile gaming is a curious one. It’s roots date back well over decade but it’s only really come into its own in the past few years as smartphones became capable enough and there were platforms available to support it. The industry blossomed on the backs of the small and independent developers who took advantage of the low barriers to entry to be able to release their games on the platform and is now a multi-billion dollar industry. As a traditional gamer I was a bit sceptical that it would amount to anything more than just another time waster platform, my opinion only changing after buying Infinity Blade which I thoroughly enjoyed. Still I’m a firm believer that the mobile platform, whilst definitely a successful industry, is not killing other platforms as you just can’t recreate the same experience on a tablet or handheld as you can with a PC or a console.

Of course the large game developers and publishers are concerned about how what the future of their business will look like. With mobile gaming carving out a good chunk of the games industry in such a small amount of time (about 6.4% of all games industry revenue) and social networking games grabbing about the same it really shouldn’t come as a surprise that they might be worried about it. Recently Mike Capps, president of Epic who create the Unreal engine, went on record saying that the flood of 99 cent games was killing them:

“We have not been this uncertain about what’s coming next in the games industry since Epic’s been around for 20 years. We’re at such an inflection point. Will there be physical distribution in 10 years or even five? Will anyone care about the next console generation? What’s going on in PC? Can you make money on PC if it’s not a connected game? What’s going on in mobile?

“If there’s anything that’s killing us [in the traditional games business] it’s dollar apps,” he lamented. “How do you sell someone a $60 game that’s really worth it … They’re used to 99 cents. As I said, it’s an uncertain time in the industry. But it’s an exciting time for whoever picks the right path and wins.”

If you take into consideration the vast majority of people who play games on their phones don’t play games on other platforms¹ then it makes sense that you can’t sell them a $60 game because the platform just isn’t suited to that kind of title. Sure people may spend a good chunk of time playing games on their mobiles (rivalling the amount of time spent on more traditional titles) but it’s in no way comparable. Most of the time spent on mobile games is done in fits and bursts as the platform is aptly tuned to, rather than the long continuous sessions that PC and console gamers are more accustomed to. In essence if you’re a traditional game developer or publisher looking to push your wares onto the mobile market you’re not going to be building the same kind of products, nor are you going to be charging the same price.

Additionally the mobile gaming industry is in no way killing any of the other platforms. Consoles are by far the kings of the industry bringing in over 40% of the total revenue with PCs still making up a good 20% (and is even growing despite the heavy competition). Sure mobile games have brought some disruption to the industry and have given pause to many developers and publishers who are trying to gauge where the industry is heading. Frankly though if you think that there’s no future left in the classic $60 titles then you deserve to go out of business, since the industry figures just don’t support that view.

I do agree with Capps’ claim that we’re at an inflection point where the games industry is facing quite a few fundamental changes. Just like the music and film industries before them though they will need to adapt to these market changes or face dying a slow death as they attempt to shoe horn their business models into the new world. I do believe that the games industry is far better poised to adapt and innovate with these market disruptions than their more traditional media outlets were and that was proven by just how fast mobile gaming caught on as a mainstream phenomenon. Still mobile gaming is a long, long way from killing off the traditional gaming sector and realistically it has a lot of growing up to do before it would ever have a chance at doing so.

¹I tried my darnedest to find some solid numbers on this and couldn’t find anything substantial. I stand by the sentiment though as from my personal viewpoint the vast majority of people who are mobile gamers are solely dedicated to that platform and don’t play games anything else.

 

SpaceX’s Vision for Mars.

There are only a few private space companies that I have any semblance of faith in these days, most notably Armadillo Aerospace (founded by programming genius John Carmack, creator of DOOM) and my current space idol SpaceX. The former’s achievements have been quite impressive with their technology progressing steadily over the past decade. SpaceX has shown everyone that the realm of space is not just for the super-governments of the world, successfully launching multiple rockets and landing numerous contracts for their services. If there’s anyone that can commoditize access to space it will be SpaceX.

Whilst their current plans of reducing the cost of access to space is clear their direction past that has always been something of a mystery. Last year they announced some plans for a number of rockets that had some mightily impressive specifications, rivalling that of rockets of decades past. SpaceX’s CEO Elon Musk has gone on record saying that he wants to retire on Mars (and his wife is on board too) but those dreams had always been met with scepticism as we haven’t been past low earth orbit for the better part of 4 decades. Reports are starting to come in though that shows Musk is quite serious about his future retirement plans:

“We’ll probably put a first man in  in about three years,” Elon Musk told the Wall Street Journal Saturday. “We’re going all the way to Mars, I think… best case 10 years, worst case 15 to 20 years.”

“Our goal is to facilitate the transfer of people and cargo to other planets, and then it will be up to people if they want to go,” said Musk, who also runs the Tesla company which develops electric cars.

Putting that in perspective that could mean we’d have people on Mars by 2021 or at latest 2031. Comparing that to George Bush’s Vision for Space Exploration which had us returning to the moon in 2020 you’d be forgiven for being sceptical of it since if a government couldn’t do it with a lead time of 15+ years and a comparatively large budget what chance would SpaceX have? However SpaceX has shown that they are quite capable of creating aggressive schedules, meeting them and doing it all on a fraction of the budget that has traditionally been used to accomplish such feats. Indeed the recent announcement of the Falcon Heavy saw many people speculating about missions like a Mars sample return mission that has not been feasible due to the launch weight required but was well within the capabilities of SpaceX’s new rocket.

SpaceX has also been making strides with its Dragon capsule, putting the finish touches on it to make it 100% compatible with NASA’s human rating standards. The planned additions to the craft would see the launch abort system, traditionally a large spike on top of the craft that’s discarded once the launch is successful, put on the side of the craft. This would give the Dragon capsule an unprecedented amount of accuracy when it came to landing the craft, enabling it to soft land at a precise location rather than requiring a splash down in the ocean. Consequently a Dragon capsule could very well be used to land on the surface of other planets, including SpaceX’s goal of Mars.

You’d think by now nothing that SpaceX could do would surprise me, but it seems at every turn they manage to pull off another feat that puts their wild claims firmly in reality. Whilst we may still be a decade away from seeing any real progress on this front it still feels a million times closer than it ever did when the same goal was held by a government agency. Even if they don’t meet their aggressive 2021 target there will be a whole host of progress made between now and then, enough so we’ll have a clear picture of when we’ll be exploring our diminutive red cousin.

 

iPhone Tracking Fiasco (Or Big Brother Isn’t Watching).

Now I love me a good piece of Apple information just like the next guy, they’re just too hard to resist as they’re some damn fine blog fodder. Still for the most part I steer clear of product speculation and rumours simply because I’m not really interested in writing on fluff and my posts would be lost in the deluge of other blogs parroting the same “facts”. Still every so often I come across a bit of Apple news that deserves reporting on, like how people were getting the whole Antennagate thing wrong,  and yesterday brought across another piece of news that had all the tech bloggers in a tizzy.

Yet again I feel they’ve all got it wrong.

What I’m talking about is the the iOS location fiasco that’s currently winding itself up in the media today. In essence its been shown that iOS devices have a log of all your location data from the day you first turned on your phone. The file in question resides on your phone but is copied across to your PC when you sync with iTunes. If you’ve selected to encrypt your backups the file will be encrypted but it is stored in plain text on your phone. People are crying foul over how Big Brother Apple is tracking them and how this is a major breach in privacy and you’d be forgiven for thinking that if you didn’t bother going beneath the surface of these stories.

Now I was considering writing about this yesterday (instead choosing to post some inane dribble, sorry) but I had really no desire to comment on something that seemed like a non-issue. Whilst I’m not keen for someone to follow my every move I was pretty sure that the location database that people were mucking around with was more than likely a cache of previous location data, most likely used in future GPS calculations to improve accuracy. Additionally there was absolutely no evidence that this database had ever made its way to Apple or anyone else for that matter and the only program that has the demonstrated ability to read those files can only do so if its unencrypted and not on your iPhone.

The privacy issue is a different beast however but in reality no one would try to use something like that location cache to track you. Whilst sites like Please Rob Me might provide insight into how such data could be used for nefarious purposes the thing is that most crime will still be committed the old fashion way, by going up to your house and seeing if anyone is there. Hiding all your location information from online sources won’t help combat this problem and the only way this database file could be used against you was if someone had direct access to your phone or PC, the former which indicates your phone has been stolen (got a pin code on that buddy?) and the latter that they’re already in your house (got a password on your PC?).

Of course in doing my research for this post today I came across a few other posts detailing my exact predictions of what the problem might be. People familiar with the matter investigated the Android platform to see if something similar was being done on their end and sure enough there was. The difference between the Apple and Android was that Android had a hard limit set on the number of records whereas the iPhone had no such limit. More than likely Apple will set an upper limit on the number of location records that are kept in the log files in the next iOS update so that people don’t get all butthurt about it again, even though there was nothing wrong in the first place.

Apple products seem to have the uncanny ability of drumming up a PR storm no matter what minor thing happens with them. Whilst this particular ability is primarily positive for Apple it seems even the Apple opposition falls prey to going hyperbolic on any little flaw in the iOS platform, creating fiascos out of what are merely features of the platform. This is why I steer clear of many Apple related articles, there’s simply too much group think on either side of the fence for there to be any proper discussion. So hopefully my little blog here has helped clarify that big brother isn’t watching you and we can all go back to our normal lives and not have to worry about Big Brother Apple watching us in our sleep.

You can still wear the tinfoil hat though, it’s sexy.

I’m Over Being Inspired.

I remember way back when I first started trying to create a business for myself that I’d eagerly seek out stories of others who had done the same, looking for trends so that I could replicate their success. There are of course hundreds of these and it didn’t take me long to find a couple examples that mirrored my own experiences. The common thread that I found amongst them all was that overnight success never happens overnight and that more often than not you’ve got a few solid years of working on something before it starts to get traction. That was a revelation in itself for me since I had always ascribed much of the success of these kinds of people to luck or something I had no control over.

Of course over the past 2 years I’ve seen nearly every inspirational story there is to see and read thousands of articles on how to build a successful business. Sure many of them have been helpful but recently I’ve found myself deliberately avoiding any success stories or how I did it articles, finding them to be rather tedious and uninformative. Indeed the vast majority of them are usually a bunch of high level waffle of how they do this or that, with those things usually being one of the industry hype terms of the time. There are of course notable exceptions who attempt to give you real actionable advice but even they fall prey to make things so general as to be useless to anyone.

It feels like I’m suffering from some major inspiration fatigue. Back in the early days these stories of success pushed me to keep on coding on those days when I felt I was being less than useless, knowing that if I kept at it that eventually I’d have something of value to release upon the world. After failing to attract attention both from Y-Combinator and the general public with my Lobaco beta these stories of success started to seem more like the exception than the rule. I fast became disillusioned with all these inspiring tales of how success followed their hard work, instead wanting the real meat of what they did in order to achieve success.

Maybe its more that I’m at a point where I know that there’s the possibility of success out there and hearing about it no longer helps to inspire me to achieve it. Perhaps its the realization that there’s thousands upon thousands of much more talented developers out there working on their own ideas that are much better than my own. In any case I can no longer take comfort in the just the mere idea that success awaits those who put in the effort and will only be content upon its realization.

Or maybe its just a slow news day and I had nothing better to write about, you can be the judge on that one.

Retailers vs Publishers: The Battle For The Second Hand Market.

I’ll admit that I haven’t bought many games used since I’m usually in the store on release day hungering to be one of the first to get my hands on them. Still I realize there’s quite a market for second hand games since not everyone has the disposable income that I do to splurge on the latest and greatest titles. They’re also a significant source of revenue for brick and mortar games retailers as the margins on used titles are significantly higher than their brand new counter-parts and provide an additional sales hook for them to attract customers (I.E. trade-ins for newer games). There are one group of people who aren’t so pleased with the second hand games market however, the publishers.

Second hand titles, whilst generating significant revenue for the retailers, generate almost nothing for the publishers that first distributed the games. The advent of downloadable content mitigated this somewhat as it was usually tied to the console it was downloaded on and not the game itself but it is a pittance compared to what they generate from a new sale. More recently however games publishers have taken a more sinister approach to the second hand market, seeking to make a resold product less attractive than the new unless the consumer ponies up the extra cash to make up the difference.

Sadly this kind of chicanery affected one of my most favorite games, Mass Effect 2. New buyers of the game received a special code that gave them access to the Cerberus Network, a daily news service for the Mass Effect universe plus the gateway to all the DLC available for the game. The code was a one time use deal so anyone buying the game second hand would have to do without or pony up the US$15 for access to it.  Whilst you could argue that you still got the vast majority of the game despite the lack of the additional DLC there was quite a bit of free stuff on there, some of it even on day 1. This meant that anyone buying it without the code was essentially getting an incomplete game, even if it was playable.

Whilst it’s still not the norm to cripple the second hand market like this it is becoming alarmingly common, with several recent titles making used purchases far less desirable through new-only-or-pay-up DLC. It’s still a step ahead of something like Steam which doesn’t allow the sale of second hand titles at all, not even for a trade in on other steam titles. But it’s still a dick move by the publishers who are just trying to squeeze money out of the consumers in any way they can. Realistically though its detrimental to both the publisher and consumer since many trade ins drive new games sales, to the tune of 20%. Cutting that market out completely would harm the new games market significantly, but none of the publishers will admit to that.

It’s also arguably a violation of the First Sale Doctrine although no one has yet tried to test out this particular violation of it in court.

All this does is reduce the perceived value of the product that the publishers are putting forward and will only help to encourage people to seek out alternative methods in lieu of forking out the extra dollars. Whilst I am happy to give up my freedom to sell my games for the convenience that Steam provides (I am a hoarder, though) I know many people who aren’t so willing to make that trade and have avoided purchasing games that remove their right to first sale doctrine. Instead of punishing people for buying second hand they should be encouraging people to buy in early with things like betas and in game items. Of course I find it hard to fault a company that tries to maximize its profits but when it comes at a cost of significant good will I have to wonder if the costs outweigh the potential benefits and the only ones that know the answer to that are the publishers.

And they’re not talking about it, unfortunately.

Microsoft’s Blackmagic: A Double Edged Sword.

I’m a really big fan of Microsoft’s development tools. No other IDE that I’ve used to date can hold a candle to the mighty Visual Studio, especially when you couple it with things like ReSharper and the massive online communities dedicated to overcoming any of the shortcomings that you might encounter along the way. The same communities are also responsible for developing many additional frameworks in order to extend the Microsoft platforms even further, with many of them making their way into official SDKs. There have only been a few times when I’ve found myself treading new ground with Microsoft tools which no one has before, but every time I have I’ve discovered so much more than I initially set out to.

I’ve come to call these encounters “black magic moments”.

You see with the ease of developing with a large range of solutions already laid out for you it becomes quite tempting to slip into the habit of seeking out a completed solution, rather than building one of your own. Indeed there were a few design decisions in my previous applications that were driven by this, mostly because I didn’t want to dive under the hood of those solutions to develop the fix for my particular problem. It’s quite surprising how far you can get into developing something by doing this but eventually the decisions you make will corner you into a place where you have to make a choice between doing some real development or scraping a ton of work. Microsoft’s development ideals seem to encourage the latter (in favor of using one of their tried and true solutions) but stubborn engineers like me hate having to do rework.

This of course means diving beneath the surface of Microsoft’s black boxes and poking around to get an idea of what the hell is going on. My first real attempt at this was back in the early days of the Lobaco code base when I had decided that everything should be done via JSON. Everything was working out quite well until I started trying to POST a JSON object to my webservice, where upon it would throw out all sorts of errors about not being able to de-serialize the object. I spent the better part of 2 days trying to figure that problem out and got precisely no where, eventually posting my frustrations to the Silverlight forums. Whilst I didn’t get the actual answer from there they did eventually lead me down a path that got me there, but the solution is not documented anywhere nor does it seem that anyone else has attempted such a feat before (or after for that matter).

I hit another Microsoft black magic moment when I was working on my latest project that I had decided would be entirely cloud based. After learning my way around the ins and outs of the Windows Azure platform I took it upon myself to migrate the default authentication system built into ASP.NET MVC 3 onto Microsoft’s cloud. Thanks to a couple handy tutorials the process of doing so seemed fairly easy so I set about my task, converting everything into the cloud. However upon attempting to use the thing I just created I was greeted with all sorts of random errors and no amount of massaging the code would set it straight. After the longest time I found that it came down to a nuance of the Azure Tables storage part of Windows Azure, namely the way it structures data.

In essence Azure Tables is one of them new fangled NOSQL type databases and as such it relies on a couple properties in your object class  to uniquely identify a row and provide scalability. These two properties are called PartitionKey and RowKey and whilst you can leave them alone and your app will still work it won’t be able to leverage any of the cloud goodness. So in my implementation I had overridden these variables in order to get the scalability that I wanted but had neglected to include any setters for them. This didn’t seem to be a problem when storing objects in Azure Tables but when querying them it seems that Azure requires the setters to be there, even if they do nothing at all. Adding one in fixed nearly every problem I was encountering and brought me back to another problem I had faced in the past (more on that when I finally fix it!).

Like any mature framework that does a lot of the heavy lifting for you Microsoft’s solutions suffer when you start to tread unknown territory. Realistically though this is should be expected and I’ve found I spend the vast majority of my time on less than 20% of the code that ends up making the final solution. The upshot is of course that once these barriers are down progress accelerates at an extremely rapid pace, as I saw with both the Silverlight and iPhone clients for Lobaco. My cloud authentication services are nearly ready for prime time and since I struggled so much with this I’ll be open sourcing my solution so that others can benefit from the numerous hours I spent on this problem. It will be my first ever attempt at open sourcing something that I created and the prospect both thrills and scares me, but I’m looking forward to giving back a little to the communities that have given me so much.

OCZ Vertex 3: Don’t Play With My Heart (Or The SSD Conundrum).

My main PC at home is starting to get a little long in the tooth, having been ordered back in the middle of 2008 and only receiving upgrades of a graphics card and a hard drive since then. Like all PCs I’ve had it suffered a myriad of problems that I just usually put up with until I stumbled across a work around, but I think the vast majority of them can be traced to a faulty motherboard (Can’t put more than 4GB of RAM in it or it won’t post) and a batch of faulty hard drives (that would randomly park the heads causing it to freeze). At the time I had the wonderful idea of buying the absolute latest so I could upgrade cheaply for the next few years, but thanks to the consolization of games I found that wasn’t really necessary.

To be honest it’s not even really necessary now either, with all the latest games still running at full resolution and most at high settings to boot. I am starting to lag on the technology front however with my graphics card not supporting DirectX 11 and everything but the RAM being 2 generations behind (yes, I have a Core 2 Duo). So I took it upon myself to build a rig that combined the best performance available of the day rather than trying to focus on future compatibility. Luckily for me it looks like those two are coinciding.

Just because like any good geek I love talking shop when it comes to building new PCs here are the specs of the potential beast in making:

  • Intel Core i7 2600K
  • Asrock P67 Motherboard
  • Corsair Vengeance 1600MHz DDR3 16GB
  • Radeon HD6950
  • 4 x 1TB Seagate HDD in RAID 10
  • OCZ Vertex 3 120GB

The first couple choices I made for this rig were easy. Hands down the best performance out there is with the new Sandy Bridge i7 chips with the 2600K being the top of the lot thanks to its unlocked multiplier and hyperthreading, which chips below the 2600 lack. The choice of graphics cards was a little harder as whilst the Radeon comes out leagues ahead on a price to performance ratio the NVIDIA cards still had a slight performance lead overall, but hardly enough to justify the price. Knowing that I wanted to take advantage of the new SATA 6Gbps  range of drives that were coming out my motherboard choice was almost made for me as the Asrock P67 seems to be one of the few that has more than 4 of the ports available (it has 6, in fact).

The choice of SSD however, whilst extremely easy at the time, became more complicated recently.

You see back in the initial pre-production review round the OCZ Vertex 3 came out shooting, blasting away all the competition in a seemingly unfair comparison to its predecessors. I was instantly sold especially considering the price was looking to be quite reasonable, around the $300 mark for a 120GB drive. Sure I could opt for the bigger drive and dump my most frequently played games on it but in reality a RAID10 array of SATA 6Gbps drives should be close enough without having to overspend on the SSD. Like any pre-production reviews I made sure to keep my ear to the ground just in case something changed once they started churning them out.

Of course, something did.

The first production review that grabbed my attention was from AnandTech, renowned for their deep understanding of SSDs and producing honest and accurate reviews. The results for my drive size of choice, the 120GB, were decidedly mixed on a few levels with it falling down in several places where the 240GB version didn’t suffer any such problems. Another review confirmed the figures were in the right ballpark although unfortunately lacking a comparison to the 240GB version. The reasons behind the performance discrepancies are simple, whilst functionally the same drives the differences come from the number of NAND chips used to create the drive. The 240GB version has double the amount of the 120GB version which allows for higher throughput and additionally grants the drive a larger scratch space that it can use to optimize its performance¹.

So of course I started to rethink my position. The main reason for getting a real SSD over something like the PCIe bound RevoDrive was that I could use it down the line as a jumbo flash drive if I wanted to and I wouldn’t have to sacrifice one of my PCIe lanes to use it. The obvious competitor to the OCZ Vertex 3 would be something like the Intel 510 SSD but the reviews haven’t been very kind to this device, putting it barely in competition with previous generation devices.

After considering all my options I think I’ll still end up going with the OCZ Vertex 3 at the 120GB size. Whilst it might not be the kind of performance in every category it does provide tremendous value when compared to a lot of other SSDs and it will be in another league when compared to my current spinning rust hard drive. Once I get around to putting this new rig together you can rest assured I’ll put the whole thing through its paces, if at the very least to see how the OCZ Vertex 3 stacks up against the numbers that have already been presented.

¹Ever wondered why some SSDs are odd sizes? They are in fact good old fashioned binary sizes (128GB and 256GB respectively) however the drive reserves a portion of that (8GB and 16GB) to use as scratch space to write and optimize data before committing it. Some drives also use it as a buffer for when flash cells become unwritable (flash cells don’t usually die, you just can’t write to them anymore) so that the drive’s capacity doesn’t degrade.