I haven’t been an iPhone user for many years now, my iPhone 3GS sitting disused in the drawer beside me ever since it was replaced, mostly because the alternatives presented by other companies have, in my opinion, outclassed them for a long time. This is not to say that I think everything else should replace their phone with a Xperia Z, that particular phone is definitely not for everyone, as I realise that the iPhone fills a need for many people. Indeed it’s the phone I usually recommend to my less technically inclined friends and family members because I know that they have a support system tailored towards them (meaning they’ll bug me less). So whilst today’s announcement of the new models won’t have me opening up my wallet anytime soon it is something I feel I need to be aware of, if only for the small thrill I get for being critical of an Apple product.
So as many had speculated Apple announced 2 new iPhones today: the iPhone 5C which is essentially the entry level model and the iPhone 5S which is the top of the line one with all the latest and greatest features. The most interesting different between the two is the radical difference in design with the 5C looking more like a kids toy with its pastel style colours and the 5S looking distinctly more adult with it’s muted tones of silver, grey and gold. As expected the 5C is the cheaper of the two with the base model starting from AUD$739 and the 5S AUD$869 with the prices ramping up steadily depending on how much storage you want.
The 5C is interesting because everyone was expecting a budget iPhone to come out and Apple’s response is clearly not what most people had in mind. Sure it’s the cheapest model of the lot (bar the Phone 4S) but should you want to upgrade the storage you’re already paying the same amount as the entry level 5S. The difference in features as well are also pretty minimal with the exceptions being an A6 vs A7 processor, slightly bulkier dimensions, new fandangled fingerprint home button and a slightly better camera. Of course those slight differences are usually enough to push any potential iPhone buyer to the higher end model so the question then becomes: who is the 5C marketed towards?
It’s certainly not at the low end of the market, as most people were expecting, even though it looks the part with its all plastic finish (which we haven’t seen since I last used an iPhone). It might appeal to those who like those particular colours although realistically I can’t see that being much of a draw card considering you can buy any colour case for $10 these days. Indeed even when you factor in the typical on contract price for a new iPhone (~$200) the difference between an entry level 5C and 5S is so small that most would likely dole out the extra cash just to have the better version, especially considering how visually different they are.
Another thing running against the 5C is that the 5S shares the same dimensions as the original iPhone 5 allowing you to use all your old cases and accessories with it. I know this won’t be a dealbreaker for many but it seems obvious that the 5S is aimed at people coming from the iPhone 5 whereas the 5C doesn’t appear to have any particular market in mind that necessitates its differences. If this was Apple’s attempt to try and claw back some of the market that Android has been happily dominating then I can help but feel it’s completely misguided. Then again I lost my desire for Apple products years ago so I might be missing out on what the appeal of a gimped, not-really-budget Apple handset might be.
The iPhone 5S does look like a decent phone sporting most of the features you’d expect from a current generation smart phone. NFC is still missing which, if I’m honest, isn’t as big of a deal as I used to make it out to be as I’ve now got a NFC phone and I can’t use it for jack so I don’t count it as downer anymore. As always though the price of a comparable Android handset to what you get from Apple is a big sore point with the top of the line model topping out at an incredible AUD$1129. I know Apple is a premium brand but when the price difference between the high and low end is $260 and the only difference is storage you really have to ask if its worth it, especially when comparable Android phones will have the same level of features and will be cheaper (my 16GB Xperia Z was $768 for reference).
I will be really interested to see how the 5C pans out as many are billing it as the “budget” iPhone that everyone was after when in truth it’s anything but that. The 5S is your typical product refresh cycle from Apple, bringing in a few new cool things but nothing particularly revolutionary. Of course you should consider everything I’ve said through the eyes of a long time Android user and lover as whilst I’ve owned an iPhone before it’s been so long between drinks that I can barely remember the experience anymore. Still I’m sure at least the 5S will do well in the marketplace as all the flagship Apple phones do.
You know who gets a ton of my money these days? Game publishers. Whilst they might not get the same amount per sale that they used to the amount I pump into the industry per year has rocketed up in direct correlation with my ability to pay. Nearly every game you see reviewed on here is purchased gladly with my own money and I would happily do the same with all forms of entertainment if they provided the same level of service that the games industry does. However my fellow Australian citizens will know the pain that we routinely endure here with delayed releases and high prices, so much so that our Parliament subpoenaed several major tech companies to have them explain themselves.
If I’m honest though I had thought the situation was getting a bit better, that was until I caught wind of this:
I saw the trailer for Cloud Atlas sometime last year and the concept instantly intrigued me. As someone who’s nascent years were spent idolizing The Matrix I’ve always been a fan of the Wachowskis’ work and so of course their latest movie was of particular interest. Since I’m on the mailing list for my local preferred cinema (Dendy, in case you’re wondering) I simply waited for the email announcing it. For months and months I waited to see something come out until I started hearing friends talking about how they had seen it already. Curious I checked my favourite Usenet site and lo and behold it was available, which mean only one thing.
It was available on DVD elsewhere.
That email I was waiting for arrived a couple days ago, 4 months after the original theatrical release in markets overseas. Now I know it’s not that hard to get a film approved in Australia nor is it that difficult to get it shipped over here (even if it was shot on film) so what could be the reason for such a long delay? As far as I can tell it’s the distributors holding onto their out dated business models in a digital era where they have to create artificial scarcity in order to try and bilk more money out of the end consumers. I’ve deliberately not seen movies in cinemas in the past due to shenanigans like this and Cloud Atlas is likely going to be the latest entry on my civil disobedience list.
I seriously can’t understand why movie studios continue with behaviour like this which is what drives customers to seek out other, illegitimate means of getting at their content. I am more than happy to pay (and, in the case of things like Cloud Atlas, at a premium) for content like this but I do not want my money going to businesses that fail to adapt their practices to the modern world. Artificial scarcity is right up there with restrictive DRM schemes in my book as they provide absolutely no benefit for the end user and only serve to make the illegitimate product better. Really when we’re hit from all sides with crap like this is it any surprise that we’re a big ole nation o pirates?
A decade ago many of my generation simply lacked the required disposable income in order to support their habits and piracy was the norm. We’ve all grown up now though with many of us having incomes that we could only dream of back then, enough for us to begin paying for the things we want. Indeed many of us are doing that where we’re able to but far too many industries are simply ignoring our spending habits in favour of sticking to their traditional business models. This isn’t sustainable for them and it frustrates me endlessly that we still have to deal with shit like this when it’s been proven that this Internet thing isn’t going away any time soon. So stop this artificial scarcity bullshit, embrace our ideals and I think you’ll find a torrent of new money heading in your direction. Enough so that you’ll wonder why you held such draconian views for so long.
The defacto platform of choice for any gamer used to be the Microsoft Windows based PC however the last decade has seen that change to be some form of console. Today, whilst we’re seeing something of a resurgence in the PC market thanks in part to some good releases this year and ageing console hardware, PCs are somewhere on the order take about 5% of the video game market. If we then extrapolate from there using the fact that only about 1~2% of the PC market is Linux (although this number could be higher if restricted to gamers) then you can see why many companies have ignored it for so long, it just doesn’t make financial sense to get into it. However there’s been a few recent announcements that shows there’s an increasing amount of attention being paid to this ultra-niche and that makes for some interesting speculation.
Gaming on Linux has always been an exercise in frustration, usually due to the Windows-centric nature of the gaming industry. Back in the day Linux suffered from a lack of good driver support for modern graphics cards and this made it nearly impossible to get games running on there at an acceptable level. Once that was sorted out (whether you count binary blobs as “sorted” is up to you) there was still the issue that most games were simply not coded for Linux leaving their users with very few options. Many chose to run their games through WINE or Cedega which actually works quite well, especially for popular titles, but many where still left wanting for titles that would run natively. The Humble Indie Bundle has gone a long way to getting developers working on Linux but it’s still something of a poor cousin to the Windows Platform.
Late last year saw Valve open up beta access to Steam on Linux bringing with it some 50 odd titles to the platform. It came as little surprise that they did this considering that they did the same thing with OSX just over 2 years ago which was undoubtedly a success for them. I haven’t really heard much on it since then, mostly because none of my gamer friends run Linux, but there’s evidence to suggest that it’s going pretty well as Valve is making further bets on Linux. As it turns out their upcoming Steam Box will be running some form of Linux under the hood:
Valve’s engineer talked about their labs and that they want to change the “frustrating lack of innovation in the area of computer hardware”. He also mentioned a console launch in 2013 and that it will specifically use Linux and not Windows. Furthermore he said that Valve’s labs will reveal yet another new hardware in 2013, most likely rumored controllers and VR equipment but we can expect some new exciting stuff.
I’ll be honest and say that I really didn’t expect this even with all the bellyaching people have been doing about Windows 8. You see whilst being able to brag about 55 titles being on the platform already that’s only 2% of their current catalogue. You could argue that emulation is good enough now that all the titles could be made available through the use of WINE which is a possibility but Valve doesn’t offer that option with OSX currently so it’s unlikely to happen. Realistically unless the current developers have intentions to do a Linux release now the release of the Steam Box/Steam on Linux isn’t going to be enough to tempt them to do it, especially if they’ve already recovered their costs from PC sales.
That being said all it might take is one industry heavyweight to put their weight behind Linux to start a cascade of others doing the same. As it turns out Blizzard is doing just that with one of their titles slated for a Linux release some time this year. Blizzard has a long history with cross platform releases as they were one of the few companies to do releases for Mac OS decades ago and they’ve stated many times that they have a Linux World of Warcraft client that they’ve shied away from releasing due to support concerns. Releasing an official client for one of their games on Linux will be their way of verifying whether its worth it for them to continue doing so and should it prove successful it could be the shot in the arm that Linux needs to become a viable platform for games developers to target.
Does this mean that I’ll be switching over? Probably not as I’m a Microsoft guy at heart and I know my current platform too well to just drop it for something else (even though I do have a lot of experience with Linux). I’m very interested to see how the Steam Box is going to be positioned as it being Linux changes the idea I had in my head for it and makes Valve’s previous comments about them all the more intriguing. Whilst 2013 might not be a blockbuster year for Linux gaming it is shaping up to be the turning point where it starts to become viable.
I wasn’t going to write about Apple’s latest release in the iPad Mini and iPad 4 mostly because there wasn’t really anything to write about. The iPad 4 was a bit of a shock considering that the 3 is barely 6 months old and was a pretty significant upgrade over its predecessor so you wouldn’t really think it needed a refresh this early on. The iPad Mini was widely rumoured for a very long time, so much so that blogging about it would feel like I was coming incredibly late to a party that I didn’t really care about in the first place. Thinking about it more though the iPad Mini represents a lot more than just Apple releasing yet another iOS product, it’s a sign of how Apple is no longer in control of the market they created.
Steve Jobs famously said that a tablet smaller than the iPad wouldn’t make any sense as it’d be too small to compete with regular tablets and too big to compete with smart phones. With Apple’s relatively long development cycle its likely that he was aware of the iPad Mini development but I don’t think the idea for its creation came from him. It was easy for him to make judgements from atop the massive tower of iPad sales that he was sitting on at the time however I don’t think he expected them to be as successful as they were. None of them can match the iPad for total numbers sold yet but that doesn’t mean there isn’t a niche area that Apple was failing to exploit.
It all started with the Kindle Fire just over a year ago. The tablet was squarely aimed at a particular market, one that didn’t want to spend a lot on a tablet device and was happy to accept a lower end device in return. This proved to be wildly popular and as of this month Amazon has shipped over 7 million of the devices putting it second only the iPad itself in terms of sales. This in turn drew other companies to the small tablet form factor with the most notable recent addition being the Google Nexus 7 which as of writing has already sold an estimated 3 million units world wide. Apple can’t have been ignorant of this and saw that there was a rather large niche that they weren’t exploiting, hence the release of the iPad Mini.
For a company that’s been making and dominating markets for a decade now the iPad Mini then represents the first product Apple’s created as a reaction to market forces. Whilst we can always point to technology companies that did what did before they entered the market they’re usually no where near as successful. With the small tablet form factor sector however there are multiple companies who have managed to make quite a killing in this particular space prior to Apple entering. You could argue that Apple still owns the tablet space as a whole (and that’s true, to a point) but when it comes to form factors other than those of the traditional iPad Apple has been absent up until this week, and that’s lost money they’ll never recover.
Comparatively it’s a small slice of the overall tablet pie which Apple is still getting the lion’s share of. Even though they might’ve lost 10 million potential sales to a niche market they weren’t filling they still managed to ship 14 million iPads last quarter. Their figures for this quarter might be down on what people were expecting however with the release of the new iPad and the iPad Mini right before the holiday season it’s very likely that they’ll make up that shortfall without too much trouble. Whether that will translate into dominance of the smaller form factor tablet market is up for debate and realistically we’ll only know once next quarter’s results come in.
Whilst I don’t believe this is the beginning of the end for Apple it is the first product to come from them in a long time that, as far as I can tell, is a reaction to the market rather than them attempting to create one. That’s a very different Apple than the one we’re used to seeing and whilst it isn’t necessarily a bad thing (dominating semi-established markets seems to be their bread and butter) it does make you wonder if their focus has shifted away from market creation. I don’t really know enough to answer that but if you were still wondering what Apple under Tim Cook would look like then you might be seeing the beginnings of an answer here. Whether that’s good or not is an exercise I’ll leave for the reader.
There’s only one thing that I don’t like about my little 60D and that’s the fact that it’s not a full frame camera. For the uninitiated this means that the sensor contained within the camera, the thing that actually records the image, is smaller than the standard 35mm size which was prevalent during the film days. This means that in comparison to its bigger brothers in more serious cameras there are some trade offs made, most done in the name of reducing cost. Indeed for comparison a full framed camera would be over double the price I paid for my 60D and would actually lack some of the features that I considered useful (like the screen that swings out). The rumour mill has been churning for quite a while that Canon would eventually release an affordable full frame DSLR at this year’s Photokina and the prospect really excited me, even if my 60D is still only months old at this point.
News broke late yesterday that yes the rumours were true and Canon was releasing a new camera called the EOS 6D which was in essence a full frame camera for the masses. The nomenclature would have you believe that it was in fact a full frame upgrade for the 60D, something that was widely rumoured to be the case, but diving into the specifications reveals that it shares a lot more with the 5D lineage than it does with its prosumer cousin. This doesn’t mean the camera is more focused on the professional field, indeed the inclusion of things like wifi and GPS are usually considered to be conusmer features (I’ve had them in my Sony pocket cam for years, for example), but if I’m honest the picture I built up of the new camera in my head doesn’t exactly align with what Canon has revealed and that’s left me somewhat disappointed.
Before I get into that though let me list off the things that are really quite awesome about the 6D. The full frame sensor in a camera that will cost $2099 is pretty damn phenomenal even if that’s still well out of the range of the people buying in the 60D range. It’s actually the cheapest full frame DSLR available (even the Sony fixed lens full frame is $700 more) and that in itself is an achievement worth celebrating. All the benefits of the bigger sensor are a given (better low light performance, crazy ISOs and better resolution) and the addition of WiFi and GPS means that the 6D is definitely one of the most feature packed cameras Canon has ever released. Still it’s the omission of certain features and reduction in others that’s left me wondering if it’s worth me upgrading to it.
For starters there’s the lack of an articulated screen. It sounds like a small thing as there are external monitor solutions that would get me similar functionality but I’ve found that little flip out screen on my 60D so damn useful that it pains me to give it up. The reasons behind its absence are sound though as they want to make the 6D one of their more sturdier cameras (it’s fully weather sealed as well) and an articulated screen is arguably working against them in that regard.
There’s also the auto-focus system which only comes with 11 focus points of which only 1 is cross type. This is a pretty significant step down from the 60D and coming from someone who struggled with their 400D’s lackluster autofocus system I can’t really see myself wanting to go back to that. It could very well be fine but on paper it doesn’t make me want to throw my money recklessly in Canon’s direction like I did with all the rumours leading up to this point.
One thing could sway me and that would be if MagicLatern made its way onto the 6D platform. The amount of features you unlock by running this software is simply incredible and whilst it won’t fix the 2 things that have failed to impress me it would make the 6D much more palatable for me. Considering that the team behind it just managed to get their software working on the ever elusive 7D there’s a good chance of it happening and I’ll have to see how I feel about the 6D after that happens.
Realistically the disappointment I’m feeling is my fault. I broke my rule about avoiding the hype and built up an image of the product that had no basis in reality. When it didn’t match those expectations exactly I was, of course, let down and there’s really nothing Canon could have done to prevent that. Maybe as time goes on the idea of the 6D will grow on me a bit more and then after another red wine filled night you might see another vague tweet that indicates I’ve changed my mind.
Time to restock the wine rack, methinks.
One of my most hotly anticipated games for this year, and I know I’m not alone in this, will be Blizzard’s Diablo III. I can remember the days of the original Diablo, forging my way down into the bowels of the abandoned church and almost leaping out of my chair when the butcher growled “Aaaahhh, fresh meat!” when I grew close to him. I then went online, firing up my 33K modem (yes, that’s all I had back then) and hitting up the then fledgling Battle.Net only to be overwhelmed by other players who gifted me with unimaginable loot. I even went as far as to buy the only official expansion, Hellfire, and play that to its fullest revelling in the extended Diablo universe.
Diablo II was a completely different experience, one that was far more social for me than its predecessor. I can remember many LANs dedicated to simply creating new characters and seeing how far we could get with them before we got bored. The captivation was turned up to a whole new level however with many of us running dungeons continuously in order to get that last set item or hoping for that extremely rare drop. The expansion pack served to keep us playing for many years after the games release and I still have friends telling me of how they’ve spun it back up again just for the sheer thrill of it.
Amongst all this is one constant: the torturous strain that we put on our poor computer mice. The Diablo series can be played almost entirely using the mouse thanks to the way the game was designed, although you do still need the keyboard especially at higher difficulties. In that regard it seemed like the Diablo series was destined to PC and PC only forever more. Indeed even though Blizzard had experimented with the wild idea of putting StarCraft on the Nintendo64 they did not attempt the same thing with the Diablo series. That is up until now.
Today there are multiple sources reporting that Diablo III will indeed be coming to consoles. As Kotaku points out the writing has been on the wall for quite some time about this but today is the day when everyone has started to pay attention to the idea. Now I don’t think there’s anything about the Diablo gameplay that would prevent it from being good on a console, as opposed to StarCraft (which would be unplayable, as is any RTS on a console). Indeed the simple interface of Diablo’s past would easily lend itself well to the limited input space of the controller with few UI changes needed. What concerns most people though is the possibility that Diablo III could become consolized, ruining the experience for PC gamers.
Considering that we’re already got a beta version of Diablo III on PC it’s a safe bet that the primary platform will be the PC. Blizzard also has a staunch commitment to not launching games until their done and you can bet that if there were any hints of consolization in one of their flagship titles it’d be picked up in beta testing long before it became a retail product. Diablo III coming to consoles is a sign of the times that PC gaming is still somewhat of a minority and even titles that have their roots firmly in the PC platform still need to consider a cross platform release.
Does this mean I’ll play Diablo III on one of my consoles? I must say that I’m definitely curious but I’ve already put in my pre-order for the collector’s edition of Diablo III on the PC. Due to the tie in with Battle.Net it’s entirely possible that buying it on one platform will gain you access to another via a digital download (something Blizzard has embraced wholeheartedly) and I can definitely see myself trying it out just for comparison. For me though the PC platform will always be my primary means by which I game and I can’t deny my mouse the torturous joy that comes from a good old fashioned Diablo session.
My post last week about the trials and tribulations of sorting ones media collection struck a chord with a lot of my friends. Like me they’d been doing this sort of thing for decades and the fact that none of us had any kind of sense to our sorting systems (apart from the common thread of “just leave it where it lies”) came at something of a surprise. I mean just taking the desk I’m sitting at right now for an example it’s clear of everything bar computer equipment and the stuff I bring in with me every day. The fact that this kind of organization doesn’t extend to our file systems means that we either simply don’t care enough or that it’s just too bothersome to get things sorted. Whilst I can’t change the former I decided I could do something about the latter.
So my quest last week proving fruitless I set about developing a program that could sort media based on a couple cues derived from the files themselves. Now for the most part media files have a few clues as to what they actually are. For the more organized of us the top level folder will contain the episode name but since mine was all over the place I figured it couldn’t be trusted. Instead I figured that the file name would be semi-reliable based on a cursory glance at my media folder and that most of them were single strings delimited with only a few characters. Additionally the identifier for season and episode number is usually pretty standard (S01E01, 2×01,1008, etc) so that pulling the season out of them would be relatively easy. What I was missing was something to verify that I was looking in the right place and that’s where I TheTVDB comes in.
The TV Database is like IMDB for TV shows except that it’s all community driven. Also unlike IMDB they have a really nice API that someone has wrapped up in a nice C# library that I could just import straight into my project. What I use this for is a kind of fuzzy matching filter for TV show names so that I can generate a folder with the correct name. At this point I could also probably rename the files with the right name (if I was so inclined) but for the point of making the tool simple I opted not to do this (at this point). With that under my belt I started on the really hard stuff: figuring out how to sort the damn files.
Now I could have cracked open the source of some other renaming programs to see how they did it but I figured out a half decent process after pondering the idea for a short while. It’s a multi-stage process that makes a few assumptions but seems to work well for my test data. First I take the file name and split it up based on common delimiters used in media files. Then I build up a search string using those broken up names stopping when I hit a string that matches a season/episode identifier. I then add that into a list of search terms to query for later, checking first to see if it’s already added. If it’s already in there I then add the file path into another list for that specific search term, so that I know that all files under that search term belong to the same series. Finally I create the new file location string and then present this all to the user, which ends up looking like this:
The view you see here is just a straight up data table of the list of files that Sortilio has found and identified as media (basically anything with the extension .avi or .mkv currently) and the confidence level it has in its ability to sort said media. Green means that in the search for the series name it only found one match, so it’s a pretty good assumption that it’s got it right. Yellow means that when I was doing a search for that particular title I got multiple responses back from TheTVDB so the confidence in the result is a little lower. Right now all I do is take the first response and use that for verification which has served me well with the test data, but I can easily see how that could go wrong. Red means I couldn’t find any match at all (you can see what terms I was searching for in the debug log) and everything marked like that will end up in one giant “Unsorted” folder for manual processing. Once you hit the sort button it will perform the move operations, and suffice to say, it works pretty darn well:
Of course it’s your standard hacked-together-over-the-weekend type deal with a lot of not quite necessary but really nice to have features left out. For starters there’s no way to tell it that a file belongs to a certain series (like if something is misspelled) or if it picks the wrong series to tell it to pick another. Eventually I’m planning to make it so you can click on the items and change the series, along with a nice dialog box to search for new ones should it not get it right. This means you might want to do this on a small subset of your media each time (another thing I can code in) as otherwise you might get files ending up in strange folders.
Also lacking is any kind of options page where you can specify things like other extensions, regex expressions for season/episode matching and a whole host of other preferences that are currently hard coded in. These things are nice to have but take forever to get right so they’ll eventually make their way into another revision but for now you’re stuck with the way I think things should be done. Granted I believe they’ll work for the majority of people out there, but I won’t blame you if you wait for the next release.
Finally the code will eventually be open sourced once I get it to a point where I’m not so embarrassed by it. If you really want to know what I did in the ~400 odd lines that constitute this program then shoot me an email/twitter and I’ll send the source code to you. Realistically any half decent programmer could come up with this in half the amount of time I did so I can’t imagine anyone will need it yet, unless you really need to save 3 hours
So without further ado, Sortilio can be had here. Download it, unleash it on your media files and let me know how it works for you. Comments, questions, bugs and feature requests can be left here as a comment, an @ message on Twitter or you can email me on email@example.com.
You’d think that since I invested so heavily in Silverlight when I was developing Lobaco that I would’ve been more outraged at the prospect of Microsoft killing off Silverlight as a product. Long time readers will know that I’m anything but worried about Silverlight going away, especially considering that the release of the WinRT framework takes all those skills I learnt during that time and transitions them into the next generation of Windows platforms. In fact I’d say investing in Silverlight was one of the best decisions at the time as not only did I learn XAML (which powers WPF and WinRT applications) but I also did extensive web programming, something I had barely touched before.
Rumours started circulating recently saying that Microsoft had no plans to develop another version of the Silverlight plugin past the soon to be released version 5. This hasn’t been confirmed or denied by Microsoft yet but there are several articles citing sources familiar with the matter saying that the rumour is true and Silverlight will recieve no attention past this final iteration. This has of course spurred further outrage at Microsoft for killing off technologies that developers have heavily invested in and whilst in the past I’ve been sympathetic to them this time around I don’t believe they have a leg to stand on.
All of Microsoft’s platforms are so heavily intertwined with each other that it’s really hard to be just a Silverlight/WPF/ASP.NET/MFC developer without a lot of crossover into other technologies. Hell apart from the rudimentary stuff I learnt whilst in university I was able to self learn all of those technologies in the space of a week or two without many hassles. Compare that with my month long struggle to learn basic Objective-C (which took me a good couple months afterwards to get proficient in) and you can see why I think that any developer whining about Silverlight going away is being incredibly short sighted or just straight up lazy.
In the greater world of IT you’re doomed to fade into irrelevance if you don’t keep pace with the latest technologies and developers are no exception to this. Whilst I can understand the frustration in losing the platform you may have patronized for the past 4 years I can’t sympathize with an unwillingness to adapt to a changing market. The Windows platform is by far one of the most developer friendly and the skills you learn in any Microsoft technology will flow onto other Microsoft products, especially if you’re proficient in any C based language. So whilst Microsoft might not see a future with Silverlight that doesn’t mean the developers are left high and dry, in fact they’re probably in the best position to innovate out of this situation.
Make no mistake, in the world of gaming PCs are far from being the top platform. The reasoning behind this is simple, consoles are simply easier and have a much longer life than your traditional PC making them a far more attractive platform for both gamers and developers a like. This has lead to the consolization of the PC games market ensuring that many games are developed primarily for the console first and the PC becomes something of a second class citizen, which did have some benefits (however limited they might be). The platform is long from forgotten however with it still managing to capture a very respectable share of the games market and still remaining the platform of choice for many eSports titles.
The PC games market has been no slouch though with digital sales powering the market to all time highs. Despite that though the PC still remains a relative niche compared to other platforms, routinely seeing market share in the single digit percentages. There were signs that it was growing but it still seemed like the PC was to be forever relegated to the back seat. There’s speculation however that the PC is looking to make a comeback and could possibly even dominate consoles by 2014:
As of 2008, boxed copies of games had paltry sales compared to digital sales, and nothing at all looks to change. During 2011, nearly $15 billion is going to be attributed to digital sales while $2.5 billion belong to boxed copies. This is a trend I have to admit I am not surprised by. I’ll never purchase another boxed copy if I can help it.
The death of PC gaming has long been a mocking-point of console gamers, but recent trends show that the PC has nothing to stress over. One such trend is free-to-play, where games are inherently free, but support paid-services such as purchasing in-game items. This has proven wildly successful, and has even caused the odd MMORPG to get rid of it subscription fee. It’s also caused a lot of games to be developed with the F2P mechanic decided from the get-go.
The research comes out of DFC Intelligence and NVIDIA was the one who’s been spruiking it as the renaissance of PC gaming. The past couple years do show a trend for PC games sales to continue growing despite console dominance but the prediction starts to get a little hairy when it starts to predict the decline of console sales next year when there doesn’t seem to be any evidence of it. The growth in the PC sales is also strikingly linear leading me to believe that it’s heavily speculation based. Still it’s an interesting notion to toy with, so let’s have a look at what could (and could not) be driving these predictions.
For starters the data does not include mobile platforms like smart phones and tablets which for the sake of comparison is good as they’re not really on the same level as consoles or PCs. Sure they’ve also seen explosive growth in the past couple years but it’s still a nascent platform for gaming and drawing conclusions based on the small amounts of data available would give you wildly different results based purely on your interpretation.
A big driver behind these numbers would be the surge in the number of free to play, micro-transaction based games that have been entering the market. Players of these types of games will usually spend over and above the usual amount they would on a similar game that had a one off cost. As time goes on there will be more of these kinds of titles that appeal to a wider gamer audience thereby increasing the revenue of PC games considerably. Long time gamers like me might not like having to fork out for parts of the game but you’d be hard pressed to argue that it isn’t a successful business model.
Another factor could be that the current console generation is getting somewhat long in the tooth. The Xbox360 and PlayStation 3 were both launched some 5 to 6 years ago and whilst the hardware has performed admirably in the past the disparity between what PCs and consoles are capable of is hard to ignore. With neither Microsoft nor Sony mentioning any details on their upcoming successors to the current generation (nor if they’re actually working on them) this could see some gamers abandon their consoles for the more capable PC platforms. Considering even your run of the mill PC is now capable of playing games beyond the console level it wouldn’t be surprising to see gamers make the change.
What sales figures don’t tell us however is what the platform of choice will be for developers to release on. Whilst the PC industry as a whole might be more profitable than consoles that doesn’t necessarily mean it will be more profitable for everyone. Indeed titles like Call of Duty and Battlefield have found their homes firmly on the console market with PCs being the niche. The opposite is true for many of the online free to play games that have yet to make a successful transition onto the console platform. It’s quite possible that these sales figures will just mean an increase in a particular section of the PC market while the rest remain the same.
Honestly though I don’t think it really matters either way as game developers have now shown that it’s entirely possible to have a multi-platform release that doesn’t make any compromises. Consolization then will just be a blip in the long history of gaming, a relic of the past that we won’t see repeated again. The dominant platform of the day will come and go as it has done so throughout the history of gaming but what really matters is the experience which each of them can provide. As its looking right now all of them are equally capable when placed in the hands of good developers and whilst these sales projections predict the return of the PC as the king platform in the end it’ll be nothing more than bragging rights for us long time gamers.
Google+ has only been around for a mere 2 months yet I already feel like writing about it is old hat. In the short time that the social networking service as been around its had a positive debut to the early adopter market, seen wild user growth and even had to tackle some hard issues like their user name policy and user engagement. I said very early on that Google had a major battle on their hands when they decided to launch another volley at an another silicone valley giant but early indicators were pointing towards them at least being a highly successful niche product at the very least, if for the only fact that they were simply “Facebook that wasn’t Facebook“.
One of the things that was always lacking from the service was an API that was on the same level as its competitors. Both Facebook and Twitter both have exceptional APIs that allow services to deeply integrate with them and, at least in the case of Twitter, are responsible in a large part for their success. Google was adamant that an API was on the way and just under a week ago they delivered on their promise, releasing an API for Google+:
Developers have been waiting since late June for Google to release their API to the public. Well, today is that Day. Just a few minute ago Chris Chabot, from Google+ Developer Relations, announced that the Google+ API is now available to the public. The potential for this is huge, and will likely set Google+ on a more direct path towards social networking greatness. We should see an explosion of new applications and websites emerge in the Google+ community as developers innovate, and make useful tools from the available API. The Google+ API at present provides read-only access to public data posted on Google+ and most of the Google+ API follows a RESTful API design, which means that you must use standard HTTP techniques to get and manipulate resources.
Like all their APIs the Google+ one is very well documented and even the majority of their client libraries have been updated to include the new API. Looking over the documentation it appears that there’s really only 2 bits of information available to developers at this point in time, those being public Profiles (People) and activities that are public. Supporting these APIs is the OAuth framework so that users can authorize external applications so that they can access their data on Google+. In essence this is a read only API for things that were already publicly accessible which really only serves to eliminate the need to screen scrape the same data.
I’ll be honest, I’m disappointed in this API. Whilst there are some useful things you can do with this data (like syndicating Google+ posts to other services and reader clients) the things that I believe Google+ would be great at doing aren’t possible until applications can be given write access to my stream. Now this might just be my particular use case since I usually use Twitter for my brief broadcasts (which is auto-syndicated to Facebook) and this blog for longer prose (which is auto shared to Twitter) so my preferred method of integration would be to have Twitter post stuff to my Google+ feed. Because as it is right now my Google+ account is a ghost town compared to my other social networks simply because of the lack of automated syndication.
Of course I understand that this isn’t the final API, but even as a first attempt it feels a little weak.
Whilst I won’t go as far as to say that Google+ is dying there is data to suggest that the early adopter buzz is starting to wind down. Anecdotally my feed seems to mirror this trend with average time between posts on there being days rather than minutes it is on my other social networks. The API would be the catalyst required to bring that activity back up to those initial levels but I don’t think it’s capable of doing so in its current form. I’m sure that Google won’t be a slouch when it comes to releasing new APIs but they’re going to have to be quick about it if they want to stem the flood of inactivity.
I really want to use Google+, I really do it’s just that the lack of interoperability that keeps all my data out of it. I’m sure in the next couple months we’ll see the release of a more complete API that will enable me to use the service as I, and many others I feel, use our other social networking services.