When you think of Apple what kind of company do you think they are? Many will answer that they’re a technology company, some a computing company, but there are precious few who recognise them as a hardware company. Whilst they may run large non-hardware enterprises like the App Store and iTunes these all began their lives as loss-leaders for their respective hardware platforms (the iPhone and the iPod). OSX didn’t start out its life in that way, indeed it was long seen as the only competitor to Windows with any significant market share, however it has been fast approaching the same status as its iCompanions for some time now and the recently announced El Capitan version solidifies its future.
I haven’t covered an OSX version in any detail since I mentioned OSX Lion in passing some 4 years ago now and for good reason: there’s simply nothing to write about. The Wikipedia entry on OSX versions sum up the differences in just a few lines and for the most part the improvements with each version come down to new iOS apps being ported and the vague “under-the-hood” improvements that come with every version. The rhetoric from Apple surrounding the El Capitan release even speaks to this lack of major changes directly, stating things like “Refinements to the Mac Experience” and “Improvements to System Performance” as their key focus. Whilst those kinds of improvements are welcome in any OS release the fact that the last 6 years haven’t seen much in the way of innovation in the OSX product line is telling of where it’s heading.
The Mountain Lion release of OSX was the first indication that OSX was likely heading towards an iLine style of product with many iOS features making their way into the operating system. Mavericks continued this with the addition of another 2 previously iOS exclusives and Yosemite bringing Handoff to bridge between other iOS devices. El Capitan doesn’t make any specific moves forward in this regard however it is telling that Apple’s latest flagship compute product, the revamped and razor thin Macbook, is much more comparable to an upscale tablet than it is to an actual laptop. In true Apple fashion it doesn’t really compare with either, attempting to define a new market segment in which they can be the dominant player.
If it wasn’t obvious what I’m getting at here is that OSX is fast approaching two things: becoming another product in the iOS line and, in terms of being a desktop OS, irrelevance. Apple has done well with their converged ecosystem, achieving a level of unification that every other ecosystem envies, however that strategy is most certainly focused on the iOS line above all else. This is most easily seen in the fact that the innovation happens on iOS and then ported back to OSX. This is not something that I feel Apple would want to continue doing long into the future. Thus it would seem inevitable that OSX would eventually pass the torch to iOS running on a laptop form factor, it’s just a matter of when.
This is not to say it would be a bad thing for the platform, far from it. In terms of general OS level tasks OSX performs more than adequately and has done so for the better part of a decade. What it does mean however is that the core adherents which powered Apple’s return from the doldrums all those years ago are becoming a smaller part of Apple’s overall strategy and will thus recieve much less love in the future. For Apple this isn’t much of a concern, the margins on PCs (even their premium models), have always been slim when compared to their consumer tech line. However for those who have a love for all things OSX they might want to start looking at making the transition if an iOS based future isn’t right for them.
The date for the final version of Windows has been set: July 29 of this year.
The announcement comes as a shock to no one, Microsoft had repeatedly committed to making Windows 10 generally available sometime this year, however the timing is far more aggressive than I would have expected. The Windows Insider program was going along well although the indications were that most of the builds still had a decidedly beta feel to them along with many features being missing. Indeed the latest build was released just three days ago indicating that a full release was still some time away. Microsoft isn’t one to give soft dates, especially for their flagship OS, so we can take the July 29 date as gospel from here on out.
Since everyone in the Insider program has had their hands on Windows 10 for some time now the list of features likely won’t surprise you however there were a few things that caught my eye in Microsoft’s announcement post. By the looks of it Office 2016 will be released alongside the new version of Windows including a new universal app version that’s geared towards touch devices. Considering how clumsy the desktop Office products felt on touch screens this is a welcome addition for tablet and transformer devices although I’d hazard a guess that the desktop version will still be the preferred one for many. What’s really interesting though is that OneNote and Outlook, long considered staples of the Office suite by many, will now be included in the base version of Windows for free. It’s not a big of an upset as including say Word or Excel would be but still an unexpected move none-the-less.
Many of the decidedly lacklustre default metro apps will get some new life breathed into them with an update to the universal app platform. On the surface this removes their irritating “takes over your entire desktop when launched” behaviour and makes them behave a lot more like a traditional app. Whether or not they’ll be improved to the point of usable beyond that is something that I’ll have to wait and see although I do have to admit that some of the built in apps (like the PDF reader) were quite useful to have. How the well integration between those apps, the cloud and other devices that can run universal apps, works remains to be seen although I’ve heard positive things about this experience in the past.
It seems that Microsoft has had this date in mind for some time now as all my home Windows 8.1 installs last night chirped up with a “Reserve your free Windows 10!” pop up late last night. This is the realisation of the promise Microsoft made back at the start of the year to provide a free Windows 10 update to all current consumer level customers, something I thought would likely be handled through a redemption portal or similar. However, based on the success Microsoft had in getting people to upgrade from 8 to 8.1 with a similar notification, I can see why they’ve taken this approach as it’s far more likely to get people upgrading than a free Windows 10 serial would.
What will be truly interesting to see is if the pattern of adoption continues with major Windows versions. Windows 7, which is now approaching middle age, still remains unchallenged by the previous two upstarts. The barriers to transitioning are now much lower than they once were, however customers have shown that familiarity is something they value above nearly everything else. Windows 10 has all the makings of a Windows version that consumers want but we all know that what people say they want and what they actually want are two different things.
Windows 10 is fast shaping up to be one of the greatest Windows releases with numerous consumer facing changes and behind the scenes improvements. Whilst Microsoft has been struggling somewhat to deliver on the rapid pace they promised with the Windows Insider program there has been some progress as of late and a couple new features have made their way into a leaked build. Technology wise they might not be revolutionary ideas, indeed a couple of them are simply reapplications of tech they’ve had for years now, but the improvements they bring speak to Microsoft’s larger strategy of trying to reinvent itself. That might be awfully familiar for those with intimate knowledge of Windows 8 (Windows Blue, anyone?) so it’s interesting to see how this will play out.
First cab off the ranks in Windows 10’s new feature set is a greatly reduced footprint, something that Windows has copped a lot of flak for in the past. Now this might not sound like a big deal on the surface, drives are always getting bigger these days, however the explosion of tablet and portable devices has brought renewed focus on Windows’ rather large install size on these space constrained devices. A typical Windows 8.1 install can easily consume 20GB which, on devices that have only 64GB worth of space, doesn’t leave a lot for a user’s files. Windows 10 brings a couple improvements that free up a good chunk of that space and bring with it a couple cool features.
Windows 10 can now compress system files saving approximately 2GB on a typical install. The feature isn’t on by default, instead during the Windows install the system will be assessed to make sure that compression can happen without impacting user experience. Whether current generation tablet devices will meet the minimum requirements for this is something I’m a little skeptical about so it will be interesting to see how often this feature gets turned on or off.
Additionally Windows 10 does away with the recovery partition on the system drive which is where most of the size savings comes from. Now instead of reserving part of the disk to hold a full copy of the Windows 10 install image, which was used for the refresh and repair features, Windows 10 can rebuild itself in place. This comes with the added advantage of keeping all your installed updates so that refreshed PCs don’t need to go through the hassle of downloading them all again. However in the advent that you do have to do that they’ve included another great piece of technology that should make updating a new PC in your home a little easier.
Windows 10 will include the option of downloading PC updates via a P2P system which you can configure to download updates only from your local network or also PCs on the Internet. It’s essentially an extension of the BranchCache technology that’s been a part of Windows for a while now but it makes it far more accessible, allowing home users to take advantage of it. If you’re running a Windows home (like I am) this will make downloading updates far less painful and, for those of us who format regularly, help greatly when we need to get a bunch of Windows updates again. The Internet enabled feature is mostly for Microsoft’s benefit as it’ll take some load off their servers but should also help out users who are in regions that don’t have great backhaul to the Windows Update servers.
If Microsoft continues to release features like this for Windows 10 then it definitely has a bright future ahead of it. Things like this might not be the sexiest things to talk about but they address real concerns that have plagued Windows for years. In the end they all amount to one thing: a better experience for the consumer, something which Microsoft has fervently increased its focus on as of late. Whether they’ll amount to the panacea to the ills of Windows 8 remains to be seen but suffice to say I’m confident that it’ll line up well.
Microsoft isn’t a company you’d associate with open source. Indeed if you wound back the clock 10 years or so you’d find a company that was outright hostile to the idea, often going to great lengths to ensure open source projects that competed with their offerings would never see the light of day. The Microsoft of today is vastly different, contributing to dozens of open sourced projects and working hard with partner organisations to develop their presence in the ecosystem. For the most part however this has usually been done with an integration view towards their proprietary products which isn’t exactly in-line with the open source ethos. That may be set to change however as Microsoft will be fully open sourcing its .NET framework, the building blocks of all Microsoft applications.
For the uninitiated Microsoft .NET is a development framework that’s been around since the Windows XP days that exposed a consistent set of capabilities which applications could make use of. Essentially this meant that developing a .NET application meant you could guarantee it would work on any computer running that framework, something which wasn’t entirely a given before its inception. It’s since then grown substantially in capability, allowing developers to create some very capable programs using nothing more than the functionality built directly into Windows. Indeed it was so successful in accomplishing its aims that there was already a project going to make it work on non-Windows platforms, dubbed Mono, and it is with them that Microsoft is seeking to release a full open source implementation of the .NET framework.
Whilst this still falls in line with Microsoft’s open source strategy of “things to get people onto the Microsoft platform” it does open up a lot of opportunities for software to be freed from the Microsoft platform. The .NET framework underpins a lot of applications that run on Windows, some that only run on Windows, and an implementation of that framework on another platform could quickly elevate them to cross platform status. Sure, the work to translate them would still likely be non-trivial, however it’ll be a damn sight easier with a full implementation available, possibly enough to tempt some companies to make the investment.
One particularly exciting application of an open sourced .NET framework is games which, traditionally, have an extremely high opportunity cost when porting between platforms. Whilst everything about games development on Windows isn’t strictly .NET there are a lot of .NET based frameworks out there that will be readily portable to new platforms once the open sourcing is complete. I’m not expecting miracles, of course, but it does mean that the future of cross-platform releases is looking a whole bunch brighter than it was just a week ago.
This is probably one of Microsoft’s longest bets in a while as it’s going to be years before the .NET framework sees any kind of solid adoption among the non-Windows crowd. However this does drastically increase the potential of C# and .NET to become the cross platform framework of favour with developers, especially considering the large .NET developer community that already exists today. It’s going to be an area that many of us will be watching with keen interest as it’s yet another signal that Microsoft isn’t the company it used to be, a likely never will be again in the future.
I honestly couldn’t tell you how long I’ve been hearing people talk about Apple getting into the smartwatch business. It seemed every time that WWDC or any other Apple event rolled around there’d be another flurry of speculation as to what their wearable would be. Like most rumours details on it were scant and so the Internet, as always, circlejerked itself into a frenzy about a product that might not have even been in development. In the absence of a real product competitors stepped up to the plate and, to their credit, the devices have started to look more compelling. Well today Apple finally announced their Watch and it’s decidedly mediocre.
For starters it makes the same mistake that many smartwatches do: it follows the current design trend for nearly all other smartwatches. Partly this is due to the nature of LCD screens being rectangular, limiting what you can do with them, however for a company like Apple you’d expect them to buck the trend a bit. Instead you’ve got what looks like an Apple-ized version of the Pebble Steel, not entirely unpleasing but at the same time feeling incredibly bland. I guess if you’re a fan of having a shrunken iPhone on your wrist then the style will appeal to you but honestly smartwatches which look like smartwatches are a definite turn off for me and I know I’m not alone in thinking this.
Details as to what’s actually under the hood of this thing are scarce, probably because unlike most devices Apple announces you won’t be able to get your hands on this one right away. Instead you’ll be waiting until after March next year to get your hands on one and the starting price is somewhere on the order of $350. That’s towards the premium end of the smartwatch spectrum, something which shouldn’t be entirely unexpected, and could be indicative of the overall quality of the device. Indeed what little details they’ve let slip do seem to indicate it’s got some decent materials science behind it (both in the sapphire screen and the case metals) which should hopefully make it a more durable device.
Feature wise it’s pretty much as you’d expect, sporting the usual array of notifications pushed from your phone alongside a typical array of sensors. Apple did finally make its way into the world of NFC today, both with the Apple Watch and the new iPhone, so you’ll be able to load up your credit card details into it and use the watch to make payments. Honestly that’s pretty cool, and definitely something I’d like to see other smartwatch manufacturers emulate, although I’m not entirely hopeful that it’ll work anywhere bar the USA. Apple also toutes an interface that’s been designed around the smaller screen but without an actual sample to look over I really couldn’t tell you how good or bad it would be.
So all that blather and bluster that preceded this announcement was, surprise, completely overblown and the resulting product really does nothing to stand out in the sea of computerized hand adornments. I’m sure there’s going to be a built in market from current Apple fans but outside that I really can’t see the appeal of the Apple Watch over the numerous other devices. Apple does have a good 6 months or so to tweak the product before release so there’s potential for it to become something before they drop it on the public.
I haven’t been an iPhone user for many years now, my iPhone 3GS sitting disused in the drawer beside me ever since it was replaced, mostly because the alternatives presented by other companies have, in my opinion, outclassed them for a long time. This is not to say that I think everything else should replace their phone with a Xperia Z, that particular phone is definitely not for everyone, as I realise that the iPhone fills a need for many people. Indeed it’s the phone I usually recommend to my less technically inclined friends and family members because I know that they have a support system tailored towards them (meaning they’ll bug me less). So whilst today’s announcement of the new models won’t have me opening up my wallet anytime soon it is something I feel I need to be aware of, if only for the small thrill I get for being critical of an Apple product.
So as many had speculated Apple announced 2 new iPhones today: the iPhone 5C which is essentially the entry level model and the iPhone 5S which is the top of the line one with all the latest and greatest features. The most interesting different between the two is the radical difference in design with the 5C looking more like a kids toy with its pastel style colours and the 5S looking distinctly more adult with it’s muted tones of silver, grey and gold. As expected the 5C is the cheaper of the two with the base model starting from AUD$739 and the 5S AUD$869 with the prices ramping up steadily depending on how much storage you want.
The 5C is interesting because everyone was expecting a budget iPhone to come out and Apple’s response is clearly not what most people had in mind. Sure it’s the cheapest model of the lot (bar the Phone 4S) but should you want to upgrade the storage you’re already paying the same amount as the entry level 5S. The difference in features as well are also pretty minimal with the exceptions being an A6 vs A7 processor, slightly bulkier dimensions, new fandangled fingerprint home button and a slightly better camera. Of course those slight differences are usually enough to push any potential iPhone buyer to the higher end model so the question then becomes: who is the 5C marketed towards?
It’s certainly not at the low end of the market, as most people were expecting, even though it looks the part with its all plastic finish (which we haven’t seen since I last used an iPhone). It might appeal to those who like those particular colours although realistically I can’t see that being much of a draw card considering you can buy any colour case for $10 these days. Indeed even when you factor in the typical on contract price for a new iPhone (~$200) the difference between an entry level 5C and 5S is so small that most would likely dole out the extra cash just to have the better version, especially considering how visually different they are.
Another thing running against the 5C is that the 5S shares the same dimensions as the original iPhone 5 allowing you to use all your old cases and accessories with it. I know this won’t be a dealbreaker for many but it seems obvious that the 5S is aimed at people coming from the iPhone 5 whereas the 5C doesn’t appear to have any particular market in mind that necessitates its differences. If this was Apple’s attempt to try and claw back some of the market that Android has been happily dominating then I can help but feel it’s completely misguided. Then again I lost my desire for Apple products years ago so I might be missing out on what the appeal of a gimped, not-really-budget Apple handset might be.
The iPhone 5S does look like a decent phone sporting most of the features you’d expect from a current generation smart phone. NFC is still missing which, if I’m honest, isn’t as big of a deal as I used to make it out to be as I’ve now got a NFC phone and I can’t use it for jack so I don’t count it as downer anymore. As always though the price of a comparable Android handset to what you get from Apple is a big sore point with the top of the line model topping out at an incredible AUD$1129. I know Apple is a premium brand but when the price difference between the high and low end is $260 and the only difference is storage you really have to ask if its worth it, especially when comparable Android phones will have the same level of features and will be cheaper (my 16GB Xperia Z was $768 for reference).
I will be really interested to see how the 5C pans out as many are billing it as the “budget” iPhone that everyone was after when in truth it’s anything but that. The 5S is your typical product refresh cycle from Apple, bringing in a few new cool things but nothing particularly revolutionary. Of course you should consider everything I’ve said through the eyes of a long time Android user and lover as whilst I’ve owned an iPhone before it’s been so long between drinks that I can barely remember the experience anymore. Still I’m sure at least the 5S will do well in the marketplace as all the flagship Apple phones do.
You know who gets a ton of my money these days? Game publishers. Whilst they might not get the same amount per sale that they used to the amount I pump into the industry per year has rocketed up in direct correlation with my ability to pay. Nearly every game you see reviewed on here is purchased gladly with my own money and I would happily do the same with all forms of entertainment if they provided the same level of service that the games industry does. However my fellow Australian citizens will know the pain that we routinely endure here with delayed releases and high prices, so much so that our Parliament subpoenaed several major tech companies to have them explain themselves.
If I’m honest though I had thought the situation was getting a bit better, that was until I caught wind of this:
I saw the trailer for Cloud Atlas sometime last year and the concept instantly intrigued me. As someone who’s nascent years were spent idolizing The Matrix I’ve always been a fan of the Wachowskis’ work and so of course their latest movie was of particular interest. Since I’m on the mailing list for my local preferred cinema (Dendy, in case you’re wondering) I simply waited for the email announcing it. For months and months I waited to see something come out until I started hearing friends talking about how they had seen it already. Curious I checked my favourite Usenet site and lo and behold it was available, which mean only one thing.
It was available on DVD elsewhere.
That email I was waiting for arrived a couple days ago, 4 months after the original theatrical release in markets overseas. Now I know it’s not that hard to get a film approved in Australia nor is it that difficult to get it shipped over here (even if it was shot on film) so what could be the reason for such a long delay? As far as I can tell it’s the distributors holding onto their out dated business models in a digital era where they have to create artificial scarcity in order to try and bilk more money out of the end consumers. I’ve deliberately not seen movies in cinemas in the past due to shenanigans like this and Cloud Atlas is likely going to be the latest entry on my civil disobedience list.
I seriously can’t understand why movie studios continue with behaviour like this which is what drives customers to seek out other, illegitimate means of getting at their content. I am more than happy to pay (and, in the case of things like Cloud Atlas, at a premium) for content like this but I do not want my money going to businesses that fail to adapt their practices to the modern world. Artificial scarcity is right up there with restrictive DRM schemes in my book as they provide absolutely no benefit for the end user and only serve to make the illegitimate product better. Really when we’re hit from all sides with crap like this is it any surprise that we’re a big ole nation o pirates?
A decade ago many of my generation simply lacked the required disposable income in order to support their habits and piracy was the norm. We’ve all grown up now though with many of us having incomes that we could only dream of back then, enough for us to begin paying for the things we want. Indeed many of us are doing that where we’re able to but far too many industries are simply ignoring our spending habits in favour of sticking to their traditional business models. This isn’t sustainable for them and it frustrates me endlessly that we still have to deal with shit like this when it’s been proven that this Internet thing isn’t going away any time soon. So stop this artificial scarcity bullshit, embrace our ideals and I think you’ll find a torrent of new money heading in your direction. Enough so that you’ll wonder why you held such draconian views for so long.
The defacto platform of choice for any gamer used to be the Microsoft Windows based PC however the last decade has seen that change to be some form of console. Today, whilst we’re seeing something of a resurgence in the PC market thanks in part to some good releases this year and ageing console hardware, PCs are somewhere on the order take about 5% of the video game market. If we then extrapolate from there using the fact that only about 1~2% of the PC market is Linux (although this number could be higher if restricted to gamers) then you can see why many companies have ignored it for so long, it just doesn’t make financial sense to get into it. However there’s been a few recent announcements that shows there’s an increasing amount of attention being paid to this ultra-niche and that makes for some interesting speculation.
Gaming on Linux has always been an exercise in frustration, usually due to the Windows-centric nature of the gaming industry. Back in the day Linux suffered from a lack of good driver support for modern graphics cards and this made it nearly impossible to get games running on there at an acceptable level. Once that was sorted out (whether you count binary blobs as “sorted” is up to you) there was still the issue that most games were simply not coded for Linux leaving their users with very few options. Many chose to run their games through WINE or Cedega which actually works quite well, especially for popular titles, but many where still left wanting for titles that would run natively. The Humble Indie Bundle has gone a long way to getting developers working on Linux but it’s still something of a poor cousin to the Windows Platform.
Late last year saw Valve open up beta access to Steam on Linux bringing with it some 50 odd titles to the platform. It came as little surprise that they did this considering that they did the same thing with OSX just over 2 years ago which was undoubtedly a success for them. I haven’t really heard much on it since then, mostly because none of my gamer friends run Linux, but there’s evidence to suggest that it’s going pretty well as Valve is making further bets on Linux. As it turns out their upcoming Steam Box will be running some form of Linux under the hood:
Valve’s engineer talked about their labs and that they want to change the “frustrating lack of innovation in the area of computer hardware”. He also mentioned a console launch in 2013 and that it will specifically use Linux and not Windows. Furthermore he said that Valve’s labs will reveal yet another new hardware in 2013, most likely rumored controllers and VR equipment but we can expect some new exciting stuff.
I’ll be honest and say that I really didn’t expect this even with all the bellyaching people have been doing about Windows 8. You see whilst being able to brag about 55 titles being on the platform already that’s only 2% of their current catalogue. You could argue that emulation is good enough now that all the titles could be made available through the use of WINE which is a possibility but Valve doesn’t offer that option with OSX currently so it’s unlikely to happen. Realistically unless the current developers have intentions to do a Linux release now the release of the Steam Box/Steam on Linux isn’t going to be enough to tempt them to do it, especially if they’ve already recovered their costs from PC sales.
That being said all it might take is one industry heavyweight to put their weight behind Linux to start a cascade of others doing the same. As it turns out Blizzard is doing just that with one of their titles slated for a Linux release some time this year. Blizzard has a long history with cross platform releases as they were one of the few companies to do releases for Mac OS decades ago and they’ve stated many times that they have a Linux World of Warcraft client that they’ve shied away from releasing due to support concerns. Releasing an official client for one of their games on Linux will be their way of verifying whether its worth it for them to continue doing so and should it prove successful it could be the shot in the arm that Linux needs to become a viable platform for games developers to target.
Does this mean that I’ll be switching over? Probably not as I’m a Microsoft guy at heart and I know my current platform too well to just drop it for something else (even though I do have a lot of experience with Linux). I’m very interested to see how the Steam Box is going to be positioned as it being Linux changes the idea I had in my head for it and makes Valve’s previous comments about them all the more intriguing. Whilst 2013 might not be a blockbuster year for Linux gaming it is shaping up to be the turning point where it starts to become viable.
I wasn’t going to write about Apple’s latest release in the iPad Mini and iPad 4 mostly because there wasn’t really anything to write about. The iPad 4 was a bit of a shock considering that the 3 is barely 6 months old and was a pretty significant upgrade over its predecessor so you wouldn’t really think it needed a refresh this early on. The iPad Mini was widely rumoured for a very long time, so much so that blogging about it would feel like I was coming incredibly late to a party that I didn’t really care about in the first place. Thinking about it more though the iPad Mini represents a lot more than just Apple releasing yet another iOS product, it’s a sign of how Apple is no longer in control of the market they created.
Steve Jobs famously said that a tablet smaller than the iPad wouldn’t make any sense as it’d be too small to compete with regular tablets and too big to compete with smart phones. With Apple’s relatively long development cycle its likely that he was aware of the iPad Mini development but I don’t think the idea for its creation came from him. It was easy for him to make judgements from atop the massive tower of iPad sales that he was sitting on at the time however I don’t think he expected them to be as successful as they were. None of them can match the iPad for total numbers sold yet but that doesn’t mean there isn’t a niche area that Apple was failing to exploit.
It all started with the Kindle Fire just over a year ago. The tablet was squarely aimed at a particular market, one that didn’t want to spend a lot on a tablet device and was happy to accept a lower end device in return. This proved to be wildly popular and as of this month Amazon has shipped over 7 million of the devices putting it second only the iPad itself in terms of sales. This in turn drew other companies to the small tablet form factor with the most notable recent addition being the Google Nexus 7 which as of writing has already sold an estimated 3 million units world wide. Apple can’t have been ignorant of this and saw that there was a rather large niche that they weren’t exploiting, hence the release of the iPad Mini.
For a company that’s been making and dominating markets for a decade now the iPad Mini then represents the first product Apple’s created as a reaction to market forces. Whilst we can always point to technology companies that did what did before they entered the market they’re usually no where near as successful. With the small tablet form factor sector however there are multiple companies who have managed to make quite a killing in this particular space prior to Apple entering. You could argue that Apple still owns the tablet space as a whole (and that’s true, to a point) but when it comes to form factors other than those of the traditional iPad Apple has been absent up until this week, and that’s lost money they’ll never recover.
Comparatively it’s a small slice of the overall tablet pie which Apple is still getting the lion’s share of. Even though they might’ve lost 10 million potential sales to a niche market they weren’t filling they still managed to ship 14 million iPads last quarter. Their figures for this quarter might be down on what people were expecting however with the release of the new iPad and the iPad Mini right before the holiday season it’s very likely that they’ll make up that shortfall without too much trouble. Whether that will translate into dominance of the smaller form factor tablet market is up for debate and realistically we’ll only know once next quarter’s results come in.
Whilst I don’t believe this is the beginning of the end for Apple it is the first product to come from them in a long time that, as far as I can tell, is a reaction to the market rather than them attempting to create one. That’s a very different Apple than the one we’re used to seeing and whilst it isn’t necessarily a bad thing (dominating semi-established markets seems to be their bread and butter) it does make you wonder if their focus has shifted away from market creation. I don’t really know enough to answer that but if you were still wondering what Apple under Tim Cook would look like then you might be seeing the beginnings of an answer here. Whether that’s good or not is an exercise I’ll leave for the reader.
There’s only one thing that I don’t like about my little 60D and that’s the fact that it’s not a full frame camera. For the uninitiated this means that the sensor contained within the camera, the thing that actually records the image, is smaller than the standard 35mm size which was prevalent during the film days. This means that in comparison to its bigger brothers in more serious cameras there are some trade offs made, most done in the name of reducing cost. Indeed for comparison a full framed camera would be over double the price I paid for my 60D and would actually lack some of the features that I considered useful (like the screen that swings out). The rumour mill has been churning for quite a while that Canon would eventually release an affordable full frame DSLR at this year’s Photokina and the prospect really excited me, even if my 60D is still only months old at this point.
News broke late yesterday that yes the rumours were true and Canon was releasing a new camera called the EOS 6D which was in essence a full frame camera for the masses. The nomenclature would have you believe that it was in fact a full frame upgrade for the 60D, something that was widely rumoured to be the case, but diving into the specifications reveals that it shares a lot more with the 5D lineage than it does with its prosumer cousin. This doesn’t mean the camera is more focused on the professional field, indeed the inclusion of things like wifi and GPS are usually considered to be conusmer features (I’ve had them in my Sony pocket cam for years, for example), but if I’m honest the picture I built up of the new camera in my head doesn’t exactly align with what Canon has revealed and that’s left me somewhat disappointed.
Before I get into that though let me list off the things that are really quite awesome about the 6D. The full frame sensor in a camera that will cost $2099 is pretty damn phenomenal even if that’s still well out of the range of the people buying in the 60D range. It’s actually the cheapest full frame DSLR available (even the Sony fixed lens full frame is $700 more) and that in itself is an achievement worth celebrating. All the benefits of the bigger sensor are a given (better low light performance, crazy ISOs and better resolution) and the addition of WiFi and GPS means that the 6D is definitely one of the most feature packed cameras Canon has ever released. Still it’s the omission of certain features and reduction in others that’s left me wondering if it’s worth me upgrading to it.
For starters there’s the lack of an articulated screen. It sounds like a small thing as there are external monitor solutions that would get me similar functionality but I’ve found that little flip out screen on my 60D so damn useful that it pains me to give it up. The reasons behind its absence are sound though as they want to make the 6D one of their more sturdier cameras (it’s fully weather sealed as well) and an articulated screen is arguably working against them in that regard.
There’s also the auto-focus system which only comes with 11 focus points of which only 1 is cross type. This is a pretty significant step down from the 60D and coming from someone who struggled with their 400D’s lackluster autofocus system I can’t really see myself wanting to go back to that. It could very well be fine but on paper it doesn’t make me want to throw my money recklessly in Canon’s direction like I did with all the rumours leading up to this point.
One thing could sway me and that would be if MagicLatern made its way onto the 6D platform. The amount of features you unlock by running this software is simply incredible and whilst it won’t fix the 2 things that have failed to impress me it would make the 6D much more palatable for me. Considering that the team behind it just managed to get their software working on the ever elusive 7D there’s a good chance of it happening and I’ll have to see how I feel about the 6D after that happens.
Realistically the disappointment I’m feeling is my fault. I broke my rule about avoiding the hype and built up an image of the product that had no basis in reality. When it didn’t match those expectations exactly I was, of course, let down and there’s really nothing Canon could have done to prevent that. Maybe as time goes on the idea of the 6D will grow on me a bit more and then after another red wine filled night you might see another vague tweet that indicates I’ve changed my mind.
Time to restock the wine rack, methinks.