Posts Tagged‘market share’

StatCounter-os-ww-monthly-201108-201410

Windows XP Finally Meeting its End.

For the longest time, far too long in my opinion, XP had been the beast that couldn’t be slayed. The numerous releases of Windows after it never seemed to make much more than a slight dent in its usage stats and it reigned as the most used operating system worldwide for an astonishing 10 years after its initial release. It finally lost its crown to Windows 7 back in October of 2011 but it still managed to hold on a market share that dwarfed many of its competitors. It’s decline was slow though, much slower than an operating system which was fast approaching end of life should have been. However last quarter saw it drop an amazing 6% in total usage, finally dropping it below the combined usage of Windows 8 and 8.1.

StatCounter-os-ww-monthly-201108-201410

The reasons behind this drop are wide and varied but it finally appears that people are starting to take Microsoft’s warnings that their product is no longer supported seriously and are looking for upgrades. Surprisingly though the vast majority of people transitioning away from the aging operating system aren’t going for Windows 7, they’re going straight to Windows 8.1. This isn’t to say that 8.1 is eating away at 7’s market share however, it’s up about half a percent in the same time frame, and the upgrade path is likely due to the fact that Microsoft has ceased selling OEM copies of Windows 7. Most of those new licenses do come with downgrade rights however though I’m sure few people actually use them.

If XP’s current downward trend continues along this path then it’s likely to hit the low single digit usage figures sometime around the middle of next year. On the surface this would appear to be a good thing for Microsoft as it means that the majority of their user base will be on a far more modern platform. However at the same time the decline might just be a little too swift for people to consider upgrading to Windows 10 which isn’t expected to be RTM until late next year. Considering the take up performance of Windows 8 and 8.1 this could be something of a concern for Microsoft although there is another potential avenue: Windows 7 users.

The last time Microsoft has a disastrous release like Windows 8 the next version of Windows to take the majority of the market share was 7, a decade after the original had released. Whilst it’s easy to argue that this time will be different (like everyone does) a repeat performance of that nature would see Windows 7 being the dominant platform all the way up until 2019. Certainly this is something that Microsoft wants to avoid so it will be interesting to see how fast Windows 10 gets picked up and which segments of Microsoft’s business it will cannibalize. Should it be primarily Windows 7 based then I’d say everything would be rosy for them, however if it’s all Windows 8/8.1 then we could be seeing history repeat itself.

Microsoft is on the cusp of either reinventing itself with Windows 10 or being doomed to forever repeat the cycle which consumers have forced them into. To Microsoft’s credit they have been trying their best to break out of this mould however it’s hard to argue with the demands of the consumer and there’s only so much they can do before they lose their customer’s faith completely. The next year will be very telling for how the Microsoft of the future will look and how much of history will repeat itself.

Windows 8.2 Start Bar

How Long Will it Take for Enterprise IT to Embrace Rapid Innovation?

The IT industry has always been one of rapid change and upheaval, with many technology companies only lasting as long as they could innovate for. This is at odds with the traditional way businesses operated, preferring to stick to predictable cycles and seek gains through incremental improvements in process, procedure and marketing. The eventuality of this came in the form of the traditional 3~5 year cycle that many enterprises engaged in, upgrading the latest available technology usually years after it had been released. However the pace of innovation has increased to the point where such a cycle could leave an organisation multiple generations behind and it’s not showing any signs of slowing down soon.

Windows 8.2 Start Bar

I mentioned last year how Microsoft’s move from a 3 year development cycle to a yearly one was a good move, allowing them to respond to customer demands much more quickly than they were previously able to. However the issue I’ve come across is whilst I, as a technologist, love hearing about the new technology the customer readiness for this kind of innovation simply isn’t there. The blame for this almost wholly lays at the feet of XP’s 12 year dominance of the desktop market, something which even the threat of no support did little to impact its market share. So whilst the majority may have made the transition now they’re by no means ready for a technology upgrade cycle that happens on a yearly basis. There are several factors at play with this (tools, processes and product knowledge being the key ones) but the main issue remains the same: there’s a major disjoint between Microsoft’s current release schedule and it’s adoption among its biggest customers.

Microsoft, to their credit, are doing their best to foster rapid adoption. Getting Windows 8.1 at home is as easy as downloading an app from the Windows store and waiting for it to install, something you can easily do overnight if you can’t afford the down time. Similarly the tools available to do deployments on a large scale have been improved immensely, something anyone who’s used System Center Configuration Manager 2012 (and it’s previous incarnations) will attest to. Still even though the transition from Windows 7 to 8 or above is much lower risk than from XP to 7 most enterprises aren’t looking to make the move and it’s not just because they don’t like Windows 8.

With Windows 8.2 slated for release sometime in August this year Windows 8 will retain an almost identical look and feel to that of its predecessors, allowing users to bypass the metro interface completely and giving them back the beloved start menu. With that in place there’s almost no reason for people to not adopt the latest Microsoft operating system yet it’s likely to see a spike in adoption due to the inertia of large IT operations. Indeed even those that have managed to make the transition to Windows 8 probably won’t be able to make the move until 8.3 makes it debut, or possibly even Windows 9.

Once the Windows 8 family becomes the standard however I can see IT operations looking to move towards a more rapid pace of innovation. The changes between the yearly revisions are much less likely to break or change core functionality, enabling much of the risk that came with adopting a new operating system (application remediation). Additionally once the IT sections have moved to better tooling upgrading their desktops should also be a lot easier. I don’t think this will happen for another 3+ years however as we’re still in the midst of a XP hangover, one that’s not likely to subside until it’s market share is in the single digits. Past that we administrators then have the unenviable job of convincing our businesses that engaging in a faster product update cycle is good for them, even if the cost is low.

As someone who loves working with the latest and greatest from Microsoft it’s an irritating issue for me. I spend countless hours trying to skill myself up only to end up working on 5+ year old technology for the majority of my work. Sure it comes in handy eventually  but the return on investment feels extremely low. It’s my hope that the cloud movement, which has already driven a lot of businesses to look at more modern approaches to the way they do their IT, will be the catalyst by which enterprise IT begins to embrace a more rapid innovation cycle. Until then however I’ll just lament all the Windows Server 2012 R2 training I’m doing and wait until TechEd rolls around again to figure out what’s obsolete.

Linux Distros Tux

2013 Might Be Linux’s Year For Gaming.

The defacto platform of choice for any gamer used to be the Microsoft Windows based PC however the last decade has seen that change to be some form of console. Today, whilst we’re seeing something of a resurgence in the PC market thanks in part to some good releases this year and ageing console hardware, PCs are somewhere on the order take about 5% of the video game market. If we then extrapolate from there using the fact that only about 1~2% of the PC market is Linux (although this number could be higher if restricted to gamers) then you can see why many companies have ignored it for so long, it just doesn’t make financial sense to get into it. However there’s been a few recent announcements that shows there’s an increasing amount of attention being paid to this ultra-niche and that makes for some interesting speculation.

Linux Distros Tux

Gaming on Linux has always been an exercise in frustration, usually due to the Windows-centric nature of the gaming industry. Back in the day Linux suffered from a lack of good driver support for modern graphics cards and this made it nearly impossible to get games running on there at an acceptable level. Once that was sorted out (whether you count binary blobs as “sorted” is up to you) there was still the issue that most games were simply not coded for Linux leaving their users with very few options. Many chose to run their games through WINE or Cedega which actually works quite well, especially for popular titles, but many where still left wanting  for titles that would run natively. The Humble Indie Bundle has gone a long way to getting developers working on Linux but it’s still something of a poor cousin to the Windows Platform.

Late last year saw Valve open up beta access to Steam on Linux bringing with it some 50 odd titles to the platform. It came as little surprise that they did this considering that they did the same thing with OSX just over 2 years ago which was undoubtedly a success for them. I haven’t really heard much on it since then, mostly because none of my gamer friends run Linux, but there’s evidence to suggest that it’s going pretty well as Valve is making further bets on Linux. As it turns out their upcoming Steam Box will be running some form of Linux under the hood:

Valve’s engineer talked about their labs and that they want to change the “frustrating lack of innovation in the area of computer hardware”. He also mentioned a console launch in 2013 and that it will specifically use Linux and not Windows. Furthermore he said that Valve’s labs will reveal yet another new hardware in 2013, most likely rumored controllers and VR equipment but we can expect some new exciting stuff.

I’ll be honest and say that I really didn’t expect this even with all the bellyaching people have been doing about Windows 8. You see whilst being able to brag about 55 titles being on the platform already that’s only 2% of their current catalogue. You could argue that emulation is good enough now that all the titles could be made available through the use of WINE which is a possibility but Valve doesn’t offer that option with OSX currently so it’s unlikely to happen. Realistically unless the current developers have intentions to do a Linux release now the release of the Steam Box/Steam on Linux isn’t going to be enough to tempt them to do it, especially if they’ve already recovered their costs from PC sales.

That being said all it might take is one industry heavyweight to put their weight behind Linux to start a cascade of others doing the same. As it turns out Blizzard is doing just that with one of their titles slated for a Linux release some time this year. Blizzard has a long history with cross platform releases as they were one of the few companies to do releases for Mac OS decades ago and they’ve stated many times that they have a Linux World of Warcraft client that they’ve shied away from releasing due to support concerns. Releasing an official client for one of their games on Linux will be their way of verifying whether its worth it for them to continue doing so and should it prove successful it could be the shot in the arm that Linux needs to become a viable platform for games developers to target.

Does this mean that I’ll be switching over? Probably not as I’m a Microsoft guy at heart and I know my current platform too well to just drop it for something else (even though I do have a lot of experience with Linux). I’m very interested to see how the Steam Box is going to be positioned as it being Linux changes the idea I had in my head for it and makes Valve’s previous comments about them all the more intriguing. Whilst 2013 might not be a blockbuster year for Linux gaming it is shaping up to be the turning point where it starts to become viable.

Unity and Windows Phone 7: Microsoft’s Short Sightedness Damages Their Platform.

Cross platform development is one of those things that I’ve seen done dozens of times before but rarely do I see anyone get it right. I understand the allure of doing so, heck my most creative forms of procrastination came from experimenting with these ideas, but the fact remains that more often than not they’re going to be a total waste of your time. I do have one exception to this rule however and that comes in the form of the cross-platform game engine Unity. Where other libraries promise compatibility and a wide range of functions Unity actually delivers on these with little compromise. Couple that with their ridiculously good pricing model and awesome dev environment and it’s hard to fault the product. Indeed all the shortcomings I found were, as far as I could tell, limitations of my programming expertise.

For games developers Unity offers the chance to have one code base for all platforms with only minor tweaking required once you want to deploy the project to your chosen platform. This is great because initially you can focus on one platform and then once you’ve verified your product there it doesn’t take much to port it to the new platform. It’s no surprise then that you’ll find many Unity based games in both the Apple and Android app stores. Figuring that Unity was going for all round platform domination I thought it was only a matter of time before I saw that the library would support Windows Phone 7, even though it’s still in its nascent stages.

As it turns out however that won’t be happening:

The Unity engine does not support Windows Phone 7 because of restrictions placed on Microsoft’s mobile, the CEO of Unity Technologies has said.

“But we’re looking at Windows Phone 8 and hopefully it will be easier to work on that system,” he said.

In an interview with Develop, to be published soon, Helgason explained Windows Phone 7 “is a relatively closed system so you can’t run native content, which means we can’t really support it”.

The “closed” nature that David Helgason (CEO of Unity) is referring to is the fact that if you want to put a game on the Windows Phone 7 platform you need to have coded it in either Silverlight or Microsoft’s XNA framework. Unity then approached Microsoft to see if they could get an exemption from this rule (as well as access to some private APIs which would be required for their libraries to work) however Microsoft turned them down. This means that Unity and all the games built on top of it are banned from being released on this platform, save for a full rewrite of the code. In response Unity has turned their sites towards Windows 8 which will be a lot more friendly for them thanks to the WinRT framework

This feels like a pretty big misstep from Microsoft. Windows Phone 7 hasn’t been gaining any momentum and it’s overall smart phone market share (that includes Windows Mobile devices) has been taking quite the battering dropping to a low of 1.6%. Whilst I won’t go as far to say that Unity would be its saving grace it definitely wouldn’t hurt to have that available as an option for games developers looking to develop for the Windows Phone 7 platform. Indeed since Unity supports coding in C# I’d hazard a guess that there would be quite a few who’d be willing to give it a shot just because it would be easy for them to learn. Heck I know I did.

In reality Windows Phone 7 has a lot of other hurdle to overcome before it can be considered a serious competitor in the market but Microsoft can’t afford to throw away any potential advantage it can get. Not working with Unity only serves to damage the Windows Phone 7 platform as it has demonstrated success on every platform it’s available on. Unity developers then may have to wait for Windows 8 and the corresponding Windows Phone release before they can think about coming across onto Microsoft’s platform but I feel that may be too far off, and the damage has already been done.

Virtualized Smartphones: No Longer a Solution in Search of a Problem.

It was just under 2 years ago when I wrote my first (and only) post on smartphone virtualization approaching it with the enthusiasm that I do with most cool new technologies. At the time I guessed that VMware would eventually look to integrate this idea with some of their other products, in essence turning user’s phones into dumb terminals so that IT administrators could have more control over them. However the exact usefulness was still not clear as at the time most smartphones were only just capable of running a single instance, let alone another one with all the virtualization trimmings that’d inevitably slow it down. Android was also somewhat of a small time player back then as well having only 5% of the market (similar to Windows Phone 7 at the same stage in its life, funnily enough) making this a curiosity more than anything else.

Of course a lot has changed in the time between that post and now. Then market leader, RIM, is now struggling with single digit market share when it used to make up almost half the market. Android has succeeded in becoming the most popular platform surpassing Apple who maintained the crown for many years prior. Smartphones have also become wildly more powerful as well, with many of them touting dual cores, oodles of RAM and screen resolutions that would make my teenage self green with envy. With this all in mind then the idea of running some kind of virtualized environment on a smartphone doesn’t seem all that ludicrous any more.

Increasingly IT departments are dealing with users who want to integrate their mobile devices with their work space in lieu of using a separate, work specific device. Much of this pressure came initially from the iPhone with higher ups wondering why they couldn’t use their devices to access work related data. For us admin types the reasons were obvious: it’s an unapproved, untested device which by rights has no business being on the network. However the pressure to capitulate to their demands was usually quite high and work arounds were sought. Over the years these have taken many various forms, but the best answer would appear to lie within the world of smartphone virtualization.

VMware have been hard at work creating full blown virtualization systems for Android that allow a user to have a single device that contains both their personal handset as well as a secure, work approved environment. In essence they have an application that allows them to switch between the two of them, allowing the user to have whatever handset they want whilst still allowing IT administrators to create a standard, secure work environment. Android is currently the only platform that seems to support this wholly thanks to its open source status, although there are rumours of it coming to the iOS line of devices as well.

It doesn’t stop there either. I predicted that VMware would eventually integrate their smartphone virtualization technology into their View product, mostly so that the phones would just end up being dumb terminals. This hasn’t happened exactly, but VMware did go ahead and imbue their View product with the ability to present full blown workstations to tablet and smartphones through a secure virtual machine running on said devices. This means that you could potentially have your entire workforce running off smartphones with docking stations, enabling users to take their work environment with them wherever they want to go. It’s shockingly close to Microsoft’s Three Screens idea and with Google announcing that Android apps are now portable to Google TV devices you’d be forgiven for thinking that they outright copied the idea.

For most regular users though these kinds of developments don’t mean a whole lot, but it is signalling the beginning of the convergence of many disparate experiences into a single unified one. Whilst I’m not going to say that anyone one platform will eventually kill off the other (each one of the three screens has a distinct purpose) we will see a convergence in the capabilities of each platform, enabling users to do all the same tasks no matter what platform they are using. Microsoft and VMware are approaching this idea from two very different directions with the former unifying the development platform and the latter abstracting it away so it will be interesting to see which approach wins out or if they too eventually converge.

HP, WebOS and the Future of the Tablet Space.

So last Friday saw the announcement that HP was spinning off their WebOS/Tablet division, a move that sent waves through the media and blogosphere. Despite being stuck for decent blog material on the day I didn’t feel the story had enough legs to warrant investigation, I mean anyone but the most dedicated of WebOS fans knew that the platform wasn’t going anywhere fast. Heck it took me all of 30 seconds on Google to find these latest figures that have it pegged at somewhere around 2%, right there with Symbian (those are smart phone figures, not overall mobile) trailing the apparently “failing” Windows Phone 7 platform by a whopping 7%. Thus the announcement that they were going to dump the whole division wasn’t so much of a surprise and set about trying to find something more interesting to write about.

Over the weekend though the analysts have got their hands on some juicy details that I can get stuck into.

Now alongside the announcement that WebOS was getting the boot HP also announced that it was considering exiting the PC hardware business completely. At the moment that would seem like a ludicrous idea as that division was their largest with almost $10 billion in revenue but their enterprise services division (which is basically what used to be EDS) is creeping up on that quite quickly. Such a move also wouldn’t see them exit the server hardware business either which would be a rather suicidal move from them considering they’re the second largest player there with 30% of the market. More it seems like HP wants out of the consumer end of the market and wants to focus on enterprise software, services and the hardware that supports them.

It’s a move that several similar companies have taken in the past when faced with downwards trending revenues in the hardware sector. Back when I worked at Unisys I can remember them telling me about how they now derive around 70% of their revenue from outsourcing initiatives and only 30% from their mainframe hardware sales. They used to be a mostly hardware oriented company but switched to professional services and outsourcing when they had negative growth for several years. HP on the other hand doesn’t seem to be suffering any of these problems, which begs the question why would they bother exiting what seems to be a lucrative market for them?

It was a question I hadn’t really considered until I read this post from John Gruber. Now I’d known that HP had gotten a new CEO since Mark Hurd was ejected over that thing with former PlayBoy Girl Jodie Fisher (and his expense account, but that’s no where near as fun to write) but I hadn’t caught up with who they’d hired as his replacement. Turns out it is former SAP CEO Leo Apotheker. Now their decisions to spin off their WebOS (and potentially their PC division) make a lot of sense as that’s the kind of company Apotheker has quite a lot of experience in. Couple that with their decision to buy Autonomy, another enterprise software company, it seems almost certain that HP is heading towards the end goal of being a primarily serviced based company.

Of course with HP exiting the consumer market after only being in it for such a short time people started to wonder if there was ever going to be a serious competitor to Apple’s offerings, especially in the tablet market. Indeed it doesn’t look good for anyone trying to crack into that market as it’s pretty much all Apple all the time and if a multi-billion dollar company can’t do it then there’s not much hope for anyone else. However Android has made some impressive inroads into this Apple dominated niche, securing a solid 20% of the market.  Just like it did with the iPhone before it no single vendor will come to completely decimate Apple in this space but overall Android’s dominance will come from the sheer variety that they offer. We’ve still yet to see  Galaxy S2-esque release in the Android tablet space but I’m sure one’s not too far off.

It’ll be interesting to see how HP evolves itself over the next year or so under Apotheker’s leadership as it’s current direction is vastly different to that of the HP in the past. This isn’t necessarily a good or bad thing for the company either as whilst they might have any cause for concern now this transition could avoid the pain of attempting to do it further down the track. The WebOS split off is just the first step in this long journey for HP and there will be many more for them to take if they’re to make the transition to a professional services company.