It’s no secret that I’m something of a fan of Windows 8 but then again my experience is somewhat biased by my extreme early adopter attitude. I haven’t yet had to support it in a production environment although I have installed it on varying levels of hardware that I have access to and I’ve yet to struggle with the issues that plagued me with previous Windows releases. The thing is though, whilst I’m a firm believer in Windows 8 and the features it brings, I’m of the opinion that it probably won’t see a high level of adoption in the enterprise space as the default desktop OS but that’s not necessarily a bad thing.
Despite the fact that Windows 7 has been out for a good 4 years at this point many enterprises are still in the midsts of deploying it within their organisation. This is wholly due to the initial disaster that Windows Vista was which caused the vast majority of organisations to not consider it as a possible upgrade to their Windows XP infrastructure. Past SP1 though Vista was a perfectly usable operating system and by that point many of the OEMs had caught up with their drivers which was the main cause of headaches for Vista users. Still it seemed the damage was done and Vista never managed to gain the market share it needed, leaving many organisations languishing on XP.
Not only was this bad for Microsoft in terms of sales it was worse for the organisations who stayed on it. Now systems that were designed for XP became far more entrenched and the rework required for applications to be Vista compatible got further delayed. Thus when it finally came time to move operating systems the cost of doing so (in both real terms and the effort required) was quite a lot higher and the larger the organisation the longer the transition it would take. Indeed the organisation I’m currently working for still has XP (using Netware for directory services no less) is only just getting around to rolling out Windows 7 this year due to the numerous number of applications that require remediation.
Whilst Microsoft will likely make good on their promise of delivering more updates, like they’re doing with the Windows Blue update this year, and major releases more frequently it’s likely that organisations are still reeling from their Windows 7 transition. Windows 9 is still a way off with estimates for a release dating anywhere from mid-2014 to somewhere in 2015 but that’s around the time when enterprises will be looking to upgrade in order to get the next set of killer features as Windows 7 starts to show its age. Now it’s entirely possible that with the frequent Blue style updates that Windows 8 will become far more attractive for enterprise before this date but if history has taught us anything the disruptive versions of Windows are usually the ones that end up being skipped, and Windows 8 certainly fits that bill.
There’s definitely potential for Windows 8 to make inroads into the enterprise space as the Surface would seem to be an ideal fit for the enterprise, even if most of the usability comes from the non-Metro side of it. Developing proper Metro applications for Microsoft’s enterprise products would go a long way to improving its market penetration and I know that IT admins at large would much prefer to maintain a fleet of Surfaces than a comparable fleet of iDevices. It’s clear that Metro was primarily consumer oriented but as we know many IT decisions a top driven in nature and if they want to get more people on board providing a better tablet experience to organizational executives could be the in that Windows 8 needs.
Still after 2 decades of watching Windows releases it won’t come as a surprise if Windows 8 gets passed over in favour of its next generation cousin. What we really need to avoid though is another decade of OS stagnation as whilst Windows 7 it has the potential to keep the mentality that developed with XP alive and that just makes change more painful than it needs to be. With Microsoft being committed to more releases more often we’re in a good position to avoid this and all that’s needed is for us to continue pushing our organisations in the right direction.
There’s always risk in innovation. When you’re creating a new product or service there’s always the chance that nobody will want what you’re creating. Similarly whatever you end up creating could very well end up grating against the current norms in such a way that your product is almost wholly rejected by those its aimed at. A great example of this, which I covered in the past, was Windows Vista. In order for Microsoft to move ahead into the future they had to break away from some of their old paradigms and this drew the ire of many of their loyal customers. The damage that was done there is still being felt today with slower adoption rates of their latest product but had it not been for this initial failure they may not have been enjoying the level of success that Windows 7 has today.
In fact many pioneering products and services were first faced with dismal (albeit, mostly profitable) reception initially. Steam was a great example of this, debuting back in a time where broadband penetration numbers in many countries wasn’t particularly great and sought to deliver all games digitally direct to the consumer. Couple this with the fact that they were cutting out the publishers and distributors in the process the guys at Valve faced an extremely long, uphill battle in order for their platform to gain dominance. Still three years later they started to get big titles releasing on their platform and the rest, as they say, is history.
Interestingly enough I began to notice similar things happening with the Playstation Portable. Whilst the next version of the handheld, the NGP, is not going to be a digital only download device Sony has recently said that all games will be available digitally with only the bigger titles coming to the physical world:
“One thing we learnt from PSP, is that we want to have simultaneous delivery in digital and physical for NGP. Just to clarify that, all games that appear physically will be made available digitally, said House. He added, “Not necessarily all games have to be made available physically. And having the option of a digital-only method affords more creative risk-taking, and that’s because you don’t-have that in-built risk of physical inventory.”
For those who follow Sony you’d be aware of the dismal failure that was the PSP Go. Debuting at an insanely high price (costing just a hair below a full PS3) whilst offering little in the way of improvements the PSP Go was never going to be a phenomenal success. However it was particularly hampered by the lack of compatibility with its current gen brethren, doing away with the UMD drive in favor of a fully digital distribution model. This annoyed PSP customers to no end because their current collection of games could not be migrated onto the new platform (other than through nefarious means). Looking at the NGP there’s no way to get UMD games onto it but since most people are already aware that their current UMD titles will not have a format transition to the new platform they’ve avoided doing the same amount of damage to their next generation handheld as they did to the PSP Go.
Failure teaches you where you went wrong and where you should be heading in order to avoid making such mistakes again. Many successful products have been built on the backs of dismal failures, just look at satellite phones and radio for example. Sometimes it requires a risk taker to pave the way forward for those who will profit from the endeavor and hopefully that risk taker gets some of the kudos down the line. Digital distribution is one of those such areas where path has already been beaten and even some of the pioneers are continuing to profit from it.
I’ve been in the IT industry professionally for quite some time now, and even longer as an avid enthusiast. I’ve seen so many companies come and go as they evolve with the fast paced ever changing technology world and that’s lead to a great understanding of some of the fundamental rules that don’t seem to change. One of these such rules is the constant upgrade cycle, e.g. the release of new versions of products on a fairly regular schedule , in order to take advantage of the latest developments from other companies. The interesting thing about these cycles is that usually they can’t be too drastic lest you alienate your customers who’ve created expectations of the product that will cause a revolt should they not be met.
Take for example some of Microsoft’s products, most notably their desktop operating systems Windows. From Windows 3.1 to Windows 7 there’s a kind of baseline familiarity that users have developed with the products and, for the most part, they’ve remained unchanged for the better part of almost 2 decades. Granted it would be quite a shock for someone who’d been using Windows 3.1 to move straight to 7 but that’s usually never the case. More most users would be ushured along onto the latest product from at most 2 generations previous, usually when they can’t do something that everyone else in their social circle can.
Most major players in the IT world have mastered this idea of product cycles. From Apple to Dell to AMD you can bet your bottom dollar that they’ll release a new product on a predictable timeline, usually timed perfectly to be right smack in the middle of their competitors cycle. If there’s any phenomenon that’s to be held responsible for the IT sector’s almost ubelievably fast movement speed it would have to be this culture of ensuring that your company is providing the latest and greatest features and products to its consumers, always making sure that you keep on eye on your competitors. It is basically a massive game of one-upmanship.
Sometimes however, the product cycle does not go quite as planned.
You see I’m writing this blog post to you today on what should be considered an absolute dinosaur of the IT world: a Windows XP machine. Released late in 2001 I can remember fondly my first experiences with it, blue screens abounding and most of my hardware behaving in ways that I never thought imaginable. A couple months saw it come good with the various manufacturers catching up with their drivers and Microsoft patching the more obvious flaws in the system. It was a rocky start for Microsoft’s attempt to bring some of the better parts of their server line to the desktop but eventually most companies relented and XP found its home as the defacto operating system for the majority of computer users worldwide.
5 years later, Microsoft would attempt to do it all over again.
Now before we dig into Microsoft’s next product cycle let’s take a moment to think about that last paragraph. Think about where you were 9 years ago and compare it to today, worlds apart right? Just imagine if I told you that I’d bought a top of the line phone back in 2001 and I was still using that today, you’d think I was pretty bonkers since even a $50 phone today would be better in almost every way. Whilst I’m sure there are people doing such things (my Dad is using a phone from 2004) the simple fact that technology moves so fast means that most products have an effective life of around 2~3 years. Windows XP, for some reason, seems to be completely immune to that idea.
Partly that’s to blame with the long development cycle that plague its successor, Windows Vista (codenamed Longhorn). Initially planned for release a mere 2 years after the initial release of XP its original intention was to function as a stop gap between XP and the next major release codenamed Blackcomb. Due to feature creep that saw Longhorn encroach on Blackcomb’s territory the two finally merged together under the Vista title and the release date slipped by over 3 years. This lead to one of the longest time between releases of Windows versions in almost a decade, and the markets reaction was nothing short of devastating.
Windows Vista, for what its worth, was not a bad operating system at heart. Like its predecessor it was plagued with the job of attempting to support legacy systems whilst at the same time trying to innovate in any way it could. Consequently neither part could be done very well as legacy support inherently holds back innovation, leaving Vista to languish in a kind of no man’s land. Again like XP before it Vista attempted to do things in a completely new way which broke the compatibility with numerous bits of hardware and software further stifling its adoption rates. Overall the industries first reactions to Vista were ultimately its death knell and I never found a workplace that found the idea of switching to it appealing.
Microsoft managed to make the system quite usable in the years following Vista’s initial release. I myself ran it on my personal computer for quite some time and so did many of my technical friends. Still the damage was done and many corporate departments decided that XP suited their needs aptly and left it at that. It wasn’t until late last year that Windows 7 made its triumphant debut, hoping to be the knight in shining armor to pull the damsels of corporate IT away from the darkness that was Windows XP.
However due to the botch cycle of Windows Vista they were met with almost spiteful disdain. 8 years is a long time to go between refreshing your products and nearly all IT departments had grown accustomed to things working the XP way. Whilst many recognised that Windows 7 was not Vista (thanks to new and improved eye candy) they still couldn’t fathom the idea that anything but XP was required and were even more concerned for all those legacy applications they’d developed for their aging XP systems. Thus Microsoft, who really did so many things right with Windows 7, was left trying to market a product to people who were so entrenched in their habits that Windows 7 was almost set to Vista all over again. Windows 7 however is that good that its adoption rates are almost double that of Vista’s for the same time period, matching that of Windows XP.
There’s a couple lessons to be learnt from the Windows Vista story. The first is to repeat the old developer mantra release early, release often. Microsoft’s long development cycle for Vista meant that there was already quite a bit of inertia working against it. Whilst its quite understandable that something as complicated as an operating system takes time to develop they knew from the get go that a long development cycle would harm the adoption rates. They fell prey to some of the most common project management mistakes (read: scope creep) and their final product, whilst impressive technologically, was too far away from user’s current expectations. The original idea of Longhorn being a stepping stone to Blackcomb was sound and was proven succinctly with the success of Windows 7 which inadvertently used Vista as its stepping stone.
It’s always interesting to look back at the history of product releases and to see how customer behaviour influences company decisions. Vista was one of those oddities where the latest and greatest was wholly rejected by the community it set out to serve and only its rebirth under a new label and shiny facade was enough to win them back. It was also a demonstration of the market power that Microsoft has since a failed product cycle was the in for many competitors to swoop in yet as we can see despite their disdain for the latest offering Microsoft’s customers remained loyal, even if it was to the wrong product (in Microsoft’s eyes).
I should really update my machine to the new Windows 7 environment they’re offering here… 😉
Ask people on the street about Windows Vista and you’ll usually get the response “Isn’t Vista crap?” even though many of them have not used it nor have a clue about what you actually get from it. For the past year I’ve been using Vista as my main desktop and really I found it to be quite usable. Sure there were some things that were obviously changes for change sake (where did my up folder button go?!?!) but overall the new UI was pretty appealing and once you were past about 2GB of system memory there wasn’t much of a performance difference between them. However, the initial fiasco of it requiring such an exotic system just to run and the incompatibility with many legacy devices lead to bad perceptions all over the place. Something that Microsoft hasn’t been able to shake to this day.
Enter the ring Windows 7, Microsoft’s evolutionary step into the next world of operating systems. With such a shorter development time then Vista this wasn’t going to be the new revolution of the computing world that Vista was supposed to be. No this clucky little system was supposed to build on Vista’s success whilst making the whole user experience much more pleasant and secure. As it turns out, Windows 7 might actually achieve what Vista set out to do.
I’ve been using Windows 7 exclusively as my main computer for the past couple weeks, and there’s a couple things I’d love to share with you.
First off the installation of Windows 7 takes a much shorter time then any other OS that I’ve installed. From booting up from the disk to a usable system I had the install time just shy of 20 minutes, with most of that spent with me away from the computer. They’ve even gone to the trouble of making the loading screen look pretty, which while completely useless is a nice touch.
Boot times have been significantly reduced, including time to usable¹. Vista had a nasty habit of showing the initial loading screen and then a black screen for a while before letting you login. They’ve bypassed this part and once the initial logo disappears the login screen comes up seconds later.
They’ve redesigned the UI for Windows 7, which I initially groaned at. Most of the time I encounter UI changes things are moved around for no good reason (hello Facebook) and it takes me more time to figure out how to do something than what I actually want to do. However, the Windows 7 UI is a refreshing change from this, with many of the changes being revisionary steps forward, rather than a whole paradigm shift (Ha! Correctly used buzzwords). The augmentations to the start menu are very useful, especially for things like the Remote Desktop Client. If you could navigate your way around Vista you won’t find Windows 7 hard at all. In fact, I think you’ll find it easier.
Windows 7 really does seem to be everything that Vista should have been. It’s fast, very usable but different enough from the previous version to really set it apart. However, under its sleek and shiny exterior it really is a revamped Vista at heart, which leads me to my main point for this post.
Vista got blackballed from the first day it was released and unfortunately could never shake the negative press associated with it. Microsoft in its wisdom tried to remedy this by initiating Project Mojave, a sneaky little project that gussied up Vista as some new and exciting OS from Microsoft. Whilst this proved Microsoft’s point that Vista was actually a very capable and usable OS it did not improve the market’s perception of the product. Windows 7 on the other hand is Microsoft’s next genuine attempt at a new OS and as much as I’d like to say it’s changed, it’s still Vista underneath.
Sure, there are many changes between the two and not just in terms of UI. The revised UAC model is a tad more usable although still fundamentally useless from a security perspective, and there are several other administration tweaks. Device Stage sounds like a great idea, and hopefully the driver writers step up to the plate to take advantage of this.
Overall I’m very happy with the way Windows 7 is going. I believe more frequent releases of operating systems leads to them being far more in tune with the market and it will help ease the transition pain we saw with XP to Vista. The great news is that Microsoft is offering Windows 7 as a free upgrade to all Vista users and downgraders, something that will definitely work wonders for its initial adoption. Additionally most applications that have been reworked for Vista will work with 7, so there should be a lot less compatability issues moving from XP to 7.
It does beg the question however, was Vista really the failure it was made out to be or was it the failure they had to have in order to get everyone in the mindset for a change?
That is an exercise left up to the reader 🙂
¹This is the time taken from pressing the power button to actually being able to use the computer. Vista would start up quick but wouldn’t be usable for quite some time.