The last two years have seen a major shake up in the personal computing industry. Whilst I’m loathed to admit it Apple was the one leading the charge here, redefining the smart phone space and changing the way many people did the majority of their computing by creating the wildly successful niche of curated computing (read: tablets). It is then inevitable that many subsequent innovations from rival companies are seen as reactions to Apple’s advances, even if the steps that company is taking are towards a much larger and broader goal than competing in the same market.
I am, of course, referring to Microsoft’s Windows 8 which was just demoed recently.
There’s been quite a bit of news about the upcoming release of Windows 8 with many leaked screenshots and even leaked builds that gave us a lot of insight into what we can expect of the next version of Windows. For the most part the updates didn’t seem like anything revolutionary although things like portable desktops and a more integrated web experienced were looking pretty slick. Still Windows 7 was far from being revolutionary either but the evolution from Vista was more than enough to convince people that Microsoft was back on the right track and the adoption rates reflect that.
However the biggest shift that is coming with Windows 8 was known long before it was demoed: Windows 8 will run on ARM and other System on a Chip (SOC) devices. It’s a massive deviation from Microsoft’s current platform which is wholly x86/x86-64 based and this confirms Microsoft’s intentions to bring their full Windows experience to tablet and other low power/portable devices. The recent demo of the new operating system confirmed this with Windows 8 having both a traditional desktop interface that we’re all familiar with and also a more finger friendly version that takes all of its design cues from the Metro interface seen on all Windows Phone 7 devices.
Looking at all these changes you can’t help but think that they were all done in reaction to Apple’s dominance of the tablet space with their iPad. It’s true that a lot of the innovations Microsoft has done with Windows 8 mirror those of what Apple has achieved in the past year or so however since Windows 8 has been in development for much longer than that not all of them can be credited to Microsoft playing the me-too game. Realistically it’s far more likely that many of these innovations are Microsoft’s first serious attempts at realizing their three screens vision and many of the changes in Windows 8 support this idea.
A lot of critics think the idea of bringing a desktop OS to a tablet form factor is doomed for failure. The evidence to support that view is strong too since Windows 7 (and any other OS for that matter) tablet hasn’t enjoyed even a percentage of the success that the dedicated tablet OS’s have. However I don’t believe that Microsoft is simply making a play for the tablet market with Windows 8, what they’re really doing is providing a framework for building user experiences that remain consistent across platforms. The idea of being capable of completing any task whether you’re on your phone, TV or dedicated computing device (which can be a tablet) is what is driving Microsoft to develop Windows 8 they way they are. Windows Phone 7 was their first steps into this arena and their UI has been widely praised for its usability and design and Microsoft’s commitment to using it on Windows 8 shows that they are trying to blur the lines that current exist between the three screens. The potential for .NET applications to run on x86, ARM and other SOC platforms seals the deal, there is little doubt that Microsoft is working towards a ubiquitous computing platform.
Microsoft’s execution of this plan is going to be vital for their continued success. Whilst they still dominate the desktop market it’s being ever so slowly eroded away by the bevy of curated computing platforms that do everything users need them to do and nothing more. We’re still a long time away from everyone out right replacing all their PCs with tablets and smart phones but the writing is on the wall for a sea change in the way we all do our computing. Windows 8 is shaping up to be Microsoft’s way of re-establishing themselves as the tech giant to beat and I’m sure the next year is going to be extremely interesting for fans and foes alike.
There’s always risk in innovation. When you’re creating a new product or service there’s always the chance that nobody will want what you’re creating. Similarly whatever you end up creating could very well end up grating against the current norms in such a way that your product is almost wholly rejected by those its aimed at. A great example of this, which I covered in the past, was Windows Vista. In order for Microsoft to move ahead into the future they had to break away from some of their old paradigms and this drew the ire of many of their loyal customers. The damage that was done there is still being felt today with slower adoption rates of their latest product but had it not been for this initial failure they may not have been enjoying the level of success that Windows 7 has today.
In fact many pioneering products and services were first faced with dismal (albeit, mostly profitable) reception initially. Steam was a great example of this, debuting back in a time where broadband penetration numbers in many countries wasn’t particularly great and sought to deliver all games digitally direct to the consumer. Couple this with the fact that they were cutting out the publishers and distributors in the process the guys at Valve faced an extremely long, uphill battle in order for their platform to gain dominance. Still three years later they started to get big titles releasing on their platform and the rest, as they say, is history.
Interestingly enough I began to notice similar things happening with the Playstation Portable. Whilst the next version of the handheld, the NGP, is not going to be a digital only download device Sony has recently said that all games will be available digitally with only the bigger titles coming to the physical world:
“One thing we learnt from PSP, is that we want to have simultaneous delivery in digital and physical for NGP. Just to clarify that, all games that appear physically will be made available digitally, said House. He added, “Not necessarily all games have to be made available physically. And having the option of a digital-only method affords more creative risk-taking, and that’s because you don’t-have that in-built risk of physical inventory.”
For those who follow Sony you’d be aware of the dismal failure that was the PSP Go. Debuting at an insanely high price (costing just a hair below a full PS3) whilst offering little in the way of improvements the PSP Go was never going to be a phenomenal success. However it was particularly hampered by the lack of compatibility with its current gen brethren, doing away with the UMD drive in favor of a fully digital distribution model. This annoyed PSP customers to no end because their current collection of games could not be migrated onto the new platform (other than through nefarious means). Looking at the NGP there’s no way to get UMD games onto it but since most people are already aware that their current UMD titles will not have a format transition to the new platform they’ve avoided doing the same amount of damage to their next generation handheld as they did to the PSP Go.
Failure teaches you where you went wrong and where you should be heading in order to avoid making such mistakes again. Many successful products have been built on the backs of dismal failures, just look at satellite phones and radio for example. Sometimes it requires a risk taker to pave the way forward for those who will profit from the endeavor and hopefully that risk taker gets some of the kudos down the line. Digital distribution is one of those such areas where path has already been beaten and even some of the pioneers are continuing to profit from it.
My mum isn’t the most technical person around. Sure she’s lived with my computer savvy father for the better part of 3 decades but that still doesn’t stop her from griping about new versions of software being confusing or stupid, much like any regular user would. Last night I found out that her work had just switched over to Windows 7 (something I’ve yet to do at any office, sigh) and Office 2010. Having come from XP and Office 2003 she lamented the new layout of everything and how it was impossible to get tasks done. I put forth that it was a fantastic change and whilst she might fight it now she’ll eventually come around.
I didn’t do too well of convincing her that, though
You see when I first saw Vista I was appreciative of the eye candy and various other tweaks but I was a bit miffed that things had been jumbled around for seemingly no reason. Over time though I came to appreciate the new layout and the built in augmentations (start menu search is just plain awesome) that helped me do things that used to be quite laborious. Office 2007 was good too as many of the functions that used to be buried in an endless stream of menu trees were now easily available and I could create my own ribbon with my mostly used things on it. Most users didn’t see it that way however and the ribbon interface received heavy criticism, on par with that leveled at Vista. You’d then think that Microsoft would’ve listened to their users and made 7 and office 2010 closer to the XP experience, but they didn’t and continued along the same lines.
Why was that?
For all the bellyaching about Vista it was actually a fantastic product underneath. Many of the issues were caused by manufacturers not providing Vista compatible drivers, magnified by the fact that Vista was the first consumer level operating system to support 64 bit operation on a general level (XP64 was meant for Itaniums). Over the years of course drivers matured and Vista became quite a capable operating system although by then the damage had already been done. Still it laid the groundwork for the success that Windows 7 has enjoyed thus far and that will continue long after the next iteration of Windows is released (more on that another day ;)).
Office 2010 on the other hand was a different beast. Microsoft routinely consults with customers to find out what kind of features they might be looking for in future products. For the past decade or so 80% of the most requested features have already been in the product for a while, users just weren’t able to find them. In order to make them more visible Microsoft created the ribbon system, putting nearly all the features less than one click away. Quite a lot of users found this to be quite annoying since they were used to the old way of doing things (and many old shortcuts no longer worked) but in the end it’s won over many of its critics showcased by its return in 2010.
What can this experience tell us about users? Whilst they’re a great source of ideas and feedback that you can use to improve your application sometime you have to make them sit down and take their medicine so that their problems can go away. Had Microsoft bent over to the demands of some of their more vocal users we wouldn’t have products like Windows 7 and Office 2010 that rose from the ashes of their predecessors. Of course many of the changes were initially driven by user feedback so I’m not saying that their input was completely worthless, more that sometimes in improving a product you’ll end up annoying some of your loyal users even if the changes are for their benefit.
There’s a phenomena that many of us IT folks deal with every day but not many outside our industry know about. It goes by many different names but the most apt one is what I refer to as the Qantas Club factor. You see whilst it’s all well and good to develop solid technology that provides tangible benefits to business it really doesn’t help if it doesn’t get any visibility with higher ups (or decision makers as the vendors call them). The one sure fire place to find an executive or someone who can sway the decision making process is the various flight clubs and lounges that they frequent whilst jet setting around the world. Any technology that is either present there or in the literature available to them is almost guaranteed to find its way into that decision maker’s organisation.
My own personal experience with this was Blackberrys. One of the top executives had been on a recent jaunt overseas with a couple of his peers from other organisations. Before they were boarding the flight they were all discussing their various exploits when the other two pulled out their Blackberrys. Feeling quite inadequate that he didn’t have one his own the executive put a request down the line to trial the Blackberrys within our organisation and no less than 2 weeks later we had 10 of them running rampant in our environment.
Now I wouldn’t of cared that much since Blackberrys do enable some people to be more productive than they could be otherwise and I’ll never turn down new kit. However we’d already been trialling our own solution (Exchange ActiveSync) that was not only free but would also run on a wide range of handsets, something that was deemed critical as part of the email on phones solution they wanted us to develop. Still the Qantas Club factor was enough for them to overrule all their previous decisions and push forward with a solution that, whilst completely functional, showed a complete disregard for any kind of practicality or reasoned thought.
The same can be said for the iPad. When it was released I lamented it’s limited ability and took a torchto the speculation that it would be a revolution in the online media space. I still stand by those comments as whilst it has been a unabashed success the revolutions it was meant to spur haven’t even begun to show their heads yet. It has however managed to change the landscape of consumer PCs devices effectively creating a new market segment, much like the netbooks did before it. Consequently many manufacturers are playing catch up to try and corner one part of this market and one of those has the Qantas Club factor executives squarely in its sites.
The product is the Windows 7 Slate from HP.
Now back when it was announced the Slate had your typical Microsoft vapourware flavour to it. They’re often guilty of announcing a product, usually with specs that border on the edge of reality, that will never see the light of day. It’s done to scare would be competitors out of the market and unfortunately has a track record of working. When the iPad was a runaway success that couldn’t be killed by this kind of grandstanding many people thought that HP had killed the slate completely, opting instead to acquire Palmand create an iPad competitor based on their WebOS software. This was all but confirmed when HP registered the trademark PalmPad as there didn’t seem to be any reason to release 2 competing platforms.
As it turns out though instead of pulling one in favour of the other they were in fact working on rebranding the device as a enterprise appliance:
We’ve sensed that something‘s been up with the HP Slate for a while now, and it looks like we’ve finally gotten the first solid confirmation that the Windows 7 tablet as unveiled by Steve Ballmer at CES in January won’t hit the consumer market as planned — speaking at the Fortune Brainstorm Tech conference, HP Personal Systems Group VP Todd Bradley just said that the Slate will be “more customer-specific than broadly deployed,” and that it would launch the Microsoft-based tablet “for the enterprise” in the fall. That fits right in with HP telling us the other day that it was in “customer evaluations” as it prepared for the “next steps,”and based on followup comments from Bradley and Palm head Jon Rubinstein, it certainly sounded like the company will focus Windows tablets at the enterprise and develop a variety of webOS devices for the consumer market.
Enterprise in this sense means it will more than likely be running either a fully fledged Windows 7 OS or a Windows 7 Compactinstall in order to support all enterprise functions (compliance, software deployment, etc.). Additionally I wouldn’t expect it to be a consumption focused device like the iPad purely because in the enterprise there’s not a great need for a casual computing device that fits that need. I can see them becoming the next execu-toy to have filling in a requirement that didn’t exist before the product became available.
That puts it firmly out of the league of the iPad, for better and for worse. Realistically there’s little to no justification for having an iPad in the enterprise as they’re solely focused on the consumer with no integration with traditional enterprise applications. This is by design and follows the trend that Steve Jobs follows. Apple has never been that big in the enterprise and never will be with Steve Jobs at the helm as he prefers to focus on consumers at the cost of other applications. That’s not a bad thing either as he’s shown that Apple can be quite a successful consumer electronics company and it looks like other companies are starting to take notice.
Does this mean I’ll be buying one? Probably not as it fits into the same requirements model that is aptly filled by a laptop, which I’m currently in the process of buying. It appears though that the demandfor a Microsoft alternative to the iPad is strong but unless you’re willing to shell out enterprise level dollars for it (read: probably double the iPad) it will be firmly out of your reach. There are wide range of alternatives of course, including the all but confirmed PalmPad, but none of them have drawn the attention that the HP Slate did when it was first announced.
I’m sure I’ll get to have a good play with one of them when one of the executives catches his friend using one before their next trip
I’ve been in the IT industry professionally for quite some time now, and even longer as an avid enthusiast. I’ve seen so many companies come and go as they evolve with the fast paced ever changing technology world and that’s lead to a great understanding of some of the fundamental rules that don’t seem to change. One of these such rules is the constant upgrade cycle, e.g. the release of new versions of products on a fairly regular schedule , in order to take advantage of the latest developments from other companies. The interesting thing about these cycles is that usually they can’t be too drastic lest you alienate your customers who’ve created expectations of the product that will cause a revolt should they not be met.
Take for example some of Microsoft’s products, most notably their desktop operating systems Windows. From Windows 3.1 to Windows 7 there’s a kind of baseline familiarity that users have developed with the products and, for the most part, they’ve remained unchanged for the better part of almost 2 decades. Granted it would be quite a shock for someone who’d been using Windows 3.1 to move straight to 7 but that’s usually never the case. More most users would be ushured along onto the latest product from at most 2 generations previous, usually when they can’t do something that everyone else in their social circle can.
Most major players in the IT world have mastered this idea of product cycles. From Apple to Dell to AMD you can bet your bottom dollar that they’ll release a new product on a predictable timeline, usually timed perfectly to be right smack in the middle of their competitors cycle. If there’s any phenomenon that’s to be held responsible for the IT sector’s almost ubelievably fast movement speed it would have to be this culture of ensuring that your company is providing the latest and greatest features and products to its consumers, always making sure that you keep on eye on your competitors. It is basically a massive game of one-upmanship.
Sometimes however, the product cycle does not go quite as planned.
You see I’m writing this blog post to you today on what should be considered an absolute dinosaur of the IT world: a Windows XP machine. Released late in 2001 I can remember fondly my first experiences with it, blue screens abounding and most of my hardware behaving in ways that I never thought imaginable. A couple months saw it come good with the various manufacturers catching up with their drivers and Microsoft patching the more obvious flaws in the system. It was a rocky start for Microsoft’s attempt to bring some of the better parts of their server line to the desktop but eventually most companies relented and XP found its home as the defacto operating system for the majority of computer users worldwide.
5 years later, Microsoft would attempt to do it all over again.
Now before we dig into Microsoft’s next product cycle let’s take a moment to think about that last paragraph. Think about where you were 9 years ago and compare it to today, worlds apart right? Just imagine if I told you that I’d bought a top of the line phone back in 2001 and I was still using that today, you’d think I was pretty bonkers since even a $50 phone today would be better in almost every way. Whilst I’m sure there are people doing such things (my Dad is using a phone from 2004) the simple fact that technology moves so fast means that most products have an effective life of around 2~3 years. Windows XP, for some reason, seems to be completely immune to that idea.
Partly that’s to blame with the long development cycle that plague its successor, Windows Vista (codenamed Longhorn). Initially planned for release a mere 2 years after the initial release of XP its original intention was to function as a stop gap between XP and the next major release codenamed Blackcomb. Due to feature creep that saw Longhorn encroach on Blackcomb’s territory the two finally merged together under the Vista title and the release date slipped by over 3 years. This lead to one of the longest time between releases of Windows versions in almost a decade, and the markets reaction was nothing short of devastating.
Windows Vista, for what its worth, was not a bad operating system at heart. Like its predecessor it was plagued with the job of attempting to support legacy systems whilst at the same time trying to innovate in any way it could. Consequently neither part could be done very well as legacy support inherently holds back innovation, leaving Vista to languish in a kind of no man’s land. Again like XP before it Vista attempted to do things in a completely new way which broke the compatibility with numerous bits of hardware and software further stifling its adoption rates. Overall the industries first reactions to Vista were ultimately its death knell and I never found a workplace that found the idea of switching to it appealing.
Microsoft managed to make the system quite usable in the years following Vista’s initial release. I myself ran it on my personal computer for quite some time and so did many of my technical friends. Still the damage was done and many corporate departments decided that XP suited their needs aptly and left it at that. It wasn’t until late last year that Windows 7 made its triumphant debut, hoping to be the knight in shining armor to pull the damsels of corporate IT away from the darkness that was Windows XP.
However due to the botch cycle of Windows Vista they were met with almost spiteful disdain. 8 years is a long time to go between refreshing your products and nearly all IT departments had grown accustomed to things working the XP way. Whilst many recognised that Windows 7 was not Vista (thanks to new and improved eye candy) they still couldn’t fathom the idea that anything but XP was required and were even more concerned for all those legacy applications they’d developed for their aging XP systems. Thus Microsoft, who really did so many things right with Windows 7, was left trying to market a product to people who were so entrenched in their habits that Windows 7 was almost set to Vista all over again. Windows 7 however is that good that its adoption rates are almost double that of Vista’s for the same time period, matching that of Windows XP.
There’s a couple lessons to be learnt from the Windows Vista story. The first is to repeat the old developer mantra release early, release often. Microsoft’s long development cycle for Vista meant that there was already quite a bit of inertia working against it. Whilst its quite understandable that something as complicated as an operating system takes time to develop they knew from the get go that a long development cycle would harm the adoption rates. They fell prey to some of the most common project management mistakes (read: scope creep) and their final product, whilst impressive technologically, was too far away from user’s current expectations. The original idea of Longhorn being a stepping stone to Blackcomb was sound and was proven succinctly with the success of Windows 7 which inadvertently used Vista as its stepping stone.
It’s always interesting to look back at the history of product releases and to see how customer behaviour influences company decisions. Vista was one of those oddities where the latest and greatest was wholly rejected by the community it set out to serve and only its rebirth under a new label and shiny facade was enough to win them back. It was also a demonstration of the market power that Microsoft has since a failed product cycle was the in for many competitors to swoop in yet as we can see despite their disdain for the latest offering Microsoft’s customers remained loyal, even if it was to the wrong product (in Microsoft’s eyes).
I should really update my machine to the new Windows 7 environment they’re offering here…
Ask people on the street about Windows Vista and you’ll usually get the response “Isn’t Vista crap?” even though many of them have not used it nor have a clue about what you actually get from it. For the past year I’ve been using Vista as my main desktop and really I found it to be quite usable. Sure there were some things that were obviously changes for change sake (where did my up folder button go?!?!) but overall the new UI was pretty appealing and once you were past about 2GB of system memory there wasn’t much of a performance difference between them. However, the initial fiasco of it requiring such an exotic system just to run and the incompatibility with many legacy devices lead to bad perceptions all over the place. Something that Microsoft hasn’t been able to shake to this day.
Enter the ring Windows 7, Microsoft’s evolutionary step into the next world of operating systems. With such a shorter development time then Vista this wasn’t going to be the new revolution of the computing world that Vista was supposed to be. No this clucky little system was supposed to build on Vista’s success whilst making the whole user experience much more pleasant and secure. As it turns out, Windows 7 might actually achieve what Vista set out to do.
I’ve been using Windows 7 exclusively as my main computer for the past couple weeks, and there’s a couple things I’d love to share with you.
First off the installation of Windows 7 takes a much shorter time then any other OS that I’ve installed. From booting up from the disk to a usable system I had the install time just shy of 20 minutes, with most of that spent with me away from the computer. They’ve even gone to the trouble of making the loading screen look pretty, which while completely useless is a nice touch.
Boot times have been significantly reduced, including time to usable¹. Vista had a nasty habit of showing the initial loading screen and then a black screen for a while before letting you login. They’ve bypassed this part and once the initial logo disappears the login screen comes up seconds later.
They’ve redesigned the UI for Windows 7, which I initially groaned at. Most of the time I encounter UI changes things are moved around for no good reason (hello Facebook) and it takes me more time to figure out how to do something than what I actually want to do. However, the Windows 7 UI is a refreshing change from this, with many of the changes being revisionary steps forward, rather than a whole paradigm shift (Ha! Correctly used buzzwords). The augmentations to the start menu are very useful, especially for things like the Remote Desktop Client. If you could navigate your way around Vista you won’t find Windows 7 hard at all. In fact, I think you’ll find it easier.
Windows 7 really does seem to be everything that Vista should have been. It’s fast, very usable but different enough from the previous version to really set it apart. However, under its sleek and shiny exterior it really is a revamped Vista at heart, which leads me to my main point for this post.
Vista got blackballed from the first day it was released and unfortunately could never shake the negative press associated with it. Microsoft in its wisdom tried to remedy this by initiating Project Mojave, a sneaky little project that gussied up Vista as some new and exciting OS from Microsoft. Whilst this proved Microsoft’s point that Vista was actually a very capable and usable OS it did not improve the market’s perception of the product. Windows 7 on the other hand is Microsoft’s next genuine attempt at a new OS and as much as I’d like to say it’s changed, it’s still Vista underneath.
Sure, there are many changes between the two and not just in terms of UI. The revised UAC model is a tad more usable although still fundamentally useless from a security perspective, and there are several other administration tweaks. Device Stage sounds like a great idea, and hopefully the driver writers step up to the plate to take advantage of this.
Overall I’m very happy with the way Windows 7 is going. I believe more frequent releases of operating systems leads to them being far more in tune with the market and it will help ease the transition pain we saw with XP to Vista. The great news is that Microsoft is offering Windows 7 as a free upgrade to all Vista users and downgraders, something that will definitely work wonders for its initial adoption. Additionally most applications that have been reworked for Vista will work with 7, so there should be a lot less compatability issues moving from XP to 7.
It does beg the question however, was Vista really the failure it was made out to be or was it the failure they had to have in order to get everyone in the mindset for a change?
That is an exercise left up to the reader
¹This is the time taken from pressing the power button to actually being able to use the computer. Vista would start up quick but wouldn’t be usable for quite some time.