Do you know anyone who actually uses a laptop on their lap? The name would imply that the intended use case for a laptop is for something that could ostensibly be used anywhere that you could find yourself a seat although the primary use for most laptops seems to be as a desktop replacement rather than a dedicated mobile computing experience. I know that was the case for me with many of the work issued laptops that I was given (they were primarily only used at work) and their form factors usually didn’t lend themselves to being used well as their namesake implies. After reviewing the ASUS Zenbook UX32V this week I remembered how similar my experiences sounded to those who’d bought a tablet and that got me wondering about the larger portable computing market.
You see whilst I was pretty sour on the tablet idea for a long time I have to admit there were situations where I’d find myself thinking that it’d be great to have one in order to do something. Granted these situations were pretty niche for me, usually when I was lounging in front of the TV and wanted to look something up (with my PC being not 3 meters from me) but never the less it kept me looking into them. Then after getting the Zenbook I found myself doing those exact, presumably tablet optimized tasks, on said device. It was a bit of a strange thing to happen to me as it’s not like I didn’t have a laptop lying around the house before but there was something about the form factor and extremely fast boot time of the Zenbook that transformed it from just being another laptop to a truly portable device I could use anywhere.
In fact thinking back I can remember many of my friends who purchased a MacBook Air said very similar things and their reaction to the iPad was one of questioning the value they’d get out of it. Taking this idea further it would seem that whilst there’s a case to be made for having a laptop and some kind of tablet around should you get the right kind of laptop, namely an ultrabook, you’ll end up usurping all the uses cases you’d have for a tablet. Granted there are still some situations where a tablet might be better (I can think of 1 and only 1 for myself, previewing photographs before I finish shooting) but for people like me it’s hard to warrant a purchase based around that. For many other people however, those who were craving a good portable computing experience, the tablet filled the niche long before the ultrabook ever had a chance to make its case.
Keen observers will note that whilst I might be making my argument around the ultrabook designation there was actually a market sector that was dedicated to the small, highly portable market long before the ultrabooks became viable: the netbooks. Indeed this is correct and whilst many will say that the iPad is what killed the netbook sector (I strongly disagree on that) it was fare more likely that after the initial hype cycle companies started focusing on higher margin products, like the ultrabooks, and thus the market shifted towards them. Indeed when the computing power in a tablet can match or exceed that of a netbook most people will probably go with the former, especially if the user experience is better.
My point among all this is that for me, and I believe many others, the use case for a tablet are more than aptly covered off by an ultrabook. Sure it’s hard to compare them as the form factors, operating systems and available applications are worlds apart from each other but I really have found myself wondering why I’d need a tablet now that I can use my Zenbook in practically every other situation. It could very well be that I’m too in love with physical keyboards and the possibility of playing my PC games anywhere to realise the extra value that I might derive from a tablet but apart from being a digital photograph portfolio I can’t see much use for one any more.
If there’s one thing that us system administrators loathe more than dealing with users its dealing with users who have a bit of IT smarts around them. On the surface they’re the perfect user, being able to articulate their problems and requirements aptly so we have to spend considerably less time fulfilling their requests. However more often than not they’re also the ones attempting to circumvent safeguards and policies in order to get a system to work the way they want it to. They’re also the ones who will push for much more radical changes to systems since they will have already experimented with such things at home and will again want to replicate that in their work environment.
Collectively such people are known as shadow IT departments.
Such departments are a recent phenomena with a lot of credit (or blame) being levelled at those of my generation, the first to grow up as digital natives. Since the vast majority of us have used computers and the Internet from an early age we’ve come to expect certain things to be available to us when using them and don’t appreciate it when they are taken away. This doesn’t gel too well with the corporate world of IT where lock downs and restrictions are the norm, even if they’re for the user’s benefit, and thus they seek to circumvent such problems causing endless headaches for their system administrators. Still they’re a powerful force for driving change in the work place, enough so that I believe these shadow IT departments are shaping the future of corporate environments and the technologies that support them.
Most recently I’ve seen this occurring with mobility solutions, a fancy way of saying tablets and phones that users want to use on the corporate network. Now it’s hard to argue with a user that doing such a thing isn’t technically feasible but in the corporate IT world bringing in uncontrolled devices onto your network is akin to throwing a cat into a chicken coup (I.E. no one but the cat benefits and you’re left with an awful mess to clean up). Still all it takes is one of the higher ups to request such a thing for it to become a mandate for the IT department to implement. Unfortunately for us IT guys the technology du jour doesn’t lend itself well to being tightly controlled by a central authority so most resort to hacks and work arounds in order to make them work as required.
As the old saying goes the unreasonable person is the one who changes the world to suit themselves and therefore much of the change in the corporate IT world is being made by these shadow IT departments. At the head of these movements are my fellow Gen Y and Zers who are struggling with the idea that what they do at home can’t be replicated at work:
“The big challenge for the enterprise space is that people will expect to bring their own devices and connect in to the office networks and systems,” Henderson said. “That change is probably coming a lot quicker than just five years’ time. I think it will be a lot sooner than that.”
Dr Keiichi Nakata, reader in social informatics at Henley Business School at the University of Reading, who was also at the roundtable, said the university has heard feedback from students who have met companies for interviews and been “very surprised” that technologies they use every day are not being utilised inside those businesses.
It’s true that the corporate IT world is a slow moving beast when compared to the fast paced consumer market and companies aren’t usually willing to wear the risk of adopting new technologies until they’ve proven themselves. Right now any administrator being asked to do something like “bring your own computer” will likely tell you its impossible, lest you open yourselves up to being breached. However technologies like virtualization are making it possible to create a standard work environment that runs practically everywhere and I think this is where a bring your own device world could be possible.
Of course this shifts the problem from the IT department to the virtualization product developer but companies like VMware and CITRIX have both already demonstrated the ability to run full virtual desktop environments on smart phone level hardware. Using such technologies then users would be able to bring in almost any device that would then be loaded with a secure working environment, enabling them to complete the work they are required to do with the device they choose. This would also allow IT departments to become a lot more flexible with their offerings since they wouldn’t have to spend so much time providing support to the underlying infrastructure. Of course there are many other issues to consider (like asset life cycles, platform vetting, etc) but a future where your work environment is independent of the hardware is not so far fetched after all.
The disjunct between what’s possible with IT and what is the norm in computer environments has been one of those frustrating curiosities that has plagued my IT career. Of course I understand that the latest isn’t always the greatest, especially if you’re looking for stability, but the lack of innovation in the corporate space has always been one of pet peeves. With more and more digital natives joining the ranks however the future looks bright for a corporate IT world that’s not too unlike the consumer one that we’re all used to, possibly one that even innovates ahead of it.
We often forget that the idea of a personal computer is an extremely modern one, considering how ingrained in our lives they have become. Indeed the first personal computers appeared around 40 years ago and it took decades for them to become a fixture as common as the television in modern households. The last 2 decades have seen an explosion in the adoption rate of personal computers growing at double digit rates nearly every year. Still even though today’s personal computers are leaps and bounds above their predecessors in terms of functionality they still share the common keyboard, monitor and mouse configuration that’s been present for decades despite many attempts to reinvent them.
There does however seem to be a market for curated computing devices that, whilst lacking the power of their bigger brethren, are capable of performing a subset of their tasks. I first began to notice this trend way back when I was still working in retail as many customer’s requirements for a PC rarely amounted to more than “email, web surfing and writing a few documents”. Even back then (2000~2006) even the most rudimentary of the PC line I had to sell would cover this off quite aptly and more often than not I’d send them home with the cheapest PC available, leaving the computing beasts to gather dust in the corner. To me it seemed that unless you were doing photo/video editing or gaming you could buy a PC that would last the better part of 5 years before having to think about upgrading, and even then only because it would be so cheap to do so.
The trend towards such devices began about 4 years ago with the creation of the netbook class of personal computing devices. Whilst still retaining much of the functionality of their ancestors netbooks opted for a small form factor and low specifications in order to keep costs down. I, like many geeks of the time, saw them as nothing more than a distraction as they filled a need that didn’t exist failing to remember the lessons I had learned many years before. The netbook form factor proved to be a wild success with many people replacing their PCs in favor of the smaller platform. They were however still fully fledged PCs.
Then along came Apple with their vision of creating yet another niche and filling it with their product. I am of course talking about the iPad which has enjoyed wild success and created the very niche that Apple dreamed of creating. Like with netbooks I struggled with the idea that there could be a place in my home for yet another computing device since I could already do whatever I wanted. However just like the netbooks before them I finally came around to the idea of having a tablet in my house and that got me thinking, maybe the curated experience is all most people need.
Perhaps the PC is better off as an appliance, at least for most people.
For the everyman their requirements for a computing device outside the workplace don’t usually extend past the typical “email, web and document editing” holy trinity. Tablets, whilst being far from an ideal platform to do all those tasks aptly (well, in my opinion anyway) they’re good enough to replace a PC for most people outright. Indeed the other Steve behind Apple, Mr Wozniak, has said that tablets are PCs for everyone else:
“The tablet is not necessarily for the people in this room,” Wozniak told the audience of enterprise storage engineers. “It’s for the normal people in the world,” Wozniak said.
“I think Steve Jobs had that intention from the day we started Apple, but it was just hard to get there, because we had to go through a lot of steps where you connected to things, and (eventually) computers grew up to where they could do … normal consumer appliance things,” Wozniak said.
If you consider the PC as a household appliance then the tablet form factor starts to make a lot of sense. Sure it can’t do everything but it can do a good chunk of those tasks very well and the barrier to using them is a whole lot lower than that of a fully fledged PC. Plus unlike a desktop or laptop they don’t seem out of place when used in a social situation or simply lying around on the coffee table. Tablets really do seem to be a good device for the large majority of people who’s computing needs barely stress today’s incredibly powerful PCs.
Does that mean tablets should replace PCs outright? Hell no, there’s still many tasks that are far more aptly done on PC and the features that make a tablet convenient (small size, curated experience) are also its most limiting factors. Indeed the power of tablets is built on the foundations that the PC has laid before it with many tablets still relying on their PC brethren to provide certain capabilities. I think regular users will gravitate more towards the tablet platform but it will still be a long time before the good old keyboard, monitor and mouse are gone.