Posts Tagged‘admin’

The Strange Dichotomy of IT Certifications.

The story of the majority of IT workers is eerily similar. Most get their beginnings in a call centre, slaving away behind a headset troubleshooting various issues for either their end users or as part of a bigger help desk that services dozens of clients. Some are a little more lucky, landing a job as the sole IT guy at a small company which grants them all the creative freedom they could wish for but also being shouldered with the weight of being the be all and end all of their company’s IT infrastructure. No matter how us IT employees got our start all of us eventually look towards getting certified in the technologies we deal with every day and, almost instantly after getting our first, become incredibly cynical about what they actually represent.

Microsoft Training and Certification

For many the first certification they will pursue will be something from Microsoft since it’s almost guaranteed that every IT job you’ll come across will utilize it in some fashion. Whilst the value of the online/eLearning packages is debatable there’s little question that you’ll likely learn something that you didn’t already know, even if it’s completely esoteric and has no application in the real world. For anyone who’s spent a moderate amount of time with the product in question these exams aren’t particularly challenging as most of them focus on regurgitating the Microsoft way of doing things. This, in turn, feeds into their greatest weakness as they favour rote memorization over higher order concepts and critical thinking (at least at the introductory/intermediate levels).

This has led to a gray market which is solely focused on passing the exams for these tests. Whilst there are some great resources which fall into this area (Like CBT Nuggets) there are many, many more which skirt the boundaries of what’s appropriate. For anyone with a modicum of Google skills it’s not hard to track down copies of the exams themselves, many with the correct answers highlighted for your convenience. In the past this meant that you could go in knowing all the answers in advance and whilst there’s been a lot of work done to combat this there are still many, many people carrying certifications thanks to these resources.

The industry term for such people is “paper certs”.

People with qualifications gained in this way are usually quite easy to spot as rote memorization of the answers does not readily translate into real world knowledge of the product. However for those looking to hire someone this often comes too late as interview questions can only go so far to root these kinds of people out. Ultimately this makes those entry level certifications relatively worthless as having one of them is no guarantee that you’ll be an effective employee. Strangely however employers still look to them as a positive sign and, stranger still, companies looking to hire on talent from outsourcers again look for these qualifications in the hopes that they will get someone with the skills they require.

I say this as someone who’s managed to skate through the majority of his career without the backing of certs to get me through. Initially I thought this was due to my degree, which whilst being tangentially related to IT is strictly speaking an engineering one, but the surprise I’m met with when I mention that I’m an engineer by training has led me to believe that most of my former employers had no idea. Indeed what usually ended up sealing the position for me was my past experiences, even in positions where they stated certain certs were a requirement of the position. Asking my new employers about it afterwards had them telling me that those position descriptions are usually a wish list of things they’d like but it’s rare that anyone will actually have them all.

So we have this really weird situation where the majority of certifications are worthless, which is known by all parties involved, but are still used as a barrier to entry for some positions/opportunities but that can be wholly overwritten if you have enough experience in that area. If that’s sounding like the whole process is, for want of a better word, worthless than you’d be of the same opinion of most of the IT workers that I know.

There are some exceptions to this rule, CISCO’s CCIE exams being chief among them, but the fact that the training and certification programs are run by the companies who develop the products are the main reason why the majority of them are like this. Whilst I’m not entirely sure that having an independent certification body would solve all the issues (indeed some of those non-vendor specific certs are just as bad) it would at least remove the financial driver to churn as many people through the courses/exams as they currently do. Whilst I abhor artificial scarcity one of the places it actually helps is in qualifications, but that’d only be the first few tentative steps to solving this issue.

Will The Cloud Kill The IT Admin?

IT is one of the few services that all companies require to compete in today’s markets. IT support then is one of those rare industries where jobs are always around to be had, even for those working in entry level positions. Of course this assumes that you put in the required effort to stay current as letting your skills lapse for 2 or more years will likely leave you a generation of technology behind, making employment difficult. This is of course due to the IT industry constantly evolving and changing itself and much like other industries certain jobs can be made completely redundant by technological advancements.

For the past couple decades though the types of jobs you expect to see in IT support have remained roughly the same, save for the specializations brought on by technology. As more and more enterprises came online and technology began to develop a multitude of specializations became available, enabling then generic “IT guys” to become highly skilled workers in their targeted niche. I should I know, just on a decade ago I was one of those generic IT support guys and today I’m considered to be a specialist when it comes to hardware and virtualization. Back when I started my career the latter of those two skills wasn’t even in the vernacular of the IT community, let alone a viable career path.

Like any skilled position though specialists aren’t exactly cheap, especially for small to medium enterprises (SMEs). This leads to an entire second industry of work-for-hire specialists (usually under the term “consultants”) and companies looking to take the pain out of utilizing the technology without having to pay for the expertise to come in house. This isn’t really a surprise (any skilled industry will develop these secondary markets) but with IT there’s a lot more opportunity to automate and leverage economies of scale, more so than any other industry.

This is where Cloud Computing comes in.

The central idea behind cloud computing is that an application can be developed to run on a platform which can dynamically deliver resources to it as required. The idea is quite simple but the execution of it is extraordinarily complicated requiring vast levels of automation and streamlining of processes. It’s just an engineering problem however, one that’s been surmounted by several companies and used to great effect by many other companies who have little wish to maintain their own infrastructure. In essence this is just outsourcing taken to the next level, but following this trend to its logical conclusion leads to some interesting (and, if you’re an IT support worker, troubling) predictions.

For SMEs the cost of running their own local infrastructure, as well as the support staff that goes along with it, can be one of their largest cost centres. Cloud computing and SaaS offers the opportunity for SMEs to eliminate much of the cost whilst keeping the same level of functionality, giving them more capital to either reinvest in the business or bolster their profit margins. You would think then that this would just be a relocation of jobs from one place to another but cloud services utilize much fewer staff due to the economies of scale that they employ, leaving fewer jobs available for those who had skills in those area.

In essence cloud computing eliminates the need for the bulk of skilled jobs in the IT industry. There will still be need for most of the entry level jobs that cater to regular desktop users but the back end infrastructure could easily be handled by another company. There’s nothing fundamentally wrong with this, pushing back against such innovation never succeeds, but it does call into question those jobs that these IT admins currently hold and where their future lies.

Outside of high tech and recently established businesses the adoption rate of cloud services hasn’t been that high. Whilst many of the fundamentals of the cloud paradigm (virtualization, on-demand resourcing, infrastructure agnostic frameworks) have found their way into the datacenter the next logical step, migrating those same services into the cloud, hasn’t occurred. Primarily I believe this is due to the lack of trust and control in the services as well as companies not wanting to write off the large investments they have in infrastructure. This will change over time of course, especially as that infrastructure begins to age.

For what its worth I still believe that the ultimate end goal will be some kind of hybrid solution, especially for governments and the like. Cloud providers, whilst being very good at what they do, simply can’t satisfy the need of all customers. It is then highly likely that many companies will outsource routine things to the cloud (such as email, word processing, etc) but still rely on in house expertise for the customer applications that aren’t, and probably will never be, available in the cloud. Cloud computing then will probably see a shift in some areas of specialization but for the most part I believe us IT support guys won’t have any trouble finding work.

We’re still in the very early days of cloud computing and its effects on the industry are still hard to judge. There’s no doubt that cloud computing has the potential to fundamentally change the way the world does IT services and whatever happens those of us in IT support will have to change to accommodate it. Whether that comes in the form of reskilling, training or looking for a job in a different industry is yet to be determined but suffice to say that the next decade will see some radical changes in the way businesses approach their IT infrastructure.

Virtualized Smartphones: No Longer a Solution in Search of a Problem.

It was just under 2 years ago when I wrote my first (and only) post on smartphone virtualization approaching it with the enthusiasm that I do with most cool new technologies. At the time I guessed that VMware would eventually look to integrate this idea with some of their other products, in essence turning user’s phones into dumb terminals so that IT administrators could have more control over them. However the exact usefulness was still not clear as at the time most smartphones were only just capable of running a single instance, let alone another one with all the virtualization trimmings that’d inevitably slow it down. Android was also somewhat of a small time player back then as well having only 5% of the market (similar to Windows Phone 7 at the same stage in its life, funnily enough) making this a curiosity more than anything else.

Of course a lot has changed in the time between that post and now. Then market leader, RIM, is now struggling with single digit market share when it used to make up almost half the market. Android has succeeded in becoming the most popular platform surpassing Apple who maintained the crown for many years prior. Smartphones have also become wildly more powerful as well, with many of them touting dual cores, oodles of RAM and screen resolutions that would make my teenage self green with envy. With this all in mind then the idea of running some kind of virtualized environment on a smartphone doesn’t seem all that ludicrous any more.

Increasingly IT departments are dealing with users who want to integrate their mobile devices with their work space in lieu of using a separate, work specific device. Much of this pressure came initially from the iPhone with higher ups wondering why they couldn’t use their devices to access work related data. For us admin types the reasons were obvious: it’s an unapproved, untested device which by rights has no business being on the network. However the pressure to capitulate to their demands was usually quite high and work arounds were sought. Over the years these have taken many various forms, but the best answer would appear to lie within the world of smartphone virtualization.

VMware have been hard at work creating full blown virtualization systems for Android that allow a user to have a single device that contains both their personal handset as well as a secure, work approved environment. In essence they have an application that allows them to switch between the two of them, allowing the user to have whatever handset they want whilst still allowing IT administrators to create a standard, secure work environment. Android is currently the only platform that seems to support this wholly thanks to its open source status, although there are rumours of it coming to the iOS line of devices as well.

It doesn’t stop there either. I predicted that VMware would eventually integrate their smartphone virtualization technology into their View product, mostly so that the phones would just end up being dumb terminals. This hasn’t happened exactly, but VMware did go ahead and imbue their View product with the ability to present full blown workstations to tablet and smartphones through a secure virtual machine running on said devices. This means that you could potentially have your entire workforce running off smartphones with docking stations, enabling users to take their work environment with them wherever they want to go. It’s shockingly close to Microsoft’s Three Screens idea and with Google announcing that Android apps are now portable to Google TV devices you’d be forgiven for thinking that they outright copied the idea.

For most regular users though these kinds of developments don’t mean a whole lot, but it is signalling the beginning of the convergence of many disparate experiences into a single unified one. Whilst I’m not going to say that anyone one platform will eventually kill off the other (each one of the three screens has a distinct purpose) we will see a convergence in the capabilities of each platform, enabling users to do all the same tasks no matter what platform they are using. Microsoft and VMware are approaching this idea from two very different directions with the former unifying the development platform and the latter abstracting it away so it will be interesting to see which approach wins out or if they too eventually converge.