There was an awful lot of noise last month around the whole XboxOne DRM/features/whatever debacle that ended up with Microsoft doing a 180 on their often-on DRM stance. Ostensibly it was reactionary due to the amount of praise that Sony was getting at Microsoft’s expense, even though they’d managed to hold fast during the initial PR stampede. There were a few though, certainly not the majority but a non-zero amount, who lamented this change by Microsoft, saying that they had capitulated to the crowd and were essentially keeping gaming services in the dark ages. There’s a little meat to this story as the removal of the daily check-in requirement meant that some of the features that came along with it had to go away. Initially the things people were talking about didn’t require a daily check-in to achieve (like worlds that “live on” between game sessions, I think Animal Crossing had that covered pretty well) but there was one that was so revolutionary that I thought people were just making it up.
That was the ability to sell your digital only games.
Now as someone who’s got a massive library of these kinds of games on Steam (last count was in the realm of 300+) the ability to sell, or even just transfer, these games would be a pretty great feature. It’s possible that residents of EU countries might end up getting this by default thanks to a 2012 CURIA ruling but the idea that this could come to the XboxOne, regardless of territory, would be very appealing to a lot of gamers. The often on check is then required to make sure you haven’t sold the game through one channel and then continue to play it offline, which makes some sense in context, although I’d argue that the number of people who’d do such things would be in the minority (and you could just check whenever they did eventually get online anyway). However all that still has the one enormous caveat that I think was the crux of the issue for everyone: you have to rely on a service that may or may not be there in the future.
“Ah ha”, I hear you say, “but that’s the same for Steam and everyone just accepts it there!” and you’re right, to a point. That was probably the biggest thing that Steam had going against it at the time as PC gamers were most certainly not welcoming of it, I know I certainly wasn’t. However once the value proposition became very attractive, mostly through the sales, ease of use and increasing broadband penetration we started to warm to the service. There was also the assurance from Gabe Newell (although trying to source a direct quote relating to this is proving elusive) that should Steam have to shut down there’ll be a patch issued that would free your game library from its decaying hands. With Microsoft’s announcement there wasn’t, or at least it wasn’t communicated well, an equivalent assurance that would allow gamers to continue to play such games past the time when the Xbox Live service disappeared.
Indeed this problem faces all gamers as many titles move towards a more connected model which could mean that core features become unusuable the second the developer can no longer support running the back end infrastructure. For some times, ones that are traditionally multiplayer only, this is kind of expected but the difference between Diablo and Diablo III for instance is that in 20 years I can almost guarantee the former will still be able to be run by anyone with the disc, the latter I’m not sure will see the end of this decade. Sure the number of people doing this might not be in the majority but they’re a vocal one and the sole reason why services like GoG exist. Had Microsoft given some assurances to the contrary they might not be in the position they are today and those features might still be available to Xbox customers.
It may seem like we’re just being backwards Luddites bent on keeping the status quo but it’s far more than that, we just want to be able to play our games long into the future like we can do with so many titles we grew up on. I see no technical reason why systems can’t be built to enable both sides of the equation, one that allows us to sell/trade digital games whilst also giving the opportunity to play offline whenever we want, but the reasons are far more likely business in nature. It’s a real shame as Microsoft could have really outdone Sony on this particular front but it seems like they’re instead gearing up for being second place, capitulating just enough so they don’t end up competing with the Wii U for scraps of market share.
It’s no secret that I’m a Microsoft guy, owing much of my current career to their products which have been the staple of my computing experience since I was 5 years old. In that time I’ve gone from a simple user, to a power user who tweaked his system for the ultimate gaming experience to the administrator I am today, one who has seen almost everything Microsoft has to offer. I won’t lie, much of that foundational experience was built on the backs of pirated software but once I had a proper job that gave me access to all the software I needed I found myself not often needing much more than they provided. That was until I became a contractor which necessitated some external learning on my part.
Enter TechNet subscriptions.
They’re essentially a golden ticket to Microsoft’s entire software library. Back when I first bought into them there was only one level which got you everything but Visual Studio (that privilege is reserved for MSDN subscribers) and came with a handful of licenses for every Windows version out there, and I do mean every version as you could get MS-DOS 1.0 should you be so inclined. I, like most TechNet subscribers at the time, got it because the cost was roughly equivalent to the Windows desktop licensing cost to cover all my home machines at the time and the added server OSes and business software were an added bonus that’d help me professionally. I didn’t end up renewing it, mostly because I then got a MSDN account through work, but I know several people who are still subscribers today, usually for the same reasons I was.
It was with mixed feelings then that I read today’s announcement that Microsoft was going to stop selling the program effective August 31st, 2013. If you’re so inclined you can buy yourself a subscription (or renew your current one) all the way up to this date so you can continue to use the service for another year after that, putting the end date of the service at late 2014. After that your only option to get a similar level of access to Microsoft’s catalogue will be to go through MSDN which at current pricing levels is out of reach for infrastructure professionals like myself. Whilst the price difference is justified by a lot of the extra features you get (like the super cheap Azure pricing) those benefits aren’t exactly aligned with the current TechNet crowd.
The suggested replacement for TechNet is now the Evaluation Center which provides access to time limited versions of the same software (although how comprehensive the library is in comparison isn’t something I can comment on). Ironically there’s still a text blurb pointing you to buy a TechNet subscription should you want to “enjoy software for longer” something which I’m sure won’t remain there for long. In all honesty the reason why TechNet was so useful was the lack of time and feature limitations, allowing you to work freely with the product without having to consider some arbitrary limitation. For people like me who like to evaluate different bits of software at different times this was great as I could have an environment set up with all the basics and just install that application on top of it. Time limited software doesn’t provide this functionality, making evaluation done at the individual professional level essentially pointless.
The rationale is that people are looking more towards free services for evaluation and deployment. Now no one but Microsoft has the stats to back that argument up so we’ll just have to take their word for it but I get the feeling this is more about them trying to realign their professional network more than anything else. Sure I’m in the camp that admins will need to skill themselves up on dev related things (PowerShell and C# would not go astray) but semi-forcing them onto MSDN to do so isn’t the right way to go about it. Sure they’ve committed to expanding the services offered through the evaluation center but I doubt the best feature of TechNet, the no time and feature limitations, will ever come to it. Perhaps if they were to do a TechNet cloud edition, one where all the software had to be run on Azure, I might sing a different tune but I doubt that’ll ever happen.
As much as I praise Microsoft here I can’t help but feel this is a bad move on their part as it will only help to alienate a dedicated part of their user base that serves as the front line advocates for their products. I may not be a subscriber anymore, nor will I likely be one in the near future thanks to the benefits granted by my job, but I know many people who find a lot of value in the service, people who are de facto product evangelists because of it. I can only hope that they revamp the MSDN subscriptions to provide a similar level of service as otherwise there’s really only one place people will turn to and I know Microsoft doesn’t approve of it.
Whilst its easy to argue to the contrary Microsoft really is a company that listens to its customers. Many of the improvements I wrote about during my time at TechEd North America were the direct result of them consulting with their users and integrating their requests into their updated product lines. Of course this doesn’t make them immune to blundering down the wrong path as they have done with the XboxOne (and a lot would argue Windows 8 as well, something which I’m finding hard to ignore these days) something which Sony gleefully capitalized on. Their initial attempts at damage control did little to help their image and it was looking like they were just going to wear it until launch day.
And then they did this:
Essentially it’s a backtrack to the way things are done today with the removal of the need for the console to check in every day in order for you to be able to play installed/disc based games. This comes hand in hand with Microsoft now allowing you to trade/sell/gift your disc based games to anyone, just like you can do now. They’re keeping the ability to download games directly from Xbox Live although it seems the somewhat convoluted sharing program has also been nixed, meaning you can no longer share games with your family members nor can you share downloaded titles with friends. Considering that not many people found that particular feature attractive I’m not sure it will be missed but it does look like Microsoft wanted to put the boot in a little to show us what we could have had.
I’ll be honest and say I didn’t expect this as Microsoft had been pretty adamant that it was going to stick around regardless of what the consumers thought. Indeed actions taken by other companies like EA seemed to indicate that this move was going to be permanent, hence them abandoning things that would now be part of the platform. There’s been a bit of speculation that this was somehow planned all along; that Microsoft was gauging the Market’s reaction and would react based on that but if that was the case this policy would have been reversed a lot sooner, long before the backlash reached its crescendo during E3. The fact that they’ve made these changes shows that they’re listening now but there’s not to suggest that this was their plan all along.
Of course this doesn’t address some of the other issues that gamers have taken with the XboxOne, most notably the higher cost (even if its semi-justified by the included Kinect) and the rather US centric nature of many of the media features. Personally the higher price doesn’t factor into my decision too much, although I do know that’s a big deal for some, but since the XboxOne’s big selling points was around it’s media features it feels like a lot of the value I could derive from it is simply unavailable to me. Even those in the USA get a little bit of a rough ride with Netflix being behind the Xbox Live Gold wall (when it’s always available on the PS4) but since both of them are requiring the subscription for online play it’s not really something I can really fault/praise either of them for.
For what it’s worth this move might be enough to bring those who were on the fence back into the fold but as the polls and preorders showed there’s a lot of consumers who have already voted with their wallets. If this console generation has the same longevity as the current one then there’s every chance for Microsoft to make up the gap over the course of the next 8 years and considering that the majority of the console sales happen after the launch year it’s quite possible that all this outrage could turn out to be nothing more than a bump in the road. Still the first battle in this generation of console wars has been unequivocally won by Sony and it’s Microsoft’s job to make up that lost ground.
If the deafening outcrying from nearly every one of my favourite games news sites and social media brethren is anything to go by the console war has already been won and the new king is Sony. Whilst the fanboy in me would love to take this opportunity to stick it to all the Xboxers out there I honestly believe that Sony really didn’t do much to deserve the praise that’s currently being heaped on it. More I feel like the news coming out of E3 just shows how many missteps Microsoft took with the XboxOne with Sony simply sitting on the sidelines, not really changing anything from what they’re currently doing today.
The one, and really only, point that this all hinged on was the yet unknown stance that Sony would take for DRM on the PlayStation4. It was rumoured that they were watching social media closely and that spurred many grassroots campaigns aimed at influencing them. The announcement came at E3 that they’d pretty much be continuing along the same lines as they are now, allowing you to trade/sell/keep disc based games without any restrictions built into the platform. This also means that developers were free to include online passes in their games, something which has thankfully not become too common but could go on the rise (especially with cross platform titles).
There wasn’t much else announced at E3 that got gamers excited about the PlayStation4 apart from seeing the actual hardware for the first time. One curious bit of information that didn’t receive a whole lot of attention though was the change to Sony’s stance on free multiplayer through the PlayStation Network. You’ll still be able to get a whole bunch of services for free (like NetFlix/Hulu) but if you want to get multiplayer you’re going to have to shell out $5/month for the privilege. However this is PlayStation Plus which means it comes with a whole bunch of other benefits like free full version games so it’s not as bad as it sounds. Still it looks like Sony might have been capitalizing on the notion that there will be quite a few platform switchers for this generation and thus took the opportunity to make the service mandatory for multi.
It could also be partly to offset the extremely low (relative) price of the PlayStation4 with it clocking in at $399. Considering its specs it’s hard to believe that they’re not using the console as a loss leader yet again, something which I thought they were going to avoid for this generation. If the life of these consoles remains relatively the same that means they’ll at least get the console’s price back again in subscription fees, plus any additional revenue they get from the games sales. At least part of it will have to go to the massive amount of online services they’re planning to release however, but overall it seems that at least part of that subscription cash will be going to offset the cheaper hardware.
The thing to note here is that the differences between Sony’s current and next generation console are far smaller than those for Microsoft. This is the same Sony who were ridiculed for releasing the PSN long after Xbox Live, pricing their console way above the competition and, even if it wasn’t for games specifically, had some of the most insane DRM known to man. The fact that not much has changed (they have, in fact, got objectively worse) and they’re being welcomed with open arms shows just how much Microsoft has dropped the ball.
Whether or not this will translate into lost sales though will have to remain to be seen. The consumer market has an incredibly short memory and we’ve got a good 5 months between now and when the XboxOne goes on sale. It’s entirely possible that the current conversation is being dominated by the vocal minority and the number of platform loyalists will be enough to overcome that initial adoption hump (something which the Wii-U hasn’t been able to do). I’m sure that anyone who was on the fence about which one to get has probably made their mind up now based on these announcements but in all honesty those people are few and far between. I feel the majority of console gamers will get one, and only one, console and will likely not change platforms easily.
The proof will come this holiday season, however.
[UPDATE]: It has come to my attention that Sony has stated that they will not be allowing online passes from anyone. Chalk that up to yet another win for them.
After spending a week deep in the bowels of Microsoft’s premier tech conference and writing about them breathlessly for Lifehacker Australia you’d be forgiven for thinking I’m something of a Microsoft shill. It’s true that I think the direction they’re going in for their infrastructure products is pretty spectacular and the excitement for those developments is genuine. However if you’ve been here for a while you’ll know that I’m also among their harshest critics, especially when they do something that drastically out of line with my expectations as one of their consumers. However I believe in giving credit where its due and a recent PA Report article has brought Microsoft’s credentials in one area into question when they honestly shouldn’t be.
The article I’m referring to is this one:
I’m worried that there are going to be a few million consoles trying to dial into the home servers on Christmas morning, about the time when a mass of people begin to download new games through Microsoft’s servers. Remember, every game will be available digitally day and date of the retail version, so you’re going to see a spike in the number of people who buy their Xbox One games online.
I’m worried about what happens when that new Halo or Call of Duty is released and the system is stressed well above normal operating conditions. If their system falls, no matter how good our Internet connections, we won’t be able to play games.
Taken at face value this appears to be a fair comment. We can all remember times when the Xbox Live service came down in a screaming heap, usually around christmas time or even when a large release happened. Indeed even doing a quick Google search reveals there’s been a couple of outages in recent memory although digging deeper into them reveals that it was usually part of routine maintenance and only affected small groups of people at a time. With all the other criticism that’s being levelled at Microsoft of late (most of which I believe is completely valid) it’s not unreasonable to question their ability to keep a service of this scale running.
However as the title of this post alludes to I don’t think that’s going to be an issue.
The picture shown above is from the Windows Azure Internals session by Mark Russinovich which I attended last week at TechEd North America. It details the current infrastructure that underpins the Windows Azure platform which powers all of Microsoft’s sites including the Xbox Live service. If you have a look at the rest of the slides from the presentation you’ll see how far that architecture has come since they first introduced it 5 years ago when the over-subscription rates were much, much higher for the entire Azure stack. What this meant was that when something big happened the network simply couldn’t handle it and caved under the pressure. With this current generation of the Azure infrastructure however it’s far less oversubscribed and has several orders of magnitude more servers behind it. With that in mind it’s far less likely that Microsoft will struggle to service large spikes like they have done in the past as the capacity they have on tap is just phenomenal.
Of course this doesn’t alleviate the issues with the always/often on DRM or the myriad of other issues that people are criticizing the XboxOne for but it should show you that worrying about Microsoft’s ability to run a reliable service shouldn’t be one of them. Of course I’m just approaching this from an infrastructure point of view and it’s entirely possible for the Xbox Live system to have some systemic issue that will cause it to fail no matter how much hardware they throw at it. I’m not too concerned about that however as Microsoft isn’t your run of the mill startup who’s just learning how to scale.
I guess we’ll just have to wait and see how right or wrong I am.
All of my previous posts concerning Server 2012 (including those ones on LifeHacker) have been rather…high level focusing more on what you can achieve with it rather than some concrete examples. I’ll admit this can be almost wholly attributed to laziness as I’ve had Server 2012 running on my home machine for quite some time now and just haven’t bothered installing any additional features on it. However one of my close friends is in the throes of setting up his own aerial photography business (using UAVs, super cool stuff) and offered up his home server as a guinea pig for a Server 2012 install, provided I give him a working VPN in return.
Initially I thought that I’d install DirectAccess for him as it’s a pretty awesome piece of technology and implementing it appears to be a hell of a lot easier than it was on 2008¹. However the requirements for this were quite high for a VPN setup that would have at most a couple users, requiring a whole bunch of infrastructure that would serve no other purpose. In a rather strange coincidence one of my favourite Microsoft blogs, 4SysOps, wrote a post detailing the installation method for a SSTP VPN (one that tunnels over HTTPS) mere days before I was slated to go out and do the install for him.
Installing Server 2012 went incredibly smoothly and apart from a strange graphics card issue (the NVIDIA card he had in there didn’t seem to be able to regulate its fan without drivers, leading to it to lock up when it overheated) there were no problems. Following the guide was for the most part successful with everything going the way you’d expect it to. However there were a couple gotchas that we ran into along the way that I thought I’d detail here in case anyone got snagged on them.
We had several routing issues thanks to DNS entries taking far too long to expire, something we could have avoided with a little bit of forward planning. You can test the VPN internally by just using the local IP address however you probably won’t be able to get in as the SSL cert won’t match, but it is handy to test if all the plumbing is set up. However the most frustrating issue was that everything would seem to connect but would then immediate drop us out. Thankfully there were some events generated that allowed us to research this problem further but I’m not a big fan of the solution.
The error we were getting was something like “Error 720: The user <username> connected to port <server> has been disconnected because no network protocols were successfully negotiated”. There are numerous posts detailing this exact error and after trying many of the solutions the only one that worked was this one. Essentially it looks like, at least with SSTP VPNs, relaying DHCP requests doesn’t seem to work at all which is what causes this error. Setting up a static pool of IP addresses, and excluding it on the DHCP server, allowed us to connect in without a hitch.
It appears that this issue is a hangover from previous versions of Windows Server as the Routing and Remote Access console looks like it’s straight out of 2003 without much modification to it (apart from the Network Policies section). Now I’m not going to say that it needs a revamp, indeed once we got around that particular issue it worked perfectly, but it could use a little love.
Overall I’m pretty happy with my first real world Server 2012 install as I was able to get a technology that I had no previous experience with (VPNs) up and running in a matter of hours with little more than patience and a whole bunch of Googling. I’m now tempted to give DirectAccess a go at home as I’ve been meaning to set up a lab for a while now and being able to demonstrate some of Server 2012′s capabilities anywhere I have an Internet connection would just be plain awesome. That might be a little while off though as next week I’ll be in New Orleans, knee deep in TechEd goodness.
¹I can remember reading about it when it was first released and thinking I’d give it a go but nearly every install guide had DO NOT USE IN PRODUCTION plastered all over it. This doesn’t seem to be the case anymore as there are many production ready guides available and they’re all pretty easy to follow.
I don’t think I’m alone in saying that the timing of Sony’s announcement at the beginning of the year was a little surprising. Sure when they started inviting press to an event for an unnamed product we weren’t exactly surprised to find out it was the PlayStation 4 but by the same accounts we were also under the impression that it was nowhere as far along as the XboxOne was. Indeed I had gone on record several times saying that we’d likely see the new Xbox this year (which we will) and that the PS4 would follow sometime next year. It follows then that Microsoft would be the first to announce it but Sony beat them to the punch and, based on the reaction of the gamer community, this move appears to have benefitted them greatly.
Announcing a product early is always a risky endeavour as everyone will pick up on any half-baked ideas and descend upon them in a torrent of Internet rage. Indeed Sony copped quite a bit of criticism for doing just that as they failed to show the console (which feeds into the idea that it’s not done yet) and hand waved over a couple of the more important questions like backwards compatibility. Still the announcement set the tone for the next console generation with Sony putting games at the forefront and putting heavy emphasis on the features they’d built to enhance the gaming experience.
What did Microsoft bring to the table? Well if the following video is anything to go by:
The announcement had left me somewhat indifferent to Microsoft’s console, mostly thanks to the things that the above video highlights, and it appears that this sentiment has been echoed by many other gaming websites. Indeed whilst Microsoft may have made the right decision by broadening the console’s appeal through expanding its media offerings it’s certainly done nothing to endear them to the core crowd of Xbox gamers. You could leave the argument at that and still have a decent explanation for the backlash but honestly I think Sony’s press event set gamer’s expectations for the XboxOne long before Microsoft swaggered out with media-centric guns blazing.
Us gamers might have our various affiliations for different consoles, born from the days when we were allowed one and only one console which laid the groundwork for the fanboyism we see today, but we’re also astutely aware of what the competition is bringing to the table. Thus many of them would have been aware of what the PlayStation 4 was offering and would expect that Microsoft would have an answer to each and every feature that Sony had lauded at the PS4′s launch. Microsoft didn’t do this however, instead focusing on what they perceive as the major use case for the XboxOne: media consumption. Now this might not be too far out of left field, indeed more hours are spent watching Netflix than playing games on Xbox today, but I doubt that many of them were purchased solely for that. Indeed I believe many of them were bought by or for gamers primarily and the media integration was a nice add on for anyone else who wanted to use it.
Whether this translates into lost sales for Microsoft though will remain to be seen as whilst us gamers are a vocal bunch it’s entirely likely that consumers at large will view it as a solid DVR that also plays games. Like the Nintendo Wii before it this could have the potential to open up the Xbox to a much larger market, bypassing the vocal gamer community. It will be interesting to see how the sentiment develops over the next 6 months as that will determine if the XboxOne will retain its currently loyal gamer community or if they eschew it in favour of cementing their foothold as the center of your home entertainment system.
This year was already shaping up to be a great run for gamers, what with all the new IP heading our way and multiple high quality sequels, and the next console generation will likely be upon us before the year is out. Had you asked me last year what my predictions were I would’ve told you that we’d be lucky to see the next generation Xbox this year and it was far more likely that we’d see both of them sometime in 2014. I’m quite glad to be wrong in this instance however as whilst I might still be primarily a PC gamer I grew up on consoles and will always have a soft spot for them.
Today Microsoft officially announced their successor to the Xbox360: the XboxOne. If you’ve been following the rumours and leaks like I have there’s nothing too much surprising about the console itself as it sports the exact specs that have been floating around for a while. However there are still a few surprises from Microsoft’s next generation console and the launch event clarified some of the more controversial rumours that had been flying around. Suffice to say that Sony and Microsoft have very different audiences in mind for their next gen offerings, meaning that the choice between the two might no longer be based on platform exclusives alone.
Whilst I won’t go over the hardware specifications as they’re near identical to that of the PS4 (although I can’t find a confirmation of DDR3 vs GDDR5) there were a couple surprises under the hood of the XboxOne. For starters it’s sporting a BluRay drive which was kind of expected but still up in the air thanks to Microsoft initially throwing its support behind HDDVD, giving a little credence to the rumour that they wouldn’t incorporate it into their next gen offering. It also brings with it a HDMI in port, allowing those with set top boxes to run their TV through it. Whilst that doesn’t sound like much it’s telling of the larger strategy that Microsoft has at play here: they’re marketing the XboxOne as much more than a games console.
Indeed all the other features that they’ve included, like Snap Mode and the upgrades to their SmartGlass app, are all heavily focused on media consumption and making the XboxOne the central point of your home entertainment setup. Considering that current generation Xboxs are used to watch media more than they are to play games this change in direction is not surprising however it could alienate some of the more hardcore games fans. It seems Sony was well aware of this as their launch focused far more heavily on the gaming experience that their console could deliver rather than its additional media capabilities. The delineation then seems clear: if you want a gaming machine go for the PS4, but for everyone else there’s XboxOne.
The Xbox had always been Microsoft’s last piece in the Three Screens puzzle and it appears that the XboxOne will in fact be running a version of windows under the hood. In fact it’s running 3 different operating systems: Windows 8/RT, a second Xbox OS that’ll remain largely static (for developers) and the third layer sounds more like a hypervisor, managing access to resources for the 2 main operating systems. I speculated last year that Microsoft would be looking to bring WinRT to the next gen Xbox and that appears to be the case although how much of the functionality is directly compatible is still up for question as Microsoft has stated that you’ll “need to do some work” to port them across.
Unfortunately it does look like Microsoft wants to take an axe to the second hand games market as whilst the rumours of it needing to be always online have turned out to be false (although games can make use of Azure Cloud Gaming services which would require an online connection) installing a game to a hard drive locks it to that particular Xbox account, requiring a fee to do it on another. Whether or not you can play games without installing them is still up for debate however and the answer to that will make or break the second hand games market.
Additionally there’s going to be no backwards compatibility to speak of, save for transferring of licenses for media and your gamer score. Whilst this was not unexpected this combined with the lack of a second hand games market might be a dealbreaker for some. Whether this will push more people to Sony remains to be seen though as whilst they’ve alluded to backwards compatibility possibly coming via some kind of cloud gaming service that won’t be something former Xboxers will care about. It’s far more likely that the decision will be made on what the console will primarily be used for: gaming or media.
I’ve been something of a stalwart “buy all the things” consumer ever since I had a job that would allow me to do this but with the announcement of XboxOne I’m not sure if that will be the case anymore. I say this because I believe that the vast majority of titles will be cross platform, thanks to the x86 architecture, and as of yet there hasn’t been any compelling exclusives announced for either platform that would draw me to it. The Xbox360 landed a purchase solely for Mass Effect but I get the feeling that we won’t see another title that’s bound to a single platform like that again. With that in mind it’s highly likely that my current console collection will be slimmed down to one, and the last man standing will be the PS4.
I would love to be convinced otherwise though, Microsoft.
The story of the majority of IT workers is eerily similar. Most get their beginnings in a call centre, slaving away behind a headset troubleshooting various issues for either their end users or as part of a bigger help desk that services dozens of clients. Some are a little more lucky, landing a job as the sole IT guy at a small company which grants them all the creative freedom they could wish for but also being shouldered with the weight of being the be all and end all of their company’s IT infrastructure. No matter how us IT employees got our start all of us eventually look towards getting certified in the technologies we deal with every day and, almost instantly after getting our first, become incredibly cynical about what they actually represent.
For many the first certification they will pursue will be something from Microsoft since it’s almost guaranteed that every IT job you’ll come across will utilize it in some fashion. Whilst the value of the online/eLearning packages is debatable there’s little question that you’ll likely learn something that you didn’t already know, even if it’s completely esoteric and has no application in the real world. For anyone who’s spent a moderate amount of time with the product in question these exams aren’t particularly challenging as most of them focus on regurgitating the Microsoft way of doing things. This, in turn, feeds into their greatest weakness as they favour rote memorization over higher order concepts and critical thinking (at least at the introductory/intermediate levels).
This has led to a gray market which is solely focused on passing the exams for these tests. Whilst there are some great resources which fall into this area (Like CBT Nuggets) there are many, many more which skirt the boundaries of what’s appropriate. For anyone with a modicum of Google skills it’s not hard to track down copies of the exams themselves, many with the correct answers highlighted for your convenience. In the past this meant that you could go in knowing all the answers in advance and whilst there’s been a lot of work done to combat this there are still many, many people carrying certifications thanks to these resources.
The industry term for such people is “paper certs”.
People with qualifications gained in this way are usually quite easy to spot as rote memorization of the answers does not readily translate into real world knowledge of the product. However for those looking to hire someone this often comes too late as interview questions can only go so far to root these kinds of people out. Ultimately this makes those entry level certifications relatively worthless as having one of them is no guarantee that you’ll be an effective employee. Strangely however employers still look to them as a positive sign and, stranger still, companies looking to hire on talent from outsourcers again look for these qualifications in the hopes that they will get someone with the skills they require.
I say this as someone who’s managed to skate through the majority of his career without the backing of certs to get me through. Initially I thought this was due to my degree, which whilst being tangentially related to IT is strictly speaking an engineering one, but the surprise I’m met with when I mention that I’m an engineer by training has led me to believe that most of my former employers had no idea. Indeed what usually ended up sealing the position for me was my past experiences, even in positions where they stated certain certs were a requirement of the position. Asking my new employers about it afterwards had them telling me that those position descriptions are usually a wish list of things they’d like but it’s rare that anyone will actually have them all.
So we have this really weird situation where the majority of certifications are worthless, which is known by all parties involved, but are still used as a barrier to entry for some positions/opportunities but that can be wholly overwritten if you have enough experience in that area. If that’s sounding like the whole process is, for want of a better word, worthless than you’d be of the same opinion of most of the IT workers that I know.
There are some exceptions to this rule, CISCO’s CCIE exams being chief among them, but the fact that the training and certification programs are run by the companies who develop the products are the main reason why the majority of them are like this. Whilst I’m not entirely sure that having an independent certification body would solve all the issues (indeed some of those non-vendor specific certs are just as bad) it would at least remove the financial driver to churn as many people through the courses/exams as they currently do. Whilst I abhor artificial scarcity one of the places it actually helps is in qualifications, but that’d only be the first few tentative steps to solving this issue.
If you’ve been here a little while you’ll know that last year I won a competition to go up to Brisbane to cover TechEd Australia 2013 for LifeHacker Australia. During my time up there I wrote three posts covering everything from PowerShell, the evolution of the term “private cloud” and why Windows Server 201 would succeed. Evidently the LifeHacker writers and readers loved what I wrote and I ended up winning the mini-competition with the 2 other guest bloggers. At the time I was told that this would lead onto another series of posts for Microsoft themselves however that never eventuated but I did end up with a shiny new HP MicroServer that’s become the mainstay of my home network.
I thought that would be the end of it but a couple months ago Angus Kidman, the man behind much of LifeHacker Australia’s tech coverage, contacted me with an offer: come with him to the USA and participate in covering TechEd North America as part of their World of Servers initiative.
Of course I said yes.
It will be much the same as it was last year, I’ll be attending TechEd in New Orleans every day and writing up a post that sums up the lessons learned that I take away each day. The primary focus will still be on Server 2012 although with Microsoft’s increasing focus on cloud integration you can rest assured that I’ll be weaseling my way into as many Azure sessions as I possibly can. It’s going to be interesting to compare and contrast the two as I’m sure TechEd North America is going to be huge by comparison and hopefully that means we’ll get some juicy insights into some of Microsoft’s upcoming products.
But this post isn’t just for me to humble brag to you guys. I’m here to tell you that LifeHacker Australia is offering this very same opportunity to 2 lucky IT professionals! To enter all you have to do is fill out this entry form and answer a few questions about your IT chops. Once you’ve done that you’re in the running to win a fully paid trip to New Orleans to cover TechEd North America and you’ll get to hang out with me for the duration of the trip (most people would consider that a perk…most people ;)).
If you’re a budding blogger hoping to get a foot in the door or just a tech head who loves everything Microsoft then there really isn’t a better opportunity than the one LifeHacker is offering here. You’ve only got until May 1st to get your entries in (that’s 2 weeks people!) so I’d encourage you to get it in sooner rather than later. I’m incredibly excited to be going along for the ride on this one and if my previous experience was anything to go by it’ll be a blast and it’d be amazing if I could bring one my readers along for the ride.
Hope to see you there!