It’s no secret that I’m a Microsoft guy, owing much of my current career to their products which have been the staple of my computing experience since I was 5 years old. In that time I’ve gone from a simple user, to a power user who tweaked his system for the ultimate gaming experience to the administrator I am today, one who has seen almost everything Microsoft has to offer. I won’t lie, much of that foundational experience was built on the backs of pirated software but once I had a proper job that gave me access to all the software I needed I found myself not often needing much more than they provided. That was until I became a contractor which necessitated some external learning on my part.
Enter TechNet subscriptions.
They’re essentially a golden ticket to Microsoft’s entire software library. Back when I first bought into them there was only one level which got you everything but Visual Studio (that privilege is reserved for MSDN subscribers) and came with a handful of licenses for every Windows version out there, and I do mean every version as you could get MS-DOS 1.0 should you be so inclined. I, like most TechNet subscribers at the time, got it because the cost was roughly equivalent to the Windows desktop licensing cost to cover all my home machines at the time and the added server OSes and business software were an added bonus that’d help me professionally. I didn’t end up renewing it, mostly because I then got a MSDN account through work, but I know several people who are still subscribers today, usually for the same reasons I was.
It was with mixed feelings then that I read today’s announcement that Microsoft was going to stop selling the program effective August 31st, 2013. If you’re so inclined you can buy yourself a subscription (or renew your current one) all the way up to this date so you can continue to use the service for another year after that, putting the end date of the service at late 2014. After that your only option to get a similar level of access to Microsoft’s catalogue will be to go through MSDN which at current pricing levels is out of reach for infrastructure professionals like myself. Whilst the price difference is justified by a lot of the extra features you get (like the super cheap Azure pricing) those benefits aren’t exactly aligned with the current TechNet crowd.
The suggested replacement for TechNet is now the Evaluation Center which provides access to time limited versions of the same software (although how comprehensive the library is in comparison isn’t something I can comment on). Ironically there’s still a text blurb pointing you to buy a TechNet subscription should you want to “enjoy software for longer” something which I’m sure won’t remain there for long. In all honesty the reason why TechNet was so useful was the lack of time and feature limitations, allowing you to work freely with the product without having to consider some arbitrary limitation. For people like me who like to evaluate different bits of software at different times this was great as I could have an environment set up with all the basics and just install that application on top of it. Time limited software doesn’t provide this functionality, making evaluation done at the individual professional level essentially pointless.
The rationale is that people are looking more towards free services for evaluation and deployment. Now no one but Microsoft has the stats to back that argument up so we’ll just have to take their word for it but I get the feeling this is more about them trying to realign their professional network more than anything else. Sure I’m in the camp that admins will need to skill themselves up on dev related things (PowerShell and C# would not go astray) but semi-forcing them onto MSDN to do so isn’t the right way to go about it. Sure they’ve committed to expanding the services offered through the evaluation center but I doubt the best feature of TechNet, the no time and feature limitations, will ever come to it. Perhaps if they were to do a TechNet cloud edition, one where all the software had to be run on Azure, I might sing a different tune but I doubt that’ll ever happen.
As much as I praise Microsoft here I can’t help but feel this is a bad move on their part as it will only help to alienate a dedicated part of their user base that serves as the front line advocates for their products. I may not be a subscriber anymore, nor will I likely be one in the near future thanks to the benefits granted by my job, but I know many people who find a lot of value in the service, people who are de facto product evangelists because of it. I can only hope that they revamp the MSDN subscriptions to provide a similar level of service as otherwise there’s really only one place people will turn to and I know Microsoft doesn’t approve of it.
As longtime readers will know I’m quite keen on Microsoft’s Azure platform and whilst I haven’t released anything on it I have got a couple projects running on it right now. For the most part it’s been great as previously I’d have to spend a lot of time getting my development environment right and then translate that onto another server in order to make sure everything worked as expected. Whilst this wasn’t beyond my capability it was more time burnt in activities that weren’t pushing the project forward and was often the cause behind me not wanting to bother with them anymore.
Of course as I continue down the Azure path I’ve run into the many different limitations, gotchas and ideology clashes that have caused me several headaches over the past couple years. I think most of them can be traced back to my decision to use Azure Table Storage as my first post on Azure development is how I ran up against some of the limitations I wasn’t completely aware of and this continued with several more posts dedicated to overcoming the shortcomings of Microsoft’s NOSQL storage backend. Since then I’ve delved into other aspects of the Azure platform but today I’m not going to talk about any of the technology per se, no today I’m going to tell you about what happens when you hit your subscription/spending limit, something which can happen with only a couple mouse clicks.
I’m currently on a program called Microsoft BizSpark a kind of partner program whereby Microsoft and several other companies provide resources to people looking to build their own start ups. Among the many awesome benefits I get from this (including a MSDN subscription that gives me access to most of the Microsoft catalogue of software, all for free) Microsoft also provides me with an Azure subscription that gives me access to a certain amount of resources. Probably the best part of this offer is the 1500 hours of free compute time which allows me to run 2 small instances 24/7. Additionally I’ve also got access to the upcoming Azure Websites functionality which I used for a website I developed for a friend’s wedding. However just before the wedding was about to go ahead the website suddenly became unavailable and I went to investigate why.
As it turned out I had somehow hit my compute hours limit for that month which results in all your services being suspended until the rollover period. It appears this was due to me switching the website from the free tier to the shared tier which then counts as consuming compute hours whenever someone hits the site. Removing the no-spend block on it did not immediately resolve the issue however a support query to Microsoft saw the website back online within an hour. However my other project, the one that would be chewing up the lion’s share of those compute hours, seemed to have up and disappeared even though the environment was still largely in tact.
This is in fact expected behaviour for when you hit either your subscription or spending limit for a particular month. Suspended VMs on Windows Azure don’t count as being inactive and will thus continue to cost you money even whilst they’re not in use. To get around this should you hit your spending limits those VMs will be deleted, saving you money but also causing some potential data loss. Now this might not be an issue for most people, for me all it entailed was republishing them from Visual Studio, but should you be storing anything critical on the local storage of an Azure role it will be gone forever. Whilst the nature of the cloud should make you wary of storing anything on non-permanent storage (like Azure Tables, SQL, blob storage) it’s still a gotcha that you probably wouldn’t be aware of until you ran into a situation similar to mine.
Like any platform there are certain aspects of Windows Azure that you have to plan for and chief among them is your spending limits. It’s pretty easy to simply put in your credit card details and then go crazy by provisioning as many VMs as you want but sooner or later you’ll be looking to put limits on it and it’s then that you have the potential to run into these kinds of issues.
So I might be coming around to the whole cloud computing idea despite my history of being sceptical of the whole idea. Whilst I’ve yet to go any further than researching the possibilities the potential of the cloud to eliminate a lot of the problems encountered when scaling would mean a lot less time spent on those issues and more on developing a better product. However whilst the benefits of the cloud are potentially quite large there’s also the glaring issue of vendor dependency and lock in as no two cloud providers are the same nor is there any real standards around the whole cloud computing idea. This presents a very large problem for both cloud native services and those looking to migrate to the cloud platform as once you’ve made your choice you’re pretty much locked in unless you’re willing to pay for significant rework.
Right now my platform of choice is looking to be Windows Azure. Primarily this is because of the platform familiarity as whilst the cloud might be a whole new computing paradigm services built on the traditional Microsoft platform won’t have a hard time being migrated across. Additionally they’ve got a fantastic offer for MSDN subscribers, giving them a small instance and whole swath of other goodies to get them off the ground. This is good news for aspiring entrepreneurs like myself as Microsoft offers a free MSDN Premium subscription to start-ups (called BizSpark) who’ve been together for less than 3 years and less than $1 million in revenue. However as I was comparing this to the other cloud providers out there I noticed that no two were alike, in fact they were all at odds with each other.
Take the biggest cloud provider out there, Amazon’s EC2. Whilst the compute instances are pretty comparable since they’re just the operating system the other services (file storage, databases) are worlds away from each other and it’s not just the API calls that are different either. Amazon’s cloud offering is architecturally different from that of Microsofts, so much so that any code written for Azure will have to be wholly rewritten in order to function on their cloud. This means that when it comes time to move your service into the cloud you’ll have to make sure you trust the provider that you’re going to be going with. You could also double the entire budget and keep half of it in reserve should it ever come to that, but I doubt many people have that luxury.
Like the format wars that have raged for the past century such incompatibility between cloud providers only serves to harm the consumers of such services. Whilst I’m not suggesting the notion that there be one and only one cloud provider (corporations really can’t be trusted with monopolies and I’d hate to think what a government sanctioned cloud would look like) what’s missing from the current cloud environment is a level of interoperability that we’re so used to seeing in this Internet enabled age. The good news is that I’m not the only one to notice issues like this and there are several movements working towards an open set of standards for cloud operators to adopt. This not only provides the level of interoperability that’s currently lacking in the cloud world but would also give customers more confidence when working with smaller cloud operators, knowing that they wouldn’t be left high and dry should they fail.
Like any technology in its infancy cloud computing still has a ways to go before it can before it can be counted amongst its more mature predecessors. Still the idea has proven itself to be viable, usable and above all capable of delivering on some of the wild promises it made back when it was still called SaaS. With the d-day fast approaching for Lobaco I’ll soon be wandering into the cloud myself and I’m sure I’ll have much more to write about it when the time comes. For now though I’m happy to say that my previous outlook on the cloud was wrong, despite the problems I’ve cited here today.