If the deafening outcrying from nearly every one of my favourite games news sites and social media brethren is anything to go by the console war has already been won and the new king is Sony. Whilst the fanboy in me would love to take this opportunity to stick it to all the Xboxers out there I honestly believe that Sony really didn’t do much to deserve the praise that’s currently being heaped on it. More I feel like the news coming out of E3 just shows how many missteps Microsoft took with the XboxOne with Sony simply sitting on the sidelines, not really changing anything from what they’re currently doing today.
The one, and really only, point that this all hinged on was the yet unknown stance that Sony would take for DRM on the PlayStation4. It was rumoured that they were watching social media closely and that spurred many grassroots campaigns aimed at influencing them. The announcement came at E3 that they’d pretty much be continuing along the same lines as they are now, allowing you to trade/sell/keep disc based games without any restrictions built into the platform. This also means that developers were free to include online passes in their games, something which has thankfully not become too common but could go on the rise (especially with cross platform titles).
There wasn’t much else announced at E3 that got gamers excited about the PlayStation4 apart from seeing the actual hardware for the first time. One curious bit of information that didn’t receive a whole lot of attention though was the change to Sony’s stance on free multiplayer through the PlayStation Network. You’ll still be able to get a whole bunch of services for free (like NetFlix/Hulu) but if you want to get multiplayer you’re going to have to shell out $5/month for the privilege. However this is PlayStation Plus which means it comes with a whole bunch of other benefits like free full version games so it’s not as bad as it sounds. Still it looks like Sony might have been capitalizing on the notion that there will be quite a few platform switchers for this generation and thus took the opportunity to make the service mandatory for multi.
It could also be partly to offset the extremely low (relative) price of the PlayStation4 with it clocking in at $399. Considering its specs it’s hard to believe that they’re not using the console as a loss leader yet again, something which I thought they were going to avoid for this generation. If the life of these consoles remains relatively the same that means they’ll at least get the console’s price back again in subscription fees, plus any additional revenue they get from the games sales. At least part of it will have to go to the massive amount of online services they’re planning to release however, but overall it seems that at least part of that subscription cash will be going to offset the cheaper hardware.
The thing to note here is that the differences between Sony’s current and next generation console are far smaller than those for Microsoft. This is the same Sony who were ridiculed for releasing the PSN long after Xbox Live, pricing their console way above the competition and, even if it wasn’t for games specifically, had some of the most insane DRM known to man. The fact that not much has changed (they have, in fact, got objectively worse) and they’re being welcomed with open arms shows just how much Microsoft has dropped the ball.
Whether or not this will translate into lost sales though will have to remain to be seen. The consumer market has an incredibly short memory and we’ve got a good 5 months between now and when the XboxOne goes on sale. It’s entirely possible that the current conversation is being dominated by the vocal minority and the number of platform loyalists will be enough to overcome that initial adoption hump (something which the Wii-U hasn’t been able to do). I’m sure that anyone who was on the fence about which one to get has probably made their mind up now based on these announcements but in all honesty those people are few and far between. I feel the majority of console gamers will get one, and only one, console and will likely not change platforms easily.
The proof will come this holiday season, however.
[UPDATE]: It has come to my attention that Sony has stated that they will not be allowing online passes from anyone. Chalk that up to yet another win for them.
After spending a week deep in the bowels of Microsoft’s premier tech conference and writing about them breathlessly for Lifehacker Australia you’d be forgiven for thinking I’m something of a Microsoft shill. It’s true that I think the direction they’re going in for their infrastructure products is pretty spectacular and the excitement for those developments is genuine. However if you’ve been here for a while you’ll know that I’m also among their harshest critics, especially when they do something that drastically out of line with my expectations as one of their consumers. However I believe in giving credit where its due and a recent PA Report article has brought Microsoft’s credentials in one area into question when they honestly shouldn’t be.
The article I’m referring to is this one:
I’m worried that there are going to be a few million consoles trying to dial into the home servers on Christmas morning, about the time when a mass of people begin to download new games through Microsoft’s servers. Remember, every game will be available digitally day and date of the retail version, so you’re going to see a spike in the number of people who buy their Xbox One games online.
I’m worried about what happens when that new Halo or Call of Duty is released and the system is stressed well above normal operating conditions. If their system falls, no matter how good our Internet connections, we won’t be able to play games.
Taken at face value this appears to be a fair comment. We can all remember times when the Xbox Live service came down in a screaming heap, usually around christmas time or even when a large release happened. Indeed even doing a quick Google search reveals there’s been a couple of outages in recent memory although digging deeper into them reveals that it was usually part of routine maintenance and only affected small groups of people at a time. With all the other criticism that’s being levelled at Microsoft of late (most of which I believe is completely valid) it’s not unreasonable to question their ability to keep a service of this scale running.
However as the title of this post alludes to I don’t think that’s going to be an issue.
The picture shown above is from the Windows Azure Internals session by Mark Russinovich which I attended last week at TechEd North America. It details the current infrastructure that underpins the Windows Azure platform which powers all of Microsoft’s sites including the Xbox Live service. If you have a look at the rest of the slides from the presentation you’ll see how far that architecture has come since they first introduced it 5 years ago when the over-subscription rates were much, much higher for the entire Azure stack. What this meant was that when something big happened the network simply couldn’t handle it and caved under the pressure. With this current generation of the Azure infrastructure however it’s far less oversubscribed and has several orders of magnitude more servers behind it. With that in mind it’s far less likely that Microsoft will struggle to service large spikes like they have done in the past as the capacity they have on tap is just phenomenal.
Of course this doesn’t alleviate the issues with the always/often on DRM or the myriad of other issues that people are criticizing the XboxOne for but it should show you that worrying about Microsoft’s ability to run a reliable service shouldn’t be one of them. Of course I’m just approaching this from an infrastructure point of view and it’s entirely possible for the Xbox Live system to have some systemic issue that will cause it to fail no matter how much hardware they throw at it. I’m not too concerned about that however as Microsoft isn’t your run of the mill startup who’s just learning how to scale.
I guess we’ll just have to wait and see how right or wrong I am.
All of my previous posts concerning Server 2012 (including those ones on LifeHacker) have been rather…high level focusing more on what you can achieve with it rather than some concrete examples. I’ll admit this can be almost wholly attributed to laziness as I’ve had Server 2012 running on my home machine for quite some time now and just haven’t bothered installing any additional features on it. However one of my close friends is in the throes of setting up his own aerial photography business (using UAVs, super cool stuff) and offered up his home server as a guinea pig for a Server 2012 install, provided I give him a working VPN in return.
Initially I thought that I’d install DirectAccess for him as it’s a pretty awesome piece of technology and implementing it appears to be a hell of a lot easier than it was on 2008¹. However the requirements for this were quite high for a VPN setup that would have at most a couple users, requiring a whole bunch of infrastructure that would serve no other purpose. In a rather strange coincidence one of my favourite Microsoft blogs, 4SysOps, wrote a post detailing the installation method for a SSTP VPN (one that tunnels over HTTPS) mere days before I was slated to go out and do the install for him.
Installing Server 2012 went incredibly smoothly and apart from a strange graphics card issue (the NVIDIA card he had in there didn’t seem to be able to regulate its fan without drivers, leading to it to lock up when it overheated) there were no problems. Following the guide was for the most part successful with everything going the way you’d expect it to. However there were a couple gotchas that we ran into along the way that I thought I’d detail here in case anyone got snagged on them.
We had several routing issues thanks to DNS entries taking far too long to expire, something we could have avoided with a little bit of forward planning. You can test the VPN internally by just using the local IP address however you probably won’t be able to get in as the SSL cert won’t match, but it is handy to test if all the plumbing is set up. However the most frustrating issue was that everything would seem to connect but would then immediate drop us out. Thankfully there were some events generated that allowed us to research this problem further but I’m not a big fan of the solution.
The error we were getting was something like “Error 720: The user <username> connected to port <server> has been disconnected because no network protocols were successfully negotiated”. There are numerous posts detailing this exact error and after trying many of the solutions the only one that worked was this one. Essentially it looks like, at least with SSTP VPNs, relaying DHCP requests doesn’t seem to work at all which is what causes this error. Setting up a static pool of IP addresses, and excluding it on the DHCP server, allowed us to connect in without a hitch.
It appears that this issue is a hangover from previous versions of Windows Server as the Routing and Remote Access console looks like it’s straight out of 2003 without much modification to it (apart from the Network Policies section). Now I’m not going to say that it needs a revamp, indeed once we got around that particular issue it worked perfectly, but it could use a little love.
Overall I’m pretty happy with my first real world Server 2012 install as I was able to get a technology that I had no previous experience with (VPNs) up and running in a matter of hours with little more than patience and a whole bunch of Googling. I’m now tempted to give DirectAccess a go at home as I’ve been meaning to set up a lab for a while now and being able to demonstrate some of Server 2012′s capabilities anywhere I have an Internet connection would just be plain awesome. That might be a little while off though as next week I’ll be in New Orleans, knee deep in TechEd goodness.
¹I can remember reading about it when it was first released and thinking I’d give it a go but nearly every install guide had DO NOT USE IN PRODUCTION plastered all over it. This doesn’t seem to be the case anymore as there are many production ready guides available and they’re all pretty easy to follow.
I don’t think I’m alone in saying that the timing of Sony’s announcement at the beginning of the year was a little surprising. Sure when they started inviting press to an event for an unnamed product we weren’t exactly surprised to find out it was the PlayStation 4 but by the same accounts we were also under the impression that it was nowhere as far along as the XboxOne was. Indeed I had gone on record several times saying that we’d likely see the new Xbox this year (which we will) and that the PS4 would follow sometime next year. It follows then that Microsoft would be the first to announce it but Sony beat them to the punch and, based on the reaction of the gamer community, this move appears to have benefitted them greatly.
Announcing a product early is always a risky endeavour as everyone will pick up on any half-baked ideas and descend upon them in a torrent of Internet rage. Indeed Sony copped quite a bit of criticism for doing just that as they failed to show the console (which feeds into the idea that it’s not done yet) and hand waved over a couple of the more important questions like backwards compatibility. Still the announcement set the tone for the next console generation with Sony putting games at the forefront and putting heavy emphasis on the features they’d built to enhance the gaming experience.
What did Microsoft bring to the table? Well if the following video is anything to go by:
The announcement had left me somewhat indifferent to Microsoft’s console, mostly thanks to the things that the above video highlights, and it appears that this sentiment has been echoed by many other gaming websites. Indeed whilst Microsoft may have made the right decision by broadening the console’s appeal through expanding its media offerings it’s certainly done nothing to endear them to the core crowd of Xbox gamers. You could leave the argument at that and still have a decent explanation for the backlash but honestly I think Sony’s press event set gamer’s expectations for the XboxOne long before Microsoft swaggered out with media-centric guns blazing.
Us gamers might have our various affiliations for different consoles, born from the days when we were allowed one and only one console which laid the groundwork for the fanboyism we see today, but we’re also astutely aware of what the competition is bringing to the table. Thus many of them would have been aware of what the PlayStation 4 was offering and would expect that Microsoft would have an answer to each and every feature that Sony had lauded at the PS4′s launch. Microsoft didn’t do this however, instead focusing on what they perceive as the major use case for the XboxOne: media consumption. Now this might not be too far out of left field, indeed more hours are spent watching Netflix than playing games on Xbox today, but I doubt that many of them were purchased solely for that. Indeed I believe many of them were bought by or for gamers primarily and the media integration was a nice add on for anyone else who wanted to use it.
Whether this translates into lost sales for Microsoft though will remain to be seen as whilst us gamers are a vocal bunch it’s entirely likely that consumers at large will view it as a solid DVR that also plays games. Like the Nintendo Wii before it this could have the potential to open up the Xbox to a much larger market, bypassing the vocal gamer community. It will be interesting to see how the sentiment develops over the next 6 months as that will determine if the XboxOne will retain its currently loyal gamer community or if they eschew it in favour of cementing their foothold as the center of your home entertainment system.
This year was already shaping up to be a great run for gamers, what with all the new IP heading our way and multiple high quality sequels, and the next console generation will likely be upon us before the year is out. Had you asked me last year what my predictions were I would’ve told you that we’d be lucky to see the next generation Xbox this year and it was far more likely that we’d see both of them sometime in 2014. I’m quite glad to be wrong in this instance however as whilst I might still be primarily a PC gamer I grew up on consoles and will always have a soft spot for them.
Today Microsoft officially announced their successor to the Xbox360: the XboxOne. If you’ve been following the rumours and leaks like I have there’s nothing too much surprising about the console itself as it sports the exact specs that have been floating around for a while. However there are still a few surprises from Microsoft’s next generation console and the launch event clarified some of the more controversial rumours that had been flying around. Suffice to say that Sony and Microsoft have very different audiences in mind for their next gen offerings, meaning that the choice between the two might no longer be based on platform exclusives alone.
Whilst I won’t go over the hardware specifications as they’re near identical to that of the PS4 (although I can’t find a confirmation of DDR3 vs GDDR5) there were a couple surprises under the hood of the XboxOne. For starters it’s sporting a BluRay drive which was kind of expected but still up in the air thanks to Microsoft initially throwing its support behind HDDVD, giving a little credence to the rumour that they wouldn’t incorporate it into their next gen offering. It also brings with it a HDMI in port, allowing those with set top boxes to run their TV through it. Whilst that doesn’t sound like much it’s telling of the larger strategy that Microsoft has at play here: they’re marketing the XboxOne as much more than a games console.
Indeed all the other features that they’ve included, like Snap Mode and the upgrades to their SmartGlass app, are all heavily focused on media consumption and making the XboxOne the central point of your home entertainment setup. Considering that current generation Xboxs are used to watch media more than they are to play games this change in direction is not surprising however it could alienate some of the more hardcore games fans. It seems Sony was well aware of this as their launch focused far more heavily on the gaming experience that their console could deliver rather than its additional media capabilities. The delineation then seems clear: if you want a gaming machine go for the PS4, but for everyone else there’s XboxOne.
The Xbox had always been Microsoft’s last piece in the Three Screens puzzle and it appears that the XboxOne will in fact be running a version of windows under the hood. In fact it’s running 3 different operating systems: Windows 8/RT, a second Xbox OS that’ll remain largely static (for developers) and the third layer sounds more like a hypervisor, managing access to resources for the 2 main operating systems. I speculated last year that Microsoft would be looking to bring WinRT to the next gen Xbox and that appears to be the case although how much of the functionality is directly compatible is still up for question as Microsoft has stated that you’ll “need to do some work” to port them across.
Unfortunately it does look like Microsoft wants to take an axe to the second hand games market as whilst the rumours of it needing to be always online have turned out to be false (although games can make use of Azure Cloud Gaming services which would require an online connection) installing a game to a hard drive locks it to that particular Xbox account, requiring a fee to do it on another. Whether or not you can play games without installing them is still up for debate however and the answer to that will make or break the second hand games market.
Additionally there’s going to be no backwards compatibility to speak of, save for transferring of licenses for media and your gamer score. Whilst this was not unexpected this combined with the lack of a second hand games market might be a dealbreaker for some. Whether this will push more people to Sony remains to be seen though as whilst they’ve alluded to backwards compatibility possibly coming via some kind of cloud gaming service that won’t be something former Xboxers will care about. It’s far more likely that the decision will be made on what the console will primarily be used for: gaming or media.
I’ve been something of a stalwart “buy all the things” consumer ever since I had a job that would allow me to do this but with the announcement of XboxOne I’m not sure if that will be the case anymore. I say this because I believe that the vast majority of titles will be cross platform, thanks to the x86 architecture, and as of yet there hasn’t been any compelling exclusives announced for either platform that would draw me to it. The Xbox360 landed a purchase solely for Mass Effect but I get the feeling that we won’t see another title that’s bound to a single platform like that again. With that in mind it’s highly likely that my current console collection will be slimmed down to one, and the last man standing will be the PS4.
I would love to be convinced otherwise though, Microsoft.
The story of the majority of IT workers is eerily similar. Most get their beginnings in a call centre, slaving away behind a headset troubleshooting various issues for either their end users or as part of a bigger help desk that services dozens of clients. Some are a little more lucky, landing a job as the sole IT guy at a small company which grants them all the creative freedom they could wish for but also being shouldered with the weight of being the be all and end all of their company’s IT infrastructure. No matter how us IT employees got our start all of us eventually look towards getting certified in the technologies we deal with every day and, almost instantly after getting our first, become incredibly cynical about what they actually represent.
For many the first certification they will pursue will be something from Microsoft since it’s almost guaranteed that every IT job you’ll come across will utilize it in some fashion. Whilst the value of the online/eLearning packages is debatable there’s little question that you’ll likely learn something that you didn’t already know, even if it’s completely esoteric and has no application in the real world. For anyone who’s spent a moderate amount of time with the product in question these exams aren’t particularly challenging as most of them focus on regurgitating the Microsoft way of doing things. This, in turn, feeds into their greatest weakness as they favour rote memorization over higher order concepts and critical thinking (at least at the introductory/intermediate levels).
This has led to a gray market which is solely focused on passing the exams for these tests. Whilst there are some great resources which fall into this area (Like CBT Nuggets) there are many, many more which skirt the boundaries of what’s appropriate. For anyone with a modicum of Google skills it’s not hard to track down copies of the exams themselves, many with the correct answers highlighted for your convenience. In the past this meant that you could go in knowing all the answers in advance and whilst there’s been a lot of work done to combat this there are still many, many people carrying certifications thanks to these resources.
The industry term for such people is “paper certs”.
People with qualifications gained in this way are usually quite easy to spot as rote memorization of the answers does not readily translate into real world knowledge of the product. However for those looking to hire someone this often comes too late as interview questions can only go so far to root these kinds of people out. Ultimately this makes those entry level certifications relatively worthless as having one of them is no guarantee that you’ll be an effective employee. Strangely however employers still look to them as a positive sign and, stranger still, companies looking to hire on talent from outsourcers again look for these qualifications in the hopes that they will get someone with the skills they require.
I say this as someone who’s managed to skate through the majority of his career without the backing of certs to get me through. Initially I thought this was due to my degree, which whilst being tangentially related to IT is strictly speaking an engineering one, but the surprise I’m met with when I mention that I’m an engineer by training has led me to believe that most of my former employers had no idea. Indeed what usually ended up sealing the position for me was my past experiences, even in positions where they stated certain certs were a requirement of the position. Asking my new employers about it afterwards had them telling me that those position descriptions are usually a wish list of things they’d like but it’s rare that anyone will actually have them all.
So we have this really weird situation where the majority of certifications are worthless, which is known by all parties involved, but are still used as a barrier to entry for some positions/opportunities but that can be wholly overwritten if you have enough experience in that area. If that’s sounding like the whole process is, for want of a better word, worthless than you’d be of the same opinion of most of the IT workers that I know.
There are some exceptions to this rule, CISCO’s CCIE exams being chief among them, but the fact that the training and certification programs are run by the companies who develop the products are the main reason why the majority of them are like this. Whilst I’m not entirely sure that having an independent certification body would solve all the issues (indeed some of those non-vendor specific certs are just as bad) it would at least remove the financial driver to churn as many people through the courses/exams as they currently do. Whilst I abhor artificial scarcity one of the places it actually helps is in qualifications, but that’d only be the first few tentative steps to solving this issue.
If you’ve been here a little while you’ll know that last year I won a competition to go up to Brisbane to cover TechEd Australia 2013 for LifeHacker Australia. During my time up there I wrote three posts covering everything from PowerShell, the evolution of the term “private cloud” and why Windows Server 201 would succeed. Evidently the LifeHacker writers and readers loved what I wrote and I ended up winning the mini-competition with the 2 other guest bloggers. At the time I was told that this would lead onto another series of posts for Microsoft themselves however that never eventuated but I did end up with a shiny new HP MicroServer that’s become the mainstay of my home network.
I thought that would be the end of it but a couple months ago Angus Kidman, the man behind much of LifeHacker Australia’s tech coverage, contacted me with an offer: come with him to the USA and participate in covering TechEd North America as part of their World of Servers initiative.
Of course I said yes.
It will be much the same as it was last year, I’ll be attending TechEd in New Orleans every day and writing up a post that sums up the lessons learned that I take away each day. The primary focus will still be on Server 2012 although with Microsoft’s increasing focus on cloud integration you can rest assured that I’ll be weaseling my way into as many Azure sessions as I possibly can. It’s going to be interesting to compare and contrast the two as I’m sure TechEd North America is going to be huge by comparison and hopefully that means we’ll get some juicy insights into some of Microsoft’s upcoming products.
But this post isn’t just for me to humble brag to you guys. I’m here to tell you that LifeHacker Australia is offering this very same opportunity to 2 lucky IT professionals! To enter all you have to do is fill out this entry form and answer a few questions about your IT chops. Once you’ve done that you’re in the running to win a fully paid trip to New Orleans to cover TechEd North America and you’ll get to hang out with me for the duration of the trip (most people would consider that a perk…most people ).
If you’re a budding blogger hoping to get a foot in the door or just a tech head who loves everything Microsoft then there really isn’t a better opportunity than the one LifeHacker is offering here. You’ve only got until May 1st to get your entries in (that’s 2 weeks people!) so I’d encourage you to get it in sooner rather than later. I’m incredibly excited to be going along for the ride on this one and if my previous experience was anything to go by it’ll be a blast and it’d be amazing if I could bring one my readers along for the ride.
Hope to see you there!
As longtime readers will know I’m quite keen on Microsoft’s Azure platform and whilst I haven’t released anything on it I have got a couple projects running on it right now. For the most part it’s been great as previously I’d have to spend a lot of time getting my development environment right and then translate that onto another server in order to make sure everything worked as expected. Whilst this wasn’t beyond my capability it was more time burnt in activities that weren’t pushing the project forward and was often the cause behind me not wanting to bother with them anymore.
Of course as I continue down the Azure path I’ve run into the many different limitations, gotchas and ideology clashes that have caused me several headaches over the past couple years. I think most of them can be traced back to my decision to use Azure Table Storage as my first post on Azure development is how I ran up against some of the limitations I wasn’t completely aware of and this continued with several more posts dedicated to overcoming the shortcomings of Microsoft’s NOSQL storage backend. Since then I’ve delved into other aspects of the Azure platform but today I’m not going to talk about any of the technology per se, no today I’m going to tell you about what happens when you hit your subscription/spending limit, something which can happen with only a couple mouse clicks.
I’m currently on a program called Microsoft BizSpark a kind of partner program whereby Microsoft and several other companies provide resources to people looking to build their own start ups. Among the many awesome benefits I get from this (including a MSDN subscription that gives me access to most of the Microsoft catalogue of software, all for free) Microsoft also provides me with an Azure subscription that gives me access to a certain amount of resources. Probably the best part of this offer is the 1500 hours of free compute time which allows me to run 2 small instances 24/7. Additionally I’ve also got access to the upcoming Azure Websites functionality which I used for a website I developed for a friend’s wedding. However just before the wedding was about to go ahead the website suddenly became unavailable and I went to investigate why.
As it turned out I had somehow hit my compute hours limit for that month which results in all your services being suspended until the rollover period. It appears this was due to me switching the website from the free tier to the shared tier which then counts as consuming compute hours whenever someone hits the site. Removing the no-spend block on it did not immediately resolve the issue however a support query to Microsoft saw the website back online within an hour. However my other project, the one that would be chewing up the lion’s share of those compute hours, seemed to have up and disappeared even though the environment was still largely in tact.
This is in fact expected behaviour for when you hit either your subscription or spending limit for a particular month. Suspended VMs on Windows Azure don’t count as being inactive and will thus continue to cost you money even whilst they’re not in use. To get around this should you hit your spending limits those VMs will be deleted, saving you money but also causing some potential data loss. Now this might not be an issue for most people, for me all it entailed was republishing them from Visual Studio, but should you be storing anything critical on the local storage of an Azure role it will be gone forever. Whilst the nature of the cloud should make you wary of storing anything on non-permanent storage (like Azure Tables, SQL, blob storage) it’s still a gotcha that you probably wouldn’t be aware of until you ran into a situation similar to mine.
Like any platform there are certain aspects of Windows Azure that you have to plan for and chief among them is your spending limits. It’s pretty easy to simply put in your credit card details and then go crazy by provisioning as many VMs as you want but sooner or later you’ll be looking to put limits on it and it’s then that you have the potential to run into these kinds of issues.
If you’ve ever worked in a multi-tenant environment with shared resources you’ll know of the many pains that can come along with it. Resource sharing always ends up leading to contention and some of the time this will mean that you won’t be able to get access to the resources you want. For cloud services this is par for the course as since you’re always accessing shared services and so any application you build on these kinds of platforms has to take this into consideration lets your application spend an eternity crashing from random connection drop outs. Thankfully Microsoft has provided a few frameworks which will handle these situations for you, especially in the case of Azure SQL.
The Transient Fault Handling Application Block (or Topaz, which is a lot better in my view) gives you access to a number of classes which take out a lot of the pain when dealing with the transient errors you get when using Azure services. Of those the most useful one I’ve found is the RetryPolicy which when instantiated as SqlAzureTransientErrorDetectionStrategy allows you to simply wrap your database transactions with a little bit of code in order to make them resistant to the pitfalls of Microsoft’s cloud SQL service. For the most part it works well as prior to using it I’d get literally hundreds of unhandled exception messages per day. It doesn’t catch everything however so you will still need to handle some connection errors but it does a good job of eliminating the majority of them.
Currently however there’s no native support for it in Entity Framework (Microsoft’s data persistence framework) and this means you have to do a little wrangling in order to get it to work. This StackOverflow question outlines the problem and there’s a couple solutions on there which all work however I went for the simple route of instantiating a RetryPolicy and then just wrapping all my queries with ExecuteAction. As far as I could tell this all works fine and is the supported way of using EF with Topaz at least until 1.6 comes out which will have in built support for connection resiliency.
However when using Topaz in this way it seems that it mucks with entity tracking, causing returned objects to not be tracked in the normal way. I discovered this after I noticed many records not getting updated even though manually working through the data showed that they should be showing different values. As far as I can tell if you wrap an EF query with a RetryPolicy the entity ends up not being tracked and you will need to .Attach() to it prior to making any changes. If you’ve used EF before then you’ll see why this is strange as you usually don’t have to do that unless you’ve deliberately detached the entity or recreated the context. So as far as I can see there must be something in Topaz that causes it to become detached requiring you to reattach it if you want to persist your changes using Context.SaveChanges().
I haven’t tested any of the other methods of using Topaz with EF so it’s entirely possible there’s a way to get the entity tracked properly without having to attach to it after performing the query. Whether they work or not will be an exercise left for the reader as I’m not particularly interested in testing it, at least not just after I got it all working again. By the looks of it though a RC version of EF 6 might not be too far away, so this issue probably won’t remain one for long.
I heap a lot of praise on Windows Azure here, enough for me to start thinking about how that’s making me sound like a Microsoft shill, but honestly I think it’s well deserved. As someone who’s spent the better part of a decade setting up infrastructure for applications to run on and then began developing said applications in its spare time I really do appreciate not having to maintain another set of infrastructure. Couple that with the fact that I’m a full Microsoft stack kind of guy it’s really hard to beat the tight integration between all of the products in the cloud stack, from the development tools to the back end infrastructure. So like many of my weekends recently I spent the previous coding away on the Azure platform and it was filled with some interesting highs and rather devastating lows.
For the uninitiated Azure Web Sites are essentially a cut down version of the Azure Web Role allowing you to run pretty much full scale web apps for a fraction of the cost. Of course this comes with limitations and unless you’re running on at the Reserved tier you’re essentially sharing a server with a bunch of people (I.E. a common multi-tenant scenario). For this site, which isn’t going to receive a lot of traffic, it’s perfect and I wanted to deploy the first run app onto this platform. Like any good admin I simply dove in head first without reading any documentation on the process and to my surprise I was up and running in a matter of minutes. It was pretty much create web site, download publish profile, click Publish in Visual Studio, import profile and wait for the upload to finish.
Deploying a web site on my own infrastructure would be a lot more complicated as I can’t tell you how many times I’ve had to chase down dependency issues or missing libraries that I have installed on my PC but not on the end server. The publishing profile coupled with the smarts in Visual Studio was able to resolve everything (the deployment console shows the whole process, it was actually quite cool to watch) and have it up and running at my chosen URL in about 10 minutes total. It’s very impressive considering this is still considered preview level technology, although I’m more inclined to classify it as a release candidate.
Other Azure users can probably guess what I’m going to write about next. Yep, the horrific storage problems that Azure had for about 24 hours.
I noticed some issues on Friday afternoon when my current migration (yes that one, it’s still going as I write this) started behaving…weird. The migration is in its last throws and I expected the CPU usage to start ramping down as the multitude of threads finished their work and this lined up with what I was seeing. However I noticed the number of records migrated wasn’t climbing up at the rate it was previously (usually indicative of some error happening that I suppressed in order for the migration to run faster) but the logs showed that it was still going, just at a snail’s pace. Figuring it was just the instance dying I reimaged it and then the errors started flooding in.
Essentially I was disconnected from my NOSQL storage so whilst I could browse my migrated database I couldn’t keep pulling records out. This also had the horrible side effect of not allowing me to deploy anything as it would come back with SSL/TLS connection issues. Googling this led to all sorts of random posts as the error is also shared by the libraries that power the WebClient in .NET so it wasn’t until I stumbled across the ZDNet article that I knew I wasn’t in the wrong. Unfortunately you were really up the proverbial creek without a paddle if your Azure application was based on this as the temporary fixes for this issue, either disabling SSL for storage connections or usurping the certificate handler, left your application rather vulnerable to all sorts of nasty attacks. I’m one of the lucky few who could simply do without until it was fixed but it certainly highlighted the issues that can occur with PAAS architectures.
Honestly though that’s the only issue (that’s not been directly my fault) I’ve had with Azure since I started using it at the end of last year and comparing it to other cloud services it doesn’t fair too badly. It has made me think about what contingency strategy I’ll need to implement should any parts of the Azure infrastructure go away for a extended period of time though. For the moment I don’t think I’ll worry too much as I’m not going to be earning any income from the things I build on it but it will definitely be a consideration as I begin to unleash my products onto the world.