Posts Tagged‘windows’

Capturing a Before and After Performance Report for Windows Servers.

The current project I’m on has a requirement for being able to determine a server’s overall performance before and after a migration, mostly to make sure that it still functions the same or better once its on the new platform. Whilst it’s easy enough to get raw statistics from perfmon getting an at-a-glance view  of how a server is performing before and after a migration is a far more nuanced concept, one that’s not easily accomplished with some Excel wizardry. With that in mind I thought I’d share with you my idea for creating such a view as well as outlining the challenges I’ve hit when attempting to collate the data.

Perfmon Data

At a high level I’ve focused on the 4 core resources that all operating systems consume: CPU, RAM, disk and network. For the most part these metrics are easily captured by the counters that perfmon has however I wanted to go a bit further to make sure that the final comparisons represented a more “true” picture of before and after performance. To do this I included some additional qualifying metrics which would show if increased resource usage was negatively impacting on performance or if it was just the server consuming more resources because it could since the new platform had much more capacity. With that in mind these are the metrics I settled on using:

  • Average of CPU usage (24 hours), Percentage, Quantitative
  • CPU idle time on virtual host of VM (24 hours), Percentage, Qualifying
  • Top 5 services by CPU usage, List, Qualitative
  • Average of  Memory usage (24 hours), Percentage, Quantitative
  • Average balloon driver memory usage (24 hours), MB consumed, Qualifying
  • Top 5 services by Memory usage, List, Qualitative
  • Average of Network usage (24 hours), Percentage, Quantitative
  • Average TCP retransmissions (24 hours), Total, Qualifying
  • Top 5 services by Network bandwidth utilized, List, Qualitative
  • Average of Disk usage (24 hours), Percentage, Quantitative
  • Average queue depth (24 hours), Total, Qualifying
  • Top 5 services by Storage IOPS/Bandwidth utilized, List, Qualitative

Essentially these metrics can be broken down into 3 categories: quantitative, qualitative  and qualifying. Quantitative metrics are the base metrics which will form the main part of the before and after analysis. Qualitative metrics are mostly just informational (being the Top 5 consumers of X resource) however they’ll provide some useful insight into what might be causing an issue. For example if an SQL box isn’t showing the SQL process as a top consumer then it’s likely something is causing that process to take a dive before it can actually use any resources. Finally the qualifying metrics are used to indicate whether or not increased usage of a certain metric signals an impact to the server’s performance like say if the memory usage is high and the memory balloon size is high it’s quite likely the system isn’t performing very well.

The vast majority of these metrics are provided in perfmon however there were a couple that I couldn’t seem to get through the counters, even though I could see them in Resource Monitor. As it turns out Resource Monitor makes use of the Event Tracing for Windows (ETW) framework which gives you an incredibly granular view of all events that are happening on your machine. What I was looking for was a breakdown of disk and network usage per process (in order to generate the Top 5 users list) which is unfortunately bundled up in the IO counters available in perfmon. In order to split these out you have to run a Kernel Trace through ETW and then parse the resulting file to get the metrics you want. It’s a little messy but unfortunately there’s no good way to get those metrics separated. The resulting perfmon profile I created can be downloaded here.

The next issue I’ve run into is getting the data into a readily digestible format. You see not all servers are built the same and not all of them run the same amount of software. This means that when you open up the resulting CSV file from different servers the column headers won’t line up so you’ve got to either do some tricky Excel work (which is often prone to failure) or get freaky with some PowerShell (which is messy and complicated). I decided to go for the latter as at least I could maintain and extend the script somewhat easily whereas an Excel spreadsheet has a tendency to get out of control faster than anyone expects. That part is still a work in progress however but I’ll endeavour to update this post with the completed script once I’ve got it working.

After that point it’s a relatively simple task of displaying everything in a nicely formatted Excel spreadsheet and doing comparisons based on the metrics you’ve generated. If I had more time on my hands I probably would’ve tried to integrate it into something like a SharePoint BI site so we could do some groovy tracking and intelligence on it but due to tight time constraints I probably won’t get that far. Still a well laid out spreadsheet isn’t a bad format for presenting such information, especially when you can colour everything green when things are going right.

I’d be keen to hear other people’s thoughts on how you’d approach a problem like this as trying to quantify the nebulous idea of “server performance” has proven to be far more challenging than I first thought it would be. Part of this is due to the data manipulation required but it was also ensuring that all aspects of a server’s performance were covered and converted down to readily digestible metrics. I think I’ve gotten close to a workable solution with this but I’m always looking for ways to improve it or if there’s a magical tool out there that will do this all for me 😉

Windows Threshold: Burying Windows 8 for the Sake of 9.

It’s hard to deny that Windows 8 hasn’t been a great product for Microsoft. In the 2 years that it’s been on the market it’s managed to secure some 12% of total market share which sounds great on the surface however its predecessor managed to nab some 40% in a similar time frame. The reasons behind this are wide and varied however there’s no mistaking that a large part of it was the Metro interface which just didn’t sit well with primarily desktop users. Microsoft, to their credit, has responded to this criticism by giving consumer what they want but like Vista the product that Windows 8 today is overshadowed by it’s rocky start. It seems clear now that Microsoft is done with Windows 8 as a platform and is now looking towards its successor, codenamed Windows Threshold.

Windows ThresholdNot a whole lot is known about what Threshold will entail but what is known points to a future where Microsoft is distancing itself from Windows 8 in the hopes of getting a fresh start. It’s still not known whether or not Threshold will become known as Windows 9 (or whatever name they might give to it) however the current release date is slated for sometime next year, on time with Microsoft’s new dynamic release schedule. This would also put it at 3 years after the initial release of Windows 8 which also ties into the larger Microsoft product cycle. Indeed most speculators are pegging Threshold to be much like the Blue release of last year with all Microsoft products receiving an update upon release. What interests me about this release isn’t so much of what it contains, more what it’s going to take away from Windows 8.

Whilst Microsoft has made inroads to making Windows 8 feel more like its predecessors the experience is still deeply tied to the Metro interface. Pressing the windows key doesn’t bring up the start menu and Metro apps are still have that rather obnoxious behaviour of taking over your entire screen. Threshold however is rumoured to do away with this, bringing back the start menu with a Metro twist that will allow you to access those kinds of applications without having to open up the full interface. Indeed for desktop systems, those that are bound to a mouse and keyboard, Metro will be completely disabled by default. Tablets and other hybrid devices will still retain the UI with the latter switching between modes depending on what actions occur (switch to desktop when docked, Metro when in tablet form).

From memory such features were actually going to make up parts of the next Windows 8 update, not the next version of Windows itself. Microsoft did add some similar features to Windows 8 in the last update (desktop users now default to desktop on login, not Metro) but the return of the start menu and the other improvements are seemingly not for Windows 8 anymore. Considering just how poor the adoption rates of Windows 8 has been this isn’t entirely surprising and Microsoft might be looking for a clean break away from Windows 8 in order to drive better adoption of Threshold.

It’s a strategy that has worked well for them in the past so it shouldn’t be surprising to see Microsoft doing this. For those of us who actually used Vista (after it was patched to remedy all the issues) we knew that Windows 7 was Vista under the hood, it was just visually different enough to break past people’s preconceptions about it. Windows Threshold will likely be the same, different enough from its direct ancestor that people won’t recognise it but sharing the same core that powered it. Hopefully this will be enough to ensure that Windows 7 doesn’t end up being the next XP as I don’t feel that’s a mistake Microsoft can afford to keep repeating.

 

How Long Will it Take for Enterprise IT to Embrace Rapid Innovation?

The IT industry has always been one of rapid change and upheaval, with many technology companies only lasting as long as they could innovate for. This is at odds with the traditional way businesses operated, preferring to stick to predictable cycles and seek gains through incremental improvements in process, procedure and marketing. The eventuality of this came in the form of the traditional 3~5 year cycle that many enterprises engaged in, upgrading the latest available technology usually years after it had been released. However the pace of innovation has increased to the point where such a cycle could leave an organisation multiple generations behind and it’s not showing any signs of slowing down soon.

Windows 8.2 Start Bar

I mentioned last year how Microsoft’s move from a 3 year development cycle to a yearly one was a good move, allowing them to respond to customer demands much more quickly than they were previously able to. However the issue I’ve come across is whilst I, as a technologist, love hearing about the new technology the customer readiness for this kind of innovation simply isn’t there. The blame for this almost wholly lays at the feet of XP’s 12 year dominance of the desktop market, something which even the threat of no support did little to impact its market share. So whilst the majority may have made the transition now they’re by no means ready for a technology upgrade cycle that happens on a yearly basis. There are several factors at play with this (tools, processes and product knowledge being the key ones) but the main issue remains the same: there’s a major disjoint between Microsoft’s current release schedule and it’s adoption among its biggest customers.

Microsoft, to their credit, are doing their best to foster rapid adoption. Getting Windows 8.1 at home is as easy as downloading an app from the Windows store and waiting for it to install, something you can easily do overnight if you can’t afford the down time. Similarly the tools available to do deployments on a large scale have been improved immensely, something anyone who’s used System Center Configuration Manager 2012 (and it’s previous incarnations) will attest to. Still even though the transition from Windows 7 to 8 or above is much lower risk than from XP to 7 most enterprises aren’t looking to make the move and it’s not just because they don’t like Windows 8.

With Windows 8.2 slated for release sometime in August this year Windows 8 will retain an almost identical look and feel to that of its predecessors, allowing users to bypass the metro interface completely and giving them back the beloved start menu. With that in place there’s almost no reason for people to not adopt the latest Microsoft operating system yet it’s likely to see a spike in adoption due to the inertia of large IT operations. Indeed even those that have managed to make the transition to Windows 8 probably won’t be able to make the move until 8.3 makes it debut, or possibly even Windows 9.

Once the Windows 8 family becomes the standard however I can see IT operations looking to move towards a more rapid pace of innovation. The changes between the yearly revisions are much less likely to break or change core functionality, enabling much of the risk that came with adopting a new operating system (application remediation). Additionally once the IT sections have moved to better tooling upgrading their desktops should also be a lot easier. I don’t think this will happen for another 3+ years however as we’re still in the midst of a XP hangover, one that’s not likely to subside until it’s market share is in the single digits. Past that we administrators then have the unenviable job of convincing our businesses that engaging in a faster product update cycle is good for them, even if the cost is low.

As someone who loves working with the latest and greatest from Microsoft it’s an irritating issue for me. I spend countless hours trying to skill myself up only to end up working on 5+ year old technology for the majority of my work. Sure it comes in handy eventually  but the return on investment feels extremely low. It’s my hope that the cloud movement, which has already driven a lot of businesses to look at more modern approaches to the way they do their IT, will be the catalyst by which enterprise IT begins to embrace a more rapid innovation cycle. Until then however I’ll just lament all the Windows Server 2012 R2 training I’m doing and wait until TechEd rolls around again to figure out what’s obsolete.

What Kind of Microsoft Can We Expect From Satya Nadella?

In the time that Microsoft has been a company it has only known two Chief Executive Officers. The first was unforgettably Bill Gates, the point man of the company from its founding days that saw the company grow from a small software shop to the industry giant of the late 90s. Then, right at the beginning of the new millennium, Bill Gates stood down and passed the crown to long time business partner Steve Ballmer who has since spent the better part of 2 decades attempting to transform Microsoft from a software company to a devices and services one. Rumours had been spreading about who was slated to take over Ballmer for some time now and, last week, after much searching Microsoft veteran Satya Nadella took over as the third CEO of the venerable company and now everyone is wondering where he will take it.

New Microsoft Chief Executive Officer of, Satya NadellaFor those who don’t know him Nadella’s heritage in Microsoft comes from the Server and Tools department where he’s held several high ranking positions for a number of years. Most notably he’s been in charge of Microsoft’s cloud computing endeavours, including building out Azure which hit $1 billion in sales last year, something I’m sure helped to seal the deal on his new position. Thus many would assume that Nadella’s vision for Microsoft would trend along these lines, something which runs a little contrary to the more consumer focused business that Ballmer sought to deliver, however his request to have Bill Gates step down as Chairman of the Board so that Nadella could have him as an advisor in this space says otherwise.

As with any changing of the guard many seek to impress upon the new bearer their wants for the future of the company. Nadella has already come under pressure to drop some of Microsoft’s less profitable endeavours including things like Bing, Surface and even the Xbox division (despite it being quite a revenue maker, especially as of late). Considering these products are the culmination of the effort of the 2 previous CEOs, both of which will still be involved in the company to some degree, taking an axe to them would be a extraordinarily hard thing to do. These are the products they’ve spent years and billions of dollars building so dropping them seems like a short sighted endeavour, even if it would make the books look a little better.

Indeed many of these business units which certain parties would look to cut are the ones that are seeing good growth figures. The surface has gone from a $900 million write down disaster to losing a paltry $39 million in 2 quarters, an amazing recovery that signals profitability isn’t too far off. Similarly Bing, the search engine that we all love to hate on, saw a 34% increase in revenue in a single quarter. It’s not even worth mentioning the Xbox division as it’s been doing well for years now and the release of the XboxOne with it’s solid initial sales ensures that remains one of Microsoft’s better performers.

The question then becomes whether Nadella, and the board he now serves, sees the on-going value in these projects. Indeed much of the work they’ve been doing in the past decade has been with the focus towards unifying many disparate parts of their ecosystem together, heading towards that unified nirvana where everything works together seamlessly. Removing them from the picture feels like Microsoft backing itself into a corner, one where it can be easily shoehorned into a narrative that will see it easily lose ground to those competitors it has been fighting for years. In all honesty I feel Microsoft is so dominant in those sectors already that there’s little to be gained from receding from perceived failures and Nadella should take this chance to innovate on his predecessor’s ideas, not toss them out wholesale.

 

Increasing Microsoft’s Agility With Windows Blue.

Microsoft’s flagship product, Windows, isn’t exactly known for it’s rapid release cycle. Sure for things like patches, security updates, etc. they’re probably one of the most responsive companies out there. The underlying operating system however is updated much less frequently with the base feature set being largely the same for the current 3 year product life cycle. In the past that was pretty much sufficient as the massive third party application market for Windows made up for anything that might have been lacking. Customers are increasingly looking for more fully featured platforms however and whilst Windows 8 is a step in the right direction it had the potential to start lagging behind its other, more frequently updated brethren.

Had Windows 8 stayed as a pure desktop OS this wouldn’t be a problem as the 3 year product cycle fit in perfectly with their largest customer base: the enterprise. Since Windows 8 will now form the basis of every Microsoft platform (or at least the core WinRT framework) they’re now playing in the same realm as iOS and Android. Platform updates for these two operating systems happen far more frequently and should Microsoft want to continue playing in this field they will have to adapt more rapidly. Up until recently I didn’t really know how Microsoft was planning to accomplish this but it seems they’ve had something in development for a while now.

Windows Blue

Windows Blue is shaping up to be the first feature pack for Windows 8, scheduled for release sometimes toward the end of this year. It’s also the umbrella term for similar updates happening across the entire Microsoft platform around the same time including their online services like Outlook.com and SkyDrive. This will be the first release of what will become a yearly platform update that will bring new features to Windows and its surrounding ecosystem. It will not be in lieu of the traditional platform updates however as there are still plans to deliver Windows 9 on the same 3 year cycle that we’ve seen for the past 2 Windows releases.

Whilst much of the press has been around the leaked Blue build and what that means for the Windows platform it seems that this dedication to faster product cycles goes far deeper. Microsoft has shifted its development mentality away from it’s traditional iterative process to a continuous development process, a no small feat for a company of this magnitude. Thus we should expect the entire Microsoft ecosystem, not just Windows, to see a similarly rapid pace of development. They had already done this with their cloud offerings (as it seems to gain new features every year) and the success they saw there has been the catalyst for applying it to the rest of the their product suites.

Microsoft has remained largely unchallenged in the desktop PC space for the better part of 2 decades but the increasing power of mobile devices has begun to erode their core business. They have then made the smart move to start competing in that space with an unified architecture that will enable a seamless experience across all platforms. The missing piece of the puzzle was their ability to rapidly iterate on said platform like the majority of their rivals were, something which the Blue wave of products will begin to rectify. Whether it will be enough to pull up some of their worse performing platforms (Windows Phone) will remain to be seen however, but I’m sure we can agree that it will be beneficial, both for Microsoft and us as consumers.

 

Housekeeping And Heads Up For Next Week.

Just going to make a quick post housekeeping post today as there’s a couple things I want to update you guys on. If you’re a dedicated LifeHacker reader you may have noticed that my ugly mug graced the front page for a while yesterday   and yes it’s true I’ll be covering TechEd 2012 Australia for them. It’s an incredible opportunity and I’m very excited to be doing it so for most of next week I’ll probably be recapping my day on here with all the real writing appearing on LifeHacker’s site. The posts on here probably won’t be at their usual time however so if you’re looking for your regular lunch time-ish article I’m going to have to disappoint you for a while.

I’m in the middle of migrating this blog over from my old Windows VPS that’s served me well over the past couple years to a Linux VPS with a ton more capacity. I tried to make the move last night but after getting everything up and running everything seemed to go pear shaped and nothing but index.php was being served by Apache so I trashed it all and started again this morning. I’m hopeful that this migration will go along smoothly but if things disappear it’s mostly because the two databases weren’t completely in sync at the time. This post was written on the old server and will likely disappear when the real migration occurs. Once that happens though I’ll know everything has worked and I’ll be working to get everything back up again.

Also, if you’ll allow me to get a little sappy for a second, I want to give you my heartfelt thanks for reading my tripe for the past 4 years as that was what motivated me to enter the LifeHacker competition in the first place. I didn’t start off as a great writer (as I’ve been told several times in no uncertain terms) but the feedback, comments and pageviews you guys gave me were enough incentive to keep on writing and improving my craft to a point where I felt confident enough to attempt something like this. That being said the true test is going to be how well the wider public receives my writing which is making me both excited and extremely nervous at the same time. Still I have no doubt it’s going to be great and I really do feel that all of you helped me get there in some way.

Now back to configuring Apache… 😉

Valve’s End Game For Steam (or The Birth of SteamOS).

My first interaction with Steam wasn’t a pleasant one. I remember the day clearly, I was still living out in Wamboin when Valve released Half Life 2 and had made sure to grab myself a copy before heading home. After going through the lengthy install process requiring multiple CD swaps I was greeted by a login box asking me to create an account. Frustratingly all my usual gamer tags: PYROMANT|C, SuperDave, Nalafang, etc. were already taken leaving me to choose   a random name. That wasn’t the real annoyance though, no what got me was the required update that needed to be applied before I could play it which, on the end of a 56k connection, was going to take me the better part of an hour to apply.

This soured me on the idea of Steam for quite a few years, at least until I got myself a stable form of broadband that let me update without having to wait hours at a time. Still it wasn’t until probably 3 years or so ago that I started buying most of my games through Steam as buying the physical media and then integrating with Steam later was still a much better experience. Today though it’s my platform of choice when purchasing games and it seems that I’m not alone in this regard with up to 70% of all digital sales passing through the platform. We’ve also seen Steam add many more features like SteamCloud and SteamWorks which have provided a platform for developers to add features that would have otherwise been too costly to develop themselves.

With all the success that Steam has enjoyed (in the process making Valve one of the most profitable companies per employee) it makes you wonder what the end game for Steam will end up being. Whilst they’d undoubtedly be able to coast along quite easily on the recurring sales and the giant community they’ve built around the platform history has shown that Valve isn’t that kind of company. Indeed the recent press release from Valve saying that traditional applications will soon be available through the Steam platform seems to indicate that they have ambitions that extend past their roots of gaming and digital distribution.

And its at this point that I start speculating wildly.

Valve has shown that it is dedicated to gamers regardless of the platform with Steam already on OSX and will soon be finding its way onto Linux alongside a native port of Left 4 Dead 2. With such a deep knowledge of games and an engine that runs on nearly any platform it would make sense that Valve might take a stab at cutting out the middle man entirely, choosing to create their own custom operating system that’s solely dedicated to the purpose of gaming. If such an idea was to come to fruition it would most likely be some kind of Linux derivative with a whole bunch of optimizations in it to make Source titles run better. I’ll be honest with you when this idea was suggested to me I thought it was pretty far out but there are some threads within this idea that have some merit.

Whilst the idea of SteamOS as a standalone operating system might be a bit far fetched I could see something akin to media centre software that transforms a traditional Windows/Linux/OSX PC into a dedicated gaming machine. Steam’s strength arguably comes from the giant catalogue of third party titles that they have on there and keeping the underlying OS (with its APIs in tact) means that all these games would still be available. This also seems to line up with the rumoured SteamBox idea that was floating around at the start of the year and would mean that the console was in fact just a re-badged Windows PC with some custom hardware underneath. The console itself might not catch on (although the success of the OUYA seems to indicate otherwise) but I could very well see people installing SteamOS beside their XBMC installation turning their Media PC into a dual use machine.

With all this in mind you have to then ask yourself what Valve would get out of something like this. They are already making headway into getting Steam in one form or another onto already existing consoles (see Steam for the PS3) and they’ve arguably already captured the lion’s share of PC gamers, the ones who’d be most likely to use something like SteamOS. The SteamBox would arguably be targeted at people who are not traditionally PC gamers and SteamOS then would simply be an also ran, something that would provide extra value to its already dedicated PC community. Essentially it would be further cementing Steam as the preferred digital distribution network for games whilst also attempting to capture a market that they’ve had little to do with up until this point.

All of this though is based on the current direction Valve seems to be going but realistically I could just be reading way too far into it. Their recent moves with the Steam platform are arguably just Valve trying to grow their platform organically and could very easily not be part of some grander scheme for greater platform dominance. The idea though is intriguing and whilst I have nothing more than speculation to go on I don’t think it would be a bad move by Valve at all.

Changing the User Paradigm with Windows 8.

As any IT admin will tell you users aren’t really the best at coping with change. It’s understandable though, for many people the PC that they use in their everyday work is simply a tool with which to accomplish their required tasks, nothing more. Fundamentally changing the way that tool works means that they also have to change the way they work and often this is met with staunch resistance. As such it’s rather difficult for new paradigms to find their feet, often requiring at least one failed or mediocre product to be released in order for the initial groundwork to be done and then the next generation can enjoy the success that its predecessor was doomed to never achieve.

We don’t have to look that far into the past to see an example of this happening. Windows Vista was something of a failure commercially which can be traced to 2 very distinct issues. The first, and arguably the most important, was the lack of driver support from vendors leaving many users with hardware that simply couldn’t run Vista even if it was technically capable of doing so. The second was the major shift in the user experience with the start menu being completely redesigned and many other parts of the operating system being revamped. These 2 items were the 1-2 knock-out punch that put Vista in the graveyard and gave Windows 7 one hell of an up hill battle.

Windows 8, whilst not suffering from the driver disaster that plagued Vista, revamps the user experience yet again. This time however it’s more than just a simple splash of eye candy with a rearranging of menu items, it’s a full on shift in how Windows PCs will be used. Chief amongst these changes is the Metro UI which after being field tested on Windows Phone 7 handsets has found its way onto the desktop and any Windows powered device. Microsoft has made it clear that this will be the way they’ll be doing everything in the future and that the desktop as we know it will soon be fading away in favour of a Metro interface.

This has drawn the ire of IT professionals and it’s easy to see why. Metro is at its heart designed for users, taking cues from the success that Apple has achieved with its iOS range of products. However whilst Apple is happy to slowly transform OS X into another branch of their iOS line Microsoft has taken the opposite approach, unifying all their ecosystems under the one banner of Metro (or more aptly WinRT). This is a bold move from Microsoft essentially betting that the near future of PC usage won’t be in the desktop sense, the place where the company has established itself as the dominant player in the market.

And for what it’s worth they’re making the right decision. Apple’s success proves that users are quite capable (and willing) to adapt to new systems if the interfaces to them are intuitive, minimalistic and user focused. Microsoft has noticed this and it is looking to take advantage of it by providing a unified platform across all devices. Apple is already close to providing such an experience but Microsoft has the desktop dominance, something that will help them drive adoption of their other platforms. However whilst the users might be ready, willing and able to make the switch I don’t think Windows 8 will be the one to do it. It’s far more likely to be Windows 9.

The reasoning behind this is simple, the world is only just coming to grips with Windows 7 after being dragged kicking and screaming away from Windows XP. Most enterprises are only just starting to roll out the new operating system now and those who have already rolled out don’t have deployments that are over a year old. Switching over to Windows 8 then is going to be something that happens a long way down the line, long enough that many users will simply skip upgrading Windows 8 in favour of the next iteration. If Microsoft sticks to their current 3 year release schedule then organizations looking to upgrade after Windows 7 won’t be looking at Windows 8, it’s far more likely to be Windows 9.

I’m sure Microsoft has anticipated this and has decided to play the long game instead of delaying fundamental change that could put them seriously behind their competition. It’s a radical new strategy, one that could pay them some serious dividends should everything turn out the way they hope it will. The next couple years are going to be an interesting time as the market comes to grips with the new Metro face of the iconic Windows desktop, something which resisted change for decades prior.

Why Macs and Enterprise Computing Don’t Mix.

I’m a big fan of technology that makes users happy. As an administrator anything that keeps users satisfied and working productively means more time for me to make the environment even better for them. It’s a great positive feedback loop that builds on itself continually, leading to an environment that’s stable, cutting edge and just plain fun to use and administer. Of course the picture I’ve just painted is something of an IT administrator nirvana, a great dream that is rarely achieved even by those who have unlimited freedom with the budgets to match. That doesn’t mean we shouldn’t try to achieve it however and I’ll be damned if I haven’t tried at every place I’ve ever worked at.

The one thing that always come up is “Why don’t we use Macs in the office? They’re so easy to use!”. Indeed my two month long soiree into the world of OSX and all things Mac showed that it was indeed an easy operating system to pick up and I could easily see why so many people use it as their home operating system. Hell at my current work place I can count several long time IT geeks who’ve switched their entire household over to solely Apple gear because it just works and as anyone who works in IT will tell you the last thing you want to be doing at  home is fixing up PCs.

You’d then think that Macs would be quite prevalent in the modern workspace, what with their ease of use and popularity amongst the unwashed masses of users. Whilst their usage in the enterprise is growing considerably they’re still hovering just under 3% market share, or about the same amount of market share that Windows Phone 7 has in the smart phone space. That seems pretty low but it’s in line with world PC figures with Apple being somewhere in the realms of 5% or so. Still there’s a discrepancy there so the question still remains as to why Macs aren’t seen more often in the work place.

The answer is simple, Apple simply doesn’t care about the enterprise space.

I had my first experience with Apple’s enterprise offerings very early on in my career, way back when I used to work for the National Archives of Australia. As part of the Digital Preservation Project we had a small data centre that housed 2 similar yet completely different systems. They were designed in such a way that should a catastrophic virus wipe out the entire data store on one the replica on the other should be unaffected since it was built from completely different software and hardware. One of these systems utilized a few shelves of Apple’s Xserve RAID Array storage. In essence they were just a big lump of direct attached storage and for that purpose they worked quite well. That was until we tried to do anything with it.

Initially I just wanted to provision some of the storage that wasn’t being used. Whilst I was able to do some of the required actions through the web UI the unfortunate problem was that the advanced features required installing the Xserve tools on a Mac computer. Said computer also had to have a fibre channel card installed, something of a rarity to find in a desktop PC. It didn’t stop there either, we also tried to get Xsan installed (so it would be, you know, an actual SAN) only to find out that we’d need to buy yet more Apple hardware in order to be able to use it. I left long before I got too far down that rabbit hole and haven’t really touched Apple enterprise gear since.

You could write that off as a bad experience but Apple has continued to show that the enterprise market is simply not their concern. No less than 2 years after I last touched a Xserve RAID Array did Apple up and cancel production of them, instead offering up a rebadged solution from Promise. 2 years after that Apple then discontinued production of its Xserve servers and lined up their Mac Pros as a replacement. As any administrator will tell you the replacements are anything but and since most of their enterprise software hasn’t recieved a proper update in years (Xsan’s last major release was over 3 years ago) no one can say that Apple has the enterprise in mind.

It’s not just their enterprise level gear that’s failing in corporate environments. Whilst OSX is easy to use it’s an absolute nightmare to administer on anything larger than a dozen or so PCs as all of the management tools available don’t support it. Whilst they do integrate with Active Directory there’s a couple limitations that don’t exist for Windows PCs on the same infrastructure. There’s also the fact that OSX can’t be virtualized unless it runs on Apple hardware which kills it off as a virtualization candidate. You might think that’s a small nuisance but it means that you can’t do a virtual desktop solution using OSX (since you can’t buy the hardware at scale to make it worthwhile) and you can’t utilize any of your current investment in virtual infrastructure to run additional OSX servers.

If you still have any doubts that Apple is primarily a hardware company then I’m not sure what planet you’re on.

For what its worth Apple hasn’t been harmed by ignoring the enterprise as it’s consumer electronics business has more than made up for the losses that they’ve incurred. Still I often find users complaining about how their work computers can’t be more like their Macs at home, ignorant of the fact that Apple’s in the enterprise would be an absolutely atrocious experience. Indeed it’s looking to get worse as Apple looks to iPhoneizing their entire product range including, unfortunately, OSX. I doubt Apple will ever change direction on this which is a real shame as OSX is the only serious competitor to Micrsoft’s Windows.

Website Performance (or People are Impatient).

Way back when I used to host this server myself on the end of my tenuous ADSL connection loading up the web site always felt like something of a gamble. There were any number of things that could stop me (and the wider world) from getting to it like: the connection going down, my server box overheating or even the power going out at my house (which happened more often than I realised). About a year ago I made the move onto my virtual private server and instantly all those worries evaporated and the blog has been mostly stable ever since. I no longer have to hold my breath every time I type my url into the address bar nor do I worry about posting media rich articles anymore, something I avoided when my upstream was a mere 100KB/s.

What really impressed me though was the almost instant traffic boost that I got from the move. At the time I just put it down to more people reading my writing as I had been at it for well over a year and a half at that point. At the same time I had also made a slight blunder with my DNS settings which redirected all traffic from my subdomains to the main site so I figured that the burst in traffic was temporary and would drop off as people’s DNS caches expired. The strangest thing was though that the traffic never went away and continued to grow steadily. Not wanting to question my new found popularity I just kept doing what I was always doing until I stumbled across something that showed me what was happening.

April last year saw Google mix in a new metric to their ranking algorithm: page load speed, right around the same time that I experienced the traffic boost from moving off my crappy self hosting and onto the VPS. The move had made a significant improvement in the usability of the site, mostly due to the giant pipe that it has, and it appeared that Google was now picking up on that and sending more people my way. However the percentage of traffic coming here from search engines remained the same but since it was growing I didn’t care to investigate much further.

I started to notice some curious trends though when aggregating data from a couple different sources. I use 2 different kinds of analytics here on The Refined Geek the first being WordPress.com Stats (just because it’s real-time) and Google Analytics for long term tracking and pretty graphs. Now both of them agree with each other pretty well however the one thing they can’t track is how many people come to my site but leave before the page is fully loaded. In fact I don’t think there’s any particular service that can do this (I would love to be corrected on this) but if you’re using Google’s Webmaster Tools you can get a rough idea of the number of people that come from their search engine but get fed up waiting for your site to load. You can do this by checking the number of clicks you get from search queries and comparing that to the number of people visiting your site from Google Analytics. This will give you a good impression of how many people abandon your site because it’s running too slow.

For this site the results are quite surprising. On average I lose about 20% of my visitors between them clicking on the link in Google and actually loading a page¹. I shudder to think how many I was losing back in the days where a page would take 10+ seconds to load but I’d hazard a guess it was roughly double that if I take into account the traffic boost I got after moving to a dedicated provider. Getting your site running fast then is probably one of the most important things you can do if you’re looking to get anywhere on the Internets, at least that’s what my data is telling me.

After I realised this I’ve been on a bit of a performance binge, trying anything and everything to get it running better. I’m still in the process of doing so however and many of the tricks that people talk about for WordPress don’t translate well into the Windows world so I’m basically hacking my way through it. I’ve dedicated part of my weekend to this and I’ll hopefully write up the results next week so that you other crazy Windows based WordPressers can benefit from my tinkering.

¹If people are interested in finding out this kind of data from their Google Analytics/Webmasters Tools account let me know and I might run up a script to do the comparison for you.