The public cloud is a great solution to a wide selection of problems however there are times when its use is simply not appropriate. This is typical of organisations who have specific requirements around how their data is handled, usually due to data sovereignty or regulatory compliance. However whilst the public cloud is a great way to bolster your infrastructure on the cheap (although that’s debatable when you start ramping up your VM size) it doesn’t take advantage of the current investments in infrastructure that you’ve already made. For large, established organisations this is not insignificant and is why many of them were reluctant to transition fully to public cloud based services. This is why I believe the future of the cloud will be paved with hybrid solutions, something I’ve been saying for years now.
Microsoft has finally shown that they’ve understood this with the release of Windows Azure Pack for Server 2012R2. Sure there was beginnings of it with SCVMM 2012 allowing you to add in your Azure account and move VMs up there but that kind of thing has been available for ages through hosting partners. The Azure Pack on the other hand brings features that were hidden behind the public cloud wall down to the private level, allowing you to make full use of it without having to rely on Azure. If I’m honest I thought that Microsoft would probably be the only ones to try this given their presence in both the cloud and enterprise space but it seems other companies have begun to notice the hybrid trend.
Google has been working with the engineers at Red Hat to produce the Test Compatibility Kit for Google App Engine. Essentially this kit provides the framework for verifying the API level functionality of a private Google App Engine implementation, something which is achievable through an application called CapeDwarf. The vast majority of the App Engine functionality is contained within that application, enough so that current developers on the platform could conceivably use their code using on premises infrastructure if they so wished. There doesn’t appear to be a bridge between the two currently, like there is with Azure, as CapeDwarf utilizes its own administrative console.
They’ve done the right thing by partnering with RedHat as otherwise they’d lack the penetration in the enterprise market to make this a worthwhile endeavour. I don’t know how much presence JBoss/OpenShift has though so it might be less of using current infrastructure and more about getting Google’s platform into more places than it currently is. I can’t seem to find any solid¹ market share figures to see how Google currently rates compared to the other primary providers but I’d hazard a guess they’re similar to Azure, I.E. far behind Rackspace and Amazon. The argument could be made that such software would hurt their public cloud product but I feel these kinds of solutions are the foot in the door needed to get organisations thinking about using these services.
Whilst my preferred cloud is still Azure I’m still a firm believer that the more options we have to realise the hybrid dream the better. We’re still a long way from having truly portable applications that can move between freely between private and public platforms but the roots are starting to take hold. Given the rapid pace of IT innovation I’m confident that the next couple years will see the hybrid dream fully realised and then I’ll finally be able to stop pining for it.
¹This article suggests that Microsoft has 20% of the market which, since Microsoft has raked in $1 billion, would peg the total market at some $5 billion total which is way out of line with what Gartner says. If you know of some cloud platform figures I’d like to see them as apart from AWS being number 1 I can’t find much else.
It was late Friday night. My companions and I had just finished up work as we stumbled out into the hot, humid air that surrounded us here in Brunei. After a nearly 12 hour day we had our sights fixed on grabbibng some dinner and then an early night as we would have to come in the next day to finish the job. As we chatted over our meals a curious image appeared on the television, one that I recognized very clearly as SpaceX’s Dragon capsule that was launched no more than a couple days earlier. At the time it appeared that they were performing some last manuevers before the docking would occur. I couldn’t take my eyes away from it staring intently at the capsule that was driftly serenely across the beautiful backdrop of our earth.
The time came for us to make our departure and we headed back to the hotel. I hit up Facebook to see what was going on when I saw a message from a long time friend: “I hope you’re not missing this http://on.msnbc.com/JxfRMS“.
I assured him I wasn’t.
I was fixated on the craft watching it intently from 2 different streams so that I’d never be out of the loop. I monitored Twitter like a hawk, soaking in the excitement that my fellow space nuts shared. I almost shed a tear when Houston gave SpaceX the go to make the final docking approach as, for some unknown reason, that was when it all became real: the very first private space craft was about to dock with the International Space Station. At 13:56 UTC on May 25th, 2012 the SpaceX Dragon became the first private space craft to be captured by the International Space Station and not 6 minutes later it was birthed on the earth side docking port of the American Harmony module.
It’s an incredible achievement for SpaceX and proves just how capable they are. This is only the second launch of both the Falcon 9 rocket and the Dragon capsule which demonstrates just how well engineered they are. Most of the credit here can go to the modularity of the Falcon series systems meaning that most of the launch stack has already seen a fair bit of flight testing thanks to the previous Falcon 1 launches. The design is paying off in spades for them now as with this kind of track record it won’t be long before we see them shipping humans up atop their Falcon rockets, and that’s extremely exciting.
The payload of the COTS Demo Flight 2 Dragon capsule is nothing remarkable being mostly food, water and spare computing parts and small experiments designed by students. What’s really special about the Dragon though is its ability to bring cargo back to earth (commonly referred to as downrange capability) something that no other craft currently offers. The ATV, HTV and Progress crafts all burn up upon re-entry meaning that the only way to get experiements back from the ISS now will be aboard the Dragon capsule. Considering that we now lack the enormous payload bay of the Space Shuttle this might be cause for some concern but I think SpaceX has that problem already solved.
Looking over the scheduled flights it would appear that SpaceX is looking to make good on their promise to make the launches frequent in order to take advantage of the economies of scale that will come along with that. If the current schedule is anything to go by there will be another 2 Dragon missions before the year is out and the pace appears to be rapidly increasing from there. So much so that 2015 could see 5 launches of the Dragon system rivalling the frequency at which the Soyuz/Progress capsules currently arrive at the ISS. It’s clear that SpaceX has a lot of faith in their launch system and that confidence means they can attempt such aggressive scheduling.
I have to congratulate SpaceX once again on their phenomenal achievement. For a company that’s only just a decade old to have achieved something that no one else has done before is simply incredible and I’m sure that SpaceX will continue to push the envelope of what is possible for decades to come. I’m more excited than ever now to see the next Dragon launch as each step brings us a little closer to the ultimate goal: restoring the capability that was lost with the Space Shuttle. I’ve made a promise to myself to be there to see it launch and I simply can’t wait to see when it will be.
Over the weekend the wife and I watched a documentary on the American education system called Waiting for Superman, here’s the trailer:
The documentary dives deep into the American public education system and the crux of it is that whilst there are some fantastic public schools there the problem is that space at those schools are limited. In order to resolve this situation the government has legislated the only thing that can be equally fair to all involved: public schools with more applicants than places must have a lottery to determine who gets in and who doesn’t. It’s eye opening, informative and heart wrenching all at the same time and definitely something that I’d recommend you watch.
The reason it hit home for me was because of the parallels that I could draw to my own education experience. My parents had had me on the waiting list for one of Canberra’s most respected private schools since the day I was born. I went to a public school for my initial education but I was always destined for a life of private education. However upon attending that school I was miserable, the few friends that did make the transition to the same school abandoning me and the heavily Anglican environment (with mandatory bible studies classes) only making things worse.
The straw that broke my parent’s back was when I made my case for transferring me to a public school where most of my friends had ended up. They couldn’t get through to me that the private school I was going to was the best place for me to be educated but one thing I said changed their minds: “You make your own education”. I still wonder if I actually uttered those exact words or just something along those lines (I don’t have a vivid memory of that incident, but my parents say it was so) but that was enough for them to let me transfer. If I’m honest the transfer didn’t make things any better, although I told myself differently at the time, but suffice to say I can count myself amongst the few who did make it to university after going to that school. Heck you might even say I’ve been successful.
Anecdotally then public education system in Australia seems to work just fine. The schools I went to had a rather rough reputation for not producing results (and indeed my university entrance score was dragged down a good 5 points due to my attendance there) but there were students that excelled in spite of it. However when watching Waiting for Superman I got this sinking feeling that in the USA they might not even have the chance to make their own education simply because the schools are set up for failure. Indeed my own success might have blinded me to the fact that the schools I went to were set up in such a way, leading me to believe there was no problem when there was one.
Cursory research however shows that, at least for Australia, this isn’t the case. Indeed the biggest indicators of child’s success at school and their pursuit of higher education is largely dependent on non-school factors. Following on from that idea it’s not just you who makes your education, but your entire social structure that supports it. Bringing that back to my experience shows then that it was my strong family support that lead for me to do well and my late found group of friends who led me to excel at university. In that respect I should feel incredibly lucky but in reality it’s got little to do with luck and more to do with a whole lot of dedicated effort on the parts of everyone who had been involved in my life during my education.
Still we should be thankful for the education system that Australia has, especially when you compared it to what it could be. I’m still a strong believer in those words I uttered well over a decade ago and whilst they might not be applicable everywhere in the world they are definitely applicable here.
10 days after Atlantis blasted off on its final trip into space for STS-135 the last ever space shuttle mission has finally returned to earth, signalling an end to the 30 year program and marking the end of an era for space. For many of us young star gazers the space shuttle is an icon, something that embodied the human spirit ever searching for new frontiers to explore. For me personally it symbolized something I felt truly passionate about, a feeling that I had not been familiar with for a very long time. Many will lament its loss but it has come time for NASA to reinvent itself, leaving the routine of low earth orbit for new frontiers that eagerly await them.
Atlantis’ final firey return back to earth, as seen from the International Space Station.
Image credit: NASA/Johnson Space Center (via @NASA_Johnson)
The shuttle was, from a technical point of view, too much of a compromise between government agencies for it to be able to achieve the goals set out for it. There’s no denying it was an extremely versatile craft but many of the design decisions made were at odds with the end goals of making a reusable craft that could cater to all of the USA’s launch needs for the next 30 years. Constellation then would look like a step in the right direction however whilst it was a far more appropriate craft for NASA’s current needs their money is better spent on pushing their capability envelope, rather than designing yet another launch system.
NASA, to their credit, appears to be in favour of offloading their launch capabilities to private industry. They already have contracts with SpaceX and Orbital Sciences to provide both launch capabilities and crew/cargo capsules however attempts to fully privatize their more rudimentary activities have been routinely blocked by congress. It’s no secret that much of the shuttle’s manufacturing process is split up across states for purely political purposes (it made no sense to build the external tank so far away that it needed a barge to ship it back) and the resistance from congress for private launch systems is indicative of that. Still they have their foot in the door now and this opens up the opportunity for NASA to get back to its roots and begin exploring the final frontier.
There’s no denying that we’ve made great progress with robotic space exploration, reaching out to almost every section of our solar system and exploring their vast wonders. However not since 1972 has a human left low earth orbit, something people of the time wouldn’t believe if you told them so. Whilst it might not be the most efficient way of exploring the universe it’s by far the best for inspiring the next generation:
It’s a historic day and it will mark a turning point for NASA and space flight in the USA one way or another. It’s my fervent hope that NASA uses this as an opportunity to refocus on its core goals of pushing the envelope of what’s possible for humanity through exploring that vast black frontier of space. It won’t be an easy journey for NASA, especially considering the greater economic environment they’re working in right now, but I know the people there are more than capable of doing it and the USA needs them in order to inspire the next generation.
I make no secret of the fact that I’ve pretty much built my career around a single line of products, specifically those from VMware. Initially I simply used their workstation line of products to help me through university projects that required Linux to complete but after one of my bosses caught wind of my “experience” with VMware’s products I was put on the fast line to become an expert in their technology. The timing couldn’t have been more perfect as virtualization then became a staple of every IT department I’ve had the pleasure of working with and my experience with VMware ensured that my resume always floated around near the top when it came time to find a new position.
In this time I’ve had a fair bit of experience with their flagship product now called vSphere. In essence it’s an operating system you can install on a server that lets you run multiple, distinct operating system instances on top of it. Since IT departments always bought servers with more capacity than they needed systems like vSphere meant they could use that excess capacity to run other, not so power hungry systems along side them. It really was a game changer and from then on servers were usually bought with virtualization being the key purpose in mind rather than them being for a specific system. VMware is still the leader in this sector holding an estimated 80% of the market and has arguably the most feature rich product suite available.
Yesterday saw the announcement of their latest product offering vSphere 5. From a technological standpoint it’s very interesting with many innovations that will put VMware even further ahead of their competition, at least technologically. Amongst the usual fanfare of bigger and better virtual machines and improvements to their current technologies vSphere 5 brings with it a whole bunch of new features aimed squarely at making vSphere the cloud platform for the future. Primarily these innovations are centred around automating certain tasks within the data centre, such as provisioning new servers and managing server loads including down to the disk level which wasn’t available previously. Considering that I believe the future of cloud computing (at least for government organisations and large scale in house IT departments) is a hybrid public/private model these improvements are a welcome change , even if I won’t be using them immediately.
The one place that VMware falls down and is (rightly) heavily criticized for is the price. With the most basic licenses costing around $1000 per core it’s not a cheap solution by any stretch of the imagination, especially if you want to take advantage of any of the advanced features. Still since the licencing was per processor it meant that you could buy a dual processor server (each with say, 6 cores) with oodles of RAM and still come out ahead of other virtualization solutions. However with vSphere 5 they’ve changed the way they do pricing significantly, to the point of destroying such a strategy (and those potential savings) along with it.
Licensing is still charged on a per-processor basis but instead of having an upper limit on the amount of memory (256GB for most licenses, Enterprise Plus gives you unlimited) you are now given a vRAM allocation per licence purchased. Depending on your licensing level you’ll get 24GB, 32GB or 48GB worth of vRAM which you’re allowed to allocate to virtual machines. Now for typical smaller servers this won’t pose much of a problem as a dual proc, 48GB RAM server (which is very typical) would be covered easily by the cheapest licensing. However should you exceed even 96GB of RAM, which is very easy to do, that same server will then require additional licenses to be purchased in order to be able to full utilize the hardware. For smaller environments this has the potential to make VMware’s virtualization solution untenable, especially when you put it beside the almost free competitor of Hyper-V from Microsoft.
The VMware user community has, of course, not reacted positively to this announcement. Whilst for many larger environments the problems won’t be so bad as the vRAM allocation is done at the data center level and not the server level (allowing over-allocated smaller servers to help out their beefier brethren) it does have the potential to hurt smaller environments especially those who heavily invested in RAM heavy, processor poor servers. It’s also compounded by the fact that you’ll only have a short time to choose to upgrade for free, thus risking having to buy more licenses, or abstain and then later have to pay an upgrade fee. It’s enough for some to start looking into moving to the competition which could cut into VMware’s market share drastically.
The reasoning behind these changes is simple: such pricing is much more favourable to a ubiquitous cloud environment than it is to the current industry norm for VMware deployments. VMware might be slightly ahead of the curve on this one however as most customers are not ready to deploy their own internal clouds with the vast majority of current cloud users being hosted solutions. Additionally many common enterprise applications aren’t compatible with VMware’s cloud and thus lock end users out of realising the benefits of a private cloud. VMware might be choosing to bite the bullet now rather than later in the hopes it will spur movement onto their cloud platform at a later stage. Whether this strategy works or not remains to be seen, but current industry trends are pushing very hard towards a cloud based future.
I’m definitely looking forward to working with vSphere 5 and there are several features that will definitely provide an immense amount of value to my current environment. The licensing issue, whilst I feel won’t be much of an issue, is cause for concern and whilst I don’t believe VMware will budge on it any time soon I do know that the VMware community is an innovative lot and it won’t be long before they work out how to make the best of this licensing situation. Still it’s definitely an in for the competition and whilst they might not have the technological edge they’re more than suitable for many environments.
6 months ago I wrote about SpaceX’s historic flight of their Falcon 9 rocket and how much it meant to us space romantics. Their tentative schedule had me all aflutter with the possibility of seeing not one, but two more flights of their flagship rocket within this year. It was looking entirely possible too as just on a month later they were already building the next rocket and there was even a hint that I might get to see it take off on my trip through America. Whilst I may not have gotten to see the launch for myself SpaceX is not one to disappoint with them launching their second Falcon 9 rocket earlier this morning carrying a fully fledged version of their crew and cargo capsule, the Dragon.
The launch itself didn’t go by without a hitch though with some bad telemetry data causing the initial launch to be scrubbed and rescheduled for about an hour later. However once they were past that minor hurdle they were able to continue with launch preparations and launch without incident. This is testament to their ability to rapidly troubleshoot and resolve problems that would likely cost anyone else at least a day to recover from. Elon Musk is definitely onto something when he thought about running a launcher company as a startup, rather than a traditional organisation.
The mission profile was a relatively simple one although it represents a giant leap forward in capability for SpaceX. The previous launch of the Falcon 9 carried with it a Dragon Spacecraft Qualification Unit, basically just a shell of a full Dragon capsule designed to be little more than a weight on top of the Falcon 9 rocket. That capsule lacked the ability to separate from the second stage of the Falcon 9 it was attached to and was also designed to burn up on re-entry. The payload for this mission however was a fully functional Dragon capsule with the full suite of avionics, support systems and the ability to return to earth from orbit. It was also carrying a small fleet of government owned CubeSats that were launched shortly after they achieved orbit. Approximately 3 hours after the Falcon 9’s launch the Dragon capsule returned safely to earth, splashing down in the Pacific Ocean.
I, along with every other space nut out there, are incredibly excited about what this means for the future of space. Not only has SpaceX managed to successfully launch a brand new rocket twice in 6 months they’ve done so with an almost flawless record. The pace at which they’re progressing is really quite astonishing considering how small they are compared to those who’ve achieved the same goals previously. The team that Elon Musk has assembled really deserves all the credit that they get and I now I wait with baited breath at their next launch as that will be the first private spacecraft ever to visit the International Space Station.
It’s really quite exciting to see progress like this in an area that was once considered only accessible by the world’s superpower governments. Whilst we’re still a long, long way from such technology becoming an everyday part of our lives like commercial air travel has the progress that SpaceX has made shows that the current cost to orbit can and will come down over time. This also gives NASA the opportunity to stop focusing on the more rudimentary aspects of flight that SpaceX is now capable of handling, leaving them to return to what they were once known best for: pushing the envelope of what the human race is capable of in space. So whilst we won’t be seeing another Falcon 9 launch this year as I had hoped all those months ago this perfect flight of the first fully functional Dragon capsule signals that the future of space travel for us humans is not just bright, it’s positively blinding.