Posts Tagged‘future’

3 Decades of Existence.

Well it happened: I turned 30.

As someone who keeps company about a year older than himself it was interesting to see my friends’ reactions to hitting their third decade, most of them dreading it with all their being. I think the reason for this is that from that moment on you’re no longer in your twenties and, as such, are shouldered with the burden of expectations of being a mature adult. That’s not something my generation will typically accept gracefully although, honestly, I’d hazard a guess that no generation before has either. Regardless of how you feel about the date though it is a time to pause and reflect on the time you’ve had and the path laid out before you.

30 Before and After

I’ve enjoyed quite a lot of success in my life thanks to a combination of determination, support and a healthy dose of luck. If you had told me I’d be in this position 10 years ago I would’ve told you that was obvious (ah, the naivete of youth) but honestly I had tracked out a completely different path for myself then, confident that I had control over every single nuance of my life. That, of course, turned out to be impossible and instead I have made the best of the situation I’ve found myself in. Now instead of attempting to meticulously plan my life out 10 years ahead I instead find myself focusing on the here and now, taking every step I can towards my goals without constraining myself to a set timeframe.

Which brings me to the pertinent point of this post: the goal.

I had an informal, let’s call it a goal, with a mate of mine that we made back in college that we’d both make our first million by the time we were 30. It seemed like a pretty reasonable goal, we had about 12 years at the time to make it which averaged out to something like $83,000/year. He was already on the way to achieving that, having been peddling computer gear and other things through the then nascent eBay, but I was intent on making it by rocketing up the corporate ladder to find myself in a position to earn it directly. Interestingly enough we’ve both progressed our careers along those paths as he’s had several business ventures over the years whilst I’ve spent the majority of mine climbing the corporate structures of Canberra IT.

Now the juicy part: did either of us make it? I can’t speak for Vince (although he did say it changed month by month when I mentioned it in a blog post almost 5 years ago, gosh the time went fast) but for me the answer is a little complicated: yes if you’ll allow me a technicality and no if you won’t. The technicality lies within the capital that I have control over which is well over the threshold however I cannot strictly lay claim to that, the bank still holds an interest in that capital in the form of mortgages. If I was to realise everything I’d probably be closer to halfway to my first million rather than over it which many would argue is the more true figure. Suffice to say I err on the side of the technicality because whilst I might not have a bank balance that says $1 million I do make returns as if I have it and have used that capital as real money in the past.

In all honesty though that’s taken a back seat to my primary goal of just doing the things that I love the most. I made the move to Dell almost 2 years ago now and honestly couldn’t be happier as they’ve allowed me to do things that I never had the chance to before. I’ve literally travelled the world for my work and my blog, something that I had always dreamed of doing but never thought I’d realise. In the last 3 years I’ve played over 150 different games, greatly expanding my gaming world and revelling in the review process. I’ve married my college sweetheart and seen her pursue a career of her own passion which has made both her and myself incredibly happy. Money is all well and good but it’d be nothing without the amazing things that have happened despite it.

It’s strange to think that, even though I’ve been living for 30 years now, the vast majority of my life is still likely ahead of me. I may not be the crazy youth that I once was, the one who thought being the CEO of multi-national company was possible in a sub-10 year timeframe, but I still believe that great things lie ahead. Experience has taught me that those things won’t come my way on their own however and should I aspire to more I’ll have to bust my butt to make it there. It is satisfying to know that the only thing standing between me and my ultimate goals is effort and time rather than whatever nebulous concept I had in my head all those years ago.

Suffice to say 30 doesn’t feel like the doom and gloom it was made out to be 😉

Silverlight May Die, But the Developers Won’t.

You’d think that since I invested so heavily in Silverlight when I was developing Lobaco that I would’ve been more outraged at the prospect of Microsoft killing off Silverlight as a product. Long time readers will know that I’m anything but worried about Silverlight going away, especially considering that the release of the WinRT framework takes all those skills I learnt during that time and transitions them into the next generation of Windows platforms. In fact I’d say investing in Silverlight was one of the best decisions at the time as not only did I learn XAML (which powers WPF and WinRT applications) but I also did extensive web programming, something I had barely touched before.

Rumours started circulating recently saying that Microsoft had no plans to develop another version of the Silverlight plugin past the soon to be released version 5. This hasn’t been confirmed or denied by Microsoft yet but there are several articles citing sources familiar with the matter saying that the rumour is true and Silverlight will recieve no attention past this final iteration. This has of course spurred further outrage at Microsoft for killing off technologies that developers have heavily invested in and whilst in the past I’ve been sympathetic to them this time around I don’t believe they have a leg to stand on.

Microsoft initially released Silverlight back in 2007 and has release updates to the platform every year or so since then. Taking that into consideration you’d figure that the latest release of Silverlight has 1 or 2 years in it before other technologies (most likely HTM5 and JavaScript) overtake it in terms of functionality. In that time Windows 8 will be released along with WinRT, the framework that will be instantly familiar to any Silverlight developer. Sure the code might not be directly translatable to the new platform but considering the design work is done in XAML and C# is a supported language I’d struggle to find any Silverlight developer who wouldn’t be able to blunder their way through with a couple Google searches and a StackOverflow account.

All of Microsoft’s platforms are so heavily intertwined with each other that it’s really hard to be just a Silverlight/WPF/ASP.NET/MFC developer without a lot of crossover into other technologies. Hell apart from the rudimentary stuff I learnt whilst in university I was able to self learn all of those technologies in the space of a week or two without many hassles. Compare that with my month long struggle to learn basic Objective-C (which took me a good couple months afterwards to get proficient in) and you can see why I think that any developer whining about Silverlight going away is being incredibly short sighted or just straight up lazy.

In the greater world of IT you’re doomed to fade into irrelevance if you don’t keep pace with the latest technologies and developers are no exception to this. Whilst I can understand the frustration in losing the platform you may have patronized for the past 4 years I can’t sympathize with an unwillingness to adapt to a changing market. The Windows platform is by far one of the most developer friendly and the skills you learn in any Microsoft technology will flow onto other Microsoft products, especially if you’re proficient in any C based language. So whilst Microsoft might not see a future with Silverlight that doesn’t mean the developers are left high and dry, in fact they’re probably in the best position to innovate out of this situation. 

Innovations That Should Be Everywhere.

William Gibson, the author of the seminal cyberpunk book Neuromancer, is quoted as saying that “the future is already here  it’s just not evenly distributed”. I think that’s quite apt as many technological innovations that should be everywhere never seem to get to the places that they need to be. Indeed even when the technology is available some people will simply refuse to use it, instead preferring to do things the way that they’ve always done them because, simply, that that’s the way it’s always been done. As a child of the Y generation I have no qualms upsetting the status quo should it net me some tangible benefit.

I got thinking about this late yesterday afternoon during my usual weekly clean up of the house before the working week. There’s always been this one little innovation that I’ve admired and always wondered why it wasn’t more widespread. It’s simply the bottom of a cheap coffee mug from Ikea:

Now looking at it you could just write that off as simple artistic flair on an otherwise standard coffee cup, but you’d be dead wrong. You see those grooves on the bottom are actually designed to drain water away from the bottom of the cup when it’s upside down, I.E. when it’s in the dish washer. Of all the cups in my household this is the only one that doesn’t have a little pool of water on top of it once the dishwasher has finished. It might seem like a relatively small thing but those little grooves mean that it can dry completely on its own without having a bunch of leftover dish washing scum on top of it. It’s ingenious in its simplicity.

Everyone’s had these sort of ah-ha moments where you find something that makes you question how you’d lived without it before hand. It’s interesting because we, as a species, are highly adaptable and yet conversely we’re also quite resistant to change, at least from my anecdotal point of view. That does seem to be changing with the younger generations picking up new innovations quicker than their predecessors (like social networking and new media) so it’s quite possible that the resistance to change is one that could be overcome in time. History does have an awful habit of repeating itself however and the rapid adoption we witness now might just be an artefact of them growing up in a technological world. Only time will tell for that, however.

 

The Post-PC Era, or More Accurately The Post Desktop Era.

There’s no doubt that we’re at a crossroads when it comes to personal computing. For decades we have lived with the norm that computers conformed to a strict set of requirements such as having a mouse, keyboard and monitor as their primary interface devices. The paradigm seemed unbreakable as whilst touchscreens an motion controllers were a reality for the longest time they just failed to catch on with the tried and true peripherals dominating our user experience. In this time however the amount of computing power that we’ve been able to make mobile changed the way many people did computing and speculation began to run wild about the future, a place that had evolved past the personal computer.

Taking a step back for a second to look at the term “Post PC era” I could find where the term originated. Many point to Steve Jobs as being the source for the term but I’ve found people referencing it for well over a decade, long before Jobs started mentioning it in reference to the iPad and how it was changing the PC game. The definition of the term also seems somewhat lax with some defining it as a future where each niche has its own device whereas others see it more as straight up abolishing of desktop computers in favour of general purpose portable devices. The lack of a formal definition means that everyone has their own idea of what a Post PC era will entail, but all of them seem to be missing the crux of the matter.

What actually constitutes a Personal Computer?

In the most general terms a PC is a general purpose computing device that’s usable by an end user. The term stems from a time when most computers were massive machines, well out of the reach of any individual (both practically and financially). Personal computers then were the first computing devices designed for mass consumption rather than scientific or business purposes. The term “Post PC era” then suggests that we’ve moved past the PC onto something else for our computing needs, meaning our current definition of PC is no longer suitable for the technology that we’re using.

However, whilst the Post PC era might be somewhat loosely defined, many envision a future where something like a tablet PC is the basis of everyone’s computing. For all intents and purposes that is a personal computer as it’s a general purpose computing device that’s designed for mass consumption by an end user. Post-PC era extremists might take the definition further and say that the Post PC era will see a multitude of devices with specific purposes in mind but I can’t imagine someone wanting to buy a new device for each of the applications they want to access. Indeed the trend is very much the opposite with smartphones becoming quite capable of outright replacing a PC for many people, especially if it’s something like the Motorola Atrix that’s specifically designed with that purpose in mind.

Realistically people are seeing the Post-PC era as a Post Desktop Computer Era.

Now this is a term I’m much more comfortable with as it more aptly explains the upcoming trends in personal computing. Many people are finding that tablet PCs do all the things that their desktop PCs do with the added benefit of being portable and easy to use. Of course there are some tasks that tablets and other Post PC era devices aren’t quite capable of doing and these use cases could be easily covered off with docking stations that provide additional functionality. These could even go as far as providing additional features like more processing power, additional storage and better input peripherals. Up until recently such improvements were in the realms of fantasy, but with interconnects like Thunderbolt it’s entirely possible to provide capabilities that used to be reserved for internal components like PCIe devices.

The world of personal computing is changing and we’ve undergone several paradigm shifts in the last couple years that have changed the computing landscape dramatically. The notion that we’ll never touch a desktop again in the near future is an easy extrapolation to make (especially if you’re selling tablet computers) but it does ignore current trends in favour of an idealized future. More I feel we’ll be moving to an ubiquitous computing environment, one where our experience isn’t so dependent on the platform and those platforms will be far more flexible than they currently are. Whether the Post PC era vision or my ubiquitous computing idea comes to fruition remains to be seen, but I’d bet good money that we’re heading towards the latter than the former.

Adapt or Die: Why I’m Keen on the Cloud.

Anyone who works in IT or a slightly related field will tell you that you’ve got to be constantly up to date with the latest technology lest you find yourself quickly obsoleted. Depending on what your technology platform of choice is the time frame you have to work in can vary pretty wildly, but you’d be doing yourself (and your career) a favour by skilling up in either a new or different technology every 2 years or so. Due to the nature of my contracts though I’ve found myself learning completely new technologies at least every year and its only in this past contract that I’ve come back full circle to the technology I initially made my career on, but that doesn’t mean the others I learnt in the interim haven’t helped immensely.

If I was honest though I couldn’t say that in the past I that I actively sought out new technologies to become familiar with. Usually I would start a new job based on the skills that I had from a previous engagement only to find that they really required something different. Being the adaptable sort I’d go ahead and skill myself up in that area, quickly becoming proficient enough to do the work they required. Since most of the places I worked in were smaller shops this worked quite well since you’re always required to be a generalist in these situations. It’s only been recently that I’ve turned my eyes towards the future to figure out where I should place my next career bet.

It was a conversation that came up between me and a colleague of mine whilst I was on a business trip with them overseas. He asked me where I thought were some of the IT trends that were going to take off in the coming years and I told him that I thought cloud based technologies were the way to go. At first he didn’t believe me, which was understandable since we work for a government agency and they don’t typically put any of their data in infrastructure they don’t own. I did manage to bring him around to the idea eventually though, thanks in part to my half decade of constant reskilling.

Way back when I was just starting out as a system administrator I was fortunate enough to start out working with VMware’s technology stack, albeit in a strange incarnation of running their workstation product on a server. At the time I didn’t think it was anything revolutionary but as time went on I saw how much money was going to waste as many servers sat idle for the majority of their lives, burning power and providing little in return. Virtualization then was a fundamental change to the way that back end infrastructure would be designed, built and maintained and I haven’t encountered any mid to large sized organisation who isn’t using it in some form.

Cloud technologies then represent the evolution of this idea. I reference cloud technologies and not “the cloud” deliberately as whilst the idea of relying on external providers to do all the heavy lifting for you is extremely attractive it unfortunately doesn’t work for everyone, especially for those who simply cannot outsource. Cloud technologies and principles however, like the idea of having massive pools of compute and storage resources that can be carved up dynamically, have the potential to change the way back end services are designed and provisioned. Most importantly it would decouple the solution design from the underlying infrastructure meaning that neither would dictate the other. That in itself is enough for most IT shops want to jump on the cloud bandwagon, and some are even doing so already.

It’s for that exact reason why I started developing on the Windows Azure platform and researching into VMware’s vCloud solution. Whilst the consumer space is very much in love with the cloud and the benefits it provides large scale IT is a much slower moving beast and it’s only just now coming around to the cloud idea. With the next version of Windows shaping up to be far more cloud focused than any of its predecessors it seems quite prudent for us IT administrators to start becoming familiar with the benefits cloud technology provides, lest we be left behind by those up and comers who are betting on this burgeoning platform.

The Shadow IT Department and Its Influence on Corporate IT’s Future.

If there’s one thing that us system administrators loathe more than dealing with users its dealing with users who have a bit of IT smarts around them. On the surface they’re the perfect user, being able to articulate their problems and requirements aptly so we have to spend considerably less time fulfilling their requests. However more often than not they’re also the ones attempting to circumvent safeguards and policies in order to get a system to work the way they want it to. They’re also the ones who will push for much more radical changes to systems since they will have already experimented with such things at home and will again want to replicate that in their work environment.

Collectively such people are known as shadow IT departments.

Such departments are a recent phenomena with a lot of credit (or blame) being levelled at those of my generation, the first to grow up as digital natives. Since the vast majority of us have used computers and the Internet from an early age we’ve come to expect certain things to be available to us when using them and don’t appreciate it when they are taken away. This doesn’t gel too well with the corporate world of IT where lock downs and restrictions are the norm, even if they’re for the user’s benefit, and thus they seek to circumvent such problems causing endless headaches for their system administrators. Still they’re a powerful force for driving change in the work place, enough so that I believe these shadow IT departments are shaping the future of corporate environments and the technologies that support them.

Most recently I’ve seen this occurring with mobility solutions, a fancy way of saying tablets and phones that users want to use on the corporate network. Now it’s hard to argue with a user that doing such a thing isn’t technically feasible but in the corporate IT world bringing in uncontrolled devices onto your network is akin to throwing a cat into a chicken coup (I.E. no one but the cat benefits and you’re left with an awful mess to clean up). Still all it takes is one of the higher ups to request such a thing for it to become a mandate for the IT department to implement. Unfortunately for us IT guys the technology du jour doesn’t lend itself well to being tightly controlled by a central authority so most resort to hacks and work arounds in order to make them work as required.

As the old saying goes the unreasonable person is the one who changes the world to suit themselves and therefore much of the change in the corporate IT world is being made by these shadow IT departments. At the head of these movements are my fellow Gen Y and Zers who are struggling with the idea that what they do at home can’t be replicated at work:

“The big challenge for the enterprise space is that people will expect to bring their own devices and connect in to the office networks and systems,” Henderson said. “That change is probably coming a lot quicker than just five years’ time. I think it will be a lot sooner than that.”

Dr Keiichi Nakata, reader in social informatics at Henley Business School at the University of Reading, who was also at the roundtable, said the university has heard feedback from students who have met companies for interviews and been “very surprised” that technologies they use every day are not being utilised inside those businesses.

It’s true that the corporate IT world is a slow moving beast when compared to the fast paced consumer market and companies aren’t usually willing to wear the risk of adopting new technologies until they’ve proven themselves. Right now any administrator being asked to do something like “bring your own computer” will likely tell you its impossible, lest you open yourselves up to being breached. However technologies like virtualization are making it possible to create a standard work environment that runs practically everywhere and I think this is where a bring your own device world could be possible.

Of course this shifts the problem from the IT department to the virtualization product developer but companies like VMware and CITRIX have both already demonstrated the ability to run full virtual desktop environments on smart phone level hardware. Using such technologies then users would be able to bring in almost any device that would then be loaded with a secure working environment, enabling them to complete the work they are required to do with the device they choose. This would also allow IT departments to become a lot more flexible with their offerings since they wouldn’t have to spend so much time providing support to the underlying infrastructure. Of course there are many other issues to consider (like asset life cycles, platform vetting, etc) but a future where your work environment is independent of the hardware is not so far fetched after all.

The disjunct between what’s possible with IT and what is the norm in computer environments has been one of those frustrating curiosities that has plagued my IT career. Of course I understand that the latest isn’t always the greatest, especially if you’re looking for stability, but the lack of innovation in the corporate space has always been one of pet peeves. With more and more digital natives joining the ranks however the future looks bright for a corporate IT world that’s not too unlike the consumer one that we’re all used to, possibly one that even innovates ahead of it.

 

The Startup Life: My Unplanned Future.

If I were to rewind back a couple years and ask my past self where I would be at today the answer would probably be something like “living overseas and applying for various MBA programs”. It seemed ever since part way through university I had my eye on being in some form of upper management role in a large company, reveling in the idea of a high rise office building and being able to make a positive impact. It seemed every year I was doomed to delay those plans for another year because of other things that would crop up, with me finally admitting that anyone with a 10 year plan is deluding themselves.

Despite that my aspirations have not changed. I still lust after that high flying lifestyle that I attributed to the ranks of C-level executives and still yern to travel overseas as so many have done before me. However I’ve grown disillusioned with the idea of attaining such goals in the annals of an established company. My illustrious career, spanning a mere 6 years, has shown me that there’s little joy climbing the ranks in such environments with games of politics and tit-for-tat deals the accepted norm. The engineer in me was languishing under the idea of being suppressed for years whilst I played these games on my way to the top where I could finally unleash it with the power to make a difference. At the end of last year it finally broke through and gave me the dreadful clarity I needed to finally change my way of thinking.

I needed to make it on my own.

It was around this time that I’d started to get an interest for the curious world of technology start ups. You see here in Canberra where everyone is employed by the government or doing work for the government there’s no place for technological innovators, the captive market here just isn’t interested. Thus the idea of lashing out on my own in the only field I knew was always put aside as a untenable notion; the environment to support it just isn’t here. Still the idea gnawed away at the edge of my mind for quite a while and my feed reader started gathering information on all aspects of starting out on your own and how others had done it before me.

At the same time I had begun working on Geon, primarily as a eating my own dog food exercise but also as something to give back to my readers who’d been loyal over the fledgling months of this blog. The idea had legs though and I continued to work on it off and on for many months afterwards with many iterations making their way onto this site. After a while the notion of building my own business and my hobby of building something to satisfy a niche that was going unserviced began to merge and the dream I had once become disillusioned came back with a thundering vengeance.

There’s always going to be that part of me that nags at the corner of my mind telling me that any plan I make is doomed to failure, and I’ve learnt to come to terms with that. When I can talk about my idea with someone for hours on end and walk away with countless ideas about where I can take my project in the future I know the work I’m doing is good. That voice at the back of my head keeps me honest with myself, ensuring that I apply critical thinking to all the problems that I encounter. In that respect my fledgling inner skeptic makes sure I don’t bullshit myself into a corner and for that I’m eternally thankful.

I guess it all comes down to not knowing where you’ll end up in life. 6 years ago I had my whole life planned out until I was 30 (and a bet with an old friend of mine I haven’t forgotten) and today I’ll happily tell you that I’ve got no idea where I’ll be at 30. That idea would be frightening for many people but for someone like me who thrives on making the most out of his time it’s extremely liberating. No longer am I locked into any preconcieved notion of what I need to do to get where I want to be. All I need to do is work in the moment to achieve the best I can, and that’s exactly why I believe I’ll succeed.

/egostroke 😉

Will Twitter Endure?

It seems that every couple years we find a new technology, or re-purpose an old one, with which to communicate with each other. They often start out as a niche application that only a few technically inclined people (Internet, Email) use or are reserved for those who can afford the luxury (Cellular phones) but then as these early adopters foster the growth of the technology the barrier to entry drops dramatically. Soon after the technology hits critical mass and widespread adoption is inevitable (I dive more in depth into this concept here). Twitter is the latest method of communication to hit this critical mass point and as far as anyone can tell it’s here to stay for the long run:

Some time soon, the company won’t say when, the 100-millionth person will have signed on to Twitter to follow and be followed by friends and strangers. That may sound like a MySpace waiting to happen — remember MySpace? — but I’m convinced Twitter is here to stay.

And I’m not alone.

“The history of the Internet suggests that there have been cool Web sites that go in and out of fashion and then there have been open standards that become plumbing,” said Steven Johnson, the author and technology observer who wrote a seminal piece about Twitter for Time last June. “Twitter is looking more and more like plumbing, and plumbing is eternal.”

Carr hits on a point that I’ve made informally (and unfortunately not on this blog as far as I can tell) many times before. Twitter as it stands provides a useful service to many millions of people and they did the right thing from the start of blowing open their whole operation in the form of very friendly APIs. This drove adoption up like crazy since anyone who has spent even a short amount of time building web applications would be able to whip up a simple Twitter app for their platform of choice. Twitter even has reach in places where Internet connectivity is sparse but mobile phone coverage is not, further widening their market and cementing it in our minds as an ubiquitous technology.

So you might ask then, why does the title of this post suggest I believe otherwise?

As they always say, if you want to find the truth follow the money. Twitter, according to Alexa, is currently ranked the 14th most visited site in the world with a global Internet traffic capture of almost 5%. When you have that many eyes locked on your site every day it doesn’t take a lot to monetize that traffic and make an incomprehensible amount of cash. However, as some leaked documents showed, Twitter’s revenue scratched $400,000 in Q3 of 2009 with projected earnings of $4 million in Q4. Whilst they’re still being coy about what their actual monetization strategy is going to be there’s been a few tidbits leaking out around some agreements with search engines and the possibility of in depth analytics for corporate customers. These are all good in theory, but is it really going to be enough for Twitter to keep on keeping on?

The problem as I see it isn’t that the service isn’t valuable or unpopular, far from it. More the fact that despite all their success Twitter is struggling financially with its current windfalls only providing a temporary cash injection (of which they’ve already had 3). The user-base is used to having their Twitter clean and ad free so taking the Google route of slapping advertising on it is out of the question and really unless they’re capturing some other wordly metric that no one has thought of yet I can’t see the value businesses could derive from ponying up the cash for a premium account. With Twitter being tight lipped on the matter (apart from saying they have multiple things they want to try) I’m wondering if they’ll ever produce a stable revenue stream. It all sounds dangerously similar to Youtube before their acquisition by Google, and they had a solid monetization strategy.

This is not to say that the service that Twitter embodies will disappear, far from it. The social netziens have developed a set of social norms that revolve around the use of a Twitter-esque service and should it disappear they’ll soon hop to the next web framework that provides a similar means of communication. Indeed there are already several alternatives that provide much the same services as Twitter and could easily pull up the slack should it disappear. How much of the horde would convert if such a fate befell their service of choice? Hard to tell, but the past has shown us that if people want something bad enough they’ll go at almost any lengths to get it.

Still Twitter has been going this long without a hitch so there’s no reason they won’t continue on for the foreseeable future. They pulled their monetization strategies ahead aggressively in order to alay their shareholder’s fears and their recent deals should cash them up enough for them to try several revenue generation streams before they’re out of time. There’s also the possibility of acquisition by a large corporate entity who can absorb their losses in exchange for their user-base (ala the Google/Youtube deal) but that’s a few years off at least, since they’ve only just recently started playing nice with the giants of the IT world.

It’s going to be an interesting year for all the folks over at Twitter.

NEO, The Moon and Mars.

NASA’s in a real pickle at the moment. After having their budget repeatedly slashed year after year by various governments looking to save a few dollars and the scope of their works ever increasing they’re now faced with the challenge of choosing their future direction. A white house panel recently convened on the subject and had several proposals put forth, half of them requiring a cash injection to the beleaguered agency to the tune of $3 billion a year. It would seem the idea of visiting a Near Earth Object (NEO) has gained some traction recently:

BOULDER, Colo. – Call it Operation: Plymouth Rock. A plan to send a crew of astronauts to an asteroid is gaining momentum, both within NASA and industry circles.

Not only would the deep space sojourn shake out hardware, it would also build confidence in long-duration stints at the moon and Mars. At the same time, the trek would sharpen skills to deal with a future space rock found on a collision course with Earth.

In Lockheed Martin briefing charts, the mission has been dubbed “Plymouth Rock – An Early Human Asteroid Mission Using Orion.” Lockheed is the builder of NASA’s Orion spacecraft, the capsule-based replacement for the space shuttle.

If they are to follow such a plan (assuming it came from the white house panel’s proposals) it does have some interesting consequences for NASA. First of all it’s one of the more expensive options, meaning that their budget would need to be increased to cope with it. Secondly it would see the shuttle program extended for another year delaying its retirement until 2011. It would also see NASA divert their focus from the shuttle replacement Ares-I in favour of using commercial options like SpaceX’s Falcon 9, only relying on their launch capabilities as a backup. The last, and probably most important aspect, would be that America would no longer cease its involvement in the International Space Station in 2015, instead continuing until 2020. All of these points show a shift from traditional NASA thinking and it has me wondering where the push is coming from.

In all honesty visiting a NEO would make for an interesting mission. It would be a long duration flight of around 6 months with a maximum of a couple weeks spent actually in and around the object in question. The real benefit of such a mission isn’t so much in the science we can do at the asteroid (we’ve already done that) but in the verification that the new hardware is capable of such long duration flights. It’s definitely a step forward in terms of capability, but will it really serve as the stepping stone for manned missions to Mars and beyond?

Buzz Aldrin thinks not, and he’s been an advocate for NASA to focus on going directly to Mars for quite some time. His plan does seem incredibly sensible to me as collaborating with other space faring nations whilst pushing the envelope in terms of deep space exploration means that NASA can get the best of both worlds. I’m sure that Roscosmos and the ESA would jump at the opportunity to establish a presence on the moon as they did with the ISS. The main issue that Buzz hits on quite succinctly is that NASA should be actively seeking collaboration from international partners for projects such as a moon base as these have significant scientific benefits. It would be hard to justify it as a stepping stone to Mars and beyond, but as an international effort it almost looks like a no-brain-er.

It’s a troubling time for NASA as they’ve been presented with a whole swath of options and are faced with the hard choice of cutting back on their core programs or attempting the next-to-impossible by squeezing more cash out of congress. The next year will see many changes happen with the impending retirement of the shuttle and the realization of fully private launch capabilities so we can rest assured that NASA’s future won’t be in question for much longer.

I just wish they’d make up their minds about the shuttle so I can plan my trip over there to see the last shuttle launch 😉

Augmented Reality.

Most people know about the ideas of Virtual Reality, such as the concepts expressed in the Matrix trilogy of movies and other Science Fiction productions. However many people are unaware of the bridge between these worlds that already exists using today’s technology. This is known as Augmented Reality and it attempts to enhance our current perception of the world using technology. The simplest form of this I can think of is Heads Up Displays (HUDs) that you can even get in your car these days (if you happen to own one of those spiffy European cars ;)). However I don’t want to get bogged down in the idea of visual augmented reality, as that’s really just a small part of it.

With today’s technology putting more and more information at our fingertips our reality is becoming more augmented then we might think. For instance, my phone has a web browser built into it and an Internet connection that would’ve cost most companies thousands of dollars a decade ago. Right now if someone asks me a question that I have no idea about a quick trip to Wikipedia has the general information about the topic at hand almost instantly. Additionally back when I had a Windows Mobile phone (Which I managed to lose, but that’s another story!) I used to subscribe to RSS feeds that would be updated every hour. This meant that I had up to date information on various topics that interested me in my pocket at all times. If I was out at lunch I’d merely scroll through the newest items and I’d always be up to date on the latest.

But even this “pull” side of augmented reality is only one part of it. When I was down in Melbourne visiting one of my friends he happened to tell me about these new shoes that he got. It seems that Nike had gotten together with Apple to produce what basically amounted to a pedometer that was embedded in the shoes and was capable of recording statistics whilst you were jogging. He was partly doing this because his work had a sponsored health campaign, and they were all uploading their stats to a website to see how they were all going. As much as I hate the term “Web 2.0” it’s very much that, putting the users in charge of generating content that is of interest to everyone.

So where is all this technology going? Back in 2004 a university project in Singapore spawned a real world Pacman, using GPS and a complex overlay of the real world. Whilst this is more of a gimmick it did show the potential of using many disparate forms of technology to augment and enhance our view of the world. One of the coolest apps, which also demonstrates the power of Open Development Platforms, is Wikitude AR Travel Guide for the HTC G1 Android mobile phone:

What I like about this app is that it is a consumer level application. It’s designed for your everyday user to be able to download and use without having to think about it. As the Android platform matures I’m sure we’ll start to see many more implementations of applications like this and I for one, can’t wait.

It’s almost enough for me to break out Visual Studio and start coding again……

Almost 😉