Monthly Archives: April 2011

Yuri Gagarin: A 50 Year Anniversary, An Eternal Legacy.

50 years is an almost incomprehensible amount of time for a young person like myself. That’s nearly double my entire time on this planet and even in my short 26 years I’ve seen wild changes to this world, so I can only imagine the changes anyone someone who has lived 50 years or more has seen. One of the most incredible changes that the last 5 decades has brought us has been the invention of space flight which has dramatically influenced humanity as we know it today, even if its presence is mostly invisible. Two days ago saw the anniversary of our very first tenuous steps into the final frontier with the Russian cosmonaut Yuri Gagarin becoming the first ever human to enter space and orbit our beautiful blue marble.

Winding the clock back 50 years puts us right in the middle of the cold war, a political battle fought over decades on a global scale. The first artificial satellite was created just 4 years prior and the space race between the then USSR and the USA had reached a fever pitch. Both sides were working fervently to stake their claim on being the first to accomplish anything in space and at this point the Russians were winning after their success with Sputnik. They weren’t resting on the laurels however and they were aggressively pursuing the goal of getting the first man into space. The mission was to be called Vostok 1.

The craft Gagarin was to ride into space wasn’t a large one by any stretch of the imagination, being a mere 2.3 meters in diameter and looking a lot more like a submersible craft than one destined for the vacuum of space. In true Russian fashion it was also incredibly robust and when compared to its American counterparts it was incredibly simple. The craft also lacked any control surfaces and didn’t have any backup thrusters, which is why the craft was mostly spherical, since unlike the American craft it couldn’t orientate a heat shield to protect it on re-entry. This also meant that in the event that retrorockets didn’t fire Gagarin would have been stuck in orbit for up to 10 days, and as such the craft was equipped with enough supplies to ensure that he’d survive.

The mission began at 5:30AM, 12th of April 1961. Both Gagarin and his backup pilot, Gherman Titov, were awoken at this time with the launch scheduled to start 2 hours later. Things went pretty smoothly although doctors reported that Gagarin wasn’t himself at this time, being somewhat pale and unusually reserved. Still in comparison to Titov, who had to take medication to calm himself down, Gagarin was as calm as ever with a resting heart rate of that of a long distance runner. About an hour after being awoken he was secured in the Vostok capsule (which had to be resealed once due to it failing the first time) and was left in there for another 40 minutes before blasting off into space.

In total Gagarin spent just over an hour orbiting the earth, completing one full orbit and touching down in a field outside of Engels in the Saratov region. His descent from the heavens startled a farmer and his daughter who witnessed this alien like creature in an orange suit with a white helmet descending from the heavens. He later recalled the situation:

When they saw me in my space suit and the parachute dragging alongside as I walked, they started to back away in fear. I told them, don’t be afraid, I am a Soviet like you, who has descended from space and I must find a telephone to call Moscow!

Gagarin and his capsule were both successfully recovered. He returned back to Moscow a hero and a figure that will be remembered as one of the great pioneers of the final frontier. Although he never orbited the earth again he was heavily involved in the USSR’s space program afterwards, helping design new craft and was a backup pilot for the very first Soyuz mission a craft that is still in use today. Tragically his life was cut short in 1968 in a routine test flight over a Russian air base, but the legacy he laid down will last on for as long as humanity exists.

I’ve often said that I don’t give the Russians enough attention on this blog and they should be recognized for their amazing accomplishments in space. 50 years on the influence of early pioneers like Gagarin and his team are clearly visible in all facets of the Russian space program. It’s a testament to their strong ideals of simplicity and robustness that a craft designed decades ago can still be in service today and still meet the needs of both NASA and the ROSCOSMOS. Whilst I may be a bit late to the party in remembering the great feats of the Russian space program I hope you’ll join me today in recognizing their accomplishments, and wishing them all the best for the next 50 years.

Oh Optus, Femtocells Aren’t The Answer.

Look I can understand how frustrating it can be to live in a place with crap cell phone reception. I spent the majority of my life living only 30 minutes outside Canberra and even that short distance was enough for the reception to basically drop off to nothing unless you were with Telstra. Even then you were lucky to be able to place a call indoors (especially if you had the typical colourbond roof) with most mobile calls being made from the nearest hill you could scurry up. I still suffer from spotty coverage even in town thanks to my current network provider but not once have I thought that a femtocell would be the answer to my problem.

Like I’ve said previously femtocells seem to be like a cash grab from cellular providers who instead should be spending their own money on fixing their coverage problems. Their use case is almost too narrow to be of any use since you need to have a broadband connection (which usually puts you in mobile phone range) and since nearly every broadband router comes with a wireless access point there’s no need to use 3G when you’re at home. In essence you’re just giving yourself full coverage so you can pay the exorbitant cellular data rates whilst at the same time using your own data cap, in essence double charging yourself for the privilege. Just like there doesn’t seem to be a case for a cellular tablet I struggle to find a use for a femtocell other than for a cellular provider to bilk their customers.

It seems that these useless devices have finally made their way onto Australian shores with Optus, the carrier with the worst record for coverage (in my experience at least), beginning trials of the devices:

Dubbed the ’3G Home Zone’, the new Optus femtocell device is a small base station that plugs into a wireless router and uses a fixed-line broadband Internet connection to boost mobile coverage. Once operational, the Optus femtocell device should typically provide full mobile coverage within a 30 metre range.

Optus recommends that the 3G Home Zone be connected to a broadband service with a minimum download speed of 1Mbps and a minimum upload speed of 256kbps — if the speed is capped at 128kbps or lower, the device will no longer work.

The most insulting part about Optus’ introduction of these devices is that they’re charging for them, and it’s not a trivial amount either. You either pony up $60 initially and another $60 over 12 months (with a $70/month plan) or you pay $240 outright. Now far be it from me to get in the way of a company trying to make a profit but it would seem that the investment they spent in getting these devices functional could have been far better spent upgrading the spots where reception is a problem. Getting 3G indoors is all well and good but the vast majority of use cases for that are already covered off aptly by wireless, and you don’t need to pay an additional monthly fee to use that.

What I would support however would be something along the lines of what AT&T is doing in the USA, giving all users who request it a free femtocell. Of course it would seem like a silly move to begin with but having been an actual AT&T customer and seeing the coverage problems they had a free femtocell would go a long way to keeping people on their network. Of course they didn’t start out free (they definitely weren’t when I was there) but obviously the cost can’t be too high or they wouldn’t be offering it. Hopefully it won’t be too long before Optus follows suit.

Femtocells feel like a solution in search of a problem. Sure it might be great to have full coverage in your house (I currently get 1 bar) but the reason for doing so seems almost non-nonsensical when you look at the requirements needed to do it. I can’t see a future where I’ll ever need a device like this unless they somehow make it affordable with a satellite connection, but even then if I’m that far away from humanity I’d be guessing I wouldn’t want to bring the Internet with me. So hopefully these silly devices will disappear into the dark niche they belong in: the technically ignorant and woefully misinformed.

R18+ For Games: Death By Vocal Minority.

It was glorious, we started to see the beginnings of a rational discourse over the whole lack of a R18+ for games and there was hope for an overhaul of our decidedly archaic and convoluted classification system. I was happy, thinking I would soon be living in a country that had cast off the shackles of its past in favor of adopting a more progressive view of the games industry. A country that recognizes that games are predominantly not for children anymore with the vast majority of gamers now grown up, wanting the medium to grow up with them. Realistically I knew it was a small issue, but the fact that it could get dragged out over such a long period of time was the driving factor behind my outrage. I just couldn’t (and still can’t) understand why it has been so difficult.

It was over a year ago that what appeared to be the final wall standing between us and a more rational future was Senator Atkinson came tumbling down with his retirement. We still lost one title to the dreaded Refused Classification black hole in this time but I consoled myself in the fact that soon all of this would be a distant memory, a blip in Australia’s history where it stubbornly refused to modernize for no reason in particular. The news shortly afterwards that reformation was on the horizon was confirmation of this fact and made my spirit soar once again, only to be dashed by this recent news:

LONG-AWAITED reforms of Australia’s censorship of computer games look set to fail after Victoria declared its strong concern that the move will legalise games with ‘‘high levels of graphic, frequent and gratuitous violence’’.

Backed by a groundswell of support from the gaming community, the Gillard government is determined to fix the classification system for computer games, which allows unsuitable games to be rated for 15-year-olds, yet bans popular games for adults.

But the Baillieu government’s Attorney-General, Robert Clark, has echoed the concerns of the Australian Christian Lobby, putting him on a collision course with Canberra, which requires the backing of all states and territories to change classification laws.

The article goes on to say that coalition wants to put the matter to “careful scrutiny and public debate”, happily ignoring the fact that it’s been hotly debated for the last 2 years and had a public consultation that was overwhelmingly positive with 98.2% of respondents supporting the cause. Opponents also ignore the fact that Australia is one of the few modern countries that lacks a R18+ rating for games yet has such a rating for books, films and TV. I probably shouldn’t be surprised as the facts haven’t been the opposition’s strong suit in trying to cut down the R18+ rating in its infancy.

I’ve said it time and time again, the R18+ issue provides nothing but benefits to Australia and it’s gaming populace. The R18+ rating would make parents aware of material that isn’t appropriate for their children, allowing them to regulate the consumption of such materials. It would ensure proper classification of games as well, rather than shoe horning many games into the MA15+ rating that in reality belong in the R18+ category. A R18+ rating would also make Australia far more attractive to developers who are creating games targeted towards adults (I.E. the majority of the consumers in the games industry) instead of them shying away from us for fear of the dreaded RC rating.

The reason that the R18+ rating has languished in this political shitstorm for so long can be almost entirely blamed on a single lobby group: The Australian Christian Lobby. Wherever opposition to the rating is found you can bet your bottom dollar that they’re involved some how, and I’m not just saying this for dramatic effect. Whilst I won’t link to any of their tripe directly, since I don’t think they deserve the attention, a simple search for “R18+ acl” brings back dozens of articles of them supporting the demise of the R18+ rating. Indeed they’ve also been major proponents of other, more aggressive censorship efforts such as the Internet filter going so far as to label my views as “extreme” back when I was heavily involved in the No Clean Feed movement.

The ACL is of course in the minority here since the Australian public is overwhelming in support of a R18+ rating for games. Yet they keep managing to swing people in key positions leaving the battle for the R18+ rating effectively hamstrung. Thankfully the recent ultimatum on either a R18+ or a classification system overhaul (which would be far more painful for those in opposition to endure) shows that there are people willing to stand up to this vocal minority who has shown they can not act rationally when it comes to people doing things they don’t agree with.

It seems my dream of an Australia that finally brought itself into the 21st century are still a long way from being realized and the thorn in my side that was Senator Atkinson has since been replaced by Attorney-General Clark, but there’s still hope on the horizon. One day I’ll be able to buy games built by adults that have been designed to be consume by adults and the ACL won’t be able to say anything about it. Until then however I’ll continue to angrily blog about any development in the R18+ space until it gets fixed and I’ll put in every effort to make sure it becomes a reality.

I won’t let the irrational vocal minority win.

 

Experience, Not The Platform, Is What Makes The Developer.

I spent a lot of time, probably way too much of it, watching the start-up scene and getting a feel for the current trends of what’s hot and what’s not¹. Increasingly I find myself on the other side of the fence since I’m wholeheartedly a Microsoft supporter and everyone else seems to be into Linux, Rails and varying forms of Javascript like Node.JS. Sure there’s a great many websites built on these frameworks and the stuff people are able to churn out with them in seemingly little time at all certainly makes me feel like a total idiot when I’m floundering around in ASP.NET. But in reality those proclaiming that they created these things in just a weekend or could deploy a new app within minutes are often hiding one crucial fact from you.

The multiple years of experience that came prior to it.

It’s no secret that whilst I’ve been developing for a long time I’m no rockstar when it comes to the world of web programming. Indeed my first foray into this world was a bastard of a page that was lucky not to fall on its face constantly and the experience had me running to find better solutions, eventually falling to Silverlight. The reason for this was obvious, it allowed me to leverage my desktop development experience into a new platform. Sure I struggled with the ideas that just couldn’t be boiled down into the desktop world (like that whole REST thing) but it was a quick way to get myself into this world and expand from there.

So of course when I saw people saying they built this incredible website in only a weekend when it took me several months worth of weekends just to get mine working I was intrigued. I even made the foolish mistake of reading up on some of their “how I did it” posts on Hacker News and saw all these wonderful frameworks that they had been using, assuming this would make me a master overnight. Stepping through some of the tutorials and looking at the tools available started to raise some eyebrows since they were unlike anything I had seen before, and this is where I got suspicious.

You see I could whip up a simple desktop app or PowerShell script in minutes that would do some function using the tools I have in front of me, but that doesn’t mean you should be using those tools to create your site. Neither does that mean you would be able to whip up the same thing using the same tools in the same amount of time, no matter how skilled you were in other languages. The simple reason for this is that whilst you might be a rockstar in ruby or an expert in PHP your experience is confined to the environment to which you’re most accustomed and should you need to retool and reskill for a new language it’s going to be several months before you’re at your maximum competency again.

Sure good developers are able to adapt much faster than so-so developers but there’s a significant opportunity cost in switching away from your current knowledge comfort zone in order to try and emulate those who you idolize. I came to this realization a couple months back after staring at so many Ruby/Python/SomeDynamicLanguage web sites, wondering at how the heck they got them looking and functioning so well. In truth the platform they were using had little to do with it, these guys had just been in the game for so much longer than me that they knew how to get these things done. With me still in the grok stage of my first really truly web framework I really shouldn’t be comparing myself to them just yet, not at least until I can get my new application functioning the way it should.

It’s so easy to get disillusioned with what you’re doing when you see others progressing so much faster than you ever thought you could. My new application was supposed to be a testament to my coming of age as a web developer, having giving myself only a short time to get it off the ground before actually launching it. Since my deadline for that has come and past I’ve been forced to change the way I view myself as a developer and have come to realize that unless I’m working in something I’ve developed with before I shouldn’t expect myself to be a rockstar from day one, instead recognizing that I’m still learning and pushing through the pain barrier until I become the rockstar I thought I was.

¹If you’re interested, what’s hot right now is photo sharing apps. What’s not? Location apps, go figure.

Un/Conscious Influences.

If I’m honest I can’t really tell you where the inspiration for Lobaco came from. Sure the idea itself is pretty simple (what’s going on there?) but I can’t really tell you what place or event first inspired me. The pursuit of the idea itself is much easier as it basically comes down to my inner dialog that constantly shouts put up or shut up at the back of my head and I felt hypocritical telling people to aggressively pursue their goals if I myself didn’t do the same thing. The 3 redesigns and one renaming Lobaco have much more solid roots having all stemmed from taking a break from developing and then taking a fresh look at the work I was doing.

Most of the inspiration came from a conscious desire to improve the product. In an effort to duplicate what I currently perceived as success many of the changes came from me taking ideas from places like Twitter and Foursquare and wrangling them into my product. Some of these ideas worked quite well like the UI redesign that took some serious cues from Twitter (large post box in the middle of the screen, 3 column layout) and others like the achievement service which mirrored Foursquare’s badge system (only has one unlockable, First Post!) proved to be a whole lot of effort for not a whole lot of gain. If you’re one of the brave souls testing the iPhone client (you can sign up here) you’ll notice that the latter feature is completely absent, for that exact reason.

Unconsciously however I believe I was thinking that Lobaco would end up being the platform upon which location based communication would be done. Sure many of the design decisions I made like making the API RESTful and JSON based were to increase cross-platform compatibility but ultimately I knew that the real power was being a platform, and even blogged to that effect. Whilst I don’t believe Lobaco suffered unduly because of this I hadn’t really considered the influence that outside forces were having on me subconsciously until 4chan creator Christopher “moot” Poole said this:

One of the biggest startup cliches is that every other startup wants to become a platform for other startups to build on. But to Christopher Poole, the founder of Canvas and 4Chan, that is the wrong approach. “People get caught up in trends—game mechanics, building a platform,” he tells Chris Dixon in the Founder Stories video above. Instead of trying to copy what works for others, founders should “focus on building what you love, focus on the product and building the community.”

He doesn’t understand “this obsession with building platforms. Focus on building something worth scaling. You don’t even have something worthy of an API yet. Focus on users and have them fall in love with your thing.” Amen.

Indeed many of the ideas I had emulated in Lobaco were done because I saw other successful companies doing them and figured that they would work for me as well. In reality I would have been much better served by focusing on the core product, refining the idea to the point where its utility was obvious to anyone. Since the idea was hinged on the idea of localized information I probably should have done things backwards, getting the core handset product right before attempting to bring it onto the web. That would have forced me to cut all of the fat out of the application, lest I create a cluttered and useless handset experience.

No matter how hard you try to fight it you will always be influenced by your experiences and for an information junkie like myself this meant that the service I was building emulated those which I considered most successful. My latest endeavor (which shall remain a secret, for now) is already showing signs of this kind of influence but I’m at least taking the lessons learned from Lobaco and applying them aggressively. I’m hoping this current project will be the fast track to self-sustainability that I’ve been hungering after for almost 2 years now and hopefully the time spent in the trenches for Lobaco will pay dividends in bringing this project to fruition.

 

The PC as an Appliance.

We often forget that the idea of a personal computer is an extremely modern one, considering how ingrained in our lives they have become. Indeed the first personal computers appeared around 40 years ago and it took decades for them to become a fixture as common as the television in modern households. The last 2 decades have seen an explosion in the adoption rate of personal computers growing at double digit rates nearly every year. Still even though today’s personal computers are leaps and bounds above their predecessors in terms of functionality they still share the common keyboard, monitor and mouse configuration that’s been present for decades despite many attempts to reinvent them.

There does however seem to be a market for curated computing devices that, whilst lacking the power of their bigger brethren, are capable of performing a subset of their tasks. I first began to notice this trend way back when I was still working in retail as many customer’s requirements for a PC rarely amounted to more than “email, web surfing and writing a few documents”. Even back then (2000~2006) even the most rudimentary of the PC line I had to sell would cover this off quite aptly and more often than not I’d send them home with the cheapest PC available, leaving the computing beasts to gather dust in the corner. To me it seemed that unless you were doing photo/video editing or gaming you could buy a PC that would last the better part of 5 years before having to think about upgrading, and even then only because it would be so cheap to do so.

The trend towards such devices began about 4 years ago with the creation of the netbook class of personal computing devices. Whilst still retaining much of the functionality of their ancestors netbooks opted for a small form factor and low specifications in order to keep costs down. I, like many geeks of the time, saw them as nothing more than a distraction as they filled a need that didn’t exist failing to remember the lessons I had learned many years before. The netbook form factor proved to be a wild success with many people replacing their PCs in favor of the smaller platform. They were however still fully fledged PCs.

Then along came Apple with their vision of creating yet another niche and filling it with their product. I am of course talking about the iPad which has enjoyed wild success and created the very niche that Apple dreamed of creating. Like with netbooks I struggled with the idea that there could be a place in my home for yet another computing device since I could already do whatever I wanted. However just like the netbooks before them I finally came around to the idea of having a tablet in my house and that got me thinking, maybe the curated experience is all most people need.

Perhaps the PC is better off as an appliance, at least for most people.

For the everyman their requirements for a computing device outside the workplace don’t usually extend past the typical “email, web and document editing” holy trinity. Tablets, whilst being far from an ideal platform to do all those tasks aptly (well, in my opinion anyway) they’re good enough to replace a PC for most people outright. Indeed the other Steve behind Apple, Mr Wozniak, has said that tablets are PCs for everyone else:

“The tablet is not necessarily for the people in this room,” Wozniak told the audience of enterprise storage engineers. “It’s for the normal people in the world,” Wozniak said.

“I think Steve Jobs had that intention from the day we started Apple, but it was just hard to get there, because we had to go through a lot of steps where you connected to things, and (eventually) computers grew up to where they could do … normal consumer appliance things,” Wozniak said.

If you consider the PC as a household appliance then the tablet form factor starts to make a lot of sense. Sure it can’t do everything but it can do a good chunk of those tasks very well and the barrier to using them is a whole lot lower than that of a fully fledged PC. Plus unlike a desktop or laptop they don’t seem out of place when used in a social situation or simply lying around on the coffee table. Tablets really do seem to be a good device for the large majority of people who’s computing needs barely stress today’s incredibly powerful PCs.

Does that mean tablets should replace PCs outright? Hell no, there’s still many tasks that are far more aptly done on PC and the features that make a tablet convenient (small size, curated experience) are also its most limiting factors. Indeed the power of tablets is built on the foundations that the PC has laid before it with many tablets still relying on their PC brethren to provide certain capabilities. I think regular users will gravitate more towards the tablet platform but it will still be a long time before the good old keyboard, monitor and mouse are gone.

Falcon Heavy: SpaceX’s Rocket Can Beat Up Your Rocket.

I’ve been unfortunately slack with space based posts on my blog recently and whilst that’s mostly due to my attention being diverted away to other exploits I found it hard to find news or topics that I hadn’t already covered that I thought everyone would enjoy hearing about. Sure when it comes to space even the most hum-drum activities are still amazing feats are deserving of our attention but that doesn’t necessarily spark the creative muse inside me that’s responsible for me churning out a blog post every weekday. Thankfully however my favorite private aeronautics company SpaceX was determined to make waves today, and boy did they ever.

It all started with a single tweet last week where SpaceX teased that “Something big is coming” and released an accompanying 32 second video showing some of their previous accomplishments. Since their bread and butter is full launch systems many people speculated that this would be the announcement of a new rocket class, something bigger than that of the Falcon 9. Today saw the full announcement from Space that the “something big” was indeed their new rocket the Falcon Heavy and it’s set to disrupt the private space industry:

Falcon Heavy, the world’s most powerful rocket, represents SpaceX’s entry into the heavy lift launch vehicle category. With the ability to carry satellites or interplanetary spacecraft weighing over 53 metric tons (117,000 lb) to Low Earth Orbit (LEO), Falcon Heavy can lift nearly twice the payload of the next closest vehicle, the US Space Shuttle, and more than twice the payload of the Delta IV Heavy.

Falcon Heavy’s first stage will be made up of three nine-engine cores, which are used as the first stage of the SpaceX Falcon 9 launch vehicle. It will be powered by SpaceX’s upgraded Merlin engines currently being tested at the SpaceX rocket development facility in McGregor, Texas. SpaceX has already designed the Falcon 9 first stage to support the additional loads of this configuration, and with common structures and engines for both Falcon 9 and Falcon Heavy, development and operation of the Falcon Heavy will be highly cost-effective.

The numbers that SpaceX are throwing around are quite amazing with the Falcon Heavy being able to lift twice the payload weight of the Space Shuttle whilst costing an order of magnitude less per launch. Their specifications make multiple references to the closest competitor the DELTA IV Heavy which would be its most direct competitor citing that they can deliver twice the payload at a third of the cost. Whilst on paper their claim of double the payload rings true I’m still a bit skeptical on “third of the price” bit since the Falcon Heavy’s price range isn’t too far off the DELTA IV Heavy’s ($80~125 million vs $140~$170 million respectively), but it’s still a significant cost saving none the less.

As with all SpaceX rocket designs they are truly something to marvel at. Whilst I’m always get a bit worried when I see large clusters of engines (the Falcon Heavy has 27 engines total) SpaceX has shown they can get 9 of them to work in synchronization perfectly well in the past so I’m sure they’ll have no trouble scaling it up. What really intrigued me was the cross-feeding fuel system that the Falcon Heavy will employ. In essence it means that during its first stage all of the engines are drawing their fuel from the boosters on the side so that when it comes time for stage separation the core stage booster will still have an almost full tank. Couple this with the extraordinary mass ratio of 30, which is almost double that of the space shuttle, and it’s little wonder that the Falcon Heavy can achieve such extreme payload numbers whilst still boasting a ridiculously cheap price.

What’s truly exciting though is their planned production rate for these new rockets. Once in service SpaceX is planning to launch up to 10 of both the Falcon 9 and Falcon Heavy per year for a total of 20 flights per year. To put this in perspective the DELTA IV Heavy has only had 16 launches during its entire lifetime so for SpaceX to pursue such an aggressive launch schedule means that they think there’s a real demand for getting a whole lot of kit up into space, just not at the current price level. Indeed SpaceX will be the first company ever to offer payload delivery into space at the coveted $1000/lb mark, long held as the peak of conventional rocket technology. With SpaceX pursuing such aggressive economies of scale though it won’t be long before that price begins to come down, and that’s when things start to get interesting.

Whilst the cost of ticket to space is still well outside the reach of the everyman for many decades to come breakthroughs like the ones SpaceX are making a habit of releasing signal the beginning of the real space age for all mankind. The $1000/lb mark puts the cost of putting your average human into orbit at around $200,000 just on weight (probably triple that for a realistic cost) which is scarily close to Virgin Galactic’s initial ticket price for a 5 minute sub-orbital junket. As many aspects of getting people orbit become routine and the research costs are a long forgotten memory there’s really nothing stopping the price from coming down to be within the reach of those who would desire it. Sure we’re a long way off from seeing the kind of competition we see with the airlines today but the similarities between the early days of flight and the fledgling space industry are just too strong to ignore. The next decade will bring us some truly exceptional revolutions in technology and all of them will help to make the dream of a true space age for humanity come to fruition.

I really can’t express just how excited this makes me.

What’s The Use Case For a Cellular Tablet?

So I’m sold on the tablet idea. After resisting it since Apple started popularizing it with the iPad I’ve finally started to find myself thinking about numerous use cases where a tablet would be far more appropriate than my current solutions. Most recently it was after turning off my main PC and sitting down to watch some TV shows, realizing that I had forgotten to set up some required downloads before doing so. Sure I could do them using the diNovo Mini keyboard but it’s not really designed for more than logging in and typing in the occasional web address. Thinking that I’d either now have to power my PC or laptop on I lamented that I didn’t have a tablet that I could RDP into the box with and set up the downloads whilst lazing on the couch. Thankfully it looks like my tablet of choice, a wifi only Xoom, can be shipped to Australia via Amazon so I’ll be ordering one of them very soon.

Initially I thought I’d go for one of the top of the line models with all the bells and whistles, most notably a 3G/4G connection. That was mostly just for geek cred since whenever I’m buying gadgets I like to get the best that’s on offer at the time (as long as the price isn’t completely ludicrous). After a while though I started to have a think about my particular use patterns and I struggled to find a time where I’d want to use a tablet and be bereft of a WiFi connection, either through an access point or tethered to my phone. There’s also the consideration of price with all non-cellular  tablets is usually quite a bit cheaper, on the order of $200 with the Xoom. It then got me thinking, what exactly is the use case for a tablet with a cellular connection?

The scenarios I picture go something along these lines. You’re out and about, somewhere that has mobile phone reception, but you don’t have your phone on you (or one not capable of tethering) and you’re no where near a WiFi access point. Now the possibility of having mobile phone reception but no WiFi is a pretty common event, especially here in Australia, but the other side to that potential situation is you either can’t tether to your mobile phone because its not capable or you don’t have it on you. Couple that with the fact that you’re going to have to pay for yet another data plan just for your new tablet then you’ve really lost me as to why you’d bother with a tablet that has cellular connectivity.

If your reason for getting cellular connectivity is that you want to use it when you don’t have access to a WiFi hard point then I could only recommend it if you have a phone that can’t tether to other devices (although I’d struggle to find one today, heck even my RAZR was able to do it). However, if I may make a sweeping statement, I’d assume that since you’ve bought a tablet you already have a smart phone which is quite capable of tethering, even if the carrier charges you a little more for it (which is uncommon and usually cheaper than a separate data plan). The only real reason to have it is for when you have your tablet but not your phone, a situation I’d be hard pressed to find myself in and not be within range of an access point.

In fact most of the uses I can come up with for a tablet device actually require them to be on some kind of wireless network as they make a fitting interface device to my larger PCs with all the functions that could be done on cellular networks aptly covered off by a smartphone. Sure they might be more usable for quite a lot of activities but they’re quite a lot more cumbersome than something that can fit into my pocket and rarely do I find myself needing functionality above that of the phone but below that of a fully fledged PC. This is why I was initially skeptical of the tablet movement as the use cases were already aptly covered by current generation devices. It seems there’s quite a market for transitional devices however.

Still since nearly every manufacturer is making both cellular and wireless only tablets there’s got to be something to it, even if I can’t figure it out. There’s a lot to be said about the convenience factor and I’m sure a lot of people are willing to pay the extra just to make sure they can always use their device wherever they are but I, for one, can’t seem to get a grip on it. So I’ll put it out to the wisdom of the crowd: what are your use cases for a cellular enabled tablet?

Still In The Grok Stage.

After reaching 1.0 of Lobaco I’ve taken a breather from developing it, mostly so I could catch up on my backlog of games and give my brain a well deserved break from working on that problem space. It’s not that I’m tired of the idea, I still think it has merit, but the last 6 months of little free time on the nights and weekends were starting to catch up with me and a break is always a good way to kick start my motivation. It didn’t take long for numerous new ideas to start popping into my head afterwards and instead of jumping back into Lobaco development I thought I’d cut my teeth on another, simple project that would give me the experience I needed to migrate Lobaco into the cloud.

The weekend before last I started experimenting with ASP.NET MVC, Microsoft’s web framework that based on the model-view-controller pattern that I had become familiar with after deep diving into Objective-C. I could have easily done this project in Silverlight but I thought that I’d have to man up sooner or later and learn a proper web language otherwise I’d be stuck in my desktop developer paradigm for good. The results weren’t spectacular and I could only bring myself to spend about half the time I usually do coding on the new site, but there was progress made there none the less.

Last weekend was more productive with me managing to make the site look something like the vision I had in my head. Satisfied that I could design a decent looking website I decided to start hacking away at the core fundamentals of the application. This is where I rubbed up against the limitations of the framework that I had chosen for this particular project, not knowing that whilst ASP.NET MVC might share most of its name with its ASP.NET cousins it is in fact a world away from it. Sure it’s still extremely capable but it’s nothing like the drag and drop framework that I had been used to with other Microsoft products, leaving me to research pure HTML and Javascript solutions, something which I had avoided like the plague in the past. This meant that progress was pretty slow and the temptation to play Starcraft 2 with a bunch of my good mates was too strong and I left it there for the weekend.

The slow progress really frustrated me. After finally gaining competence with Objective-C I felt like learning yet another new framework would be easy, even if it meant learning another language. Somehow I managed to forget that frustrating first month where progress was almost nil and I convinced myself I wasn’t procrastinating when looking for other solutions to my problems. Eventually I came to the realization that I was still grokking the new framework I had chosen for my application and that I shouldn’t be expecting myself to be blazing trails when I was still establishing my base of fundamental knowledge.

I see lots of people go through the same struggle when trying out new things and can see how easy it is to give up when you’re not making the kinds of progress other people are. Believe me its even worse in the tech/start-up area where every other day I’m reading about someone who hacked together a fully usable service in a weekend whilst I struggle to get my page to look like it wasn’t written in notepad. The realization that you’re still in the grok stage of learning something new I find to be quite a powerful motivator as past experience has shown that it’s only a matter of time and persistence between floundering around and becoming quite capable.

I’m usually the first one to tell people to stick with what they know as re-skilling is extremely expensive time wise (and can be $$$ wise too, Objective-C set me back a few large) but the pay-offs of diversifying you skills can be quite large. Whilst I’ve yet to make any semblance of a dollar from all my adventures in iPhone development I still count it as a valuable experience, if for the mere fact it’s given me a lot of perspective and oodles of delicious blog fodder. Time will tell if this current foray into yet another web framework will be worth my time but I wouldn’t be doing it if I thought there was no chance of it ever paying off.

Barbie Cali Girl Horse Adventure 3

Barbie’s Cali Girl Horse Adventure: My Life is Complete.

Long time readers of this blog will know that I’m a pretty big fan of ponies, even casting off the oppressions of my daily job to pursue my career as a pony tamer. Sadly it wasn’t to be as all the ponies in my care suffered an unfortunate trip to the glue factory, something which still haunts me to this day. Still my love and adoration for ponies hasn’t subsided but instead I now find solace in other pony related exploits. You can then imagine how excited I was when I heard that Barbie had released their latest pony related game, Barbie Cali Girl Horse Adventure.

Whilst not strictly a pony centric game Cali Girl Horse Adventure provides many aspects of the pony experience. You play as yourself, a stranger in the land of California looking for that perfect horse to take on an adventure with you. Throughout the adventure you’re presented with many different options providing an enthralling non-linear experience, ensuring that each play through will be quite unlike the ones before. I chose the Gallop on the Beach path initially, and boy was I in for a wild ride.

Let me just take a step back to say something about the graphics of this game, they’re simply stunning. Whilst I’ve been known to gush over any latest gen game Cali Girl Horse Adventure was another step ahead of anything else I had played recently, even Crysis 2. Indeed I spent most of the game just staring slack jawed at the screen, completely and utterly dumbfounded with what I was seeing. Truly game developers of today have a lot to learn from the people behind Cali Girl Horse Adventure and I’m hoping the BarbieHorEngine that drives this makes its way into some more AAA titles.

Some of the real meat of the game is in the customization options that allow you to modify your horse as you see fit. As with any game that provides an in depth character creator I spent the better part of 4 hours narcissisticly  trying to recreate myself within the game. Whilst, as always, I wasn’t able to get a perfect representation of myself I still did quite well managing to get most of the important details correct. Most interestingly the character creator also has a huge impact on how you will progress through Cali Girl Horse Adventure so you’d be wise to make your choice of character look carefully, lest you end up in a situation you just hadn’t planned for.

Combat in the game has been refined to a point where it’s almost indistinguishable from just randomly clicking on the screen. Sure many people will criticize this game for dumbing it down for the console port (which I’ve yet to find a release date for) but to do so misses the incredible amount of innovation that went into creating Cali Girl Horse Adventure. I could go on describing the experience but realistically it’s something so far away from everything else that you just have to experience it for yourself.

Of course what really captivated me was the story behind Cali Girl Horse Adventure. Whilst I won’t dive into any spoilers here it suffices to say that from beginning to end I was hooked, eagerly clicking away at every opportunity to find out what happened next. The ending is uplifting and extremely satisfying, wrapping up the plot succinctly whilst providing an extremely subtle hint that a sequel may or may not be on the horizon. I usually trounce on games that do this but Cali Girl Horse Adventure pulled it off so well that I can’t do anything but commend them for it.

When I was writing out the title for this post I really meant it when I said my life was complete. Barbie’s Cali Girl Horse Adventure has shown me what gaming perfection is, effortlessly capturing all those things which I’ve been longing for in other AAA titles. I really can’t tell you exactly how many hours I lost on this game but suffice to say I loved every single one of them and will probably be going back for another fix very soon. If you’re looking for a game that delivers on every promise that any game has ever made then Cali Girl Horse Adventure is definitely for you.

Rating: 11.0/10

Barbie’s Cali Girl Horse Adventure is available on PC right now for $3399.89. Game was played on the hardest difficulty setting with approximately 238 hours played with my character reaching the “Super Horse Lover” level.