Posts Tagged‘web’

The Viability of Cloud Gaming.

The idea of cloud gaming is a seductive one especially for those of us who lived through the times when upgrading your computer every 12 months was a requirement if you didn’t want to be watching a slide show. Abstracting the hardware requirement away from the user and then letting them play on any device above a certain, extremely low threshold would appear to be the solution to the upgrade and availability issues of dedicated gaming platforms. I’ve long made the case that the end product is something of a niche market, one that I was never quite sure would be viable on a large scale. With the demise of OnLive I could very easily make my point based around that but you can never write off an industry on the failures of the first to markets (see Iridium Communications for proof of this).

Providing even a small cloud gaming service requires some rather massive investments in capital expenditure, especially with the hardware that’s currently available today. For OnLive this meant that only one of their servers could serve one of their users at a time which was terrible from a scalability point as they could never really service that many customers without bleeding money on infrastructure. For cloud gaming services of the future however they might be in luck as both NVIDIA and AMD are working on cloud GPUs that will enable them to get much higher densities than the current 1 to 1 ratio. There’ll still be an upper limit to that’s much lower than most cloud services (which typically serve thousands per server) but at the very least the scalability problem is now an engineering issue rather than a capital one.

The second major challenge that cloud gaming companies face how latency sensitive a good portion of the games market is. Whilst you can get down to very low latency numbers with strategically placed servers you’re still going to be adding a good chunk of input lag on top of any server latency which will be unacceptable for a lot of games. Sure there are titles where this won’t be an issue but cutting off a large section of the market (FPS, RTS, RPGs and any mix of them inbetween) further reduces the viability of any potential cloud gaming service.

In fact for many of the titles that could benefit from a cloud gaming service can already be ported to the web thanks to things like Unity or the use of OpenGL extensions in HTML5. Indeed many of the games that I could see being published on a cloud platform (casual MMORPGs, turn based strategy games, etc.) wouldn’t be much different if they were brought to the traditional web instead. Sure you lose some of the platform agnosticity because of this but you can arguably reach the same number of people using that as you could with a cloud platform.

User expectations are also set rather high for cloud services with many of them being flat fee, unlimited usage scenarios (think Pandora, NetFlix, etc). The current business models for cloud gaming didn’t gel well with this mindset as you were paying for the games you wanted to play (often cheaper than retail, sometimes not) for a limited period of time, akin to a long term rental. Whilst this works for some people most users will expect to be able to pay a flat fee in order to access a catalogue they can then use at their leisure and this has significant ramifications for how publishers and developers will license their games to cloud developers. It’s not an insurmountable problem (the music industry came around eventually so the games industry can’t be far behind) but it does introduce a market dynamic that cloud gaming services have not yet investigated.

With all these things being considered I find it hard to see how cloud gaming services can viable in the near term as whilst all the issues are solvable they all work against delivering something that can turn you a profit. Cloud GPUs, ever increasing quality of Internet connections and the desire by many to migrate completely to cloud based services does mean that there’s a trend towards cloud gaming services becoming viable in the future however the other, fundamental limitations could see those pressures rendered null and void. This is something I’m willing to be proven wrong on though as I’ve invested myself heavily in cloud principles and I know that its capable of great things. Whether cloudifying our gaming experience is one of them is something that I don’t believe is currently feasible however and I don’t see that changing for a while.

The Weird and Wonderful Effects of Microgravity.

Life on Earth evolved in a never ending battle to be the most well adapted species to its environment. Consequently it can be said that the life forms that evolved here on Earth are specialist biological machines with certain requirements that must be met in order for them to thrive. It then comes as no surprise that entire species can be wiped out by small changes to their environment as their specific adaptations no longer provide them the advantage that they require. However there’s one particular pressure that all life has evolved with that, at least for most life, will never change: gravity.

Many biological processes rely on gravity in order to function correctly and for the longest time it was thought that no life that evolved here on Earth could survive a zero/microgravity environment for long. Indeed medical doctors back on the Mercury program were very sure that the second their astronauts went into orbit their vision would blur, rendering them incapable of performing any tasks. The truth of the matter is whilst we’re designed to work well in our standard 1G environment our bodies can cope quite well with microgravity environments for extended periods of time, provided certain precautions are taken.

What’s truly fascinating to watch though is how other creatures function without the aide of a constant gravitic pull. Indeed quite a lot of science done aboard the International Space Station has been centred around studying these effects on varying levels of creatures and some have produced very interesting results. For example spiders sent up to the ISS don’t spin webs like their Earth bound relatives do, they instead weave what looks like a tangled mess all over their environment. It would seem that their sense of direction heavily relies on figuring out which was is down and absent that their webs lose their usual symmetry.

Other animal species seem to adapt rapidly to the loss of gravity’s unrelenting effects. Mummichogs, a type of small fish, appear to be quite hardy little creatures in microgravity environments. They suffer some initial confusion but after a short while they appear to be quite capable of swimming perfectly well in microgravity. Ants too seem to adapt rapidly to the loss of gravity with their nests taking on an almost surreal structure that is not like anything you’ll see on Earth. The habitat that NASA designed to take ants into space is also quite incredible being a clear blue gel that contains everything the ants need to survive both the trip up and life aboard the space station.

Incredibly some species appear to be  better suited to microgravity than the regular 1G environment on Earth.  C. Elegans, a type of unsegmented worm, not only adapted to life in space but showed a marked increase in life span over their earth bound cousins. The cause appears to be a down-regulation of certain genes associated with muscle ageing which in turn leads to a longer life. Whether the same genes could be down-regulated in humans is definitely an area for investigation but as everyone knows us humans are far more complicated beasts than the simple  C. Elegan.

Indeed whilst muscle atrophy is one of the biggest problems facing astronauts who spend a long time in space there are several more concerns that also need to be addressed. Unlike the C. Elegan we humans have an internal skeleton and absent the effects of gravity it tends to deteriorate in much the same way as it does in bed ridden patients and people with osteoporosis. Additionally whilst the ISS is still within the protective magnetic field of Earth it’s still subject to much higher levels of radiation than what we get here on Earth which poses significant health risks over the long term. There’s also a whole swath of things that don’t quite work as intended (burping in microgravity is fraught with danger) which we’re still working on solutions for but suffice to say if we’re ever going to colonize space reproducing the effects of gravity is going to be one of the most critically required technologies.

It’s not often that we get the opportunity to effectively remove a unyielding constant and then study how much it influenced the development of life here on Earth. This is one of the reasons why space based research is so important, it gives us clues and insights into how dependent our biological processes are on certain key variables. Otherwise we’d figure that gravity was simply a requirement for life when now we know that life can survive, and even thrive, in its absence.

Still In The Grok Stage.

After reaching 1.0 of Lobaco I’ve taken a breather from developing it, mostly so I could catch up on my backlog of games and give my brain a well deserved break from working on that problem space. It’s not that I’m tired of the idea, I still think it has merit, but the last 6 months of little free time on the nights and weekends were starting to catch up with me and a break is always a good way to kick start my motivation. It didn’t take long for numerous new ideas to start popping into my head afterwards and instead of jumping back into Lobaco development I thought I’d cut my teeth on another, simple project that would give me the experience I needed to migrate Lobaco into the cloud.

The weekend before last I started experimenting with ASP.NET MVC, Microsoft’s web framework that based on the model-view-controller pattern that I had become familiar with after deep diving into Objective-C. I could have easily done this project in Silverlight but I thought that I’d have to man up sooner or later and learn a proper web language otherwise I’d be stuck in my desktop developer paradigm for good. The results weren’t spectacular and I could only bring myself to spend about half the time I usually do coding on the new site, but there was progress made there none the less.

Last weekend was more productive with me managing to make the site look something like the vision I had in my head. Satisfied that I could design a decent looking website I decided to start hacking away at the core fundamentals of the application. This is where I rubbed up against the limitations of the framework that I had chosen for this particular project, not knowing that whilst ASP.NET MVC might share most of its name with its ASP.NET cousins it is in fact a world away from it. Sure it’s still extremely capable but it’s nothing like the drag and drop framework that I had been used to with other Microsoft products, leaving me to research pure HTML and Javascript solutions, something which I had avoided like the plague in the past. This meant that progress was pretty slow and the temptation to play Starcraft 2 with a bunch of my good mates was too strong and I left it there for the weekend.

The slow progress really frustrated me. After finally gaining competence with Objective-C I felt like learning yet another new framework would be easy, even if it meant learning another language. Somehow I managed to forget that frustrating first month where progress was almost nil and I convinced myself I wasn’t procrastinating when looking for other solutions to my problems. Eventually I came to the realization that I was still grokking the new framework I had chosen for my application and that I shouldn’t be expecting myself to be blazing trails when I was still establishing my base of fundamental knowledge.

I see lots of people go through the same struggle when trying out new things and can see how easy it is to give up when you’re not making the kinds of progress other people are. Believe me its even worse in the tech/start-up area where every other day I’m reading about someone who hacked together a fully usable service in a weekend whilst I struggle to get my page to look like it wasn’t written in notepad. The realization that you’re still in the grok stage of learning something new I find to be quite a powerful motivator as past experience has shown that it’s only a matter of time and persistence between floundering around and becoming quite capable.

I’m usually the first one to tell people to stick with what they know as re-skilling is extremely expensive time wise (and can be $$$ wise too, Objective-C set me back a few large) but the pay-offs of diversifying you skills can be quite large. Whilst I’ve yet to make any semblance of a dollar from all my adventures in iPhone development I still count it as a valuable experience, if for the mere fact it’s given me a lot of perspective and oodles of delicious blog fodder. Time will tell if this current foray into yet another web framework will be worth my time but I wouldn’t be doing it if I thought there was no chance of it ever paying off.

Procrastination Takes Many Forms.

I really can be my own worst enemy sometimes. It’s been almost a month since I got back from the USA and despite the best of intentions I haven’t really done that much work on Lobaco apart from a little work on the API and web UI. Whilst I was pretty sure I wasn’t going to hit the code hard immediately after touching back down in Australia I still thought that after maybe a week or two of lazing about the coding bug which had firmly bitten me before I left would take a hold once again, pushing me to build on the momentum I had set up. Sadly it wasn’t to be and instead I just resided myself to feeling guilty about what I should’ve been doing and pulling the meter tall weeds that had grown in our front yard.

Partly to blame is that sense of perspective I get whenever I take time away from a project to work on something else or to just have a break. Usually the first thing that pops into my head is “why the hell should I bother” and I’ll spend a good chunk of time focusing on the negative aspects of whatever I’m doing. After a while though I’ll just try to do a feel small things, a few quick wins to get me back into the mindset of getting things done. After that it’s usually pretty easy going (this usually takes about 2 weeks) until I hit a wall again or I feel like getting my weekends back for a while so I can relax and get my head back together. The last few iterations of this cycle are what lead to the 3 major revisions of what is now Lobaco.

This time however was different. After being back for 2 weeks and being firmly thrust back into the world that had barely changed since I had left (even though I expected it to be wildly different,  for some reason) I still really couldn’t get into coding without feeling like I should be doing something else. My usual routine of getting a couple quick wins with the API and web UI didn’t translate into jumping back onto my MacBook and smashing out some iPhone code. Instead I started wondering whether or not a native client was the way to go and the possibility of doing a web based client for the phone itself. I had been down this road before and ultimately found that whilst iPhone programming was a world away from I’d done before the progress I had made with only a couple weeks of effort was far more encouraging than the same amount of time spent trying to wrangle HTML5 and Javascript into something workable.

Then along came Sencha.

I was going through my 700+ post backlog of Techcrunch articles when I came across this one about Sencha, a web startup that just released their Touch framework which provides the basis for building native looking applications in HTML5 and Javascript. Thinking this might be my salvation to writing native clients for all handsets I quickly downloaded the framework and started hacking around to get something workable. I was able to get the example running in one weekend and made a few modifications but I didn’t get into the real meat of it until last Friday night. After managing to replicate the UI I had built in objective-c within the Sencha framework I uploaded it to my web server to see what it would look like on the iPhone and instantly I realised what was wrong.

This client was just an elaborate way of procrastinating.

Now whilst the client looked decent and didn’t take too much to set up it didn’t look anywhere near as good as my native app nor could it hold a candle to its performance. Sure my hack job probably ensured that the performance wasn’t as good as it could be but in comparison to the native client hack job I did it was pretty stunning. After coming to that realisation I booted up my MacBook to start getting acquainted with Xcode again and spent last weekend coding up some performance improvements¹ for it which I had put off before I left for the USA. I’m sure this won’t stop me from looking at going down that path in the future but I can at least rest easier now that I’m feeling the urge to program once again.

It’s been a weird learning experience for me since I’m usually pretty good at knowing when I’m procrastinating. Usually I have a good reason for it (like having 1 bit of work to do and not doing it since it’s not due for months) but this is the first time I caught myself doing something I thought was useful when really I was just making excuses for doing the work I knew needed to be done. With a good week of holidays coming up over the Christmas/New Year period this realisation couldn’t have come at a better time and I’m looking forward to making the most of my time off with the hope that the momentum will carry me on into the new year.

¹I’m going to do a big post about this hopefully tomorrow. I hit some really, really esoteric problems getting everything to work but I have and the results are, to put it bluntly, bloody amazing.

2 Years and 2 Days On: A Look Back at The Refined Geek.

I’ve never been one for making a big fuss about milestones on this blog, apart from that one time when I hit 100 posts (now well over 450) and unleashed Geon into the world. Indeed as the title of this post suggests I even managed to let the 2 year milestone slip by for 2 days before realising that I had been at this blogging thing for quite a while, nearly double the time of any job I’ve held in the past 6 years. So since I don’t have anything else interesting to post about today (more on that later) I thought I’d take some time to reflect on what this blog was, where it is and where I think this thing is going in the near future.

As anyone who’s made the journey into the archives section of this blog will tell you I initially started blogging as a knee jerk reaction to being roped into the No Clean Feed movement here in Australia. In all honesty I’ve never really been that much of a writer nor anyone who you would consider as a public face for something. Still my ego is large enough to support that idea so when my long time friend Hannah asked me to be the media representative for the Canberran branch I didn’t hesitate to say no. What followed was a brief stint in the public eye with me doing a couple radio interviews and doing a speech in the middle of Canberra. Thinking that this would lead onto bigger and better things I thought it would be time to get my online personality into shape and started this blog to chronicle my thoughts and misadventures whilst fighting against the Australian Internet Filter.

The name was something I thought up with a night of googling through dozens of possibilities before I found one that didn’t have any meaningful search results for the title. I always had the theme of something debonair but also wanted to keep true to my geeky/nerdy roots and “The Refined Geek” seemed to fit the bill. Funnily enough not too long after starting this blog and buying the domain name did I come across Refined Geek, another Australian based blogger who shares some of my passions but who’s writings are worlds away from what I write here. I still drop by there from time to time as he’s quite a good writer, preferring to post less often with much more well formed posts than my usual one post a day scatter gun approach.

I can’t remember exactly when it happened but I do remember making the commitment to writing at least one post a day sometime back in the beginning of 2009. Mostly it was because I felt this blog was languishing in the dark recesses of the Internet, garnering only one view a day for the first 3 months or so. After integrating my blog with Twitter and Facebook that increased traffic ten fold but my presence outside my social circle was still quite minimal. Still as I developed a large backlog of posts on varying subjects the traffic started to climb, peaking at about 20 visits a day by the end of 2009. 2010 however really has been this blog’s year with 80~100 people visiting this blog per day looking for all sorts of weird and wonderful things. I’m still surprised to see some of my old articles popping up in the stats, it always brings a smile to my face.

Initially I started out with the idea that this would be my professional presence on the web, demonstrating my professionalism and expertise on certain subjects. However, as most amateur bloggers find, the stories that do well are often those that come with a personal aspect to them and I always found those the easiest to write. Over time I let go of the idea that people would come here like they do for the other big blogs, instead preferring to just write about what I’m passionate about and seeing where the chips fall. Most recently this has taken the form of not trying to force out a post every day (although my OCD keeps bugging me to) instead hoping that I can just let the topics come to me and write when the moment strikes. Most recently I took to blogging my exploits through the USA which was an interesting diversion away from the usual game/tech/space focus that I usually take. I think that was the final nail in the “this isn’t my personal blog” idea’s coffin (all the other nails were put in a long time ago, however) and I’ve wholeheartedly resigned myself to not thinking about The Refined Geek in that way again.

As for the future of this blog? I’m not really sure where I want to go with it. Spending an hour or two here every day writing a post is still feels like part of my morning routine so there’s no doubt that I’ll be continuing to post here for the foreseeable future. However there have been many times when I’ve considered moving it to a better domain (I happen to own www.davidklemke.com, which would be very suitable), revamping the site with a new theme or even starting anew with a better focus but with all my other exploits at the moment I can’t see many of them happening soon. So for those long term readers of mine can rest easy in the fact that I’m not going to start changing things now that I’ve hit the terrible twos but with change coming my way in the real world soon I can see this blog shifting in unison as it has done so over the past 2 years. Whether that’s anything I’ve just predicted is anyone’s guess, but I’m not one to be comfortable with the status quo.

I mean really, when was the last time you saw me write about finance? 😉

WTF Was This Guy Doing: My Refactoring Experience.

It doesn’t take much to sending me on a coding spree. Sometimes is something as simple as an idea that I need to implement now since it fundamentally changes the way the application will evolve and other times it’s something right out of the blue. Last night was the latter as after finishing up some preliminary packing for my trip to the US on Sunday (stay tuned for pictures, posts and vlogs!), playing a couple games of Starcraft 2 I found myself watching the latest episode of the Random Show. In essence it’s just Tim Ferriss (4 Hour Work Week) and Kevin Rose (Digg founder) talking about all sorts of things, but a common theme is always that of entrepreneurship. As someone who’s aspiring to that lifestyle I’m usually fixed firmly to the screen, hoping for some gems that will help me along my merry way. Last night however provided something completely different.

After listening to them for quite a while I looked down at my notepad with a list of features that I’ve slated for integration into Lobaco. I’ve deliberately let them go by the wayside as feature creep is the easiest way to kill a product before it even gets off the ground. Couple that with the fact that I only just recently had the penny drop on iPhone development the less ambitious I make the first iteration of the product the more likely I am to make it solid and usable, rather than a total mess of half done features. Still there are a couple on there that are wholly web client based so feeling the entrepreneurial surge from two web start-up powerhouses I thought I should go ahead an knock a couple of them over.

Boy was I in for a surprise.

One feature which was easy and would make the UI slightly more complete was making the right hand side information section scale dynamically with the browser’s height. In essence this is so you can see more if you’ve got a larger screen and makes the UI look a bit better on smaller screens. Since Silverlight supports dynamic height scaling by simply not specifying a height I thought that all I’d need to do was remove the static height and I’d be done, leaving me to knock over another feature before bed. Changing the property lead to the list box scaling out to its full height and refusing to show a scroll bar, and left me scratching my head as to what was going wrong.

Diving into the code I noticed that whatever I set the height to in the class file would determine the height of the list box. Thinking that it would just be a matter of setting that to the available height would give me the behaviour I wanted I coded up a loop that set that height whenever the size of the browser window changed. This kind of worked but never scaled properly, despite my beautifully crafted logic statements. Something was definitely amiss, but it took me another 2 hours to track down what it was.

Essentially it was a clusterfuck of 3 different coding screw ups. The first was placing the custom class I had designed inside a list box, which was in essence wrapping itself in itself. The second was actually using that class in the first place as it was not required and also duplicated a ton of styling logic thanks to the way Expression Blend messes with your code. Lastly, instead of adding items directly into the list box itself I was creating yet another list box, adding items into that and then adding that entire list box into the main list box (which was wrapped in yet another list box). To get dynamically scaling height in that mess would’ve required setting the height in about 3 different locations consecutively, an expensive process for something that’s supported natively.

The thing is this component was one of the first things to be coded into Lobaco about 3 months ago so this issue has been there almost from day dot. I’ve looked at that code dozens of times over the course of developing this application and not once did it twig that was I was doing was completely ass backward. It’s been almost a month and a half since I did any serious work on the web client and it seems that time away has given me enough perspective to see those obvious mistakes. I think that all developers need time away from their projects in order to get their head out of the problem space and get a clearer perspective on what they’re doing. Hell I’d say that those breaks I took from developing Lobaco were wholly responsible for the 3 code dumps I did and the current polished version that’s on the web today.

In the end the whole development process has been one of the most gratifying learning experiences I’ve ever had. It seems every time I think I’ve got things down pat I learn something new that makes me rethink my past decisions, tweaking things so they’re just that little bit better. Whilst I’m sure that this code base is here to stay it’s definitely evolving as time goes on as each change builds upon the last to provide a better experience for my future users. I won’t be making any progress on it for the next month whilst I travel the US but I’ve got the feeling in that time I’ll get enough perspective to make some incredible changes to Lobaco and hopefully I’ll come back recharged enough to hit development with renewed vigour.

Presenting…..Geon! (and 100 posts!)

Well it may be late on a Sunday afternoon but here it is, my 100th blog post! It’s been quite a fun exercise for me and I’m hoping to bring you many more posts in the future. Hopefully they will all be interesting, but I can guarantee that ;). The past 7 months have seen many changes in both my personal and professional life and I feel that this blog has reflected that. I’ve been able to craft my thoughts much more succinctly after writing so much, and my spelling has definitely improved. It’s also introduced me to the wonderful world of web applications, something that I’ve kept away from in the past. All of this would be for nothing if it wasn’t for you, my readers. I just want to say thanks for coming back day after day and reading and commenting on my site, it really does mean a lot when people care about what you have to say 🙂

As promised I have been working on something secretly in the background, and today marks it’s 1.0 release to the public. It’s a hacky, cobbled together web application that will form the basis of a future application that I want to develop. For now I’ll be working on it under the code name Geon which stands for Geological Information, although the final product will be a lot more then that.

For a taste hop on over to here. Also available from the Geon link in The Lab. Click around, see what you think it’s supposed to do then come on back here. If you can write down your impressions of it before you read on, I want to see what everyone thinks about it before I mess your perception with my ideas 🙂

In essence the application is part of a framework for real time information feed based upon location. Right now it gets content from Twitter, Flickr and additionally everyone in the same city (roughly) can talk to each other. The Flickr and Twitter buttons will bring up markers at your location, whilst clicking directly on the map will bring up Flickr pictures and Twitter posts that are located within that area. When you begin chatting it will start to do live updates from your area with other people who are chatting, you can disable this by unchecking the box (you’ll see why you might want to do this in a sec). You can change your user name to, the random string of numbers is mostly me being lazy and no implementing a full user database, that’s on the cards for the future.

Currently it will only return the first 10 Twitter posts but it will return all the Flickr pictures in the area. I wanted to get the chats popping up there as well for this release however I haven’t found a way to get the info windows to update dynamically, I believe this is a limitation of the api wrapper I’m using. Also if you’re chatting any information from outside your area will probably be cleared when it next refreshes. This seems to be a fun bit of AJAX that isn’t supposed to happen, but any partial post back triggers the map to update itself.

Here’s what I think is wrong with it so far (in terms of bugs):

  • Internet Explorer doesn’t work properly. The click event handler seems to report a wildly different location in IE then it does in Firefox/Chrome. For now, IE is unsupported and I’ll recommend Firefox for anyone who’s having trouble using it.
  • The chat inserts new lines at the top rather then at the bottom. This is because ASP doesn’t have a clean way to put the chat messages at the bottom and keep the scroll bar there. To save everyone scrolling down whenever they post a message or when it updates I thought it best to put them at the top.
  • Live updates kill any information on the map that wasn’t added in a certain way. For some reason any partial render of the screen causes the map to think it has to do a postback to. I haven’t been able to disable this but when you use the buttons at the bottom this information won’t be wiped. The functions are basically identical, but I can’t get information from clicking on the map to be persistent. I’ve wrote to the author of the wrapper about this, we’ll see what he says.

So what’s the big idea for all this? Well what I wanted to make was an application where you could zoom in on an area and see what’s going on there. This application does most of that now but what I’m looking to do is to build in a request information section and then anyone who’s on Geon (it will be available on mobiles….one day!) can submit pictures/text/whatever back up. I thought this would be amazing for breaking news events as long as there was enough users of course 🙂

I’d love to hear what everyone thinks about it and what you believe would be great to add in. I’ve already got a Google Wave integration idea in the works which I’m sure everyone will like. Experience has shown me that your users are the ones who matter, so I’m opening up the floodgates for you guys to craft the direction Geon takes over the coming months.