Posts Tagged‘onlive’

The Artemis pCell: Making Interference Work For You.

It will likely come as a shock to many to find out that Australia leads the world in terms of 4G speeds, edging out many other countries by a very healthy margin. As someone who’s a regular user of 4G for both business and pleasure I can attest to the fact that the speeds are phenomenal with many of the CBD areas around Australia giving me 10~20Mbps on a regular basis. However the speeds have notably degenerated over time as back in the early days it wasn’t unheard of to get double those speeds, even if you were on the fringes of reception. The primary factor in this is an increased user base and thus as the network becomes more loaded the bandwidth available to everyone starts to turn south.

There’s 2 factors at work here, both of which influence the amount of bandwidth that a device will be able to use. The primary one is the size of the backhaul pipe on the tower as that is the hard limit on how much traffic can pass through a particular end point. The second, and arguably just as important, factor is the number of devices vs the number of antennas on the base station as this will determine how much of the backhaul speed can be delivered to a specific device. This is what I believe has been mostly responsible for the reduction in 4G speeds I’ve experienced but according to the engineers at Artemis, a new communications start up founded by Steve Perlman (the guy behind the now defuct OnLive), that might not be the case forever.

Artemis pCell pWaveArtemis new system hopes to solve the latter part of the equation not by eliminating signal interference, that’s by definition impossible, but instead wants to utilize it in order to create pCells (personal cells) that are unique to each and every device that’s present on their network. According to Perlman this would allow an unlimited number of devices to coexist in the same area and yet still receive the same amount of signal and bandwidth as if they were on it all by themselves. Whilst he hasn’t divulged exactly how this is done yet he has revealed enough for us to get a good idea about how it functions and I have to say it’s quite impressive.

So the base stations you see in the above picture are only a small part of the equation, indeed from what I’ve read they’re not much different to a traditional base station under the hood. The magic comes in the form of the calculations that are done prior to the signal being sent out as instead of blindly broadcasting (like current cell towers do) they instead use your, and everyone else who is connected to the local pCell network, location to determine how the signals be sent out. This then manifests as a signal that’s coherent only at the location of your handset giving you the full amount of signal bandwidth regardless of how many other devices are nearby.

I did enough communications and signal processing at university to know something like this is possible (indeed it’s a similar kind of technology that powers “sound lasers”) and could well work in practice. The challenges facing this technology are many but from a technical standpoint there are 2 major ones I can see. Firstly it doesn’t solve the backhaul bandwidth issue meaning that there’s still an upper limit on how much data can be passed through a tower, regardless of how good the signal is. For a place like Australia this would be easily solved by implementing a full fibre network which, unfortunately, seems to be off the cards currently. The second problem is more nuanced and has to do with the calculations required and the potential impacts that might have on the network.

Creating these kinds of signals, ones that are only coherent at a specific location, requires a fair bit of  back end calculations to occur prior to being able to send the signal out. The more devices you have in any particular area the more challenging this becomes and the longer that this will take to calculate before the signal can be generated. This has the potential to introduce signal lag into the network, something that might be somewhat tolerable from a data perspective but is intolerable when it comes to voice transmission. To their credit Artemis acknowledges this challenge  and has stated that their system can do up to 100 devices currently so it will be very interesting to see if it can scale out like they believe it can.

Of course this all hinges on the incumbent cellular providers getting on board with this technology, something which a few have already said their aware of but haven’t gone much further than that. If it works as advertised then it’s definitely a disruptive technology, one that I believe should be adopted everywhere, but large companies tend to shy away from things like this which could strongly hamper adoption. Still this tech could have wide reaching applications outside the mobile arena as things like municpal wireless could also use it to their advantage. Whether it will see application there, or anywhere for that matter, will be something to watch out for.

 

Cloud Enhanced Gaming is a Stupendously Bad Idea.

The advent of cloud computing, or more generally the commoditization of computer infrastructure, has provided us with capabilities that few could have accurately predicted. Indeed the explosive growth in the high tech sector can be substantially attributed to the fact that businesses now do not require heavy capital injections in order to validate their ideas, allowing many ideas which wouldn’t have been viable 5 years ago to flourish today. Of course this has also led to everyone seeking to apply the ideals of cloud computing wherever they can, hoping it can be the panacea to their ills. One such place is in the world of gaming and in all honesty the ways in which its being used is at best misguided with most solutions opening us up to a world of hurt not too far down the track.

Square Enix Project FlareI’ve gone on record saying that I don’t believe the general idea of Cloud Gaming, whereby a service runs hardware in a central location and users connect to it with a streaming device, isn’t viable. The problem comes from the requirements placed on that infrastructure, specifically the requirement for a low latency which means a user can’t be too far away from the equipment. That would mean that for it to have global reach it would likely need some kind of hardware in all capital cities which would be a rather capital intensive exercise. At the same time the consolidation ratios for gaming level hardware aren’t particularly great at the moment, although that may change in the future with both NVIDIA and AMD working on cloud GPU solutions. Still the fact that OnLive, a once $1 billion company, failed to make the idea feasible says a lot about it.

That hasn’t stopped companies from attempting to integrate the cloud through other avenues something which I’ve come to call Cloud Enhanced gaming. This is where a game can offload less latency sensitive aspects of the game to servers elsewhere so they can do the calculations, sending the results back down the wire. In theory this allows you to make your game better as you don’t have to worry about the limitations of the platform you’re running on, using more of that local grunt for pretty graphics while all the grunt work is done offsite. The latest entrant into this arena is Square-Enix’s Project Flare which they’re marketing as a technological breakthrough in cloud gaming.

On the surface it sounds like a great idea; consoles would no longer suffer from their hardware limitations and thus would remain viable for much longer than they have in the past. Indeed for a developer that’s looking to do something that’s outside of a consoles capabilities offloading processing into the cloud would seem to be the only way to accomplish it should they want to use a specific platform over the alternatives. However doing so binds that game to that backend infrastructure which means that the game’s life is only as long as the servers that power it. Considering the numerous examples we’ve had recently of game servers and services disappearing (including the infamous Games for Windows Live) the effect of turning off an integral part of the game would be far worse and likely without an easy path for remediation.

The reason why this would be such a big issue is that when compared to traditional game server infrastructure the requirements for a cloud enhanced game are much, much greater. You can happily run dozens of virtual servers that service thousands of clients from a single dedicated box, however try and run physics calculations (like in one of the Project Flare demos) and the number of people you can service per server drops dramatically. This means the time in which those servers remain fiscally viable is dramatically reduced and it’s far more likely that the service will cease to exist much sooner than other game servers would. Moore’s Law goes a little way to remedy this but you can’t really get past the fact that the consolidation ratios achievable with this are a couple of orders of magnitude lower than what developers have traditionally come to expect.

This is not to mention how the system will handle poor Internet connections or overloaded servers, something which is guaranteed to happen with more popular titles. Whilst its not an unsolvable problem it’s definitely something that will lead to sub-par gaming experiences as the two most likely systems (stopping the game to wait for the calculations to arrive or simply not simulating them at all) will be anything but seamless. I’m sure it could be improved over time however the way this is marketed makes it sound like they want to do a lot of computation elsewhere so the console graphics can be a lot prettier leaving not a whole lot of wiggle room when the inevitable happens.

Whilst this idea is far more feasible than running the entire game environment on a server it’s still a long way from being a viable service. It’s commendable that Square-Enix are looking for ways to make their games better, removing restrictions of the platforms that the majority have chosen, however I can’t help but feel it’s going to come around to bite them, and by extension us, in the ass in the not too distant future. As always I’d love to be proven wrong on this but the fact is that farming out core game calculations means that the game’s life is tied to that service and once it’s gone there’s nothing you can do to restore it.

The Viability of Cloud Gaming.

The idea of cloud gaming is a seductive one especially for those of us who lived through the times when upgrading your computer every 12 months was a requirement if you didn’t want to be watching a slide show. Abstracting the hardware requirement away from the user and then letting them play on any device above a certain, extremely low threshold would appear to be the solution to the upgrade and availability issues of dedicated gaming platforms. I’ve long made the case that the end product is something of a niche market, one that I was never quite sure would be viable on a large scale. With the demise of OnLive I could very easily make my point based around that but you can never write off an industry on the failures of the first to markets (see Iridium Communications for proof of this).

Providing even a small cloud gaming service requires some rather massive investments in capital expenditure, especially with the hardware that’s currently available today. For OnLive this meant that only one of their servers could serve one of their users at a time which was terrible from a scalability point as they could never really service that many customers without bleeding money on infrastructure. For cloud gaming services of the future however they might be in luck as both NVIDIA and AMD are working on cloud GPUs that will enable them to get much higher densities than the current 1 to 1 ratio. There’ll still be an upper limit to that’s much lower than most cloud services (which typically serve thousands per server) but at the very least the scalability problem is now an engineering issue rather than a capital one.

The second major challenge that cloud gaming companies face how latency sensitive a good portion of the games market is. Whilst you can get down to very low latency numbers with strategically placed servers you’re still going to be adding a good chunk of input lag on top of any server latency which will be unacceptable for a lot of games. Sure there are titles where this won’t be an issue but cutting off a large section of the market (FPS, RTS, RPGs and any mix of them inbetween) further reduces the viability of any potential cloud gaming service.

In fact for many of the titles that could benefit from a cloud gaming service can already be ported to the web thanks to things like Unity or the use of OpenGL extensions in HTML5. Indeed many of the games that I could see being published on a cloud platform (casual MMORPGs, turn based strategy games, etc.) wouldn’t be much different if they were brought to the traditional web instead. Sure you lose some of the platform agnosticity because of this but you can arguably reach the same number of people using that as you could with a cloud platform.

User expectations are also set rather high for cloud services with many of them being flat fee, unlimited usage scenarios (think Pandora, NetFlix, etc). The current business models for cloud gaming didn’t gel well with this mindset as you were paying for the games you wanted to play (often cheaper than retail, sometimes not) for a limited period of time, akin to a long term rental. Whilst this works for some people most users will expect to be able to pay a flat fee in order to access a catalogue they can then use at their leisure and this has significant ramifications for how publishers and developers will license their games to cloud developers. It’s not an insurmountable problem (the music industry came around eventually so the games industry can’t be far behind) but it does introduce a market dynamic that cloud gaming services have not yet investigated.

With all these things being considered I find it hard to see how cloud gaming services can viable in the near term as whilst all the issues are solvable they all work against delivering something that can turn you a profit. Cloud GPUs, ever increasing quality of Internet connections and the desire by many to migrate completely to cloud based services does mean that there’s a trend towards cloud gaming services becoming viable in the future however the other, fundamental limitations could see those pressures rendered null and void. This is something I’m willing to be proven wrong on though as I’ve invested myself heavily in cloud principles and I know that its capable of great things. Whether cloudifying our gaming experience is one of them is something that I don’t believe is currently feasible however and I don’t see that changing for a while.

Seems OnLive Couldn’t Handle Being a Niche Product.

It’s no secret that I’ve never been much of a fan of the OnLive service. Whilst my initial scepticism came from my roots as someone who didn’t have decent Internet for the vast majority of his life while everyone else in the world seemed to since then I’ve seen fundamental problems with the service that I felt would severely hamper adoption. Primarily it was the capital heavy nature of the beast, requiring a large number of high end gaming PCs to be always on and available even when there was little demand for them. That and the input lag issue that would have made many games (FPS being the most prominent genre) nearly unplayable, at least in my mind. Still I never truly believed that OnLive would struggle that much as there definitely seemed to be a lot of people eager to use the service.

For once though I may have been right.

OnLive might have been a rather capital intensive idea but it didn’t take long for them to build out a company that was getting valued in the $1 billion range, no small feat by any stretch of the imagination. It was at that point that I started doubting my earlier suspicions, that level of value doesn’t come without some solid financials behind it, but it seems that since that dizzying high (and most likely in a reaction to Sony’s acquisition of their competitor Gaikai for much less than that) that they only had one place to go and that was down:

We’re hearing from a reliable source that OnLive’s founder and CEO Steve Perlman finally decided to make an exit — and in the process, is screwing the employees who helped build the company and brand. The cloud gaming company reportedly had several suitors over the last few years (perhaps including Microsoft) but Perlman reportedly held tight control over the company, apparently not wanting to sell or share any of OnLive’s secret sauce.

Our source tells us that the buyer wants all of OnLive’s assets — the intellectual property, branding, and likely patents — but the plan is to keep the gaming company up and running. However, OnLive management cleaned house today, reportedly firing nearly the entire staff, and we hear it was done just to reduce the company’s liability, thus reducing employee equity to practically zero. Yeah, it’s a massive dick move.

We’ve seen this kind of behaviour before in companies like the ill-fated MySpace and whilst the company will say many things about why they’re doing it essentially it makes the acquisition a lot more attractive for the buyer, due to the lower ongoing costs. Whoever this well funded venture capitalist is they don’t seem to be particularly interested in the company of OnLive itself, more the IP and massive amount of infrastructure that they’ve built up over the course of the last 3 years. No matter how the service is doing financially those things have some intrinsic value behind them and although the new mysterious backer has committed to keeping the service running I’m not sure how much faith can be put in those words.

Granted there are services that were so costly to build that the initial companies who built them folded but the subsequent owner who acquired everything at a fire sale price went onto to make a very profitable service (see Iridium Communications for a real world example of this). However the figures that we’ve been seeing on OnLive’s numbers since this story broke don’t paint a particularly rosy picture for the health of the service. When you have a fleet of 8000 servers servicing at most 1600 users that doesn’t seem sustainable by any way that I can think of lest the users be paying out the nose for the service (which they’re not, unfortunately). It’s possible that the massive amount of lay offs coupled with a reduction in their current infrastructure base might see OnLive become a profitable enterprise once again but I’ll have to say that I’m still sceptical.

Apart from the monthly access fee requirement being dropped none of the issues that I and countless other gamers have highlighted have been addressed and their niche of people who want to play high end games without the cost (and don’t own a console) just isn’t big enough to support their idea. I could see something like this service being an also-ran for a large company, much like Sony is planning to do with Gakai, but as a stand alone enterprise the costs of establishing the require infrastructure to get the required user base are just too high. This is not even touching on the input lag or the ownership/DRM issues either, both of which have been shown to be deal breakers for many gamers contemplating the service.

It’s a bit of a shame really as whilst I love being right about these things I’d much rather be proven wrong, especially when it comes to non-traditional ideas like OnLive. It’s entirely possible that their new benefactor could turn things around for them but they haven’t done a lot to endear themselves to the public and their current employees so their battle is going to be very much up hill from now on. I’m still willing to be proven wrong on this idea though but as time goes on it seems less and less likely that it’ll happen and that’s a terrible thing for my already inflated ego.

Steam Box? Hmmm, Not Sure If Want…

Today the platform of choice for the vast majority of gamers is the console, there’s really no question about it. Whilst video games may have found their feet with PCs consoles took them to the next level offering a consistent user experience that expanded the potential market greatly. PC gaming however is far from dead and has even been growing despite the heavy competition that it faces in consoles. However the idea of providing a consistent user experience whilst maintaining the flexibility is an enticing one and there are several companies that are attempting to fuse the best elements of both platforms in the hopes of capturing both markets.

OnLive is one of these such companies. Their product is, in essence, PC gaming as a service (PCGAAS?) and seeks to alleviate the troubles some gamers used to face with the constant upgrade cycle. I was sceptical of the idea initially as their target demographic seemed quite small but here we are 2 years later and they’re still around, even expanding their operations beyond the USA. Still the limitations on the service (high bandwidth requirement being chief amongst them) mean that whilst OnLive might provide a consistent experience on par of that of consoles the service will likely never see the mainstream success that the 3 major consoles do.

Rumours have been circulating recently that Valve may take a stab at this problem; taking the best parts of the PC experience and distilling them down into a console creating new platform called the Steam Box:

According to sources, the company has been working on a hardware spec and associated software which would make up the backbone of a “Steam Box.” The actual devices may be made by a variety of partners, and the software would be readily available to any company that wants to get in the game.

Adding fuel to that fire is a rumor that the Alienware X51 may have been designed with an early spec of the system in mind, and will be retroactively upgradable to the software.

Indeed there’s enough circumstantial evidence to give some credence to these rumours. Valve applied for a patent on a controller back in 2009, one that had a pretty interesting twist to it. The controller would be modular allowing the user to modify it and those modifications would be detected by the controller. Such an idea fits pretty well with a PC/console type hybrid that the Steam Box is likely to be. It would also enable a wider selection of titles to be available on the Steam Box as not all games lend themselves well to the traditional 2 joystick console controller standard.

At the same time one of Valve’s employees, Greg Coomer, has been tweeting about a project that he’s working on that looks suspiciously like some kind of set top box. Now Valve doesn’t sell hardware, they’re a games company at heart, so why someone at Valve would be working on such a project does raise some questions. Further the screenshot of the potential Steam Box shows what looks to be a Xbox360 controller in the background. It’s entirely possible that such a rig was being used as a lightweight demo box for Valve to use at trade shows, but it does seem awfully coincidental.

For what its worth the idea of a Steam box could have some legs to it. Gone are the days when a constant upgrade cycle was required to play the latest games, mostly thanks to the consolization of the games market. What this means though is that a modern day gaming PC has the longevity rivalling that of most consoles. Hell even my last full upgrade lasted almost 3 years before I replaced it and even then I didn’t actually need to replace it; I just wanted to. A small, well designed PC then could function much like a console in that regard and you could even make optimized compliers for it to further increase it’s longevity.

The Steam Box could also leverage off the fact that many PC titles, apart from things like RTS, lend themselves quite well to the controller format. In fact much of Steam’s current catalog would be only a short modification away from being controller ready and some are even set up for their use already. The Steam Box then would come out of the box with thousands of titles ready for it, something that few platforms can lay claim to. It may not draw the current Steam crowd away from their PCs but it would be an awfully attractive option to someone who was looking to upgrade but didn’t want to go through the hassle of building/researching their own box.

Of course this is all hearsay at the moment but I think there could be something to this idea. It might not reach the same market penetration as any of the major consoles but there’s a definite niche in there that would be well served by something like this. What remains to be seen now is a) whether or not this thing is actually real and b) how the market reacts should Valve actually announce said device. If the rumours are anything to go by we may not have to wait too long to find both of those things out.

Touché OnLive.

OnLive and I have a very strange relationship. In the beginning I thought it was a potential money winner that would be hamstrung by the company’s desire to monetise their service from the get go. 9 months later I changed my tune somewhat when they announced that they’d be offering free trials to a decent handful of people and believed that the service could survive as a niche service for city dwelling casual gamers. I started to come around to the idea in its entirety when one of its competitors demoed World of Warcraft running on the iPad, something which I thought could easily be a common use case for their target market. With almost one and a half years separating my first post on them and today’s entry I have to say back then I didn’t expect them to come as far as they have today, nor for them to go in the direction they have.

My initial complaints about the service having a monthly fee were probably the biggest sticking point for many potential users. Having to pay US$15 per month to access the games (which you also have to buy) is something people just aren’t comfortable doing when digital distribution platforms like Steam do it for free. They won my approval when they offered quite a few people free trials that extended past a year which I believed would help get them that critical mass of users they needed in order to be attractive for their investors. In reality the opposite was true since free users won’t necessarily migrate to a paid product but paying customers are paying customers, ensuring that you not only have a viable product but also a viable market.

I really hadn’t heard anything more about the service until yesterday when I stumbled across one of their blog posts that detailed something quite extraordinary:

It’s official: There will be no base monthly fee for the OnLive Game Service going forward. WOOT!

Free Instant-play Demos, Free Massive Spectating, Free Brag Clip™ videos, messaging, friending.

No credit card needed, unless you decide to buy a 3-day, 5-day or Full PlayPass. And ongoing access with no monthly fee. Of course, we’ve had a promotion waiving the monthly fee for the first year, so this announcement is confirming what we had hoped—that we can continue without a monthly fee beyond the first year. Although we wish we could have confirmed no monthly fee from the get-go, pioneering a major new video game paradigm is hard: we had to first grow to a large base of regular users before we could understand usage patterns and operating costs. Now that we’ve reached that stage, we can confidently say a monthly fee is not needed, which deserves a double WOOT! WOOT!

I must say it really took me by surprise when I read that. Knowing that video streaming services are extremely bandwidth intensive and highly unprofitable (YouTube still isn’t profitable) I struggled to see how they could make a decent amount of money without charging monthly access fees. OnLive of course knows their finances better than anyone and it appears that the monthly fees were just a temporary measure to get them over that initial hump of users required to get them a steady stream of funding from their primary market: games sales. I hadn’t really looked into how they were doing this but having a quick look around their website I can see where the potential revenue is coming from.

For most games there’s 3 different purchase options. The 3 and 5 day play pass let’s you play the game in question for their amount of time from the day you purchase it. This isn’t game time mind you so it’s more like you’re renting that game for 3 or 5 days. The last option is the full play pass which allows you to play the game for as long as it is available on OnLive’s servers. They state in their support section that all games will be supported for at minimum 3 years from the point they’re first available so in essence even the full pass is still a rental, just one with an uncertain end date.

The 3 and 5 day passes seem to be reasonably priced with the most expensive of their being $6 and $9 respectively. For many throw away games that you’ll only ever play once this is a pretty reasonable price and would open up quite a few games to those who’d traditionally shy away from them because of the price. It’s akin to Netflix’s idea of taking the pain out of renting rentals by letting you conduct the entire process from your home. In my opinion this is where OnLive will draw most of its sales as that’s the area where the service shines. The full play pass however is riddled with problems.

For starters the cost of full play passes aren’t universally cheaper than their digital download counter parts with many of them being the same price or higher, for example:

  • Prince of Persia: The Forgotten Sands ($49.99 OnLive, $39.99 Steam)
  • Kane and Lynch 2: Dog Days ($49.99 on both)
  • Defense Grid Gold: ($13.99 on both, assuming “Gold” means all the additional packs)

This also doesn’t take into account any multi-pack sales that Steam is famous for.

Sure I can understand the point that you’re paying for the ability to play a game anywhere and thus the costs aren’t really comparable but for anyone with a machine less than 3 years old (my current one is 2) you could easily play any of these games without needing OnLive anyway. This is due solely to the consolisation of PC games and won’t be changing for anytime in the foreseeable future. Thus whilst you do gain flexibility from buying these games in OnLive you can only guarantee them to be there for 3 years and once they decide to stop supporting it you’re sweet out of luck. There’s no way to download your purchase once they’ve decided to flip the kill switch, effectively ending your ability to use your purchase forever.

This is the one aspect of OnLive that I absolutely detest, it’s the ultimate DRM that games publishing companies have been salivating over for years. Users of OnLive can’t trade their games with friends nor sell them to a used game shop in order to buy additional games. Effectively this turns all game “purchases” in OnLive into rentals under the control of the games publishers and the OnLive service, stripping away any freedom the end user might have once had. It is purely based on this fact that I will never, ever buy a full play pass from OnLive and will be extremely hesitant to use it for anything save reviewing the service itself as I can not condone this kind of behaviour from any corporation.

OnLive at its heart is a brilliant idea to bring gaming to those who can’t afford the time or monetary investment to stay on the cutting edge of gaming but still have a desire to. However every time I find something to love about the service I find yet another thing to hate about it and as it stands today I can not recommend it for anything past renting a throwaway game. The core ideas are solid and should OnLive make an effort to improve their service through say letting you download full purchases through their client then I’d have no trouble recommending them. For now though I’ll have to abstain from what is the worst form of DRM I’ve ever encountered and hope that everyone else will do the same.

OnLive Might Just Catch On.

Ah the cloud, it seems to be the catch all for any problem that you might have had with your computer since the day it was invented. Need your files wherever you go? Put it in the cloud! Want to sync your personal data across all your devices? Put it in the cloud! Does your hair not have enough body and lift? Get some better shampoo, since the cloud probably isn’t the answer to that one. Still there are some interesting ideas that just so happen to be cloud based and one of those, that I’ve covered a couple times previously, is OnLive. A curious service that aims to bring high end gaming to those on a budget, all for the low low cost of $14.95 per month (plus game costs).

Now whilst I haven’t been a huge fan of the idea I did muse that it had its place, albeit in a somewhat niche capacity which limited its appeal. Still this hasn’t stopped them from inking deals with big names like British Telecommunications to bring their product to a much wider audience. From what I’ve seen there’s still a significant amount of work required before they hit all the platforms they were talking about (computing appliances, like the iPad) and there’s still some issues they won’t be able to innovate away (input lag for instance). Given time and their obvious sway with investors I’m sure any problem that can be solved will be solved eventually, hopefully driving up the market adoption they’ll need to keep their heads above water.

There really hasn’t been that much said about OnLive in recent months, most because the initial trials have been done and now the only thing people are interested in is when they can give it a go. Turns out that might be sooner than we thought, thanks to this little tidbit of news:

Smart move by OnLive today. The controversial streaming game service is offering to waive the $14.95 monthly access fee for a full year (originally it was 3 months) for anyone who enthusiastically pre-registered early — many of you we suspect. It’s even tossing in a coupon for a free game when you register for the offer. The only catch seems to be the credit card required to complete registration as proof that you’re over 18. If you didn’t pre-register then tough luck, no offer for you. But at least you can take comfort in knowing that a small army of gamers will be taking the service to task unencumbered by membership fees. In other words, we’ll know right quickly if OnLive can live up to its “ultra high-performance” streaming gameplay on entry-level PCs and Macs.

I’d previously criticized OnLive for attempting to charge for their service from the get go, saying it would stifle adoption rates. Whilst this offer is really only valid for a very small subset of people (read: those who can actually get the darn service) it does mean there will be 25,000 people on the service in its early days functioning as free beta testers. The offer of a free game confirms this since that means everyone will have at least something to play on the service for their free 12 months. It will be interesting to see what the retention rates will be like after the initial 12 months, since I’m pretty sure that if OnLive isn’t up to par it will be dropped completely when they start asking for your credit card.

My assessment of OnLive being suited to “casual, city dwelling gamers” still seems to ring true 4 months on and when coupled with some recent developments I’m even more sure of it. Whilst I’m aghast to point to the iPad as a potential source of innovation (ugh I feel dirty already) the casual gamer, to whom the OnLive service would be highly appropriate, is in my opinion much more likely to have a device like the iPad. The reasoning behind this is simple, for most casual games they don’t need a high end machine and most casuals would rather use a device like an iPad or netbook since they’re cheaper and far more portable. The iPad is the more likely of the mostly thanks to the brand power that Apple commands and the fact that it has been marketed directly as a casual computing device. If you then also consider that those who are buying a product like that are more likely to have the disposable income required to pay for such a service then the iPad becomes a pretty powerful gaming device for those that like to game but don’t want to bother messing around with a full sized machine.

I really hadn’t considered this viewpoint until I came across a recent article about one of OnLive’s competitors, Gaikai, who was mentioned in the same breath as World of Warcraft running on the iPad. Now whilst that might just seem like a pointless waste of time (and in fact I can’t confirm that it actually works) its actually quite a smart move by Gaikai. You see of the 12 million-ish subscribers to World of Warcraft the vast majority of them would identify themselves as casual players¹. For them playing on an iPad would probably be quite preferable to sitting on the computer and the bonus would be that they could play all the other games they have on there as well. So whilst OnLive might still be a niche, they might just have had a huge gust of wind put in their sails by Apple.

For me personally I’ll probably never have a use for a service like this. I get far too much enjoyment out of building up a really good gaming rig and then putting it through its paces, savouring the moments when I can crank all the slider bars up to “EXTREME”. Still I’m beginning to realise that even though a market might not yet exist for something there’s the potential for someone to create it, and OnLive seems to be doing a good job of developing theirs. Time will tell if they have enough staying power to be the best and fend off their imitators, but that’s what capitalism is all about right? 😉

Now I wonder how long it will take them to release it in Australia…. I’m not going to hold my breath over that one.

¹ I tried to find a good source on this as I remember a survey being done some time ago showing the breakdown of play times and amount of content completed. From memory it was something on the order of 6% of players identifying as hardcore players and the rest identifying with something along the lines of casual, semi-casual or casual hardcore. Doing some quick numbers there are approximately 6100 guilds that have “finished” the current content patch (I.E. defeated the last boss in the current endgame encounter) which gives you about 153,000 players I’d consider “hardcore”, which is about 1.3% of the total population. That’s a wild guess though and should be taken as such.

OnLive: Can They Handle Being a Niche?

You might recall a while back me ranting about Cloud Computing and how it was just an idea that died a long time ago but managed to resurrect itself under a flashy web 2.0 name. In that post I made a passing mention to a gaming service called OnLive which promised to deliver high definition gaming experiences to any platform that was capable of streaming video over the Internet. Although I really didn’t mention it in that post I was pretty skeptical that it could deliver on any of its promises and had many conversations with my gamer pals along those lines. Still they had open their services up for a closed beta for carefully selected people (most notably only in the US) but details had still been scant. That was until one of the guys at PC Perspective managed to wrangle himself a login:

Of course things aren’t always as easy as they seem.  Immediately after the 2009 announcement technology and game journalists began to wonder how the game service could work as easily and and as effortlessly as OnLive claimed.  By far the most troubling question was regarding latency – how would a service like OnLive deal with the input latency (time between data leaving your PC and arriving at the data center) of a mouse, keyboard or controller?  With as much as 100 ms of delay between servers on the Internet, that is a potentially long time between your mouse movement and your mouse movement appearing on screen.

Well, obviously looking for answers, I found a login for the closed OnLive beta and decided to sit down for a couple of weeks and give the service a thorough evaluation.  In this article we’ll look at both the ease of use of the service as well as the real-world experience of playing a few of the games.  I think you will find the results to be interesting!

Indeed the results were and I encourage you to follow the link above and read through the article in its entirety. He raises some good points and also highlights what the big road blocks are for the service. There was one thing that he didn’t end up mentioning though, and that was the business model that OnLive is going to be relying on.

For game publishers OnLive is a dream come true. No longer are gamers buying physical copies of their games which have that nasty effect of generating the second hand market they can’t profit from (not for lack of trying, however) and are also rife with piracy. Instead you’re now only renting a copy of the game and the second you stop paying, you stop playing. It has the effect of turning a one off sale into a continuing revenue stream. Much like a MMO without the continual investment in providing new content. You can see why nearly every major publisher has jumped on the OnLive bandwagon, it’s a huge potential cash cow.

However the problems that Shrout notes in his review of the OnLive service are real threats to their bottom line. For instance let us assume that their service works flawlessly given you’re within a certain range of the data center. The range limit then shrinks the potential customer base substantially since, although Internet access is pervasive amongst the gamer community, not all of them are within a short distance from a data center. There’s still a large potential market of people who are (namely any city with a population over 100,000) but this still requires that OnLive servers be installed at these locations and here’s where the problems start to arise.

With any new installation there’s going to be an overhead of minimum equipment required to provide the OnLive service. This then rules out most of the smaller cities since they won’t be able to guarantee there will be enough subscribers to justify the install costs. As such it would appear that OnLive would be limited to medium to large cities who could have a large enough population to guarantee the minimum number of subscribers to make the installation viable.

There’s also the fact that the service really only appeals to the casual gaming crowd. Sure I’d love to be free of the upgrade cycle but if I have to deal with input lag, blocky compression and having to pay a continuing fee to access the games I want suddenly buying my own PC capable of playing the games doesn’t seem like so much of a hassle. Casual gamers on the other hand would rather that they just be able to play the game and would be less concerned about the issues I meant above.

So in the end the target audience for OnLive is the casual, city dwelling gamer and to be honest most of them are pretty satisfied with their consoles or Pop cap game collections. Don’t get me wrong there are definitely people out there who would use and love the service however I keep getting the feeling that the idea of OnLive somehow revolutionizing the way we play games is just plain marketing hyperbole. But then again I guess that’s what all good marketing companies do when they’re pushing a product that’s completely different from anything else that’s been offered before.

The real question then becomes: can OnLive survive and profit from this niche? Only time will tell. With our gaming rigs lasting a lot longer due to the console revolution most gamers aren’t too fussed when their rig needs an upgrade. Couple that with the average age of a gamer being somewhere in their early 30’s with a much larger disposable income and the advent of digital distribution you’re looking at a market who doesn’t really need the services that OnLive provides. They may attract enough of a crowd to continue on for as long as they need to but I doubt they’ll ever become the pervasive service that they were initially marketed to be.