The advent of cloud computing, or more generally the commoditization of computer infrastructure, has provided us with capabilities that few could have accurately predicted. Indeed the explosive growth in the high tech sector can be substantially attributed to the fact that businesses now do not require heavy capital injections in order to validate their ideas, allowing many ideas which wouldn’t have been viable 5 years ago to flourish today. Of course this has also led to everyone seeking to apply the ideals of cloud computing wherever they can, hoping it can be the panacea to their ills. One such place is in the world of gaming and in all honesty the ways in which its being used is at best misguided with most solutions opening us up to a world of hurt not too far down the track.
I’ve gone on record saying that I don’t believe the general idea of Cloud Gaming, whereby a service runs hardware in a central location and users connect to it with a streaming device, isn’t viable. The problem comes from the requirements placed on that infrastructure, specifically the requirement for a low latency which means a user can’t be too far away from the equipment. That would mean that for it to have global reach it would likely need some kind of hardware in all capital cities which would be a rather capital intensive exercise. At the same time the consolidation ratios for gaming level hardware aren’t particularly great at the moment, although that may change in the future with both NVIDIA and AMD working on cloud GPU solutions. Still the fact that OnLive, a once $1 billion company, failed to make the idea feasible says a lot about it.
That hasn’t stopped companies from attempting to integrate the cloud through other avenues something which I’ve come to call Cloud Enhanced gaming. This is where a game can offload less latency sensitive aspects of the game to servers elsewhere so they can do the calculations, sending the results back down the wire. In theory this allows you to make your game better as you don’t have to worry about the limitations of the platform you’re running on, using more of that local grunt for pretty graphics while all the grunt work is done offsite. The latest entrant into this arena is Square-Enix’s Project Flare which they’re marketing as a technological breakthrough in cloud gaming.
On the surface it sounds like a great idea; consoles would no longer suffer from their hardware limitations and thus would remain viable for much longer than they have in the past. Indeed for a developer that’s looking to do something that’s outside of a consoles capabilities offloading processing into the cloud would seem to be the only way to accomplish it should they want to use a specific platform over the alternatives. However doing so binds that game to that backend infrastructure which means that the game’s life is only as long as the servers that power it. Considering the numerous examples we’ve had recently of game servers and services disappearing (including the infamous Games for Windows Live) the effect of turning off an integral part of the game would be far worse and likely without an easy path for remediation.
The reason why this would be such a big issue is that when compared to traditional game server infrastructure the requirements for a cloud enhanced game are much, much greater. You can happily run dozens of virtual servers that service thousands of clients from a single dedicated box, however try and run physics calculations (like in one of the Project Flare demos) and the number of people you can service per server drops dramatically. This means the time in which those servers remain fiscally viable is dramatically reduced and it’s far more likely that the service will cease to exist much sooner than other game servers would. Moore’s Law goes a little way to remedy this but you can’t really get past the fact that the consolidation ratios achievable with this are a couple of orders of magnitude lower than what developers have traditionally come to expect.
This is not to mention how the system will handle poor Internet connections or overloaded servers, something which is guaranteed to happen with more popular titles. Whilst its not an unsolvable problem it’s definitely something that will lead to sub-par gaming experiences as the two most likely systems (stopping the game to wait for the calculations to arrive or simply not simulating them at all) will be anything but seamless. I’m sure it could be improved over time however the way this is marketed makes it sound like they want to do a lot of computation elsewhere so the console graphics can be a lot prettier leaving not a whole lot of wiggle room when the inevitable happens.
Whilst this idea is far more feasible than running the entire game environment on a server it’s still a long way from being a viable service. It’s commendable that Square-Enix are looking for ways to make their games better, removing restrictions of the platforms that the majority have chosen, however I can’t help but feel it’s going to come around to bite them, and by extension us, in the ass in the not too distant future. As always I’d love to be proven wrong on this but the fact is that farming out core game calculations means that the game’s life is tied to that service and once it’s gone there’s nothing you can do to restore it.
The article is pretty spot on. But I’d like to focus on the side everyone seems to forget about, the customers (gamers). As far as I can tell, based on all the research I’ve pulled on these topics, cloud gaming isn’t only a pipe dream its completely forgetting the customer. Which to be honest isn’t so surprising from the gaming industry, which has been constantly identified as being “out of touch with its audiences”. This is just another move made for the shareholders. They don’t care that this would cost the individual gamers more money in the long term and provide even greater risk to the gamer (ie pay infinitely for a service that may have only 10% of the games they want). And on top of it all we already have working examples as of late showing what some companies WANT this future to be like. Sony was releasing there service wanting to charge per hour of gaming time. Yeah you heard it. Doesn’t matter the cost (although it was ridiculous, something like $3.99 per hour), its not a good future for gaming. Hopefully this stuff gets nipped in the but and more honorable sellers like GOG will take hold in the future, giving us full releases that we can download with no DRM. Because let’s be honest, cloud gaming is just more DRM. They can cut out a game when they want, turn your access off when they want. Its the same old crap just to a greater extreme.
I remember bringing up that point a while back when OnLive started debuting its first games and many of them were either the same price as full retail or more expensive. There was a title here or there that was cheaper but if you’re only playing say a dozen or so games a year you wouldn’t be saving money on gaming purchases, especially back when they were still charging a monthly access fee for the service. The use case for their service also seemed incredibly narrow (people near CBDs who like to play games but don’t have a console or PC), along with its capital intensive infrastructure requirement, is ultimately why the service failed.
Indeed I think the vast majority of the gaming community is well aware of the issues you have raised and that’s the reason why it’s (thankfully) struggling to take off. Services like Steam and GOG are going a long way to making them irrelevant as they simply can’t compete on a price level and, soon, won’t be able to do it on a feature level either.
This is one thing I hope I’m not proven wrong on but, for now, all the evidence points to ideas like this being dead in the water.