My stance on game streaming services has been well known for some time now but for the uninitiated let me sum it up for you: I think they’re rubbish. The investment in capital required to get them to work well at scale seems incompatible with the number of potential users who’d want such a service and nearly all offerings in this space priced the games similarly to their full blooded, non-streamed cousins. Sony doesn’t share my view on this however having invested several hundred million dollars into buying game streaming service Gaikai and committing to providing a sort-of backwards compatibility service using that platform. Since I wasn’t entirely interested in the idea I hadn’t looked into it much further but at a tech level it’s quite interesting, even if I think the service won’t be the cash cow I’m sure Sony thinks it’ll be.
I’ve mentioned in the past that there weren’t too many ways for backwards compatibility to make it’s way onto current generation consoles even if some form of streaming service was going to be offered. I postulated around the potential ways of doing it, either by running a whole bunch of old consoles in a data center or developing an emulation framework, neither of which I felt was going to be particularly scalable due to my percieved lack of demand for the service. As it turns out Sony has gone with the former option for their streaming service, opting to run a bunch of PlayStation3s in the cloud and providing access to them through their new PlayStation Now service. However they’re not consoles as you’d recognise them, they’re in fact all new hardware.
Sony has developed a custom motherboard that contains on it 8 PlayStation3 chips allowing them to achieve a pretty incredible amount of density when compared to simply racking consumer units. Some back of the napkin calculations puts this at about 384 PlayStation 3s per rack, quite a decent number although I’m sure the cost of that hardware is going to be non-trivial. This custom solution does have its benefits though like them being able to throw in a new network interface and hardware video encoder, reducing the latency between the customer and their PlayStation3 in the cloud. This might not be enough to make the service feasible but it’ll do a lot to make the majority of games on their far more playable than they would be otherwise.
Right now the service offers up about 200 titles for individual rent or an all you can eat subscription that has a selection of 100 titles for $15 per month at the cheapest option. That’s a damn sight better than pretty much every other game streaming service I’ve seen before but it still suffers from the same restricted availability (only select US and Canada areas currently) issues which hamstrung other services. The one thing the service does have going for it though is the veritable cornucopia of devices that PlayStation Now can run on, including Sony’s recent range of TVs and even DVD players. That’s definitely an advantage that other competitors didn’t have since they all required another hardware purchase but I’m still not sure there’ll be enough demand even if the barrier to entry is low for Sony’s more loyal customers.
With the average cost of producing a PS3 apparently down around the $280 mark (which I’ll assume is relatively similar for the custom solution) it will take Sony around 18 months to recoup the costs invested in hardware based on the current subscription fees which doesn’t take into account the licensing arrangements for streaming. There’s potential for them to make up a bit more margin with the single rentals which appear to be quite a bit more pricey but it still seems like a long time for the investment to pay off. That being said with the life of consoles now getting dangerously close to 10 years there’s potential for it to work but I still think it’s a bit of a gamble on the part of Sony.
Us gamers tend to be hoarders when it comes to our game collections with many of us amassing huge stashes of titles on our platforms of choice. My steam library alone blew past 300 titles some time ago and anyone visiting my house will see the dozens of game boxes littering every corner of the house. There’s something of a sunk cost in all this and it’s why the idea of being able to play them on a current generation system is always attractive to people like me: we like to go back sometimes and play through games of our past. Whilst my platform of choice rarely suffers from this (PCs are the kings of backwards compatibility) my large console collection is in varying states of being able to play my library of titles and, if I’m honest, I don’t think it’s ever going to get better.
For the current kings of the console market the decision to do away with backwards compatibility has been something of a sore spot for many gamers. Whilst the numbers show that most people buy new consoles to play the new games on them¹ there’s a non-zero number who get a lot of enjoyment out of their current gen titles. Indeed I probably would’ve actually used my PlayStation4 for gaming if it had some modicum of backwards compatibility as right now there aren’t any compelling titles for it. This doesn’t seem to have been much of a hinderance to adoption of the now current gen platforms however.
There does seem to be a lot of faith being poured into the idea that backwards compatibility will come eventually through cloud services, of which only Sony has committed to developing. The idea is attractive, mainly because it then enables you to play any time you want from a multitude of devices, however, as I’ve stated in the past, the feasibility of such an idea isn’t great, especially if it relies on server hardware needing to be in many disparate locations around the world to make the service viable. Whilst both Sony and Microsoft have the capital to make this happen (and indeed Sony has a head start on it thanks to the Gaikai acquisition) the issues I previously mentioned are only compounded when it comes to providing a cloud based service with console games.
The easiest way of achieving this is to just run a bunch of the old consoles in a server environment and allow users to connect directly to them. This has the advantage of being cheaper from a capital point of view as I’m sure both Sony and Microsoft have untold hordes of old consoles to take advantage of, however the service would be inherently unscalable and, past a certain point, unmaintable. The better solution is to emulate the console in software which would allow you to run it on whatever hardware you wanted but this brings with it challenges I’m not sure even Microsoft or Sony are capable of solving.
You see whilst the hardware of the past generation consoles is rather long in the tooth emulating it in software is nigh on impossible. Whilst there’s some experimental efforts by the emulation community to do this none of them have produced anything capable of running even the most basic titles. Indeed even with access to the full schematics of the hardware recreating them in software would be a herculean effort, especially for Sony who’s Cell processor is a nightmare architecturally speaking.
There’s also the possibility that Sony has had the Gaikai team working on a Cell to x86 transition library which could make the entire PlayStation3 library available without too much hassle although there would likely be a heavy trade off in performance. In all honesty that’s probably the most feasible solution as it’d allow them to run the titles on commodity hardware but you’d still have the problems of scaling out the service that I’ve touched on in previous posts.
Whatever ends up happening we’re not going to hear much more about it until sometime next year and it’ll be a while after that before we can get our hands on it (my money is on 2016 for Australia). If you’re sitting on a trove of old titles and hoping that the next gen will allow you to play them I wouldn’t hold your breath as its much more likely that it’ll be extremely limited, likely requiring an additional cost on top of your PlayStation Plus membership. That’s even if it works as everyone speculating it will as I can see it easily turning out to be something else entirely.
¹ I can’t seem to find a source for this but back when the PlayStation3 Slim was announced (having that capability removed) I can remember a Sony executive saying something to this effect. It was probably a combination of factors that led up to him saying that though as around that time the PlayStation2 Slim was still being manufactured and was retailing for AUD$100, so it was highly likely that anyone who had the cash to splurge on a PlayStation3 likely owned a PlayStation2.
The advent of cloud computing, or more generally the commoditization of computer infrastructure, has provided us with capabilities that few could have accurately predicted. Indeed the explosive growth in the high tech sector can be substantially attributed to the fact that businesses now do not require heavy capital injections in order to validate their ideas, allowing many ideas which wouldn’t have been viable 5 years ago to flourish today. Of course this has also led to everyone seeking to apply the ideals of cloud computing wherever they can, hoping it can be the panacea to their ills. One such place is in the world of gaming and in all honesty the ways in which its being used is at best misguided with most solutions opening us up to a world of hurt not too far down the track.
I’ve gone on record saying that I don’t believe the general idea of Cloud Gaming, whereby a service runs hardware in a central location and users connect to it with a streaming device, isn’t viable. The problem comes from the requirements placed on that infrastructure, specifically the requirement for a low latency which means a user can’t be too far away from the equipment. That would mean that for it to have global reach it would likely need some kind of hardware in all capital cities which would be a rather capital intensive exercise. At the same time the consolidation ratios for gaming level hardware aren’t particularly great at the moment, although that may change in the future with both NVIDIA and AMD working on cloud GPU solutions. Still the fact that OnLive, a once $1 billion company, failed to make the idea feasible says a lot about it.
That hasn’t stopped companies from attempting to integrate the cloud through other avenues something which I’ve come to call Cloud Enhanced gaming. This is where a game can offload less latency sensitive aspects of the game to servers elsewhere so they can do the calculations, sending the results back down the wire. In theory this allows you to make your game better as you don’t have to worry about the limitations of the platform you’re running on, using more of that local grunt for pretty graphics while all the grunt work is done offsite. The latest entrant into this arena is Square-Enix’s Project Flare which they’re marketing as a technological breakthrough in cloud gaming.
On the surface it sounds like a great idea; consoles would no longer suffer from their hardware limitations and thus would remain viable for much longer than they have in the past. Indeed for a developer that’s looking to do something that’s outside of a consoles capabilities offloading processing into the cloud would seem to be the only way to accomplish it should they want to use a specific platform over the alternatives. However doing so binds that game to that backend infrastructure which means that the game’s life is only as long as the servers that power it. Considering the numerous examples we’ve had recently of game servers and services disappearing (including the infamous Games for Windows Live) the effect of turning off an integral part of the game would be far worse and likely without an easy path for remediation.
The reason why this would be such a big issue is that when compared to traditional game server infrastructure the requirements for a cloud enhanced game are much, much greater. You can happily run dozens of virtual servers that service thousands of clients from a single dedicated box, however try and run physics calculations (like in one of the Project Flare demos) and the number of people you can service per server drops dramatically. This means the time in which those servers remain fiscally viable is dramatically reduced and it’s far more likely that the service will cease to exist much sooner than other game servers would. Moore’s Law goes a little way to remedy this but you can’t really get past the fact that the consolidation ratios achievable with this are a couple of orders of magnitude lower than what developers have traditionally come to expect.
This is not to mention how the system will handle poor Internet connections or overloaded servers, something which is guaranteed to happen with more popular titles. Whilst its not an unsolvable problem it’s definitely something that will lead to sub-par gaming experiences as the two most likely systems (stopping the game to wait for the calculations to arrive or simply not simulating them at all) will be anything but seamless. I’m sure it could be improved over time however the way this is marketed makes it sound like they want to do a lot of computation elsewhere so the console graphics can be a lot prettier leaving not a whole lot of wiggle room when the inevitable happens.
Whilst this idea is far more feasible than running the entire game environment on a server it’s still a long way from being a viable service. It’s commendable that Square-Enix are looking for ways to make their games better, removing restrictions of the platforms that the majority have chosen, however I can’t help but feel it’s going to come around to bite them, and by extension us, in the ass in the not too distant future. As always I’d love to be proven wrong on this but the fact is that farming out core game calculations means that the game’s life is tied to that service and once it’s gone there’s nothing you can do to restore it.
The idea of cloud gaming is a seductive one especially for those of us who lived through the times when upgrading your computer every 12 months was a requirement if you didn’t want to be watching a slide show. Abstracting the hardware requirement away from the user and then letting them play on any device above a certain, extremely low threshold would appear to be the solution to the upgrade and availability issues of dedicated gaming platforms. I’ve long made the case that the end product is something of a niche market, one that I was never quite sure would be viable on a large scale. With the demise of OnLive I could very easily make my point based around that but you can never write off an industry on the failures of the first to markets (see Iridium Communications for proof of this).
Providing even a small cloud gaming service requires some rather massive investments in capital expenditure, especially with the hardware that’s currently available today. For OnLive this meant that only one of their servers could serve one of their users at a time which was terrible from a scalability point as they could never really service that many customers without bleeding money on infrastructure. For cloud gaming services of the future however they might be in luck as both NVIDIA and AMD are working on cloud GPUs that will enable them to get much higher densities than the current 1 to 1 ratio. There’ll still be an upper limit to that’s much lower than most cloud services (which typically serve thousands per server) but at the very least the scalability problem is now an engineering issue rather than a capital one.
The second major challenge that cloud gaming companies face how latency sensitive a good portion of the games market is. Whilst you can get down to very low latency numbers with strategically placed servers you’re still going to be adding a good chunk of input lag on top of any server latency which will be unacceptable for a lot of games. Sure there are titles where this won’t be an issue but cutting off a large section of the market (FPS, RTS, RPGs and any mix of them inbetween) further reduces the viability of any potential cloud gaming service.
In fact for many of the titles that could benefit from a cloud gaming service can already be ported to the web thanks to things like Unity or the use of OpenGL extensions in HTML5. Indeed many of the games that I could see being published on a cloud platform (casual MMORPGs, turn based strategy games, etc.) wouldn’t be much different if they were brought to the traditional web instead. Sure you lose some of the platform agnosticity because of this but you can arguably reach the same number of people using that as you could with a cloud platform.
User expectations are also set rather high for cloud services with many of them being flat fee, unlimited usage scenarios (think Pandora, NetFlix, etc). The current business models for cloud gaming didn’t gel well with this mindset as you were paying for the games you wanted to play (often cheaper than retail, sometimes not) for a limited period of time, akin to a long term rental. Whilst this works for some people most users will expect to be able to pay a flat fee in order to access a catalogue they can then use at their leisure and this has significant ramifications for how publishers and developers will license their games to cloud developers. It’s not an insurmountable problem (the music industry came around eventually so the games industry can’t be far behind) but it does introduce a market dynamic that cloud gaming services have not yet investigated.
With all these things being considered I find it hard to see how cloud gaming services can viable in the near term as whilst all the issues are solvable they all work against delivering something that can turn you a profit. Cloud GPUs, ever increasing quality of Internet connections and the desire by many to migrate completely to cloud based services does mean that there’s a trend towards cloud gaming services becoming viable in the future however the other, fundamental limitations could see those pressures rendered null and void. This is something I’m willing to be proven wrong on though as I’ve invested myself heavily in cloud principles and I know that its capable of great things. Whether cloudifying our gaming experience is one of them is something that I don’t believe is currently feasible however and I don’t see that changing for a while.