Many older games, like those that were built before the time when the Internet was as ubiquitous as it is today, are playable so long as you can figure out how to install them. This can be no small feat in some instances although emulators like DOSbox do a lot of the heavy lifting for you. However for slightly more modern games, especially those that relied on DRM or activation servers in order to work, getting them installed is only half the battle. Quite often those activation servers have long since shut down, leaving you with few options if you want to enjoy an older title. Typically this meant turning to the less than legitimate sources for a cracked version of the main executable, free from the checks that would otherwise prevent it from working. This practice however is now legitimized thanks to a ruling by the Library of Congress spurred on by the Electronic Freedom Foundation.
The ruling allows gamers to circumvent any measures of abandoned games that would prevent “local play” of a copy that they legally purchased. Essentially this means that if a central server is shut down (or made inactive without explanation for 6 months) then you’re free to do whatever you need to in order to resurrect it. Considering so many of us now rely on Steam or other digital distribution platforms this ruling is critical to ensuring that we’ll be able to access our games should the unthinkable happen. It also means that more recent abandonware titles that had central DRM servers can now be legally resurrected. For many of us who still enjoy old games this certainly is a boon although it does come with a couple caveats.
Probably the biggest restriction that the Library of Congress placed on this ruling was that multiplayer services were not covered by this exemption. What that means is that, should a game have a multiplayer component, creating the backend component to support it is still not a legal activity. Additionally should the mechanisms be contained within a console the exemption does not cover modification of said console in order to resurrect the game. Whilst I can understand why circumventing console protections wasn’t included (that’s essentially an open season notice to pirates) the multiplayer one feels like it should have been included. Indeed a lot of games thrived on their multiplayer scene and not being able to bring back that component could very well mean it never gets brought back at all.
The exemptions come as part of the three yearly review that the Library of Congress conducts of the Digital Millennium Copyright Act (DMCA). In the past exemptions have also been granted for things such as jailbreaking phones and the fair use of sampled content from protected media. There’s potential in a future review for the exemptions to be extended which could potentially open up further modification capabilities in order to preserve our access to legally purchased games. However the Entertainment Software Association has been fervent in its defence of both the multiplayer and console modification arguments so it will be a tough fight to win any further exemptions.
These exemptions are good news for all gamers as it means that many more titles will be playable long into the distant future. We might not have the full freedom we need yet but it’s an important first step towards ensuring that the games of our, and future generation’s, time remain playable to all.
I’m no conspiracy theorist, my feet are way too firmly planted in the world of testable observations to fall for that level of crazy, but I do love it when we the public get to see the inner workings of secretive programs, government or otherwise. Part of it is sheer voyeurism but if I’m truthful the things that really get me are the big technical projects, things that done without the veil of secrecy would be wondrous in their own right. The fact that they’re hidden from public view just adds to the intrigue, making you wonder why such things needed to be kept secret in the first place.
One of the first things that comes to mind was the HEXAGON series of spy satellites which were high resolution observation platforms launched during the cold war that still rival the resolution of satellites launched today. It’s no secret that all space fairing nations have fleets of satellites up there for such purposes but the fact that the USA was able to keep the exact nature of the entire program secret for so long is quite astounding. The technology behind it though was what really intrigued me as it really was years ahead of the curve in terms of capabilities, even if it didn’t have the longevity of its fully digital progeny.
Yesterday however a friend sent me this document from the Electronic Frontier Foundation which provides details on something called the Presidential Surveillance Program (PSP). I was instantly intrigued.
According to William Binney, a former head of the National Security Agency the PSP is in essence a massive data gathering program with possible intercepts at all major fibre terminations within the USA. The system simply siphons off all incoming and outgoing data which is then stored in massive, disparate data repositories. This in itself is a mind boggling endeavour as the amount of data that transits the Internet in a single day dwarfs the capacity of most large data centres. The NSA then ramps it up a notch by being able to recover files, emails and all sorts of other data based on keywords and pattern matching which implies heuristics on a level that’s just simply mind blowing. Of course this is all I’ve got to go on at the moment but the idea itself is quite intriguing.
For starters creating a network that’s able to handle a direct tap to a fibre connection is no small feat in itself. When the fibres terminating at the USA border are capable of speeds in the GB/s range the require infrastructure to handle that is non-trivial, especially so if you want to store that data later. Storing that amount of data is another matter entirely as most commercial arrays begin to tap out in the petabyte range. Binney’s claims start to seem a little far fetched here as he states there are plans up into the yottabyte range but concedes that current incarnations of the program couldn’t have more than tens of exabytes. Barring some major shake up in the way we store data I can’t fathom how they’d manage to create an array that big. Then again I don’t work for the NSA.
As intriguing as such a system might be there’s no question that its existence is a major violation of privacy for US citizens and the wider world. Such a system is akin to tapping every single phone and recording every conversation on it which is most definitely not supported by their current legal system. Just because they don’t use it until the have a reason to doesn’t make it just either as all data gathered without the suspicion of guilt or pretence to commit a crime is illegitimate. I could think of many legitimate uses for the data (anonymous analytical stuff could prove very useful) but the means by which its was gathered eliminates any purpose being legitimate.