Posts Tagged‘chilling effect’

L'Aquila Earthquake Devastation

Jailing Scientists Doesn’t Help Anybody, Italy.

I have a bugbear for people who believe they know better than those who’ve made a career out of being experts in some field. For me in particular its doctors as I know that I’m rubbish when it comes to figuring why things are happening in my body so I defer to their expertise. People I know seem to harbour a deep mistrust for them however, believing that everything they’re telling them is wrong and only they have the right answers. Whilst everyone has a story of when a doctor might not have got things quite right they always seem to forget the times when they got them spot on, which I’ll argue is more often than not.

The reason why they don’t get it right 100% of the time is due to the very nature of medicine and, more generally, the principles they and all other science based professions engage. For highly complicated systems like the human body it’s nigh on impossible to control for every input and thus we instead rely on statistical models that pull from large data sets so we have a good idea of the effect something will have given a certain input. These models are far from perfect and this means that edge cases won’t respond in the same way but that does not invalidate the model, it merely identifies another factor that needs to be incorporated into it.

It was these very principles that lead a group of scientists back in 2009 to make a prediction that there was a low risk of an earthquake in the small town of L’Aquila in Italy. Months prior to them making the prediction L’Aquila had been rocked by many small tremors which is what caused the local government to convene a panel of experts to determine whether action was warranted. L’Aquila lies on a fault line and using seismic models they had available at the time the scientists concluded that the risk of a larger quake was unaffected by the recent tremors, but there was still a risk. Forced into the situation of giving a yes or no answer they opted for no as earthquake predictions of that nature are incredibly disruptive events for all involved. Unfortunately for them not 6 days later a magnitude 6.3 quake hit L’Aquila and over 300 people lost their lives.

When something devastating like this happens it’s human nature to look for someone to blame. The people of L’Aquila turned their sights on the scientists and politicians who had were involved in making the predictions and yesterday saw 7 of them convicted of manslaughter and sentenced to 6 years jail, 2 more years than what the prosecution was asking for. The conviction is assuredly done in order to placate the larger public of L’Aquila who are still struggling to rebuild after the quake laid waste to their town and many of them are seeing this as some small form of justice for those who perished.

This could not be further from the truth.

The predictions they made (which were then announced by a government official with no seismological experience) were based around the models and data they had available at the time and all of them pointed to there being no increased risk of a large quake at that time. Whilst there’s an argument to be made that in the hours leading up to it models in use then would have predicted a massive increase in risk (on the other from 1 in 200,000 to 1 in 1000) that doesn’t change the fact that the prediction they made was sound. To turn around and prosecute them means that in future all scientists who are approached to make predictions of this nature will err on the side of caution and any mild risk will turn into an absolute or, more chillingly, they’ll simply refuse to make any prediction at all lest they face litigation.

The fact of the matter is that there are many factors that lead up to this disaster being as bad as it was and laying all the blame on the scientists who made a prediction based on good data and science shows that they were only looking for a scape goat. There are numerous other individuals who could be held as equally responsible for this such as the builders who built and maintained those houses (magnitude 6.0 proof buildings can be easily constructed, just ask Japan), the regulators who didn’t mandate certain construction standards and anyone else who could be tangentially involved. We won’t do that though because it sounds like madness yet throwing scientists in jail seems reasonable, something which I will never understand.

I am so sorry for the losses the people of L’Aquila have had to endure but blaming the scientists for this is not the right course of action. Instead they should focus on ensuring that the risk is fully mitigated rather than relying on predictions that can and will be wrong from time to time. From now on no scientist in their right mind will make any predictions unless they can be granted immunity from prosecution and when that doesn’t happen they’ll simply refuse. It is one of the most chilling effects modern science has experience in recent memory and I can only hope that the verdict is overturned.

Not just for the scientist’s sake, for the sake of science at large.

Resistance is Futile, Integration is Inevitable.

Enabling your users to interact with your application through the use of open APIs has been a staple of the open web since its inception over a decade ago. Before that as well the notion of letting people modify your product helped to create vast communities of people dedicated to either improving the user experience or creating features that the original creators overlooked. I can remember my first experience with this vividly, creating vast levels in the Duke Nukem 3D level creator and showing them off to my friends. Some of these community developed products can even become the killer feature of the original application itself and whilst this is a boon for the application itself it pose some issues to the developer.

Probably the earliest example I can think of this would have to be World of Warcraft. The client has a pretty comprehensive API available that enabled people to create modifications to do all sorts of wonderful things, from the more mundane inventory managers to boss timer mods that helped keep a raid coordinated.  After a while many mods became must haves for any regular player and for anyone who wanted to join in the 40 persons raids they became critical to achieving success. Over the years many of these staple mods got replaced by Blizzard’s very own implementations of them ensuring that anyone that was able to play the game was guaranteed to have them. Whilst most of the creators weren’t enthused that all their hard work was now being usurped by their corporate overlords many took it as a challenge to create even more interesting and useful mods, ensuring their user base stayed loyal.

More recently this issue has come to light with Twitter who are arguably popular due to the countless hours of work done by third parties. Their incredibly open API has meant that anything they were able to do others could do to, even to the point of them doing it better than them. In fact it’s at the point where only a quarter of their traffic is actually on their main site, the other three quarters is from their API. This shows that whilst they’ve built an incredibly useful and desirable service they’re far from the best providers of it, with their large ecosystem of applications filling in the areas where it falls down. More recently however Twitter has begun incorporating features into its product that used to be provided by third parties and the developer community hasn’t been too happy about it.

The two most recent bits of technology that Twitter has integrated have been the new Tweet button (previously provided by TweetMeme) and their new link shortening service t.co which was handled by dozens of others.  The latter wasn’t unique to Twitter at all and whilst many of the new comers to the link shortening space made their name on Twitter’s platform many of them report that it’s no longer their primary source of traffic. The t.co shortener is then really about Twitter taking control of the platform that they developed and possibly using the extra data they can gather from it as leverage in brokering advertising and partnership deals. The Tweet button however is a little bit more interesting.

Way back when news aggregator sites were all the rage. From Digg to Del.icio.us to Reddit there were all manner of different sites designed around the central idea of sharing online content with others. Whilst the methods of story aggregation differed from service to service most of them ended up implementing some kind of “Add this story to X” button that could be put on your website. This served two purposes: it helped readers show a little love to the article by giving it some attention on another site and secondly it gave content to the other site to link to, with little involvement from the user. The TweetMeme button then represented a way to drive Twitter adoption further and at the same time get even more data on their users than they previously had before. Twitter, for what it’s worth, said they licensed some of the technology from TweetMeme for their button however they have still in essence killed off one of their popular services and that’s begun to draw the ire of some developers.

The issue many developers take with Twitter building these services into their main product is because it puts a chilling effect on products based on Twitter’s ecosystem. Previously if you had built something that augmented their service chances were you could build yourself quite the web property. Unlike other companies which would acquire these innovator’s companies in order to integrate their technology Twitter has instead taken to developing the same products themselves, in direction competition with those innovators. The reasons behind this are simple, Twitter simply doesn’t have the cash available to do acquisitions like the big guys do. They’re kind of stuck between a rock and a hard place as whilst they need to encourage innovation using their platform they can’t let it go on forever, lest they become irrelevant past delivering an underlying service. Realistically the best option for them is to start generating some cash in order to start acquiring innovator’s technology rather than out competing them but they’re still too cash poor for them to this to be viable.

In the end if you build your product around someone else’s service you’re really putting yourself at their mercy. The chill that Twitter is putting on their developers probably won’t hurt them in the long run should they not continue to copy other’s solutions to their problems however their fledgling advertising based business model is at odds with all the value add developers. Twitter is quite capable of doing some impressive innovation on their own (see #newtwitter) but their in house development is nothing compared to the hordes of third parties who’ve been doing their part to improve their ecosystem. I’m interested to see what direction they go with this, especially so since I’m working on what could be classed as a competing service.

Although I’m hoping people don’t see it that way :P