Posts Tagged‘open source’

.NET to be Fully Open Source.

Microsoft isn’t a company you’d associate with open source. Indeed if you wound back the clock 10 years or so you’d find a company that was outright hostile to the idea, often going to great lengths to ensure open source projects that competed with their offerings would never see the light of day. The  Microsoft of today is vastly different, contributing to dozens of open sourced projects and working hard with partner organisations to develop their presence in the ecosystem. For the most part however this has usually been done with an integration view towards their proprietary products which isn’t exactly in-line with the open source ethos. That may be set to change however as Microsoft will be fully open sourcing its .NET framework, the building blocks of all Microsoft applications.

Microsoft .Net logo

For the uninitiated Microsoft .NET is a development framework that’s been around since the Windows XP days that exposed a consistent set of capabilities which applications could make use of. Essentially this meant that developing a .NET application meant you could guarantee it would work on any computer running that framework, something which wasn’t entirely a given before its inception. It’s since then grown substantially in capability, allowing developers to create some very capable programs using nothing more than the functionality built directly into Windows. Indeed it was so successful in accomplishing its aims that there was already a project going to make it work on non-Windows platforms, dubbed Mono, and it is with them that Microsoft is seeking to release a full open source implementation of the .NET framework.

Whilst this still falls in line with Microsoft’s open source strategy of “things to get people onto the Microsoft platform” it does open up a lot of opportunities for software to be freed from the Microsoft platform. The .NET framework underpins a lot of applications that run on Windows, some that only run on Windows, and an implementation of that framework on another platform could quickly elevate them to cross platform status. Sure, the work to translate them would still likely be non-trivial, however it’ll be a damn sight easier with a full implementation available, possibly enough to tempt some companies to make the investment.

One particularly exciting application of an open sourced .NET framework is games which, traditionally, have an extremely high opportunity cost when porting between platforms. Whilst everything about games development on Windows isn’t strictly .NET there are a lot of .NET based frameworks out there that will be readily portable to new platforms once the open sourcing is complete. I’m not expecting miracles, of course, but it does mean that the future of cross-platform releases is looking a whole bunch brighter than it was just a week ago.

This is probably one of Microsoft’s longest bets in a while as it’s going to be years before the .NET framework sees any kind of solid adoption among the non-Windows crowd. However this does drastically increase the potential of C# and .NET to become the cross platform framework of favour with developers, especially considering the large .NET developer community that already exists today. It’s going to be an area that many of us will be watching with keen interest as it’s yet another signal that Microsoft isn’t the company it used to be, a likely never will be again in the future.

You Can’t Archive Digital Video? Surely You Jest.

On recommendation of a friend I recently watched a documentary called Side by Side which details the history of the primary technology behind the cinema: the cameras. It starts off by giving you an introduction into the traditional photographic methods that were used to create films in the past and then goes on to detail the rise of digital in the same space. Being something of a photographic buff myself as well as a technological geek who can’t get enough of technology the topic wasn’t something I was unfamiliar with but it was highly interesting to see what people in the industry were thinking about the biggest change to happen in their industry in almost a century.

RED Epic Side Shot

Like much of my generation I grew up digitally with the vast majority of my life spent alongside computers and other non-analog style equipment. I was familiar with film as my father was something of a photographer (I believe his camera of choice was a Pentax K1000 which he still has, along with his Canon 60D) and my parents gave me my own little camera to experiment with. It wasn’t until a good decade and a half later that I’d find myself in possession of my first DSLR and still not another few years until after then that I’d find some actual passion for it. What I’m getting at here is that I’m inherently biased towards digital since it’s where I found my feet and it’s my preferred tool for capturing images.

One of the arguments that I’ve often heard levelled at digital formats, both in the form of images and your general everyday data, is that there’s no good way to archive it in order for future generations to be able to view it. Film and paper, the traditional means with which we’ve stored information for centuries, would appear to archive quite well due to the amount of knowledge contained in those formats that has stood the test of time. Ignoring for the moment that digital representations of data are still something of a nascent technology by comparison the question of how we archive it has come up time and time again and everyone seems to be under the impression that there’s no way to archive it.

This just isn’t the case.

Just before I was set to graduate from university I had been snooping around for a better job after my jump to a developer hadn’t worked out as I planned. As luck would have it I managed to land a job at the National Archives of Australia, a relatively small organisation tasked with the monumental effort of cataloguing all records of note that were produced in Australia. This encompassed all things from regular documents used in the course of government to things of cultural value like the air line tickets from when the Beatles visited Australia. Whilst they were primarily concerned with physical records (as shown by their tremendous halls filled with boxes) there was a small project within this organisation that was dedicated to the preservation of records that were born digital and were never to see the physical world.

I can’t take much credit for the work that they did there, I was merely a care taker of the infrastructure that was installed long before I arrived but I can tell you about the work they were doing there. The project team, consisting mostly of developers with just 2 IT admins (including myself), was dedicated to preserving digital files in the same way you would do with a paper record. At the time a lot of people were still printing them off and then archiving them in that way however it became clear that this process wasn’t going to be sustainable, especially considering that the NAA had only catalogued about 10% of their entire collection when I was there (that’s right, they didn’t know what 90% of the stuff they had contained). Thankfully many of the ideas used in the physical realm translated well to the digital one and thus XENA was born.

XENA is an open source project headed by the team at NAA that can take everyday files and convert them into an archival format. This format contains not only the content but also the “essence” of the document, I.E. it’s presentation, layout and any quirks that make that document, that document. The viewer included is then able to reconstruct the original document using the data contained within the file and since the project is open source should the NAA cease development on the project the data will still be available for all of those who used the XENA program. The released version does not currently support video but I can tell you that they were working on it while I was there but the needs of archiving digital documents was the more pressing requirement at the time.

Ah ha, I’ll hear some film advocates say, but what about the medium you store them on? Surely there’s no platform that can guarantee that the data will still be readable in 20 years, heck even 10 I’ll bet! You might think this, and should you have bought any of the first generation of CD-Rs I wouldn’t fault you for it, but we have many ways of storing data for long term archival purposes. Tapes are by far the most popular (and stand the test of time quite well) but for truly archival quality data storage that exists today nothing beats magneto-optical discs which can have lives measured in centuries. Of course we could always dive into the world of cutting edge science for likes like a sapphire etched platinum disc that might be capable of storing data for up to 10 million years but I think I’ve already hammered home the point enough.

There’s no denying that there are challenges to be overcome with the archival of digital data as the methods we developed for traditional means only serve as a pointer in the right direction. Indeed attempting to apply them to digital the world has often had disastrous results like the first reel of magnetic tape brought to the NAA which was inadvertenly baked in an oven (done with paper to kill microbes before archival), destroying the data forever. This isn’t to say we don’t have anything nor are we not working on it however and as technology improves so will the methods available for archiving digital data. It’s simply a matter of time until digital becomes as durable as its analogue counterpart and, dare I say it, not long before it surpasses it.

OUYA and The Console Reformation.

I’ve seen so many consoles come and during my years as a gamer. I remember the old rivalries back in the day between the stalwart Nintendo fans and the just as dedicated Sega followers. As time went on Nintendo’s dominance became hard to push back against and Sega struggled to face up to the competition. Sony however made quite a splash with their original Playstation and was arguably the reason behind the transition away from game cartridges to the disc based systems we have today. For the last 5 years or so though there really hasn’t been much of a shake up in the console market, save for the rise of the motion controllers (which didn’t really shake anything up other than causing a giant fit of mee-tooism from all the major players).

I think the reasons for this are quite simple: consoles became powerful enough to be somewhat comparable to PCs, the old school king of gaming. The old business models of having to release a new console every 3 years or so didn’t make sense when your current generation was more than capable of modern games at a generally acceptable level. There was also the fact that Microsoft got burned slightly by releasing the Xbox360 so soon after the original Xbox and I’m sure Sony and Nintendo weren’t keen on making the same mistake. All we’ve got now are rumours about the next generation of consoles but by and large they’re not shaping up to be anything revolutionary like their current gen brethren were when they were released.

What’s really been shaking up the gaming market recently though is the mobile/tablet gaming sector. Whilst I’ll hesitate to put these in the same category as consoles (they are, by and large, not a platform with a primary purpose of gaming in mind) they have definitely had an impact in the portable sector. At the same time though the quality of games available on the mobile platform has increased significantly and developers now look to develop titles on the mobile platform wouldn’t have been reasonable or feasible only a few short years ago. This is arguably due to the marked increase in computing power that has been made available to even the most rudimentary of smart phones which spurred developers on to be far more ambitious with the kinds of titles they develop for the platform.

What I never considered though was a crossover between the traditional console market and the now flourishing mobile sector. That’s were OUYA, an Android based game console, comes into play.

OUYA is at its heart a smartphone without a screen or a cellular chipset in it. At its core it boasts a NVIDIA Tegra 3 coupled with 1GB of RAM, 8GB of flash storage, Bluetooth and a USB 2 port for connectivity. For a console the specifications aren’t particularly amazing, in fact they’re down right pitiful, but it’s clear that their idea for a system isn’t something that can play the latest Call of Duty. Instead the OUYA’s aim is to lurethat same core of developers, the ones who have been developing games for mobile platforms, over to their platform by making the console cheap, license free and entirely open. They’ve also got the potential to get a lot of momentum from current Android developers who will just need a few code modifications to support the controller, giving them access to potentially thousands of launch titles.

I’ll be honest at the start I was somewhat sceptical about what the OUYA’s rapid funding success meant. When I first looked at the console specifications and intended market I got the feeling that the majority of people ordering it weren’t doing it for the OUYA as a console, no the were more looking at it as a cracking piece of hardware for a bargain basement price. Much like the Raspberry Pi the OUYA gives you some bits of tech that are incredibly expensive to acquire otherwise like a Tegra 3 coupled with 1GB RAM and a Bluetooth controller. However that was back when there were only 8,000 backers but as of this morning there’s almost 30,000 orders in for this unreleased console. Additionally the hype surrounding around the console doesn’t appear to be centred on the juicy bits of hardware underneath it, people seem to be genuinely excited by the possibilities that could be unlocked by such a console.

I have to admit that I am too. Whilst I don’t expect the OUYA to become the dominant platform or see big name developers rushing towards releasing torrents of titles on it the OUYA represents something that the console market has been lacking: a cheap, low cost player that’s open to anyone. It’s much like the presence of an extremely cut-rate airline (think Tiger Airlines in Australia) sure you might not catch them all the time because of the ridiculous conditions attached to the ticket but their mere presence keeps the other players on their best behaviour. The OUYA represents a free, no holds barred arena where big and small companies alike can duke it out and whilst there might not be many multi-million dollar titles made for the platform you can bet that the big developers won’t be able to ignore it for long.

I’m genuinely excited about what the OUYA represents for the console games industry. With innovation seemingly at a stand still for the next year or two it will be very interesting to see how the OUYA fairs, especially considering its release date for the first production run in slated for early next year. I’m also very keen to see what kinds of titles will be available for it at launch and, hacker community willing, what kinds of crazy, non-standard uses for the device come out. I gladly plonked down $149 for the privilege of getting 1 with 2 controllers and even if you have only a casual interest in game consoles I’d urge you to do much the same.

It Would Be So Easy To Give Up.

I’ve been an on again, off again developer ever since my first year of university. I wasn’t particularly good at it either and it took me a good year of slogging through various programming languages before the penny finally dropped when I started using C#. After that initial hump however I found it much easier to pick up on new languages and technologies which has ultimately culminated in me attempting to create my own web application from the ground up, something I would’ve seen as impossible just a few years ago. It’s just over a year and a half since I began work on my pet project and in that time it’s gone through 3 complete rewrites, 4 redesigns and several months of me staring at a computer screen wondering if this is the best thing to do with my time.

It was that little hater getting into my head again.

I hadn’t really been thinking about much until a friend of mine commented on how he’d noticed that my writings indicated I was getting tired of developing Lobaco. After thinking about it for a while I knew he was right, the long weekends spent coding and testing had been taking their toll on me mentally. I had begun to fantasise about other applications I could be developing or other hobbies I could pick up, losing hours in research. After a while they started to meld together and my new found hobbies were turning into other potential start up ideas and I began lusting after them as they began to look so much more tangible than Lobaco. It was the dreaded unknowing procrastination beginning to slip in again and I had been welcoming it willingly.

As Jay Smooth put so aptly it was being in the thick of creation for so long that was making me lose sight of the end game. I’ve been writing on this blog for over 2 years now and there have been many times I’ve thought I should just give it up and shut the whole thing down (I would gain a considerable amount of time per day back again) but every time I get a comment either here or in real life I know that the work I do here is appreciated and it keeps me going that much longer. I’ve finally come to terms with the fact that some days I just won’t be able to find anything to write about and that doesn’t mean this blog is worthless. Still I do enjoy blogging and when I’ve got a topic I’m passionate about I feel it shows and it’s posts like that that keep me coming back every day in the hopes I’ll hit on one of those topics.

Ever since that realisation I’ve been making great strides with the Lobaco iPhone application. Last weekend was probably my most productive ever with 4 core features being implemented and many improvements made thanks to some open source libraries I hadn’t come across before. Now it feels like I’ve hit one of those points where my progress as an iPhone developer is accelerating and my formerly hacker style approach is now becoming more standardized and new features are just rolling off my fingers. I’ve still got a couple months of development effort ahead of me before I’ll be releasing the iPhone application to beta testers but now its only a matter of time rather than the impossible mountain it used to be.

I guess this is why the majority of start ups are founded with more than just a single person. It’s so easy to get lost in your own world when you’re trying to bring an idea into reality and having someone there beside you really helps to keep you in the game and focused on the goal. Whilst I haven’t found anyone (yet, but I’m still looking!) who’s willing to go on this startup journey with me my group of close friends have acted as the sounding board and grounding rod that’s gotten me this far into the project. The next few months are going to be the make or break time for Lobaco but with the progress I’ve made in just the past couple weeks I have a much renewed level of confidence, and a desire to succeed that is yet to be satiated.

Google Wave: Rethinking Communication.

Taking a look over the past decade or so of technology and communications you’ll notice that they’re hasn’t been any revolutionary ideas that have come forward. Sure there have been a lot of improvements or augmentations to current technology but no one has really gone back and thought about the underlying principals of communication and how to make them transcend new mediums. When I first heard about Google Wave it was something that was supposed to be “pretty cool” but I didn’t hear much more of it then that. Queue the following video, which I thoroughly recommend watching if you have the time (I’ll give a general overview of it anyway if you can’t spare the 1.5 hours):

In essence Wave is a wrapper around many different modes of communication such as email, twitter and instant messaging with augmentations that allow for some creative ways of interacting with the flow of the communications. This by itself isn’t a revolutionary means of transforming communication but due to Google’s idea of open sourcing the majority of the code and having a wealth of APIs developed this will allow the market to drive the innovation, and that is where the true revolution can begin.

Looking over it I couldn’t help but notice a trend that has been developing over the past few years when it comes to technologies like this. On the Internet we have access to such wide and disperate sources of information that it is easy to become overwhelmed when you’re trying to filter out everything you don’t want to see. Many technologies have tried to solve this issue by allowing you to aggregate your personal choices into one interface (things like RSS feeds) in the hopes to focus your experience. Wave is Google’s attempt to transcend all the mediums and bring them onto a more robust and open platform which is not only a boon for them but also for standards based development, something that the Internet has been lacking for a long time (thank you Internet Explorer!).

I like the idea a lot. I see myself using many different forms of communication these days and it would be great to have a unified web based interface to the lot of them. Of course the augmentations that Google has added (that spell checker and auto-translator are awesome) would make using this platform worthwhile but as we’ve seen with other Google products once the developers get their hands on it the applications will widen considerably. Couple that with the fact that they’ll let you run your own Wave server and I’m sold, I love having new toys to play with on my web server 🙂

Hopefully the haiku and ASCII frog I sent them will butter them up enough to send me an invite….. 😉