Business

Cost of a movie by country

Are We Really Surprised by Australian’s Being Pirates?

Australia is an incredibly strong country economically being ranked as the 12th largest by GDP of all countries in the world. When you then consider that our population is a fraction of that of many countries that are above us (Canada is the closest in size and is in 11th spot with a population about 50% bigger than ours) it means that, on average, Australians are more wealthy than their global counterparts. This is somewhat reflected in the price we pay for certain things however it doesn’t take a lot of effort to show that we pay more than you’d expect for many goods and services. The most notable being media as we lack any of the revolutionary services that drive their prices down (Netflix, Hulu, etc.) or any viable alternatives. It gets even worse though as it seems we also pay more just to go to the cinema.

Cost of a movie by country

 

The graphic above shows that Australia, along with a few other developed nations, pay an extraordinary amount more than others do when the costs are normalized. The differences between the lowest and the highest aren’t exactly huge, you’re looking at a spread of about $15 from the cheapest to the most expensive, however this is yet another indication of just how much more Australia pays for its media than anyone else does. In essence we’re paying something on the order of 25%~50% more for the same product yet the excuses that the industry once relied on, that Australia is “really far away”, don’t really hold water anymore.

It should come as little surprise then that Australians are then far more likely to pirate than any other developed country, sometimes representing up to almost 20% of new release piracy. There have been some inroads made into attempting to reduce this number, with a few stations “fast-tracking” episodes (although they still usually carry a delay) or giving users access to an online option, however the former doesn’t solve the problem entirely and the latter was unfortunately repealed. The hunger for the media is there it’s just that a reasonably priced option has failed to materialize for Australian users (and if you mention Quickflix I’ll gut you) which has led to these dramatic figures.

Now I’d be entirely happy with doing the slightly dodgy and getting myself a Netflix or Hulu account via a VPN or geo-unblocking service however my bandwidth isn’t up to the task of streaming media at 720p. Sure it could probably do a lower resolution but I didn’t invest as much as I did in my entire home theatre system to have it operate at a sub-par level. This issue was supposed to go away with the NBN being just around the corner but I literally have no idea when that might be coming nor what incarnation of it I will end up getting. So it seems that, at least for now, I’m stuck in digital limbo where I either fall to piracy or being gouged repeatedly.

Neither of these issues are beyond fixing and indeed it’s been shown that once a reasonably priced alternative becomes available people ditch piracy in a heartbeat. Heck I know that for me once Steam became widely available my game spend increased dramatically, especially after I found sites like DLcompare. I can assure you that the same will happen once a media based alternative comes to Australia and I’m not the only one who has the disposable income to support it.

OculusVR Headset Developer Preview 2

What’s the Deal with Facebook Acquiring Oculus VR?

Companies buying other companies is usually nothing to get excited about. Typically it’s a big incumbent player buying up a small company that’s managed to out-innovate them in a particular market segment so instead of losing market share the incumbent chooses to acquire them. Other times it’s done in order to funnel the customer base onto the core product that the incumbent is known for much like Google did with many of its acquisitions like Android. Still every so often a company will seemingly go out of its way to acquire another that honestly doesn’t seem to fit and we’re all left wondering what the hell they’re thinking. Facebook has done this today acquiring the virtual reality pioneer OculusVR.

OculusVR Headset Developer Preview 2Facebook and OculusVR could not be more different, one being  the biggest social network in the world that’s got 1.23 billion active users per month and the other being a small company with only 50 employees focusing on developing virtual reality technology. Whilst the long winded PR speech from Zuckerberg seems to indicate that they’re somehow invested in making the Oculus Rift the new way of experiencing the world it’s clear that Facebook is going to be running it as it’s own little company, much like Instagram and WhatsApp before it. With the recent rumours of Facebook looking to purchase drone manufacturer Titan Aerospace, another company that doesn’t seem like a good fit for the Facebook brand, it begs the question: what’s Facebook’s plan here?

Most of the previous high profile acquisitions aligned directly with Facebook’s weaknesses, namely how badly they were doing in the mobile space. Instagram fit the bill perfectly in this regard as they managed to grow a massive mobile-only social network that rivalled Facebook’s own mobile client for usage. Whilst many questioned whether paying $1 billion for a company that hadn’t generated a single dollar was worth it for them it seems like Facebook got some value out of it as their mobile experience has improved drastically since then. WhatsApp seemed to be in a similar vein although the high cost of acquisition (even though this one had some revenue to back it up) makes it much more questionable than the Instagram purchase. Still for both of them it was filling in a gap that Facebook had, OculusVR doesn’t do that.

From my perspective it seems like Facebook is looking to diversify its portfolio and the only reason I can think of to justify that is their core business, the Facebook social network, is starting to suffer. I can’t really find any hard evidence to justify this but it does seem like the business community feels that Facebook is starting to lose its younger audience (teens specifically) to messenger apps. Acquiring WhatsApp goes some way to alleviate this but acquiring the most popular app every couple years isn’t a sustainable business model. Instead it looks like they might be looking to recreate the early Google environment, one that spawned multiple other lines of business that weren’t directly related to their core business.

This was definitely a successful model for Google however most of the products and acquisitions they made at a similar stage to Facebook were centred around directing people back to their core products (search and advertising). Most of the moonshot ideas, whilst showing great initial results, have yet to become actual lines of business for them with the two most notable ones, Glass and the self-driving car, still in the developmental or early adopter phase. Facebook’s acquisition of OculusVR doesn’t really fit into this paradigm however with OculusVR likely going to be the first to market with a proper virtual reality headset it might just be a large bet that this market segment will take off.

Honestly it’s hard to see what Facebook’s endgame is here, both for OculusVR and themselves as a company. I think Facebook will stay true to their word about keeping OculusVR independent but I have no clue how they’ll draw on the IP and talent their to better themselves. Suffice to say not everyone is of the same opinion and this is something that Facebook and OculusVR are going to have to manage carefully lest the years of goodwill they’ve built up be dashed in a single movement. I won’t go as far to say that I’m excited to see what these too will do together but I’ll definitely be watching with a keen interest.

 

New Microsoft Chief Executive Officer of, Satya Nadella

What Kind of Microsoft Can We Expect From Satya Nadella?

In the time that Microsoft has been a company it has only known two Chief Executive Officers. The first was unforgettably Bill Gates, the point man of the company from its founding days that saw the company grow from a small software shop to the industry giant of the late 90s. Then, right at the beginning of the new millennium, Bill Gates stood down and passed the crown to long time business partner Steve Ballmer who has since spent the better part of 2 decades attempting to transform Microsoft from a software company to a devices and services one. Rumours had been spreading about who was slated to take over Ballmer for some time now and, last week, after much searching Microsoft veteran Satya Nadella took over as the third CEO of the venerable company and now everyone is wondering where he will take it.

New Microsoft Chief Executive Officer of, Satya NadellaFor those who don’t know him Nadella’s heritage in Microsoft comes from the Server and Tools department where he’s held several high ranking positions for a number of years. Most notably he’s been in charge of Microsoft’s cloud computing endeavours, including building out Azure which hit $1 billion in sales last year, something I’m sure helped to seal the deal on his new position. Thus many would assume that Nadella’s vision for Microsoft would trend along these lines, something which runs a little contrary to the more consumer focused business that Ballmer sought to deliver, however his request to have Bill Gates step down as Chairman of the Board so that Nadella could have him as an advisor in this space says otherwise.

As with any changing of the guard many seek to impress upon the new bearer their wants for the future of the company. Nadella has already come under pressure to drop some of Microsoft’s less profitable endeavours including things like Bing, Surface and even the Xbox division (despite it being quite a revenue maker, especially as of late). Considering these products are the culmination of the effort of the 2 previous CEOs, both of which will still be involved in the company to some degree, taking an axe to them would be a extraordinarily hard thing to do. These are the products they’ve spent years and billions of dollars building so dropping them seems like a short sighted endeavour, even if it would make the books look a little better.

Indeed many of these business units which certain parties would look to cut are the ones that are seeing good growth figures. The surface has gone from a $900 million write down disaster to losing a paltry $39 million in 2 quarters, an amazing recovery that signals profitability isn’t too far off. Similarly Bing, the search engine that we all love to hate on, saw a 34% increase in revenue in a single quarter. It’s not even worth mentioning the Xbox division as it’s been doing well for years now and the release of the XboxOne with it’s solid initial sales ensures that remains one of Microsoft’s better performers.

The question then becomes whether Nadella, and the board he now serves, sees the on-going value in these projects. Indeed much of the work they’ve been doing in the past decade has been with the focus towards unifying many disparate parts of their ecosystem together, heading towards that unified nirvana where everything works together seamlessly. Removing them from the picture feels like Microsoft backing itself into a corner, one where it can be easily shoehorned into a narrative that will see it easily lose ground to those competitors it has been fighting for years. In all honesty I feel Microsoft is so dominant in those sectors already that there’s little to be gained from receding from perceived failures and Nadella should take this chance to innovate on his predecessor’s ideas, not toss them out wholesale.

 

HP Care Packs

HP Charging For Updates? Not Sure if Don’t Want.

It’s every system administrator’s dream to only be working on the latest hardware running the most recent software available. This is partially due to our desire to be on the cutting edge of all things, where new features abound and functionality is at its peak. However the reality is always far from that nirvana with the majority of our work being on systems that are years old running pieces of software that haven’t seen meaningful updates in years. That’s why few tears have been shed by administrators worldwide about XP’s impending demise as it signals the end of the need to support something that’s now over a decade old. Of course this is much to the chagrin of end users and big enterprises who have still yet to make the transition.

Indeed big enterprises are rarely on the cutting edge and thus rely on extended support programs in order to keep their fleet maintained. This is partially due to the amount of inertia big corporations have, as making the change to potentially thousands of endpoints takes some careful planning an execution. Additionally the impacts to the core business cannot be underestimated and must be taken into careful consideration before the move to a new platform is made. With this in mind it’s really no surprise that corporations often buy support contracts that go for 3 or 5 years for the underlying hardware as that ensures that they won’t have to make disruptive changes during that time frame.

HP Care Packs

So when HP announced recently that it would be requiring customers to have a valid warranty or support agreement with them in order to get updates I found myself in two minds about it. For most enterprises this will be a non-issue as running hardware that’s out of warranty is begging for trouble and not many have the appetite for that kind of risk. Indeed I actually thought this would be a good thing for enterprise level IT as it would mean that I wouldn’t be cornered into supporting out of warranty hardware, something which has caused me numerous headaches in the past. On the flip side though this change does affect something that is near and dear to my heart: my little HP Mircoserver.

This new decision means that this little server only gets updates for a year after purchase after which you’re up for at least $100 for a HP Care Pack which extends the warranty out to 5 years and provides access to all the updates. Whilst I missed the boat on the install issues that plagued its initial release (I got mine after the update came out) I can see it happening again with similar hardware models. Indeed the people hit hardest by this change are likely the ones who would be least able to afford a support plan of this nature (I.E. smaller businesses) who are the typical candidates for running hardware that’s out of a support arrangement. I can empathise with their situation but should I find myself in a situation where I needed an update for them and couldn’t get it due to their lack of support arrangements I’d be the first one to tell them so.

Indeed the practice isn’t too uncommon with the majority of other large vendors requiring something on the order of a subscription in order to get product updates with the only notable exception being Dell (full disclosure: I work for them). I’ll agree that it appears to be a bit of a cash grab as HP’s server business hasn’t been doing too well in the recent quarters (although no one has done particularly well, to be honest) although I doubt they’re going to make up much to counter act the recent downfall. This might also spur some customers on to purchase newer hardware whilst freeing up resources within HP that no longer need to support previous generations of hardware.

So I guess what I’m getting at is that whilst I can empathise with the people who will be hard done by with this change I, as someone who has to deal with warranty/support calls, don’t feel too hard done by. Indeed any admin worth their salt could likely get their hands on the updates anyway without having to resort to the official source anyway. If the upkeep on said server is too much for you to afford then it’s likely time to rethink your IT strategy, potentially looking at cloud based solutions that have a very low entry point cost when compared to upgrading a server.

Amazon Prime Air Drone

Amazon Air Prime: It’s Part Marketing Campaign, Flight of Fancy and Hand Forcing.

If you haven’t been deliberately  avoiding mainstream media for the past couple days then chances are you’re already aware of Amazon’s latest announcement in Amazon Prime Air. It sounds like the world of science fiction, being able to place an order for something and then have it delivered by an automated right to your door in under 30 minutes. Indeed it pretty much is for the time being as whilst there seems to be a large amount of foundational work done on it the service is still many years away from seeing actual use. So this has had many asking the question: why would Amazon bother announcing something like this when its so far away form being a reality?

Amazon Prime Air DroneAs many have already rightly pointed out the timing of the announcement seems to point towards it being an elaborate marketing campaign with Amazon managing to snag a good 15 minutes of advertising, ostensibly for free, from the 60 minutes program. This happening right before Cyber Monday is too much of a coincidence for it to be anything other than planned so, at least for the short term, Amazon Prime Air is a marketing tactic to get people to shop at Amazon. I’d hesitate to give credence to the theory that it’s been done to remediate Jeff Bezos image though as those rumours about his personality have been swirling for ages and I’d be astonished if he wasn’t aware of them already.

However the one thing that this idea has going for it is that Bezos is behind it and this isn’t the first wild gamble he’s undertaken in recent memory. Back in 2000 he founded Blue Origin a private space company focused on human spaceflight. Whilst it hasn’t made the headlines like its competition has it’s still managed to do a handful of test flights and has worked in conjunction with NASA on its Commercial Crew Development Program. Whilst this is a worlds away from flying drones to deliver products it does show that Bezos is willing to commit funding over a long period of time to see something achieved. Though this still leaves the question of why he made the announcement so soon unanswered.

For what its worth I think the reasoning behind it is to get the public talking about it now so that there’ll be some momentum behind the idea when it comes time for Amazon to start talking with legislators about how this system is going to work. If the FAA was anything to go by such a system wouldn’t see the light of day for another 13 years or so. Whilst it’s definitely not ready for prime time yet due to the numerous technical challenges it has yet to overcome it’s unlikely that they will take that long to solve. Putting the screws to politicians in this way means that Amazon doesn’t have to spend as much money on direct lobbying or convincing the public that it’s a good idea.

As for me personally I think it’s a nifty idea however its application is likely going to be horribly limited, especially in locations outside of the USA. A quick glance over this map reveals just how many locations Amazon has in various countries (don’t be fooled by those 2 locations in Australia, they’re just corporate offices) and since their drones need to launch from one of the fulfilment sites you can see how small a range this kind of service will have. Of course they can always widen this by increasing the number of distribution centres they have but that’s antithetical to their current way of doing business. It’s a challenge that can be overcome, to be sure, however I just don’t see it getting much air time (ha!) outside of major capital cities, especially in non-USA countries.

I’d love to be proven wrong on this however as the lazy introvert inside me loves not having to do anything to get stuff that I want and the instant gratification such a service would provide is just the icing on the cake. However it’s unlikely to see the light of day for several years from now and likely the better part of a decade before it comes to Australia so I’m not exactly hanging out for it. I think the idea has some merit though although whether that will be enough to carry it on as a viable business process will be something that only time will be able to reveal to us.

 

 

AGIMO ICT Strategy Summary

A New AGIMO Policy is Great, But…

Canberra is a strange little microcosm. If you live here chances are you are either working directly for the government as a member of the public service or you’re part of an organisation that’s servicing said government. This is especially true in the field of IT as anyone with a respectable amount of IT experience can make a very good living working for any of the large department’s headquarters. I have made my IT career in this place and in my time here I’ve spent all of my time lusting after the cutting edge of technology whilst dealing with the realities of what large government departments actually need to function. As long time readers will be aware I’ve been something of a cloud junkie for a while now but not once have I been able to use it at my places of work, and there’s a good reason for that.

AGIMO ICT Strategy SummaryNot that you’d know that if you heard the latest bit of rhetoric from the current government which has criticised the current AGIMO APS ICT Strategy for providing only “notional” guidelines for using cloud base services. Whilst I’ll agree that the financial implications are rather cumbersome (although this is true of any procurement activity within the government, as anyone who’s worked in one can tell you) what annoyed me was the idea that security requirements were too onerous. The simple fact of the matter is that many government departments have regulatory and legal obligations not to use overseas cloud providers due to the legislation that restricts Australian government data from travelling outside our borders.

The technical term for this is data sovereignty and for the vast majority of the large government departments of Australia they’re legally bound to keep all their services, and the data that they rely on, on Australian soil. The legislation is so strict in this regard that even data that’s not technically sensitive, like say specifications of machines or network topologies, in some cases can’t be given to external vendors and must instead only be inspected on site. The idea then that governments could take advantage of cloud providers, most of which don’t have availability zones here in Australia, is completely ludicrous and no amount of IT strategy policies can change that.

Of course cloud providers aren’t unaware of these issues, indeed I’ve met with several people behind some of the larger public clouds on this, and many of them are bringing availability zones to Australia. Indeed Amazon Web Services has already made itself available here and Microsoft’s Azure platform is expected to land on our shores sometime next year. The latter is probably the more important of the two as if the next AGIMO policy turns out the way it’s intended the Microsoft cloud will be the defacto solution for light user agencies thanks to the heavy amount of Microsoft products in use at those places.

Whilst I might be a little peeved at the rhetoric behind the review of the APS ICT Strategy I do welcome it as even though it was only written a couple years ago it’s still in need of an update due to the heavy shift towards cloud services and user centric IT that we’ve seen recently. The advent of Australian availability zones will mean that the government agencies most able to take advantage of cloud services will finally be able to, especially with AGIMO policy behind them. Still it will be up to the cloud providers to ensure their systems can meet the requirements of these agencies and there’s still every possibility that they will still not be enough for some departments to take advantage of.

We’ll have to see how that pans out, however.

 

Simon Hackett Internode

A Ray of Hope for the NBN.

The resignation of the National Broadband Network board was an expected move due to the current government’s high level of criticism of the project. Of course while I, and many other technically inclined observers, disagreed with the reasons cited for Turnbull’s request for their resignations I understood that should we want to get the NBN in the way we (the general public) wanted it then it was a necessary move that would allow the Liberal party to put their stamp on the project. However what followed seemed to be the worst possible outcome, one that could potentially see the NBN sent down the dark FTTN path that would doom Australia into remaining as an Internet backwater for the next few decades.

They hired ex-Telstra CEO Ziggy Switkowski.

For anyone who lived through his tenure as the head of Australia’s largest telecommunications company his appointment to the head of the NBN board was a massive red flag. It would be enough to be outraged at his appointment for the implementation of data caps and a whole host of other misdeeds that have plagued Australia’s Internet industry since his time in office but the real crux of the matter is that since his ousting at Telstra he’s not been involved in the telecommunications industry for a decade. Whatever experience he had with it is now long dated and whilst I’m thankful that his tenure as head of the board is only temporary (until a new CEO is found) the fact that he has approved other former Telstra executives to the NBN board shows that even a small amount of time there could have dire implications

Simon Hackett InternodeNews came yesterday however that Turnbull has appointed Simon Hackett, of Internode fame, was appointed to the NBN board. In all honesty I never expected this to come through as whilst there were a few grass roots campaigns to get that to happen I didn’t think that they’d have the required visibility in order to make it happen. However Hackett is a well known name in the Australian telecommunications industry and it’s likely that his reputation was enough for Turnbull to consider him for the position. Best of all he’s been a big supporter of the FTTH NBN since the get go and with this appointment will be able to heavily influence the board’s decisions about the future of Australia’s communication network.

Whilst I was always hopeful that a full review of the feasibility of the NBN would come back with resounding support for a FTTH solution this will almost certainly guarantee such an outcome. Of course Turnbull could still override that but with his staunch stance of going with the review’s decision it’s highly unlikely he’d do that, less he risk some (even more) severe political backlash. The most likely change I can see coming though is that a good chunk of the rollout, mostly for sites where there is no current contracts, will fall to Telstra. Whilst I’m a little on the fence about this (they’d be double dipping in that they’d get paid to build the new network and for disconnecting their current customers) it’s hard to argue that Telstra isn’t a good fit for this. I guess the fact that they won’t end up owning it in the end does make it a fair bit more palatable.

So hopefully with Hackett’s appointment to the NBNCo board we’ll have a much more technically inclined view presented at the higher levels, one that will be able to influence decisions to go down the right path. There’s still a few more board members to be appointed and hopefully more of them are in the same vein as Hackett as I’d rather not see it be fully staffed with people from Telstra.

 

NOKIA BRICKS

Microsoft Buys Nokia: That’s Great, For One of Them.

If you’re old enough to remember a time when mobile phones weren’t common place you also likely remember the time when Nokia was the brand to have, much like Apple is today. I myself owned quite a few of them with my very first phone ever being the (then) ridiculously small Nokia 8210. I soon gravitated towards other, more shiny devices as my disposable income allowed but I did find myself in possession of an N95 because, at the time, it was probably one of the best handsets around for techno-enthusiasts like myself.  However it’s hard to deny that they’ve struggled to compete in today’s smartphone market and, unfortunately, their previous domination in the feature phone market has also slipped away from them.

NOKIA BRICKSTheir saving grace was meant to come from partnering with Microsoft and indeed I attested to as much at the time. Casting my mind back to when I wrote that post I was actually of the mind that Nokia was going to be the driving force for Microsoft however in retrospect it seems the partnership was done in the hopes that both of their flagging attempts in the smartphone market could be combined into one, potentially viable, product. Whilst I’ve praised the design and quality of Windows Phone based Nokias in the past it’s clear that the amalgamation of 2 small players hasn’t resulted in a viable strategy to accumulate a decent amount of market share.

You can then imagine my surprise when Microsoft up and bought Nokia’s Devices and Services business as it doesn’t appear to be a great move for them.

So Nokia as a company isn’t going anywhere as they still retain control of a couple key businesses (Solutions and Networks, HERE/Navteq and Advanced Technologies which I’ll talk about in a bit) however they’re not going to be making phones anymore as that entire capability has been transferred to Microsoft. That’s got a decent amount of value in itself, mostly in the manufacturing and supply chains, and Microsoft’s numbers will swell by 32,000 when the deal is finished. However whether that’s going to result in any large benefits for Microsoft is debateable as they arguably got most of this in their 2011 strategic partnership just that they can now do all the same without the Nokia branding on the final product.

If this type of deal is sounding familiar then you’re probably remembering the nearly identical acquisition that Google made in Motorola back in 2011. Google’s reasons and subsequent use of the company were quite different however and, strangely enough, they have yet to use them to make one of Nexus phones. Probably the biggest difference, and this is key to why this deal is great for Nokia and terrible for Microsoft, is the fact that Google got all of Motorola’s patents, Microsoft hasn’t got squat.

As part of the merger a new section is being created in Nokia called Advanced Technologies which, as far as I can tell, is going to be the repository for all of Nokia’s technology patents. Microsoft has been granted a 10 year license to all of these, and when that’s expired they’ll get a perpetual one, however Nokia gets to keep ownership of all of them and the license they gave Microsoft is non-exclusive. So since Nokia is really no longer a phone company they’re now free to start litigating against anyone they choose without much fear of counter-suits harming any of their products. Indeed they’ve stated that the patent suits will likely continue post acquisition signalling that Nokia is likely going to look a lot more like a patent troll than a technology company in the near future.

Meanwhile Microsoft has been left with a flagging handset business, one that’s failed to reach the kind of growth that would be required to make it sustainable long term. Now there’s something to be said about Microsoft being able to release Lumia branded handsets (they get the branding in this deal) but honestly their other forays into the consumer electronics space haven’t gone so well so I’m not sure what they’re going to accomplish here. They’ve already got the capability and distribution channels to get products out there (go into any PC store and you’ll find Microsoft branded peripherals there, guaranteed) so whilst it might be nice to get Nokia’s version of that all built and ready I’m sure they could have built one themselves for a similar amount of cash. Of course the Lumia tablet might be able to change consumer’s minds on that one but most of the user complaints around Windows RT weren’t about the hardware (as evidenced in my review).

In all honesty I have no idea why Microsoft would think this would be a good move, let alone a move that would let them do anything more than they’re currently doing. If they had acquired Nokia’s vast portfolio of patents in the process I’d be singing a different tune as Microsoft has shown how good they are in wringing license fees out of people (so much so that the revenue they get from Android licensing exceeds that of their Windows Phone division) . However that hasn’t happened and instead we’ve got Nokia lining up to become a patent troll of epic proportions and Microsoft left $7 billion patent licensing deal that comes with its own failing handset business. I’m not alone in this sentiment either as Microsoft’s shares dropped 5% on this announcement which isn’t great news for this deal.

I really want to know where they’re going with this because I can’t for the life of me figure it out.

 

Model S

Tesla Plays With The Big Boys and Wins.

Elon Musk is quite the business magnate. Long time readers will know that he’s the CEO of SpaceX the current darling of the private space industry which has done as much innovation in a decade as others have done in half a century. However that’s not Musk’s only endeavor having started out by working in the payments industry, famously being PayPal’s largest stock holder when it was eventually acquired by eBay for $1.5 billion. That allowed him to create 2 companies of his own: SpaceX and Tesla Motors whilst being heavily involved in a third, SolarCity. The success of all these companies can’t be denied but it wasn’t always all roses for all these companies, especially Tesla, and indeed Musk himself.

Model S

Building a car manufacturer, especially one that eschews the traditional internal combustion engine for full electric, is fraught with risk and requires massive amounts of capital to pull off. Whilst Tesla’s end goal has been affordable electric cars for everyone it didn’t start off trying to service this market, instead focusing on building a high performance electric roadster that had a very limited production run. Of course this also drew skepticism from potential investors as they couldn’t be sure that Tesla would be anything more than a niche sports car producer and so many steered clear. However Musk was undeterred and in 2008 announced the Model S and hinted towards further models that would use the same power train, effectively creating a platform for the rest of Tesla’s fleet.

To say that the rest of the world was skeptical that they could pull this off would be putting it lightly. Indeed even though they managed to secure a $451.8 million dollar loan from the Department of Energy to help set them up investors still continued to short their stock heavily, to the point where it was one of the most shorted stocks on the NASDAQ. Some went as far as to say that Tesla was only profitable due to the American tax payers, words which would soon be served right back to them with a serve of humble pie when Tesla paid the loan back in full at the start of this year, 9 years before it was due. Since then Tesla’s stocks have continued to climb and it’s not just because people are looking for a pump ‘n’ dump.

The Tesla Model S won car of the year from Motor Trends and Automobile Magazine last year rocketing it from being a toy for the technical/green crowd to being a well known brand. Whilst it’s still not in the realm of the everyman with the base model still being some $65,000 it has still proved to be quite a popular car snagging 8% of the luxury car market in the USA. To put that into perspective that means the Model S has beaten the sales of both the BMW 7 series and the Audi A8, cars which have a pretty loyal following and have been around for decades. They’re only just beginning to ramp up production as well with the current 400 or so produced per week expected to double by years end making them one of the largest producers of purely electric vehicles.

Tesla has not only shown that fully electrical vehicles are possible today they’re also, in fact, great business too. Whilst the investors might be skeptical other car companies aren’t with the number of EVs available exploding as each manufacturer tries to carve out their own section of this market. Most of them are focusing on the low end now however and it’s highly likely that Tesla will eat their lunch when the eventual $30,000 model debuts sometime in the future. Still the more competition in this space the better as it means the products we get as consumers get that much better and, of course, cheaper.

Now all we have to do is hope that the Australia Tax doesn’t hit the Model S as that’d put the kibosh on my enthusiasm a little bit.

Daft Punk Random Access Memories

Legitimate Piracy.

If you were to plot my rate of piracy related activities over time it’d show a direct negative correlation to my salary. My appetite for software, games and music hasn’t really changed over the years but as my income has grown I found myself preferring to pay for something if I can, especially now that many services out compete the pirated product in terms of features and convenience. I’d be lying if I said guilt wasn’t part of it too as whilst I didn’t have the money to give back at the height of my piracy days I feel like I’m beginning to make up for it. Still I constantly find situations where I need to turn to less than legal avenues to get a product I want, usually one I’ve purchased anyway.

Daft Punk Random Access Memories

Indeed this happened quite recently with my purchase of the new Daft Punk album. My long time Twitter followers will tell you that I went rather…hyperbolic when I heard their new album was due out this year and I make no secret of the fact that they’re my favourite band, bar none. Of course that translates to me wanting to give them as much of my money as I can and so I plonked down the requisite $50 preorder for the vinyl version of their album (mostly as a talking piece) which also included a digital download of their album. Now considering that it was going to be available everywhere digitally on day 1 I figured I’d get an email with the code in it and the album would take its merry time getting here.

I received no such email.

My copy of Random Access Memories showed up yesterday, almost a week after the official launch date and nearly two weeks after Daft Punk made it available for streaming through iTunes. I had a couple options available to me at this point: I could simply wait until mine arrived, listen to a stream (requiring an iTunes install, something I don’t want to do) or find another way. My other way was to find an upload on Grooveshark, which was obviously not authorized and was taken down a day later. I got to hear the album at roughly the same time as everyone else though which was basically all I wanted but I couldn’t help but feel like I had been cheated somewhat just because I tried to support the artists as much as I could.

I felt no guilt going to slightly nefarious sources to get my Daft Punk fix but honestly I shouldn’t have had to. There’s nothing special about the code they sent me that requires it to be physical and it’s not like emailing people who preordered a code to plug into a website is an unsolved problem either. The pirates in this instance were making up for the failings of others, providing a service to everyone regardless of whether they’d made the purchase or not. Now that I’ve got my real copy I have no need for it but it still gets to me that they’re providing a valuable service, one that I didn’t have to pay them for.

Sure in the larger scheme of things its a small gripe but it’s things like this that highlight the reason that piracy exists and will continue to exist for a long time to come. The effort required to fix them is quite trivial since the pirates don’t do this as their full time job and the companies providing the service just need to hurry up and out compete them. If Valve can get digital distribution right then I see no reason why others can’t, but until then I’ll still have to rely on my slightly nefarious friends to make up for their failings.