Posts Tagged‘amazon’

Amazon Teases Prime Air, Again.

The last time I wrote about Amazon Prime Air was almost 2 years ago to the day and back then it seemed to be little more than a flight of fancy. Back then drones, whilst still being somewhat commonplace, were still something of an emerging space especially when it came to regulations and companies making use of them. Indeed the idea instantly ran afoul of the FAA, something which Amazon was surprisingly blase about at the time. Still there had been musings of them continuing development of the program and today they’ve shown off another prototype drone that they might use in the future.

prime-air_03

The drone is an interesting beast, capable of both VTOL and regular flight. This was most likely done to increase the effective range of the craft as traditional flight is a lot less energy intensive than 100% VTOL flight. The new prototype drone has a stated range of 16 miles (about 25KM) which you’d probably have to cut in half for the return trip. Whilst that’s likely an order of magnitude above the previous prototype they showcased 2 years ago it still means that a serviced based on them will either be very limited or Amazon is planning a massive shakeup of its distribution network.

Of course the timing of this announcement (and the accompanying video below) mere hours before the yearly Cyber Monday sale starts in earnest can’t be denied. Amazon Prime Air is undeniably a marketing tactic, one that’s worked well enough in the past to warrant them trying it again in order to boost sales on this day. On the flip side Amazon does seem pretty committed to the idea, with their various proposals for airspace usage and “dozens of prototypes” in the works, however until they start offering the service to real customers it’s going to be easy to remain skeptical.

Last time I wrote about Amazon Prime Air one of my local readers mentioned that a similar service was looking to take off here in Australia. The offering was going to be a joint effort between Flirtey, a delivery drone developer, and Zookal a local text book sale and rent service. They were targeting mid last year for their first delivery by drone however that never came to pass. Indeed an article earlier this year was all I could dredge up on the service where they still have yet to use the service commercially. To their credit Flirtey did make the first drone delivery in the USA in July this year so the technology is there it just needs to be put to use.

Whether or not something like this will see widespread adoption however is something I’m still not sure on. Right now the centralized distribution models that most companies employ simply don’t work with the incredibly limited range that most drones have. Even if the range issue could be solved I’m still not sure if it would be economical to use them, unless the delivery fees were substantially higher (and then how many customers would pay for that?). Don’t get me wrong, I still think it’d be incredibly cool to get something delivered by drone, but at this point I’m still not 100% sold on the idea that it can be done economically.

Amazon Air Prime: It’s Part Marketing Campaign, Flight of Fancy and Hand Forcing.

If you haven’t been deliberately  avoiding mainstream media for the past couple days then chances are you’re already aware of Amazon’s latest announcement in Amazon Prime Air. It sounds like the world of science fiction, being able to place an order for something and then have it delivered by an automated right to your door in under 30 minutes. Indeed it pretty much is for the time being as whilst there seems to be a large amount of foundational work done on it the service is still many years away from seeing actual use. So this has had many asking the question: why would Amazon bother announcing something like this when its so far away form being a reality?

Amazon Prime Air DroneAs many have already rightly pointed out the timing of the announcement seems to point towards it being an elaborate marketing campaign with Amazon managing to snag a good 15 minutes of advertising, ostensibly for free, from the 60 minutes program. This happening right before Cyber Monday is too much of a coincidence for it to be anything other than planned so, at least for the short term, Amazon Prime Air is a marketing tactic to get people to shop at Amazon. I’d hesitate to give credence to the theory that it’s been done to remediate Jeff Bezos image though as those rumours about his personality have been swirling for ages and I’d be astonished if he wasn’t aware of them already.

However the one thing that this idea has going for it is that Bezos is behind it and this isn’t the first wild gamble he’s undertaken in recent memory. Back in 2000 he founded Blue Origin a private space company focused on human spaceflight. Whilst it hasn’t made the headlines like its competition has it’s still managed to do a handful of test flights and has worked in conjunction with NASA on its Commercial Crew Development Program. Whilst this is a worlds away from flying drones to deliver products it does show that Bezos is willing to commit funding over a long period of time to see something achieved. Though this still leaves the question of why he made the announcement so soon unanswered.

For what its worth I think the reasoning behind it is to get the public talking about it now so that there’ll be some momentum behind the idea when it comes time for Amazon to start talking with legislators about how this system is going to work. If the FAA was anything to go by such a system wouldn’t see the light of day for another 13 years or so. Whilst it’s definitely not ready for prime time yet due to the numerous technical challenges it has yet to overcome it’s unlikely that they will take that long to solve. Putting the screws to politicians in this way means that Amazon doesn’t have to spend as much money on direct lobbying or convincing the public that it’s a good idea.

As for me personally I think it’s a nifty idea however its application is likely going to be horribly limited, especially in locations outside of the USA. A quick glance over this map reveals just how many locations Amazon has in various countries (don’t be fooled by those 2 locations in Australia, they’re just corporate offices) and since their drones need to launch from one of the fulfilment sites you can see how small a range this kind of service will have. Of course they can always widen this by increasing the number of distribution centres they have but that’s antithetical to their current way of doing business. It’s a challenge that can be overcome, to be sure, however I just don’t see it getting much air time (ha!) outside of major capital cities, especially in non-USA countries.

I’d love to be proven wrong on this however as the lazy introvert inside me loves not having to do anything to get stuff that I want and the instant gratification such a service would provide is just the icing on the cake. However it’s unlikely to see the light of day for several years from now and likely the better part of a decade before it comes to Australia so I’m not exactly hanging out for it. I think the idea has some merit though although whether that will be enough to carry it on as a viable business process will be something that only time will be able to reveal to us.

 

 

A New AGIMO Policy is Great, But…

Canberra is a strange little microcosm. If you live here chances are you are either working directly for the government as a member of the public service or you’re part of an organisation that’s servicing said government. This is especially true in the field of IT as anyone with a respectable amount of IT experience can make a very good living working for any of the large department’s headquarters. I have made my IT career in this place and in my time here I’ve spent all of my time lusting after the cutting edge of technology whilst dealing with the realities of what large government departments actually need to function. As long time readers will be aware I’ve been something of a cloud junkie for a while now but not once have I been able to use it at my places of work, and there’s a good reason for that.

AGIMO ICT Strategy SummaryNot that you’d know that if you heard the latest bit of rhetoric from the current government which has criticised the current AGIMO APS ICT Strategy for providing only “notional” guidelines for using cloud base services. Whilst I’ll agree that the financial implications are rather cumbersome (although this is true of any procurement activity within the government, as anyone who’s worked in one can tell you) what annoyed me was the idea that security requirements were too onerous. The simple fact of the matter is that many government departments have regulatory and legal obligations not to use overseas cloud providers due to the legislation that restricts Australian government data from travelling outside our borders.

The technical term for this is data sovereignty and for the vast majority of the large government departments of Australia they’re legally bound to keep all their services, and the data that they rely on, on Australian soil. The legislation is so strict in this regard that even data that’s not technically sensitive, like say specifications of machines or network topologies, in some cases can’t be given to external vendors and must instead only be inspected on site. The idea then that governments could take advantage of cloud providers, most of which don’t have availability zones here in Australia, is completely ludicrous and no amount of IT strategy policies can change that.

Of course cloud providers aren’t unaware of these issues, indeed I’ve met with several people behind some of the larger public clouds on this, and many of them are bringing availability zones to Australia. Indeed Amazon Web Services has already made itself available here and Microsoft’s Azure platform is expected to land on our shores sometime next year. The latter is probably the more important of the two as if the next AGIMO policy turns out the way it’s intended the Microsoft cloud will be the defacto solution for light user agencies thanks to the heavy amount of Microsoft products in use at those places.

Whilst I might be a little peeved at the rhetoric behind the review of the APS ICT Strategy I do welcome it as even though it was only written a couple years ago it’s still in need of an update due to the heavy shift towards cloud services and user centric IT that we’ve seen recently. The advent of Australian availability zones will mean that the government agencies most able to take advantage of cloud services will finally be able to, especially with AGIMO policy behind them. Still it will be up to the cloud providers to ensure their systems can meet the requirements of these agencies and there’s still every possibility that they will still not be enough for some departments to take advantage of.

We’ll have to see how that pans out, however.

 

Google’s App Engine Available For On Premises Deployment.

The public cloud is a great solution to a wide selection of problems however there are times when its use is simply not appropriate. This is typical of organisations who have specific requirements around how their data is handled, usually due to data sovereignty or regulatory compliance. However whilst the public cloud is a great way to bolster your infrastructure on the cheap (although that’s debatable when you start ramping up your VM size) it doesn’t take advantage of the current investments in infrastructure that you’ve already made. For large, established organisations this is not insignificant and is why many of them were reluctant to transition fully to public cloud based services. This is why I believe the future of the cloud will be paved with hybrid solutions, something I’ve been saying for years now.

Microsoft has finally shown that they’ve understood this with the release of Windows Azure Pack for Server 2012R2. Sure there was beginnings of it with SCVMM 2012 allowing you to add in your Azure account and move VMs up there but that kind of thing has been available for ages through hosting partners. The Azure Pack on the other hand brings features that were hidden behind the public cloud wall down to the private level, allowing you to make full use of it without having to rely on Azure. If I’m honest I thought that Microsoft would probably be the only ones to try this given their presence in both the cloud and enterprise space but it seems other companies have begun to notice the hybrid trend.

Google App Engine

Google has been working with the engineers at Red Hat to produce the Test Compatibility Kit for Google App Engine. Essentially this kit provides the framework for verifying the API level functionality of a private Google App Engine implementation, something which is achievable through an application called CapeDwarf. The vast majority of the App Engine functionality is contained within that application, enough so that current developers on the platform could conceivably use their code using on premises infrastructure if they so wished. There doesn’t appear to be a bridge between the two currently, like there is with Azure, as CapeDwarf utilizes its own administrative console.

They’ve done the right thing by partnering with RedHat as otherwise they’d lack the penetration in the enterprise market to make this a worthwhile endeavour. I don’t know how much presence JBoss/OpenShift has though so it might be less of using current infrastructure and more about getting Google’s platform into more places than it currently is. I can’t seem to find any solid¹ market share figures to see how Google currently rates compared to the other primary providers but I’d hazard a guess they’re similar to Azure, I.E. far behind Rackspace and Amazon. The argument could be made that such software would hurt their public cloud product but I feel these kinds of solutions are the foot in the door needed to get organisations thinking about using these services.

Whilst my preferred cloud is still Azure I’m still a firm believer that the more options we have to realise the hybrid dream the better. We’re still a long way from having truly portable applications that can move between freely between private and public platforms but the roots are starting to take hold. Given the rapid pace of IT innovation I’m confident that the next couple years will see the hybrid dream fully realised and then I’ll finally be able to stop pining for it.

¹This article suggests that Microsoft has 20% of the market which, since Microsoft has raked in $1 billion, would peg the total market at some $5 billion total which is way out of line with what Gartner says. If you know of some cloud platform figures I’d like to see them as apart from AWS being number 1 I can’t find much else.

What We Can Tell From These Custom Top Level Domain Applications.

There are some 250+ top level domains available for use on the Internet today and most of them can be had through your local friendly domain registrar. The list has grown steadily over the past couple decades as more and more countries look to cement their presence on the Internet with their very own TLD. The registry responsible for all this is the Internet Corporation for Assigned Names and Numbers (ICANN) who looks after all the domain names as well as handing out the IP blocks to ISPs and corporations that request them. Whilst it seemed that the TLD space was forever going to be the place of countries and specific industries ICANN recently decided that it would allow anyone who could pony up the requisite $200,000 could have their own TLD effectively opening the market up to custom domain suffixes.

For an individual such a price seems ludicrous so it’s unlikely you’ll see .johndoe type domain names popping up all over the place. For most companies though securing this new form of brand identity is worth far more than the asking price and so many have signed up to do so. ICANN has since released a list of all the requested gTLDs and having a look through it has lead me, and everyone else it seems, to make some interesting conclusions about the big players in this custom TLD space (I made an excel spreadsheet of it for easy sleuthing).

The biggest player, although it’s not terribly obvious unless you sort by applicant name, is the newly founded donuts.co registry which has snagged some 300+ new gTLDs in order to start up its business. Donuts has $100 million in seed capital with which to play with which about 60% will be tied up solely in these domain suffix acquisitions. They all seem like your run of the mill SEO-y type words, being a large grab bag of words that the general public is likely to be interested in but are of no value for specific companies. Every domain also has its own associated LLC which isn’t a requirement of the application process so I’m wondering why they’ve done it. Likely it’s for isolating losses in the less than successful domains but it seems like an awful lot of work to do when that could be done in other ways.

They’re not the only ones doing that either. A quick search of other companies who’ve bought multiple domains although none of them have bought the same number that Donuts has. There also seems to be a few companies that are handling the gTLD for other big name companies ostensibly because they have no interest in actually running the gTLD but are just doing it for their brand identity. The biggest player in this space seems to be CSC Global who strangely enough did all their applications from another domain under their control, CSCInfo. It’s probably nothing significant but for a company that apparently specializes in brand identity you’d wonder why they’d apply with a different domain than their own.

What’s really got everyone going though is the domains that Amazon and Google have gone after. Whilst their war chests of gTLDs aren’t anything compared to Donut’s they’re still quite sizable with Amazon grabbing about 80 and Google grabbing just over 100. Some are taking this as being indicative of their future plans as Amazon has put in for gTLDs like mobile but realistically I can just most of them being augments to their current services (got an app on AWS? Get your .mobile domain today!). There’s also a bit of overlap for most of the popular domains that both these companies have gone after as well and I’m not sure what the resolution process for that is going to be.

While the 2000 odd applications seems to show that there’s some interest in these top level domains the real question of their value, at least for us web oriented folks, is whether the search engines will like them as much as other TLDs. There’s been a lot of heavy investment in current sites that reside on the regular TLDs and apart from marketing campaigns and new websites that are looking for a good name (http://this.movie.sucks seems like it’ll be created in no time) I question how much value these TLDs will bring. Sure there will be the initial gold rush of people looking to secure all the domains they can on these new TLDs but after that will there really be anything in them? Will businesses actually migrate to these gTLDs as their primary or will they simply just redirect them to their current sites? I don’t have answers to these questions but I’m very interested to see how these gTLDs get used.

Kindle Fire: Amazon’s Not Playing Apple’s Game.

Whilst Android has been making solid inroads to the tablet market, snapping up a respectable 26.8%, it’s still really Apple’s market with them holding a commanding lead that no one’s been able to come close to touching. It’s not for a lack of trying though with many big name companies attempting to break into the market only to pull out shortly afterwards, sometimes in blaze of fire sale glory. It doesn’t help matters much that every new tablet will be compared to the iPad thus ensuring every new tablet attempts to one up it in some way, usually keeping a price parity with the iPad but without the massive catalogue of apps that people have come to expect from Apple products. 

Apple’s got a great game going here. All of their iDevice range essentially made the market that they’re in, grabbing enough fans and early adopters to ensure their market dominance for years to come. Competitors then attempt to mimic Apple’s success by copying the essential ideas and then attempting to innovate, fighting an uphill battle. Whilst they might eventually lose ground to the massive onslaught of competitors (like they have to Android) they’ll still be one of the top individual companies, if they’re not number 1. It’s this kind of market leading that makes Apple products so desirable to John Q. Public and the reason why so many companies are failing to steal their market share away.

Rumours have been circulating for a while now over Amazon releasing a low cost tablet of some description and of course everyone was wondering whether it would shape up to be the next “iPad killer”. Today we saw the announcement of the Kindle Fire: a 7-inch multi-touch tablet that’s heavily integrated with Amazon’s services and comes at the low low price of only $199.

As a tablet it’s something of an outsider. Foregoing the traditional 9 to 10 inch screen size for a smaller 7 inch display. The processor in it isn’t anything fantastic, being just a step up from the one that powers the Nook Color, but history has shown it’s quite a capable system so the Kindle Fire shouldn’t be a slouch when it comes to performance. There’s also a distinct lack of cameras, 3G and Bluetooth connectivity meaning that the sole connection this tablet has to the outside world will be via your local wifi connection. It comes with an internal 8GB of storage that’s not upgradeable, favouring to store everything on the cloud and download it as required. You can see why this thing wouldn’t work with WhisperNet.

Also absent is any indication that the Kindle Fire is actually an Android device with the operating system being given a total overhaul. The Google App store has been outright replaced by Amazon’s Android app store and the familiar tile interface has been replaced by a custom UI designed by Amazon. All of Amazon services: music, books and movies to name a few, are heavily integrated with the device. Indeed they are so heavily integrated that the tablet also comes with a free month of Amazon Prime, Amazon’s premium service that offers unlimited free 2 day shipping plus access to their entire catalogue of media. At this point calling this thing a tablet seems like a misnomer, it’s much more of a media consumption device.

What’s really intriguing about the Kindle Fire though is the browser that Amazon has developed for it called Silk. Like Opera Mini and Skyfire before it Silk offloads some of the heavy lifting to external servers, namely Amazon’s massive AWS infrastructure. There’s some smarts in the delineation between what should be processed on device and what should be done on the servers so hopefully dynamic pages, which suffered heavily in this kind of configuration, will run a lot better under Silk. Overall it sounds like a massive step up for the usability of the browser on devices like these which is sure to be a great selling point for the Kindle Fire.

The more I read about the Kindle Fire the more I get the feeling that Amazon has seen the game that Apple has been playing and decided to not get caught up in it like their competitors have. Instead of competing directly with the iPad et. al. they’ve created a device that’s heavily integrated with their own services and have put themselves at arms length with Android. John Q. Public then won’t see the Kindle Fire as an Android Tablet nor an iPad competitor, more it’s a cheap media consumption device that’s capable at doing other tasks from a large and reputable company. The price alone is enough to draw people in and whilst the margins on the device are probably razor thin they’ll more than likely make it up in media sales for the device. All those together make the Kindle Fire a force to be reckoned with, but I don’t think current tablet manufacturers have much to worry about.

The Kindle Fire, much like the iPad before it, carves out its own little niche that’s so far be unsuccessfully filled. It’s not a feature laden object of every geek’s affection, more it’s a tablet designed for the masses with a price that competitors will find hard to beat. The deep integration with Amazon’s services will be the feature that ensures the Kindle Fire’s success as that’s what every other iPad competitor has lacked. However there’ll still be a market for the larger, more capable tablets as they’re more appropriate for people seeking a replacement for their laptop rather than a beefed up media player. I probably won’t be buying one for myself, but I could easily see my parents using one of these.

And I’m sure that’s what Amazon is banking on too.

My Preferred Cloud Strategy.

Working with Microsoft’s cloud over the past couple months has been a real eye opener. Whilst I used to scoff at all these people eschewing the norms that have (and continue to) serve us well in favor of the latest technology du’jour I’m starting to see the benefits of their ways, especially with the wealth of resources that Microsoft has on the subjects. Indeed the cloud aspects of my latest side project, whilst consuming a good chunk of time at the start, have required almost no tweaking whatsoever even after I change my data model or fundamental part of how the service works. There is one architectural issue that continues to bug me however and recent events have highlighted why it troubles me so.

The events I’m referring to are the recent outage to Amazon’s Elastic Block storage service that affected a great number of web services. In essence part of the cloud services that Amazon provides, in this case a cloud disk service that for all intents and purposes is the same as a hard drive in your computer, suffered a major outage in one of their availability zones. This meant for most users in that particular zone that relied on this service to store data their services would begin to fail just as your computer would if I ripped its hard drive out whilst you were using it. The cause of the events can be traced back to human error but it was significantly compounded by the high level of automation in the system, which would be needed for any kind of cloud service at this scale.

For a lot of the bigger users of Amazon’s cloud this wasn’t so much of an issue since they usually have replicas of their service on the geographically independent mirror of the cloud that Amazon runs. For a lot of users however their entire service is hosted in the single location, usually because they can’t afford the additional investment to geographically disperse their services. Additionally you have absolutely no control over any of the infrastructure so you leave yourself at the mercy of technicians of the cloud. Granted they’ve done a pretty good so far but you’re still outsourcing risk that you can’t mitigate, or at least not affordably.

My preferred way of doing the cloud, which I’ve liked ever since I started talking to VMware about their cloud offerings back in 2008, was to combine the ideas of both self hosted services with the extensibility of the cloud. For many services they don’t need all the power (nor the cost) of running multiple cloud instances and could function quite happily on a few co-hosted servers. Of course there are times when they’d require the extra power to service peak requests and that’s where the on-demand nature of cloud services would really shine. However apart from vCloud Express (which is barely getting trialled by the looks of things) none of the cloud operators give you the ability to host a small private cloud yourself and then offload to their big cloud as you see fit. Which is a shame since I think it could be one way cloud providers could wriggle their way into some of the big enterprise markets that have shunned them thus far.

Of course there are other ways of getting around the problems that cloud providers might suffer, most of which involve not using them for certain parts of your application. You could also build your app on multiple cloud platforms (there are some that even have compatible APIs now!) but that would add an inordinate amount of complexity to your solution, not to mention doubling the costs of developing it. The hybrid cloud solution feels like the best of both worlds however it’s highly unlikely that it’ll become a mainstream solution anytime soon. I’ve heard rumors of Microsoft providing something along those lines and their new VM Role offering certainly shows the beginnings of that becoming a reality but I’m not holding my breath. Instead I’ll code a working solution first and worry about scale problems when I get to scale, otherwise I’m just engaging in fancy procrastination.