Flash has been dying a long, slow and painful death. Ever since it developed a reputation for being a battery killer, something which is abhorrent in today’s mobile centric world, consumers have begun avoiding it at every opportunity they can. This has been aided tremendously by the adoption of web standards by numerous large companies, ensuring that no one is beholden to Adobe’s proprietary software. However the incredibly large investment in Flash simply wouldn’t disappear overnight, especially since Adobe’s tooling around it is still one of its biggest money makers. It seems Adobe is ready to start digging Flash’s grave however with the announcement that Flash Professional will become Adobe Animate CC.
The change honestly shouldn’t come as much of a surprise as the writing has been on the wall for sometime. Adobe first flirted with the idea of Flash being a HTML5 tool way back in 2011 with their Wallaby framework and had continued to develop it as time went on. Of course it was still primarily a Flash development tool, and the majority of people using it are still developing Flash applications, however it was clear that the market wanted to move away from Flash and onto standards based alternatives. That being said the rebranding of the product away from being a Flash tool signals that Adobe is ready to let it start to fade in the background and let the standards based web take over.
Interestingly the change is likely not revenue driven as the total income that Adobe derives from it directly is around 6% or so. More it would seem to be about bolstering their authoring tools as the standard for all rich web content, broadening the potential user base for the Animate CC application. From that perspective there’s some potential for the rebranding to work, especially since standards based development is now one of their key marketing plays. Whether that will be enough to pull people away from the alternatives that cropped up in the interim though is less clear but Adobe does have a good reputation when it comes to making creative tools.
Flash will likely still hang around in the background for quite some time now though as much of the infrastructure that’s built up around that ecosystem is still lumbering to change. A good example of this, YouTube, dumped Flash as default for Chrome some time ago but that still left around 80% of their visitors defaulting to Flash. Similarly other sites still rely on Flash for ads and other rich content with standards based solutions really only being the norm for newly developed websites and products. How long Flash will hang around is an open ended question but I don’t see it disappearing within the next few years.
We’re rapidly approaching a post-Flash world and we will all be much better because of it. Flash is a relic of a different time on the Internet, one where proprietary standards were the norm and everyone was battling for platform dominance. Adobe is now shifting to the larger market of being the tool of choice for content creators on a standards based web, a battle they’re much more likely to win than fighting to keep Flash alive. I, like many others, won’t be sad to see Flash go as the time has come for it to make way for the new blood of the Internet.
A game based around any of the world wars is usually an instant turn off for me. The number of games that have been based around those events are so numerous that there really doesn’t feel like there can be any more angles to tackle it from as pretty much every story from it has been done to death. The alternate reality and fantasy versions of it, like those in Wolfenstein, get away with it since they’re not wholly dependent on war stories for inspiration but they’ll still need a little something extra to pique my interest. Valiant Hearts, which comes to us care of Ubisoft Montpellier, has been receiving wide praise for it’s touching story. As someone who’s just come off 2 rather lacklustre story based titles I wasn’t hoping for miracles but Valiant Hearts managed to surprise me, bringing this writer to tears as its conclusion.
The year is 1914 and the assassination of Archduke Franz Ferdinand caused Germany to declare war on Russia. France, anticipating that this war will escalate far beyond those two countries, deports all of its German citizens back to their home country. Karl is one of those citizens, torn away from his wife and young son he is sent to the frontlines of the war to fight for his home country. Not long after his wife’s father, Emile, is called to duty as well and sent to fight for the French army. What follows is a tale of how the war drives families apart and the never ending quest for them to be reunited once again.
Valiant Hearts reminds me of the flash games of yesteryear, albeit with production values far exceeding that of any of its predecessors. It was developed on the same framework that powered Ubisoft’s recent release Child of Light and it’s easy to see just how heavily the choice of that platform influenced the art work. In contrast to Child of Light however Valiant Heart’s art style is far more dark and monotone with infinite shades of brown and grey being the primary colour palette. This does mean that when colour is used it’s quite striking and the art team does a fantastic job of using it to great effect. This also extends to the beautiful soundtrack that accompanies the game, ebbing and flowing at all the right moments.
In terms of actual game play Valiant Hearts is much like other story-first games in the sense that it usually takes a back seat to progressing the story. For the most part you’ll be doing elaborate fetch quest missions that require you to find one item in order to progress through the next session. Sometimes you’ll have to make your way through various different people in order to get to the final objective and try as you might there’s no clever way to bypass certain things. There’s also a bevy of quicktime-esque events that will require you to either guess correctly or simply memorize the sequence in which events happen in order to move on to the next part of the story.
Thankfully Valiant Hearts didn’t fall into the trap of putting far too much game play in between sections of the story like both the recent titles I played through did. In all honesty I didn’t think it was a major hurdle for games of this nature to get past as many of them are done by indie developers and so ancillary mechanics are usually on the bottom of their to do list. However with 2 games falling prey to the same problems I have to commend Valiant Hearts for getting the pacing right which helps immensely with keeping the player interested in the story. There were some sections that could use some tuning but compared to my recent experiences it was heaven.
Most of the puzzles are fairly intuitive as your inventory space is limited to a single item, limiting the amount of complexity that the game can throw at you significantly. There’s a pretty good variety of puzzle mechanics so you won’t be redoing the same thing over and over again but most of them shouldn’t take you more than 10 minutes or so to figure out. A couple of them will require you to think laterally about what you’re doing as some of them lack obvious cues as to what might interact with what. This did lead to a couple confusing moments when I wasn’t quite sure if I was doing the right thing but most of the time you’ll get there through trial and error.
One issue I did find with Valiant Hearts was that since there’s not a lot of visual differentiation between different parts of the environment it can be sometimes hard to find a path you’re meant to go down or what elements are interactive. This meant that in some of the more visually busy sections I was wondering just where exactly I was meant to go as I couldn’t find the particular path to go down. I also had some deaths that felt like they were more due to visual confusion more than anything else. This might just be a fault of the writer however but it’s still an issue that should be pointed out.
Of course what really makes Valiant Hearts worth playing is the story. Overall it’s a pretty typical story of a family torn apart by war, almost Romeo and Juliet like in the star crossed lovers from different houses idea, and the story of them trying to reunite with each other. The main characters all receive the background and development they deserve, which helps immensely when it comes to scenes that rely on engaging your sense of empathy with them. Some of the elements of it are a little on the fantasy side, which can be a tad distracting from the overall message that the game tries to put forth, however they’re only there as aides to the plot so they’re easily pushed aside.
I’ll have to admit that for probably the first half or so of Valiant Hearts I wasn’t too emotionally invested with the characters or story. Whilst the opening was gripping enough to draw me into playing the game further there’s a bit of dearth in the early game as the characters are seemingly just going through the motions. However as each of their back stories is developed in detail you find yourself becoming attached to them and each tragedy that befalls them starts to cut into you. The final climatic scene is by far one of the most bittersweet endings I have endured in recent memory and whilst it might lean on the cheesy/predictable side that didn’t stop me from bursting into tears, overcome with a sense of grief.
Valiant Hearts is a beautiful story masterfully told through the medium of video games. The art style and music direction are some of the best I’ve experienced in their category, taking the traditional flash styled game and ramping it up to the next level. The game mechanics are simple, enjoyable and thankfully stay out of the way of the story, leaving the player to enjoy Valiant Hearts for what it truly is. Finally the story is by far one of the best examples I’ve come across this year with all the characters receiving the right amount of screen time and development required for it’s ultimate emotional climax. If you, like me, have been feeling let down by the offerings of story based games of late then I can wholeheartedly recommend Valiant Hearts as the cure to what ails you.
Valiant Hearts is available on PC, PlayStation3, PlayStation4, Xbox360 and XboxOne right now for $14.99, $22.95, $22.95, $19.95 and $19.95 respectively. Game was played on the PC with 5 hours of total play time.
I’ve been in the market for a new PC for a little while now so occasionally I’ll indulge myself in a little hypothetical system building so I can figure out how much I want to spend (lots) and what kind of computer I’ll get out of it (a super fast one). One of the points that got me unstuck was the fact that whilst I can get semi-decent performance out of my RAID10 set which stores most of my stuff it’s no where near the performance of my SSD that holds the OS and my regularly used applications. Easy, I thought, I’ll just RAID together some SSDs and get the performance I want with enough space to hold all my games and other miscellany. Thing is though SSDs don’t like to be in RAID sets (thanks to TRIM not working with it) unless its RAID0 and I’m not terribly keen on halving the MTBF just so I can get some additional space. No what I need is a bigger drive and it looks like Samsung is ready to deliver on that.
That little chip is the key to realizing bigger SSDs (among other things). It’s a new type of flash memory called V-NAND based on a new gate technology called CTF and Samsung has just started mass production of them.
What’s really quite groovy about this new kind of NAND chip is that unlike all other computer chips which are planar in nature, I.E. all the transistors lie on a single plane, V-NAND (as you can likely guess) is actually a vertical stack of planar chips. This allows for incredible densities inside a single chip with this first generation clocking in at a whopping 128GB. Putting that in perspective the drive that I’m currently using has the same capacity as that single chip which means that if I replaced its memory with this new V-NAND I’d be looking at a 1TB drive. For tech heads like me even hearing that it was theoretically possible to do something like that would make us weak at the knees but these are chips that you can start buying today.
Apparently this isn’t their most dense chip either as their new 3D NAND tech allows them to go up to 24 layers high. I can’t seem to find a reference that states just how many layers are in this current chip so I’m not sure how dense we’re talking here but it seems like this will be the first chip among many and I doubt they’ll stop at 24.
As if all that wasn’t enough Samsung is also touting higher reliability, from anywhere between 2x to 10x, as well as at least double the write performance of traditional NAND packages. All SSDs are at the point where the differences in write/read speeds are almost invisible to the end user so that may be moot for many but for system builders it’s an amazing leap forward. Considering we can already get some pretty amazing IOPS from the SSDs available today doubling that just means we can do a whole lot more with a whole lot less hardware and that’s always a good thing. Whether those claims hold up in the real world will have to be seen however but there’s a pretty close relationship between data density and increased throughput.
Unfortunately whilst these chips are hitting mass production today I couldn’t find any hint of which partners are creating drives based around them or if Samsung was working on one themselves. They’ve been releasing some pretty decent SSDs recently, indeed they were the ones I was eyeing off for my next potential system, so I can’t imagine they’d be too far off given that they have all the expertise to create one. Indeed they just recently released the gigantic 1.6TB SSD that uses the new PCIe interface NVMe to deliver some pretty impressive speeds so I wouldn’t be surprised if their next drive comes out on that platform using this new V-NAND.
It’s developments like this that are a testament to the fact that Moore’s Law will keep on keeping on long despite the numerous doubters ringing its death bell. With this kind of technology in mind its easy to imagine it being applied elsewhere, increasing density in other areas like CPU dies and volatile memory. Of course porting such technology is non-trivial but I’d hazard a guess that all the chip manufacturers worldwide are chomping at the bit to get in on this and I’m sure Samsung will be more than happy to license the patents to them.
For a princely sum, of course 😉
I have a love/hate relationship with the new wave of hardcore platformers that have swept through the game scene recently due to the indie game developer revolution. Initially I find them quite fun, as I did with Super Meat Boy and They Bleed Pixels, but usually towards the end when the difficulty starts to ramp up and my total play time sky rocket despite progress slowing to a crawl I tend to get frustrated with them. None of them have matched up to the Nintendo Hard hell that was Battletoads but ramping the difficulty up to insanity in the later levels might be part of the fun for some, but it certainly isn’t for me. No Time To Explain is another instalment in the indie platformer genre and despite my history with them the videos were intriguing enough to make me want to play it.
No Time To Explain drops you in a nondescript house with you casually minding your own business. Not long after a good chunk of your house is blown away by some unknown force and then suddenly someone who looks strikingly similar to you appears. “I’m you from the future. There’s no time to explain!” he exclaims at you before he’s snatched away by a giant alien crab who’s intent on taking him, you, away. You then find yourself in possession of a weapon capable of dealing untold amounts of damage whilst also functioning as a partial jetpack to get you over any obstacles in your way. It’s then up to you to rescue yourself from whatever dangers you find yourself in.
Whilst I’ve described some games in the past as being Flash-like due to their styling and choice of colour palettes No Time To Explain is in fact a flash game brought to you as a standalone executable thanks to Adobe’s AIR framework. This means the graphics are pretty much what you’d expect to see from any browser based flash game. This isn’t necessarily a bad thing, indeed for No Time To Explain the cartoonish presentation is what makes it so hilariously awesome, but there’s a certain standard that flash games seem to hit and never get passed no matter how long is spent on it. It’s probably a limitation of the platform more than anything although I can’t really comment since the last time I looked at ActionScript I got scared and decided to stick to C#.
Whilst Not Time To Explain starts off as a kind of soft core version of Metal Slug where you basically just wailing on random things with your giant beam weapon the core game mechanic is actually that of a physics based platformer. Your gun, whilst unleashing torrents of destruction where ever you aim it, also has something of a kick to it. Pointing it in the right direction can send you soaring up into the clouds or launch you across wide gaps at incredible speed. The trouble then becomes figuring out what the right angles, amount of force and then how to correct your trajectory whilst you’re up in the air.
At the beginning this is relatively easy as your landing zones are huge and there’s nothing that will kill you brutally should you get your timing wrong. Soon after however there will be spikes coating surfaces, bottomless pits to fall in and jumps/obstacles that seem to be next to impossible to cross the first time you see them. Thanks to the decent auto-save system though you’ll be able to fine tune your strategy rapidly without having to go through everything from the start again. I have to say that this was a welcome change from the Super Meat Boy way of doing things where one particular obstacle could block you for ages simply because it took so long to get there in the first place.
Each section is capped off with a boss fight which usually involves aiming your laser at whatever is moving and then waiting for it to keel over. This is perhaps where the save system is a little too good as there’s not a whole lot of challenge in the majority of the boss fights when you can literally stand in one section the entire time and simply wail on them until they die. Of course you can make it interesting for yourself (and speed up the process) by dodging the incoming bullets and positioning yourself better but that’s not technically a challenge the game provides. There was one boss fight where the quick save system didn’t apply which was a refreshing change but there were bigger issues at play there.
The Drill Squirrel boss is the first one where you can actually “die” in the sense that should you get injured at a specific point you’ll be sent back to the start of the fight rather than respawned where you were last standing. This is fine in and of itself however the fight is completely and utterly broken should certain things happen. Easy ways to replicate this are: be in the pit when he does his laser eyes at you or be on the same platform during said event. Once you’re past that the next section, where the pits fill with lava and fiery columns spew up from the ground, simply won’t happen and the drill squirrel will get stuck in the ground. This isn’t the only bug either, should you get bounced into a wall by him during the second phase you’ll get stuck in there as well but at least the game recognizes it and restarts you from the start.
No Time To explain is an awesome platformer title, combining some of the twitch aspects of its more insanely difficult brethren with mechanics that make the platforming enjoyable rather than a chore. For the most part it works well with many of the times I got stuck being down to me not getting the puzzle rather than game breaking bugs. However there are still some teething issues that need to be worked out, especially with that one particular boss, before I could say it was a trouble free experience. I also have a small gripe over the price since it’s rather short (and is available a lot cheaper direct from the developer) but it is on sale right now which kind of renders that complaint moot. Overall I quite enjoyed No Time To Explain and after reading through the developer’s blog I have to say I’m interested in their future titles and hope that their recent Greenlight success will give them the capital to see it through.
No Time To Explain is available on PC right now for $9.99. Total game time was approximately 2 hours.
Google is one of the biggest proponents of an Internet that’s unencumbered by proprietary standards, patents and non-neutral traffic routes. That’s been a great boon to us Internet users as their advocacy on our behalf means that as long as they stay in business we’re likely to continue to have an Internet that stays true to those ideals. Of course like any company they’re not entirely perfect, at times attempting to forward their own agenda under the guise of openness, but overall their contribution to keeping the Internet free and open has been positive. It seems rather odd then that Google has an obsession with Adobe’s Flash product, to the point where I wonder if there’s something going on that I don’t know about.
Back in March last year Google announced that they were integrating the Flash plugin directly into their Chrome browser. This was at the height of the web standards war that was raging between Apple and Adobe so it was easy to construe Google’s support of Flash as them taking Adobe’s side in the matter. That notion was further reinforced by the fact that Google’s Android platform fully supported Flash as well. This level of support for a proprietary plug-in for a company that prides itself on being a big supporter of open standards seems rather hypocritical, but there are some reasons as to why they’re doing it.
Recently though it appears that Google’s support of Flash was actually leading up to a much more ambitious goal, transitioning the web from Flash to a HTML5 future:
Google is enabling developers who use the Adobe Flash Professional developer tool to convert their animations to HTML5 via an extension based on Google’s Swiffy conversion technology.
“One of our main aims for Swiffy is to let you continue to use Flash as a development environment, even when you’re developing animations for environments that don’t support Flash,” said Esteban de la Canal, Google software engineer, in a blog post. “To speed up the development process, we’ve built the Swiffy Extension for Flash Professional. The extension enables you to convert your animation to HTML5 with one click (or keyboard shortcut).”
Now it’s interesting that Google would go ahead and do something like this when Adobe had already made their play in this field with their Wallaby product. The big difference here is that Wallaby was specifically targeted at Flash Ads only and didn’t support many of the features that made Flash so versatile, like ActionScript. Swiffy on the other hand does support ActionScript and several other features that weren’t present in Wallaby. It would seem then that Google thinks they can do better than Adobe at their own game which they very well could especially when Adobe just recently announced that they weren’t working on mobile Flash any more.
Of course the transition from native Flash to Flash rendered through HTML5 doesn’t necessarily mean we’re looking at a future web that performs better. The main problem with Flash wasn’t so much the platform it was the developers on that platform. The Flash ads were the biggest culprit, often laden with gobs of unnecessary and bloated code that were the source of the performance problems people encountered. Transitioning such ads to HTML5 won’t make that code go away (there is a chance to optimize, but automated tools can only go so far) and the result will more than likely be just as bad as the original Flash it came from. It’s a step in the right direction yes, but it’s not going to be an all roses future like some would have you believe.
It’s quite interesting to see the kind of games that Google plays in order to make the web better for everyone. At times they may seem to be on the wrong side but it’s becoming clear that they’re playing the long game for a better web for everyone. It will be interesting to see how common Swiffy converted Flash files become and whether they’re still the performance hogs that their predecessors are but knowing Google they won’t let it lie until they’ve optimized it to the nth degree. Adobe’s reaction to Swiffy will be telling as well considering they’re now competing directly with Google on their home turf. The end result will be a better, more open Internet for us all something I think we can all agree is a good thing.
Of all the PC upgrades that I’ve ever done in the past the one that’s most notably improved performance of my rig is, by a wide margin, installing a SSD. Whilst good old fashioned spinning rust disks have come a long way in recent years in terms of performance they’re still far and away the slowest component in any modern system. This is what chokes most PC’s performance as the disk is a huge bottleneck, slowing everything down to its pace. The problem can be mitigated somewhat by using several disks in a RAID 0 or RAID 10 set but all of those pale in comparison when compared to even a single SSD.
The problem doesn’t go away for the server environment either, in fact most of the server performance problems I’ve diagnosed have had their roots in poor disk performance. Over the years I’ve discovered quite a few tricks to get around the problems presented by traditional disk drives but there are just some limitations you can’t overcome. Recently at work the issue of disk performance came to a head again as we investigated the possibility of using blade servers in our environment. I casually made mention of a company that I had heard of a while back, Fusion-IO, who specialised in making enterprise class SSDs. The possibility of using one of the Fusion-IO cards as a massive cache for the slower SAN disk was a tantalizing prospect and to my surprise I was able to snag an evaluation unit in order to put it through its paces.
The card we were sent was one of the 640GB ioDrives. It’s surprising heavily for its size, sporting gobs of NAND flash and a massive heat sink that hides the propeitary c ontroller. What intrigued me about the card initially was the NAND didn’t sport any branding I recognised before (usually its recognisable like Samsung) but as it turns out each chip is a 128GB Micron NAND Flash chip. If all that storage was presented raw it would total some 3.1 TB and this is telling of the underlying infrastructure of the Fusion-IO devices.
The total storage available to the operating system once this card is installed is around 640GB (600GB usable). Now to get that kind of storage out of the Micron NAND chips you’d only need 5 of them but the ioDrive comes with a grand total of 25 dotting the board. No traditional RAID scheme can account for the amount of storage presented. So based on the fact that there’s 25 chips and only 5 chips worth of capacity available it follows that the Fusion-IO card uses quintuplet sets of chips to provide the high level of performance that they claim. That’s an incredible amount of parallelism and if I’m honest I expected these chips to all be 256MB chips that were all RAID 1 to make one big drive.
Funnily enough I did actually find some Samsung chips on this card, two 1GB DDR2 chips. These are most likely used for the CPU on the ioDrive which has a front side bus of either 333 or 400MHz based on the RAM speed.
But enough of the techno geekery, what’s really important is how well this thing performs in comparison to traditional disks and whether or not it’s worth the $16,000 price tag that comes along with it. Now I had done some extensive testing of various systems in the past in order to ascertain whether the new Dell servers we were looking at where going to perform as well as their HP counterparts. All of this testing was purely disk based using IOMeter, a disk load simulator that tests and reports on nearly every statistic you want to know about your disk subsystem. If you’re interested in replicating the results I’ve got then I’ve uploaded a copy of my configuration file here. The servers included in the test are Dell M610x, Dell M710HD, Dell M910, Dell R710 and a HP DL380G7. For all the tests (bar the two labelled local install) all of them are a base install of ESXi 5 with a Windows 2008R2 virtual machine installed on top of it. The specs of the virtual machine are 4 vCPUs, 4GB RAM and a 40GB disk.
As you can see the ioDrive really is in a class all of its own. The only server that comes close in terms of IOPS is the M910 and that’s because it’s sporting 2 Samsung SSDs in RAID 0. What impresses me most about the ioDrive though is its random performance which manages to stay quite high even as the block size starts to get bigger. Although its not shown in these tests the one area where the traditional disks actually equal the Fusion-IO is in terms of throughput when you get up to really large write sizes, on the order of 1MB or so. I put this down to the fact that the servers in question, the R710s and DL380G7s, have 8 disks in them that can pump out some serious bandwidth when they need to. If I had 2 Fusion-IO cards though I’m sure I could easily double that performance figure.
What interested me next was to see how close I could get to the spec sheet performance. The numbers I just showed you are particularly incredible but Fusion-IO claims that this particular drive was capable of something on the order of 140,000 IOPS if I played my cards correctly. Using the local install of Windows 2008 I had on there I fired up IOMeter again and set up some 512B tests to see if I could get close to those numbers. The results, as shown in the Dell IO contoller software, are shown below:
Ignoring the small blip in the centre where I had to restart the test you can see that whilst the ioDrive is capable of some pretty incredible IO the advertised maximums are more than likely theoretical than practical. I tried several different tests and while a few averaged higher than this (approximately 80K IOPS was my best) it was still a far cry from the figures they have quoted. Had they gotten within 10~20% I would’ve given it to them but whilst the ioDrive’s performance is incredible it’s not quite as incredible as the marketing department would have you believe.
As a piece of hardware the Fusion-IO ioDrive is really the next step up in terms of performance. The virtual machines I had running directly on the card were considerably faster than their spinning rust counterparts and if you were in need of some really crazy performance you really couldn’t go past one of these cards. For the purpose we had in mind for it however (putting it inside a M610x blade) I can’t really recommend it as it’s a full height blade that only has the power of a half height. The M910 represents much better value with its crazy CPU and RAM count and the SSDs, whilst being far from Fusion-IO level, do a pretty good job of bridging the disk performance gap. I didn’t have enough time to see how it would improve some real world applications (it takes me longer than 10 days to get something like this into our production environment) but based on these figures I have no doubt it improve the performance of whatever I put it into considerably.
Adobe had also been quite stalwart in their support for Flash too, refusing to back down on their stance that they were “the way” to do rich content on the Internet. Word came recently however that they were stopping development on the mobile version of Flash:
Graphics software giant Adobe announced plans for layoffs yesterday ahead of a major restructuring. The company intends to cut approximately 750 members of its workforce and said that it would refocus its digital media business. It wasn’t immediately obvious how this streamlining effort would impact Adobe’s product line, but a report that was published late last night indicates that the company will gut its mobile Flash player strategy.
Adobe is reportedly going to stop developing new mobile ports of its Flash player browser plugin. Instead, the company’s mobile Flash development efforts will focus on AIR and tools for deploying Flash content as native applications. The move marks a significant change in direction for Adobe, which previously sought to deliver uniform support for Flash across desktop and mobile browsers.
Now the mobile version of Flash had always been something of a bastard child, originally featuring a much more cut down feature set than its fully fledged cousin. More recent versions brought them closer together but the experience was never quite as good especially with the lack of PC level grunt on mobile devices. Adobe’s mobile strategy now is focused on making Adobe AIR applications run natively on all major smart phone platforms, giving Flash developers a future when it comes to building mobile applications. It’s an interesting gamble, one that signals a fundamental shift in the way Adobe views the web.
Arguably the writing has been on the wall for this decision for quite some time. Back at the start of this year Adobe released Wallaby, a framework that allows advertisement developers the capability to convert Flash ads into HTML5. Indeed even back then I said that Wallaby was the first signal that Adobe thought HTML5 was the way of the future and were going to start transitioning towards it as their platform of the future. I made the point then that whilst Flash might eventually disappear Adobe wouldn’t as they have a history for developing some of the best tools for non-technical users to create content for the web. Indeed there are already prototypes of such tools already available so it’s clear that Adobe is looking towards a HTML5 future.
The one place that Flash still dominates, without any clear competitors, is in online video. Their share of the market is somewhere around 75% (that’s from back in February so I’d hazard a guess that its lower now) with the decline being driven from mobile devices that lack support for Flash video. HTML5’s alternative is unfortunately still up in the air as the standards body struggles to find an implementation that can be open, unencumbered by patents and yet still be able to support things like Digital Rights Management. It’s this lack of standardization that will see Flash around for a good while yet as until there’s an agreed upon standard that meets all those criteria Flash will remain as the default choice for online video.
So it looks like the war that I initially believed that Adobe would win has instead seen Adobe pursuing a HTML5 future. Its probably for the best as they will then be providing some of the best tools in the market whilst still supporting open standards, something that’s to the benefit of all users of the Internet. Hopefully that will also mean better performing web sites as well as Flash had a nasty reputation for bringing even some of the most powerful PCs to their knees with poorly coded Flash ads. The next few years will be crucial to Adobe’s long term prospects but I’m sure they have the ability to make it through to the other end.
The computer (or whatever Internet capable device you happen to be viewing this on) is made up of various electronic components. For the most part these are semiconductors, devices which allow the flow of electricity but don’t do it readily, but there’s also a lot of supporting electronics that are what we call fundamental components of electronics. As almost any electrical enthusiast will tell you there are 3 such components: the resistor, capacitor and inductor each of them with their own set of properties that makes them useful in electronic circuits. There’s been speculation of a 4th fundamental component for about 40 years but before I talk about that I’ll need to give you a quick run down on what the current fundamentals properties are.
The resistor is the simplest of the lot, all it does is impede the flow of electricity. They’re quite simple devices, usually a small brown package banded by 4 or more colours which denotes just how resistive it actually is. Resistors are often used as current limiters as the amount of current that can pass through them is directly related to the voltage and level of resistance of said resistor. In essence you can think of them as narrow pathways in which electric current has to squeeze through.
Capacitors are intriguing little devices and can be best thought of as batteries. You’ve seen them if you’ve taken apart any modern device as they’re those little canister looking things attached to the main board of said device. They work by storing charge in an electrostatic field between two metal plates that’s separated by an insulating material called a dielectric. Modern day capacitors are essentially two metal plates and the dielectric rolled up into a cylinder, something which you could see if you cut one open. I’d only recommend doing this with a “solid” capacitor as the dielectrics used in other capacitors are liquids and tend to be rather toxic and/or corrosive.
Inductors are very similar to capacitors in the respect that they also store charge but instead of an electrostatic field they store it in a magnetic field. Again you’ve probably seen them if you’ve cracked open any modern device (or say looked inside your computer) as they look like little circles of metal with wire coiled around them. They’re often referred to as “chokes” as they tend to oppose the current that induces the magnetic field within them and at high frequencies they’ll appear as a break in the circuit, useful if you’re trying to keep alternating current out of your circuit.
For quite a long time these 3 components formed the basis of all electrical theory and nearly any component could be expressed in terms of them. However back in 1971 Leon Chua explored the symmetry between these fundamental components and inferred that there should be a 4th fundamental component, the Memristor. The name is a combination of memory and resistor and Chua stated that this component would not only have the ability to remember its resistance, but also have it changed by passing current through it. Passing current in one direction would increase the resistance and reversing it would decrease it. The implications of such a component would be huge but it wasn’t until 37 years later that the first memristor was created by researchers in HP’s lab division.
What’s really exciting about the memristor is its potential to replace other solid state storage technologies like Flash and DRAM. Due to memristor’s simplicity they are innately fast and, best of all, they can be integrated directly onto the chip of processors. If you look at the breakdown of a current generation processor you’ll notice that a good portion of the silicone used is dedicated to cache, or onboard memory. Memristors have the potential to boost the amount of onboard memory to extraordinary levels, and HP believes they’ll be doing that in just 18 months:
Williams compared HP’s resistive RAM technology against flash and claimed to meet or exceed the performance of flash memory in all categories. Read times are less than 10 nanoseconds and write/erase times are about 0.1-ns. HP is still accumulating endurance cycle data at 10^12 cycles and the retention times are measured in years, he said.
This creates the prospect of adding dense non-volatile memory as an extra layer on top of logic circuitry. “We could offer 2-Gbytes of memory per core on the processor chip. Putting non-volatile memory on top of the logic chip will buy us twenty years of Moore’s Law, said Williams.
To put this in perspective Intel’s current flagship CPU ships with a total of 8MB of cache on the CPU and that’s shared between 4 cores. A similar memristor based CPU would have a whopping 8GB of on board cache, effectively negating the need for external DRAM. Couple this with a memristor based external drive for storage and you’d have a computer that’s literally decades ahead of the curve in terms of what we thought was possible, and Moore’s Law can rest easy for a while.
This kind of technology isn’t you’re usual pie in the sky “it’ll be available in the next 10 years” malarkey, this is the real deal. HP isn’t the only one looking into this either, Samsung (one of the world’s largest flash manufacturers) has also been aggressively pursuing this technology and will likely début products around the same time. For someone like me it’s immensely exciting as it shows that there are still many great technological advances ahead of us, just waiting to be uncovered and put into practice. I can’t wait to see how the first memristor devices perform as it will truly be a generational leap ahead in technology.
It’s only been just over a week since Microsoft demoed their latest iteration of the Windows platform but in that short amount of time it’s already managed to stir up quite a bit of discussion from friends and foes alike. The foes were quick to call out the new OS’s tablet envy, conveniently forgetting Microsoft’s rhetoric that the next version of Windows after 7 was going to have a much more web centric focus, with the possibility of it being entirely cloud based. More interesting however is the discussion arising from long term developers on the Microsoft platform, and it’s not the kind of adulation and praise you’d normally expect.
Rampant speculation soon followed and wasn’t helped by the fact that Microsoft has asked everyone to remain calm until their BUILD developer conference in September. It’s not the first time this sort of thing has happened either, a similar level of hubbub was roused when Microsoft was coy about Silverlight’s future when talking about Internet Explorer 9 and it’s dedication to web standards. They soon came out saying that they still saw a future in Silverlight, especially for the Windows Phone 7 platform, but many of them were left unconvinced. It’s then quite likely that this second round of doubt that Microsoft has cast over their third party developer’s futures was the straw that broke the camel’s back and all the blame is being leveled squarely at Microsoft.
For what it’s worth I feel their concerns are valid if the reaction to them is somewhat overblown. Microsoft has a long history of eating its own dog food and many of their client facing applications are built upon the technologies that so many are worried are going to disappear in the near future. The best example of this is their Windows Azure management console which is built entirely on Silverlight. Couple that with the fact that Microsoft has many partners with a very heavy investment in the platform and I find it hard to jump on the “Silverlight is dead” bandwagon, but that doesn’t necessarily mean Microsoft is committed to bringing Silverlight into the Windows 8 tablet world.
Sure it would be great to be able to create Silverlight applications on the new Windows 8 tile system and Microsoft would be leveraging off a lot of preexisting talent to help drive adoption of the platform. However it would also hinder Microsoft’s adoption of web standards, as many developers would favor using proprietary Microsoft technologies instead of attempting to reskill. They’d then be the slave of two masters: on the one hand the Silverlight crowd demanding ever more features and tools that are constrained to that platform and on the other the web standards crowd that has been Microsoft’s bug bear ever since alternative browsers started to gain real market traction. It’s not like Microsoft doesn’t have the resources to deal with this though, but I can understand their motivations should they want to eschew Silverlight in favour of a more standard environment.
Adobe and Apple haven’t been the best of friends for a while now. Whilst many of their products are still considered some of the most top of the line applications available on the OS X platform Apple couldn’t be more hostile to their most popular product: Flash. Now this isn’t without good reason as Flash has a terrible tendency to be abused by sloppy developers (most of the time ad networks) who can even bring a full blown desktop PC to its knees. Keeping Flash out of their handhelds meant fewer headaches for them and forced the hand of many companies to rethink their use of Flash, lest they draw the ire of the iOS browsing crowd.
Whilst there was a good few months of to and fro between these two companies last year it all subsided once Apple capitulated to the developer community that raised concerns over Apple’s wide reaching policy on cross platform libraries. This seemingly opened up the door that Apple had shut in Adobe’s face, enabling them to create a product that could convert Flash files into a more iOS friendly format. A couple days ago they announced the first iteration of the product, called Wallaby:
Welcome to the Wallaby Technology Preview. Wallaby is an application to convert Adobe Flash Professional CS5 files (.FLA) to HTML5. Wallaby has a very simple UI which accepts as input a FLA file and exports HTML and support files to a user-selected folder. There is also an option to launch the default application assigned for the .html extension.
The announcement has, of course, caused quite a stir in the tech community. Most of them focus on the fact that Wallaby was designed with only one purpose in mind: to get Flash banner ads working on iOS devices. As such Wallaby is pretty limited in the functionality it provides, being unable to convert things like ActionScript which enable things like Flash based games. Of course this also raises the issue that Flash is most often abused by advertising agencies with poorly coded banner ads being one of the main culprits. Whether or not badly coded ads in Flash translate into bad (or worse) ads in HTML5 remains to be seen, but I can’t see how they could get any better.
Realistically the issues that many people associated with Flash aren’t really caused by it. More it is those who use the platform that are to blame for the troubles that many people encounter with it. This is why I didn’t understand Apple’s position on Flash in the first place. Sure there are many banner ads out there that can make your web experience a browsing hell but banning one technology simply drives those same people to look for other platforms, it won’t magically make them better developers overnight. Wallaby is a great example of this as those same people that created poor performing Flash ads can now do the same in HTML5. In the end Apple is merely delaying the time in which it takes for the same problems that plagued Flash to come to their iOS platform. Google I feel has is on the right track to solving this problem, tightly integrating Flash into their products so they can tune it properly.
It does show that Adobe doesn’t believe the future is still with their Flash platform and the gears are in motion to transition to the new world of HTML5. There’s a reason why Flash has been such an integral part of the web for so long and it’s simply because Flash gave the best tools for non-technical users to create rich content for the web. Whilst they’ve come rather late to the mobile boat they are one of the few companies that has the momentum and devoted user base to make the switch successfully. I’m sure many people will see this as them “capitulating” to Apple’s demands but in reality its anything but and I’m sure they’ll eventually dominate the HTML5 space just as they’ve done in the past with Flash.