Posts Tagged‘flash’

V-NAND-04-0

Samsung Starts Producing V-NAND, Massive SSDs Not Far Off.

I’ve been in the market for a new PC for a little while now so occasionally I’ll indulge myself in a little hypothetical system building so I can figure out how much I want to spend (lots) and what kind of computer I’ll get out of it (a super fast one). One of the points that got me unstuck was the fact that whilst I can get semi-decent performance out of my RAID10 set which stores most of my stuff it’s no where near the performance of my SSD that holds the OS and my regularly used applications. Easy, I thought, I’ll just RAID together some SSDs and get the performance I want with enough space to hold all my games and other miscellany. Thing is though SSDs don’t like to be in RAID sets (thanks to TRIM not working with it) unless its RAID0 and I’m not terribly keen on halving the MTBF just so I can get some additional space. No what I need is a bigger drive and it looks like Samsung is ready to deliver on that.

V-NAND-04-0

That little chip is the key to realizing bigger SSDs (among other things). It’s a new type of flash memory called V-NAND based on a new gate technology called CTF and Samsung has just started mass production of them.

What’s really quite groovy about this new kind of NAND chip is that unlike all other computer chips which are planar in nature, I.E. all the transistors lie on a single plane, V-NAND (as you can likely guess) is actually a vertical stack of planar chips. This allows for incredible densities inside a single chip with this first generation clocking in at a whopping 128GB. Putting that in perspective the drive that I’m currently using has the same capacity as that single chip which means that if I replaced its memory with this new V-NAND I’d  be looking at a 1TB drive. For tech heads like me even hearing that it was theoretically possible to do something like that would make us weak at the knees but these are chips that you can start buying today.

Apparently this isn’t their most dense chip either as their new 3D NAND tech allows them to go up to 24 layers high. I can’t seem to find a reference that states just how many layers are in this current chip so I’m not sure how dense we’re talking here but it seems like this will be the first chip among many and I doubt they’ll stop at 24.

As if all that wasn’t enough Samsung is also touting higher reliability, from anywhere between 2x to 10x, as well as at least double the write performance of traditional NAND packages. All SSDs are at the point where the differences in write/read speeds are almost invisible to the end user so that may be moot for many but for system builders it’s an amazing leap forward. Considering we can already get some pretty amazing IOPS from the SSDs available today doubling that just means we can do a whole lot more with a whole lot less hardware and that’s always a good thing. Whether those claims hold up in the real world will have to be seen however but there’s a pretty close relationship between data density and increased throughput.

Unfortunately whilst these chips are hitting mass production today I couldn’t find any hint of which partners are creating drives based around them or if Samsung was working on one themselves. They’ve been releasing some pretty decent SSDs recently, indeed they were the ones I was eyeing off for my next potential system, so I can’t imagine they’d be too far off given that they have all the expertise to create one. Indeed they just recently released the gigantic 1.6TB SSD that uses the new PCIe interface NVMe to deliver some pretty impressive speeds so I wouldn’t be surprised if their next drive comes out on that platform using this new V-NAND.

It’s developments like this that are a testament to the fact that Moore’s Law will keep on keeping on long despite the numerous doubters ringing its death bell. With this kind of technology in mind its easy to imagine it being applied elsewhere, increasing density in other areas like CPU dies and volatile memory. Of course porting such technology is non-trivial but I’d hazard a guess that all the chip manufacturers worldwide are chomping at the bit to get in on this and I’m sure Samsung will be more than happy to license the patents to them.

For a princely sum, of course ;)

 

No Time To Explain Screnshot Wallpaper You From The Future

No Time To Explain: It’s Like Super Meat Boy, But Fun.

I have a love/hate relationship with the new wave of hardcore platformers that have swept through the game scene recently due to the indie game developer revolution. Initially I find them quite fun, as I did with Super Meat Boy and They Bleed Pixels, but usually towards the end when the difficulty starts to ramp up and my total play time sky rocket despite progress slowing to a crawl I tend to get frustrated with them. None of them have matched up to the Nintendo Hard hell that was Battletoads but ramping the difficulty up to insanity in the later levels might be part of the fun for some, but it certainly isn’t for me. No Time To Explain is another instalment in the indie platformer genre and despite my history with them the videos were intriguing enough to make me want to play it.

No Time To Explain Screnshot Wallpaper Intro Level

No Time To Explain drops you in a nondescript house with you casually minding your own business. Not long after a good chunk of your house is blown away by some unknown force and then suddenly someone who looks strikingly similar to you appears. “I’m you from the future. There’s no time to explain!” he exclaims at you before he’s snatched away by a giant alien crab who’s intent on taking him, you, away. You then find yourself in possession of a weapon capable of dealing untold amounts of damage whilst also functioning as a partial jetpack to get you over any obstacles in your way. It’s then up to you to rescue yourself from whatever dangers you find yourself in.

Whilst I’ve described some games in the past as being Flash-like due to their styling and choice of colour palettes No Time To Explain is in fact a flash game brought to you as a standalone executable thanks to Adobe’s AIR framework. This means the graphics are pretty much what you’d expect to see from any browser based flash game. This isn’t necessarily a bad thing, indeed for No Time To Explain the cartoonish presentation is what makes it so hilariously awesome, but there’s a certain standard that flash games seem to hit and never get passed no matter how long is spent on it. It’s probably a limitation of the platform more than anything although I can’t really comment since the last time I looked at ActionScript I got scared and decided to stick to C#.

No Time To Explain Screnshot Wallpaper You From The Future

Whilst Not Time To Explain starts off as a kind of soft core version of Metal Slug where you basically just wailing on random things with your giant beam weapon the core game mechanic is actually that of a physics based platformer. Your gun, whilst unleashing torrents of destruction where ever you aim it, also has  something of a kick to it. Pointing it in the right direction can send you soaring up into the clouds or launch you across wide gaps at incredible speed. The trouble then becomes figuring out what the right angles, amount of force and then how to correct your trajectory whilst you’re up in the air.

At the beginning this is relatively easy as your landing zones are huge and there’s nothing that will kill you brutally should you get your timing wrong. Soon after however there will be spikes coating surfaces, bottomless pits to fall in and jumps/obstacles that seem to be next to impossible to cross the first time you see them. Thanks to the decent auto-save system though you’ll be able to fine tune your strategy rapidly without having to go through everything from the start again. I have to say that this was a welcome change from the Super Meat Boy way of doing things where one particular obstacle could block you for ages simply because it took so long to get there in the first place.

No Time To Explain Screnshot Wallpaper Shark Boss

Each section is capped off with a boss fight which usually involves aiming your laser at whatever is moving and then waiting for it to keel over. This is perhaps where the save system is a little too good as there’s not a whole lot of challenge in the majority of the boss fights when you can literally stand in one section the entire time and simply wail on them until they die. Of course you can make it interesting for yourself (and speed up the process) by dodging the incoming bullets and positioning yourself better but that’s not technically a challenge the game provides. There was one boss fight where the quick save system didn’t apply which was a refreshing change but there were bigger issues at play there.

The Drill Squirrel boss is the first one where you can actually “die” in the sense that should you get injured at a specific point you’ll be sent back to the start of the fight rather than respawned where you were last standing. This is fine in and of itself however the fight is completely and utterly broken should certain things happen. Easy ways to replicate this are: be in the pit when he does his laser eyes at you or be on the same platform during said event. Once you’re past that the next section, where the pits fill with lava and fiery columns spew up from the ground, simply won’t happen and the drill squirrel will get stuck in the ground. This isn’t the only bug either, should you get bounced into a wall by him during the second phase you’ll get stuck in there as well but at least the game recognizes it and restarts you from the start.

No Time To Explain Screnshot Wallpaper Weird Polygon Thing

No Time To explain is an awesome platformer title, combining some of the twitch aspects of its more insanely difficult brethren with mechanics that make the platforming enjoyable rather than a chore. For the most part it works well with many of the times I got stuck being down to me not getting the puzzle rather than game breaking bugs. However there are still some teething issues that need to be worked out, especially with that one particular boss, before I could say it was a trouble free experience. I also have a small gripe over the price since it’s rather short (and is available a lot cheaper direct from the developer) but it is on sale right now which kind of renders that complaint moot. Overall I quite enjoyed No Time To Explain and after reading through the developer’s blog I have to say I’m interested in their future titles and hope that their recent Greenlight success will give them the capital to see it through.

Rating: 8.25/10

No Time To Explain is available on PC right now for $9.99. Total game time was approximately 2 hours.

Google’s Peculiar Flash Obsession.

Google is one of the biggest proponents of an Internet that’s unencumbered by proprietary standards, patents and non-neutral traffic routes. That’s been a great boon to us Internet users as their advocacy on our behalf means that as long as they stay in business we’re likely to continue to have an Internet that stays true to those ideals. Of course like any company they’re not entirely perfect, at times attempting to forward their own agenda under the guise of openness, but overall their contribution to keeping the Internet free and open has been positive. It seems rather odd then that Google has an obsession with Adobe’s Flash product, to the point where I wonder if there’s something going on that I don’t know about.

Back in March last year Google announced that they were integrating the Flash plugin directly into their Chrome browser. This was at the height of the web standards war that was raging between Apple and Adobe so it was easy to construe Google’s support of Flash as them taking Adobe’s side in the matter. That notion was further reinforced by the fact that Google’s Android platform fully supported Flash as well. This level of support for a proprietary plug-in for a company that prides itself on being a big supporter of open standards seems rather hypocritical, but there are some reasons as to why they’re doing it.

One of Google’s main objectives in developing the Chrome browser (and subsequently releasing the vast majority of its source code) was to improve the Internet experience for the end user. A lot of effort was focused on developing a much faster JavaScript engine, dubbed V8, that would significantly speed up many of the JavaScript heavy websites that now dominate the web. The integration of Flash into Chrome then was the next step in making the Internet as a whole more usable as the liberal use of Flash on many websites can bring even the most powerful PC systems to their knees. The same sites wreck absolute havoc on mobile handsets so it was definitely in Google’s interest to get more closely acquainted with Flash, if only to make it more usable.

Recently though it appears that Google’s support of Flash was actually leading up to a much more ambitious goal, transitioning the web from Flash to a HTML5 future:

Google is enabling developers who use the Adobe Flash Professional developer tool to convert their animations to HTML5 via an extension based on Google’s Swiffy conversion technology.

“One of our main aims for Swiffy is to let you continue to use Flash as a development environment, even when you’re developing animations for environments that don’t support Flash,” said Esteban de la Canal, Google software engineer, in a blog post. “To speed up the development process, we’ve built the Swiffy Extension for Flash Professional. The extension enables you to convert your animation to HTML5 with one click (or keyboard shortcut).”

Now it’s interesting that Google would go ahead and do something like this when Adobe had already made their play in this field with their Wallaby product. The big difference here is that Wallaby was specifically targeted at Flash Ads only and didn’t support many of the features that made Flash so versatile, like ActionScript. Swiffy on the other hand does support ActionScript and several other features that weren’t present in Wallaby. It would seem then that Google thinks they can do better than Adobe at their own game which they very well could especially when Adobe just recently announced that they weren’t working on mobile Flash any more.

Of course the transition from native Flash to Flash rendered through HTML5 doesn’t necessarily mean we’re looking at a future web that performs better. The main problem with Flash wasn’t so much the platform it was the developers on that platform. The Flash ads were the biggest culprit, often laden with gobs of unnecessary and bloated code that were the source of the performance problems people encountered. Transitioning such ads to HTML5 won’t make that code go away (there is a chance to optimize, but automated tools can only go so far) and the result will more than likely be just as bad as the original Flash it came from. It’s a step in the right direction yes, but it’s not going to be an all roses future like some would have you believe.

It’s quite interesting to see the kind of games that Google plays in order to make the web better for everyone. At times they may seem to be on the wrong side but it’s becoming clear that they’re playing the long game for a better web for everyone. It will be interesting to see how common Swiffy converted Flash files become and whether they’re still the performance hogs that their predecessors are but knowing Google they won’t let it lie until they’ve optimized it to the nth degree. Adobe’s reaction to Swiffy will be telling as well considering they’re now competing directly with Google on their home turf. The end result will be a better, more open Internet for us all something I think we can all agree is a good thing.

Fusion-IO ioDrive Maximised IOPS

Fusion-IO’s ioDrive Comparison: Sizing up Enterprise Level SSDs.

Of all the PC upgrades that I’ve ever done in the past the one that’s most notably improved performance of my rig is, by a wide margin, installing a SSD. Whilst good old fashioned spinning rust disks have come a long way in recent years in terms of performance they’re still far and away the slowest component in any modern system. This is what chokes most PC’s performance as the disk is a huge bottleneck, slowing everything down to its pace. The problem can be mitigated somewhat by using several disks in a RAID 0 or RAID 10 set but all of those pale in comparison when compared to even a single SSD.

The problem doesn’t go away for the server environment either, in fact most of the server performance problems I’ve diagnosed have had their roots in poor disk performance. Over the years I’ve discovered quite a few tricks to get around the problems presented by traditional disk drives but there are just some limitations you can’t overcome. Recently at work the issue of disk performance came to a head again as we investigated the possibility of using blade servers in our environment. I casually made mention of a company that I had heard of a while back, Fusion-IO, who specialised in making enterprise class SSDs. The possibility of using one of the Fusion-IO cards as a massive cache for the slower SAN disk was a tantalizing prospect and to my surprise I was able to snag an evaluation unit in order to put it through its paces.

The card we were sent was one of the 640GB ioDrives. It’s surprising heavily for its size, sporting gobs of NAND flash and a massive heat sink that hides the propeitary c ontroller. What intrigued me about the card initially was the NAND didn’t sport any branding I recognised before (usually its recognisable like Samsung) but as it turns out each chip is a 128GB Micron NAND Flash chip. If all that storage was presented raw it would total some 3.1 TB and this is telling of the underlying infrastructure of the Fusion-IO devices.

The total storage available to the operating system once this card is installed is around 640GB (600GB usable). Now to get that kind of storage out of the Micron NAND chips you’d only need 5 of them but the ioDrive comes with a grand total of 25 dotting the board. No traditional RAID scheme can account for the amount of storage presented. So based on the fact that there’s 25 chips and only 5 chips worth of capacity available it follows that the Fusion-IO card uses quintuplet sets of chips to provide the high level of performance that they claim. That’s an incredible amount of parallelism and if I’m honest I expected these chips to all be 256MB chips that were all RAID 1 to make one big drive.

Funnily enough I did actually find some Samsung chips on this card, two 1GB DDR2 chips. These are most likely used for the CPU on the ioDrive which has a front side bus of either 333 or 400MHz based on the RAM speed.

But enough of the techno geekery, what’s really important is how well this thing performs in comparison to traditional disks and whether or not it’s worth the $16,000 price tag that comes along with it. Now I had done some extensive testing of various systems in the past in order to ascertain whether the new Dell servers we were looking at where going to perform as well as their HP counterparts. All of this testing was purely disk based using IOMeter, a disk load simulator that tests and reports on nearly every statistic you want to know about your disk subsystem. If you’re interested in replicating the results I’ve got then I’ve uploaded a copy of my configuration file here. The servers included in the test are Dell M610x, Dell M710HD, Dell M910, Dell R710 and a HP DL380G7. For all the tests (bar the two labelled local install) all of them are a base install of ESXi 5 with a Windows 2008R2 virtual machine installed on top of it. The specs of the virtual machine are 4 vCPUs, 4GB RAM and a 40GB disk.

As you can see the ioDrive really is in a class all of its own. The only server that comes close in terms of IOPS is the M910 and that’s because it’s sporting 2 Samsung SSDs in RAID 0. What impresses me most about the ioDrive though is its random performance which manages to stay quite high even as the block size starts to get bigger. Although its not shown in these tests the one area where the traditional disks actually equal the Fusion-IO is in terms of throughput when you get up to really large write sizes, on the order of 1MB or so. I put this down to the fact that the servers in question, the R710s and DL380G7s, have 8 disks in them that can pump out some serious bandwidth when they need to. If I had 2 Fusion-IO cards though I’m sure I could easily double that performance figure.

What interested me next was to see how close I could get to the spec sheet performance. The numbers I just showed you are particularly incredible but Fusion-IO claims that this particular drive was capable of something on the order of 140,000 IOPS if I played my cards correctly. Using the local install of Windows 2008 I had on there I fired up IOMeter again and set up some 512B tests to see if I could get close to those numbers. The results, as shown in the Dell IO contoller software, are shown below:

Ignoring the small blip in the centre where I had to restart the test you can see that whilst the ioDrive is capable of some pretty incredible IO the advertised maximums are more than likely theoretical than practical. I tried several different tests and while a few averaged higher than this (approximately 80K IOPS was my best) it was still a far cry from the figures they have quoted. Had they gotten within 10~20% I would’ve given it to them but whilst the ioDrive’s performance is incredible it’s not quite as incredible as the marketing department would have you believe.

As a piece of hardware the Fusion-IO ioDrive is really the next step up in terms of performance. The virtual machines I had running directly on the card were considerably faster than their spinning rust counterparts and if you were in need of some really crazy performance you really couldn’t go past one of these cards. For the purpose we had in mind for it however (putting it inside a M610x blade) I can’t really recommend it as it’s a full height blade that only has the power of a half height. The M910 represents much better value with its crazy CPU and RAM count and the SSDs, whilst being far from Fusion-IO level, do a pretty good job of bridging the disk performance gap. I didn’t have enough time to see how it would improve some real world applications (it takes me longer than 10 days to get something like this into our production environment) but based on these figures I have no doubt it improve the performance of whatever I put it into considerably. 

So Long Flash and Thanks for all the Vids.

You’d be forgiven for thinking that I was some kind of shill for Adobe what with all the pro-Flash articles I’ve posted in the past. Sure I’ve taken their side consistently but that’s not because of some kind of fanboy lust for Adobe or some deep rooted hatred for Apple. More it was because the alternatives, HTML5 with CSS3 and JavaScript, are still quite immature in terms of tooling, end user experience and cross platform consistency. Flash on the other hand is quite mature in all respects and, whilst I do believe that the HTML5 path is the eventual future for the web, it will remain as a dominant part of the web for a while to come even if it’s just for online video.

Adobe had also been quite stalwart in their support for Flash too, refusing to back down on their stance that they were “the way” to do rich content on the Internet. Word came recently however that they were stopping development on the mobile version of Flash:

Graphics software giant Adobe announced plans for layoffs yesterday ahead of a major restructuring. The company intends to cut approximately 750 members of its workforce and said that it would refocus its digital media business. It wasn’t immediately obvious how this streamlining effort would impact Adobe’s product line, but a report that was published late last night indicates that the company will gut its mobile Flash player strategy.

Adobe is reportedly going to stop developing new mobile ports of its Flash player browser plugin. Instead, the company’s mobile Flash development efforts will focus on AIR and tools for deploying Flash content as native applications. The move marks a significant change in direction for Adobe, which previously sought to deliver uniform support for Flash across desktop and mobile browsers.

Now the mobile version of Flash had always been something of a bastard child, originally featuring a much more cut down feature set than its fully fledged cousin. More recent versions brought them closer together but the experience was never quite as good especially with the lack of PC level grunt on mobile devices. Adobe’s mobile strategy now is focused on making Adobe AIR applications run natively on all major smart phone platforms, giving Flash developers a future when it comes to building mobile applications. It’s an interesting gamble, one that signals a fundamental shift in the way Adobe views the web.

Arguably the writing has been on the wall for this decision for quite some time. Back at the start of this year Adobe released Wallaby, a framework that allows advertisement developers the capability to convert Flash ads into HTML5. Indeed even back then I said that Wallaby was the first signal that Adobe thought HTML5 was the way of the future and were going to start transitioning towards it as their platform of the future. I made the point then that whilst Flash might eventually disappear Adobe wouldn’t as they have a history for developing some of the best tools for non-technical users to create content for the web. Indeed there are already prototypes of such tools already available so it’s clear that Adobe is looking towards a HTML5 future.

The one place that Flash still dominates, without any clear competitors, is in online video. Their share of the market is somewhere around 75% (that’s from back in February so I’d hazard a guess that its lower now) with the decline being driven from mobile devices that lack support for Flash video. HTML5′s alternative is unfortunately still up in the air as the standards body struggles to find an implementation that can be open, unencumbered by patents and yet still be able to support things like Digital Rights Management. It’s this lack of standardization that will see Flash around for a good while yet as until there’s an agreed upon standard that meets all those criteria Flash will remain as the default choice for online video.

So it looks like the war that I initially believed that Adobe would win has instead seen Adobe pursuing a HTML5 future. Its probably for the best as they will then be providing some of the best tools in the market whilst still supporting open standards, something that’s to the benefit of all users of the Internet. Hopefully that will also mean better performing web sites as well as Flash had a nasty reputation for bringing even some of the most powerful PCs to their knees with poorly coded Flash ads. The next few years will be crucial to Adobe’s long term prospects but I’m sure they have the ability to make it through to the other end.

The Memristor: Moore’s Law Gets a Jolt.

The computer (or whatever Internet capable device you happen to be viewing this on) is made up of various electronic components. For the most part these are semiconductors, devices which allow the flow of electricity but don’t do it readily, but there’s also a lot of supporting electronics that are what we call fundamental components of electronics. As almost any electrical enthusiast will tell you there are 3 such components: the resistor, capacitor and inductor each of them with their own set of properties that makes them useful in electronic circuits. There’s been speculation of a 4th fundamental component for about 40 years but before I talk about that I’ll need to give you a quick run down on what the current fundamentals properties are.

The resistor is the simplest of the lot, all it does is impede the flow of electricity. They’re quite simple devices, usually a small brown package banded by 4 or more colours which denotes just how resistive it actually is. Resistors are often used as current limiters as the amount of current that can pass through them is directly related to the voltage and level of resistance of said resistor. In essence you can think of them as narrow pathways in which electric current has to squeeze through.

Capacitors are intriguing little devices and can be best thought of as batteries. You’ve seen them if you’ve taken apart any modern device as they’re those little canister looking things attached to the main board of said device. They work by storing charge in an electrostatic field between two metal plates that’s separated by an insulating material called a dielectric. Modern day capacitors are essentially two metal plates and the dielectric rolled up into a cylinder, something which you could see if you cut one open. I’d only recommend doing this with a “solid” capacitor as the dielectrics used in other capacitors are liquids and tend to be rather toxic and/or corrosive.

Inductors are very similar to capacitors in the respect that they also store charge but instead of an electrostatic field they store it in a magnetic field. Again you’ve probably seen them if you’ve cracked open any modern device (or say looked inside your computer) as they look like little circles of metal with wire coiled around them. They’re often referred to as “chokes” as they tend to oppose the current that induces the magnetic field within them and at high frequencies they’ll appear as a break in the circuit, useful if you’re trying to keep alternating current out of your circuit. 

For quite a long time these 3 components formed the basis of all electrical theory and nearly any component could be expressed in terms of them. However back in 1971 Leon Chua explored the symmetry between these fundamental components and inferred that there should be a 4th fundamental component, the Memristor. The name is a combination of memory and resistor and Chua stated that this component would not only have the ability to remember its resistance, but also have it changed by passing current through it. Passing current in one direction would increase the resistance and reversing it would decrease it. The implications of such a component would be huge but it wasn’t until 37 years later that the first memristor was created by researchers in HP’s lab division.

What’s really exciting about the memristor is its potential to replace other solid state storage technologies like Flash and DRAM. Due to memristor’s simplicity they are innately fast and, best of all, they can be integrated directly onto the chip of processors. If you look at the breakdown of a current generation processor you’ll notice that a good portion of the silicone used is dedicated to cache, or onboard memory. Memristors have the potential to boost the amount of onboard memory to extraordinary levels, and HP believes they’ll be doing that in just 18 months:

Williams compared HP’s resistive RAM technology against flash and claimed to meet or exceed the performance of flash memory in all categories. Read times are less than 10 nanoseconds and write/erase times are about 0.1-ns. HP is still accumulating endurance cycle data at 10^12 cycles and the retention times are measured in years, he said.

This creates the prospect of adding dense non-volatile memory as an extra layer on top of logic circuitry. “We could offer 2-Gbytes of memory per core on the processor chip. Putting non-volatile memory on top of the logic chip will buy us twenty years of Moore’s Law, said Williams.

To put this in perspective Intel’s current flagship CPU ships with a total of 8MB of cache on the CPU and that’s shared between 4 cores. A similar memristor based CPU would have a whopping 8GB of on board cache, effectively negating the need for external DRAM. Couple this with a memristor based external drive for storage and you’d have a computer that’s literally decades ahead of the curve in terms of what we thought was possible, and Moore’s Law can rest easy for a while.

This kind of technology isn’t you’re usual pie in the sky “it’ll be available in the next 10 years” malarkey, this is the real deal. HP isn’t the only one looking into this either, Samsung (one of the world’s largest flash manufacturers) has also been aggressively pursuing this technology and will likely début products around the same time. For someone like me it’s immensely exciting as it shows that there are still many great technological advances ahead of us, just waiting to be uncovered and put into practice. I can’t wait to see how the first memristor devices perform as it will truly be a generational leap ahead in technology.

 

Windows 8: The Death of the Silverlight Ecosystem?

It’s only been just over a week since Microsoft demoed their latest iteration of the Windows platform but in that short amount of time it’s already managed to stir up quite a bit of discussion from friends and foes alike. The foes were quick to call out the new OS’s tablet envy, conveniently forgetting Microsoft’s rhetoric that the next version of Windows after 7 was going to have a much more web centric focus, with the possibility of it being entirely cloud based. More interesting however is the discussion arising from long term developers on the Microsoft platform, and it’s not the kind of adulation and praise you’d normally expect.

During the D9 conference Microsoft said that the new tile mode in Windows 8 was based around HTML5 and Javascript applications. Whilst they did mention that all current apps built on the .NET platform should run as intended when running in the familiar desktop mode they made no mention of whether or not the .NET and Silverlight platforms could be used to create applications in the new style of interface. With Microsoft traditionally being quite favorable to developers the notion of having to re-skill to HTML5 and Javascript (not to mention reworking existing codebases) came as quite a shock to a lot of developers and their reaction was akin to an open revolt on the forums.

Rampant speculation soon followed and wasn’t helped by the fact that Microsoft has asked everyone to remain calm until their BUILD developer conference in September. It’s not the first time this sort of thing has happened either, a similar level of hubbub was roused when Microsoft was coy about Silverlight’s future when talking about Internet Explorer 9 and it’s dedication to web standards. They soon came out saying that they still saw a future in Silverlight, especially for the Windows Phone 7 platform, but many of them were left unconvinced. It’s then quite likely that this second round of doubt that Microsoft has cast over their third party developer’s futures was the straw that broke the camel’s back and all the blame is being leveled squarely at Microsoft.

For what it’s worth I feel their concerns are valid if the reaction to them is somewhat overblown. Microsoft has a long history of eating its own dog food and many of their client facing applications are built upon the technologies that so many are worried are going to disappear in the near future. The best example of this is their Windows Azure management console which is built entirely on Silverlight. Couple that with the fact that Microsoft has many partners with a very heavy investment in the platform and I find it hard to jump on the “Silverlight is dead” bandwagon, but that doesn’t necessarily mean Microsoft is committed to bringing Silverlight into the Windows 8 tablet world.

Sure it would be great to be able to create Silverlight applications on the new Windows 8 tile system and Microsoft would be leveraging off a lot of preexisting talent to help drive adoption of the platform. However it would also hinder Microsoft’s adoption of web standards, as many developers would favor using proprietary Microsoft technologies instead of attempting to reskill. They’d then be the slave of two masters: on the one hand the Silverlight crowd demanding ever more features and tools that are constrained to that platform and on the other the web standards crowd that has been Microsoft’s bug bear ever since alternative browsers started to gain real market traction. It’s not like Microsoft doesn’t have the resources to deal with this though, but I can understand their motivations should they want to eschew Silverlight in favour of a more standard environment.

So is this the end of the line for the Silverlight ecosystem and the developers who built their skills around it? Hard to say, with Microsoft being mum on it for the next few months we’ll just have to play it by ear until we get more information from them. In all honesty even if they do end up dropping Silverlight for HTML5 and Javascript I’d expect that the next release of Visual Studio would bring enough tools and resources with it to make the transition much easier than everyone is making it out to be. Hell if Adobe can build a Flash to HTML5 converter then it’s quite possible for Microsoft to do the same for Silverlight, even if that’s just a band-aid solution to satisfy developers who refuse to reskill.

 

Adobe’s Wallaby: And You Thought HTML5 Would Save You.

Adobe and Apple haven’t been the best of friends for a while now. Whilst many of their products are still considered some of the most top of the line applications available on the OS X platform Apple couldn’t be more hostile to their most popular product: Flash. Now this isn’t without good reason as Flash has a terrible tendency to be abused by sloppy developers (most of the time ad networks) who can even bring a full blown desktop PC to its knees. Keeping Flash out of their handhelds meant fewer headaches for them and forced the hand of many companies to rethink their use of Flash, lest they draw the ire of the iOS browsing crowd.

Whilst there was a good few months of to and fro between these two companies last year it all subsided once Apple capitulated to the developer community that raised concerns over Apple’s wide reaching policy on cross platform libraries. This seemingly opened up the door that Apple had shut in Adobe’s face, enabling them to create a product that could convert Flash files into a more iOS friendly format. A couple days ago they announced the first iteration of the product, called Wallaby:

Welcome to the Wallaby Technology Preview. Wallaby is an application to convert Adobe Flash Professional CS5 files (.FLA) to HTML5. Wallaby has a very simple UI which accepts as input a FLA file and exports HTML and support files to a user-selected folder. There is also an option to launch the default application assigned for the .html extension.

The announcement has, of course, caused quite a stir in the tech community. Most of them focus on the fact that Wallaby was designed with only one purpose in mind: to get Flash banner ads working on iOS devices. As such Wallaby is pretty limited in the functionality it provides, being unable to convert things like ActionScript which enable things like Flash based games. Of course this also raises the issue that Flash is most often abused by advertising agencies with poorly coded banner ads being one of the main culprits. Whether or not badly coded ads in Flash translate into bad (or worse) ads in HTML5 remains to be seen, but I can’t see how they could get any better.

Realistically the issues that many people associated with Flash aren’t really caused by it. More it is those who use the platform that are to blame for the troubles that many people encounter with it. This is why I didn’t understand Apple’s position on Flash in the first place. Sure there are many banner ads out there that can make your web experience a browsing hell but banning one technology simply drives those same people to look for other platforms, it won’t magically make them better developers overnight. Wallaby is a great example of this as those same people that created poor performing Flash ads can now do the same in HTML5. In the end Apple is merely delaying the time in which it takes for the same problems that plagued Flash to come to their iOS platform. Google I feel has is on the right track to solving this problem, tightly integrating Flash into their products so they can tune it properly.

It does show that Adobe doesn’t believe the future is still with their Flash platform and the gears are in motion to transition to the new world of HTML5. There’s a reason why Flash has been such an integral part of the web for so long and it’s simply because Flash gave the best tools for non-technical users to create rich content for the web. Whilst they’ve come rather late to the mobile boat they are one of the few companies that has the momentum and devoted user base to make the switch successfully. I’m sure many people will see this as them “capitulating” to Apple’s demands but in reality its anything but and I’m sure they’ll eventually dominate the HTML5 space just as they’ve done in the past with Flash.

 

Apple Caves Under Developer Pressure.

Apple’s policies for the App Store have always been a bit vague and uneven, leading to quite a few good headlines over what apps got rejected and which ones got in. I put it down to the human element in the review process as one reviewer’s biases need not line up with another. Still though the developers worked out the inputs and outputs of the application review process and if your app was useful, family friendly and didn’t go rampaging through private APIs you were golden. Apple, not content with the amount of control it was already exerting over its developers, then decided to up the ante by banning all cross platform frameworks putting a big question mark over some of their most successful applications and developers.

The whole thing can be traced back to Apple’s public flamewar with Adobe. I’m not really sure what triggered this decision in the first place (although it smacks of Jobs’ idealism) but they did it with precise timing, just a few days before Adobe was to announce their Flash to iPhone app packager for CS5. Perhaps the idea of a torrent of applications hastily converted from flash onto the iPhone was a bit too much for them to bear but in casting their net so wide they caused many people to become hesitant about developing on the platform, especially those who found great success using such 3rd party frameworks.

Apple began doing some damage control in order to ensure that they wouldn’t lose some of their biggest money earners. They gave unofficial word that frameworks such as Unity3D were safe since they generated an actual iPhone application and didn’t require use of an intermediate interpreter. Still since coding in Unity3D is done in C# this ran up against yet another draconian rule that all iPhone applications must be written in one of the sanctioned C based languages. With Android starting to pick up at a phenomenal pace there’s no doubt that Apple began to rethink their stance on many of these matters with hopes of winning back the developers they had once scorned.

Last week saw Apple release what amounts to their set of principles and guidelines that are applied when reviewing apps that will make it onto their app store. You can get the full pdf of all the guidelines here and it makes for some interesting reading. Most of them are just formalisation of the rules that most developers knew about but couldn’t get solid verification from Apple that it was a hard and fast rule. Probably the biggest coup in this whole document is they abandoned their previous stance on not allowing any cross platform libraries, allowing such applications through as long as they didn’t download any code:

The black box that is the Applereview process is creaking open. In a very brief release, Apple has essentially relaxed the requirement that developers use Apple’s own development tools “as long as the resulting apps do not download any code.” They’ve also published some review guidelines, allowing programmers to understand just what will go on behind the curtains in Cupertino.What does this mean? Well, in the updated SDK license, circa April of this year, a number of paragraphs essentially bannedoutside development tools including systems that ported Flash, Silverlight, Java, and other platforms to the iPhone. Now, presumably, any app that runs on the iPhone, regardless of source, will be considered. The language is so mushy that it’s still unclear what this means.

On the surface it would appear that Apple has backpeddled on their previous stance. Indeed the news was enough for Adobe to state that they were going to restart developing their Flash to iPhone packager which had been shelved after Apple hamstrung it earlier this year. The not downloading any code exclusion is quite understandable as this could easily be exploited as an attack vector by a malicious third party. Still most attackers wouldn’t bother with an app (that leaves a paper trail) since the browser on the phone will happily download code and run it. But I’m sure Apple knows that already.

For what its worth it seems like Apple is finally caving into the developers who helped them make their products so successful and rightly so. Developing something for an Apple product has always been about the end user, much to the detriment of those creating for those users. This is in stark contrast to Google who’s always been about the developers, favouring their freedom to develop however they want with almost no thought to the user experience. Both approaches have their pluses and pitfalls but in the end if you don’t have developers you’re going to have a hard time attracting users to your platform.

Will this lead to a flood of low quality applications on the app store and the fiery death of the user experience on the iPhone? Most likely not as there’s already enough crap on the app store to make sure that any poorly ported Flash app will be lost amongst the noise. Realistically anyone looking to publish on the iOS platform knows what they’re getting into and will redesign the app as such, lest they get bad reviews that ultimately bury their app completely. In the end I think it’s just Apple realising that the road they were going down wasn’t going to do them any favours and the rising star of Android is beginning to look attractive enough for some to make the switch.

The question now is though, will they keep their hard line on Flash? Time will tell.

Web Standards: They All Have Their Agenda.

It really should come as no surprise that anything a large corporation does is usually done in their best interests. By definition their existence is centered around increasing profit for their respective shareholders within the bounds of the law and operating outside that definition will in turn make your company not long for this world. Still we manage to suspend disbelief for certain companies which have qualities we aspire to but make no mistake they are in the end driven primarily by motives of profit. Nearly all other secondary activities are conducted to further their primary directive, even if on the surface they don’t appear that way.

Take for instance the current web standards warthat’s brewing between Apple and Adobe. Whilst both companies would have you believe that their stance is the only answer to the problem the fundamental issue that they face is not one of ubiquitous web standards, more it is about control over the future of the Internet and who will be the dominant player. I’m on record as stating that Adobe will win out thanks to its current market penetration and support from many big players. It’s no secret that Google is more on Adobe’s side in this war than Apples, as a recent post from one of their (well their subsidiary) employee states:

There’s been a lot of discussion lately about whether or not the HTML5 <video> tag is going to replace Flash Player for video distribution on the web. We’ve been excited about the HTML5 effort and <video> tag for quite a while now, and most YouTube videos can now be played via our HTML5 player. This work has shown us that, while the <video> tag is a big step forward for open standards, the Adobe Flash Platform will continue to play a critical role in video distribution.
It’s important to understand what a site like YouTube needs from the browser in order to provide a good experience for viewers as well as content creators. We need to do more than just point the browser at a video file like the image tag does – there’s a lot more to it than just retrieving and displaying a video. The <video> tag certainly addresses the basic requirements and is making good progress on meeting others, but the <video> tag does not currently meet all the needs of a site like YouTube:
All of the points Harding make add quite a lot of fuel to the fire in the whole web standards debate. He’s quite right that the current version of HTML5 does not (and most like can not) provide the features required by sites like YouTube. As such there will always be a need for plugins that fill the functionality gap between the web standards and what is technically possible. The more rich the standards are the less requirement there is for plugins but as it stands right now the features provided by third party plugins are almost a necessity for a lot of sites on the Internet and it will be a long time before the standards catch up.
However if you read on you’ll see that YouTube’s apprehension to switch over to a full HTML5 based site is fueled not only by lack of features but also because their bread and butter, videos, still lacks agreement on some core components. One of those is the codec that will be used as the standard for all content used with the <video> tag. Usually you would go with the most popular codecs out of the lot which is currently H.264. The problem with that codec is that, while it is currently royalty free, it is encumbered by a number of patents held by a consortium of companies. This poses a problem for browser developers as it means eventually they will have to pay fees to implement the video part of the web standard, which doesn’t really fit with overall vision of the HTML5 standard. Google of course has their own open codec VP8 which they’ve garnered support for which brings us full circle back to my original point: they’re only developing it to further their bottom line.
Ultimately it will be the market that decides the winner out of all this. Web standards will always lag behind what Internet enabled devices are capable of and that will mean there will have to be third party plugins to bridge the gap. Whether that gap is bridged by Adobe, Apple or some other company remains to be seen but so far the market still seems to side with Adobe as the vast majority of sites (including this one) make use of Flash in one way or another. Many sites will still go to the effort to make their content more accessible to mobile devices (like this one!) but in the end we’d still have to do that even if Apple ends up losing the war on Flash.
I guess what I’m trying to say is: if a company tells you they’re doing something that seems to be for your benefit ask yourself what they have to gain from doing it. In the end you’ll notice that they will be benefiting from it far more than you ever could.