For much of my childhood people told me I was smart. Things that frustrated other kids, like maths, seemed to come easy to me and this led to many people praising my ability. I never felt particularly smart, I mean there were dozens of other kids who were far more talented than I was, but at that age it’s hard to deny the opinions of adults, especially the ones who raised you. This led to an unfortunate misconception that stayed with me until after I left university: the idea that my abilities were fixed and that anything I found hard or difficult was simply beyond my ability. It’s only been since then, some 8 years or so, that I learnt that any skill or problem is within my capability, should I be willing to put the effort in.
It’s a theme that will likely echo among many of my generation as we grew up with parents who were told that positive reinforcement was the way to make your child succeed in the world. It’s only now, after decades of positive reinforcement failing to produce the outcomes it decried, we’re beginning to realise the folly of our ways. Much of the criticism of our generation focuses on this aspect, that we’re too spoilt, too demanding when compared to previous generations. If there’s one good thing to come out of this however it’s that research has shown that the praising a child’s ability isn’t the way to go, you should praise them for the process they go through.
Indeed once I realised that things like skills, abilities and intelligence were primarily a function of the effort and process you went through to develop them I was suddenly greeted with a world of achievable goals rather than roadblocks. At the same time I grew to appreciate those at the peak of their abilities as I knew the amount of effort they had put in to develop those skills which allowed them to excel. Previously I would have simply dismissed them as being lucky, winning the genetic lottery that gave them all the tools they needed to excel in their field whilst I languished in the background.
It’s not a silver bullet however as the research shows the same issues with positive reinforcement arise if process praise is given too often. The nuances are also unknown at this point, like how often you should give praise and in what fashion, but these research does show that giving process praise in moderation has long lasting benefits. I’d be interested to see how well this translates into adults as well since my experience has been vastly positive once I made the link between effort and results. I can’t see it holding true for everyone, as most things don’t in this regard, but if it generally holds then I can definitely see a ton of benefits from it being implemented.
In the eyes of corporate IT shops the word virtualization is synonymous with the VMware brand. The reason is this is simple, VMware was first to market with solutions that could actually deliver tangible results to the business. VMware then made the most of this first mover advantage quickly diversifying their product portfolio away from just straight up virtualization into a massive service catalogue that no competitor has yet to match. There’s no denying that they’re the most pricey of the solutions however but many IT shops have been willing to wear the costs due to the benefits that they receive. However in the past couple years or so the competitors, namely Hyper-V and Xen, have started to catch up in features and this has seen many IT shops questioning their heavy investment in VMware.
Undoubtedly this dissatisfaction with VMware’s products has been catalysed by the licensing change in vSphere 5 which definitely gave the small to medium section of the market some pause when it came to keeping VMware as a platform. For larger enterprises it wasn’t so much of a big deal since realistically they’d already licensed most of their capacity anyway. Still it’s been enough for most of them to cast a careful eye over their current spend levels on VMware’s products and seek to see if there’s perhaps a better way to spend all that cash. Indeed a recent survey commissioned by Veeam showed that 38% of virtualized businesses were looking to switch platforms in the near future.
The report doesn’t break down into exactly which platform they’re switching from and to but since the 3 biggest reasons cited are cost, alternative hypervisor features and licensing model (all long time complaints of the VMware platform) it’s a safe bet that most of those people are considering changing from VMware to another platform (typically Hyper-V). Indeed I can add that anecdotally the costs of VMware are enough now that business are seriously considering the platform swap because of the potential savings from a licensing perspective. Hyper-V is the main contender because most virtualization is done with Windows servers and under the typical licensing agreements the hypervisor is usually completely free. Indeed even the most basic of Windows server licenses gives you 1 free virtual machine to play with and it just gets better from there.
But why are so many considering switching from the market leader now when the problems cited have been around nearly half a decade? For the most part it has to do with the alternatives finally reaching feature parity with VMware when it comes to base level functionality. For the longest time VMware was the only one that was capable of doing live migrations between hosts with technology they called vMotion. Xen caught up quickly but their lack of Windows support meant that it saw limited use in corporate environments, even after the support was added in shortly after. Hyper-V on the other hand struggled to get it working only releasing it with Server 2008 R2. With Windows 2003 and XP now on the way out many IT shops are now looking to upgrade to 2008 R2 and that’s when they notice the capabilities of Hyper-V.
Strictly speaking though I’d say that whilst there’s a good few people considering making the jump from VMware to another hypervisor the majority are only doing so in order to get a better deal out of VMware. Like any business arrangement the difference between the retail price and the actual price anyone pays is quite large and VMware is no exception to this rule. I’ve seen quite a few decision makers wave the Hyper-V card without even the most rudimentary of understanding of what it’s capabilities are, nor any concrete plans to put it in motion. There’s also the fact that if you’re based on VMware now and you switch to another platform you’re going to have to make sure all your staff are retrained with the new product, a costly and time consuming exercise. So whilst the switch from VMware may look like the cheaper option if you just look at the licensing there’s a whole swath of hidden and intangible costs that need to be taken into consideration.
So with that all said is VMware staring down the barrel of a inevitable demise? I don’t believe so, their market capture and product lead means that they’ve got a solid advantage over everyone in the market. Should the other hypervisors begin eating away at their market share they have enough of a lead to be able to react in time, either by significantly reducing their prices or simply innovating their way ahead again. I will be interested to see how these figures shape up in say 3/9/12 months from now to see if those 38%ers made good on their pledge to change platforms but I’m pretty sure I know the outcome already.
Just over a year ago today I started this blog as a part of a larger body of work to combat the lunacy that is the Internet filter. I thought we were doing a good job of it to, since the trial was delayed several times and as far as anyone could tell the policy was dying a slow quite death. Indeed with companies like the Internet giant Google damning the policy you’d think that the government would want it to disappear quietly into the dark night. As it turns out nothing could be further from the truth, with several news articles coming out yesterday stating that not only had the trial been successful, it had actually achieved filtering nirvana:
THE Federal Government is pushing ahead with its controversial plan to filter the internet, saying illegal material can be blocked “with 100 per cent accuracy and negligible impact on internet speed”. It has just released results of its latest live filtering trials, used as proof that a national internet filter will work.
Labor will introduce legislation next year requiring all service providers to ban “refused classification” (RC) material hosted on overseas servers.
Communications Minister Stephen Conroy says RC material includes “child sex abuse content, bestiality, sexual violence and the detailed instruction of crime and drug use”.
“Most Australians acknowledge there is some internet content which is not acceptable in any civilised society,” Senator Conroy said.
A little digging around got me a link to the full report, available here. Looking into the report there are a few issues I can identify outright and some more insights I’ve gleaned after reading the whole thing. Overall it doesn’t bode well for us Australian’s who enjoy our Internet unfiltered.
The first issue I draw with the report is this line on page 7 (you’ll have to forgive their spelling mistakes to):
Participants were tested for accuracy in blocking the ACMA blacklist only and all nine participants achieved 100 percent accuracy ‐ a base requirement of the pilot.
Ok this is not what the initial proposal for the filter was, nor what Conroy’s rhetoric had alluded to. Filtering a list of 10,000 URLs is a trivial exercise and I’m not surprised that such a filter worked on an ISP level. In fact the government has already provided software to parents that will work to such an effect which can run on a home grade computer. This is not the heart of the problem though, as the technical challenge was just a small part of it. No where in the report or the rhetoric do we see a policy for how URLs get on the blacklist nor how to get it off should you somehow get on it. In essence the premise of the testing was a complete and utter farce.
The report indicates that any measures taken to prevent circumvention will have a negative impact on performance (pp 3, 25-27). Now when the results of this report were released there was no mention of this and it leaves the government with 2 options when they try to push the filter through. They have to either mandate that circumvention prevention be enabled (We can’t have the kids getting around this filter now can we) which degrades performance significantly or they simply leave it out, meaning that anyone with 5 minutes and Google can circumvent it. In essence saying that the filter trial was 100% successful is again misleading since any filter implemented on the back of these results will fail at either providing the service it seeks to achieve or send Australia’s Internet to the digital back water. Again it’s a load of bull.
However it seems that Telstra showed a small bit of sense for once (pg 7) which also provided some insight into the larger issues at hand:
Telstra did not test circumvention, because it considers that filtering can be circumvented by a technically competent user.
Telstra found its filtering solution was not effective in the case of non‐web based protocols such as instant messaging, peer‐to‐peer or chat rooms. Enex confirms that this is also the case for all filters presented in the pilot. Telstra reported that heavy traffic sites could overload its trial filtering solution if included in the filtering blacklist. This is also the case for all filters presented in the pilot.
So let me get this straight, you can’t filter P2P (which Conroy said he was going to do as well) and if a high traffic site somehow manages to get on the blacklist your filter solution will get overloaded which would then, logically, lead to either slowdown or loss of Internet for those who are on it? Heaven help them if RedTube ever ends up on that list, oh wait it already is. Trying to implement this kind of thing with an Alexa Top 100 site on their list, and one that ranks in the top 50 in Australia, will almost certainly overload the filters of any real large scale ISP that tries to implement these technologies.
There’s another small issue here to, none of the participants are named and neither are their solutions making a real analysis of these results impossible. If we go off the list they released a long time ago 5 of them were small time ISPs and only one of them was a semi-large (iPrimus), but still a small player in respect to the larger Internet community in Australia. Their report states that there were 9 total ISPs (2 large, 1 medium and 6 small) however with Optus being the only large provider who’s openly supported it (all the others have been outright hostile and Telstra didn’t test on their live network) that only leaves the medium (iPrimus) and 6 small for them to base their tests off. You can see why I question how relevant the results really are.
The report shows just how ridiculous the filter really is and how you can distort any test results to support your rhetorical point of view. Any real implementation of the filter will not mimic these results and trumpeting these results as showing that such a thing is viable is an insult to the public’s intelligence. I hope you will all join me in sending Conroy a message that this kind of malarkey will not be tolerated by the Australian community at large.