The representation of climate change science in the media has, up until recently, been rather poor. Far too many engaged in debates and articles that gave the impression there was still 2 sides to the argument when in fact the overwhelming majority of evidence only favours one side. The last few years have seen numerous campaigns to rectify this situation and whilst we still haven’t convinced everyone of the real facts it’s been great to see a reduction in the number of supposed “fair” debates on the topic. However if a recent study around the general population’s knowledge on this topic is anything to go by lack of knowledge might not be the problem at all, it might just be the culture surrounding it.
A recent study done by Professor Dan Kahan of Yale university was done in order to understand just how literate people were on the issues of general science as well as climate change science. The results are rather surprising (and ultimately disturbing) as whilst you’d tend to think that a better general understanding of science would lead to a better understanding of the risks associated with climate change the study actually shows that isn’t a predictor at all. Indeed the strongest predictor of was actually their left-right political affiliation with the amount of scientific knowledge actually increasing the divide between them. This leads us to a rather ugly conclusion that educating people about the facts behind climate change is most likely not going to change their opinion of it.
Whilst the divide along party lines isn’t going to shock anyone the fact that both sides of the political landscape are about as educated as each other on the topic was a big surprise to me. I had always thought that it was more ignorance than anything else as a lot of arguments I had had around climate change usually centered on the lack of scientific consensus. Had I dug further into their actual knowledge though it seems that they may have been more knowledgeable on it than I would first think, even if the conclusions they drew from the evidence were out of touch with reality. This signals that we, as those interested in spreading the facts and evidence as accepted by the wider scientific community, need to rephrase the debate from one of education to something else that transcends party lines.
What that solution would be though is something I just don’t have a good answer to. At an individual level I know I can usually convince most people of the facts if I’m given enough time with someone (heck up until 5 years ago I was on the other side of the debate myself) but the strategies I use there simply don’t scale to the broader population. Taking the politics out of an issue is no simple task, and one I’d wager has never been done successfully before, but until we find a way to break down the party lines on the issue of climate change I feel that meaningful progress will always be a goal that’s never met.
It’s easy to forget that the whole world doesn’t share your view of it. With the vast trove of information that is the Internet it’s become incredibly easy to surround yourself in an echo chamber of like-minded people, lulling you into the idea that you’re part of the majority. Once you’ve been in there long enough it’s easy to dismiss dissenting viewpoints as outliers, ones that shouldn’t have an effect on your perfect worldview. Of course if you have a rational mindset you’ll eventually come to learn that the world is quite unlike any echo chamber claims it to be and the reality can sometimes be quite sickening. Indeed I’ve fallen prey to it many times but none was as shocking to me as the current state of the population of the USA’s science education.
The table above is taken from the AP/GFK poll which was conducted between March 20-24 of this year with a total number of respondents being just over 1000. The margin of error for the study is +/- 3.4% at the 95% confidence level which shows that the figures portrayed above are very close to the actual state of opinion in the USA on these matters. As you can see whilst some of the bigger issues have wide spread support, like smoking causing cancer and that mental illness is a medical condition, the bottom half starts to get really murky. The thing about all those statements however is that they’re solid scientific facts so there’s really no debate about whether or not they’re correct. Indeed many of these things are what should be part of a typical American education which brings me onto the next point.
In the same report there’s a table labelled PPEDCUAT. (4 category) which shows the breakdown of the education levels of the respondents. Now I don’t know what I was expecting but 89% say that they have a high school education or higher which, I would hope, covers off the majority of the topics in the lower support brackets. What this tells me is that either the high school education doesn’t cover these topics, which I don’t believe is the case, or people are rejecting these arguments for one reason or another. Whilst I can hazard a guess as to why I’m sure you can figure out at least one or two factors that would be having a negative impact on those scores.
I’d be interested to see how this compares to other countries as whilst I’d love to believe that Australia is the bastion of rational thinking I know that’s not strictly the case. In any case this shows just how important science outreach programs are as this survey shows just how much work there is left to be done. Hopefully over time those numbers will all trend in the upwards direction, showing that the general public’s interest in science is increasing. It definitely feels like that is the case when you compare it to say a decade ago but as these numbers show there’s always room to improve.
Australia has one of the best education systems available as evidence by our top 10 rankings for literacy, science and mathematics as well as our overall education index of 0.993, tying us for first place with countries like Denmark and Finland. While our system isn’t exactly unique in its implementation I do believe schemes like HECS/HELP are one of the main reasons that the majority of Australians now pursue tertiary education and whilst this might bring about other issues (like a lack of people in trades) it’s clear that benefits far outweigh the costs. Indeed as someone who couldn’t have afforded university without the help of the government and now has a great career to show for it I’m something of a testament to that idea.
Recently however there’s been some criticism of the HECS-HELP system, mostly focused on the amount of student debt owing to the government and the sizeable chunk of that which is never expected to be repaid:
The Grattan Institute’s annual Mapping Australian Higher Education report finds that students and former students have accumulated HECS-HELP debts of $26.3 billion.
This is about an extra $10 billion owing, in real terms, than in 2007.
The interest bill on the income-contingent loan scheme, formerly known as HECS, is nearly $600 million a year, the institute estimates.
And it says HELP debt not expected to be repaid rose to $6.2 billion in 2012.
The report makes for some intriguing reading and does indeed state that there’s a good 25% or so of the current student debt that’s likely to never be repaid. The reasons behind it though are interesting as whilst some would have you think that it’s due to students skipping out on their debts in way or another (ala Liberal MP Steve Ciobo) it’s in fact primarily due to students either dying or moving overseas. Now there’s not a whole lot we can do about the former (except maybe investing more in the health care sector) but the latter is a problem that’s been around for decades and I’ve yet to see a solution proposed, either from the government or the private sector.
Australian graduates, especially in some sectors, suffer from a distinct lack of choice when it comes to finally finding a career once they’re done with their university studies. Whilst I might have managed to make a decent career without looking too far you have to appreciate the fact that my degree isn’t in IT, it’s in engineering, and such is the case for many graduates who try to find something in their chosen path. Usually they can get close but the chances of landing an opportunity directly in their field of study are usually pretty slim and that leads them to look overseas. I myself did exactly that not too long after I graduated and was pretty staggered at the number of opportunities available abroad that I was more than qualified for.
Another point that the report makes is that student debt is seemingly sky rocketing when compared decades prior. The graph above demonstrates that quite clearly but it doesn’t give you any indication as to why this is happening. For starters Australia’s population has increased by about 5.8 million in since 1989 or about 35%. At the same time participation in tertiary education has well over doubled in this time with the vast majority having some form of tertiary qualification and 27% of all Australians now carrying a bachelor’s degree or higher. Essentially there’s been a major cultural shift over the past 2 decades towards pursuing an education through universities rather than other avenues and this is what is responsible for the increase we’ve seen. This isn’t exactly an issue considering our GDP has quadrupled in the same time frame and whilst I won’t say there’s a causative link there I’d say you’d be hard pressed to uncouple higher education rates from improved GDP figures.
Realistically the issue of unpaid student debts isn’t much of an issue for the Australian government considering the wide reaching benefits that our high quality and freely available education system gives us. We still need to do something about our best and brightest moving overseas to greener pastures but it’s clear that the economic benefits of free education for anyone who wants it vastly outweighs the cost of providing it. Even if we were to erase all student debt in one year it would still be only a few percent of the total budget, something that could be easily done should there be any burning need for it to happen. There isn’t of course since the cost of servicing that debt is so low (comparatively) and there are much better things to spend that money on.
Like all great debates there seems to be two irreconcilable sides to the great education question of “Should I go to university?”. On the one side there’s the drive from parents, many of whom grew up in times where tertiary education was a precious resource, who want to give their children the very best chance at getting somewhere in live. On the other side is the self-taught movement, a growing swell of people who’ve eschewed the traditional progression of education and have done quite well. This in turn raises the question of whether further education is a necessity in today’s society or whether it’s all a giant waste of time that could be better spent pursuing the career of your dreams in the field of your choosing.
From a statistical point of view the numbers seem to favour pursuing some form of education beyond that of a secondary level. Employment rates for people with university level education are far higher than those without and it’s quite typical for a university educated graduate to be earning more than the average wage. Facts like these are what have driven the tertiary education levels in Australia from their lows in the post World War 2 era to the dizzying highs that we see today. This trend is what inspired the Howard government to create things like the New Apprenticeship System in order to boost the industries that relied on people eschewing university education in favor of learning a trade. Indeed not going to university, at least in Australia, would appear to be outside the norm just as going to university used to be.
It should come as no surprise then that I am a product of the Australian university system. Being one of the lucky (or not so lucky, depending) people born before the cut off date I was always a year younger than most of my class mates which meant that, since I skipped the traditional gap year that nearly all Australians seem to take, I managed to graduate at the same time as many of my peers despite my degree being 4 years long. Like many of my fellow students I was fully employed long before graduation day and had a career path mapped out that would see me use my degree to its fullest potential. Whilst I have been extremely fortunate in my career I can’t say that my degree was 100% responsible for the success I’ve enjoyed, nor for others who’ve walked similar paths to mine.
Now there are some professions (law, medicine and I’d like to say engineering but everyone’s a bloody engineer these days) where university is a legal requirement and there’s no getting around that. However for many other industries a degree, whilst seen as a useful “foot in the door” for initial job applications, is ancillary to experience and length of time in the industry. Indeed my rise through the ranks of IT support was mostly on the back of my skills in a chosen specialization with the degree just being a useful footnote with many not even realising that I was one of the few people in the IT industry legally allowed to call myself an engineer. The question then, for me at least, shifts from “should I go to university” to “what value can I derive from university and how is that comparable to similar time in industry?”.
It’s not exactly an easy question to answer, especially for an 18-year-old who’s fresh out of college and looking to make a hard decision about their future career. Indeed at the time I made the decision I didn’t think along those lines either, I just felt that it was probably the way to go. About 2 years into my degree though I was soon jealous of the money and progress that my friends were making without going to unversity and began to question why I was there. Upon reflection I don’t believe my time at university was wasted but the most valuable skills I learnt whilst there weren’t part of the syllabus.
This, I believe, is where you need to make a personal judgement call on whether university is right for you. The most valuable things I learnt at university (critical thinking, modularity, encapsulation, etc.) aren’t things that are reserved for the halls of an education institution. If you’re autodidactical by nature then the value proposition of higher education might very well be lost on you. When I started out at university I was definitely not an autodidact as I’d rarely seek to improve myself mentally beyond what I was required. Afterwards however I found myself craving knowledge on many wide and vast subjects, reveling in the challenge of conquering a new topic. This is not to say that university is a clear path to becoming like this, and indeed it seems to have the opposite effect for many, but it sure did wonders for my fledgling mind.
My main point here is that there’s no definitive stance on whether university is right for you or not and anyone who tells you that is at best being misguided. To truly understand if higher education is the right path you must reflect on whether you can attain knowledge in other ways and in similar time frames. It’s a deeply personal thing to think about, one that requires an objective view of your own abilities and desires, and sometimes you won’t be able to make a logical decision. In that case it’ll come down to what you feel is right for you and, like many of my friends found out, you’ll eventually figure out if it was right for you or not.
It’s never too late to start learning again.
I’ve always been fascinated by people who are incredibly smart and religious. To me they seem to be diametrically opposed as education goes up the evidence for God’s existence starts to come under question, usually to the point of pushing people to be either agnostic or atheist. For me it was mostly my distaste for the study of religion (I found it boring) and the ham fisted approach that my science teacher had to reconciling the Anglican school teachings with actual science.
For those both gifted and religious the most common explanation I get is the things we can’t yet explain come under the purview of a god or the God. I watched a video of an interview with Neil DeGrasse Tyson recently that sums up why that approach is fundamentally flawed:
Taken to its logical extreme, as in as our knowledge approaches the limit of all we can ever know, God then can only exist in infinitesimally smaller gaps. Logically then the belief in such an entity seems irrational as God is then just an ever shrinking pocket of ignorance. You can of course neatly sidestep this argument by saying you fully believe in your faith regardless of what science says and I’ll neatly sidestep any argument with you on the matter because I’m sure neither of us will walk away happy from it 😉
Over the weekend the wife and I watched a documentary on the American education system called Waiting for Superman, here’s the trailer:
The documentary dives deep into the American public education system and the crux of it is that whilst there are some fantastic public schools there the problem is that space at those schools are limited. In order to resolve this situation the government has legislated the only thing that can be equally fair to all involved: public schools with more applicants than places must have a lottery to determine who gets in and who doesn’t. It’s eye opening, informative and heart wrenching all at the same time and definitely something that I’d recommend you watch.
The reason it hit home for me was because of the parallels that I could draw to my own education experience. My parents had had me on the waiting list for one of Canberra’s most respected private schools since the day I was born. I went to a public school for my initial education but I was always destined for a life of private education. However upon attending that school I was miserable, the few friends that did make the transition to the same school abandoning me and the heavily Anglican environment (with mandatory bible studies classes) only making things worse.
The straw that broke my parent’s back was when I made my case for transferring me to a public school where most of my friends had ended up. They couldn’t get through to me that the private school I was going to was the best place for me to be educated but one thing I said changed their minds: “You make your own education”. I still wonder if I actually uttered those exact words or just something along those lines (I don’t have a vivid memory of that incident, but my parents say it was so) but that was enough for them to let me transfer. If I’m honest the transfer didn’t make things any better, although I told myself differently at the time, but suffice to say I can count myself amongst the few who did make it to university after going to that school. Heck you might even say I’ve been successful.
Anecdotally then public education system in Australia seems to work just fine. The schools I went to had a rather rough reputation for not producing results (and indeed my university entrance score was dragged down a good 5 points due to my attendance there) but there were students that excelled in spite of it. However when watching Waiting for Superman I got this sinking feeling that in the USA they might not even have the chance to make their own education simply because the schools are set up for failure. Indeed my own success might have blinded me to the fact that the schools I went to were set up in such a way, leading me to believe there was no problem when there was one.
Cursory research however shows that, at least for Australia, this isn’t the case. Indeed the biggest indicators of child’s success at school and their pursuit of higher education is largely dependent on non-school factors. Following on from that idea it’s not just you who makes your education, but your entire social structure that supports it. Bringing that back to my experience shows then that it was my strong family support that lead for me to do well and my late found group of friends who led me to excel at university. In that respect I should feel incredibly lucky but in reality it’s got little to do with luck and more to do with a whole lot of dedicated effort on the parts of everyone who had been involved in my life during my education.
Still we should be thankful for the education system that Australia has, especially when you compared it to what it could be. I’m still a strong believer in those words I uttered well over a decade ago and whilst they might not be applicable everywhere in the world they are definitely applicable here.
Like almost any industry IT can sometimes feel like a pretty thankless job. If you’re halfway competent at what you do people won’t notice the vast amount of effort you put in to making sure everything runs smooth and will begin to question whether they really need to keep you around. Conversely if everything isn’t running smooth it’s more likely everyone will recognize your hard work, but you’ll be spending all your time fighting fires and solving problems that need to be done urgently which isn’t the greatest thing if you like your work to be stress free. Plus the work doesn’t stop once you clock off in the afternoon since (if you’re one of the only computer guys in your family/circle of friends) people will bug you with their computer problems, begging you to provide a fix for them.
The latter point though is applicable to almost any industry. Way too often when people are socializing and the topic of work comes up people’s professions seem to be an open door for people to solicit free advice from the first person to mention what they do for a crust. The doctors will get regaled with tales of various ailments, the mechanic about car problems and the IT guy will of course be barraged with all sorts of strange questions that realistically can’t be answered on the spot. Whilst I don’t shy away from telling people what I do for work anymore (I just tell them my going rate should they want me to fix their problems) it does make IT a bittersweet industry to work in sometimes, and I’m not the only one to think that.
What spurred this idea was this blog post on why it doesn’t pay to be the computer guy. Boyd makes some great points in there hitting on some common frustrations that nearly every IT person has encountered throughout their career. Indeed I had found myself struggling with such problems for some time, like the lack of appreciation for the work that I did and how saying “it should work this way, let me check that” turns into “he said it would do exactly this, and it didn’t so its his fault” quicker than I could ever imagine. Still whilst I’m not going to say that much has changed in the 4 years since he wrote that post there is one thing that I learnt from my time in project management that I feel could solve at least half of the problems that he faced there.
That thing is expectation management.
You see when people expect the world of you it’s in our nature to not turn them down. It’s really quite flattering to have people come and ask you for help and the more you’re able to do for them the more they will expect of you. For us IT folk this has a habit of spiraling wildly out of control since 90% of the problems users encounter are 5 minutes on Google away from being fixed, so expectations eventually reach levels that no one will be able to live up to. Thus you end up being placed on a pedestal and users will look to you first instead of attempting to solve the problem first. They then expect you to be the answer to all their problems, which seems to be the root of many of Boyd’s complaints.
The best way to fight this problem is to educate the users on what they can do to help themselves out, empowering them. Back when I used to work as an IT technician for servicing people’s computers in their homes I’d usually spend a good hour of my time there explaining to them what was wrong and how they could go about fixing it themselves in the future. You’d think this would be bad for business but it wasn’t as many customers would recommend me based on my services, with a good 20% of new customers coming from referrals. Additionally when they did hit a problem they couldn’t fix themselves they were far more appreciative of my skills when I returned, knowing the effort that went into it.
We IT people could also do with eating some humble pie once in a while. I can’t tell you how many times I’ve been asked something that I know nothing about and I’ve straight up said “I don’t know” to a user’s face. Their reaction is always of surprise since it’s unusual for anyone (let alone an IT know-it-all) to admit they have no idea about something you just asked about. It’s not easy I’ll admit and your pride will take some hits from being so brutally honest about your limitations but it will knock you off that pedestal the users have put you on and they’ll be far more likely to treat you like a human rather than some IT deity. If a workplace doesn’t value this kind of honesty then I’d recommend moving on, unless you like the position you’re currently in.
There are a few points Boyd makes however that can’t be simply managed away like the constant skill devaluation and getting asked the same questions again and again, but you life as an IT worker can be a whole lot more tolerable when you start molding people’s expectations of you to more closely align to reality. It’s not easy sometimes, especially when it feels like you’re giving your boss reasons to fire you, but in the end you’ll be better for it and you’ll be far more appreciated for the work you do.
When I look back at those 4 long years I spent at university I always feel a wide range of conflicting emotions. Initially it was one of bewilderment as I was amongst some of the smartest people I’d ever met and they were all passionate about what they were studying. During my second year it turned to one of pride as I began to find my university legs and excelled at my chosen specialities. However the last 2 years of university saw me turn on the career I had once been so enamoured with, questioning why I should bother to languish in lecture halls when all of what I learnt would be irrelevant upon completion. Still 4 years on from that glorious day when I walked out of parliament house with my degree in one hand I still value my time there and I couldn’t be sure if I had the chance again would I do it any differently.
Unfortunately for me my predictions of most of the knowledge being irrelevant outside of university did ring true. Whilst many of the skills and concepts I learnt still stick with me today many of the hours spent deep in things like electronic circuits and various mathematical concepts haven’t found their way into my everyday work life. I wholly lay the blame for this at myself however as straight out of university the most lucrative career I could land was in IT support, not computer engineering. This is probably due to the engineering industry in Canberra being none too hot thanks to the low population and high public service employment rate but even those who’ve managed to find jobs in the industry quickly learned that their theoretical university experiences were nothing compared to the real world.
What university did grant me was the ability to work well from a fundamental base of knowledge in order to branch out into other areas. Every year without fail I found myself trying to build some kind of system or program that would see me dive back into my engineering roots to look for a solution. Most recently it has been with Lobaco as I’d barely touched any kind of web programming and had only limited experience in working with real 3 tiered systems. Still my base training at university allowed me to ask the right questions and find the right sources of information to be able to become proficient in a very short space of time.
Flush with success from coding and deploying a working system on the wider Internet my sights turned to something I had only a cursory experience with before: mobile handsets. A long time ago I had tried to code up a simple application on Windows Mobile only to have the program crash the simulator repeatedly and fail to work in anything meaningful way. Still being an iPhone user and having downloaded some applications of questionable quality I thought it couldn’t be too hard to pick up the basics and give it the old college try. Those of you following me on Twitter would have noticed how there was only one tweet on iPhone applications before I mentioned HTML5 as the potential direction for the mobile client, signalling that I might have bitten off more than I could chew.
Indeed this was what happened. Attempting to stumble my way through the other world that is Objective-C and Xcode was met with frustration on a scale I hadn’t felt in quite a while. Whilst the code shares a base in a language I know and understand many things are different in ways I just hadn’t fathomed and the resources online just weren’t the same as what I was used to. I managed to get a few things working but doing simple things like say incorporating the pull to refresh code into my own application proved to be next to impossible and still elude me. After a while though I began to think that I was missing the fundamentals that I had had when developing for other platforms and dreaded the idea of having to drudge through one of the millions of iPhone programming books.
Right in the depth of my plight I came across this Slashdot article on someone asking which mobile platform they should develop for. Amongst the various responses was this little gem that pointed me to something I had heard of but never looked at, iTunesU. I had known for a while that various universities had been offering up their lecture material online for free but I hadn’t known that Apple had integrated it into their iTunes catalogue. Searching for the lecture series in question I was then presented with 20 lectures and accompanying slides totalling several hours of online content. With the price being right (free) I thought nothing of downloading the first lecture to see if there was anything to gain from this, and boy was there ever.
Whilst the first 30 minutes or so were general housekeeping for the course itself the last 20 minutes proved to be quite insightful. Instantly I knew that the way I was approaching the problem wouldn’t work in Apple’s world and I needed to develop a fundamental base of knowledge before I could make any meaningful progress. These lectures have proved to be an invaluable source of knowledge and proved to be instantly valuable, helping me develop a base application that resembles what I hope to one day release to the world.
It’s this kind of knowledge dissemination that will disrupt the traditional education frameworks. The amount of information available to anyone with an Internet connection is unfathomable and those with a desire to learn about a particular subject are able to do so without any limitations. Back when I started at university anyone wanting to attend the lectures had no choice but to be physically present at each lecture. Sure you could probably grab the lecture notes but they’re a poor substitution for actually being there, especially when the classes are as useful as the ones provided by Stanford. They won’t make you an iPhone programming genius on their own but if you’ve done any sort of programming before you’ll quickly find yourself becoming vastly more proficient than you would bumbling around blindly in the Xcode IDE as I did.
In the end I realised there’s really no substitute for starting with the fundamentals and working your way from there. I had assumed that based on my extensive past programming experience that learning an new language and IDE would be a walk in the park. It took me several days of frustration to realise that I was using my Microsoft hammer to bash in Apple nails and that wasn’t getting me anywhere fast. Just an hour spent watching a very basic lecture proved to be more insightful than the hundreds of Google searches I had done previously. It’s still early days for me as an iPhone programmer but I’ve got a feeling that the next few weeks spent coding will be much easier than the week that has led up to it.
I and nearly all of my generation would have had the notion that having a university degree was the key to unlock a successful future. With around 63% of all Australians having enrolled for tertiary education at some stage in their lives we can easily assume that this a commonly held belief. It even got to the point where the trade industries were suffering due to the lack of people enrolling in apprenticeships, which lead the Howard government to attempt to sway people over to a trade in the 2007 budget. So for the most part you’re more likely to find a young Australian with a tertiary qualification of some sort than you’re not, and it appears that this qualification required mentality has spread to at least one other industry.
The IT industry overall is almost completely unregulated. There’s no formal body for qualifying someone as an IT professional nor are there any large established organisations which we can apply to, like the IEEE for engineers as an example. For the most part then when an employer is looking for someone they don’t have any standard guidelines for determining if someone who claims to have experience is the real deal, nor do they have a third party with which to verify a candidate’s story. This poses a significant problem for employers as resumes are easily faked, interviews can be coached to near perfection and you have to trust that their references aren’t just their mates doing them a favour. How then, apart from hiring them and throwing in the deep end to see them sink or swim, do you determine if a candidate is worth your time?
The answer, for many, lies in vendor certifications.
The world of IT is full of competing technologies and implementations. For every piece of equipment that makes up your computer there’s multiple companies who produce an almost identical part in form and function. As consumers this is a fantastic thing as it gives a variety of choice and low prices whilst the companies compete to ensure that their product is the one we buy. However diving into the dark world of corporate IT infrastructure shows that a companies desire to distinguish themselves from a competitor usually leads to products that are, for the most part, worlds apart from each other even if they strive to serve the same purpose. Therefore experience with one product does not readily translate to another, save for a few fundamental skills.
Coupling these 2 points of lacking formal accreditation processes and disparate technologies most companies create their own certification programs to verify that someone is competent with their brand of technology. For example Microsoft has their MCITP program (for demonstrated competence with their Windows line of products), VMware the VCP Program and so on. Any IT professional seeking to demonstrate their expertise with a product will probably undergo a program like these to formally certify their experience with a product. For those just beginning in the world of IT certifications can provide that foot in the door that many are seeking, much like those of us who got a degree for similar reasons. Still ask anyone who has a degree how much it has helped them with their professional career (put aside academia for the time being) and you’d be surprised how many retort with how it was their experienced that mattered, not the piece of paper they once held so highly.
Logically that makes sense, even outside the IT industry. It’s all well and good to have every accreditation under the sun but as many will tell you theory is usually only good in a perfect world with ideal conditions, which are quite rare in the real world. Previous experience in the field means that you at least understand the nuances of the real world implementations of theory and you should have developed your own set of algorithms to deal with the common problems that arise in your chosen field. Still if you cast your eye over the current job market you’ll see many positions requiring varying levels of qualifications in addition to industry experience and this has lead to a kind of grey market for qualifications.
I am, of course, referring to brain dumps.
Their name gives up almost all you need to know about them. Brain dumps are either straight copies of real world tests with questions and answers or study guides that are akin to the most incredible study guide ever created. You’d think that these kinds of things would be relegated to the dark recesses of some private BitTorrent tracker or secret FTP server hiding on a dark net somewhere but that’s far from the case, it’s actually quite a booming industry. Take any IT certification¹ and you can guarantee that at least part of the test or lab documents will be available online. What value can we then draw from people who have acquired these paper (I.E. nothing but paper backing them up) certifications?
The answer is rather complicated. For the most part we don’t really have anything else to fall back on, save for actually throwing someone in the job and seeing if their skills line up with their apparent qualifications. Many say that the qualifications help weed out those that would flood their inbox with useless applications, yet in my whole career I’ve only ever had 1 employer ask me for my academic record and exactly 0 have asked to verify any of my vendor certifications (I even had one who had to Google what one of them was, yet he still didn’t ask for proof it was real). Others cast their nets wide in order to scare off potential paper certs, who couldn’t hope to cover all their bases should an interview bring up every technology in question. Thus we end up in a world where the certs can be readily attained by those willing to shell out the dollars for them and employers use them only in a feeble attempt to weed those same people out.
For most employers the solution usually lies within good interviewing technique. There are certain things you can’t fake (like sound critical thinking) and using questions that have no definitive right answer is one way I’ve seen the paper certs separated from the real deal. Rote memorization or coaching won’t help you in these areas and for the most part those with experience will shine when presented with such questions (having been in such situations before).
It all seems to boil down to the fact that as a whole we’re becoming far more educated. With such a large number of people seeking higher education the value that was once granted by those pieces of paper from the hallowed halls has been diminished. In the world of IT the ease and availability of shortcuts (and, some would say, our generations entitlement mentality) to qualification heaven has, ironically, lead to the industries attempts at formal certification down the exact same path at a pace that matches the industry’s speed for innovation. They still hold some value of course, but they are far from the bastion of truth that is too often placed in them.
¹Apart from the CISCO certifications. They appear to be the only vendor who’s remained unblemished by the brain dump market. Their tests are also considered to be amongst the most difficult in the world with the lab component having an 80% first time failure rate.
Once something is ingrained in the public’s mind it becomes increasingly difficult to convince them of the opposite idea. Initial thoughts turn into innate biases and anecdotal evidence becomes undeniable fact. I can’t really put the whole blame on the public themselves since we don’t all spend the hours required to fact check everything so some of the blame rests with the media and their reporting of such things. One of these such things is the link between mobile phones and cancer which, despite a fair body of evidence to the contrary, still manages to rear its ugly head at the dinner table. Even with evidence like this people will still choose to believe the anecdotes over fact:
A very large, 30-year study of just about everyone in Scandinavia shows no link between mobile phone use and brain tumours, researchers reported on Thursday.
Even though mobile telephone use soared in the 1990s and afterward, brain tumours did not become any more common during this time, the researchers reported in the Journal of the National Cancer Institute.
Some activist groups and a few researchers have raised concerns about a link between mobile phones and several kinds of cancer, including brain tumours, although years of research have failed to establish a connection.
What interests me the most about this is that although people will still spout things like “cell phones cause cancer” they will still go ahead and use them day after day. I think the main reason behind this is the fact that although there might be a chance that it does increase your risk of cancer (most of the studies still conclude that the 20~30 year usage range needs further studies) it is so low that it doesn’t really affect them. The same can be said for smoking and unhealthy eating since for the most part the damage is so low and slow that you don’t notice it building up on you. This was very true with cigarettes 50 years ago when doctors would recommend them to their patients, not knowing the long term health problems the addictions would incur. The mental gymnastics people employ for their self destructive habits is quite amazing sometimes.
The real issue here is one of education since the method of communication (mass media et al) with the public at large is not particularly suited towards this kind of critical thinking. This has become quite apparently recently with the whole Emissions Trading Scheme legislation which, thanks to an almost soap opera-esque leadership spill in the Liberal party, has pushed Tony Abbott and his bizarre ideals on climate change. Right now it appears he’s attempting to make it look like the Rudd government is trying to tax us all for no appreciable benefit, when he can do the same for basically free. Trying to find some solid information on his policy leads me to mostly dead ends but the few articles I could find on it would see Abbott attempt massive carbon sequestering, something which does not solve the underlying problem. Let’s also not forget that Abbott has also promoted a climate change denier in the form of Nick Minchin (to call him a skeptic is completely misleading), a man who 14 years ago was a second hand smoke “skeptic”. He’s right up there with the other loonies who believe that this whole carbon thing is an attempt to deindustrialize the western world (and bring in communism, that’s right climate change is a COMMUNIST CONSPIRACY!!). You can see why I’m worried about these people pushing their views on the wider public of Australia, they’re disregarding all evidence in favour of pushing party lines.
I’m just glad that they’ll go down in flames come the next election .
Whilst there are many great educational and skeptical resources available out there most of them aren’t really targetted at the everyman. Skeptics et al have a terrible habit of preaching to the choir
and their rhetoric leaves much to be desired. When your target audience thinks that Ask Bossy is good lunchtime reading you’ve got to change your game plan to match, and that’s a process that many of us (myself included) find quite hard to do. The day that skepticism becomes sexy and cool is the day that I stop writing on the subject, since everyone will be doing my work for me.
Or maybe the ABC just needs to move Media Watch to primetime.