If it wasn’t for the HECS/HELP system I definitely wouldn’t be in the position that I am today. Whilst I didn’t come from an exactly poor family we were definitely on the lower end of the middle class and the prospect of going to uni meant that I’d have to start paying my way. Thankfully I was able to defer my HECS debt until I was able to pay it back through tax allowing me to attend university without having to fork out the $25,000 or so which I simply did not have. After 4 years of an accelerated career that was directly attributable to my university experience the debt was fully repaid back to the Australian government with a little bit of inflation added on top for good measure.
In other countries this same situation probably wouldn’t have been possible. In the USA for instance I would have had to secure a student loan with a bank, something that probably would have seen me paying exorbitant interest rates on top the much higher cost of education. Even if the loan amount remained the same I would’ve been repaying the debt for at least another year just because of interest and I would have been much less inclined to take the risks that I did knowing that I’d have to make those monthly repayments regardless of my current employment situation. The couple percent interest I paid on my HELP debt to curb the deflation on the debt seems like nothing in comparison to that.
The difference between the two systems is the motive behind the loans. HECS/HELP is made by the government to encourage people to go into higher education in the hopes that, because of said education, they will get higher paying jobs and will then be able to contribute more to the economy as well as repaying their debt. Loans made by banks on the other hand, regardless of their intended purpose, are done purely for the motive of generating a profit and they will do anything to maximise the return on them as such. This is why the Liberal’s proposal to securitise (read: sell off) student debt is an inherently bad move.
Should such a deal go down the government would likely have to sell the debt for a fraction of its current value, usually on the order of 40%~60%. This would mean an instant cash windfall of approximately $11 billion or so with the annuity streams being collected by the new owners of the debt. If your government is strapped for cash (which we really aren’t at the moment) then this would seem like a good move however it would only account for 3% of our total budget and only for the year in which it happened. For comparison HECS/HELP revenue was around $1.4 billion back in 2009/2010 financial year meaning that the $11 billion windfall would become a shortfall in 8 years (probably less considering that repayment rates would have likely increased in the interim). It’s a short term cash grab that will make the budget its in look a lot better but at the cost of making every budget that follows it look a lot worse.
The real problem though is the transfer of government owned debt to a private company, one that will inevitably look to make the most out of their investment. Whilst HECS/HELP is one of the few things you can’t discharge through bankruptcy you’re under no obligation to repay it should you not have the means to, a key to encouraging people to at least attempt higher education to further their careers. Should the debt be owned by a bank however there’s no guarantees that the same structures will hold and it’s almost inevitable that the banks would look to squeeze delinquent loans for all they’re worth. Don’t believe me? Just look at the student loan situation in the USA.
Whilst the Liberals may have said that such a plan is not current policy the fact that it’s under consideration should ring alarm bells. It’s an incredibly short sighted move, one that favours short term gains over long term losses which is something that a “fiscally responsible” government should be doing everything to avoid. Selling off national assets, especially one that provides as much value as HECS/HELP does, will only hurt us in the long term no matter how warm and fuzzy running a surplus makes you feel now.
Like all great debates there seems to be two irreconcilable sides to the great education question of “Should I go to university?”. On the one side there’s the drive from parents, many of whom grew up in times where tertiary education was a precious resource, who want to give their children the very best chance at getting somewhere in live. On the other side is the self-taught movement, a growing swell of people who’ve eschewed the traditional progression of education and have done quite well. This in turn raises the question of whether further education is a necessity in today’s society or whether it’s all a giant waste of time that could be better spent pursuing the career of your dreams in the field of your choosing.
From a statistical point of view the numbers seem to favour pursuing some form of education beyond that of a secondary level. Employment rates for people with university level education are far higher than those without and it’s quite typical for a university educated graduate to be earning more than the average wage. Facts like these are what have driven the tertiary education levels in Australia from their lows in the post World War 2 era to the dizzying highs that we see today. This trend is what inspired the Howard government to create things like the New Apprenticeship System in order to boost the industries that relied on people eschewing university education in favor of learning a trade. Indeed not going to university, at least in Australia, would appear to be outside the norm just as going to university used to be.
It should come as no surprise then that I am a product of the Australian university system. Being one of the lucky (or not so lucky, depending) people born before the cut off date I was always a year younger than most of my class mates which meant that, since I skipped the traditional gap year that nearly all Australians seem to take, I managed to graduate at the same time as many of my peers despite my degree being 4 years long. Like many of my fellow students I was fully employed long before graduation day and had a career path mapped out that would see me use my degree to its fullest potential. Whilst I have been extremely fortunate in my career I can’t say that my degree was 100% responsible for the success I’ve enjoyed, nor for others who’ve walked similar paths to mine.
Now there are some professions (law, medicine and I’d like to say engineering but everyone’s a bloody engineer these days) where university is a legal requirement and there’s no getting around that. However for many other industries a degree, whilst seen as a useful “foot in the door” for initial job applications, is ancillary to experience and length of time in the industry. Indeed my rise through the ranks of IT support was mostly on the back of my skills in a chosen specialization with the degree just being a useful footnote with many not even realising that I was one of the few people in the IT industry legally allowed to call myself an engineer. The question then, for me at least, shifts from “should I go to university” to “what value can I derive from university and how is that comparable to similar time in industry?”.
It’s not exactly an easy question to answer, especially for an 18-year-old who’s fresh out of college and looking to make a hard decision about their future career. Indeed at the time I made the decision I didn’t think along those lines either, I just felt that it was probably the way to go. About 2 years into my degree though I was soon jealous of the money and progress that my friends were making without going to unversity and began to question why I was there. Upon reflection I don’t believe my time at university was wasted but the most valuable skills I learnt whilst there weren’t part of the syllabus.
This, I believe, is where you need to make a personal judgement call on whether university is right for you. The most valuable things I learnt at university (critical thinking, modularity, encapsulation, etc.) aren’t things that are reserved for the halls of an education institution. If you’re autodidactical by nature then the value proposition of higher education might very well be lost on you. When I started out at university I was definitely not an autodidact as I’d rarely seek to improve myself mentally beyond what I was required. Afterwards however I found myself craving knowledge on many wide and vast subjects, reveling in the challenge of conquering a new topic. This is not to say that university is a clear path to becoming like this, and indeed it seems to have the opposite effect for many, but it sure did wonders for my fledgling mind.
My main point here is that there’s no definitive stance on whether university is right for you or not and anyone who tells you that is at best being misguided. To truly understand if higher education is the right path you must reflect on whether you can attain knowledge in other ways and in similar time frames. It’s a deeply personal thing to think about, one that requires an objective view of your own abilities and desires, and sometimes you won’t be able to make a logical decision. In that case it’ll come down to what you feel is right for you and, like many of my friends found out, you’ll eventually figure out if it was right for you or not.
It’s never too late to start learning again.
I can remember sitting in one of my university lectures a long time ago being taught about development philosophies. It was all pretty standard stuff, we were walked through the “traditional” methods of development (basically the once through, waterfall technique) and then brought up to speed on the more modern iterative approaches. However one little soundbite always stuck out in my head and that was when the lecturer asked us who pays for rework when a product doesn’t fit a customer’s expectations? The simple answer was you, the one who developed it and it’s something that always plays over in my head when I’m working on a project, especially those ones I do at home.
I’ve been paying extensively for rework with my latest forays into the world of game development. My regular readers and Twitter followers would’ve noticed that I cheerfully announced my success in cracking the stable orbit problem. Indeed in a round about way I had, basically my Unity scripts would push the planet around until it hit a stable orbit and afterwards would calculate the required velocity before turning off completely, letting the heavenly body orbit in a near perfect circle around its star. This worked for the 2 planets I had in there but unfortunately the equations I had developed didn’t generalize very well and adding in planets at random locations with random weights led to all sorts of wobbly orbits with planets meeting both fiery deaths and cold extinctions at the cruel hand of my orbit stabilizer. I was back to square one and I spent most of the weekend trying to figure out a fix.
Eventually I came back around to the idea that my smart-ass subconscious came up with a while ago. I had tried to implement it before but I gave up in frustration when the results I got were no different than from my far more complicated “find the angle between the sun and body, increment it a bit, find the new position, create a vector to it then apply force in that direction” when in reality the fault lied in the orbit stabilization code. All that pushing and pulling that made the orbit look stable was in fact imparting all sorts of wild forces on the poor little planet, when in fact the best way is just to simply let gravity do the work for you. With this in mind I re-implemented my perpendicular force calculations and then devised a rudimentary equation that combined the mass, radius and a fudge factor that let me hand stabilize the orbit. In the past attempting to do this stuff manually took me an hour or so per planet, with this revised code I was able to do one in minutes and have developed a new equation that is able to accurately send a planet into a stable orbit no matter where I place it in the game.
This solution was far more simple and elegant than what I had been trying to do previously but the cost in terms of rework was tremendously high. I’m lucky in this respect in that the client for this is just myself and my friend at the moment but had this been for someone else with whom I had a contractual relationship with that kind of rework would’ve been extremely costly. Of course I could try to make the client pay for it but ask anyone who’s gone back to a client asking for more money after saying they could do it for a certain price and you’ll usually be laughed out of the office, if not walked out of there by security.
Working around this isn’t easy as clients will usually want to have a strict set of deliverables and time frames which seems to rule out any iterative or agile development methodology. It also pushes a team dangerously towards suffering from analysis paralysis as you agonize over every requirement to make sure it’s covered off in the final product. A healthy amount of analysis is good for any project, especially if it makes the product easy to maintain or modify, but it’s also incredibly easy to fall into a never ending spiral of pointlessness. Thankfully however I’ve noticed that clients are far more receptive to the idea of milestones these days which lines up well with any iterative process, skirting around these problems easily.
Going after the most simple and elegant solution might seem like the best idea at the time but in my experience it’s those kinds of solutions that take the longest to achieve. It’s almost always worth it, especially if all you’re spending is your own time, but when you’re working for someone else they might not be so keen for you to spend inordinate amounts of time chasing your white whale solution. This probably explains why a lot of software contains incomprehensible code riddled with bugs, but that’s a whole ‘nother ball game and a blog post for another day.
It’s the beginning of 2006 and the end is in sight for my university career. It’s been a crazy 3 years up until this point having experienced both the dizzying highs of excelling in a subject and the punishing lows of failing to understand even the basic concepts of some units properly. Still I haven’t failed a single subject (despite some near misses) and really the only thing standing between me and that piece of paper I’ve been chasing is my final year, most of which will be dedicated to working on an engineering project. I had been looking forward to this for a while as I felt it would be a chance to test my meddle as a project manager and hopefully create something valuable in the process.
The year started off well as I found myself in a project team of 4 including 2 long time friends and a new acquaintance who was exceptionally skilled. After brainstorming ideas we eventually settled on creating a media PC with a custom interface based off the open source MythTV project which would handle most of the back end work for us. After getting a space to work in we covered the whiteboard in dozens of innovative ideas ranging from TiVO like recording features to remoteless operation based on tracking a user’s movement. Looking at the list we were convinced that even that list of features wouldn’t be enough to fill a year worth of development effort but thought it was best to settle on these first before trying to make more work for ourselves. With the features in mind I set about creating a schedule and we set about our work.
Initially everything was going great, we were making quite a lot of progress and the project was shaping up to be one of the best of the year. The hardware design and implementation was looking phenomenal, so much so that I made the brash move of saying there was a potential market for a mass produced version of the device. Our lecturers showed a keen interest in it and we even managed to come in second place for a presentation competition amongst all the project students, narrowly losing out to an autonomous robot that could map out and navigate its surroundings. We were definitely onto a winner with this idea.
However my desire to project manage 3 other people started to take its toll on the project. Realistically in a team of 4 everyone needs to pitch in to make sure stuff gets done, there’s really no room for designated roles. I however kept myself at arms length from any solid development work, instead trying to control the process and demanding vast reams of documentation from those doing the work. Additionally I failed to realize that the majority of the coding work was be done by a single team member which meant that only they understood it, making collaboration on it next to impossible. Seeing the beginnings of a sinking ship I called everyone together to try and figure things out, and that’s when things really started to turn sour.
The primary coder expressed their concerns that no one else was doing any work and I, still not realizing that I didn’t need to be a project manager, instructed them to take a week off so the others could get up to speed. This didn’t work as well as I planned as they continued to do all the work themselves, effectively locking anyone else out from being able to contribute to the effort. I did manage to get the star developer to collaborate with the others but by this point it was already too late as they’d usually have to rewrite any code that wasn’t their own.
In order to save some face in this whole project I elected to do the project report entirely on my own, realistically a task that needed to be done by all of us (just like the project). I spent countless hours cobbling everything together, piecing random bits of documentation and notes together into something resembling a professional report. It wasn’t amazing but it was enough to get the approval of everyone else in the team and our project co-ordinator so a week before the final demonstration I handed it in, wanting to be done with this project once and for all.
The final demonstration was no picnic either with everyone in the team (bar me) staying at university until midnight before the presentation. We managed to demonstrate a much cut down version of our initial vision to the class with only a few minor hiccups and the 2 honors side projects went along quite well. Afterwards we hurriedly bundled the project away into one of the members car (he provided all the hardware on the proviso he got to keep it) happy to be done with it once and for all.
For 2 years afterwards I struggled to figure out why the project that started off so well tanked so badly. It wasn’t until I was officially employed as a project manager that I figured out that the most toxic element in the whole ordeal was me, the power hungry idiot who contributed the least whilst ensuring that anyone trying to get things done was hampered by my interference. I failed to get everyone to collaborate effectively and hamstrung them with rediculous requirements for documentation. In essence I was acting like a project manager on a big project when really I was anything but. The end result was a far cry from what it could have been and one member of that project team still refuses to speak to me, and I don’t blame them for doing so.
I suppose the best thing that came out of this is that I finally realized my weaknesses and actively worked to overcome them. Sure it might have been too late for the university project that was but I’m glad to say I didn’t inflict any such torment on a project whilst I was being paid to do it, instead taking on board those lessons learned to make sure those projects were delivered as required. I still hold out hope that one day I’ll look back on those days with my former project members and laugh but those project management war wounds will stick with me forever, reminding me that I’m not as infallible as I once thought I was.
When I look back at those 4 long years I spent at university I always feel a wide range of conflicting emotions. Initially it was one of bewilderment as I was amongst some of the smartest people I’d ever met and they were all passionate about what they were studying. During my second year it turned to one of pride as I began to find my university legs and excelled at my chosen specialities. However the last 2 years of university saw me turn on the career I had once been so enamoured with, questioning why I should bother to languish in lecture halls when all of what I learnt would be irrelevant upon completion. Still 4 years on from that glorious day when I walked out of parliament house with my degree in one hand I still value my time there and I couldn’t be sure if I had the chance again would I do it any differently.
Unfortunately for me my predictions of most of the knowledge being irrelevant outside of university did ring true. Whilst many of the skills and concepts I learnt still stick with me today many of the hours spent deep in things like electronic circuits and various mathematical concepts haven’t found their way into my everyday work life. I wholly lay the blame for this at myself however as straight out of university the most lucrative career I could land was in IT support, not computer engineering. This is probably due to the engineering industry in Canberra being none too hot thanks to the low population and high public service employment rate but even those who’ve managed to find jobs in the industry quickly learned that their theoretical university experiences were nothing compared to the real world.
What university did grant me was the ability to work well from a fundamental base of knowledge in order to branch out into other areas. Every year without fail I found myself trying to build some kind of system or program that would see me dive back into my engineering roots to look for a solution. Most recently it has been with Lobaco as I’d barely touched any kind of web programming and had only limited experience in working with real 3 tiered systems. Still my base training at university allowed me to ask the right questions and find the right sources of information to be able to become proficient in a very short space of time.
Flush with success from coding and deploying a working system on the wider Internet my sights turned to something I had only a cursory experience with before: mobile handsets. A long time ago I had tried to code up a simple application on Windows Mobile only to have the program crash the simulator repeatedly and fail to work in anything meaningful way. Still being an iPhone user and having downloaded some applications of questionable quality I thought it couldn’t be too hard to pick up the basics and give it the old college try. Those of you following me on Twitter would have noticed how there was only one tweet on iPhone applications before I mentioned HTML5 as the potential direction for the mobile client, signalling that I might have bitten off more than I could chew.
Indeed this was what happened. Attempting to stumble my way through the other world that is Objective-C and Xcode was met with frustration on a scale I hadn’t felt in quite a while. Whilst the code shares a base in a language I know and understand many things are different in ways I just hadn’t fathomed and the resources online just weren’t the same as what I was used to. I managed to get a few things working but doing simple things like say incorporating the pull to refresh code into my own application proved to be next to impossible and still elude me. After a while though I began to think that I was missing the fundamentals that I had had when developing for other platforms and dreaded the idea of having to drudge through one of the millions of iPhone programming books.
Right in the depth of my plight I came across this Slashdot article on someone asking which mobile platform they should develop for. Amongst the various responses was this little gem that pointed me to something I had heard of but never looked at, iTunesU. I had known for a while that various universities had been offering up their lecture material online for free but I hadn’t known that Apple had integrated it into their iTunes catalogue. Searching for the lecture series in question I was then presented with 20 lectures and accompanying slides totalling several hours of online content. With the price being right (free) I thought nothing of downloading the first lecture to see if there was anything to gain from this, and boy was there ever.
Whilst the first 30 minutes or so were general housekeeping for the course itself the last 20 minutes proved to be quite insightful. Instantly I knew that the way I was approaching the problem wouldn’t work in Apple’s world and I needed to develop a fundamental base of knowledge before I could make any meaningful progress. These lectures have proved to be an invaluable source of knowledge and proved to be instantly valuable, helping me develop a base application that resembles what I hope to one day release to the world.
It’s this kind of knowledge dissemination that will disrupt the traditional education frameworks. The amount of information available to anyone with an Internet connection is unfathomable and those with a desire to learn about a particular subject are able to do so without any limitations. Back when I started at university anyone wanting to attend the lectures had no choice but to be physically present at each lecture. Sure you could probably grab the lecture notes but they’re a poor substitution for actually being there, especially when the classes are as useful as the ones provided by Stanford. They won’t make you an iPhone programming genius on their own but if you’ve done any sort of programming before you’ll quickly find yourself becoming vastly more proficient than you would bumbling around blindly in the Xcode IDE as I did.
In the end I realised there’s really no substitute for starting with the fundamentals and working your way from there. I had assumed that based on my extensive past programming experience that learning an new language and IDE would be a walk in the park. It took me several days of frustration to realise that I was using my Microsoft hammer to bash in Apple nails and that wasn’t getting me anywhere fast. Just an hour spent watching a very basic lecture proved to be more insightful than the hundreds of Google searches I had done previously. It’s still early days for me as an iPhone programmer but I’ve got a feeling that the next few weeks spent coding will be much easier than the week that has led up to it.
One of my university lecturer’s had a reputation for talking for hours on end about his previous projects (Dr John Rayner if you’re interested). This wasn’t atypical of many of our lecturers since the majority of them had spent many decades in industry or research before becoming lecturer’s, but Dr Rayner was a curious exception to those who were just being a little nostalgic. He was a physicist turned engineer, which is strange because even though we share some common ground most of us would never think of “crossing the border” as it were. As such we routinely had him sub in when either of our physics or engineering teachers were absent and it was guaranteed that his class would somehow revolve around one of his previous projects. The twist was, even though we’d always think we were just wasting our time listening to him by the end we all understood the material we needed to be taught, even though he rarely delved into the theory required. One of the most interesting lessons we got from him was on the expectations of customers and how that will influence your designs.
He was working on a community housing project in one of the northern states and one of the concerns was water usage. They’d optimized basically everything apart from the toilets so it was left to him and his team to optimize the amount of water that they used. They had then designed a system that used around a tenth of the water of a conventional toilet, a considerable saving. However after passing initial testing (using an IEEE approved analogue for human waste, basically sausage skin filled with sawdust) they then sent them along for their real world exposure. Curiously whilst no one reported any problems actually using the toilet they weren’t well received. As it turns out the perception of so little water being used made most people feel uneasy about the toilets, thinking they hadn’t properly flushed or that they weren’t clean. Thus the design was reworked, although he was coy on the actual results.
This whole lesson came steaming back when I saw this article yesterday:
Researchers have demonstrated a prototype device that can rid hands, feet, or even underarms of bacteria, including the hospital superbug MRSA.
The device works by creating something called a plasma, which produces a cocktail of chemicals in air that kill bacteria but are harmless to skin.
The team says that an exposure to the plasma of only about 12 seconds reduces the incidence of bacteria, viruses, and fungi on hands by a factor of a million – a number that stands in sharp contrast to the several minutes hospital staff can take to wash using traditional soap and water.
The first thing that sprung to many people’s minds is how this could be used to eliminate the need for washing your hands. It’s an interesting idea since the use of this technology could be quite a bit more hygienic whilst saving water and towel waste. However whilst novel and indeed an elegant alternative it will take many years for such things to replace the norm, just because people won’t feel comfortable walking out of the toilet without washing their hands.
It’s a challenge that every engineer will face when they’re designing and building a new system. There are a lot of social and technical norms out there and going against them won’t do anything to help the adoption of your product. I think this is the problem that Google Wave has faced recently since it has melded so many different technologies (and therefore expectations of how it will function) that we’re no quite sure how to go about using it. The fact that it has no real physical analogue doesn’t help the matter either, and that’s why my Wave account sits unused for the better part of a month.
So it becomes the engineer’s challenge to understand the everyman and work with him, since they will become the ones using our creations. I used to look upon this as unnecessary rework but over time I grew to appreciate the familiarity that came with certain lines of products (thank you Microsoft ;)) making learning and utilizing them to their fullest so much easier. A good understanding of your users can be as valuable as a good understanding of the solution, and I’m forever thankful for the eccentric Dr Raynor for teaching me that.
One memory that has always remained clear in my mind was from the first few weeks of engineering degree. Sitting in an introductory electronics class the lecturer began by throwing some rudimentary math our way and lamented how we were quick to grab our pens and paper or calculators to work out the exact answer. It was a clever way of showing us how much we relied on technology to do our thinking for us and so the next week was spent teaching us methods of estimating values, working out rough solutions to equations and educating us in the ways of solid guesswork. After a while it became second nature to us engineers (all 10 of us!) and I don’t think any good engineer would be without it.
I hadn’t really thought about it until I was dicussing LED backlit TVs with some of my friends over lunch. The basic premise of the technology (I’m not going to talk about edge LED backlighting, that’s cheating) is that instead of using what amounts to giant fluro tubes to light up the screen for each individual pixel you use either a single white LED or a RGB LED array. I imagined what the sales pitch to the higher ups must’ve been when they were trying to make a full 1080p display, saying that the amount of LEDs would have been huge. One of my friends then whipped out his phone to calculate it (instantly bringing back the university memories) and instantly the engineering estimator kicked in and I came out with about 6 million LEDs to make up the panel. The actual figure is 6220800 (1920 x 1080 x 3), which made me feel like I’d earned a fair whack of geek cred for being able to guess that close without a calculator.
It’s not just party tricks like doing insanely large multiplcations in your head that estimation is good for. Most of the time I spent in the labs at university were trying to get some electronic widget to give the correct output. Now all the equations you’re taught are based on perfect models, so you’re never going to get exactly where you want. That’s where estimation comes in handy, if you know your inputs and can hazard a rough guess at the outputs it can save you hours sitting down and working out the actual output. If everything comes out inline with your expectations then you can go back and verify your results using the equations, otherwise you know you have to rework your experiment.
Although the real scientists would argue that’s what research assistances are for 😉
In the real world where project managers and higher ups demand estimates to ensure resources are allocated appropriately being able to come up with figures quickly is one skill that’s saved me countless times over. It’s one of those things that once you learn you never really think about again as the answers just start popping into your head, kind of like muscle memory but in your brain. Plus being able to do large multiplications in your head is a sure fire way to get all the ladies.
Well, that’s what my lecturer told me anyway 🙂
Many moons ago I graduated from the University of Canberra as a Bachelor in Engineering in Computer Engineering. If you’re brave enough to click that link you’ll will notice that it’s dated 2003 and that you should check the university’s site for more information. Attempting that will lead you down a long and convoluted path which eventually leads to this page, saying that this course is no longer open to enrolments.
Like many young people who are destined to leave college I looked towards university to further my education in the hopes of improving my career prospects whilst doing something that I enjoyed. At the time I was fascinated with consumer IT hardware and after attending the open day I was convinced that the computer engineering degree was the way to go. It felt like there was quite a bit of freedom to specialise after the first year and they even offered programs with languages, which really intrigued me.
The first year of my degree went like any other. I spent the first month trying to figure out the university way of life and settling in with the people who would form my university friends for the next 4 years. After that it was a bit of a roller-coaster with my first semester seeing me barely pass all my subjects, which seemed to be the norm for all of us. The second semester went quite a bit more smoothly, with me finally figuring out how to fit into the university mould. I was an energetic little go getter ready for second year.
I do count the second year of my stay in university as the best out of the 4. With a full year of experience under my belt I didn’t feel bewildered walking into a classroom and I’d worked out all the basics (note taking, tutorials, etc) so I didn’t have to spend time on that as well as the subject material. Everything was looking up, I even managed to dux a test and get myself inducted into the Golden Key Society, who recognises the top 15% of students (of which I’ve made little use). Things were definitely looking up then, but the problem with being up so high is that there’s only one way to go afterwards.
Towards the end of second year one of my lecturers walked into the class with a sad and dejected look on his face. We’d seen this before, when he had announced earlier in the year that the Computer Engineering, Software Engineering and Electronics and Communication Engineering courses would all be merged into one degree; with the third year onwards determining a “specialisation” into the respective merged areas. To be honest, the writing was on the wall from first year for this to happen. The total influx for engineering students in my year was only 15, with the makeup being 2 computer, 7 electronics and 6 software. Although these degrees share a common basis there are specialty subjects that only apply to the specific areas, and you can’t run a subject with only 2 students willing to take it.
The news he brought on this occasion was far more grim. The university was closing the entire engineering branch, and whilst our degrees would be taught out to their fullest extent most electives would not be available. As it turned out, none of them outside the general IT and programming electives were and we were relegated to the ranks of glorified software engineers with separate titles. Whilst our initial education had given us skills in other areas the last 2 years were filled with software courses, useless mathematics courses (3rd year Engineers doing Introduction to Statistics and Introduction to accounting which are both first year subjects? Surely you jest!). Although I did enjoy some of the management and economics education I received some of these courses were clearly a complete joke and felt like a personal insult to someone like me who had to take a beginners class next to something along the lines of say, multi-variate calculus.
This was then coupled with what I call the “Third Year Blues” which was introduced to us by the engineers who preceeded us. At the start of the third year most university students will end up questioning why the hell they’re in their degree. This is amplified in IT related degrees since from beginning to end technology will have rapidly changed and you could find yourself working from a basis that is no longer relevant. It was strange to see the once highly energetic engineers questioning their very foundation, we even lost a couple since they couldn’t bring themselves to finish the degree off.
Feeling thoroughly dejected I started looking for answers. After questioning many of my lecturer’s the story became very clear, but it only worked to deepen my bitterness towards the situation.
Approximately 10 years before the close the Australian National University opened up its doors to Engineering students for the first time. Whilst they have a superb reputation in all the fields they foray into they do have a distinct taste for the more academic side of subjects. Engineering was no exception to this rule, and many discussions with ANU branded engineers showed that whilst they had a great theoretical understanding, they lacked a lot of real world implementation. Many of the subjects they were learning used out-dated tools and languages, and the practicals were very lacklustre. However, the opening of a competing engineering university in Canberra more then halved the numbers that the university of Canberra saw, especially those with post-graduate aspirations.
Looking into the ANU’s books brought up some astonishing figures. They were in as much trouble as UC, with numbers dwindling at a similar rate. However their post-graduate programs showed no decline, whereas UC’s were declining. My lecturer’s confirmed that ANU pushed for people to go to the post-graduate level, where they focused closely on employment. This is why ANU continued to run whilst UC died ever so slowly.
Maybe it was a misplaced sense of patriotism towards UC but I’ve never really let go of that fact. The true essence of engineering is solving a problem and then iterating to improve it. ANU’s lack of practical focus went against what I feel is the true sense of engineering, and their continued existence just adds salt to the wounds.
All of these factors made the day when I was given my degree bitter-sweet. I was elated that could now call myself a true Engineer, a thing that my father had scoled me for calling myself before I had finished. However, looking over the sea of graduates that day I knew only a handful were engineers, and all were the last of their breed to exit those halls with a UC degree under their arm.
Whilst I may be bitter about the experience that university gave me I’m still thankful for it. Although the content of the degree might not be what I wanted the meta-skills (problem solving, time management, critical thinking, etc) have proven themselves to be far more valuable.
And thus, the bitter engineer was created. Sometimes I wonder if all university students turn out this way, but even that’s too cynical for someone like me.