Back in my school days I thought that skill was an innate thing, a quality that you were born with that was basically immutable. Thus things like study and practice always confused me as I felt that I’d either get something or I wouldn’t which is probably why my academic performance back then was so varied. Today however I don’t believe anyone is below mastering a skill, all that is required is that you put the required amount of time and (properly focused) practice in and you’ll eventually make your way there. Innate ability still counts for something though as there are things you’re likely to find much easier than others and some people are even just better in general at learning new skills. Funnily enough that latter group of people likely has an attribute that you wouldn’t first associate with that skill: lower overall brain activity.
Research out of the University of California – Santa Barbara has shown that people who are most adept at learning new tasks actually show a lower overall brain activity level than their slow learning counterparts. The study used a fMRI machine to study the subject’s brains whilst they were learning a new task over the course of several weeks and instead of looking at a specific region of the brain the researchers focused on “community structures”. These are essentially groups of nodes within the brain that are densely interconnected with each other and are likely in heavy communication. Over the course of the study the researchers could identify which of these community structures remained in communication and those that didn’t whilst measuring the subject’s mastery of the new skill they were learning.
What the researchers found is that people who were more adept at mastering the skill showed a rapid decrease in the overall brain activity used whilst completing the task. For the slower learners many of the regions, namely things like the visual and motor cortexs, remained far more active for a longer period, showing that they were more actively engaged in the learning process. As we learn skills much of the process of actually doing that skill gets offloaded, becoming an automatic part of what we do rather than being a conscious effort. So for the slow learners these parts of the brain remained active for far longer which could, in theory, mean that they were getting in the way of making the process automatic.
For me personally I can definitely attest to this being the case, especially with something like learning a second language. Anyone who’s learnt a different language will tell you that you go through a stage of translating things into your native language in your head first before re-translating them back into the target language, something that you simply can’t do if you want to be fluent. Eventually you end up developing your “brain” in that language which doesn’t require you to do that interim translation and everything becomes far more automatic. How long it takes you to get to that stage though varies wildly, although the distance from your native language (in terms of grammatical structure, syntax and script) is usually the primary factor.
It will be interesting to see if this research leads to some developmental techniques that allow us to essentially quieten down parts of our brain in order to aid the learning process. Right now all we know is that some people’s brains begin the switch off period quicker than others and whatever is causing that is the key to accelerating learning. Whether or not that can be triggered by mental exercises or drugs is something we probably won’t know for a while but it’s definitely an area of exciting research possibilities.
I used to think I was in almost total control of nearly every aspect of being. From learning to emotions to anything mental I felt like I was astutely aware of all the processes, variables and influences that affected me and could control them at my will. That was, of course, my wild teenage brain running amok with its abnormal chemistry and time has shown me that there’s an awful lot going on inside my head that I have absolutely zero control over. Indeed the more research we do into the brain and our genetics the more we find things that we aren’t consciously in control of and that raises some really perplexing questions.
The more we chip away at the apparent control we have over our own being the more the idea of free will starts to look like some form of cruel joke played upon us by our own biological systems. I’ve wrestled with this idea before when I tried to overcome some subconscious beliefs that I didn’t consciously agree with and I’m still struggling to rationalize it today. Indeed the evidence keeps mounting for some form of hard determinism being the absolute truth here but it seems that one of those nigh on unshakable beliefs is the fact that we have some kind of will that is not controlled by our chemical/biological processes.
Things start to get really weird when you start looking at some real world examples of subconscious processes at work. Studies have shown that judges in Israel are far more likely to grant parole right after they’ve been fed with the approval rating tapering off steadily until their next meal. Whilst it may sound obvious when explained to you (it’s a Egg of Columbus type of thing) these kinds of influences pervade every nearly every aspect of our lives and it’s shocking just how little control we have over some of them. Indeed even being aware that those biases there isn’t enough to overcome them requiring a substantive effort to overcome.
I find this particularly interesting because it feeds into some of my other casual interests, namely the process of learning. There’s the oft repeated saying that it takes 10,000 hours to master something and understanding that our subconscious is doing most of the heavy lifting gives you insight into why that is. Rather than the 10,000 hours being training for our conscious selves it is in fact more to do with training our subconscious to take on all the tasks required for mastery that, at the beginning, reside only in the conscious part of our brain. It’s exactly why you can seemingly zone out when driving somewhere and not end up wrapped around a tree; the process of driving is largely a subconscious act. The same reason is behind why everyone has trouble with this seemingly ubiquitous skill at first, your subconscious just simply isn’t up to the task.
There’s also that rather sticky wicket of whether or not this means we actually have an agency at all, I.E. whether we truly are responsible for our actions. For what its worth I don’t have a good answer for this as society is very heavily predicated on the fact that we do have agency and I can’t seem to fathom how that idea could come about without it being true at some level. Of course this could just be a form of common delusion which just happens to work since it increases our survival rate and therefore allows our soma to continue on. Like I said, I don’t have a good answer to this and even my conjecture on the matter feels half baked.
Honestly I’m not altogether sure what this means for us as a species or society at large but I feel like its an important thing to understand. Awareness that we’re largely subconscious beings has helped me better understand the learning process and why people might say one thing then act in completely different ways. It’s a perplexing issue, one I’m sure that philosophers and scientists will struggle with for centuries to come and even then I’ll doubt that we’ll ever get a conclusive answer: scientifically or philosophically.
I’ve been on a bit of a rediscovery of photography of late, driven by my desire to fulfil the promises that my red-wine laden self proclaimed loudly over the Internet just a couple months ago. I’ve always had something of an interest in it dating back to the time when I wanted to capture my wife and I’s first trip overseas together all those years ago. However that interest was put aside for other things that seemed more important at the time: attempting to build my own start up, trying to build 100% passive income streams and all manner of things that, more often than not, left me burnt out and wondering why I had bothered in the first place.
I’ll have to admit that my knowledge of photography was average when I first started out on this journey, although I didn’t know that at the time. Ever since then I’ve been feeding myself on a steady diet of Wikipedia articles, photography blogs and lurking continuously on the photography subreddit. In that time I’ve come to realise that many of the assumptions I made about certain things, like the reasons why people spend so much on Leicas or why the TSE lenses are actually useful, were totally wrong and that’s had me doing a hell of a lot of self reflection.
The biggest thing to come of this seems to be an incredible distaste for nearly every picture I’ve taken since I first laid my hands on my new bits of camera equipment. I should have expected this, I even blogged about this very phenomena twice in the past, but it seems that every time I set out with the best of intentions I end up looking back at all the pictures I took and feeling like I’ve wasted my time. It’s a really painful feeling, especially when you’ve hyped up everything in your head before hand.
The reality of the situation is actually something that everyone who sets out to improve themselves goes through: the stage where you realise what it takes to be the thing you want to become and the desperation in knowing that you’re no where near there yet. This isn’t a bad thing at all, it’s in fact a critical step to progressing forward as up until this point you were operating on the rush of starting out in new territory, picking up a few quick wins but still being blissfully unaware of all the challenges that lay ahead of you. This self realization is usually what kills most people’s motivation to continue on in a particular pursuit but realistically this should be the point where you push through the pain barrier in order to make it to the other side.
Unfortunately there’s no quick fix solution other than pressing on in spite of your feelings to the contrary. You’d think having been through this process twice in recent memory that I would’ve predicted this feeling of ennui and planned accordingly but for some reason I just…didn’t. Thankfully other parts of my personality, namely the fiscal one, scream loudly enough to force me to continue on. I absolutely detest the feeling that I’m simply doing photography for the sake of getting my money’s worth out of the equipment I bought but it’s enough to keep me going and hopefully enough to drive me through to the other side.
This post will also form part of the strategy for me to keep on developing as a photographer. I’ve already put myself in many situations that I wouldn’t have otherwise for the sake of photography and, whilst I might not feel like I’m doing anything of worth at the time, I have produced some pictures that, on reflection, do meet my criteria for being “good”. I keep making a promise to myself that I’ll do 1 post here a week based on my latest photographic excursions and maybe its time that I made good on that instead of getting caught up in a circle of self loathing.
Yeah, I think its time.
Like all great debates there seems to be two irreconcilable sides to the great education question of “Should I go to university?”. On the one side there’s the drive from parents, many of whom grew up in times where tertiary education was a precious resource, who want to give their children the very best chance at getting somewhere in live. On the other side is the self-taught movement, a growing swell of people who’ve eschewed the traditional progression of education and have done quite well. This in turn raises the question of whether further education is a necessity in today’s society or whether it’s all a giant waste of time that could be better spent pursuing the career of your dreams in the field of your choosing.
From a statistical point of view the numbers seem to favour pursuing some form of education beyond that of a secondary level. Employment rates for people with university level education are far higher than those without and it’s quite typical for a university educated graduate to be earning more than the average wage. Facts like these are what have driven the tertiary education levels in Australia from their lows in the post World War 2 era to the dizzying highs that we see today. This trend is what inspired the Howard government to create things like the New Apprenticeship System in order to boost the industries that relied on people eschewing university education in favor of learning a trade. Indeed not going to university, at least in Australia, would appear to be outside the norm just as going to university used to be.
It should come as no surprise then that I am a product of the Australian university system. Being one of the lucky (or not so lucky, depending) people born before the cut off date I was always a year younger than most of my class mates which meant that, since I skipped the traditional gap year that nearly all Australians seem to take, I managed to graduate at the same time as many of my peers despite my degree being 4 years long. Like many of my fellow students I was fully employed long before graduation day and had a career path mapped out that would see me use my degree to its fullest potential. Whilst I have been extremely fortunate in my career I can’t say that my degree was 100% responsible for the success I’ve enjoyed, nor for others who’ve walked similar paths to mine.
Now there are some professions (law, medicine and I’d like to say engineering but everyone’s a bloody engineer these days) where university is a legal requirement and there’s no getting around that. However for many other industries a degree, whilst seen as a useful “foot in the door” for initial job applications, is ancillary to experience and length of time in the industry. Indeed my rise through the ranks of IT support was mostly on the back of my skills in a chosen specialization with the degree just being a useful footnote with many not even realising that I was one of the few people in the IT industry legally allowed to call myself an engineer. The question then, for me at least, shifts from “should I go to university” to “what value can I derive from university and how is that comparable to similar time in industry?”.
It’s not exactly an easy question to answer, especially for an 18-year-old who’s fresh out of college and looking to make a hard decision about their future career. Indeed at the time I made the decision I didn’t think along those lines either, I just felt that it was probably the way to go. About 2 years into my degree though I was soon jealous of the money and progress that my friends were making without going to unversity and began to question why I was there. Upon reflection I don’t believe my time at university was wasted but the most valuable skills I learnt whilst there weren’t part of the syllabus.
This, I believe, is where you need to make a personal judgement call on whether university is right for you. The most valuable things I learnt at university (critical thinking, modularity, encapsulation, etc.) aren’t things that are reserved for the halls of an education institution. If you’re autodidactical by nature then the value proposition of higher education might very well be lost on you. When I started out at university I was definitely not an autodidact as I’d rarely seek to improve myself mentally beyond what I was required. Afterwards however I found myself craving knowledge on many wide and vast subjects, reveling in the challenge of conquering a new topic. This is not to say that university is a clear path to becoming like this, and indeed it seems to have the opposite effect for many, but it sure did wonders for my fledgling mind.
My main point here is that there’s no definitive stance on whether university is right for you or not and anyone who tells you that is at best being misguided. To truly understand if higher education is the right path you must reflect on whether you can attain knowledge in other ways and in similar time frames. It’s a deeply personal thing to think about, one that requires an objective view of your own abilities and desires, and sometimes you won’t be able to make a logical decision. In that case it’ll come down to what you feel is right for you and, like many of my friends found out, you’ll eventually figure out if it was right for you or not.
It’s never too late to start learning again.
After reaching 1.0 of Lobaco I’ve taken a breather from developing it, mostly so I could catch up on my backlog of games and give my brain a well deserved break from working on that problem space. It’s not that I’m tired of the idea, I still think it has merit, but the last 6 months of little free time on the nights and weekends were starting to catch up with me and a break is always a good way to kick start my motivation. It didn’t take long for numerous new ideas to start popping into my head afterwards and instead of jumping back into Lobaco development I thought I’d cut my teeth on another, simple project that would give me the experience I needed to migrate Lobaco into the cloud.
The weekend before last I started experimenting with ASP.NET MVC, Microsoft’s web framework that based on the model-view-controller pattern that I had become familiar with after deep diving into Objective-C. I could have easily done this project in Silverlight but I thought that I’d have to man up sooner or later and learn a proper web language otherwise I’d be stuck in my desktop developer paradigm for good. The results weren’t spectacular and I could only bring myself to spend about half the time I usually do coding on the new site, but there was progress made there none the less.
The slow progress really frustrated me. After finally gaining competence with Objective-C I felt like learning yet another new framework would be easy, even if it meant learning another language. Somehow I managed to forget that frustrating first month where progress was almost nil and I convinced myself I wasn’t procrastinating when looking for other solutions to my problems. Eventually I came to the realization that I was still grokking the new framework I had chosen for my application and that I shouldn’t be expecting myself to be blazing trails when I was still establishing my base of fundamental knowledge.
I see lots of people go through the same struggle when trying out new things and can see how easy it is to give up when you’re not making the kinds of progress other people are. Believe me its even worse in the tech/start-up area where every other day I’m reading about someone who hacked together a fully usable service in a weekend whilst I struggle to get my page to look like it wasn’t written in notepad. The realization that you’re still in the grok stage of learning something new I find to be quite a powerful motivator as past experience has shown that it’s only a matter of time and persistence between floundering around and becoming quite capable.
I’m usually the first one to tell people to stick with what they know as re-skilling is extremely expensive time wise (and can be $$$ wise too, Objective-C set me back a few large) but the pay-offs of diversifying you skills can be quite large. Whilst I’ve yet to make any semblance of a dollar from all my adventures in iPhone development I still count it as a valuable experience, if for the mere fact it’s given me a lot of perspective and oodles of delicious blog fodder. Time will tell if this current foray into yet another web framework will be worth my time but I wouldn’t be doing it if I thought there was no chance of it ever paying off.
It’s been a long 7 months since I first laid eyes on Xcode and the iOS SDK all that time ago and I’ve had quite the love/hate relationship with it. There were times when I could spend only a couple hours coding and blast through features barely breaking a sweat, and others when I’d spend multiple torturous hours figuring out why something just wasn’t working the way I thought it should. The last couple months have been quite successful as my code base has grown large enough to cover most of the rudimentary functions I use constantly and my muscle memory with certain functions is approaching a usable level. Last weekend it all came to head after I polished off the last of my TODO list and sank back into my chair.
Then it hit me, this was a feature complete 1.0 release.
Apart from the achievements (which are barely implemented in the web client) you can do everything on the iPhone client that you could do with the full web client. I’ve taken design cues from many iPhone applications that I’ve been using and I feel its quite usable, especially if you’re familiar with the myriad of Twitter clients out there. I’ve been fiddling with it over the past few days and it seems to be stable enough for me to unleash on others to see how it goes and that’s where you, my faithful readers, come into play.
I’m looking for people to beta test this application pending a full release of it to the app store. If you’re interested in testing out the application and have any 3G and up iPhone (2G might work, but it would be dreadfully slow) hit me up on my gmail [email protected] and we’ll take it from there. I haven’t really experimented with Apple’s beta testing yet so the first lot of you are more than likely to be in for a fun ride as I stumble my way through deploying the application to you, but this is all part of the fun of being a very, very early adopter 🙂
Despite all the trials and tribulations that developing this client has brought me the experience is proving to be invaluable as it’s helped me refine the idea down to the core ideal I started with almost 2 years ago: getting people communicating around a location. It’s also been the first new language I’ve learned in almost 5 years and it has reminded me just how much fun it was learning and creating in a completely new environment, so much so that I’m almost completely sold on the idea of recoding the web client in Ruby on Rails. Still that’s all pie in the sky stuff for now as the next big improvement to Lobaco is moving the entire service off my poor VPS and into the wonderful world of the cloud, most likely Windows Azure. I hope you’ll jump on board with me for testing Lobaco and hopefully in the future this will grow into something much more than my pet project.