The question of where life came from on our Earth is one that has perplexed scientists and philosophers alike for centuries. Whilst we have really robust models for how life evolved to the point it’s at today how it first arose is still something of a mystery. Even if you adhere to the idea of panspermia, that the original building blocks of life were seeded on our planet from some other faraway place, that still raises the question of how that seed of life first came to be. The idea of life coming arising from the chemical soup that bathed the surface of the young earth is commonly referred to as abiogenesis but before that process took place something else had to occur and that’s where chemical evolution steps in.
We’ve know for quite a while that, given the right conditions, some of life’s most essential building blocks can arise out chemical reactions. The young earth was something of a massive chemical reactor and these such reactions were commonplace, flooding the surface with the building blocks that life would use to assemble itself. However the jump from pure chemical reactions to the development of other attributes critical to life, like cell walls, is not yet clear although the ever closing gap between chemical evolution and regular evolution suggests that there must be something. It’s likely that there’s no one thing responsible for triggering the explosion of life which is what makes the search for the secret all the more complicated.
However like all scientific endeavours it’s not something that I believe is beyond our capability to understand. There have been so many mysteries of the universe that were once thought impossible to understand that we have ended up mastering. Understanding the origins of life here on Earth will bolster our searches for it elsewhere in the universe and, maybe one day, lead us to find a civilization that’s not of our own making. To me that’s an incredibly exciting prospect and is one of the reasons why theories like this are so fascinating.
Human spaceflight is, to be blunt, an unnecessarily complicated affair. Us humans require a whole host of things to make sure we can survive the trip through the harsh conditions of space, much more than our robotic companions require. Of course whilst robotic missions may be far more efficient at performing the missions we set them out on that doesn’t further our desire to become a multi-planetary species and thus the quest to find better ways to preserve our fragile bodies in the harsh realms of space continues. One of the biggest issues we face when travelling to other worlds is how we’ll build our homes there as traditional means will simply not work anywhere else that we currently know of. This is when novel techniques, such as 3D printing come into play.
Much of the construction we engage in today relies on numerous supporting industries in order to function. Transplanting these to other worlds is simply not feasible and taking prefabricated buildings along requires a bigger (or numerous smaller) launch vehicles in order to get the required payload into orbit. If we were able to build habitats in situ however then we could cut out the need for re-establishing the supporting infrastructure or bringing prefabricated buildings along with us, something which would go a long way to making an off-world colony sustainable. To that end NASA has started the 3D Printed Habitat Challenge with $2.25 million in prizes to jump start innovation in this area.
The first stage of the competition is for architects and design students to design habitats that maximise the benefits that 3D printing can provide. These will then likely be used to fuel further designs of habitats that could be constructed off-world. The second part of the competition, broken into 2 stages, is centered on the technology that will be used to create those kinds of structures. The first focuses on technology required to use materials available at site as a feed material for 3D printing, something which is currently only achieved with very specific feedstock. The second, and ultimately the most exciting, challenge is to actually build a device capable of using onsite materials (as well as recyclables) to create a habitable structure with a cool $1.1 million to those who satisfy the challenge. Doing that would be no easy feat of course but the technology created along the way will prove invaluable to future manned missions in our solar system.
We’re still likely many years away from having robots on the moon that can print us endless 3D habitats but the fact that NASA wants to spur innovation in this area means that they’re serious about pursuing a sustainable human presence offworld. There’s likely numerous engineering challenges that we’ll need to overcome, especially between different planets, but it’s far easier to adapt a current technology than it is to build one from scratch. I’m very keen to see the entries to this competition as they could very well end up visiting other planets to build us homes there.
I am always amazed when something that I think I understand completely turns out to be far more complicated than I first thought. The anodizing process was one of these things as, back in the day, I had investigated anodizing some of my PC components as a way of avoiding having to go through the laborious process of painting them. Of course I stopped short after finding out the investments I’d need to make in order to do it properly (something my student budget could not afford) but the amount of time I poured into researching it left me with a good working knowledge of how it worked. What I didn’t know was what it could achieve when titanium was used for anodizing as it’s able to produce an entire rainbow’s worth of colours.
The wave of colours you see the metal rapidly transition through aren’t some kind of trick it’s one of the interesting properties of how the thickness of a deposited titanium layer interferes with light passing through it. As the thickness of the layer increases the interference increases, starting off with a kind of blue colour and then shifting through many different wavelengths before finally settling on the regular metallic colour that we’re all familiar with. This process can be accurately controlled by varying the voltages applied during the anodizing process as that determines the resulting thickness of the layer that’s deposited onto the host material. In the above example they’re going for a full coating, hence why the bar rapidly flashes through different colours before settling down.
These kinds of reactions always fascinate me as it shows how things can behave in extraordinarily different ways if we just vary a small few parameters in one way or the other. It’s one of those principles that drove us to discover things like graphene which, at its heart, is just another arrangement of carbon but the properties it has are wildly different to the carbon that most of us are familiar with. It just goes to show that when you think you know science is always ready to throw you another curveball and that’s why I find things like this so exciting.
Science reporting and science have something of a strained relationship. Whilst most scientists are modest and humble about the results that they produce the journalists who report on it often take the opposite approach, something which I feel drives the disillusionment of the public when it comes to announcing scientific progress. This rift is most visible when it comes to research that challenges current scientific thinking something which, whilst needs to be done on a regular basis to strengthen the validity of our current thinking, also needs to be approached with the same trepidation as any other research. However from time to time things still slip through the cracks like the latest news that the EmDrive may, potentially, be creating warp bubbles.
Initially the EmDrive, something which I blogged about late last year when the first results became public, was a curiosity that had an unknown mechanism of action necessitating further study. The recent results, the ones which are responsible for all the hubbub, were conducted within a vacuum chamber which nullified the criticism that the previous results were due to something like convection currents rather than another mechanism. That by itself is noteworthy, signalling that the EmDrive is something worth investigating further to see what’s causing the force, however things got a little crazy when they started shining lasers through it. They found that the time of flight of the light going through the EmDrive’s chamber was getting slowed down somehow which, potentially, could be caused by distortions in space time.
The thing to note here though is that the previous test was conducted in atmosphere, not in a vacuum like the previous test. This introduces another variable which, honestly, should have been controlled for as it’s entirely possible that that effect is caused by something as innocuous as atmospheric distortions. There’s even real potential for this to go the same way as the faster than light neutrinos with the astoundingly repeatable results being created completely out of nothing thanks to equipment that wasn’t calibrated properly. Whilst I’m all for challenging the fundamental principles of science routinely and vigorously we must remember that extraordinary claims require extraordinary evidence and right now there’s not enough of that to support many of the conclusions that the wider press has been reaching.
What we mustn’t lose sight of here though is that the EmDrive, in its current form, points at a new mechanism of generating thrust that could potentially revolutionize our access to the deeper reaches of space. All the other spurious stuff around it is largely irrelevant as the core kernel of science that we discovered last year, that a resonant cavity pumped with microwaves can produce thrust in the absence of any reaction mass, seems to be solid. What’s required now is that we dive further into this and figure out just how the heck it’s generating that force because once we understand that we can further exploit it, potentially opening up the path to even better propulsion technology. If it turns out that it does create warp bubbles than all the better but until we get definitive proof on that speculating along that direction really doesn’t help us or the researchers behind it.
There’s nothing like a healthy dose of snakeoil to remind you that some ideas, whilst sounding amazing in theory, are just not worth pursuing. In this age of 3D renders and photoshop it doesn’t take long for an idea to make its way into what looks like a plausible reality and the unfortunate truth of the Internet holding novelty above all else means such ideas can permeate quickly before they’re given the initial sanity check. Worst still is when well established companies engage in this behaviour, ostensibly to bolster their market presence in one way or another with an idea that may only have a passing relationship with reality. In that vein I present to you the Goodyear BH03, a concept idea that will simply never work:
Sounds cool right? Your tyres can help charge the battery of your shiny new electric car by using the heat it generates from the road and even from the sun when it’s parked outside! Indeed it sounds like such a great idea it makes you wonder why it’s taken so long for someone to think of it as even regular cars to could do with a little extra juice in the battery, potentially avoiding those embarrassing calls to the NRMA to get a jumpstart.
Of course the real reason as to why it hasn’t been done before is because it simply won’t do what they say it will.
You see translating heat into electricity is a notoriously inefficient exercise. Even RTGs, the things that we use to power our deep space craft like Voyager, can only achieve a conversion rate of some 10% of the total heat emitted. That means that kilowatts of heat generated by a red hot lump of decaying plutonium end up being maybe a hundred or so watts of usable electricity. Compare that to the surface area of a tyre, which is at most a square meter, receiving approximately 1KW worth of sun energy under ideal conditions, and you can maybe get 400W under perfect conditions with ideal conversion rates with all 4 tyres.
If you say the tyres spend about 8 hours a day under those conditions (again incredibly ideal) and you’ll get a grand total of 3.2KW into the batteries which, if we use a Tesla as an example, would give you about 15kms worth of range. If you want a more realistic figure with say only half the tyre exposed and the ideal duration much smaller then you’re looking at cutting that figure to less than half. It’s the same problem with putting solar panels on the roof of electric cars, they’re simply not going to be worth the investment because the power they generate will, unfortunately, be minimal.
Still they look cool, I guess.
Back in my school days I thought that skill was an innate thing, a quality that you were born with that was basically immutable. Thus things like study and practice always confused me as I felt that I’d either get something or I wouldn’t which is probably why my academic performance back then was so varied. Today however I don’t believe anyone is below mastering a skill, all that is required is that you put the required amount of time and (properly focused) practice in and you’ll eventually make your way there. Innate ability still counts for something though as there are things you’re likely to find much easier than others and some people are even just better in general at learning new skills. Funnily enough that latter group of people likely has an attribute that you wouldn’t first associate with that skill: lower overall brain activity.
Research out of the University of California – Santa Barbara has shown that people who are most adept at learning new tasks actually show a lower overall brain activity level than their slow learning counterparts. The study used a fMRI machine to study the subject’s brains whilst they were learning a new task over the course of several weeks and instead of looking at a specific region of the brain the researchers focused on “community structures”. These are essentially groups of nodes within the brain that are densely interconnected with each other and are likely in heavy communication. Over the course of the study the researchers could identify which of these community structures remained in communication and those that didn’t whilst measuring the subject’s mastery of the new skill they were learning.
What the researchers found is that people who were more adept at mastering the skill showed a rapid decrease in the overall brain activity used whilst completing the task. For the slower learners many of the regions, namely things like the visual and motor cortexs, remained far more active for a longer period, showing that they were more actively engaged in the learning process. As we learn skills much of the process of actually doing that skill gets offloaded, becoming an automatic part of what we do rather than being a conscious effort. So for the slow learners these parts of the brain remained active for far longer which could, in theory, mean that they were getting in the way of making the process automatic.
For me personally I can definitely attest to this being the case, especially with something like learning a second language. Anyone who’s learnt a different language will tell you that you go through a stage of translating things into your native language in your head first before re-translating them back into the target language, something that you simply can’t do if you want to be fluent. Eventually you end up developing your “brain” in that language which doesn’t require you to do that interim translation and everything becomes far more automatic. How long it takes you to get to that stage though varies wildly, although the distance from your native language (in terms of grammatical structure, syntax and script) is usually the primary factor.
It will be interesting to see if this research leads to some developmental techniques that allow us to essentially quieten down parts of our brain in order to aid the learning process. Right now all we know is that some people’s brains begin the switch off period quicker than others and whatever is causing that is the key to accelerating learning. Whether or not that can be triggered by mental exercises or drugs is something we probably won’t know for a while but it’s definitely an area of exciting research possibilities.
There’s an interesting area of research that’s dubbed biomimicry which is dedicated to looking at nature and figuring out how we can use the solutions it has developed in other areas. Evolution, which has been chugging away in the background for millions of years, has come up with some pretty solid solutions and so investigating them for potential uses seems like a great catalyst for innovation. However there are times when we see things in nature that you can’t help but feel like nature was looking at us and replicated something that we had developed. That’s what I felt when I saw this video of an erodium seed drilling itself into the ground:
As you can probably guess the secret to this seed’s ability to work its way into the ground comes from the long tendril at the top (referred to as an awn). This awn coils itself up when conditions are dry, waiting for a change. Then when the humidity begins to increase the awn begins to unfurl, slowly spinning the seed in a drilling motion. The video you see above is a sped up process with water being added at regular intervals to demonstrate how the process works.
The evolutionary advantage that this seed has developed allows it to germinate in soils that would otherwise be inhospitable to them. The drilling motion allows the seed head to penetrate the ground with much more ease, allowing it to break through coarse soils that would have otherwise proved impenetrable. How this adaptation would have developed is beyond me but suffice to say this is what led to the erodium species of plants dominating otherwise hostile areas like rockeries or alpines.
Up until I saw that video I thought things like drilling were a distinctly human invention, something we had discovered through our experimentation with inclined planes. However like many things it turns out there are fundamental principles which aren’t beyond nature’s ability to replicate, it just needs the right situation and a lot of time for it to occur. I’m sure the more I dig (pun intended) the more examples I could find of this but I’m sure that each example I found would amaze me just as much as this did.
Nearly all of us are born with what we’d consider less than ideal memories. We’ll struggle to remember where our keys our, draw a blank on that new coworker’s name and sometimes pause much longer than we’d like to remember a detail that should be front of mind. The idealised pinnacle, the photographic (or more accurately the eidetic) memory, always seems like an elusive goal, something you have to be born with rather than achieve. However it seems that our ability to forget might actually come from an evolutionary adaptation, enabling us to remember the pertinent details that helped us survive whilst suppressing those that might otherwise hinder us.
The idea isn’t a new one, having existed in some form since at least 1997, but it’s only recently that researchers have had the tools to study the mechanism in action. You see it’s rather difficult to figure out which memories are being forgotten for adaptive reasons, I.E. to improve the survival of the organism, and which ones are simply forgotten due to other factors. The advent of functional Magnetic Resonance Imaging (fMRI) has allowed researchers to get a better idea of what the brain is doing at any one point, allowing them to set up situations to see what the brain is doing when it’s forgetting something. The results are quite intriguing, demonstrating that at some level forgetting might be an adaptive mechanism.
Back in 2007 researchers at Stanford University investigated the prospect that adaptive forgetting was potentially a mechanism for reducing the amount of brain power required to select the right memories for a particular situation. The hypothesis goes that remembering is an act of selecting a specific memory for a goal related activity. Forgetting then functions as an optimization mechanism, allowing the brain to more easily select the right memories by suppressing competing memories that might not be optimal. The research supported this notion, showing decreased activity in anterior cingulated cortex which is activated when people are weighing choices (like figuring out which memory is relevant).
More recent research into this phenomena, conducted by researchers at various institutes at the University of Birmingham and various institutes in Cambridge, focused on finding out if the active recollection of a specific memory hindered the remembering of others. Essentially this means that the act of remembering a specific memory would come at the cost of other, competing memories which in turn would lead to them being forgotten. They did this by getting subjects to view 144 picture and word associations and were then trained to remember 72 of them (whilst they were inside a fMRI machine). They were then given another set of associations for each word which would serve as the “competitive” memory for the first.
The results showed some interesting findings, some which may sound obvious on first glance. Attempting to recall the second word association led to a detriment in the subject’s ability to recall the first. That might not sound groundbreaking to start off with but subsequent testing showed a progressive detriment to the recollection of competing memories, demonstrating they were being actively repressed. Further to this the researchers found that their subject’s brain activity was lower for trained images than ones that weren’t part of the initial training set, an indication that these memories were being actively suppressed. There was also evidence to suggest that the trained memories showed the most average forgetting as well as increased activity in a region of the brain known to be associated with adaptive forgetting.
Whilst this research might not give you any insight into how to improve your memory it does give us a rare look into how our brain functions and why certain it behaves in ways we believe to be sub-optimal. Potentially in the future there could be treatments available to suppress that mechanism however what ramifications that might have on actual cognition is anyone’s guess. Needless to say though it’s incredibly interesting to find out why our brains do the things we do, even if we wished they did the exact opposite most of the time.
Medicine has long known about the potential causes of Alzheimer’s however finding a safe and reliable treatment has proven to be far more elusive. Current treatments centre on alleviating the symptoms of the disease, combating things like memory loss and cognitive function. However whilst these may provide some relief and quality of life improvement they do nothing to treat the underlying cause which is a combination of amyloid plaques and neurofibrillary tangles. Current research has heavily focused on the former which blocks communications between neurons in the brain and, so the theory goes, removing them will restore cognitive function. Recently two treatments have shown some incredibly positive results with one of them not too far off seeing widespread trials.
A drug company called Biogen has developed a drug called Aducanumab which has shown a significant effect in reducing the cognitive decline of Alzheimer’s patients. It’s an antibody that helps trigger an immune system response and was created by investigating the antibodies present in healthy aged donors, with the reasoning going that they had successfully resisted Alzheimer’s related symptoms. The recent large clinical study showed an effect far beyond what the researchers were expecting, including a dose dependent effect. The drug is not yet available for widespread distribution, there’s still one more late stage trial to go, however it could see a wide market release as soon as 2018. It’s still far from a cure but the drug is capable of significantly slowing the progress of the disease, opening up the opportunity for other treatments to be far more effective.
New research from the Queensland Brain Institute at the University of Queensland investigated using focused ultrasound to help break up amyloid plaques. Essentially this treatment disrupts the blood-brain barrier temporarily, allowing microglial cells (which are essentially clean up cells) to enter the particular region of the brain and remove the plaques. After a short period of time, the research shows a couple hours or so, the blood-brain barrier is fully restored ensuring that there are no on-going complications. This allows the body to remove the plaques naturally, hopefully facilitating the restoration of cognitive function.
In the mouse model used the researchers found that they could fully restore the memories of 75% of the subjects affected, an incredibly promising result. Of course the limitations of a mouse model mean that further research is required to find out if it would work as well in humans but there’s already precedent for using this kind of technology for treatment of other brain related conditions. Considering that the mechanism of action is similar to that of Aducanumab (removal of amyloid plaques) the side effects and limitations are likely to be similar, so it will be interesting to see how this develops.
It’s great to see conditions and diseases like this, ones that used to be a long and undignified death sentence, slowly meeting their end at the hands of science. Treatments like this have the potential to vastly improve the quality of life of our later years, meaning we can still be active members of society for much longer. I’m confident that one day we’ll have these conditions pinned down to the point where they’re no more of a worry than any other chronic, but controlled condition.
Much to the surprise of many I used to be a childcare worker back in the day. It was a pretty cruisy job for a uni student like myself, being able to show up after classes, take care of kids for a few hours and then head off home to finish off my studies (or World of Warcraft, as it mostly was). I consider it a valuable experience for numerous reasons not least of which is an insight into some of the public health issues that arise from having a bunch of children all packed into tight spaces. The school which I worked at had its very first peanut allergy ever when I was first there and I watched as the number of children who suffered from it increased rapidly.
Whilst the cause of this increase in allergic reactions is still somewhat unclear it’s well understood that the incident rate of food allergies has dramatically increased in developed countries in the last 20 years or so. There are quite a few theories swirling around as to what the cause will be but suffice to say that hard evidence to support any of them hasn’t been readily forthcoming. The problem for this is the nature of the beast as studies to investigate one cause or the other are plagued with variables that researchers are simply unable to control. However for researchers at the King’s College in London they’ve been able to conduct a controlled study with children who were at-risk of developing peanut allergies and have found some really surprising results.
The study involved 640 children who were all considered to be at a high risk of developing a peanut allergy due to other conditions they currently suffered from (eczema and egg allergies) aged between 4 and 11 months. They were then randomly split into 2 groups, one whose parents were advised to feed them peanut products at least 3 times per week and the other told to avoid. The results are quite staggering showing that when compared to the control group the children who were exposed to peanut products at an early age had an 80% reduced risk in developing the condition. This almost completely rules out early exposure as a risk factor for developing a peanut allergy, a notion that seems to be prevalent among many modern parents.
Indeed this gives credence to the Hygiene Hypothesis which theorizes that the lack of early exposure to pathogens and infections is a likely cause for the increase in allergic responses that children develop. Whilst this doesn’t mean you should let your kids frolic in the sewers it does indicate that keeping them in a bubble likely isn’t protecting them as much as you might think. Indeed the old adage of letting kids be kids in this regard rings true as early exposure to these kinds of things will likely help more than harm. Of course the best course of action is to consult with your doctor and devise a good plan that mitigates overall risk, something which budding parents should be doing anyway.
It’s interesting to see how many of the conditions that plague us today are the results of our affluent status. The trade offs we’ve made have obviously been for the better overall, as our increased lifespans can attest to, however there seems to be aspects of it we need to temper if we want to overcome these once rare conditions. It’s great to see this kind of research bearing fruit as it means that further study into this area will likely become more focused and, hopefully, just as valuable as this study has proven to be.