Researchers Create Electric Circuits in Roses.

The blending of organic life and electronics is still very much in its nascent stages. Most of the progress made in this area is thanks to the adaptability of the biological systems we’re integrating with, not so much the technology. However even small progress in this field can have wide reaching ramifications, sometimes enough to dramatically reframe the problem spaces we work in. One such small step has been made recently by a team from the Linköping University in Sweden as they have managed to create working electronic circuits within roses.


The research, born out of the Laboratory of Organic Electronics division of the university, experimented with ways of integrating electronics into rose plants so they could monitor, and potentially influence, the growth and development of the plant. To do this they looked at infusing the rose with a polymer that, once ingested into the plant, would form a conductive wire. Attempts with many polymers simply resulted in the death of the plant as they either poisoned it or blocked the channels the plant used to carry nutrients. However one polymer, called PEDOT-S:H, was readily taken up by the roses and didn’t cause any damage to the plant. Instead it formed a thin layer within the xylem (one of the nutrient transport mechanisms within plants) that produced a conductive hydrogel wire up to 10cm long.

The researchers then used this wire to create some rudimentary circuits within the plant’s xylem structure. The wire itself, whilst not being an ideal conductor, was surprisingly conducive to current with a contact resistance of 10KΩ. To put that in perspective the resistance of human skin can be up to 10 times more than that. Using this wire as a basis the researchers then went on to create a transistor by connecting source, drain and gate probes. This transistor worked as expected and they went one step further to create logic gates, demonstrating that a NOR gate could be created using the hydrogel wire as the semiconducting medium.

This kind of technology has potential to revolutionize the way that we monitor and influence plant growth and development. Essentially what this allows us to do is create circuitry within living plants, using their own cellular structures as a basis, that can act as sensors or regulators for the various chemical processes that happen within them. Of course there’s still a lot of work to be done in this area, namely modelling the behaviour of this organic circuitry in more depth to ascertain what kind of data we can get and processes we can influence. Suffice to say it should become a very healthy area of research as there are numerous potential applications.


33% of USA’s 2 Year Olds Not Properly Vaccinated.

Widespread vaccination programs have been the key to driving many crippling diseases to extinction. This boils down to one, simple, irrefutable fact: they work and are incredibly safe. However the anti-vaccination movement, which asserts all sorts of non-scientific dribble, has caused vaccine rates to drop to levels where herd immunity starts to become compromised. This presents a number of challenges as unvaccinated children and adults are not only a threat to themselves but to others who have contact with them. Indeed the problem may be worse than first thought as it appears that even among those who do vaccinate the completion rate is low, with 1 in 3 two year olds in the USA not having completed the recommended vaccination course.


The study, published RTI International (a non-profit research institute based in North Carolina), showed that up until a child was 2 years old the state of their vaccinations was quite fluid. Indeed the vast majority of children weren’t compliant with the required vaccination schedule with most of them receiving a dose outside the recommended window. Upon reaching approximately 24 months of age however most had caught up with the required schedule although a staggering 33% of them were still non-compliant at this age. This might not seem like much of an issue since the majority do eventually get their vaccinations however there are sound scientific reasons for the scheduling of vaccines. Ignoring them has the potential to limit, or completely negate, their efficacy.

The standard vaccine schedule has been developed to maximise the efficacy of vaccines and also to reduce the risk that, should a child contract that disease, potentially life threatening complications are reduced or eliminated. The pertussis (whooping cough) vaccine is estimated to have an extremely high efficacy rate in young children, up to 95%, but that begins to drop off rapidly if the vaccine is administered later in life. Similar efficacy slopes are seen in other childhood disease vaccines such as the combined MMR vaccine. At the same time these vaccines are administered around the time when the potential impacts of the disease are at their greatest. Missing a vaccine at that point runs the risk of severe complications should the disease be contracted at that point.

It’s unsurprising that the study found that the western states had the lowest rates of vaccination as that’s where the anti-vaccination movement has been most active. Just this year there was an outbreak of measles there and the year before that there was a whooping cough epidemic. Interestingly the southern states had the highest rates of vaccination as shown by the snippet of this infographic above. Whilst the anti-vaccination movement is undeniably an influence in the hodge-podge vaccination approach that seems prevalent the blame here lies solely on the parents who aren’t adhering to the vaccination schedule better.

It’s understandable that some of these things can slip as the challenges of being a parent are unending but when it comes to their health there’s really no other competing priority. For parents this means that they’ll need to pay better attention to their doctor’s advice and ensure that the vaccine schedule is adhered to more closely. Additionally the government could readily help in alleviating this issue by developing better reminder systems, ones that are more in tune with the modern parent’s lives. Hopefully these statistics alone will be enough to jar most into action.

Linear Friction Welding of…Wood?

Friction welding is a fascinating process, able to join dissimilar metals and plastics together with bonds that are far stronger than their welded counterparts. As far as I was aware though it was limited to inorganic materials, mostly because other materials would simply catch fire and not fuse together. As it turns out that’s not the case and recent research has shown that it’s possible to friction weld pieces of wood together in the much the same way as you would metal.

What’s particularly interesting about the process is how similar it is to friction welding of metals or plastics. Essentially the rubbing of the two surfaces together causes the interfaces to form a viscous film that mixes together and, when the friction is stopped, fuse together. For the above video you can see some of the film produced escaping through the sides due to the large amount of pressure that’s applied to ensure the weld is secured. Like all other kinds of friction welding the strength of the joint is dependant on a number of factors such as pressure, period of the friction motion and duration of the weld. As it turns out friction welding of wood is actually an active area of research with lots of investigation into how to create the strongest joints.

Even cooler is the fact that some researchers have developed a technique that allows the welds to be done with no fibres being expelled out the sides. This means that there was no charring of the interface medium, enabling the resulting weld to be even stronger and much more resistant to intrusion by moisture. Whilst you’re not going to see a sub built of friction welded wood any time soon it does mean that, potentially, your house could be built without the use of fasteners or joiners and the first rain that came through wouldn’t make it all come unstuck.

Don’t think you can just go off and rub two pieces of wood together though, the frequency required to fuse the wood was on the order of 150Hz and a pressure of 1.9MPa, far beyond the capabilities of any human to produce. Still it’s not unthinkable that a power tool could be made to do it, although I lack the mechanical engineering experience to figure out how that would be done. I’m sure someone will figure that out though and I can’t wait to see what kind of structures can be made using friction welding techniques.


The Quest for Dark Matter: The XENON1T.

When we have observed distant galaxies we noticed something peculiar about the way they move. Instead of moving like we’d expect them to, with things further away from the centre moving slower than those closer, everything past a certain point moves at about the same speed. This is contrary to how other, non-galaxy sized systems like our solar system move and so we’ve long looked for an explanation as to how this could occur. The commonly accepted theory is that there’s extra matter present throughout the universe that we can’t see but interacts through gravity, called dark matter. Direct detection of dark matter has so far eluded us however due to its incredibly elusive nature. However there’s one experiment, called XENON1T, that could potentially shine some light on this elusive substance.


XENON1T is an evolution of previous experiments XENON10 and XENON100 which all shared the same goal: direct detection of a dark matter particle. Even though dark matter is theorized to be abundant everywhere in the universe (millions of dark matter particles would be passing through you every second) since they rarely interact detection is incredibly difficult. However just like the neutrinos before them there are ways of making detection possible and that’s what the XENON series of experiments aimed to do. At the heart of the experiment is a cylinder of liquid xenon, a chilly -95°C, bounded at each end by an array of photomultiplier tubes which are essentially extremely sensitive cameras. The thinking goes that, should a dark matter particle interact with the liquid xenon, a flash of light will occur which can then be analysed to see if it was a dark matter interaction.

The process by which this is determined is pretty interesting relying on 2 different types of interactions in order to determine what kind of particle interacted with the liquid xenon. The first signal, dubbed S1, is the flash of light that occurs at a very specific frequency of 178nm (ultraviolet). The photomultipliers are sensitive enough to be able to detect single photon events so even the smallest interaction will be captured. This photon is then brought upwards through the liquid xenon by an electric field that’s present in the gaseous section of the XENON1T device and when it leaves the liquid xenon it rockets upwards. The photon then leaves a trail of ionization in the gas behind it, dubbed the S2 signal, which allows for the exact position of the interaction to be determined. This is critical as the researchers only want events from the centre of the device as that has a greatly reduced rate of background noise.

The methodology was thoroughly validated by the two previous experiments even though both of them failed to directly detect any dark matter. They did put bounds on the properties of dark matter however, notably their size and potential electron spin. XENON1T is going to be around 100 times more sensitive than its previous brethren so even if it fails to see anything it will put even more stringent constraints on how a dark matter particle could be constructed. This is critical in validating or eliminating certain particle physics models and could give credence to other non-standard physics models like Modified Newtonian Dynamics (MOND).

Research such as this is incredibly important in developing an accurate understanding of how our universe operates. No matter the outcome of the experiment we’ll learn something valuable and it could potentially drive research towards a whole new world of physics. It won’t change our daily lives but ensuring our understanding of the world is as close to its knowable truth is the heart of science and that’s why this research is so important.


Lithium-Air Batteries are the Future, Still Many Years Away.

There’s no question about it: batteries just haven’t kept pace with technological innovation. This isn’t for lack of trying however, it’s just that there’s no direct means to increasing energy densities like there is for increasing transistor count. So what we have are batteries that are mostly capable however have not seen rapid improvement as technology has rocketed away to new heights. There are however visions for the future of battery technology that, if they come to fruition, could see a revolution in battery capacity. The latest and greatest darling of the battery world is found in a technology called Lithium-Air, although it becoming a reality is likely decades away.

ambl012_fig_2Pretty much every battery in a smartphone is some variant of lithium-ion which provides a much higher energy density than most other rechargeable battery types. For the most part it works well however there are some downsides, like their tendency to explode and catch fire when damaged, which have prevented them from seeing widespread use in some industries. Compared to other energy dense mediums, like gasoline for example, lithium-ion is still some 20 times less dense. This is part of the reason why it has taken auto makers so long to start bringing out electric cars, they simply couldn’t store the required amount of energy to make them comparable to gasoline powered versions. Lithium-Air on the other hand could theoretically match gasoline’s energy density, the holy grail for battery technology.

Lithium-air relies on the oxidation (essentially rusting) of lithium in order to store and retrieve energy. This comes with a massive jump in density because, unlike other batteries, lithium-air doesn’t have to contain its oxidizing agent within the battery itself. Instead it simply draws it from the surrounding air, much like a traditional gasoline powered engine does. However such a design comes with numerous challenges which need to be addressed before a useable battery can be created. Most of the research is currently focused on developing a cathode (negative side) as that where the current limitations are.

That’s also where the latest round of lithium-air hype has come from.

The research out of Cambridge details a particularly novel chemical reaction which, theoretically, could be used in the creation of a lithium-air battery. The reaction was reversed and redone over 2000 times, showing that it has the potential to store and retrieve energy as you’d expect a battery to. However what they have not created, and this is something much of the coverage is getting wrong, is an actual lithium-air battery. What the scientists have found is a potential chemical reaction which could make up one of the cells of a lithium-air battery. The numerous other issues, like the fact their reaction only works in pure oxygen and not air, which limit the applicability of this reaction to real world use cases. I’m not saying they can’t be overcome but all these things need to be addressed before you can say you’ve created a useable battery.

Realistically that’s not any fault of the scientists though, just the reporting that’s surrounded it. To be sure their research furthers the field of lithium-air batteries and there’s a need for more of this kind of research if we ever want to actually start making these kinds of batteries. Breathless reporting of progressions in research as actual, consumer ready technology though doesn’t help and only serves to foster the sense that the next big thing is always “10 years away”. In this case we’re one step closer, but the light is at the end of a very long tunnel when it comes to a useable lithium-air battery.

Wendelstein 7-X

Not Your Typical Fusion Reactor: The Stellarator.

When you read news about fusion it’s likely to be about a tokamak type reactor. These large doughnut shaped devices have dominated fusion research for the past 3 decades mostly because of their relative ease of construction when compared to other designs. That’s not to say they’re not without their drawbacks, as the much delayed ITER project can attest to, however we owe much of the recent progress in this field to the tokamak design. However there are other contenders that, if they manage to perform at similar levels to tokamaks, could take over as the default design for future fusion reactors. One such design is called the stellarator and its latest incarnation could be the first reactor to achieve the dream: steady state fusion.

Wendelstein 7-X

Compared to a tokamak, which has an uniform shape, the stellarator’s containment vessel appears buckled and twisted. This is because of the fundamental design difference between the two reactor types. You see in order to contain the hot plasma, which reaches temperatures of 100 million degrees celsius, fusion reactors need to contain it with a magnetic field. Typically there are two types of fields, one that provides the pinch or compressing effect (poloidal field) and another field to keep the plasma from wobbling about and hitting the containment vessel (toroidal field). In a tokamak the poloidal field comes from within the plasma itself by running a large current through the plasma and the poloidal field from the large magnets that run the length of the vessel. A stellarator however provides both the toroidal and poloidal fields externally requiring no plasma current but necessitating a wild magnet and vessel design (pictured above). Those requirements are what have hindered stellarator design for some time however with the advent of computer aided design and construction they’re starting to become feasible.

The Wendelstein 7-X, the successor to the 7-AS, is a stellarator that’s been a long time in the making, originally scheduled to have been fully constructed by 2006. However due to the complexity and precision required of the stellarator design, which was only completed with the aid of supercomputer simulations, construction only completed last year. The device itself is a marvel of modern engineering with the vast majority of the construction being completed by robots, totalling some 1.1 million hours. The last year has seen it pass several critical validation tests, including containment vessel pressure tests and magnetic field verification. Where it really gets interesting though is where their future plans lead; to steady state power generation.

The initial experiment will be focused on short duration plasmas with the current microwave generators able to produce 10MW in 10 second bursts or 1MW for 50 seconds. This is dubbed Operational Phase 1 and will serve to validate the stellarator’s design and operating parameters. Then, after the completion of some additional construction work to include a water cooling transfer system, Operational Phase 2 will begin which will allow the microwave system to operate in a true steady state configuration, up to 30 minutes. Should Wendelstein 7-X be able to accomplish this it will be a tremendous leap forward for fusion research and could very well pave the way for the first generation of commercial reactors based on this design.

Of course we’re still a long way away from reaching that goal but this, coupled with the work being done at ITER, means that we’re far closer than we ever were to achieving the fusion dream. It might still be another 20 years away, as it always is, but never before have we had so many reactor designs in play at the scales we have today. We’ll soon have two (hopefully) validated designs done at scale that can achieve steady state plasma operations. Then it simply becomes a matter of economics and engineering, problems that are far easier to overcome. No matter how you look at it the clean, near limitless energy future we’ve long dream of is fast approaching us and that should give us all great hope for the future.


The Kilogram Will Soon be Scientific.

Of all the scientific standards the one that is still yet to be defined in pure scientific terms is the kilogram. Whilst all SI units, like meters, have their basis in real world objects they have since been redefined in pure scientific terms. The meter, once defined by the length of a pendulum with a half-second period, is now defined as the distance light travels in a specific time frame. The reasoning for redefining these measurements in absolute scientific terms has to do with reproducibility of standard objects as it’s almost impossible to create two objects that are exactly identical. Such is the issue that the kilogram has faced for much of its life, but soon it will change.


The picture above depicts a replica of the International Prototype Kilogram, a platinum-iridium cylinder machined to exacting specifications which defines the current day kilogram. It’s almost exactly sized brother, Le Grande K, is the standard by which all other kilogram measures are compared. There are numerous cylinders like this all around the world and they’re periodically compared to each other to ensure that they’re roughly in alignment. However over time there’s been fluctuations noted between the prime cylinder and its siblings which causes scientists all sorts of grief. Essentially since the kilogram weights are different, even by only micrograms, these variations need to be accounted for when using the kilogram as a standard. It would be far better if it was rigidly defined as then scientists would be able to verify their instruments themselves rather than having to rely on a physical object.

It seems we may have finally reached that point.

The trouble, you see, with defining something as nebulous as the kilogram in pure scientific terms is that it needs to be reproducible and verifiable. The International Committee for Weights and Measures (CIPM) agreed to express the kilogram in terms of Planck’s constant (a link between a photon’s energy and its frequency). Essentially experiments would need to be designed to calculate the Planck value using the standard kilogram weight as a measure, which would then allow scientists to describe the kilogram as a function of a physical constant. There were numerous experiments designed to test this however the two that have come out on top were: creating a single crystal silicon sphere and counting the atoms in it and using a device called a watt balance to measure the standard kilogram against an electrical force. These are both scientifically sound ways of approaching the experiment however the latter method struggled to get the required results.

Essentially, whilst the experiment was capable of producing usable results, they couldn’t get the level of tolerances that would be required for verification of Planck’s Constant. It took several rounds of experiments, and several different research teams, to close in on the issues however in August this year they managed to hone in on Planck’s Constant with an uncertainty of 12 parts per billion, enough for the CIPM to accept the results for use in verifying a standard kilogram. This means that these results will likely now for the basis for scientists the world over to validate and calibrate devices that reference the kilogram without having to get their hands on one of the platinum-iridium cylinders.

The change of definition isn’t slated to come into effect until July 2017 and there’s further experimentation to be done between now and then. There is potential for one of the experiments to cause an upset with the other as any deviations from the currently accepted results will require confirmation from both. Currently the silicon sphere experimenters are in the process of procuring some additional test items for investigation which could potential cause this whole thing to start over again. However with the watt-balance experiment now having most of the major kinks worked out it’s unlikely this will occur and the further experimentation will ensure that the error rate is reduced even further.

It won’t mean much of a change to our everyday lives, we’ll continue weighing things with the same scales as we did before, but it will mean a monumental change in the way we conduct scientific research. Finally ridding ourselves of the last physical objects that define our measurements will free us from their variability, making them accurate in the most true sense. It’s been a long time coming but there’s light at the end of the tunnel and we’ll soon have no need for those platinum-iridium cylinders. Well, not unless you fancy yourself a really expensive paperweight.


Beyond the LHC: AWAKE.

The Large Hadron Collider has proven to be the boon to particle physics that everyone had imagined to be but it’s far from done yet. We’ll likely be getting great data out of the LHC for a couple decades to come, especially with the current and future upgrades that are planned. However it has its limit and considering the time it took to build the LHC many are looking towards what will replace it when the time comes. Trouble is that current colliders like the LHC can only get more powerful by being longer, something which the LHC struggled with at its 27KM length. However there are alternatives to current particle acceleration technologies and one of them is set to be trialled at the LHC next year.


The experiment is called AWAKE and was approved by the CERN board back in 2013. Recently however it was granted additional funding in order to pursue its goal. At its core the AWAKE experiment is a fundamentally different approach to particle acceleration, one that could dramatically reduce the size of accelerators. It won’t be the first accelerator of this type to ever be built, indeed proof of concept machines already exist at over a dozen facilities around the world, however it will be the first time CERN has experimented with the technology. All going well the experiment is slated to see first light sometime towards the end of next year with their proof of concept device.

Traditional particle colliders work on alternating electric fields to propel particles forward, much like a rail gun does with magnetic fields. Such fields place a lot of engineering constraints on the containment vessels with more powerful fields requiring more energy which can cause arcing if driven too high. To get around this particle accelerators typically favour length over field strength, allowing the particles a much longer time to accelerate before collision. AWAKE however works on a different principle, one called Plasma Wakefield Acceleration.

In a Wakefield accelerator instead of particles being directly accelerated by an electric field they’re instead injected into a specially constructed plasma. First a set of charged particles, or laser light, is sent through the plasma. This then sets off an oscillation within the plasma creating alternating regions of positive and negative charge. Then when electrons are injected into this oscillating plasma they’re accelerated, chasing the positive regions which are quickly collapsing and reforming in front of them. In essence the electrons surf on the oscillating wave, allowing them to achieve much greater velocities in a much quicker time. The AWAKE project has a great animation of the experiment here.

The results of this experiment will be key to the construction of future accelerators as there’s only so much further we can go with current technology. Wakefield based accelerators have the potential to push us beyond the current energy collision limits, opening up the possibility of understanding physics beyond our current standard model. Such information is key to understanding our universe as it stands today as there is so much beauty and knowledge still out there, just waiting for us to discover it.


The Chemistry of the Volkswagen Scandal.

The Volkswagen emissions scandal is by far one of the most egregious acts of deceptive engineering we’ve seen in a long time. Whilst the full story of how and why it came about won’t be known for some time the realities of it are already starting to become apparent. What really intrigued me however isn’t so much the drama that has arisen out of this scandal but the engineering and science I’ve had to familiarize myself with to understand just what was going on. As it turns out there’s some quite interesting chemistry at work here and, potentially, Volkswagen have shot themselves in the foot just because they didn’t want to use too much of a particular additive:

The additive in question is called AdBlue and is comparatively cheap ($1/litre seems pretty common) when compared to other fluids that most modern cars require. The problem that Volkswagen appears to have faced was that they didn’t have the time or resources require to retrofit certain models with the system when it became apparent that they couldn’t meet emissions standards. As to why they chose to defeat the emissions testing devices instead of simply delaying a model release (a bad, but much better, situation than what they currently find themselves in) is something we probably won’t know for a while.

Regardless it was an interesting aside to the larger scandal as I wasn’t familiar with this kind of diesel technology previously. Indeed now that I understand it the scandal seems all the more absurd as the additive is cheap, the technology well known and has successful implementations in many other vehicles. Still it introduced me to some interesting engineering and science that I wasn’t privy to before, so there is that at least.


Researchers Create Long Term Memory Encoding Prosthesis.

The brain is still largely a mystery to modern science. Whilst we’ve mapped out the majority of what parts do what we’re still in the dark about how they manage to accomplish their various feats. Primarily this is a function of the brain’s inherit complexity, containing some 100 trillion connections between the billions of neurons that make up its meagre mass. However like all problems the insurmountable challenge of decrypting the brain’s functions is made easier by looking at smaller parts of it. Researchers at the USC Viterbi School of Engineering and the Wake Forest Baptist Medical Center have been doing just that and have been able to recreate a critical part of the brain’s functionality in hardware.


The researchers have recreated the part of the brain called the hippocampus, the part of the brain that’s responsible for translating sensory input into long term memories. In patients that suffer from diseases like Alzheimer’s this is usually the first part that gets damaged, preventing them from forming new memories (but leaving old ones unaffected). The device they have created can essentially replace part of the hippocampus, facilitating the same encoding functions that a non-damaged section would provide. Such a device has the potential to drastically increase the quality of life of many people, enabling them to once again form new memories.

The device comes out of decades of research into how the brain processes sensory input into long term memories. The researchers initially tested their device on laboratory animals, implanting the device into healthy subjects. Then they recorded the input and output of the hippocampus, showing how the signals were translated for long term storage. This data was then used to create a model of this section of the hippocampus, allowing the researchers to then take over the job of encoding those signals. Previous research showed that, even when the animal’s long term memory function was impaired through drugs, the prosthesis was able to generate new memories.

That in and of itself is impressive however the researchers have been replicating their work with human patients. Using nine test subjects, all of whom had the requisite electrodes implanted in the right regions to treat chronic seizures, the researchers utilized the same process to develop a human based model. Whilst they haven’t yet used that to help in the creation of new memories in humans they have proven that their human model produces the same signals as the hippocampus does in 90% of cases. For patients who currently have no ability to form new long term memories this could very well be enough to drastically improve their quality of life.

This research has vast potential as there are many parts of the brain that could be mapped in the same way. The hippocampus is critical in the formation of non-procedural long term memories however there are other sections, like the motor and visual cortices, which could benefit from similar mapping. There’s every chance that those sections can’t be mapped directly like this but it’s definitely an area of potentially fruitful research. Indeed whilst we still not know how the brain stores information we might be able to repair the mechanisms that feed it, and that could help a lot of people.