It seems I can’t go a month without seeing at least one article decrying the end of Moore’s Law and another which shows that it’s still on track. Ultimately this dichotomy comes from the fact that we’re on the bleeding edge of material sciences with new research being published often. At the same time however I’m always sceptical of those saying that Moore’s Law is coming to an end as we’ve heard it several times before and, every single time, those limitations have been overcome. Indeed it seems that one technology even I had written off, Extreme Ultraviolet Lithography, may soon be viable.
Our current process for creating computing chips relies on the photolithography process, essentially a light that etches the transistor pattern onto the silicon. In order to create smaller and smaller transistors we’ve had to use increasingly shorter wavelengths of light. Right now we use deep ultraviolet light at the 193nm wavelength which has been sufficient for etching features all the way down to 10nm level. As I wrote last year with current technology this is about the limit as even workarounds like double-patterning only get us so far, due to their expensive nature. EUV on the other hand works with light at 13.5nm, allowing for much finer details to be etched although there’s been some significant drawbacks which have prevented its use in at-scale manufacturing.
For starters producing the required wattage of light at that wavelength is incredibly difficult. The required power to etch features onto silicon with EUV is around 250W, a low power figure to be sure, however due to nearly everything (including air) absorbing EUV the initial power level is far beyond that. Indeed even in the most advanced machines only around 2% of the total power generated actually ends up on the chip. This is what has led ASML to develop the exotic machine you see above in which both the silicon substrate and the EUV light source work in total vacuum. This set up is capable of delivering 200W which is getting really close to the required threshold, but still requires some additional engineering before it can be utilized for manufacturing.
However progress like this significantly changes the view many had on EUV and its potential for extending silicon’s life. Even last year when I was doing my research into it there weren’t many who were confident EUV would be able to deliver, given its limitations. However with ASML projecting that they’ll be able to deliver manufacturing capability in 2018 it’s suddenly looking a lot more feasible. Of course this doesn’t negate the other pressing issues like the interconnect widths bumping up against physical limitations but that’s not a specific problem to EUV.
The race is on to determine what the next generation of computing chips will look like and there are many viable contenders. In all honesty it surprised me to learn that EUV was becoming such a viable candidate as, given its numerous issues, I felt that no one would bother investing in the idea. It seems I was dead wrong as ASML has shown that it’s not only viable but could be used in anger in a very short time. The next few node steps are going to be very interesting as they’ll set the tempo for technological progress for decades to come.
There are few computer interconnects that have been as pervasive as USB. Its limitations are numerous however the ease at which it could be integrated into electronic devices ensured that it became the defacto standard for nearly everything that needed to talk to a PC. Few other connectors have dared to try to battle it for the connectivity crown, Firewire being the only one that comes to mind, but the new upstart of Thunderbolt as the potential to usurp the crown. Right now it’s mostly reserved for the few who’ve splashed out for a new Macbook but the amount of connectivity, bandwidth and versatility that the Thunderbolt 3 specification from Intel brings is, quite frankly, astounding.
Thunderbolt, in its current incarnation, uses its own proprietary connector. There’s nothing wrong with that specifically, especially when you consider the fact that a single Thunderbolt connection can breakout into all manner of signals, however its size and shape don’t lend it well to applications in portable or slimline devices. The latest revision of the Thunderbolt specification however, announced recently by Intel at Computex in Taiwan, ditches the current connector in favour of the USB Type-C connector which, along with the space savings, brings other benefits like a reversible connector and hopefully much cheaper production costs. Of course the connector is really just one tiny aspect of all the benefits that Thunderbolt 3 will bring.
The new Thunderbolt 3 interface will double the current bandwidth available from 20Gb/s to 40Gb/s, enough to drive two 4K displays at 60hz off a single cable. To put that in perspective the current standard for high resolution screen interconnects, DisplayPort, currently only delivers 17Gb/s with the future 1.3 version is slated to deliver 34Gb/s. On its own that might not be exactly groundbreaking news for consumers, who really cares what the raw numbers are as long as it displays the pictures, but combine that with the fact that Thunderbolt 3 can deliver 100W worth of power and suddenly things are a lot different. That means you could run your monitor off the one cable, even large monitors like my AOC G2460PGs, which only draw 65W under load.
Like its predecessors Thunderbolt 3 will be able to carry all sorts of signals along its wires, including up to 4 lanes worth of PCIe. Whilst many seem to be getting excited about the possibility of external graphics cards, despite the obvious limitations they have, I’m more excited about more general purpose stuff that can be done with external PCIe lanes. The solutions available for doing that right now aren’t great but with 100W of power and 4 PCIe lanes over a single cable there’s potential for them to become a whole lot more palatable.
Of course we’ll be waiting quite a bit of time before Thunderbolt 3 becomes commonplace as manufacturers of both PCs and devices that have that connector ramp up to support it. The adoption of a more common connector, along with the numerous benefits of the Thunderbolt interface, has the potential to accelerate this however they still have a mountain to climb before they can knock USB down. Still I’m excited for the possibilities, even if it will mean a new PC to support them.
Who am I kidding, I’ll take any excuse to get a new PC.
The main substrate of our roads hasn’t changed much in the past 50 years. Most of our roads these days are asphalt concrete with some being plain old concrete with a coarse aggregate in them. For what we use them for this isn’t really an issue as the most modern cars can still perform just as well on all kinds of roads so the impetus to improve them is low. There have been numerous ideas put forth to take advantage of the huge swaths of road we’ve laid down over the years, many seeking to use the heat they absorb to do something useful. One idea though would be a radical departure from the way we currently construct roads and it could prove to be a great source of renewable energy.
Solar (Freakin’) Roadways are solar tiles that can be laid down in place of regular road. Their surface is tempered glass that’s durable enough for a tractor to trundle over it and provides the same amount of grip that a traditional asphalt surface does. Underneath that surface is a bunch of solar panels that will generate electricity during the day. The hexagonal panels also include an array of LEDs which can then be used to generate lane markers, traffic signs or even alert drivers to hazards that have been detected up the road. Both the concept art and the current prototypes they have developed look extremely cool and with their Indiegogo campaign already being fully funded it’s almost a sure bet that we’ll see roads paved with these in the future.
The first question that comes to everyone’s mind though is just how much will roads paved in this way cost, and how does that compare to traditional roads?
As it turns out finding solid numbers on the cost of road construction per kilometer is a little difficult as the numbers seem to differ wildly depending on who you ask. A study that took data from several countries states that the median cost is somewhere around $960,000/km (I assume that’s USD) whereas councils from Australia have prices ranging from $600,000/km to $1,159,000/km. Indeed depending on how complicated the road is the costs can escalate quickly with Melbourne’s Eastlink road costing somewhere on the order of $34,000,000 per kilometer laid down. In terms of feasibility for Solar Roadways I’d say that they could be competitive with traditional roads if they could get their costs to around $1,000,000/km at scale production something which, in my mind, seems achievable.
Unfortunately Solar Roadways isn’t forthcoming with costs as of yet mostly due to them being in the prototype stage. Taking a look over the various components they list though I believe the majority of the construction cost will come from the channels beneath the panels as bulk prices for things like solar panels, tempered glass and PCBs are quite low. Digging and concreting the channels required to carry the power infrastructure could easily end up costing as much as a traditional road does so potentially we’re looking at a slightly higher cost per km than our current roads. Of course I could be horribly wrong about this since I’m no civil engineer.
The cost would be somewhat offset by the power that the solar roads would generate although the payback period is likely to be quite long. Their current prototypes are 36 watt panels which they claim will go up to 52 watt for the final production module. I can’t find any measurements for their panels so I’ve eyeballed that they’re roughly 30cm per side giving them a size of about 0.2 square meters. This means that a square meter of these things could generate roughly 250 watts at peak efficiency. The output will vary considerably throughout the year but say you get 7 hours per day at 50% max output you’re looking at about 875 watts generated per square meter. Your average road is about 3 meters wide giving us 3000 square meters of generation area generating about 2,600kwh per day. The current feed in tariffs in Australia would have 1km of Solar Roadways road making about $1000 / day giving a pay off time of around 3 years. My numbers are likely horribly skewed to be larger than they’d be realistically though (there are many more factors that come into play) but even slashing the efficiency down to 10% still gives you a pay back time of 15 years, longer than the current expected life of the panels.
As an armchair observer then it does seem like Solar Roadways’ idea is feasible and could end up being a net revenue generator for those who choose to adopt it. All of my numbers are based on my speculation though so there are numerous things that could put the kibosh on it but it’s at least taking to the real world implementation stage to see how things pan out. Indeed should this work as advertised then the future of transportation could be radically different, maybe enough to curb our impact on the global ecosystem. I’m looking forward to see more from Solar Roadways as a future with them looks to be incredibly exciting.
Do you remember the Microwave Power Plant in Sim City 2000? The idea behind them was an intriguing one, you launched a satellite into orbit with a massive solar array attached and then beamed the power back down to Earth using microwaves that were collected at a giant receiver. Whilst it worked great most of the time there was always the risk that the beam would stray from its target and begin setting fire to your town indiscriminately, something which the then 11 year old me thought was particularly hilarious. Whilst we’ve yet to see that idea (or the disasters that came along with it, but more on that in a moment) the idea of putting massive solar arrays in orbit, or on a nearby heavenly body, are attractive enough to have warranted significant study.
The one limiting factor of most satellite based designs though is that they can’t produce power constantly due to them getting occluded for almost half their orbital period by Earth. Shimizu Corporation’s idea solves this issue in the most fantastical way possible: by wrapping our moon in a wide band of solar panels, enabling it to generate power constantly and beam it back down to Earth. Such an endeavour would seem like so much vapourware coming from anyone else but Shimizu is one of Japan’s leading architectural and engineering firms with annual sales of $14 billion. If there’s anyone who could make this happen it’s them and it aligns with some of the more aggressive goals for space that the Japanese government has heavily invested in of late.
The idea is actually quite similar to that of its incarnation in Sim City. Since the Moon is tidally locked with Earth (I.E. one side of the moon always points towards us) there only needs to be a single base station on the moon. Then a ring of solar panels would then be constructed all the way around the Moon, ensuring that no matter what the position of Moon, Earth and the Sun there will always be an illuminated section. There would have to be multiple base stations on Earth to receive the constantly transmitted power but since the power beams would be pointable they needn’t be placed in any particular location.
Of course such an idea begs the question as to what would happen should the beam be misaligned or temporarily swing out of alignment, potentially roasting anything in the nearby vicinity. For microwaves this isn’t much of a threat since the amount of power delivered per square meter is relatively low with a concentrated burst of 2 seconds barely enough to raise your body temperature by a couple degrees. A deliberately mistargeted beam could do some damage if left unchecked but you could also combat it very easily by just putting up reflectors or the rectilinear antennas to absorb it. The laser beams on the other hand are designed to be “high density” so you’d want some rigorous safety systems in place to make sure they didn’t stray far from the course.
Undertaking such a feat would require several leaps in technology, not least of which would be in the automation of its construction, but it’s all based on sound scientific principles. It’s unlikely that we’ll even see the beginnings of something like this within the next couple decades but as our demand for power grows options like this start to look a lot more viable. I hope Shimizu pursues the idea further as they definitely have the resources and know how to make it happen, it’s all a question of desire and commitment to the idea.
There’s a long running joke that fusion reactors are always 20 years away, something which people began saying about 60 years ago. It’s not that we get it wrong per se, more that we have a tendency to underestimate the complexity of achieving the next step, something which is usually written off as a simple piece of engineering. We’re now acutely aware of the fact that the practical aspects of running a fusion based power plant are likely going to require significant advancements in materials science and that’s if the theoretical models we have turn out to be correct. Whilst we’ve been able to fuse atoms for a long time now the end goal of fusion power generation, a self sustaining plasma, has yet to be achieved but one theoretical model recently got a jolt of hard science behind it lending a lot of credence to the whole field.
The National Ignition Facility has been dedicated to studying Inertial Containment Fusion, ostensibly because it aligns with most of their overarching goals (one of which is weapons research). Of the two main branches of fusion research, the other being Magnetic Confinement Fusion, ICF is something of a poor sibling in terms of research dollars and large scale experiments. This is not to say its claim is any less valid just that, at least in this armchair physicists understanding, its brand of fusion doesn’t lend itself particularly well to be scaled up to the power generation levels at least not with its current modelling. However NIF has announced today that, for the first time ever for any fusion experiment, their reaction released more energy than what was pumped into it; a sure sign that nuclear fusion was occurring.
It’s a pretty amazing feat and is definitely something that NIF should be proud of, however that does not take into account the total energy of the system which was several orders of magnitude higher than the energy produced at the other end. Thus for such a system to go past full unity it would need an input to output multiplier somewhere in the thousands, much more than what they’re currently achieving. Still as far as I was aware we weren’t even entirely sure if this kind of fusion was feasible, given the strict requirements on many of the parameters. Of course such challenges aren’t entirely unique to this brand of fusion but you have to wonder why after the initial burst of research into ICF things started to slow down considerably with MCF being the reigning champion for many decades now.
From what I can tell though, with my admittedly limited knowledge on the subject matter, MCF has the greatest potential to translate into powerplant scale devices much sooner than those using ICF as a base. Indeed the challenges presented to using MCF do lend themselves well to scale (although large magnetic fields always present some trifles) whereas ICF the challenges increase dramatically with scale as it becomes significantly harder to ensure the right reactions happen to sustain fusion. Of course I’m willing to be told otherwise on this as I could just be suffering from some geek lust for ITER’s sultry designs.
In any case it’s extremely exciting to see the progress that’s being made as it bodes well for a future that could be free of fossil fuels. Whilst I’d love to believe that we’re 20 years away now (and indeed ITER’s schedule puts the first DT reaction within that time frame) I’m going to need to see a few more milestones like this one to start believing it. We’re tantalizingly close however with the evidence constantly building that we’re on the right track to producing all the energy we need without having to dump untold tons of carbon back into our atmosphere.
And that’s why it’s worth spending billions of dollars on researching every possiblity for developing a sustainable fusion reactor.
Long time readers will know I’m something of a petrol head. It’s an obsession that grew out of my introduction to all things automotive at a young age when my parents let me ride around our property on our little Honda Z50 which continued on through multiple bikes and cars as I grew older. Since these cars were usually bound for the scrap heap keeping them running wasn’t something I (well my parents, really) had much interest in spending money on so I became intimately acquainted with the inner workings of late model Datsuns. Whilst I don’t bother diving under the hood of my current car very often the interest in the technology that drives them hasn’t faded as demonstrated by my fixation on this video:
The one area I never really wanted to touch was transmissions, mostly because they’re one of the hardest parts of the car to work on and taking them apart is fraught with danger for unqualified hacks like myself. Whilst I knew the basics of how transmission worked I didn’t know the complex dynamics of power transmission through the varying gears. Whilst this video might not be reflective of how current transmissions work (indeed that’s a world’s a worlds away from how an automatic works, to my understanding) the fundamentals are still there in beautiful 1930’s video glory.
What I’d really be interested to see is the gear work behind some of the advanced transmission schemes that power some of the more modern cars like Volkswagen’s Direct-Shift Gearbox (which is essentially 2 gearboxes working in tandem). There’s also the Continuously Variable Transmission which has the peculiar behavior of letting the engine rev itself out whilst it gradually gears up. This can allow a driver to peg the engine at its optimal RPM and keep it there until the desired speed is reached. Although this is typically undesirable as it feels like clutch slippage in a traditional transmission so many CVT based cars have “gears” that are essentially different response profiles. There’s even more exotic things on formula 1 cars but that’s whole other world to me.
I’ve mentioned before that I’m a big supporter of nuclear (and renewable) sources of energy and how frustrated I am that the social stigma attached to it has seen what would otherwise be a clean and safe source of power slip by the wayside. Many people seem to think that there’s more danger inherit in this technology than there is in other power generation when this is simply not the case, but it seems that incidents of reactors past are still fresh in everyone’s mind. Still with countries like France pioneering the way for nuclear energy I’ve always held out that hope that one day we can transition away from our current energy dependency on oil and coal.
It would seem that Obama isn’t as short sighted as many of his constituents are:
In his speech, Mr. Obama portrayed the decision as part of a broad strategy to increase employment and the generation of clean power. But he also made clear that the move was a bid to gain Republican support for a broader energy bill.
“Those who have long advocated for nuclear power — including many Republicans — have to recognize that we will not achieve a big boost in nuclear capacity unless we also create a system of incentives to make clean energy profitable,” Mr. Obama said.
He also strikes on one of the biggest problems (other than the social stigma) that nuclear power faces: the cost. Current estimates for new reactors peg the total construction cost between $6~10 billion dollars with costs of construction going up faster than other means of power generation. Obama hits the nail on the head when he says that incentives are needed as the majority of western countries are quite hostile to new nuclear plants. The amount of regulation and beaucracy involved in setting up these plants typically makes them unprofitable for those who would want to set them up. Guaranteeing funding for the majority of the work means that a lot of the risk is absolved by the government, making the endeavour much more attractive.
Obama also gets kudos for using the proper spelling of nuclear (although that could be the reporter, I haven’t heard the speech myself. If you’ve got a link to it let me know!).
There is however hope for future reactors like the Westinghouse AP1000 (Yes, that Westinghouse) which has been commissioned by China for the princely sum of just $2 billion, a drastic reduction in cost. Additionally with China’s economy still growing strong they’ve planned a grand total of 100 of these reactors to be built over the course of the next decade which will have the added side effect of driving massive economies of scale when it comes to building AP1000 plants. With time I can see this reactor tech becoming a lot cheaper than their coal and oil counterparts, a critical step in driving mass adoption of nuclear technology.
However, whilst I believe that nuclear is the solution to many of our current problems I do not believe that it is the final solution to our insatiable craving for energy. Research shows that as GDP increases so does energy consumption, so you can imagine that a country like China who is just beginning to create a giant middle class will create a demand for energy on a scale that we haven’t seen before. Whilst nuclear will be capable of sustaining them (and others) in the short term the fact remains that nuclear is really just a far more efficient fossil fuel, and alternatives must be sought.
Currently my hopes remain in fusion technology. Whilst they still fall under the umbrella of fossil fuels they produce far and away more energy from orders of magnitude less fuel. However the technology is still in its infancy and requires significant amounts of research before commercial reactors become available. The good news is that many see the potential in this future technology with projects like ITER attracting funding and involvement on an international scale. People might say that fusion is always 20 years away, but I have my hopes for this technology.
One thing that’s always a big issue for any project in space is how you’re going to power whatever you’re sending up there. As it turns out the methods that we use to generate power up in space are extremely varied and in fact many of them paved the way for technologies we now use back here on earth. However there are still some advances to be made and if we are to return to the moon and beyond there will have to be a breaking down of some old barriers in order to enable us to go further into space.
Many of the initial space craft that were sent up just had your traditional chemical batteries in them. For the most part these worked well, and since they had been around for such a long time they were a proven technology (something that is critical in any space endeavour). As time went on and missions became much more ambitious NASA moved from batteries to fuel cells and were the first to fly these in a space craft on their Gemini missions. Fuel cells are advantageous because not only do they produce power, but typically a decent amount of heat and water as well. In fact they are still used to power the space shuttle and will typically produce around 500 litres of water on whilst in space. This is invaluable as that’s 500Kg less water they have to bring with them and 500Kg more they can take into orbit.
Satellites are another matter entirely. Since they don’t need any of those bothersome human things like water and heat fuel cells aren’t the right choice for them and the majority of artificial satellites in orbit around earth and our nearby neighbours use good old fashioned solar power. At the distance we are from the sun the available power is somewhere in the order of 1400W/m² but that drops off dramatically as we reach further out into the solar system. In fact the amount of power available past mars is so little compared to where we are that there is only one mission currently scheduled to Jupiter that uses solar panels called Juno.
So what do we use when we want to explore the deep reaches of space? The current technology used in most missions is called a Radioisotope Thermoelectric Generator (RTG) which in essence uses heat from decaying radioactive material to provide heat and electricity. In the past they’ve coped a lot of flack for using these as environmental groups lament the potential for damaging the environment and spreading nuclear material across the earth. NASA has done extensive research on the matter but still runs up against endless red tape whenever they try to use one. The usefulness of these devices really can’t be overstated as they’ve given us such missions such as Voyager 1, which has been going strong for over 30 years and is slated to last for at least another 15. This kind of technology is going to form the basis of any mission that attempts to leave our solar system.
NASA has begun to make inroads into producing small nuclear reactors that would be used to power a moon base. For any kind of long duration time in space us humans need quite a lot more power than our robotic counterparts and we won’t be able to use RTGs to satisfy this requirement. Whilst I do understand some of the environmental concerns if I was going to trust anyone sending nuclear material into space it would be NASA, as they have a long track record of getting hazardous materials out of our atmosphere without incident. Unfortunately the environmentalists haven’t seen it that way, and continually put up roadblocks which inhibit progress.
Eventually though I’m sure we will be able to power our space based devices using nuclear power without the worry and red tape that we have now. As time goes by NASA and other space agencies will prove that the technology is sound after repeated launches and the controversy will be nothing but a memory. It is then we can start to look further out into our solar system, and hopefully, beyond.
After my last foray into the controversial world of the environment and power generation (which generated some stimulating discussion and research for me) I thought it best to take a look at the renewable means of power generation and which of them have a future. I’ve had a bit of experience with most of the technology in the past with a few of my off site engineering lectures, a requirement for any engineering degree, being held on renewable energy technologies. My father also teaches renewable energy classes at the local TAFE here in Canberra, and I’ve seen quite a few interesting projects he’s been involved with over the years.
When we talk about renewable energy sources we’re looking for something that doesn’t rely on fossil fuels. The main candidates for renewable energy are:
Now not one of these solutions can provide meet all of the energy needs of the entire world and there’s many different factors to consider. The ideal solution will probably end up with a combination of many of these technologies (and some of the ones that are currently under development) just like the power generation we use today.
First the main consideration is base load power generation. Whilst this is usually trotted out as the argument to destroy the idea of using any form of renewable energy it does have raise a key points that need to be addressed. Many of the renewable energies I’ve mentioned (in fact just over half of them) can’t produce stable amounts of power. Solar, wind and oceanic technologies vary their power output significantly depending on their environment. To solve this issue base load generating stations like geothermal and biomass have to be used to supply that base level of power. The other alternative is to invest some storage technologies, like molten salt for solar thermal. For Australia I believe that geothermal and solar thermal are probably the way to go. This is because we have so much uninhabitable land that is very dry and sunny, something that these technologies thrive on. Photovoltaics are nice for smaller installations however they currently do not scale as well as the others, although that might all change when sliver cells take off¹.
Secondly load following plants are also required in order to accommodate variations in power requirements. Biomass and Hydroelectric are both options for this however I’m not entirely sure how well they can scale up. It may be more efficient to have more base load plants and just disconnect them from the grid. Whilst that may sound counter-intuitive it would be perfectly acceptable since the energy is usually not being harnessed anyway.
The last problem I’ve seen with the implementation of renewables is the lack of ideal locations for certain technologies. Geothermal requires geysers to be present or implementation of a hot rocks plant. Wind requires either high altitude or favourable wind environments such as offshore. Solar and solar thermal require a decent amount of sun and a nice flat area. You can see where I’m going with this, there’s a fair amount of work to be done to get these things in and working.
Having said all this, I’m still all for these technologies. All of the problems I’ve put forward are nothing short of solvable and eventually we’ll be forced into implementing these solutions. The great news is a lot of the supposedly big bag oil companies are in fact on board and supporting this kind of technology. The ones who aren’t will eventually fall by the wayside and we can only hope they come around before they pull an Enron and dissolve the company.
I still believe nuclear would be a great transition technology, but only time will tell.
¹I actually had the pleasure of meeting the developer of sliver technology, Andrew Blakers, back when I was a fledgling engineer. His technology does have the potential to change photovoltaics in a way that would make them highly viable. Origin Energy has some great pictures of the cells in development, and hopefully they’ll be commercially available soon.
I can’t help but feel that there are some technologies out there that just get hit with a bad name once and are then driven underground because of it. Cold fusion was a great example of this since the scientists who were experimenting with it first didn’t follow proper scientific method but now any serious research into this area is immediately hit with disdain, even though there are some results that require further investigation. This becomes all the more painful when something that is proven to work gets the same sort of reaction. I am of course referring to nuclear power, or fission reactors.
Now what’s the first thing that comes to mind when someone mentions nuclear power to you? Is it a clean source of energy or do you get images of Chernobyl, Three Mile Island and nuclear weaponry? It seems the majority of the world is stuck in the latter mindset, only remembering the horrors that nuclear power brings to the world. The truth of the matter is that not only is nuclear power completely safe, it’s also a lot more friendly to the environment than any other fossil fuel based means of generating power.
The first round of questions I usually get concerning nuclear power is “Doesn’t it produce highly radioactive and toxic waste?” and the answer is yes, it does. However, per kilowatt of power produced a coal plant will release around 100 times more radiation into the surrounding environment. Additionally most of the waste produced by a nuclear plant that comes out radioactive means it’s still usable as fuel for a reactor, it just requires some more handling. This is done using breeder reactors which I do admit carry with them a small risk of proliferation. This can be easily offset by modifying the breeder to render the weapons grade stuff unusable, keeping the risk well within acceptable levels.
One country that has been listening to people like me is France, producing well over 85% of their electricity from nuclear sources. They’ve also only had 2 incidents arising from their use of nuclear power and breeding reactors, giving them an amazing track record for safety. You would think that if there was such a high risk in using nuclear power that the French would have had a multitude of accidents, but they haven’t. Clearly nuclear power is a lot safer than what the general public believes.
To give you an idea of just how bad public opinion is here’s a graph showing the number of nuclear reactors over time:
Image used under the The Creative Commons Attribution-NonCommercial-ShareAlike License Version 2.5 from Global Warming Art.
The Three Mile Island incident was a pretty minor affair technically and nuclear power continued to grow afterwards. However Chernobyl tarnished the world’s view of nuclear power and it hasn’t really recovered since. The fact of the matter is the reactor responsible for that disaster was known at the time to be an unsafe design and modern reactors are quite capable of shutting themselves down before such a disaster can occur.
It’s the old saying of once bitten, twice shy. The world suffered through a major accident with nuclear power and from then on anyone peddling it as the solution to the world’s energy problems has to work past lobbyists, politicians and the society at large. It’s hard to convince everyone that the risks are far lower than what they used to be, and for some reason the mythical idea of a clean coal power plant seems like a better idea than proven nuclear technologies. Australia as a nation, who’s uranium reserves are the largest in the world, is well positioned to take advantage of this technology. With so much unarable land available there’s no reason for us not to set up large reactors away from major population centres, keeping the “risks” to the population even smaller still.
So hopefully the next time you talk to someone about nuclear power you won’t see the green glowing boogey man that seems so ingrained in everyone’s heads. One day nuclear will be one of our few options left, and it is my hope that we begin working on implementing a nuclear based power infrastructure before its our last option.