Our sun is an incredibly violent thing, smashing atoms together at an incredible rate that results in the outpouring of vast torrents of energy into our solar system. Yet from certain perspectives it takes on a serene appearance, its surface ebbing and flowing as particles trace out some of its vast magnetic field. Indeed that’s exactly what the following video shows: a gorgeous composition of imagery taken from NASA’s Solar Dynamics Observatory. Whilst not all of us have the luxury of a 4K screen it’s still quite breathtaking to behold and definitely worth at least a few minutes of your time.
SDO has been in orbit for 5 years now keeping an almost unbroken eye on our parent star. Its primary mission is to better understand the relationship that our earth and the sun have, especially those which have a direct impact on daily life. To achieve this SDO is observing the sun in multiple wavelengths all at once (shown as different colours in this video) and on a much smaller timescale than previous craft have attempted. This has led to insights into how the sun generates its magnetic field, what it looks like and how the complex fusion processes influence the sun’s varying outputs like solar wind, energetic particles and variations in its solar output. Those images aren’t just rich with scientific data however as they showcase the sun’s incredible beauty.
So, how’s the serenity? 😉
As far as we know right now we’re alone in the universe. However the staggering size of the universe suggests that life should be prevalent elsewhere and we (or they) have the unenviable task of tracking it (or us) down. We’re also not quite sure to look for as whilst we have solid ideas about our kind of life there’s no guarantees that they hold universally true across the galaxy. So when it comes to observing phenomena the last reason researchers should resort to is “aliens did it” as we simply have no way of verifying that was the case. It does make for some interesting speculation however like with the current wave of media hysteria surrounding KIC 8462852, or Tabby’s star as it’s more informally called.
KIC 8462852 was one of 145,000 stars that was being constantly monitored by the Kepler spacecraft, a space telescope that was designed to directly detect exoplanets. Kepler’s planet detection method relies on a planet transiting (I.E. passing in front of) its parent star during its observation period. When the planet does this it ever so slightly drops the brightness of the star and this can give us insights into the planet’s size, orbit and composition. This method has proven to be wildly successful with the number of identified exoplanets increasing significantly thanks to Kepler’s data. KIC 8462852 has proved particularly interesting however as its variation in brightness is way beyond anything we’ve witnessed elsewhere.
Indeed instead of the tiny dips we’re accustomed to seeing, an Earth-like planet around a main sequence star like ours produces a chance of about 84 parts per million, KIC 8462852 has dipped a whopping 15% and 22% on separate occasions. Typically this isn’t particularly interesting, there are many stars with varying output for numerous reasons, however KIC 8462852 is a F-type main sequence star which is very similar to ours (which is a G-type if you’re wondering). These don’t vary wildly in output and the scientists have ruled out issues with equipment and other potential phenomena so what we’re left with is a star with varying output with no great explanation. Whatever is blocking that light has to be huge, at least half the width of the star itself.
There are a few potential candidates to explain this, most notably a cloud of comets on an elliptical orbit that happens to transit our observation path. How that exactly came to be is anyone’s guess, and indeed it would be a rare phenomenon, but it’s looking to be the best explanation we currently have. A massive debris field has currently been ruled out due to a lack of infrared radiation, something which would be present due to the star heating the debris field. This has led to some speculation as to what could cause something like this to happen and some have looked towards intelligent life as the cause.
How could an alien race make a star’s output dip that significantly you ask? Well the theory goes that any sufficiently advanced civilization will eventually require the entire energy output of their star in order to fuel their activities. The only way to do that is to encase the star in a sphere (called a Dyson Sphere) in order to capture all of the energy that it releases. Such a megastructure couldn’t be built instantly however and so to an outside observer the star’s output would likely look weird as the structure was built around it. Thus KIC 8462852, with its wild fluctuations of output, could be in the process of being encased in one such structure for use by another civilization.
Of course such a hypothesis makes numerous leaps that are not supported by any evidence we currently have at our disposal. The research is thankfully focused on finding a more plausible explanation, something which we are capable of finding by engaging in further observations of this particular star. Should all these attempts fail to explain the phenomena, something which I highly doubt will happen, only then should we start toying with the idea that this is the work of some hyper-advanced alien civilization. Whilst the sci-fi nerd me wants to leap at the possibility of a Dyson sphere being built in our backyard I honestly can’t entertain an idea when I know there are so many other plausible options out there.
It is fun to dream, though.
We’ve known for some time that water exists in some forms on Mars. The Viking program, which consisted of both orbiter and lander craft, showed that Mars’ surface had many characteristics that must have been shaped by water. Further probes such as Mars Odyssey and the Phoenix Lander showed that much of the present day water that Mars holds is present at the poles, trapped in the vast frozen tundra. There’s been a lot of speculation about how liquid water could exist on Mars today however no conclusive proof had been found. That was until today when NASA announced it had proof that liquid water flows on Mars, albeit in a very salty form.
The report comes out of the Georgia Institute of Technology with collaborators from NASA’s Ames Research Center, Johns Hopkins University, University of Arizona and the Laboratoire de Planétologie et Géodynamique. Using data gathered from the Mars Reconnaissance Orbiter the researchers had identified that there were seasonal geologic features on Mars’ surface. These dark lines (pictured above) were dubbed recurring slope lineae would change over time, darkening and appearing to flow during the warmer months and then fading during the colder months. It has been thought for some time that these slopes were indicative of liquid water flows however there wasn’t any evidence to support that theory.
This is where the MRO’s Compact Reconnaissance Imaging Spectrometer for Mars (CRISM) comes into play. This instrument was specifically designed to detect water on Mars by looking at varying wavelengths of light emitted from the planet’s surface. Once the target sites were identified CRISM was then pointed at them and their surface composition analysed. What was found at the RSL sites were minerals called hydrated salts which, when mixed with water, would lower the freezing point of the water significantly. Interestingly these hydrated salts were only detected in places were the RSL features were particularly wide as other places, where the RSLs were slimmer, did not show any signs of hydrated salts.
These salts, called perchlorates, have been seen before by several other Mars missions although they’ve never been witnessed in hydrated form before. These perchlorates can potentially keep water from freezing at temperatures down to -70°C. Additionally some of these perchlorates can be used in the manufacturing of rocket fuel, something which could prove to be quite valuable for future missions to Mars. Of course they’re likely not in their readily usable form, requiring some processing on site before they can be utilized.
Data like this presents many new opportunities for further research on Mars. It’s currently postulated that these RSLs are likely the result of a shallow subsurface flow which is wicking up to the surface when the conditions are warmer. If this is the case then these sites would be the perfect place for a rover to investigate as there’s every chance it could directly sample martian water at these sites. Considering that wherever we find liquid water on Earth we find life then there’s great potential for the same thing to happen on Mars. If there isn’t then that will also tell us a lot which means its very much worth investigating.
Scale is something that’s hard to comprehend when it comes to celestial sized objects. The sheer vastness of space is so far beyond anything that we see in our everyday lives that it becomes incomprehensible. Yet in such scale I find perspective and understanding, knowing that the universe is far greater than anything going on in just one of its countless planets. To really grasp that scale though you have to experience it; to understand that even in our cosmic backyard the breadth of space is astounding. That’s just what the following video does:
You’d be forgiven for not knowing that Amazon founder Jeff Bezos had founded a private space company. Blue Origin, as it’s known, isn’t one for the spotlight as whilst it was founded in 2000 (2 years before SpaceX) it wasn’t revealed publicly until some years later. The company has had a handful of successful test launches however, focusing primarily on the suborbital space with Vertical Takeoff/Vertical Landing (VTVL) capable rockets. Indeed their latest test vehicle, the New Shepard, was successfully launched at the beginning of this year. Outside of that though you’d be hard pressed to find out much more about Blue Origin however today they have announced that they will be launching from Cape Canaveral, using the SLC-36 complex which used to be used for the Atlas launch system.
It might not sound like the biggest deal however the press conference held for the announcement provided us some insight into the typically secretive company. For starters Blue Origins efforts have thus far been focused on space tourism, much like Virgin Galactic was. Indeed all their previous craft, including the latest New Shepard design, were suborbital craft designed to take people to the edge of space and back. This new launch site however is designed with much larger rockets in mind, ones that will be able to carry both humans and robotic craft alike into Earth’s orbit, putting them in direct competition with SpaceX and other private launch companies.
The new rocket, called Very Big Brother (pictured above), is slated to be Blue Origin’s first entry into the market. Whilst raw specifications aren’t yet forthcoming we do know that it will be based off Blue Origin’s BE-4 engine which is being co-developed with United Launch Alliance. This engine is slated to be the replacement for the RD-180 which is currently used as part of the Atlas-V launch vehicle. Comparatively speaking the engine is about half as powerful when compared to the RD-180, meaning that if the craft is similarly designed to the Atlas-V it’s payload will be somewhere in the 4.5 to 9 tonne range to LEO. Of course this could be wildly different to what they’re planning and we likely won’t know much more until the first craft launches.
Interestingly the craft is going to retain the VTVL capability that its predecessors had. This is interesting because no sizeable craft has that capability. SpaceX has been trying very hard to get it to work with the first stages of their Falcon-9 however they have yet to have a successful landing yet. Blue Origin likely won’t beat SpaceX to the punch on this however but it’s still interesting to see other companies adopting similar strategies in order to make their rockets reusable.
Also of note is the propellant that the rocket will use for the BE-4 engine. Unlike most rockets, which either run on liquid hydrogen/liquid oxeygen or RP-1(kerosene)/liquid oxygen the BE-4 will use natural gas and liquid oxygen. Indeed it has only been recently that methane has been considered as a viable propellant as I could not find an example of a mission that has flown using the fuel. However there must be something to it as SpaceX is going to use it for their forthcoming Raptor engines.
I’m starting to get the feeling that Blue Origin and SpaceX are sharing a coffee shop.
It’s good to finally get some more information out of Blue Origin, especially since we now know their ambitions are far beyond that of suborbital pleasure junkets. They’re entering a market that’s now swarming with competition however they’ve got both the capital and strategic relationships to at least have a good go at it. I’m very interested to see what they do at SLC-36 as more competition in this space is a good thing for all concerned.
The way we get most of the scientific data back from the rovers we currently have on Mars is through an indirect method. Currently there are four probes orbiting Mars (Mars Odyssey, Mars Express, Mars Reconnaissance Orbiter and MAVEN) all of which contain communications relays, able to receive data from the rovers and then retransmit it back to Earth. This has significant advantages, mostly being that the orbiters have longer periods with which to communicate with Earth. Whilst all the rovers have their own direct connections back to Earth they’re quite limited, usually several orders of magnitude slower. Whilst current rovers won’t have their communication links improved for future missions having a better direct to Earth link could prove valuable, something which researchers at the University of California, Los Angeles (UCLA) have started to develop.
The design is an interesting one essentially being a flat panel of phased antenna array elements using a novel construction. The reasoning behind the design was that future Mars rover missions, specifically looking towards the Mars 2020 mission, would have constraints around how big of an antenna it could carry. Taking this into account, along with the other constraint that NASA typically uses X-band for deep space communications like this, the researchers came up with the design to maximise the gain of the antenna. The result is this flat, phased array design which, when tested in a prototype 4 x 4 array, closely matched their simulated performance metrics.
With so many orbiters around Mars it might seem like a better direct to Earth communications relay wouldn’t be useful however there’s no guarantees that those relays will always be available. Currently mission support for most of those orbiters is slated to end in the near future with the furthest one out slated for decommissioning in 2024 (MAVEN). Since there’s a potential new rover slated to land sometime in 2020, and since we know how long these things can last once they’ve landed, having better on board communications might become crucial to the ongoing success of the mission. Indeed should any of the other rovers still be functioning at that time the new rover may have to take on board the relay responsibilities and that would demand a much better antenna design.
There’s still more research to be done with this particular prototype, namely scaling it up from its current 4 x 4 design to the ultimate 16 x 16 panel. Should the design prove to scale as expected then there’s every chance that you might see an antenna based on this design flying with an orbiter in the near future. I’m definitely keen to see how this progresses as, whilst it might have the singular goal of improving direct to Earth communications currently, the insights gleaned from this design could lead to better designs for all future deep space craft.
Black holes are a never ending source of scientific intrigue. They form when a star of appropriate mass, approximately 5 to 10 times the mass of our own star, reaches the end of its life and begins to fuse heavier and heavier elements. At this stage the outward pressure exerted by those fusion reactions cannot overcome the gravity from its mass and it slowly begins to collapse inwards. Eventually, in a calamitous event known as a supernova, it shrinks down to a point mass of infinite density and nothing, not even light, can escape its gravitational bounds. Properties like that mean black holes do very strange things, most of which aren’t explained adequately by current models of our universe. One such thing is called the Information Paradox which has perplexed scientists for as long as the idea as black holes has been around.
The paradox stems from the interaction between general relativity (Einstein’s description of gravity as a property of spacetime) and quantum mechanics (the processes that affect atoms, photons and other particles). Their interaction suggests that physical information about anything that crosses the black hole’s event horizon could be destroyed. The problem with this is that it violates the generally held idea that if we have information about a system in one point in time we should be able to determine its state at any point in time. Put simply it means that, when you’re looking at a black hole, if something falls into it you have no way of determining when that happened because the information is destroyed.
However renown physicist Stephen Hawking, whose work on black holes is so notable that one feature of them (Hawking Radiation) is named after him, has theorized that the information might not be lost at all. Instead of the information being lost or stored within the black hole itself Hawking states that the information is stored as a super-translation (or a hologram, a 2D representation of 3D data) in the event horizon. Whilst for all practical purposes this means that the information is lost, I.E. you likely wouldn’t be able to reconstruct the system state prior to the particles crossing the event horizon, it would solve the paradox.
The idea might not be as radical as you first think as other researchers in the area, like Gerard t’Hooft (who was present at the conference where Hawking presented this idea), have been exploring similar ideas in the same vein. There’s definitely a lot of research to be done in this area, mostly to see whether or not the idea can be supported by current models or whether it warrants fundamental changes. If the idea holds up to further scrutiny then it’ll solve one of the most perplexing properties of black holes but there are still many more that await.
Space history of the past few decades is dominated by the Space Shuttle. Envisioned as a revolution in access to space it was designed to be launched numerous times per year, dramatically reducing the costs of access to space. The reality was unfortunately not in line with the vision as the numerous design concessions made, coupled with the incredibly long average turnaround time for missions, meant that the costs far exceeded that of many other alternative systems. Still it was an iconic craft, one that several generations will point to as the one thing they remember about our trips beyond our atmosphere. What few people realise though is that there was potential for the shuttle to have a Russian sister and her name was Buran.
The Buran project started in 1974, only 5 or so years after the Space Shuttle program was kicked off by NASA. The goals of both projects were quite similar in nature, both aiming to develop a reusable craft that could deliver satellites, cosmonauts and other cargo into orbit. Indeed when you look at the resulting craft, one of which is shown above in its abandoned complex at the Baikonur Cosmodrome, the similarities are striking. It gets even more interesting when you compare their resulting specifications as they’re almost identical with only a meter or two difference between them. Of course under the hood there’s a lot of differences, especially when it comes to the primary purpose of the Buran launch system,
The propulsion system of the Buran differed significantly from the Shuttle with the boosters being a liquid oxygen/hydrogen mix rather than a solid rocket fuel. There are advantages to this, chief among them being able to shut down the engines once you start them (something solid rocket boosters can’t do) however at the same time these were not designed to be reusable, unlike their Shuttle compatriots. This would mean that the only reusable part of the Buran launch system was the orbiter itself which would increase the per-launch cost. Additionally the Buran included a fully autonomous flight control system from the get go, something the Shuttle only received during an upgrade later in its life.
That last part is somewhat telling of Buran’s true purpose as, whilst it could service non-military goals, it was primarily developed to assist Russia’s (then the Soviet Union) military interests. Indeed the winged profile of the craft enables many mission profiles that are simply of no interest to non-military agencies and having it fully autonomous from the get go shows it was meant more conflict than research. Indeed when commenting on the programme’s cancellation a Russian cosmonaut commented that the Buran didn’t have any civilian tasks planned for it and, with a lack of requirements to fuel a military programme, it was cancelled.
That was not before it saw numerous test flights, including a successful orbital test flight. The achievements that the Buran made during its single flight are not to be underestimated as it was the first craft to perform such a flight fully unmanned and to make a fully automated landing. That latter feat is even more impressive when you consider that there was a very strong crosswind, some 60 kilometers per hour, and it managed to land mere meters off its originally intended mark. Indeed had Russia continued development of the Buran shuttle there’s every chance that it would have been a much more advanced version of its American sister for a very long time.
Today however the Buran shuttles and their various test components lie scattered around the globe in varying states of disrepair and decay. Every so often rumours about a resurrection of the program surface, however it’s been so long since the program was in operation that such a program would only share the name and little more. Russia’s space program has continued on to great success however, their Soyuz craft becoming the backbone of many of humanity’s endeavours in space. Whilst the Buran may never have become the icon for space that its sister Shuttle did it remains the highly advanced concept that could have been, a testament to the ingenuity and capability of the Russian space program.
When it comes to exoplanets the question that I often hear asked is: why are they all largely the same? The answer lies in the methods that we use for detecting exoplanets, the most successful of which is observing the gravitational pull that planets have on their host stars. This method requires that planets make a full orbit around their parent start in order for us to detect them which means that many go unnoticed, requiring observation times far beyond what we’re currently capable of. However there are new methods which are beginning to bear fruit with one of the most recent discoveries being a planet called 51-Eridani-b.
Unlike most other exoplanets, whose presence is inferred from the data we gather on their parent star, 51-Eridani-b is the smallest exoplanet that we’ve ever imaged directly. Whilst we didn’t get anything like the artist’s impression above it’s still quite an achievement as planets are usually many orders of magnitude dimmer than their parent stars. This makes directly imaging them incredibly difficult however this new method, which has been built into a device called the Gemini Planet Imager, allows us to directly image a certain type of exoplanet. The main advantage of this method is that it does not require a lengthy observation time to produce results although like other methods it also has some limitations.
The Gemini Planet Imager was built for the Gemini South Telescope in Chile, the sister telescope of the more famous Gemini North Telescope in Hawaii. Essentially it’s an extremely high contrast imager, one that’s able to detect a planet that’s one ten millionth as bright as its parent star. Whilst this kind of sensitivity is impressive even it can’t detect Earth-like planets around a star similar to our sun. Instead the planets that we’re likely to detect are young jupiter planets which are still hot from their formation being far more luminous than a planet typically is. This is exactly what 51-Eridani-b is, a fiery young planet that orbits a star that’s about 5 times as bright as our own.
Equally as impressive is the technology behind the Gemini Planet Imager which enables it to directly image planets like this. The first part is a coronagraph, a specially designed interference device which allows us to block out the majority of a parent star’s light. Behind that is a set of adaptive optics, essentially a set of tiny mirrors that can make micro-adjustments in order to counteract atmospheric distortions. It has to do this since, unlike space based telescopes, there’s a lot of turbulent air between us and the things we want to look at. These mirrors, which are deformable at the micro level using MEMS, are able to do this with incredible precision.
With the successful discovery of 51-Eridani-b I’m sure further discoveries won’t be far off. Whilst the Gemini Planet Imager might only be able to discover a certain type of planet it does prove that the technology platform works. This then means that improvements can be made, expanding its capabilities further. I have no doubt that future versions of this technology will be able to directly image smaller and smaller planets, one day culminating in a direct image of an Earth-like planet around a sun-like star. That, dear read, will be a day for the history books and it all began here with 51-Eridani-b.
You’d think that long duration space travel was something of a solved problem, given the numerous astronauts who’ve spent multiple months aboard the International Space Station. For some aspects of space travel this is correct but there are still many challenges that face astronauts who’d venture deeper into space. One of the biggest challenges is radiation shielding as whilst we’ve been able to keep people alive in-orbit they’re still under the protective shield of the Earth’s magnetic field. For those who go outside that realm the dangers of radiation are very real and currently we don’t have a good solution for dealing with it. The solution to this problem could come out of research being done at CERN using a new type of superconducting material.
The material is called Magnesium diboride (MgB₂) and is currently being used as part of the LHC High Luminosity Cold Powering project. MgB₂ has the desirable property of having the highest critical temperature (the point at which it becomes superconducting) of any conventional superconducting materials, some −234°C, about 40°C above absolute zero. Compared to other conventional superconductors this is a much easier temperature to work with as others usually only become superconducting at around 11°C above absolute zero. At the same time creating the material is relatively easy and inexpensive making it an ideal substance to investigate for use in other applications. In terms of applications in space the Superconductors team at CERN are working with the European Space Radiation Superconducting Shield (SR2S) project which is looking at MgB₂ as a potential basis for a superconducting magnetic shield.
Of the numerous solutions that have been proposed to protect astronauts from cosmic radiation during long duration space flight a magnetic shield is one of the few solutions that has shown promise. Essentially it would look to recreate the kind of magnetic field that’s present on earth which would deflect harmful cosmic rays away from the spacecraft. In order to generate a field large and strong enough to do this however we’d have to rely on superconductors which does introduce a lot of complexity. A MgB₂ based shield, with its lower superconducting temperature, could achieve the required field with far less requirements on cooling and power, both of which are at a premium on spacecraft.
There’s still a lot of research to go between now and a working prototype however the research team at S2RS have a good roadmap to taking the technology from the lab to the real world. The coming months will focus on quantifying what kind of field they can produce with a prototype coil, demonstrating the kinds of results they can expect. From there it will be a matter of scaling it up and working out all the parameters required for operation in space like power draw and cooling requirements.
It’s looking good for a first generation shield of this nature to be ready in time for when the first long duration flights are scheduled to occur in the future, something which is a necessity for those kinds of missions. Indeed I believe this research is certain to pave the way for the numerous private space companies and space faring nations who have set their sights beyond earth orbit.