The past couple decades have seen the rise of a burgeoning private space industry, one that’s become dominated by companies founded by entrepreneurs who made their original fortunes in industries that couldn’t have been more different. What they’ve accomplished in that timeframe has been staggering making the long standing giants of this industry look archaic by comparison. However their track records for delivering in fields that these new companies can’t yet service is what has kept them going but the time is fast approaching when even their golden tickets will be up for auction. At least one company doesn’t appear to be resting on its laurels however with United Launch Alliance, a partnership between Lockheed-Martin and Boeing, announcing their cut price launch system called Vulcan.
As the banner’s imagery alludes to ULA’s Vulcan is an all-American vehicle, ditching the reliance on Russian built engines that have been the mainstay of their rockets for quite a while now. That’s caused some consternation as of late as the USA tries to wean itself off its reliance for Russia to provide access to space as well as the well publicized failures of a few choice engines. It’s hardly a surprising move given that many other US based companies are looking to bring their manufacturing back on-shore, both for quality control reasons as well as for publicity purposes. Regardless of where its made though what will really define this rocket is how it performs and how much it will cost.
ULA has said that the Vulcan will follow in the footsteps of the Delta-IV, offering multiple configurations from medium-lift all the way up to heavy-lift. The way this will be achieved will be through the use of different sized payload fairings as well as additional strap on solid rocket boosters, allowing the rocket to be configured to match the payload its delivering into orbit. ULA is being rather coy about the range of payloads that Vulcan will be able to service however if it’s anything like the system it will ultimately be replacing it will be a direct competitor to the future Falcon Heavy. At this point I’d usually make a quip about the SpaceX equivalent being vastly cheaper however ULA is aiming for a street price of $100 million per launch which isn’t too far off SpaceX’s projected price for their craft.
This rather extraordinary drop in price (down from some $350 million for a comparable launch on the Delta-IV) comes on the back of making the Vulcan reusable, eliminating a lot of the costs of rebuilding a rocket from scratch for every launch. However unlike the fully reusable system that SpaceX and others are pursuing (which, unfortunately, suffered another failure today) ULA is instead taking a piecemeal approach to reusability with the first part being a mid-air recovery of the engine section using a helicopter. Considering that the engines are among the most expensive components on rockets recovering them only makes sense and, potentially, has a higher chance of succeeding than other approaches currently do.
It’s good to see that the private space industry has been able to put some pressure on the long standing giants, forcing them to innovate or be pushed out of space completely. Whilst Vulcan might still be quite a few years away from seeing its first launch it shows that ULA recognise the position they’re in and are willing to compete their way out of it. Hopefully we’ll see some more details on the actual specifications of this craft sometime soon as depending on the different configurations (and their potential costs) this could even prompt SpaceX to rethink their approach. The result of an innovation war between those two giants can only mean great things for the space industry as a whole and, by extension, us as potential space faring beings.
There’s little question in my mind that the future of energy production on earth lies within fusion. There’s simply no other kind of energy source that can produce energy on the same scale, nor over the extended periods of time that it can. Of course the problem is that fusion, especially the net energy positive kind, is an incredibly hard thing to achieve. So much so that in the numerous decades so far no one has yet made a device capable of producing sustained power output and the one project that might, ITER, is decades behind schedule. Thus you can imagine my scepticism when I hear that Lockheed Martin expects to have a device operable in 10 years with it being widely available in 20 (snicker).
The Compact Fusion project comes out of Lockheed Martin’s Skunk Works labs which have delivered such things as the venerable SR-71 in the past. They’re a highly secretive bunch of people which is why this announcement, along with a rather well designed website, has attracted quite a bit of interest as typically any project of theirs that might not deliver won’t see the light of day. Thus you’d assume that Lockheed Martin has some level of confidence in the project, especially when they’re committing to delivering the first round of these devices to the military in the not too distant future. Indeed if their timelines are to be believed they could even beat ITER to the punch which would be a massive coup if they pulled it off.
Their design has the entire reactor fitting on the back of a truck (although the size of said truck is debatable it looks to be the size of a tanker) which is an order of magnitude smaller than all other commercial fusion reactor designs. This is somewhat perplexing as the style of containment they’re going for, the tokamak style which ITER uses, scales up quickly (in terms of power) with increased plasma volume. There are limits to this of course but it also means that the 100MW figure they’re quoting, which is 20% of what ITER will produce, comes with its own set of problems which I don’t believe have good solutions yet.
Indeed whilst the project will be standing on the shoulders of numerous giants that have come before them there’s still a lot of fundamental challenges standing between them and a working reactor 5 years down the line. However should they be able to achieve that goal it will be the start of a new age of humanity, one where even our wildest energy demands could be met with the use of these clean running fusion reactors. The possibilities that something like this would open up would be immense however the the long running joke that fusion is always 20 years away still rings true with Lockheed Martin’s compact reactor project. I would love for my scepticism to be proven wrong on this as a fusion powered future is something humanity desperately needs but it’s always been just out of our reach and I’m afraid it will continue to be for a least a while longer.
When a technology company doesn’t get a whole lot of press it usually means one of two things: the first is that it isn’t that interesting and no one really cares about it or, and this doesn’t happen often, they simply don’t want/need it. Conversely if a product is a dismal failure it’s usually guaranteed that it’ll get a whole bunch of the wrong type of attention, especially with the Internet’s bent towards schadenfreude. With that in mind it made me wonder why I hadn’t heard more about D-Wave since I last wrote about them around this time last year. Especially considering that Lockheed Martin had bought one of their D-Wave One systems a year prior to that.
Turns out they probably don’t really need the press as they’re doing just fine:
VANCOUVER — When the world’s largest defence contractor reportedly paid $10 million for a superfast quantum computer, the Burnaby, B.C., company that built it earned a huge vote of confidence.
Two years after Lockheed Martin acquired the first commercially viable quantum computer from D-Wave Systems, the American aerospace and technology giant is once again throwing its weight behind a technology many thought was still the stuff of science fiction.
You’d be forgiven for thinking that this was just old news resurfacing 2 years later but it isn’t as Lockheed Martin just purchased a D-Wave 2, their latest and greatest quantum computing offering. Details are a little scant as to what is actually in their latest system but going off their product road map it’s likely to be some variant of their Vesuvius chip which contains 512 qubits. That’s 4 times the amount of qubits in their previous system which would make it exceptionally more powerful and all for the same cost as the first unit they sold.
In my quest to try and find a little more information about their new system I stumbled across this page which digs into the underlying architecture of the D-Wave One/Two systems. Now back when I first wrote about D-Wave they weren’t exactly forthcoming with this kind of information which was what drew them a considerable amount of criticism but since then a lot of their loudest critics have renounced their positions. Interestingly though, and feel free to correct me if I’m interpreting this wrong, whilst they indeed claim to have produced a functioning qubit they haven’t managed to entangle several of them together. Whilst this doesn’t make their system useless, single qubits daisy chained together will still be useful for some specific functions, it does mean that the exponential scaling doesn’t really apply to D-Wave’s style of quantum computers. I could be wrong about this but their explanation only mentions entanglement-like properties in the qubit section with their interconnecting grids only being used to “exchange information”, not to provide multi-qubit entanglement.
That doesn’t make it any less cool however as I’m sure as they continue to scale up their processors they’ll eventually start entangling more bits together which will increase their computational power exponentially. We won’t see consumer level processors using technology like this for a long time though as they’re akin to CUDA units on graphics cards, highly specialized computational units that excel in their task and not so much in general computing. Still D-Wave’s systems signal the beginning of the quantum computing era and that means its only a matter of time before we see them everywhere.
All too often it seems we’re presented with technologies that are always at least a decade away from seeing some sort of practical application. It’s a symptom of a bigger problem, one of research not being able to continue on ideas that don’t have the potential to produce something useful (read: profitable) and so often we’re introduced to an idea long before it becomes reality so that it can in fact become a reality. However every so often one of these technologies makes it past that crucial line of finding its way into the real world and the latest of which is the idea of quantum computing.
I can’t remember when I first introduced to the idea behind quantum computers but I have been fascinated with them for quite a while. Quantum computers differ from regular computers in that their system is made up of qubits which are able to hold either a 0, 1 or a quantum superposition of any of these. What this means is that when compared to traditional computers which can only be in one state at any one time a quantum computer can be in any number of the possible states simultaneously. Thus as the number of qubits increases the amount of computing power available increases exponentially, theoretically out-pacing any conventional processor. Such computers would have big impacts on tasks that require large amounts of raw computing power like chemical reaction simulations and breaking encryption schemes.
I can remember reading about the first public demonstration of a quantum computer back in 2007 from a company called D-Wave. At the time I wasn’t 100% sure if they’d actually created what they claimed they had as there was a bit of controversy on the matter but re-reading the articles today showed that the criticism was more leveled at the claims D-Wave was making about their systems capabilities rather than it actually being a quantum computer. I really hadn’t followed the subject much until D-Wave popped up in my feed reader last week with the bold claim that you could now buy yourself at 128 qubit quantum computer:
Whether or not D-Wave has actually built a quantum computer is still a matter of debate (though, a study authored by the company and published in Nature claims to prove its success) but, whatever it is these crafty Canadians have created, you can order one now and start crunching qubits with abandon. The D-Wave One is the first commercially available quantum computer and, while its 128-qubit processor can only handle very specific tasks and is easily outperformed by traditional CPUs, it could represent a revolution in the field of supercomputing.
Now it’s one thing to put something up for sale and another to actually deliver that product. Not many people (or companies for that matter) would have the $10 million required to purchase one of these systems handy so initially it looked like it would be a while before we actually saw one of these systems purchased. Not a week later did D-Wave announce that they had sold their very first quantum computing system to Lockheed Martin with details around the purpose of the system remaining secret for now. To say that I’m surprised that they managed to sell one that quickly would be putting it lightly, but it’s also telling of what they’ve accomplished.
Lockheed Martin, whilst being no slouch when it comes to taking bets on new technology, wouldn’t be sinking $10 million into a product that wouldn’t deliver as advertised. Nor would they want D-Wave to publicly announce it either should they end up looking the fool should their quantum computer turn out to be not so quantum. On the surface then it would seem that D-Wave is the real deal and their 128 qubit system is the first commercially available quantum computer and we could be on the cusp of yet another computing power explosion.
So does this mean that in the next few years we’ll see quantum computers appear at the consumer level? Probably not, in their current state they require liquid helium cooling and we’re still many advances away from making quantum computers in form factors that we’re all familiar with. It does mean however that we’re close to being able to solve harder problems more rapidly though and the subsequent improves to D-Wave’s systems will make doing such tasks again exponentially easier. With Lockheed Martin basically verifying D-Wave’s credibility I’m very interested to see who lines up next for a quantum computer, especially if they’re going to be a bit more open as to what they plan to do with it.