Traditional computing is bound in binary data, the world of zeroes and ones. This constraint was originally born out of a engineering limitation, designed to ensure that these different states could be easily represented by differing voltage levels. This hasn’t proved to be much of a limiting factor in the progress that computing has made however but there are different styles of computing which make use of more than just those zeroes and ones. The most notable one is quantum computing which is able to represent an exponential amount of states depending on the number of qubits (analogous to transistors) that the quantum chip has. Whilst there have been some examples of quantum computers hitting the market, even if their quantum-ness is still in question, they are typically based on exotic materials meaning mass production of them is tricky. This could change with the latest research to come out of the University of New South Wales as they’ve made an incredible breakthrough.
Back in 2012 the team at UNSW demonstrated that they could build a single qubit in silicon. This by itself was an amazing discovery as previously created qubits were usually reliant on materials like niobium cooled to superconducting temperatures to achieve their quantum state. However a single qubit isn’t exactly useful on its own and so the researchers were tasked with getting their qubits talking to each other. This is a lot harder than you’d think as qubits don’t communicate in the same way that regular transistors do and so traditional techniques for connecting things in silicon won’t work. So after 3 years worth of research UNSW’s quantum computing team has finally cracked it and allowed two qubits made in silicon to communicate.
This has allowed them to build a quantum logic gate, the fundamental building block for a larger scale quantum computer. One thing that will be interesting to see is how their system scales out with additional qubits. It’s one thing to get two qubits talking together, indeed there’s been several (non-silicon) examples of that in the past, however as you scale up the number of qubits things start to get a lot more difficult. This is because larger numbers of qubits are more prone to quantum decoherence and typically require additional circuitry to overcome it. Whilst they might be able to mass produce a chip with a large number of qubits it might not be of any use if the qubits can’t stay in coherence.
It will be interesting to see what applications their particular kind of quantum chip will have once they build a larger scale version of it. Currently the commercially available quantum computers from D-Wave are limited to a specific problem space called quantum annealing and, as of yet, have failed to conclusively prove that they’re achieving a quantum speedup. The problem is larger than just D-Wave however as there is still some debate about how we classify quantum speedup and how to properly compare it to more traditional methods. Still this is an issue that UNSW’s potential future chip will have to face should it come to market.
We’re still a long way off from seeing a generalized quantum computer hitting the market any time soon but achievements like those coming out of UNSW are crucial in making them a reality. We have a lot of investment in developing computers on silicon and if those investments can be directly translated to quantum computing then it’s highly likely that we’ll see a lot of success. I’m sure the researchers are going to have several big chip companies knocking down their doors to get a license for this tech as it really does have a lot of potential.
When a technology company doesn’t get a whole lot of press it usually means one of two things: the first is that it isn’t that interesting and no one really cares about it or, and this doesn’t happen often, they simply don’t want/need it. Conversely if a product is a dismal failure it’s usually guaranteed that it’ll get a whole bunch of the wrong type of attention, especially with the Internet’s bent towards schadenfreude. With that in mind it made me wonder why I hadn’t heard more about D-Wave since I last wrote about them around this time last year. Especially considering that Lockheed Martin had bought one of their D-Wave One systems a year prior to that.
Turns out they probably don’t really need the press as they’re doing just fine:
VANCOUVER — When the world’s largest defence contractor reportedly paid $10 million for a superfast quantum computer, the Burnaby, B.C., company that built it earned a huge vote of confidence.
Two years after Lockheed Martin acquired the first commercially viable quantum computer from D-Wave Systems, the American aerospace and technology giant is once again throwing its weight behind a technology many thought was still the stuff of science fiction.
You’d be forgiven for thinking that this was just old news resurfacing 2 years later but it isn’t as Lockheed Martin just purchased a D-Wave 2, their latest and greatest quantum computing offering. Details are a little scant as to what is actually in their latest system but going off their product road map it’s likely to be some variant of their Vesuvius chip which contains 512 qubits. That’s 4 times the amount of qubits in their previous system which would make it exceptionally more powerful and all for the same cost as the first unit they sold.
In my quest to try and find a little more information about their new system I stumbled across this page which digs into the underlying architecture of the D-Wave One/Two systems. Now back when I first wrote about D-Wave they weren’t exactly forthcoming with this kind of information which was what drew them a considerable amount of criticism but since then a lot of their loudest critics have renounced their positions. Interestingly though, and feel free to correct me if I’m interpreting this wrong, whilst they indeed claim to have produced a functioning qubit they haven’t managed to entangle several of them together. Whilst this doesn’t make their system useless, single qubits daisy chained together will still be useful for some specific functions, it does mean that the exponential scaling doesn’t really apply to D-Wave’s style of quantum computers. I could be wrong about this but their explanation only mentions entanglement-like properties in the qubit section with their interconnecting grids only being used to “exchange information”, not to provide multi-qubit entanglement.
That doesn’t make it any less cool however as I’m sure as they continue to scale up their processors they’ll eventually start entangling more bits together which will increase their computational power exponentially. We won’t see consumer level processors using technology like this for a long time though as they’re akin to CUDA units on graphics cards, highly specialized computational units that excel in their task and not so much in general computing. Still D-Wave’s systems signal the beginning of the quantum computing era and that means its only a matter of time before we see them everywhere.
Sometimes I feel like quantum computing is like cold fusion. There’s been a lot of theoretical work around how it could possibly done and there have even been a few people peddling devices that claim to do exactly what the research says but the claims have never been quite substantiated. After posting about D-Wave selling one of their quantum computers in the middle of last year I did some further research and found that whilst they might’ve created a qubit (like many have done before them) their 128 qubit computer had not yet been verified as being a 128 entangled qubit computer, a critical difference between quantum and classical computing.
However news came to me today of another possible advancement in the world of quantum computing. Physicists for the National Institute of Standards and Technology (NIST) have created a simulator capable of replicating a quantum computer with hundreds of qubits. This is an order of magnitude higher than other experiments have done (classical simulation of quantum computers has been limited to around 30 qubits) and the amount of computing power available with that many qubits is some 1080 greater than current processor technology. Critically the simulator is also able to replicate quantum entanglement between qubits, the bugbear that D-Wave has yet to put to rest. What’s really special about this simulator is that it allows researchers to alter properties that they couldn’t normally do with regular solids, something which will allow them to gain further insights into how to craft qubits for use in general computing.
Whilst this is unequivocally a major advancement in the field of quantum computing we’re still a long way off from seeing a working device based on those principles. Like many of its quantum computing brethren the NIST device still requires exotic cooling solutions for it to work (like D-Wave’s computer requiring liquid helium) relegating it solely to the lab for the time being. Additionally the NIST device isn’t much of a quantum computer at the moment, just a device which allows us to simulate what a quantum computer would be like. What this means is that for now we can’t really run any kind of computation on it, we can only explore how varying the properties affects things like the entanglement or coherency of the qubit. Such a device is still crucial to advancing the field of quantum computing however but its still a ways off from practical usage.
Thinking about it more there is one key difference between cold fusion and quantum computing: the latter has seen great progress (both theoretically and practically) whilst the former has not. It’s more akin to regular fusion in that way, being an incredibly complicated problem that we’ve demonstrated is possible to do but will take decades for us to perfect. The further we get into these fields the quicker the revelations come and I have no doubt that the next couple decades will see amazing advances in both those fields.
All too often it seems we’re presented with technologies that are always at least a decade away from seeing some sort of practical application. It’s a symptom of a bigger problem, one of research not being able to continue on ideas that don’t have the potential to produce something useful (read: profitable) and so often we’re introduced to an idea long before it becomes reality so that it can in fact become a reality. However every so often one of these technologies makes it past that crucial line of finding its way into the real world and the latest of which is the idea of quantum computing.
I can’t remember when I first introduced to the idea behind quantum computers but I have been fascinated with them for quite a while. Quantum computers differ from regular computers in that their system is made up of qubits which are able to hold either a 0, 1 or a quantum superposition of any of these. What this means is that when compared to traditional computers which can only be in one state at any one time a quantum computer can be in any number of the possible states simultaneously. Thus as the number of qubits increases the amount of computing power available increases exponentially, theoretically out-pacing any conventional processor. Such computers would have big impacts on tasks that require large amounts of raw computing power like chemical reaction simulations and breaking encryption schemes.
I can remember reading about the first public demonstration of a quantum computer back in 2007 from a company called D-Wave. At the time I wasn’t 100% sure if they’d actually created what they claimed they had as there was a bit of controversy on the matter but re-reading the articles today showed that the criticism was more leveled at the claims D-Wave was making about their systems capabilities rather than it actually being a quantum computer. I really hadn’t followed the subject much until D-Wave popped up in my feed reader last week with the bold claim that you could now buy yourself at 128 qubit quantum computer:
Whether or not D-Wave has actually built a quantum computer is still a matter of debate (though, a study authored by the company and published in Nature claims to prove its success) but, whatever it is these crafty Canadians have created, you can order one now and start crunching qubits with abandon. The D-Wave One is the first commercially available quantum computer and, while its 128-qubit processor can only handle very specific tasks and is easily outperformed by traditional CPUs, it could represent a revolution in the field of supercomputing.
Now it’s one thing to put something up for sale and another to actually deliver that product. Not many people (or companies for that matter) would have the $10 million required to purchase one of these systems handy so initially it looked like it would be a while before we actually saw one of these systems purchased. Not a week later did D-Wave announce that they had sold their very first quantum computing system to Lockheed Martin with details around the purpose of the system remaining secret for now. To say that I’m surprised that they managed to sell one that quickly would be putting it lightly, but it’s also telling of what they’ve accomplished.
Lockheed Martin, whilst being no slouch when it comes to taking bets on new technology, wouldn’t be sinking $10 million into a product that wouldn’t deliver as advertised. Nor would they want D-Wave to publicly announce it either should they end up looking the fool should their quantum computer turn out to be not so quantum. On the surface then it would seem that D-Wave is the real deal and their 128 qubit system is the first commercially available quantum computer and we could be on the cusp of yet another computing power explosion.
So does this mean that in the next few years we’ll see quantum computers appear at the consumer level? Probably not, in their current state they require liquid helium cooling and we’re still many advances away from making quantum computers in form factors that we’re all familiar with. It does mean however that we’re close to being able to solve harder problems more rapidly though and the subsequent improves to D-Wave’s systems will make doing such tasks again exponentially easier. With Lockheed Martin basically verifying D-Wave’s credibility I’m very interested to see who lines up next for a quantum computer, especially if they’re going to be a bit more open as to what they plan to do with it.