Posts Tagged‘quantum computing’

D-Wave 2X

D-Wave 2X Finally Demonstrates Quantum Speedup.

The possibilities that emerge from a true quantum computer are to computing what fusion is to energy generation. It’s a field of active research, one in which many scientists have spent their lives, yet the promised land still seems to elude us. Just like fusion though quantum computing has seen several advancements in recent years, enough to show that it is achievable without giving us a concrete idea of when it will become commonplace. The current darling of the quantum computing world is D-Wave, the company that announced they had created functioning qubits many years ago and set about commercializing them. However they were unable to show substantial gains over simulations on classical computers for numerous problems, calling into question whether or not they’d actually created what they claimed to. Today however brings us results that demonstrate quantum speedup, on the order of 108 times faster than regular computers.

D-Wave 2X

For a bit of background the D-Wave 2X (the device pictures above and the one which showed quantum speedup) can’t really be called a quantum computer, even though D-Wave calls it that. Instead it’s what you’d call a quantum annealer, a specific kind of computing device that’s designed to solve very specific kinds of problems. This means that it’s not a Turing complete device, unable to tackle the wide range of computing tasks which we’d typically expect a computer to be capable of. The kinds of problems it can solve however are optimizations, like finding local maximums/minimums for a given equation with lots of variables. This is still quite useful however which is why many large companies, including Google, have purchased one of these devices.

In order to judge whether or not the D-Wave 2X was actually doing computations using qubits (and not just some fancy tricks with regular processors) it was pitted against a classical computer doing the same function, called simulated annealing. Essentially this means that the D-Wave was running against a simulated version of itself, a relatively easy challenge for a quantum annealer to beat. However identifying the problem space in which the D-Wave 2X showed quantum speedup proved tricky, sometimes running at about the same speed or showing only a mild (comparative to expectations) speedup. This brought into question whether or not the qubits that D-Wave had created were actually functioning like they said they were. The research continued however and has just recently born fruit.

The research, published on ArXiv (not yet peer reviewed), shows that the D-Wave 2X is about 100 million times faster than its simulated counterpart. Additionally for another algorithm, quantum monte carlo, a similar amount of speedup was observed. This is the kind of speedup that the researchers have been looking for and it demonstrates that D-Wave is indeed a quantum device. This research points towards simulated annealing being the best measure with which to judge quantum systems like the D-Wave 2X against, something which will help immensely with future research.

There’s still a long way to go until we have a general purpose quantum computer however research like this is incredibly promising. The team at Google which has been testing this device has come up with numerous improvements they want to make to it and developed systems to make it easier for others to exploit such quantum systems. It’s this kind of fundamental research which will be key to the generalization of this technology and, hopefully, it’s inevitable commercialization. I’m very much looking forward to seeing what the next generation of these systems bring and hope their results are just as encouraging.

D-Wave Quantum Computing Chip

Quantum Computing Company D-Wave Still Entangling On.

When a technology company doesn’t get a whole lot of press it usually means one of two things: the first is that it isn’t that interesting and no one really cares about it or, and this doesn’t happen often, they simply don’t want/need it. Conversely if a product is a dismal failure it’s usually guaranteed that it’ll get a whole bunch of the wrong type of attention, especially with the Internet’s bent towards schadenfreude. With that in mind it made me wonder why I hadn’t heard more about D-Wave since I last wrote about them around this time last year. Especially considering that Lockheed Martin had bought one of their D-Wave One systems a year prior to that.

Turns out they probably don’t really need the press as they’re doing just fine:

VANCOUVER — When the world’s largest defence contractor reportedly paid $10 million for a superfast quantum computer, the Burnaby, B.C., company that built it earned a huge vote of confidence.

Two years after Lockheed Martin acquired the first commercially viable quantum computer from D-Wave Systems, the American aerospace and technology giant is once again throwing its weight behind a technology many thought was still the stuff of science fiction.

You’d be forgiven for thinking that this was just old news resurfacing 2 years later but it isn’t as Lockheed Martin just purchased a D-Wave 2, their latest and greatest quantum computing offering. Details are a little scant as to what is actually in their latest system but going off their product road map it’s likely to be some variant of their Vesuvius chip which contains 512 qubits. That’s 4 times the amount of qubits in their previous system which would make it exceptionally more powerful and all for the same cost as the first unit they sold.

D-Wave Quantum Computing Chip

In my quest to try and find a little more information about their new system I stumbled across this page which digs into the underlying architecture of the D-Wave One/Two systems. Now back when I first wrote about D-Wave they weren’t exactly forthcoming with this kind of information which was what drew them a considerable amount of criticism but since then a lot of their loudest critics have renounced their positions. Interestingly though, and feel free to correct me if I’m interpreting this wrong, whilst they indeed claim to have produced a functioning qubit they haven’t managed to entangle several of them together. Whilst this doesn’t make their system useless, single qubits daisy chained together will still be useful for some specific functions, it does mean that the exponential scaling doesn’t really apply to D-Wave’s style of quantum computers. I could be wrong about this but their explanation only mentions entanglement-like properties in the qubit section with their interconnecting grids only being used to “exchange information”, not to provide multi-qubit entanglement.

That doesn’t make it any less cool however as I’m sure as they continue to scale up their processors they’ll eventually start entangling more bits together which will increase their computational power exponentially. We won’t see consumer level processors using technology like this for a long time though as they’re akin to CUDA units on graphics cards, highly specialized computational units that excel in their task and not so much in general computing. Still D-Wave’s systems signal the beginning of the quantum computing era and that means its only a matter of time before we see them everywhere.


Another Quantum Leap Forward For Computing.

Sometimes I feel like quantum computing is like cold fusion. There’s been a lot of theoretical work around how it could possibly done and there have even been a few people peddling devices that claim to do exactly what the research says but the claims have never been quite substantiated. After posting about D-Wave selling one of their quantum computers in the middle of last year I did some further research and found that whilst they might’ve created a qubit (like many have done before them) their 128 qubit computer had not yet been verified as being a 128 entangled qubit computer, a critical difference between quantum and classical computing.

However news came to me today of another possible advancement in the world of quantum computing. Physicists for the National Institute of Standards and Technology (NIST) have created a simulator capable of replicating a quantum computer with hundreds of qubits. This is an order of magnitude higher than other experiments have done (classical simulation of quantum computers has been limited to around 30 qubits) and the amount of computing power available with that many qubits is some 1080 greater than current processor technology. Critically the simulator is also able to replicate quantum entanglement between qubits, the bugbear that D-Wave has yet to put to rest. What’s really special about this simulator is that it allows researchers to alter properties that they couldn’t normally do with regular solids, something which will allow them to gain further insights into how to craft qubits for use in general computing.

Whilst this is unequivocally a major advancement in the field of quantum computing we’re still a long way off from seeing a working device based on those principles. Like many of its quantum computing brethren the NIST device still requires exotic cooling solutions for it to work (like D-Wave’s computer requiring liquid helium) relegating it solely to the lab for the time being. Additionally the NIST device isn’t much of a quantum computer at the moment, just a device which allows us to simulate what a quantum computer would be like. What this means is that for now we can’t really run any kind of computation on it, we can only explore how varying the properties affects things like the entanglement or coherency of the qubit. Such a device is still crucial to advancing the field of quantum computing however but its still a ways off from practical usage.

Thinking about it more there is one key difference between cold fusion and quantum computing: the latter has seen great progress (both theoretically and practically) whilst the former has not. It’s more akin to regular fusion in that way, being an incredibly complicated problem that we’ve demonstrated is possible to do but will take decades for us to perfect. The further we get into these fields the quicker the revelations come and I have no doubt that the next couple decades will see amazing advances in both those fields.

Quantum Computing: Now For Reals?

All too often it seems we’re presented with technologies that are always at least a decade away from seeing some sort of practical application. It’s a symptom of a bigger problem, one of research not being able to continue on ideas that don’t have the potential to produce something useful (read: profitable) and so often we’re introduced to an idea long before it becomes reality so that it can in fact become a reality. However every so often one of these technologies makes it past that crucial line of finding its way into the real world and the latest of which is the idea of quantum computing.

I can’t remember when I first introduced to the idea behind quantum computers but I have been fascinated with them for quite a while. Quantum computers differ from regular computers in that their system is made up of qubits which are able to hold either a 0, 1 or a quantum superposition of any of these. What this means is that when compared to traditional computers which can only be in one state at any one time a quantum computer can be in any number of the possible states simultaneously. Thus as the number of qubits increases the amount of computing power available increases exponentially, theoretically out-pacing any conventional processor. Such computers would have big impacts on tasks that require large amounts of raw computing power like chemical reaction simulations and breaking encryption schemes.

I can remember reading about the first public demonstration of a quantum computer back in 2007 from a company called D-Wave. At the time I wasn’t 100% sure if they’d actually created what they claimed they had as there was a bit of controversy on the matter but re-reading the articles today showed that the criticism was more leveled at the claims D-Wave was making about their systems capabilities rather than it actually being a quantum computer. I really hadn’t followed the subject much until D-Wave popped up in my feed reader last week with the bold claim that you could now buy yourself at 128 qubit quantum computer:

Whether or not D-Wave has actually built a quantum computer is still a matter of debate (though, a study authored by the company and published in Nature claims to prove its success) but, whatever it is these crafty Canadians have created, you can order one now and start crunching qubits with abandon. The D-Wave One is the first commercially available quantum computer and, while its 128-qubit processor can only handle very specific tasks and is easily outperformed by traditional CPUs, it could represent a revolution in the field of supercomputing.

Now it’s one thing to put something up for sale and another to actually deliver that product. Not many people (or companies for that matter) would have the $10 million required to purchase one of these systems handy so initially it looked like it would be a while before we actually saw one of these systems purchased. Not a week later did D-Wave announce that they had sold their very first quantum computing system to Lockheed Martin with details around the purpose of the system remaining secret for now. To say that I’m surprised that they managed to sell one that quickly would be putting it lightly, but it’s also telling of what they’ve accomplished.

Lockheed Martin, whilst being no slouch when it comes to taking bets on new technology, wouldn’t be sinking $10 million into a product that wouldn’t deliver as advertised. Nor would they want D-Wave to publicly announce it either should they end up looking the fool should their quantum computer turn out to be not so quantum. On the surface then it would seem that D-Wave is the real deal and their 128 qubit system is the first commercially available quantum computer and we could be on the cusp of yet another computing power explosion.

So does this mean that in the next few years we’ll see quantum computers appear at the consumer level? Probably not, in their current state they require liquid helium cooling and we’re still many advances away from making quantum computers in form factors that we’re all familiar with. It does mean however that we’re close to being able to solve harder problems more rapidly though and the subsequent improves to D-Wave’s systems will make doing such tasks again exponentially easier. With Lockheed Martin basically verifying D-Wave’s credibility I’m very interested to see who lines up next for a quantum computer, especially if they’re going to be a bit more open as to what they plan to do with it.