How about the more practical question: does it solve any classes of problems faster than an equivalently priced classic computer? What are these problems?
...if they don't come up with answers to this questions that are easy for business persons to understand, then we can only think that they are playing "fake it 'till we make it" on US gov's money. And if, god forbid, they don't make it, we'll be in for a "quantum computing winter(s)" just like we had the "AI winter(s)" because of smart assholes that played a similar game... and boy we have a lot to "thank" them for!
If the D-Wave is actually using quantum annealing for it's algorithms, how is it playing "fake it till we make it" simply by not having optimized its output to be currently faster than classical systems? I think it's pretty easy to see how the government and private institutions that are purchasing D-Wave Two's (universities and research divisions) see them as potentially great tools separate from their processing output. Business people are a separate entity and I would be surprised, and doubtful of their acumen, if any were actually purchasing any.
If they turn out to be quantum in origin, I think you are way off base in this prediction.
It's not able to do full quantum annealing; it's only able to use stoquastic Hamiltonians on a fixed topology. Even if truly quantum effects were present (which is unclear at the moment, see Smolin and Smith arxiv posts), it's not known to be a useful quantum computing model. So it's not just an issue of "optimizing output".
> One researcher did find D-Wave performed 3600 times faster than a classical device.
Well, that was done with the 128-qubit version, right? And now they have one that is 2^384 faster than that one, so we'll see what happens next.
If it was the 512-qubit one, then we'll need to wait for the 2048-qubit D-Wave Three that should come out in 2015 (they seem to double the qubits every year, but only release a new model every 2 years or so). That one should be 2^1536 faster than the current model.
Quantum computers don't get twice as fast for every qubit you add. You're confusing the state space required for a naive classical simulation with speed.
Yes they should do. Maybe not 2x with every cubid, but definitely exponentially - for larger "state space" as you call it. This is what makes them different from classical model.
Not generally. Only for certain problems. There are precious few problems that quantum computers are known to be asymptotically better at than classical computers, and one of them (Grover's search) is only a sqrt(N) speedup.
Just speaking in terms of "quantum size" (and not computing power, which is not understood yet), it depends on the graph of possible entanglements among the qubits. If it were a complete graph (allowing arbitrary entanglement) then the size of the relevant Hilbert space would exactly double with each additional qubit (i.e. one would need twice as many complex numbers to describe any particular wave function in the space). But the D-Wave chips operate with a fixed topology that (I think) is far from fully-connected (e.g. the 128-bit chip used a "Chimera graph", which they describe in blog posts and publications). The growth would only be truly exponential with sufficient connectivity (e.g. a planar graph would mean sub-exponential growth).
Have the same feeling (I think): If you sacrifice "global" entanglement then you're probably not doubling the "reachable" state/solution space with every new qubit.
The fact that D-wave seem able to double number of qubits every year while general purpose quantum computer qubit count seems to grow more linearly with time is in my mind one reason to be skeptical of the whole approach.
It wasn't really shown to be faster; see http://www.scottaaronson.com/blog/?p=1400. To focus on the "3600 times" issue, I suggest searching for the strings " Ising" and "CPLEX". Don't miss the extremely thorough discussion in the comments, which includes comments from Cathy McGeoch (the author of the "3600 times" work), Peter Shor (as in Shor's factoring algorithm), Scott Aaronson, Greg Kuperberg, and many others.
That's a property of quantum computers (but as people already explained, it's not that simple), why do you want classical computers to behave the same way?
Or, in a short answer, no, no relation at all with that.
do people not understand that this is just fake it til you make it? it doesn't matter if it's an emulator if they are paying the right people and working on it. do you think reddit just hoped it would get better at posting as bots? that's ridiculous.
It is actually a very crucial question: If the D-Wave machine is a just quantum computer emulator supported by classical hardware, we cannot expect much improvement in the future due to the inherent limitations of classical physics.
On the other hand, if it does behave like a quantum computer, it may end up being a lot faster than any classical computer.
Let's keep in mind that the D-Wave machine is not a general purpose quantum computer. It only runs the quantum annealing, which is an aptimization algorithm, on its quantum register. This is the only thing the machine is able to do.
Good question, I am not sure :) There can be many reasons. But the problem modeling is surely very different in its D-Wave form so the fact that there are improvements doesn't mean that it is because of quantum effects.
Other than that, too limited experiments, comparing specialized vs general algorithms, sub optimal classical algorithms are also possible reasons.
...if they don't come up with answers to this questions that are easy for business persons to understand, then we can only think that they are playing "fake it 'till we make it" on US gov's money. And if, god forbid, they don't make it, we'll be in for a "quantum computing winter(s)" just like we had the "AI winter(s)" because of smart assholes that played a similar game... and boy we have a lot to "thank" them for!