TransWikia.com

Why is Google's quantum supremacy experiment impressive?

Physics Asked on June 23, 2021

In the Nature paper published by Google, they say,

To demonstrate quantum supremacy, we compare our quantum processor against state-of-the-art classical computers in the task of sampling the output of a pseudo-random quantum circuit. Random circuits are a suitable choice for benchmarking because they do not possess structure and therefore allow for limited guarantees of computational hardness. We design the circuits to entangle a set of quantum bits (qubits) by repeated application of single-qubit and two-qubit logical operations. Sampling the quantum circuit’s output produces a set of bitstrings, for example {0000101, 1011100, …}. Owing to quantum interference, the probability distribution of the bitstrings resembles a speckled intensity pattern produced by light interference in laser scatter, such that some bitstrings are much more likely to occur than others. Classically computing this probability distribution becomes exponentially more difficult as the number of qubits (width) and number of gate cycles (depth) grow.

So, from what I can tell, they configure their qubits into a pseudo-randomly generated circuit, which, when run, puts the qubits into a state vector that represents a probability distribution over $2^{53}$ possible states of the qubits, but that distribution is intractable to calculate, or even estimate via sampling using a classical computer simulation. But they sample it by "looking" at the state of the qubits after running the circuit many times.

Isn’t this just an example of creating a system whose output is intractable to calculate, and then "calculating" it by simply observing the output of the system?

It sounds similar to saying:

If I spill this pudding cup on the floor, the exact pattern it will form is very chaotic, and intractable for any supercomputer to calculate. But I just invented a new special type of computer: this pudding cup. And I’m going to do the calculation by spilling it on the floor and observing the result. I have achieved pudding supremacy.

which clearly is not impressive at all. In my example, I’m doing a "calculation" that’s intractable for any classical computer, but there’s no obvious way to extrapolate this method towards anything actually useful. Why is Google’s experiment different?

EDIT: To elaborate on my intuition here, the thing I consider impressive about classical computers is their ability to simulate other systems, not just themselves. When setting up a classical circuit, the question we want to answer is not "which transistors will be lit up once we run a current through this?" We want to answer questions like "what’s 4+1?" or "what happens when Andromeda collides with the Milky Way?" If I were shown a classical computer "predicting" which transistors will light up when a current is run through it, it wouldn’t be obvious to me that we’re any closer to answering the interesting questions.

5 Answers

The big difference between the quantum supremacy experiment and your pudding experiment is that the quantum supremacy experiment solved an unambiguous, well-posed mathematical problem. While people sometimes describe the computational task as "simulating the physical Sycamore computer", that's not right. The actual task was calculating the output of an abstract quantum logical circuit, of which the Sycamore computer was an approximate physical instantiation. The difference is subtle but crucial. From a computational perspective, the math came first and the physics came second.

Crucially, the quantum supremacy problem was mathematically well-specified, and so it could be checked on a classical computer. The parallel classical computation wasn't just there to provide a time benchmark, but also - crucially - to check the quantum computation for accuracy.

There's no such "slower but equivalent" computation to the pudding experiment. In the pudding experiment, you need to specify exact which of these two problems you're trying to solve:

  1. Simulate the pattern that will result if you knock a generic pudding cup off the table.
  2. Simulate the pattern that will result if you knock a particular pudding cup off the table [where you specify enough detail to nail down the initial conditions in enough detail to model its fall].

The first variant is obviously massively underspecified and doesn't have a unique answer. The second variant does in principle have a unique answer, but crucially, you can't actually capture all the necessary details about the initial condition in practice. So neither variant can actually be framed as a mathematically well-posed question.

In the quantum supremacy experiment, the abstract problem to be solved (which was solving the abstract logical circuit, not simulating the physical hardware) was simple enough to pose that you could (very slowly) solve it exactly on a classical computer as well as on a quantum one.

Correct answer by tparker on June 23, 2021

To elaborate on my intuition here, the thing I consider "impressive" about classical computers is their ability to simulate other systems, not just themselves. When setting up a classical circuit, the question we want to answer is not "which transistors will be lit up once we run a current through this?" We want to answer questions like "what's 4+1?" or "what happens when Andromeda collides with the Milky Way?"

There isn't a real distinction here. Both quantum and classical computers only do one thing: compute the result of some circuit. A classical computer does not fundamentally know what $4+1$ means. Instead current is made to flow through various transistors, as governed by the laws of physics. We then read off the final state of the output bits and interpret it as $5$.

The real distinction, which holds in both cases, is whether you can program it or not. For example, a simple four-function calculator is a classical system involving lots of transistors, but the specific things it can compute are completely fixed, which is why we don't regard it as a classical computer. And a pudding is a quantum system involving lots of qubits, but we can't make it do anything but be a pudding, so it's not a quantum computer.

Google can control the gates they apply in their quantum circuit, just like loading a different program can control the gates applied in a classical CPU. That's the difference.

Answered by knzhou on June 23, 2021

It is (in isolation) no different from you pudding computer.


However, the context is very important. Their are certain useful problems that we know how to solve much faster on a quantum computer than a classical one. Those problems are miles out of reach, and probably will not be solved for 10 or 20 years still (I am more pessimistic than most, maybe too much so).

In the meantime thousands of people all around the world are working on lots of different quantum computer ideas, superconducting ones like Google and IBM, but also optical ones of various types with quantum dots, down conversion sources or nitrogen vacancy centres.

These different communities have computers that work with fundamentally different hardware, so they are hard to compare against one another. This Google result indicates several things: (1) It shows that they are making some progress. (2) It is to "show off" how they are doing compared to the other approaches. For example optics people are still at around 8-qubits, while superconducting stuff is at 53. This is not a useful comparison (the two machines are too different) but that won't stop people from showing off.

The real story is not the "quantum supremacy", its that they have a 53 qubit circuit that seems to kind of half-work some of the time. (This is where the optics people get to gloat, their stuff has less qubits but actually works most of the time.) In order to "sell" this machine as an interesting step along the way to something useful they had to actually do something with it.

Answered by Dast on June 23, 2021

I co-run an experimental research group in which, among other things, we develop the ability to control quantum bits so as to do (one day) quantum computing. In our lab we have the most precise quantum bits and operations, but we have only ever performed (or tried to perform) operations on two or three bits at a time. This is partly because we have taken an interest in other aspects of the problem, and partly because if we were to put ten or more qubits into our experiment (which would be easy to do), we would merely reproduce small-scale circuits rather than learn how to build a large computer.

I would say that the question asked by Bridgeburners is well asked, and it correctly characterises the limited nature of Google's calculation. However, one can look at Google's result from an experimental perspective, and then it does become, I think, very impressive.

From an experimental point of view, Google's achievement is that they have achieved sufficient experimental control over a circuit containing 53 qubits that it can generate entangled states involving all or most of the qubits, in such a way that the fidelity of the state is not immediately lost before the state can even be measured in some way. We would certainly not be able to achieve this in our lab today. If we devoted all our effort to doing the same thing, I think it would take us a year or more to implement the required extensions to our experimental equipment with the required precision. So it is indeed very impressive. (Meanwhile with our trapped ion methods we can also do some things which Google's machine could not do.)

Looking now to the near future, there are two main technologies showing promise for quantum computing. These are atomic ions confined in high vacuum, and superconducting circuits of the type employed by Google. A few years ago, a rough 'competition' was held, involving computations requiring only 10 or so qubits, and the trapped ion methods won because they could take advantage of greater inter-connectability of their qubits, and good general precision in operations and measurements. If such a competition were to be held now, it is not so clear which technology would win. Nor is it clear which is the better bet for complete control of 50 qubits in such a way that general-purpose computing could be done, answering questions people really want to know (as opposed to computer-science abstractions). But what is clear, I think, is that this stage will be reached by either or both paths, and this will happen on a timescale of years not decades. What John Martinis and his colleagues have done is shown that the superconducting circuit approach is a very strong contender, and they have shown great expertise and mastery in overcoming many and severe technical challenges to get this far.

Answered by Andrew Steane on June 23, 2021

If I spill this pudding cup on the floor, the exact pattern it will form is very chaotic, and intractable for any supercomputer to calculate. But I just invented a new special type of computer: this pudding cup. And I'm going to do the calculation by spilling it on the floor and observing the result. I have achieved pudding supremacy.

You are right. But only half of the story

The other half is, this pudding spilled pattern can also be used to solve a mathematical problem which is very useful and impossible to be solved by conventional non-pudding system. It also could be controlled to make it deterministically solve that same problem anytime you want. If your pudding spilled can do that then it gain label of supremacy over conventional computer

The point is we need to solve a problem by any means and method as efficient as possible, if the pudding spilled are completely efficient way to do it, it would reign supremacy and we will make a pudding machine and pudding factory for solving problem

Now, replace pudding with quantum

Answered by Thaina on June 23, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP