If the era of quantum computing had begun 3 years earlier, her rising sun might have hidden behind a cloud. In 2019, Google researchers said they had reached a milestone known as quantum supremacy when their Sycamore quantum computer completed an incomprehensible calculation in 200 seconds, which they said would tie up a supercomputer for 10,000 years. Now scientists from China have made calculations in a few hours using ordinary processors. They say that a supercomputer could beat Sycamore.
“I think they’re right that if they had access to a big enough supercomputer, they could simulate the … task in a matter of seconds,” says Scott Aaronson, a computer scientist at the University of Texas at Austin. According to Greg Kuperberg, a mathematician at the University of California, Davis, the advance slightly diminishes Google’s claim. “Getting 300 feet from the top is less exciting than getting to the top.”
Still, the prospects for quantum computing remain unchanged, Kuperberg and others say. And Sergio Boixo, Google’s chief scientist for Quantum AI, said in an email that the Google team knew its lead might not last very long. “In our 2019 paper, we said that classical algorithms will improve,” he said. But “we do not believe that this classical approach will be able to keep up with quantum schemes in 2022 and beyond.”
The “problem” that Sycamore solved was designed to be difficult for a conventional computer, but as easy as possible for a quantum computer that manipulates qubits that can be set to 0, 1, or—thanks to quantum mechanics— any combination of 0 and 1 at the same time. A total of 53 Sycamore qubits, tiny resonant electrical circuits made of superconducting metal, can encode any number between 0 and 253 (about 9 quadrillion) — or even all of them together.
Starting with all qubits set to 0, the Google researchers applied a random but fixed set of logic operations, or gates, to individual qubits and pairs for 20 cycles and then read out the qubits. Roughly speaking, quantum waves representing all possible outputs sloshed between the qubits, and the gates created interference that amplified some outputs and canceled others. So some were more likely to appear than others. In millions of tests, a sharp picture of the result was revealed.
The Google researchers claimed that simulating these interference effects would overwhelm even Summit, Oak Ridge National Laboratory’s supercomputer, which has 9,216 CPUs and 27,648 faster graphics processing units (GPUs). The IBM researchers who developed the Summit were quick to point out that if they used every bit of hard drive available to the computer, it could handle the calculations in days. Now Pan Zhang, a statistical physicist at the Institute of Theoretical Physics of the Chinese Academy of Sciences, and his colleagues have shown how to beat Sycamore in a paper published in the press. Review letters of individuals.
Following others, Zhang and his colleagues reformulate the problem as a three-dimensional mathematical array called a tensor network. It consisted of 20 layers, one for each gate cycle, with each layer containing 53 points, one for each qubit. Lines connected points to represent gates, with each gate encoded in a tensor—a 2D or 4D grid of complex numbers. Running the simulation then boiled down to essentially multiplying all the tensors. “The advantage of the tensor network method is that we can use many GPUs for parallel computation,” Zhang says.
Zhang and his colleagues also relied on a key insight: Sycamore’s calculations were far from precise, so theirs shouldn’t be either. Sycamore calculated the distribution of the output signals with an estimated accuracy of 0.2%—just enough to distinguish fingerprint-like sharpness from circuit noise. So Zhang’s team traded accuracy for speed by cutting some lines in their network and eliminating the corresponding gates. Losing just eight lines made the calculation 256 times faster while maintaining 0.37% accuracy.
The researchers calculated the original pattern for 1 million of the 9 quadrillion possible number strings, relying on their own innovations to obtain a truly random representative set. The calculation took 15 hours on 512 GPUs and gave a noticeably sharp result. “It’s fair to say that Google’s experiment was simulated on a regular computer,” says Dominic Hanglater, a quantum computer scientist at the University of Maryland, College Park. According to Zhang, the calculation will take a few tens of seconds on a supercomputer — 10 billion times faster than the Google team estimated.
The researchers say this progress highlights the pitfalls of a quantum computer competing with a conventional one. “There is an urgent need for better experiments with quantum superiority,” says Aaronson. Zhang suggests a more practical approach: “We have to find some real-world applications to demonstrate quantum superiority.”
But Google’s demo was more than just hype, researchers say. According to Zhang, Sycamore required far fewer operations and less power than a supercomputer. And if Sycamore had a slightly higher accuracy, he says, his team’s simulation wouldn’t have been able to keep up. As Hangleiter says, “The Google experiment did what it set out to do, start that race.”