Just after winning the Nobel Prize, it made the cover of Nature. Google's "quantum echo" algorithm speeds up calculations by 13,000 times, and the results can be repeatedly verified.
The Google Quantum team, which has just won the Nobel Prize in Physics, has once again made it onto the cover of Nature:
They proposed a new algorithm called “Quantum Echoes”. The results calculated by this algorithm can be repeatedly verified, solving the previous problem of difficult confirmation of quantum computing results.
For a calculation that the classical supercomputer Frontier takes 3.2 years to complete, the quantum computer only needs 2.1 hours, which is 13,000 times faster.
The paper has just been published in Nature. The newly - awarded Nobel laureate and the current chief hardware scientist at Google's Quantum AI Lab, Michel Devoret, participated in this research, along with researchers from top institutions such as Princeton University, the University of California, Berkeley, and MIT. In total, more than 200 authors were involved in this study.
In another study (which will be uploaded to arXiv later), the new algorithm has been verified in detecting the interactions between atoms and particles and the structures of molecules.
The results obtained by the quantum computer are consistent with those of traditional nuclear magnetic resonance (NMR), and it reveals information that is usually not obtainable from NMR.
Just as telescopes and microscopes opened the doors to new worlds, this experiment takes a crucial step towards a “quantum mirror”, which can measure natural phenomena that were previously unobservable.
The NMR technology enhanced by quantum computing is expected to become a powerful tool in the field of drug R & D, helping to determine how potential drugs bind to their targets. In the field of materials science, it can also be used to characterize the molecular structures of new materials such as polymers, battery components, and even the materials that make up qubits.
Quantum Echo Algorithm: A Verifiable Quantum Advantage
The core of a quantum computer is a “quantum many - body system” (such as a set of entangled qubits). However, there is a major problem in studying it:
As time evolves, quantum information will rapidly spread throughout the system. This phenomenon is called “information scrambling”.
At this time, if one wants to observe its details through conventional methods such as the “temporal correlation function” (TOC), the signal will disappear exponentially, severely limiting people's ability to detect quantum information.
To solve this problem, the Google team proposed the “Quantum Echo” algorithm:
First, let the system evolve forward, then apply an operation, and then evolve it backward, repeating this process. It simulates the reversal of time to refocus the already spread quantum information.
The protagonist of this study, the “out - of - time - order correlation function” (OTOC), is an upgrade of this idea. It can stack the signals of different “evolution paths” in the quantum system, amplifying useful information and canceling out noise.
The research team conducted two types of key experiments using a superconducting quantum processor (using up to 65 qubits) and reached two core conclusions:
First, OTOC can observe the details of the quantum system for a long time, which is much better than traditional methods.
The signal of the traditional TOC becomes so weak that it is almost undetectable after 9 evolution cycles (standard deviation < 0.01). However, for the measured OTOC (especially the second - order OTOC, denoted as OTOC⁽²⁾), the signal is still clear even after 20 evolution cycles (standard deviation > 0.01).
Second, the second - order OTOC hides the phenomenon of “large - loop interference”, which classical computers cannot calculate.
When the quantum system evolves, many “Pauli strings” (which can be understood as small units of quantum states) are generated. These strings form “large loops”, and the signals of these large loops will strengthen each other, resulting in “constructive interference”.
This “large - loop interference” makes it difficult for classical computers to simulate. They tried to simulate the OTOC⁽²⁾ signal of 65 qubits using the most powerful supercomputer, Frontier, which takes about 3.2 years. However, the quantum processor only needs 2.1 hours for one measurement, a speed difference of 13,000 times.
Even using faster classical simulation methods such as Monte Carlo, the signal - to - noise ratio (1.1) of the calculated signal is far lower than that of the quantum experiment (3.9).
Proving the Possibility of “Practical Quantum Advantage”
The so - called “quantum superiority” is not just about the quantum computer being faster than the classical computer; it also has to be “useful”.
This time, the team also demonstrated the application of OTOC(2) in practical problems - Hamiltonian learning of quantum systems.
In many physical systems, it is necessary to determine the unknown parameters of the system Hamiltonian. Traditional methods are often limited by the rapid decoherence of quantum states. Due to its slow - decay characteristics and high sensitivity to dynamic details, OTOC(2) has become an ideal detection tool.
The researchers designed a single - parameter learning experiment: first, simulate a “quantum system with unknown rules”, then measure the signal of this system using OTOC⁽²⁾, and then adjust the parameters to match the quantum - simulated signal with the measured signal. Finally, they accurately found the unknown phase (with a small error).
This shows that OTOC⁽²⁾ is not only “able to detect special phenomena” but can also be used to solve practical problems, such as analyzing real quantum materials (such as solid - state NMR systems) and inferring their internal interaction rules.
This breakthrough also relies on the hardware advantages of the Willow chip. Last year, it proved its ability to handle complex quantum states through the “random circuit sampling” test. Now, it can support the “Quantum Echo” algorithm, mainly due to its two key characteristics: extremely low error rate and high - speed operation, which not only meet the requirements of the algorithm for computational complexity but also ensure the accuracy of the results.
After continuous improvement since its release, the current generation of the Willow chip has achieved first - class performance in terms of scalability. In the entire 105 - qubit array, the fidelity of single - qubit gates is as high as 99.97%, the fidelity of entanglement gates is as high as 99.88%, and the fidelity of readout is as high as 99.5%. All operations run at speeds ranging from tens to hundreds of nanoseconds.
Regarding future plans, the Google Quantum team said that they will focus on the R & D of “long - lived logical qubits” to lay the foundation for building a larger - scale, error - correctable, and practical quantum computer.
Paper URL:
https://www.nature.com/articles/s41586-025-09526-6
Reference Links:
[1]https://blog.google/technology/research/quantum-hardware-verifiable-advantage/
This article is from the WeChat official account “Quantum Bit”. Author: Meng Chen. Republished by 36Kr with permission.