_
_
_
_
_

Research inches toward quantum supremacy with results unattainable by classical computing

The experiment attained precise measurements using a processor of only 127 qubits and an error mitigation strategy

IBM
An IBM scientist checks a quantum computer at the company's research center in New York.CONNIE ZHOU
Raúl Limón

Quantum computing is now a scientific reality, despite evident limitations. A new study in Nature demonstrates the ability of a 127-qubit processor (the current capacity of commercially available quantum computers) to measure values in physics operations, while accounting for computational errors. Göran Wendin and Jonas Bylander from Sweden’s Chalmers University of Technology said the experiment demonstrates “that quantum processors are potentially useful for certain calculations, despite errors.” Although it doesn’t definitively prove quantum supremacy (the ability to solve problems classical computers cannot address), this experiment shows that quantum computing can surpass current classical computational techniques using fault-mitigation procedures. This breakthrough could eventually lead to significant advancements in the field.

IBM’s Summit supercomputer can process 200 billion calculations per second, but a quantum computer can process trillions due to superposition, a property that enables particles to be in two states simultaneously. While classical computing uses bits as the minimum unit to store data, quantum computing uses qubits. Two bits can hold one number, but since a qubit can represent both 0 and 1 at the same time (superposition), two qubits can hold four numbers. Ten qubits can therefore encode 1,024 numbers simultaneously because each additional qubit exponentially expands computing capacity.

Quantum superpositions are hindered by interactions with the environment, causing them to degrade into classical states, a process known as decoherence. Interference from heat, electromagnetism and vibration generates noise and reduces superposition maintenance time to microseconds, limiting computational capacity and leading to errors. Scientists try to mitigate these issues through programming solutions or by searching for elusive particles, such as the Majorana particle, that can maintain coherence. They also use complex isolation systems to avoid interference and operate at extremely low temperatures approaching absolute zero (-459°F or -273°C).

The authors of this study conclude that fault-tolerant quantum computing, which is beyond the capability of existing technologies, is likewise beyond the reach of today’s processors. Nonetheless, tech giants like Google have taken significant steps in bringing this elusive goal a little closer.

The new study published by IBM researchers Youngseok Kim, Andrew Eddins and Abhinav Kadala with other co-authors, demonstrates that a quantum processor and post-analysis processing can accurately generate, manipulate and measure complex quantum states. These states cannot be precisely estimated using classical approximations.

The question is not merely one of computational speed, but rather a matter of capacity. “The vast calculation potential offered by 127 qubits cannot be encoded by any classical computer due to memory limitations,” said the study. Wendin and Bylander agree. “The crux of quantum superiority lies in scalability, not speed. The problem encoded in the 127 qubits is beyond the capacity of classical computational memory.”

One of IBM's first commercial quantum computer models.
One of IBM's first commercial quantum computer models. IBM

The authors credit the experiment results to advances in superconducting processor coherence and calibration, alongside the ability to control noise. These findings provide insights into the power of quantum computing at a time when fault-tolerant computing is still beyond our reach, and offer a foundational tool for near-term quantum applications.

Quantum advantage

A European team led by University of Seville (Spain) professor Adán Cabello was able to observe a strontium ion’s complete quantum state, not just at the start and end. This groundbreaking feat, captured in a never-before-seen recording lasting a millionth of a second, was hailed as one of the most remarkable breakthroughs of 2020 by Physics World.

The Nature study used an Ising model, a paradigm initially proposed for studying the ferromagnetic transition of particles. However, the physical process wasn’t the objective. Instead, the goal was to showcase that a commercially available quantum computer, even if it lacks fault-tolerance, can achieve reliable measurements on a complex system. In an interview with the Science Media Center (SMC), Carlos Sabín, a theoretical physics researcher at the Autonomous University of Madrid (UAM), said, “The Nature study asks if we can do anything useful with today’s quantum computers that only have a small number of qubits and relatively high error probabilities. The experiment proves that we can, but only if we use the artifice of error mitigation.”

“The authors show that their [IBM’s] machine, after error mitigation, does provide reliable results when calculating physical quantities of the system,” Sabin said. “If these results are confirmed (by Google’s competing team, for instance), it would mean a first step in proving the usefulness of today’s relatively small and noisy quantum computers when aided by error mitigation. Although this calculation has no practical application because the parameter values for quantum superiority likely don’t correspond to real physical systems, Ising’s work is based on a physical model. Therefore, comparable machines could potentially address equally complex models with more practical applications, using an approach based on error mitigation rather than just correction.”

Juan José García-Ripoll, a research scientist with Spain’s Institute of Fundamental Physics (IFF-CSIC) praised the exceptional work that showcases the computing capabilities of IBM’s 127-qubit quantum computer, and summarized the study conclusions in an interview with the Science Media Center (SMC). “Quantum computers, despite their imprecision, have the unique capability to effectively simulate complex physics problems. While errors inevitably occur during computation, the protocol still enables us to derive highly accurate, quantitative predictions. For this type of endeavor, simulation techniques employed by classical computers are less precise than their quantum counterparts.”

According to the Spanish physicist, the outcome “may not necessarily be definitive,” although quantum computing has produced processors such as IBM’s Osprey, boasting 413 qubits. “Perhaps in the future, advances in tensor networks [classical systems tackling problems like those addressed in the Nature study] may enable other scientists to surpass the capabilities of this 127-qubit processor.”

Göran Wendin and Jonas Bylander share a similar viewpoint. “This breakthrough likely doesn’t help us apply quantum computing to relevant industrial problems. Those computations require far more qubits and significantly more operations to be competitive with high-performance supercomputers. Quantum computations would inevitably drown in all the noise.”

Sign up for our weekly newsletter to get more English-language news coverage from EL PAÍS USA Edition

More information

Archived In

Recomendaciones EL PAÍS
Recomendaciones EL PAÍS
_
_