Scientists working on quantum physics in computing have been making friendly wagers for years. Adán Cabello, from the University of Seville (Spain) is heading to Rome soon to collect on a decade-old bet (a fancy dinner) with a friend about this year’s Nobel laureate in physics. But four years ago, Spanish researcher Miguel Navascués lost a wager because he didn’t believe a 50-qubit quantum computer could be built before 2050. It cost him €50 worth of hamburgers. Time has favored the optimists, but quantum physics continues to face the fundamental challenge of increasing computing capacity while reducing error rates. Alejandro González Tudela, a research scientist at the Spanish National Research Council’s (CSIC) Institute for Theoretical Physics in Murcia (Spain), is working on a new approach to the problem. He is combining the novel capabilities of metamaterials (structures with unusual attributes) with the quantum properties of light. His research program has been awarded $20 million in Leonardo grant funding from the BBVA Foundation since 2014.
In conventional computing, a bit is the basic unit of information. A bit is binary in that it can only have one of two values: 0 or 1. Combinations of bits can provide computers with extraordinary capabilities, but in quantum computing, the basic unit is the quantum bit, or qubit. It’s a quantum system that can have one of two states (0 and 1), or any superposition of these states. Superposition is the ability of a quantum system to be in multiple states at the same time until it is measured. The use of qubits allows trillions of bit combinations and therefore infinite computing possibilities. According to CSIC researcher Alberto Casas, “A quantum computer of 273 qubits will have more memory than there are atoms in the observable universe.”
The problem is that this quantum property of superposition is elusive, and can only remain stable for a short time. The slightest environmental change (temperature, electromagnetic noise or vibrations) degrades this property and makes it impossible for quantum computers to effectively perform practical, large-scale calculations. This effect is known as quantum decoherence.
A recent study published in Nature Physics by British, American and Chinese scientists used a 30-qubit programmable superconducting processor to demonstrate that “quantum information processing applications can be tuned to interact with each other while maintaining coherence for an unprecedented duration.” Error correction is also used, but this technique involved tackling one of the challenges of quantum computing – significantly increasing the number of qubits.
But González is taking an innovative approach to the problem. He is using metamaterials, structures with unusual attributes, to create quantum devices that can attain more qubits without increasing error rates. “The properties of these metamaterials,” said González, “are modulated below the wavelength needed to achieve rare responses like making a material invisible or focusing light beyond its limits.”
“The hypothesis,” said González, “is based on the fact that light has very good coherence [it easily preserves its quantum properties]. So the goal is to exploit the metamaterials’ very strong responses to light in order to improve fidelity.”
Advantages and disadvantages
The idea is to take advantage of light’s capacity for maintaining its quantum properties, since it interacts very little with the environment. However, the disadvantage of using light is that it’s difficult to manipulate, says González.
González decided to use metamaterials in his research after the recent development of a network of atoms separated by very short distances made it possible to exploit the quantum behavior of light. “By placing the atoms at very short distances, they behave collectively and can have very strong interactions with light,” said González. This will enable him to use metamaterials with more coherent quantum behaviors to overcome the difficulty of manipulating light particles. The ultimate goal is to develop computer hardware that solves the problem of scalability – a quantum computer with more qubits and fewer errors.
“It’s interesting,” said González, “to explore alternative paradigms. I’m not saying that my approach will result in the breakthrough that solves the problem and becomes the definitive platform. Right now, the best quantum computing implementations use ions trapped in superconducting circuits, but there is also quantum technology based on photons. Perhaps, the big leap forward will come from something that is completely off the radar, or from a combination of solutions.” Nevertheless, González strongly feels the need to blaze new trails with projects like the one that was awarded the Leonardo grant. Alberto Casas agrees. “The future of quantum computing is unknown, but it is undoubtedly worth exploring,” he writes in his recently published book, The Quantum Revolution.
The value of quantum computing is not to solve factorial calculations such as the ones used to test the systems. Nor is to figure out logistical puzzles like the best transportation routes between cities. Besides cryptography, González says the biggest aspirations for this technology are to enable secure communications and solve “certain physics and chemistry problems. These are multi-faceted issues with many interacting elements that are hard to solve using traditional computers.”
The pharmaceutical industry is one area where quantum computing can provide an “exponential advantage” in the development of personalized therapies, says González. “Maybe new problems will be identified that could benefit from quantum computing, or new applications that we haven’t yet imagined will be developed.”
Scientists from Trinity College in Dublin (Ireland) published a paper in the Journal of Physics Communications that describes the quantum behaviors of brains, consciousness and short-term memory processes. “Quantum brain processes could explain why we can still outperform supercomputers when it comes to unforeseen circumstances, decision-making and learning new things,” said co-author Christian Kerskens, a physicist with Trinity College’s Institute of Neurosciences. According to the study, “If advanced multidisciplinary approaches validate the results of this study, it will improve the general understanding of how the brain works and lead to innovative technologies for building even more advanced quantum computers.”
Spain is an active competitor in the quantum race, not only in basic research but also in technological innovation. The Barcelona Supercomputing Center was selected by the European High Performance Computing Joint Undertaking (EuroHPC JU) to host and operate its first quantum computers. The new infrastructure will be installed and integrated with the MareNostrum 5 supercomputer, the most powerful computer in Spain and one of the most advanced in Europe. The QuantumSpain program will invest €12.5 million in this project, which is being equally co-financed by the European Union and Spain’s Secretariat for Digitization and Artificial Intelligence (SEDIA). “This new infrastructure, which will integrate quantum computing with MareNostrum 5, will enable us to advance multiple academic applications,” said a statement from Mateo Valero, director of the Barcelona Supercomputing Center. The Barcelona facility will connect to a network of supercomputers in Germany, Czechia, France, Italy and Poland to serve the growing demand for quantum computing resources and services from European industry, and to support research in areas such as health, climate change, logistics and energy use.