Bo Peng eagerly awaited the results of his algorithm running on a quantum computer. If successful, the run would churn out the correct energy values for a specific quantum system. Just as we know that two plus two equals four, Peng’s team knew the numbers the computer should generate.
But would it? At that point, Peng and his team had executed the algorithm on a machine that only simulated a quantum computer. Now they were executing it on a physical quantum computer built by IBM.
“Doing quantum information research will surprise you, force you to think about a subject from totally different perspective.” — Bo Peng, Q-NEXT, Pacific Northwest National Laboratory
“Initially I thought it would be just an attempt. I didn’t expect too much from it because we’d heard people say that the computer noise was overwhelming,” said Peng, a scientist at the U.S. Department of Energy’s (DOE) Pacific Northwest National Laboratory (PNNL) and collaborator with Q-NEXT, a DOE National Quantum Information Science Research Center led by DOE’s Argonne National Laboratory. “Well, it turned out that the result was really beautiful.”
Beautiful — in other words, accurate. The run yielded the quantum system’s true, known energy values. The computer’s anticipated noise didn’t get in the way of good answers.
Noise — sources of uncertainty that can lead to fuzzy measurements or bad calculations — is a fact of life. Noise in an image recognition algorithm might cause a program to misidentify a wheelbarrow as a bicycle. Noise in your home’s electrical wiring could cause lights to flicker. Researchers correct for error-causing noise by accounting for it in their measurements or building fail-safes in their hardware.
Combating noise is one of the biggest goals of the quantum computing community. It’s a formidable challenge, but solving the problem will be worth the effort: Quantum computing holds great promise for solving problems that today’s classical computers can’t. By harnessing the behavior of nature at its smallest scales, quantum computers can help accelerate the timeline for drug development or save energy by instantly finding the best cross-country routes for service vehicles, for example.
Part of the fulfillment of that promise depends on correcting errors inherent in quantum computing.
One technique is to design qubits to be less noisy.
The great compression
The qubit is the quantum version of the traditional computing bit. Qubits can take a variety of forms, such as particles of light or superconducting circuits.
In his research for Q-NEXT, Peng focuses on superconducting circuits. The circuit calculates how a quantum system changes over time.
Say you want to see how a quantum system, such as an array of atoms, responds after getting a magnetic kick. The more steps you need to describe the atoms’ quantum evolution, the longer the circuit you need to represent it, and the more circuit layers you need to accommodate the length.
Unfortunately, more layers mean a noisier qubit. Peng and his team set themselves the challenge of squeezing the circuit’s functionality into fewer layers without sacrificing accuracy. Peng’s team includes Argonne scientists Yuri Alexeev and Sahil Gulania and PNNL scientist Niri Govind.
And they overcame the challenge. Their solution lay in a fundamental mathematical relationship known as the Yang-Baxter equation. By applying this equation, they were able to construct circuits with fewer layers. This compressed circuit enabled them to efficiently perform the time evolution of the system on noisy quantum devices. And importantly, the compressed circuit produced much higher fidelity than its many-layered counterpart.
“You increase the fidelity of the final quantum states and increase the accuracy of your simulation,” Peng said. “We show mathematically that the number of layers you end up with to get exactly the same result grows linearly with the number of qubits represented, which means that the number of layers significantly reduced compared to what we start with.”
Using the Yang-Baxter equation may be one of several mathematical approaches for compressing qubit circuits.
“Yang-Baxter may be just the starting point of this great compression story. There may be other relationships we can use to do other types of compression,” Peng said. “There might also be other areas where we can apply this, such as optimization problems, in which you find the optimal path for completing some task.”
Refining the moments
Another effort by Peng focuses on a method for error correction called the Hamiltonian moments method — the same technique that yielded his beautiful result on the IBM quantum computer.
Consider the problem of calculating the energies in a molecule. You write its mathematical description with all the necessary specifications. You then encode this description in a way that its energies can be calculated on a quantum device. The device issues its response, and it is noisy: Either it is wrong or too imprecise to be useful.
How do you obtain the true values? Peng found that by applying some advanced algebra to the device’s response, he could refine it, like adding decimal places to a rough measurement. The Hamiltonian moments method enables scientists to approach the true energies of the molecule, rather than having to settle for the quantum device’s initial noisy, error-riddled readout.
The method is an example of a quantum-classical hybrid approach to quantum computing. Peng combines the computational power of quantum devices (the molecular model is encoded on a quantum device) with tried and tested traditional computing methods (mathematically correcting the device’s response).
It’s also an alternative to the usual methods of error correction, which are carried out in the quantum device itself.
“There are bunch of classical-computing methods that we can use to mitigate quantum errors,” Peng said. “We’re brainstorming different methods. We should not limit ourselves to the traditional ways, the ways people are used to thinking about.”
In leveraging classical computation for quantum information science, Peng and others are exploring relatively new territory.
Using quantum devices to see how quantum systems change over time — like using the Yang-Baxter relation to compress qubit circuits — is also a young research area.
“These are emerging areas in quantum science,” Peng said. “Problems in quantum time dynamics can be solved through classical computing. Now we’re bringing these problems into the realm of quantum computing.”
For Peng, switching from classical to quantum computing research meant completely reframing his ways of thinking about computing.
Peng received his doctorate in theoretical chemistry from the University of Washington in 2016 and joined nearby PNNL that year as a prestigious Linus Pauling postdoctoral researcher. He began researching the dynamics of multiple bodies interacting with each other, or many-body theories. Naturally, calculating many-body interactions requires more intensive computation than the single-body modeling many students are trained on.
“We really had to refresh our ideas with many-body theory, think about the same phenomena from a different perspective,” Peng said. “It was a very natural transition for me, moving from traditional physical chemistry to computational science, where we put those many-body theories into software.”
It was also a natural steppingstone to quantum simulations research, one that required him to shift his thinking from classical to quantum computing mode. Peng is motivated by the challenge, as well as by the opportunity to carry out quantum computations on real, physical quantum devices.
“I see that as a drive for all my research activities,” he said.
For those interested in entering quantum science and engineering, Peng emphasizes that educational resources on quantum science are abundant.
“Quantum information science is an interest-driven process. There are so many materials online and free resources that you can use to learn quantum science and engineering: free workshops, videos, lecture notes. If you’re really interested in the field, you can learn what you need to know,” Peng said. “You can become a self-taught expert.”
Whether a self-taught expert or a self-directed researcher breaking new ground in quantum information science, one is compelled to continually rethink how nature works.
“I received most of my training in classical computing, so I tend to think about certain phenomena in traditional ways,” Peng said. “With quantum, new thing pops up, and then there’s a conflict in your brain that forces you to learn something new. It’s like when baby learns new things — it might be easier for them. But for adults to learn new things, we have to get rid of some older ways of thinking. Doing quantum information research will surprise you, force you to think about a subject from totally different perspective.”
Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science.
The U.S. Department of Energy’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit https://energy.gov/science.