Searching for a quantum advantage in strong-field quantum electrodynamics

4/8/2026 Daniel Inafuku for Illinois Physics

Written by Daniel Inafuku for Illinois Physics

For the last 80 years, the theory of quantum electrodynamics (QED), which describes all electromagnetic interactions, has been a cornerstone of the standard model, withstanding the scrutiny of countless experiments and agreeing with observations down to the smallest known precisions. Yet, some high-intensity scales of QED remain unexplored, prompting some to wonder if quantum computers could deal with these scales’ inherent complexity.

Physicists at the University of Illinois Urbana-Champaign are now testing quantum simulations of these so-called strong-field QED (SFQED) processes, recently translating several processes into the language of quantum computing. Their latest work introduces an innovative method for simulating an SFQED process known as polarization flip on a quantum computer, setting a new benchmark for quantum simulations of high-energy phenomena. This research was published in the journal Physical Review D on March 9, 2026

Physics in the strong-field limit

For the most part, QED is a well-understood theory. Its predictions agree so well with experiment that it has enabled scientists to make accurate predictions down to within one part per trillion, comparable to predicting the Earth’s diameter to within a fraction of a human hair. This has earned QED its reputation as one of the most accurate theories in science.

There are some scales, however, at which QED has seldom been put to the test.

“We understand QED pretty well in vacuum and for small numbers of particles, as well as various classical regimes,” said Illinois Physics Professor Patrick Draper, who led the research team. “But there are some high-intensity regimes where we just don’t know what happens.”

In these SFQED regimes, electromagnetic fields can soar to strengths quadrillions of times greater than those found on Earth, giving rise to strange physics: photons can scatter off each other, spontaneously decay into matter, and even pop out of thin air.

Thus far, observations of SFQED phenomena have been limited to high-intensity laser and particle-collision experiments, and although never directly observed in nature, they’re thought to occur in some of the universe's most extreme environments, including those near collapsed black holes and highly magnetized neutron stars. Physicists also suspect SFQED could harbor never-before-seen effects—and perhaps even physics beyond the standard model.

Artist’s impression of a magnetar, a highly magnetized neutron star. Magnetic fields around magnetars are so strong that some believe they can make electron-positron pairs pop out of empty space. Image credit: ESO/José Francisco (josefrancisco.org)

Although terrestrial experiments may not ever reach astrophysical intensities, those investigating less strong, intermediate regimes are in the works. One such experiment is the ongoing E320 project at SLAC National Accelerator Laboratory, which collides high-energy electrons and photons. Another is the upcoming LUXE (Laser Und XFEL Experiment) collaboration at the Deutsches Elektronen-Synchrotron (DESY) in Germany, which, in addition to producing electron-photon collisions like E320, will also investigate photon-photon collisions. Unsurprisingly, physicists are now hard pressed to find theoretical and computational tools to keep up with their experimental efforts.

Computational simulations are the most accessible strategy for tackling SFQED, which presents enormous challenges because of its inherent complexity and many-body, non-equilibrium nature. Some speculate that quantum computers, predicted to operate exponentially faster than their classical counterparts, could offer solutions to SFQED’s complexity.

Draper is part of a growing contingent of physicists searching for a quantum advantage over  classical computers. He said, “Quantum computing is at a very young stage where we don’t know the complete class of problems for which it will be useful. It may be useful, for example, in some domains of science such as high-energy physics.”

Draper said that with anticipated technological advances in quantum computers, useful SFQED-scale simulations could be possible within the next several decades. He wants to be ready when that happens.

“Quantum computers aren’t at a useful stage yet, but I think we're on the cusp of reaching that point,” he continued, “There's a lot of work to be done in benchmarking them on current hardware, understanding what algorithms we can run, and developing new ones.”

Continuous vs. discrete

But QED and computation are at odds: field theories such as QED treat spacetime as smooth and continuous, whereas computers, both classical and quantum, are discrete, operating in finite “chunks.” Reconciling this continuous-versus-discrete mismatch requires discretizing relevant quantum variables, such as position or momentum, as well as encoding particles and their interactions into the language of qubits (superpositions, or sums, of special quantum states) and quantum gates (mechanisms for manipulating qubits).

In 2024, Draper’s team studied a simple SFQED process, nonlinear Breit-Wheeler pair production (BWPP), in which a photon decays into an electron and a positron in the presence of a strong electromagnetic field. The researchers chopped up the momenta of the particles, restricting them to only discrete values.

Feynman diagram of nonlinear Breit-Wheeler pair production. In the presence of a strong background field (not shown), a photon 𝛾 spontaneously disintegrates into an electron e and a positron e+. Credit: L. Hidalgo, P. Draper, Phys. Rev. D 109, 076004, Apr. 8th, 2024

In addition, they encoded electrons, positrons, and photons as quantum states called Fock states, which track the appearance or disappearance of the particles as BWPP unfolds. They also derived and discretized its time-evolution operator, a mathematical object that enables physicists to see how quantum states change over time, decomposing the operator into a series of matrix gates.

Combining these two ingredients—Fock-state qubits and matrix gates—the team rewrote BWPP as a quantum circuit that evolves the qubits forward in time, simulating the circuit on an actual IBM quantum computer through a cloud service. They also performed classical simulations of BWPP as a benchmark reference.

Using clever error-mitigation strategies to curb noise, the team found that their quantum simulations’ results closely matched their classical reference, deviating at most by 15 percent, suggesting that with improved resistance to noise, quantum computers may soon perform just as well as their classical counterparts.

Tackling one-loop SFQED

But what about more complicated SFQED processes? The simplest ones such as BWPP are known as tree level, so called because their Feynman diagrams depict interactions as junctions resembling tree branches. More complicated processes include loops, which look like closed rings and denote quantum corrections—“adjustments” made to a model that account for fine quantum interactions.

In the current work, Draper’s team considered a prototypical one-loop SFQED process known as polarization flip, in which a photon traveling through an intense electromagnetic field splits into an electron-positron pair. The electron and positron then recombine, producing an outgoing photon having a new polarization. The upshot is that the field imparts the incoming photon with angular momentum that flips its polarization. (For example, an initially vertically polarized photon can become horizontally polarized.)

Feynman diagram of polarization flip, with time advancing from left to right. An incoming photon 𝛾1 (left) interacts with a strong background electromagnetic field (not shown). The field imparts angular momentum to 𝛾1, splitting it into an electron e and a positron e+, which recombine to produce an outgoing photon 𝛾2 having a new polarization (polarizations not shown). Credit: P. Draper, L. Hidalgo, A. Ilderton, Phys. Rev. D 113, 056010, Mar. 9th, 2026

“Polarization flip is the next step up in complexity after Breit-Wheeler pair production and introduces a whole host of new complications,” Draper explained. “For example, when you discretize momenta as we did for Breit-Wheeler, you have to truncate them somewhere—you can’t have infinite momentum. Truncation works fine at tree level, but at loop level, quantum effects raise new hurdles.”

In particular, discretizing and imposing momentum cutoffs at loop level introduce unwanted, physically meaningless effects. When these arise, they must be compensated for by including quantities known as counterterms, whose parameters must be suitably tweaked to cancel out the unphysical effects—part of a procedure beloved by theorists called renormalization.

Draper summarized, “Essentially, we need to put in ‘bad stuff’ by hand to fix ‘bad stuff’ generated by our violent truncation and discretization.”

Another complication is that, unlike BWPP, for which only a few Fock states were needed, simulating polarization flip requires many more Fock states.

Illinois Physics graduate student and co-author Luis Hidalgo detailed, “A handful of Fock states were sufficient for Breit-Wheeler pair production because we didn’t consider that many particles—just three. However, loops represent an infinite number of possible particles. If we encoded Fock states using the same encoding we used for pair production, we’d need potentially thousands of qubits.”

More qubits mean a bigger quantum computer, an ongoing technological hurdle. On the other hand, fewer qubits often require more gates—more entry points, so to speak, for noise to creep in. To strike a balance between the numbers of qubits versus gates, the researchers invented a new encoding called an n-choose-k encoding: Given n qubits, this encoding represents a single Fock state using only certain n-qubit states, a method that enabled the researchers to encode many Fock states in a tractable way.

Hidalgo described, “With the n-choose-k encoding, you may need more qubits compared to other encodings, but often your gate costs go down, cutting down the number you need by about half. This might not sound like much, but often these small factors can make or break the feasibility of a quantum simulation.”

After calculating the required counterterms, the researchers encoded their particles as n-choose-k qubits and derived the counterterm-modified time-evolution operator, decomposing it into matrix gates. Next, they combined their encoded qubits and gates into a quantum circuit, mirroring their strategy for BWPP.

Finally, the researchers broke down polarization flip into small time steps, a technique called Trotterization, to perform two types of classical simulations (see diagram): a reference (black plot) that exactly reproduces the correct physics, and three simulations (blue, red, and yellow plots) of their quantum circuit, to compare against the reference and determine the feasibility of future simulations on an actual quantum computer.

Plots of the probability of polarization flip versus time x+, computed several ways: a reference simulation (black plot) and three classical simulations of the quantum circuit of varying performance quality (blue, red, and yellow plots). Observe that the yellow plot matches well with the black plot. However, this agreement requires many more gates than current quantum computers possess. Credit: P. Draper, L. Hidalgo, A. Ilderton, Phys. Rev. D 113, 056010, Mar. 9th, 2026

The researchers found that their best-performing quantum-circuit simulation—the yellow plot—closely matched their reference simulation. However, this agreement comes at a significant cost.

“Ultimately, we found that we’d need many quantum gates, much more than a current quantum computer could handle without returning pure noise,” Hidalgo stated.

In other words, the issue isn’t that current quantum computers don’t possess the required number of qubits. Rather, the issue is that quantum gates still introduce too much noise for the proposed quantum simulations to be feasible.

Draper summed up, “The cost was a little too high, about a factor of 5 to 10 too many operations on current quantum computers. It wouldn’t even be worthwhile attempting to run this because you’d just get noise.”

It’s important to keep in mind, however, that this result holds for this specific n-choose-k encoding as well as the Trotterization simulation method. Who’s to say that another encoding or simulation method wouldn’t work?

Carving out a future for quantum simulation

In spite of this null result, the team remains optimistic, emphasizing that this new benchmark will inform their next steps.

“Every new complication introduces new problems, and we had to invent new tools to solve them,” Draper remarked. “Studying this simple process taught us something that we’ll need in order to tackle more complex processes in the future.

“I would also like to start thinking about other encodings and what we’ll need to study processes at tree level in which lots of particles are produced. That will need different techniques, different encodings, and probably raise problems people haven't even thought about yet.”

Hidalgo added, “One future direction is to study the counterterms necessary to simulate multi-loop processes as well as the costs associated with such simulations.

“Another is to simulate SFQED on a spatial lattice instead of a momentum lattice. It’s less straightforward to simulate particle collisions and scattering this way, but it may turn out to be less expensive in terms of qubits or gates.”

Looking forward, Draper draws comparisons between the current era of quantum computing and the early days of lattice QCD, a widely used computational technique for studying the strong nuclear force.

“Quantum computing is at a similar stage where lattice QCD was at in the 1970s and ‘80s, when computers weren’t strong enough to deliver precise results,” he shared. “But by building algorithms and discovering how to construct good simulations, eventually the hardware got to a point where suddenly, by the early 2000s, lattice QCD became a powerful tool.

“The hope is that, on some shorter time scale, by incrementally pushing the forefront of what's possible, quantum computing will follow a similar trajectory.”

 

This research was funded by the U.S. Department of Energy’s Office of Science and Office of High Energy Physics Quantum Information Science Enabled Discovery (QuantISED) program and by the STFC consolidated grant “Particle theory at the Higgs Centre” under Grant No. ST/X000494/1. This research also used the Delta supercomputer at the National Center for Supercomputing Applications under Allocation No. PHY230137 from the Advanced Cyberstructure Coordination Ecosystem: Services & Support (ACCESS) program, which is funded by the National Science Foundation under Grant Nos. 2138259, 2138286, 2138307, 2137603, and 2138296. Any opinions, findings, conclusions or recommendations expressed in this material are those of the researchers and do not necessarily reflect the views of the funding agencies.

 


Madeline Stover is a physics doctoral student at the University of Illinois Urbana-Champaign studying atmospheric dynamics applied to forest conservation. She interns as a science writer for Illinois Physics, where she also co-hosts the podcast Emergence along with fellow physics graduate student Mari Cieszynski. When Stover is not doing research or communications, she enjoys hosting her local radio show, singing with her band, and cooking with friends.

Daniel Inafuku graduated from Illinois Physics with a PhD and now works as a science writer. At Illinois, he conducted scientific research in mathematical biology and mathematical physics. In addition to his research interests, Daniel is a science video media creator.

Karmela Padavic-Callaghan, Ph. D. is a science writer and an educator. She teaches college and high school physics and mathematics courses, and her writing has been published in popular science outlets such as WIREDScientific AmericanPhysics World, and New Scientist. She earned a Ph. D. in Physics from UIUC in 2019 and currently lives in Brooklyn, NY.

Garrett R. Williams is an Illinois Physics Ph.D. Candidate and science writer. He has been recognized as the winner of the 2020 APS History of Physics Essay Competition and as a finalist in the 2021 AAAS Science and Human Rights Essay Competition. He was also an invited author in the 2021 #BlackinPhysics Week series published by Physics Today and Physics World

 

Jamie Hendrickson is a writer and content creator in higher education communications. They earned their M.A. in Russian, East European, and Eurasian Studies from the University of Illinois Urbana-Champaign in 2021. In addition to their communications work, they are a published area studies scholar and Russian-to-English translator.

Karmela Padavic-Callaghan, Ph. D. is a science writer and an educator. She teaches college and high school physics and mathematics courses, and her writing has been published in popular science outlets such as WIREDScientific AmericanPhysics World, and New Scientist. She earned a Ph. D. in Physics from UIUC in 2019 and currently lives in Brooklyn, NY.


Share this story

This story was published April 8, 2026.