Supercomputers provide new window into the life and death of a neutron

Glen Robers Jr. for Berkeley Lab
5/30/2018

Illinois Physics alumnus worked in Berkeley Lab research team that simulated sliver of the universe to tackle subatomic-scale physics problem

Experiments that measure the lifetime of neutrons reveal a perplexing and unresolved discrepancy. While this lifetime has been measured to a precision within 1 percent using different techniques, apparent conflicts in the measurements offer the exciting possibility of learning about as-yet undiscovered physics.

Chia Cheng 'Jason' Chang, Physics Illinois alumnus,  is the lead author in a study describing the supercomputer-intensive calculation of a property known as the nucleon axial coupling. He completed the work as a Berkeley Lab postdoctoral researcher. He currently holds an appointment as a research scientist at RIKEN. Photo by Marilyn Chung, courtesy of Berkeley Lab
Chia Cheng 'Jason' Chang, Physics Illinois alumnus, is the lead author in a study describing the supercomputer-intensive calculation of a property known as the nucleon axial coupling. He completed the work as a Berkeley Lab postdoctoral researcher. He currently holds an appointment as a research scientist at RIKEN. Photo by Marilyn Chung, courtesy of Berkeley Lab
Now, a team led by scientists in the Nuclear Science Division at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) has enlisted powerful supercomputers to calculate a quantity known as the “nucleon axial coupling,” or gA—which is central to our understanding of a neutron’s lifetimewith an unprecedented precision. Their method offers a clear path to further improvements that may help to resolve the experimental discrepancy.

Illinois Physics alumnus Chia Cheng “Jason” Chang is lead author on the paper. Chang received his bachelor’s degree in 2008 and his doctoral degree in 2015, both from the Department of Physics at the University of Illinois at Urbana-Champaign. Chang’s doctoral adviser at Illinois was Professor Aida El-Khadra. These results were achieved while Chang was a postdoctoral researcher in Berkeley Lab’s Nuclear Science Division. Chang currently holds an appointment as a research scientist at the Interdisciplinary Theoretical and Mathematical Sciences Program (iTHEMS) of the Institute of Physical and Chemical Research (RIKEN), Japan.

To achieve their results, the researchers created a microscopic slice of a simulated universe to provide a window into the subatomic world. Their study was published online May 30, 2018, in the journal Nature.

The nucleon axial coupling is more exactly defined as the strength at which one component (known as the axial component) of the “weak current” of the standard model of particle physics couples to the neutron. The weak current is given by one of the four known fundamental forces of the universe and is responsible for radioactive beta decay—the process by which a neutron decays to a proton, an electron, and a neutrino.

In addition to measurements of the neutron lifetime, precise measurements of neutron beta decay are also used to probe new physics from beyond the standard model. Nuclear physicists seek to resolve the lifetime discrepancy and augment with experimental results by determining gA more precisely.

The researchers turned to quantum chromodynamics (QCD), a cornerstone of the standard model that describes how quarks and gluons interact with each other. Quarks and gluons are the fundamental building blocks for composite particles including neutrons and protons. The dynamics of these interactions determine the mass of the neutron and proton, and also the value of gA.

But sorting through QCD’s inherent complexity to produce these quantities requires the aid of massive supercomputers. In the latest study, researchers applied a numeric simulation known as lattice QCD, which represents QCD on a finite grid.

While a type of mirror-flip symmetry in particle interactions called parity (like swapping your right and left hands) is respected by the interactions of QCD, and the axial component of the weak current flips parityparity is not respected by nature (most of us are right-handed, for example). And because nature breaks this symmetry, the value of gA can only be determined through experimental measurements or theoretical predictions with lattice QCD.

In this illustration, the grid in the background represents the computational lattice that theoretical physicists used to calculate a particle property known as nucleon axial coupling. This property determines how a W boson (white wavy line) interacts with one of the quarks in a neutron (large transparent sphere in foreground), emitting an electron (large arrow) and antineutrino (dotted arrow) in a process called beta decay. This process transforms the neutron into a proton (distant transparent sphere). Image courtesy of  Evan Berkowitz/Jülich Research Center, Lawrence Livermore National Laboratory
In this illustration, the grid in the background represents the computational lattice that theoretical physicists used to calculate a particle property known as nucleon axial coupling. This property determines how a W boson (white wavy line) interacts with one of the quarks in a neutron (large transparent sphere in foreground), emitting an electron (large arrow) and antineutrino (dotted arrow) in a process called beta decay. This process transforms the neutron into a proton (distant transparent sphere). Image courtesy of Evan Berkowitz/Jülich Research Center, Lawrence Livermore National Laboratory
The team’s new theoretical determination of gA is based on a simulation of a tiny piece of the universethe size of a few neutrons in each direction. They simulated a neutron transitioning to a proton inside this tiny section of the universe, in order to predict what happens in nature.

The model universe contains one neutron amid a sea of quark-antiquark pairs that are bustling under the surface of the apparent emptiness of free space.

“Calculating gA was supposed to be one of the simple benchmark calculations that could be used to demonstrate that lattice QCD can be utilized for basic nuclear physics research, and for precision tests that look for new physics in nuclear physics backgrounds,” said André Walker-Loud, a staff scientist in Berkeley Lab’s Nuclear Science Division who led the new study. “It turned out to be an exceptionally difficult quantity to determine.”

This is because lattice QCD calculations are complicated by exceptionally noisy statistical results that had thwarted major progress in reducing uncertainties in previous gA calculations.  Some researchers had previously estimated that it would require the next generation of the nation’s most advanced supercomputers to achieve a 2 percent precision for gA by around 2020.

The team participating in the latest study developed a way to improve their calculations of gA using an unconventional approach and supercomputers at Oak Ridge National Laboratory (Oak Ridge Lab) and Lawrence Livermore National Laboratory (Livermore Lab). The study involved scientists from more than a dozen institutions, including researchers from UC Berkeley and several other Department of Energy national labs.

Chang comments, “Past calculations were all performed amidst this more noisy environment,” which clouded the results they were seeking.”

Walker-Loud adds, “We found a way to extract gA earlier in time, before the noise ‘explodes’ in your face.”

“We now have a purely theoretical prediction of the lifetime of the neutron, and it is the first time we can predict the lifetime of the neutron to be consistent with experiments,” asserts Chang.

“This was an intense 2 1/2-year project that only came together because of the great team of people working on it,” remarks Walker-Loud.

This latest calculation also places tighter constraints on a branch of physics theories that stretch beyond the standard modelconstraints that exceed those set by powerful particle collider experiments at CERN’s Large Hadron Collider. But the calculations aren’t yet precise enough to determine if new physics have been hiding in the gA and neutron lifetime measurements.

Chang and Walker-Loud noted that the main limitation to improving upon the precision of their calculations is in supplying more computing power.

“We don’t have to change the technique we’re using to get the precision necessary,” Walker-Loud states.

The latest work builds upon decades of research and computational resources by the lattice QCD community. In particular, the research team relied upon QCD data generated by the MILC Collaboration; an open source software library for lattice QCD called Chroma, developed by the USQCD collaboration; and QUDA, a highly optimized open source software library for lattice QCD calculations.

The team drew heavily upon the power of Titan, a supercomputer at Oak Ridge Lab equipped with graphics processing units, or GPUs, in addition to more conventional central processing units, or CPUs.  GPUs have evolved from their early use in accelerating video game graphics to current applications in evaluating large arrays for tackling complicated algorithms pertinent to many fields of science. The axial coupling calculations used about 184 million “Titan hours” of computing powerit would take a single CPU about 75,000 years to work through the same set of calculations.

As the researchers worked through their analysis of this massive set of numerical data, they realized that more refinements were needed to reduce the uncertainty in their calculations.

The Titan supercomputer. Image courtesy of Oak Ridge National Laboratory
The Titan supercomputer. Image courtesy of Oak Ridge National Laboratory
The team was assisted by the Oak Ridge Leadership Computing Facility staff to efficiently utilize their 64 million Titan-hour allocation, and they also turned to the Multiprogrammatic and Institutional Computing program at Livermore Lab, which gave them more computing time to resolve their calculations and reduce their uncertainty margin to just under 1 percent.

“Establishing a new way to calculate gA has been a huge rollercoaster,” Walker-Loud comments.

With more statistics from more powerful supercomputers, the research team hopes to drive the uncertainty margin down to about 0.3 percent. “That’s where we can actually begin to discriminate between the results from the two different experimental methods of measuring the neutron lifetime,” Chang explains. “That’s always the most exciting part: When the theory has something to say about the experiment.”

He adds, “With improvements, we hope that we can calculate things that are difficult or even impossible to measure in experiments.”

Already, the team has applied for time on a next-generation supercomputer at Oak Ridge Lab called Summit, which would greatly speed up the calculations.

In addition to researchers at Berkeley Lab and UC Berkeley, the science team also included researchers from University of North Carolina, RIKEN BNL Research Center at Brookhaven National Laboratory, Lawrence Livermore National Laboratory, the Jülich Research Center in Germany, the University of Liverpool in the U.K., the College of William & Mary,

Rutgers University, the University of Washington, the University of Glasgow in the U.K., NVIDIA Corp., and Thomas Jefferson National Accelerator Facility.

One of the study participants is a scientist at the National Energy Research Scientific Computing Center (NERSC). The Titan supercomputer is a part of the Oak Ridge Leadership Computing Facility (OLCF). NERSC and OLCF are DOE Office of Science User Facilities.

 

The work was supported by Laboratory Directed Research and Development programs at Berkeley Lab, the U.S. Department of Energy’s Office of Science, the Nuclear Physics Double Beta Decay Topical Collaboration, the DOE Early Career Award Program, the NVIDIA Corporation, the Joint Sino-German Research Projects of the German Research Foundation and National Natural Science Foundation of China, RIKEN in Japan, the Leverhulme Trust, the National Science Foundation’s Kavli Institute for Theoretical Physics, DOE’s Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, and the Lawrence Livermore National Laboratory Multiprogrammatic and Institutional Computing program through a Tier 1 Grand Challenge award.

Lawrence Berkeley National Laboratory addresses the world’s most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab’s scientific expertise has been recognized with 13 Nobel Prizes. The University of California manages Berkeley Lab for the U.S. Department of Energy’s Office of Science. For more, visit www.lbl.gov.

DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

 

Recent News

  • In the Media

A second solar farm planned in Savoy will put the University of Illinois in the lead among American universities in terms of solar energy, a top campus proponent says.

The campus is moving ahead with a 55-acre solar farm along the north side of Curtis Road, between First and Neil streets in Savoy, about a mile south of the first 21-acre farm on Windsor Road.

Physics Professor Scott Willenbrock, who recently served as a provost's fellow for sustainability, briefed the Academic Senate about the project Monday, saying it will help the campus meet its goal of generating 5 percent of its energy needs from renewable sources. That target was part of the Illinois Climate Action Plan, known as iCap.

  • Research
  • Biological Physics

A previously unappreciated interaction in the genome turns out to have possibly been one of the driving forces in the emergence of advanced life, billions of years ago.

This discovery began with a curiosity for retrotransposons, known as “jumping genes,” which are DNA sequences that copy and paste themselves within the genome, multiplying rapidly. Nearly half of the human genome is made up of retrotransposons, but bacteria hardly have them at all.

Nigel Goldenfeld, Swanlund Endowed Chair of Physics and leader of the Biocomplexity research theme at the IGB, and Thomas Kuhlman, a former physics professor at Illinois who is now at University of California, Riverside, wondered why this is.“We thought a really simple thing to try was to just take one (retrotransposon) out of my genome and put it into the bacteria just to see what would happen,” Kuhlman said. “And it turned out to be really quite interesting.”

  • Research
  • High Energy Physics
  • Particle Physics
The lead ion run is under way. On 8 November at 21:19, the four experiments at the Large Hadron Collider - ALICE, ATLAS, CMS and LHCb - recorded their first collisions of lead nuclei since 2015. For three weeks and a half, the world’s biggest accelerator will collide these nuclei, comprising 208 protons and neutrons, at an energy of 5.02 teraelectronvolts (TeV) for each colliding pair of nucleons (protons and neutrons). This will be the fourth run of this kind since the collider began operation. In 2013 and 2016, lead ions were collided with protons in the LHC.

Anne Sickles is co-convener of the ATLAS Heavy Ion Working Group, which will use these data.
  • Outreach
  • Quantum Information Science
  • Atomic, Molecular, and Optical Physics
  • Quantum Physics
  • Quantum Computing

A two-day summit in Chicago taking place November 8 and 9 has brought together leading experts in quantum information science to advance U.S. efforts in what’s been called the next technological “space race”—and to position Illinois at the forefront of that race. The inaugural Chicago Quantum Summit, hosted by the Chicago Quantum Exchange, includes high-level representation from Microsoft, IBM, Alphabet Inc.’s Google, the National Science Foundation, the U.S. Department of Energy, the U.S. Department of Defense, and the National Institute of Standards and Technology.

The University of Illinois at Urbana-Champaign recently joined the Chicago Quantum Exchange as a core member, making it one of the largest quantum information science (QIS) collaborations in the world. The exchange was formed last year as an alliance between the University of Chicago and the two Illinois-based national laboratories, Argonne and Fermilab.

Representing the U of I at the summit are physics professors Brian DeMarco, Paul Kwiat, and Dale Van Harlingen, who are key players in the planned Illinois Quantum Information Science and Technology Center (IQUIST) on the U of I campus. The U of I news bureau announced last week the university’s $15-million commitment to the new center, which will form a collaboration of physicists, engineers, and computer scientists to develop new algorithms, materials, and devices to advance QIS.