Machine Learning Takes Hold in Nuclear Physics

As machine learning tools gain momentum, a review of machine learning projects reveals these tools are already in use throughout nuclear physics.

Image courtesy of DOE’s Thomas Jefferson National Accelerator Facility (Jefferson Lab)
Caption: The diagram emphasizes the close connections between nuclear physics theory, nuclear physics experiments, and computation (both computational science and data science as well as many elements from computer science).

The Science

Scientists have begun applying machine learning tools to nuclear physics challenges. In the past few years, a flurry of machine learning projects has come online in nuclear physics, and researchers have published many papers on the subject. Now, 18 authors from 11 institutions have summarized this artificial intelligence-aided work. The collaboration found that machine learning spans all research scales and energy ranges, from detailing the building blocks of matter to exploring the life cycles of stars. Machine learning is also found across all the four subfields of nuclear physics: theory, experiment, accelerator science and operations, and data science.

The Impact

The summary’s authors predict that machine learning’s involvement in nuclear physics theory and experiment will speed up these subfields. It will also better interconnect these fields to energize the entire loop of the scientific process. The paper gathers and summarizes major work in the field thus far. The authors intend the paper to serve as an educational resource and as a roadmap for future endeavors.

Summary

After attending a workshop exploring artificial intelligence in March 2020, three study co-authors teamed up with others representing the subfields of nuclear physics to survey the state of machine learning in nuclear physics. The first reference came from three decades ago. In 1992, researchers used machine learning to study nuclear properties such as atomic masses. Although this early work hinted at machine learning’s potential, its use in the field remained minimal until the last several years.

Machine learning involves building models that can perform tasks without explicit instruction. It requires computers to do specific things, including complicated calculations. Incorporating machine learning can speed up the research process, which could reduce the time and money needed for research, computer usage, and other experimental costs. In nuclear theory, machine learning can complete advanced calculations faster. It can improve and simplify models. It can also make predictions and help theorists understand prediction uncertainties. Finally, it can be used to study phenomena that researchers cannot conduct experiments on, such as supernovae, neutron stars, and hyper heavy nuclei and elements. The survey’s authors hope their research will serve as a comprehensive resource that bridges the efforts across the subfields of nuclear physics and sparks discussions and innovation in the future.

Contact

Amber Boehnlein
Thomas Jefferson National Accelerator Facility
amber@jlab.org

Funding

This work was supported by the Department of Energy Office of Science and the National Science Foundation.

Publications

Boehnlein, A.,  et al., Colloquium: Machine learning in nuclear physics. Reviews of  Modern Physics 94, 031003 (2022). [DOI: 10.1103/RevModPhys.94.031003]

Related Links

Machine Learning Takes Hold in Nuclear Physics, Jefferson Lab news feature

Highlight Categories

Program: NP

Performer: University , DOE Laboratory , SC User Facilities , NP User Facilities , CEBAF