AI Rebuilds Molecules From Exploding Fragments


BYLINE: Ula Chrobak

Read this story in the SLAC News Center

 

Newswise — Researchers at the Department of Energy’s SLAC National Accelerator Laboratory and collaborating institutions recently built a generative AI model that can recreate molecular structures from the movement of the molecule’s ions after they are blasted apart by X-rays, a technique called Coulomb explosion imaging.

The research, published in Nature Communications, is an important step toward being able to take snapshots of molecules during chemical reactions – an advance that could have important impacts in medicine and industry. The machine learning model closely predicted the geometries of a range of different molecules made of less than ten atoms, paving the way for applying the technique to larger molecules. “We were pretty excited about this,” said Xiang Li, an associate scientist at SLAC’s Linac Coherent Light Source (LCLS) and lead author of the study. “It is the first AI model built for molecular structure reconstruction from Coulomb explosion imaging.”

 

A new way to see molecules

Currently, there are limited options available for imaging isolated gas phase molecules. With electron microscopy, for example, subjects must be fixed in place, making it impossible to image free-floating molecules. And for diffraction-based techniques to work, the sample of molecules needs to be dense enough to generate a strong signal in the detector. The resulting image is technically an average of many molecules, restricting researchers from studying details only visible when imaging isolated molecules.

In the paper, the researchers instead focused on Coulomb explosion imaging. In this technique, an X-ray pulse hits a single molecule in a vacuum chamber, ripping off the molecule’s electrons. This leaves behind positive ions that explosively repel away from each other and smash into a detector. The detector captures their momentum, which can be used to reconstruct the structure of the molecule. “This technique has the ability to isolate minor details that are chemically relevant,” said James Cryan, LCLS interim deputy director for science, research and development, associate professor of photon science at SLAC and coauthor of the paper.

But this reconstruction process has so far been largely infeasible due to computing constraints. After the X-ray pulse strips away electrons, the remaining ions do not explode apart instantly. During this brief delay, the atoms can shift slightly, making it difficult to reconstruct the original structure using Coulombs law for electrostatic forces. “It will not be accurate because a simple use of that law only works if the charge-up process is instantaneous,” explained Li.

Making things even messier, every additional atom in the molecule adds an exponential level of complexity. “It’s very challenging to work backwards to get the original structure,” said co-author Phay Ho, a physicist with DOE’s Argonne National Laboratory. “It’s kind of like breaking a glass and trying to put it back together from how the pieces flew apart. Many problems in modern physics and chemistry involve reconstructing hidden structures from indirect measurements. This work demonstrates how AI can help tackle such inverse problems.”

 

Machine learning for molecular structures

The research team set out to build a machine learning model that could overcome this computing constraint. They developed and trained the model at SLAC’s Shared Science Data Facility (S3DF). Generative AI models are well-suited for the task because they “think” differently than a standard computer simulation. Instead of working through a series of equations, they learn by finding patterns in training data. Then, they use those patterns to make statistical predictions. 

To gather training data, the team turned to a simulation built by Ho. The simulation analyzes molecular structures and calculates the momentum of their ions following a Coulomb explosion. After running for over a month, the computing-intensive simulation, using both quantum mechanics and classical physics equations, produced a dataset of 76,000 molecular samples.

Initially, the researchers trained the AI on this dataset alone, which is small by AI-training standards, and they found the model predicted inaccurate structures from explosion data. So, they re-did the training, adding in another dataset derived using only classical physics. The second set was less precise but about 100 times larger than the first one.

This two-step training was the trick for predicting precise structures.

The researchers tested the AI model by prompting it to predict molecular structures in a portion of the simulation data it had not seen in training. The model, which the team named MOLEXA (short for “molecular structure reconstruction from Coulomb explosion imaging”), took the ion momenta and calculated the most likely structures. “We found that this two-step training process suppressed the prediction error by a factor of two,” said Li.

The team then tested MOLEXA with experimental datasets recorded at the Small Quantum Systems (SQS) instrument of the European X-ray Free-Electron Laser facility (European XFEL) in Germany. The molecules they tested included water, tetrafluoromethane and ethanol. They entered the experimental ion momenta into the model, reconstructed the molecular structures, and then compared the reconstructions to known structures listed by the National Institute of Standards and Technology.

They found the predictions largely overlapped with the established structures. Overall, the bonds were in the right spots, with only slight variations in their angles. The errors in position were generally less than half the length of a typical chemical bond. “The model is actually, most of the time, doing better than that,” added Li. “It is only a starting point for future research, which will not only improve model accuracy but also extend its applicability to larger molecular systems.”

 

Expanding to larger molecules and chemical reactions

The paper is a major step in advancing Coulomb explosion imaging, which has long been limited by the challenge of reconstructing molecular structures from experimental measurements. In future work, the researchers plan to scale up the number of atoms the machine learning model can piece back together and apply the model to time-resolved experiments at the LCLS and European XFEL. That will help researchers to reconstruct snapshots of molecules in motion, creating flip-book-like molecular movies with insights into how chemical reactions unfold. It will also help with the interpretation of data collected at the high X-ray pulse rates delivered by SLAC’s superconducting X-ray laser, Cryan said.

The team is also now testing the model’s ability to reconstruct molecules from incomplete data. Much of the time, the detector misses an ion produced in the Coulomb explosion. Li wants to know, for example: Can the AI still reconstruct an ethanol molecule if one or more of its hydrogen ions are not registered in the detector?

If these challenges are resolved, the technique could become more applicable in biology and chemistry research. Proteins, for instance, can consist of thousands of atoms. “That’s really the goal,” said Li. “We will be able to study systems that are more biologically or industrially relevant.”

The team also included researchers from the Stanford PULSE Institute; Stanford University; Kansas State University; European XFEL, Germany; the Max Planck Institute for Nuclear Physics, Germany; Fritz Haber Institute, Germany; and Sorbonne University, France. Large parts of this work were funded by the Department of Energy’s Office of Science. LCLS is an Office of Science user facility.

 

About SLAC

SLAC National Accelerator Laboratory explores how the universe works at the biggest, smallest and fastest scales and invents powerful tools used by researchers around the globe. As world leaders in ultrafast science and bold explorers of the physics of the universe, we forge new ground in understanding our origins and building a healthier and more sustainable future. Our discovery and innovation help develop new materials and chemical processes and open unprecedented views of the cosmos and life’s most delicate machinery. Building on more than 60 years of visionary research, we help shape the future by advancing areas such as quantum technology, scientific computing and the development of next-generation accelerators.

SLAC is operated by Stanford University for the U.S. Department of Energy’s Office of Science. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time.




ATLAS: Four Decades of Nuclear Physics Innovation


Newswise — Henry Wadsworth Longfellow wrote, ​“It is difficult to know at what moment love begins; it is less difficult to know that it has begun.” If the celebrated poet were alive today, he might admit that, when it comes to vague beginnings, love is not alone.

Ask two people when the week begins, and you may get different answers. Ask scientists at the U.S. Department of Energy’s (DOE) Argonne National Laboratory when one of its user facilities got off the ground, and you’ll hear a similar story.

Officially, the Argonne Tandem Linac Accelerator System (ATLAS) was commissioned in 1985, and 2025 marked 40 years of operation. Yet working in this DOE Office of Science user facility — built to reveal the structure and properties of atomic nuclei — are staff whose work predates that milestone by years.

“I came on in 1978,” said Gary Zinkann, an ATLAS principal engineer. ​“That was 47 years ago.”

Zinkann’s long tenure illustrates how ATLAS grew from theories, ideas and technological breakthroughs that enabled its planning, construction and commissioning. It also reflects a culture of continuous improvement — expanding capabilities and generating a steady stream of scientific insights.

“ATLAS stands as a testament to decades of scientific ingenuity and dedication,” said Guy Savard, ATLAS scientific director and Argonne Distinguished Fellow. ​“Its history is one of impactful discovery and continuous renewal. At ATLAS, we are always working to improve, innovate and expand our capabilities.”

A vision takes shape

ATLAS’ origins reach back to the early 1970s, when Argonne physicists set out to push the boundaries of nuclear physics research. The community was tackling fundamental questions about the forces inside atomic nuclei — the building blocks of matter. Argonne scientists envisioned a facility that would use superconducting technology to accelerate heavy‑ion beams and provide an unprecedented tool for nuclear physics studies.

At the time, the idea of a superconducting linear accelerator (linac) for nuclear physics was new. Superconducting materials lose electrical resistance at extremely low temperatures, enabling a high accelerating field at a comparatively low input power. Applying this technology to accelerators was largely uncharted territory.

Researchers, including Lowell Bollinger at Argonne and Caltech physicist Ken Shepard, who later came to Argonne, began collaborating to explore the feasibility of this approach. Their work led to the development of niobium split‑ring resonators, first successfully tested at Argonne in 1977.

“Developing the niobium split‑ring resonator is arguably the major technological breakthrough that made ATLAS possible,” said Benjamin Kay, a group leader at ATLAS. ​“They would ultimately become the technological backbone for the entire facility.”

These resonators, cooled with liquid helium, demonstrated the potential to accelerate heavy ions with unprecedented efficiency. Building on this breakthrough, Argonne scientists constructed a prototype superconducting ​“booster” linac, consisting of 24 resonators. The booster accelerated an ion beam delivered by Argonne’s existing tandem Van de Graaff accelerator, in use since the 1960s, and its negative‑ion source.

“The booster linac was the first part of what later became ATLAS,” said Zinkann, who retired in 2016.

Beginning operation in 1978, the booster served as a testbed for the split-ring resonator technology, allowing scientists to refine designs and address technical challenges.

“It was a very active time: designing, testing, troubleshooting,” said Zinkann. ​“And in the middle of all that, researchers were doing experiments too!”

By the early 1980s, the booster linac had logged more than 10,000 hours of beam time, much of it for experiments conducted by users visiting Argonne from other institutions. These early successes demonstrated the feasibility and promise of superconducting linacs for nuclear physics research and gave Argonne the confidence to build a full‑scale facility.

With funding from the U.S. Congress, construction of ATLAS began in the early 1980s. ATLAS would combine the booster with a second linac — also using split‑ring resonators — and new ​“target areas” equipped with detectors to collect detailed experimental data on the accelerated ion beams.

In 1983, Bollinger, then director of ATLAS, wrote to the Argonne community: ​“Scientists from all over the world will use it to expand the boundaries of research into the forces that hold together atomic nuclei.”

That aspiration helped establish ATLAS as a global hub for nuclear physics research.

The final stages of ATLAS’ construction included fabrication and installation of the superconducting resonators for the new linac (dubbed the ​“ATLAS linac” to distinguish it from the older booster linac), expansion of the liquid helium refrigerator and cryogenic plumbing system, and expansion of the computer control system to manage the new linac and beamlines. The team completed the project on time and within budget.

Building a foundation

The ATLAS facility quickly became a global center for nuclear physics research, hosting a growing community of scientists and delivering high‑quality beams for studies of nuclear structure, astrophysics and fundamental interactions. By the late 1980s, ATLAS was serving hundreds of researchers each year, providing beams of stable isotopes for experiments probing the quantum structure of nuclei and the processes that forge elements in stars.

But even in the facility’s early years, ATLAS leadership was looking ahead.

“Almost immediately after the commissioning, ATLAS leaders announced plans to replace its negative-ion source with a positive-ion source,” said Kay.

Initially, ATLAS used a negative‑ion sputter source to generate ion beams, which were accelerated and stripped to positive ions in the tandem Van de Graaff accelerator for subsequent acceleration in the linacs. Installing a positive-ion source would eliminate a need for the Van de Graaff, improving performance and allowing the facility to access the heaviest elements.

As with the booster in the 1970s, Argonne scientists and engineers collaborated to design and build what they needed. Those efforts led to the development of a new generation of ​“quarter‑wave” resonators to support the positive‑ion source.

“There’s no catalog for ordering positive‑ion sources for superconducting linear accelerators,” said Zinkann.

The Positive Ion Injector (PII) was completed and brought online in 1992. Though only seven years after ATLAS’ 1985 commissioning, PII expanded ATLAS’ capabilities by enabling beams of some of the heaviest elements, including uranium, and increased available beam currents for lighter ions. The 1960s‑era tandem Van de Graaff accelerator still served as an injector until its retirement in 2014. Its former space at ATLAS now houses stopped-beam experimental stations.

New additions

Expanding ATLAS’ capabilities widened its scientific impact. Early instruments enabled studies of nuclear reactions inside stars, shedding light on the processes that created most elements and the role of nuclear reactions in stellar evolution. Other ATLAS‑enabled efforts probed the heaviest elements and the limits of nuclear stability.

“No two days were alike,” said Zinkann. ​“A lot of times, our work was about seeing a need, finding a way to fulfill it, and then we’d see the next need and get started on that. Over time, that can make a big difference.”

Advanced instruments for nuclear structure and reaction studies were developed and deployed, including:

  • Fragment Mass Analyzer, brought online in 1992 for high‑precision measurements of nuclear masses and decay processes.
  • ATLAS Positron Experiment (APEX), commissioned in 1993 to study electrons and positrons emitted during heavy‑ion collisions.
  • Canadian Penning Trap, which began operations in 2000 for high‑precision mass measurements of exotic nuclei.

“Like all of our instruments at ATLAS, these were wise investments that continue to pay scientific dividends today for researchers, the public and the world at large,” said Walter Wittmer, ATLAS operations director.

In 1997, the ATLAS team installed and commissioned Gammasphere, one of the world’s most powerful gamma‑ray spectrometers for nuclear structure research. Gammasphere collects data on gamma‑ray emissions following heavy‑ion fusion reactions, enabling high‑precision studies of nuclear shapes, decay processes and the forces that bind protons and neutrons. Its arrival allowed scientists to explore the quantum structure of nuclei and phenomena such as nuclear superfluidity and shape coexistence.

In cooperation with DOE and other partners, ATLAS was a finalist in the 1990s to host a new facility dedicated to rare‑isotope beams. Although DOE ultimately selected Michigan State University for that facility, ATLAS expanded in complementary directions and continues to grow its role in rare‑isotope science.

Expanding capabilities

In 2009, ATLAS commissioned the Californium Rare Ion Breeder Upgrade (CARIBU) system. Led by Savard and Richard Pardo, then ATLAS’ operations manager, CARIBU enabled production of neutron‑rich isotopes for experiments by harnessing the fission of californium‑252 to generate rare isotopes for acceleration.

“Adding CARIBU to ATLAS enabled the production of neutron‑rich isotopes that were previously inaccessible, opening new avenues for nuclear physics research,” said Savard. ​“CARIBU was particularly valuable for studying nuclear reactions that occur during supernova explosions and neutron star mergers.”

CARIBU allowed researchers to examine nuclear reactions involved in the rapid neutron‑capture process (r‑process) — a key mechanism that creates heavy elements such as gold, platinum and uranium during supernovae and neutron star mergers.

ATLAS continued to add detectors and systems for a broader range of experiments, including:

These projects also gave ATLAS engineers opportunities to innovate. HELIOS, for example, incorporates a solenoid magnet from a hospital’s decommissioned MRI scanner.

“Never underestimate what a top‑rate engineering team can do,” said Kay.

Another major addition was the Gamma‑Ray Energy Tracking In‑beam Nuclear Array (GRETINA), a precision gamma‑ray detector for high‑resolution studies of nuclear structure. Built by the U.S. nuclear physics community, GRETINA arrived at Argonne in 2013 for the first of what would ultimately be four experimental campaigns, the last of which ended in 2025. In between campaigns at ATLAS, it was installed at other accelerator facilities. GRETINA collected detailed data on gamma rays emitted during nuclear reactions, providing insights into nuclear forces and structure. Argonne scientists were instrumental in developing GRETA (Gamma‑Ray Energy Tracking Array), a next‑generation detector that will eventually replace GRETINA at ATLAS. GRETA will provide 3D tracking of gamma‑ray paths and energies for even more precise studies.

“Throughout its recent history, ATLAS has remained at the forefront of nuclear physics research, enabling studies of rare isotopes, nuclear reactions and fundamental symmetries,” said Wittmer. ​“The facility’s research programs continue to address key questions in nuclear astrophysics, nuclear structure and the properties of exotic nuclei.”

ATLAS further expanded its capabilities in 2023 with the installation of the ATLAS Material Irradiation Station (AMIS), which is used to emulate material damage in nuclear reactors. AMIS uses some of the accelerator’s lowest energies to deliver heavy ions that quickly degrade the material properties — without the radioactivity associated with irradiation in a reactor — making the development of new reactor materials safer and more efficient.

Today, ATLAS hosts researchers from across the U.S. and around the world, providing more than 6,000 hours of beam time annually.

“ATLAS has maintained strong engagement with its user community, hosting workshops, meetings and collaborative research projects to ensure that its capabilities align with the needs of scientists worldwide,” said Savard. ​“We move in the directions that will allow our users to deepen the scientific questions they can answer using ATLAS.”

Innovating for the future

Forty years after commissioning, ATLAS continues its tradition of continuous improvement to stay at the forefront of rare‑isotope research.

The team is installing and commissioning nuCARIBU, an upgraded version of the original CARIBU system, that will provide a reliable, on‑demand supply of radioactive isotopes for experiments while simplifying maintenance and improving operational efficiency. nuCARIBU will rely on neutron‑induced fission of uranium to produce isotopes and, for the first time, will allow the source to be turned off when not needed.

ATLAS is also preparing for the next generation of nuclear physics research through the N=126 Factory, an experimental system designed to provide beams of rare, neutron‑rich radioactive isotopes of very heavy elements. These isotopes are difficult to generate by other means and are important for understanding how the heaviest elements in the universe are made.

And to make the most efficient use of these new capabilities, ATLAS is pursuing a multi‑user upgrade that will enable the facility to deliver beams to two experimental stations simultaneously — one stable beam and one rare-isotope.

ATLAS’ beginning may be hard to pin down, and its history is one of continuous change. But its culture of improvement, expansion and excellence has put it on secure footing for tomorrow.

“The history of ATLAS is a story of growth, adaptation and scientific excellence. That will also be its future,” said Savard. ​“This facility’s ability to innovate and grow from its ambitious origins has allowed ATLAS to remain a vital resource for nuclear physics research, even as the field has evolved. As ATLAS looks to the future, it is well‑positioned to tackle the next generation of scientific challenges, continuing its legacy of discovery and its mission to unlock the secrets of the universe.”

Argonne Tandem Linac Accelerator System

This material is based upon work supported by the U.S. Department of Energy (DOE), Office of Science, Office of Nuclear Physics, under contract number DE‐AC02‐06CH11357. This research used resources of the Argonne Tandem Linac Accelerator System (ATLAS), a DOE Office of Science User Facility.

Argonne National Laboratory seeks solutions to pressing national problems in science and technology by conducting leading-edge basic and applied research in virtually every scientific discipline. Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science.

The U.S. Department of Energy’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit https://​ener​gy​.gov/​s​c​ience.




Solving a Mystery in Dark Matter Detectors Could Improve Quantum Computers


BYLINE: Lauren Biron

Newswise — Although dark matter makes up most of the mass in our universe, it has never been directly observed. To hunt for lighter dark matter and other rare phenomena, researchers must solve a puzzle in their supersensitive detectors: an unexpected number of low-energy events, called the “low-energy excess” or LEE, that can obscure the rare signals they seek.

In a study published on Dec. 30, 2025, in Applied Physics Letters, researchers with the TESSERACT (Transition-Edge Sensors with Sub-EV Resolution And Cryogenic Targets) experiment identified one of the culprits behind the low-energy excess. They found that the noise comes not from the electronics or the surrounding environment, but from tiny bursts of vibrational energy within the silicon crystal of the detectors themselves. And the thicker the silicon, the more LEE events there are.

Since at least some LEE events come from tiny changes in the detector material itself, researchers estimate they also cause problems in superconducting qubits, the sensitive building blocks of quantum computers that are often made of silicon. The bursts of energy can create “quasiparticles” that disturb a qubit’s fragile quantum state, causing it to decohere or fail. So even in carefully shielded quantum systems, some errors could be coming from inside the house.

“Quantum computers could perform calculations our current systems can’t, but only if people can make qubits that are stable,” said Dan McKinsey, the director of TESSERACT and a scientist at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab), which leads the experiment. “Because the detectors we use for our dark matter experiment have a similar backbone to what is in qubits, by understanding a problem in particle physics, we’re also getting information on how to improve the quantum computing side.”

To pinpoint where LEE events were coming from, TESSERACT collaborators fabricated superconducting phonon sensors (which pick up quantum vibrations, or phonons) on two nearly identical silicon chips that were 1 and 4 millimeters thick. In both detectors, the number of events decreased over time as they were cooled, and the thicker chip saw four times as many low-energy events — pointing to the volume of silicon itself as the source, rather than outside causes.

Now that the scientific community knows the number of LEE events relates to how thick the silicon is, some groups will be able to improve their sensors simply by scaling back how much silicon they use. But it’s still just the first step in understanding exactly what causes the bursts of energy and finding an engineering solution to get rid of the background noise completely.

“Superconducting qubits for computers are designed to ignore the environment so that their quantum state survives,” said Matt Pyle, a TESSERACT collaborator, associate professor at UC Berkeley, and researcher at Berkeley Lab. “In contrast, our photon and phonon sensors use similar technology, but they’re designed to be incredibly sensitive to their environment so that they can sense dark matter. That makes our detectors unique and powerful tools for diagnosing environmental sources that cause decoherence and limit quantum computers.”

During the experiment, TESSERACT’s thinner detector also achieved a world-leading energy resolution of 258.5 millielectronvolts. That means it could distinguish between two events with energies differing by only a few hundredths of an electronvolt, several times smaller than the amount of energy carried by a single particle of visible light. That precision will allow scientists to distinguish extremely faint signals from background noise, essential for tracking down dark matter.

TESSERACT is currently in the prototype and construction phase, and will eventually be installed in France’s Modane Underground Laboratory. The TESSERACT collaboration also includes researchers at Argonne National Laboratory, Caltech, Florida State University, IJCLab (Laboratoire de Physique des 2 Infinis Iréne Joliot-Curie), IP2I (Institut de Physique des 2 Infinis de Lyon), LPSC (Laboratoire de Physique Subatomique et de Cosmologie), Texas A&M University, UC Berkeley, the University of Massachusetts Amherst, the University of Zürich, and QUP (the International Center for Quantum-field Measurement Systems for Studies of the Universe and Particles).

###

Lawrence Berkeley National Laboratory (Berkeley Lab) is committed to groundbreaking research focused on discovery science and solutions for abundant and reliable energy supplies. The lab’s expertise spans materials, chemistry, physics, biology, earth and environmental science, mathematics, and computing. Researchers from around the world rely on the lab’s world-class scientific facilities for their own pioneering research. Founded in 1931 on the belief that the biggest problems are best addressed by teams, Berkeley Lab and its scientists have been recognized with 17 Nobel Prizes. Berkeley Lab is a multiprogram national laboratory managed by the University of California for the U.S. Department of Energy’s Office of Science.

DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.




Cracking the Code: Using AI to Solve Difficult-to-Map Proteins


BYLINE: Ashleigh Papp

Newswise — Using a tool to solve a protein’s structure, for most researchers in the world of structural biology and computational chemistry, is not unlike using the Rosetta Stone to unlock the secrets of ancient Egyptian texts. Once a protein’s structure has been discovered, or defined, one can infer crucial information about its function or, in a diseased state, its dysfunction. While researchers have been pursuing the quest of solving protein structure for decades, advancing tools and computing technologies offer a new frontier for this work.

A collaborative study recently published in Nature Communications unveiled a new computing program that offers a faster and more accurate way to determine protein structure at a new level of precision. Researchers from the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab), along with an international team of researchers, were a part of the effort. This tool, dubbed AI-enabled Quantum Refinement, or AQuaRef for short, uses quantum-mechanical calculations (QM) and artificial intelligence (AI) to predict the highly-accurate placement of atoms and electrons to determine a protein’s molecular structure.

This program is a part of Phenix, a comprehensive software suite that generates realistic computer models used by structural biologists around the world to solve macromolecular structures. “We’re all basically a bunch of proteins,” said Nigel Moriarty, a Berkeley Lab researcher and contributor to the recent publication. “They do so much in our bodies that detail the processes of life. Understanding their structure can give us insights into the mechanisms that cause disease in humans or produce energy in plants. All of this knowledge can lead to more effective therapeutics and bioenergy production.”

The current way of mapping a protein’s structure entails bringing together two streams of information: experimental data produced through techniques like X-ray crystallography and cryogenic electron microscopy (cryo-EM), and theoretical data that exists in a library of detailed, known protein structural information. But the current options are limited, explained Moriarty, a computational research scientist in the Molecular Biophysics and Integrated Bioimaging (MBIB) Division’s Phenix group. Our understanding today is limited to the chemical entities that have already been defined and doesn’t yet include meaningful noncovalent interactions, the type of attraction typically seen holding a protein in its structural form. “That’s where quantum and AI come in,” he said.

Nearly five years ago, members of the Phenix team began working with researchers at Carnegie Mellon University to explore how they might be able to apply their coding work to Phenix’s offerings. The collaborative approach, coupled with 15 years of incremental research, led to this breakthrough program. In addition to Moriarty, other members of the Phenix team involved in this work were Paul Adams and Billy Poon, with Pavel Afonine leading the research. AQuaRef uses machine learning (ML) tools developed at Carnegie Mellon integrated with the Phenix software to compute energy and forces for scientifically interesting proteins—making quantum-level refinement practical where it was previously impossible.

Of the 71 experiments that were tested in this study, AQuaRef produced higher quality structural information at a substantially lower computational cost while maintaining an equal or better fit to experimental data. In addition to the proof-of-concept results from this work, AQuaRef also correctly determined proton positions in DJ-1, a human protein linked to some forms of Parkinson’s Disease, the structure of which has been notoriously difficult to map. Now that the team has confirmed that quantum-level refinement of a 3D protein model structure is possible, they’re aiming to broaden the scope to include more diverse structures, such as those required for pharmaceutical drug design. And the potential impacts of this work reach far beyond human health, from better understanding the mechanisms of photosynthesis for enhanced crop productivity to mapping the proteins in plants as it relates to biofuel production.

“There is a near-infinite number of things that can benefit from a detailed understanding of these mechanisms and protein structure,” said Moriarty. “I’m excited to see how the paradigm shift that AQuaRef represents impacts the field of protein structure determination.”

This international team also included collaborators from the University of Wrocław, Poland, the University of Florida, and Pending.AI, Australia.

This work was funded by the National Institutes of Health as well as with support from the Phenix Industrial Consortium.

###

Lawrence Berkeley National Laboratory (Berkeley Lab) is committed to groundbreaking research focused on discovery science and solutions for abundant and reliable energy supplies. The lab’s expertise spans materials, chemistry, physics, biology, earth and environmental science, mathematics, and computing. Researchers from around the world rely on the lab’s world-class scientific facilities for their own pioneering research. Founded in 1931 on the belief that the biggest problems are best addressed by teams, Berkeley Lab and its scientists have been recognized with 17 Nobel Prizes. Berkeley Lab is a multiprogram national laboratory managed by the University of California for the U.S. Department of Energy’s Office of Science.

DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.




Measuring Pollutant-Destroying Molecules Used in Water Treatment


Newswise — From brightly colored textile dyes to persistent pesticides and antibiotics, many modern pollutants dissolved in water — such as Bisphenol A — resist traditional treatment methods. A promising approach uses electricity to power chemical reactions in water over an electrode surface. Much like in a battery, electrodes send and receive electrical current that drives chemical reactions.

This process, known as electrocatalysis, generates a class of highly reactive oxygen-containing compounds, known as reactive oxygen species or oxidants, at the electrode surface. These powerful oxidants, which include ozone and hydrogen peroxide, can break down even the most stubborn contaminants, producing cleaner water. However, because these oxygen species are unstable, degrade over time and exist in trace amounts — down to the parts-per-billion level — they have been notoriously difficult to detect and quantify.

In a study published in ACS Catalysis, researchers at the U.S. Department of Energy’s (DOE) Argonne National Laboratory report a new method for detecting and quantifying these short-lived oxygen species in real time with unprecedented sensitivity. Their approach revealed not only how much of each oxidant is produced, but also which specific species are formed under different treatment conditions.

“These oxygen species don’t last long, and they’re hard to detect individually,” said Argonne Electrochemist Scientist Pietro Papa Lopes, who led the study. ​“But knowing which ones are present and in what quantities is essential for improving water treatment technologies.”

Importantly, the team’s findings have applications beyond water treatment. One example is fuel cells. They convert hydrogen or other chemical fuels into electricity. Another is electrolyzers. They can split water molecules to produce hydrogen fuel or convert carbon dioxide into aviation fuels, for example.

The researchers used a method involving two electrodes to determine which oxidants were generated at the electrode surface. The first was a disk where a water oxidation reaction took place, generating the reactive oxygen species. The second was a concentric ring electrode. It produced an electrical signal that could detect and quantify the reactive oxygen species.

They tested the performance of three materials as the disk electrode: lead dioxide, platinum and iridium oxide. Lead dioxide was selected for its known ability to generate significant amounts of ozone and relevance to pollutant degradation. Platinum and iridium oxide were included as controls, as earlier studies had suggested they do not produce measurable amounts of reactive oxygen species. But the results told a different story.

“Somewhat to our surprise, at high voltages, all three electrode materials produced measurable levels of hydrogen peroxide and ozone,” said Papa Lopes. ​“That finding matters. Those oxidants can degrade membranes and other components used in electrochemical technologies, which could impact their long-term performance.”

Another key result involved Faradaic efficiency — a measure of how much input electricity is converted into useful chemical products. The team found that lead dioxide converted up to 30% of the electrical energy into ozone. That’s a high efficiency for systems of this type and suggests strong potential for scalable pollutant breakdown technologies.

The study provides a new benchmark for scientists and engineers working to advance electrochemical water purification. By establishing a consistent, sensitive method for identifying and quantifying reactive oxygen species in electrochemical systems, the research enables better system design and more meaningful comparisons across experiments and technologies.

This work was conducted through the Advanced Materials for Energy-Water Systems (AMEWS) Center, an Energy Frontier Research Center led by Argonne and supported by DOE. AMEWS seeks to understand how water — and the substances it carries — interacts with solid materials at the molecular level.

In addition to Papa Lopes, contributing authors at Argonne include Igor Messias, Jacob Kupferberg, Askley Bielinski and Alex Martinson, as well as Raphael Nagao at the Universidade Estadual de Campinas in Brazil. The research was funded by the DOE Office of Basic Energy Sciences.

Argonne National Laboratory seeks solutions to pressing national problems in science and technology by conducting leading-edge basic and applied research in virtually every scientific discipline. Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science.

The U.S. Department of Energy’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit https://​ener​gy​.gov/​s​c​ience.




Two Argonne scientists receive 2025 DOE Early Career Research Awards


Newswise — Two researchers from the U.S. Department of Energy’s (DOE) Argonne National Laboratory have been named recipients of 2025 Early Career Research Program awards from the DOE Office of Science. David Kaphan and Yong Zhao will each receive $550,000 per year for five years to further their research.

This DOE Office of Science program seeks to strengthen the nation’s scientific workforce by providing support to outstanding researchers early in their careers, when many scientists make formative contributions. Awardees were selected from a large pool of applicants from universities and national labs based on peer review by scientific experts.

David Kaphan is a chemist in Argonne’s Chemical Sciences and Engineering division. His research focuses on designing a new generation of catalysts — materials that speed up chemical reactions — for chemical transformations to overcome key kinetic limitations of today’s catalysts. His project aims to explore the potential of electric field-responsive oxides, such as ferroelectrics, to actively control the surface-level electronic characteristics of catalytic active sites. This approach could enable the development of catalysts that adapt during chemical transformations, optimizing reactivity for different phases of chemical synthesis processes.

Kaphan’s project will study the complex role that external electric fields can play in the modulation of electronic surface properties during catalytic processes. He will use X-ray absorption spectroscopy techniques and other methods at the Advanced Photon Source and the Center for Nanoscale Materials — both DOE Office of Science user facilities at Argonne — to measure properties such as field responsive surface electron density and catalytic reactivity. Additionally, the project will integrate artificial intelligence and machine learning to accelerate the exploration of reaction parameters and electric field conditions. This work has the potential to revolutionize catalyst design for critical processes such as selective methane oxidation and ammonia synthesis.

“Stimulus-responsive, nonequilibrium catalysis represents an exciting opportunity to overcome the classical limitations of static processes and increase efficiency in chemical transformations,” said Kaphan. ​“This support will allow us to explore new frontiers in field-responsive dynamic catalyst design and develop new solutions to address key challenges in energy-related chemistry.”

Yong Zhao is an assistant physicist in the Physics division. His research seeks to address one of the most fundamental questions in nuclear physics: understanding the internal structure of protons and neutrons. These are key objectives of multidimensional proton imaging efforts at DOE’s Thomas Jefferson National Accelerator Facility and the forthcoming Electron-Ion Collider at DOE’s Brookhaven National Laboratory.

Both protons and neutrons consist of different combinations of quarks and gluons. Zhao plans to develop a new theoretical approach and use lattice quantum chromodynamics (QCD) for precise calculations of the underlying multidimensional quark and gluon structures. This approach will enable high-precision imaging of the proton, as well as reveal the contributions of quark and gluon spin and orbital angular momentum to the proton’s spin.

Using the Aurora and Polaris supercomputers at the Argonne Leadership Computing Facility, a DOE Office of Science user facility, Zhao’s project aims to reduce systematic uncertainties and improve numerical precision in proton and neutron structural studies. Its insights will provide crucial theoretical guidance for experiments at Jefferson Lab, Brookhaven and other facilities.

“This award is a tremendous opportunity to push the boundaries of our understanding of the strong force and the fundamental building blocks of matter,” said Zhao. ​“I am grateful for the support that will allow us to make significant strides in this area of research.”

“David and Yong exemplify the innovative spirit and scientific excellence that are hallmarks of Argonne’s research community,” said Kawtar Hafidi, associate laboratory director for Argonne’s Physical Sciences and Engineering directorate. ​“Their groundbreaking work has the potential to transform our understanding of fundamental processes in physics and address key challenges in research and development. I look forward to seeing the impact of their efforts in the years to come.”

About Argonne’s Center for Nanoscale Materials

The Center for Nanoscale Materials is one of the five DOE Nanoscale Science Research Centers, premier national user facilities for interdisciplinary research at the nanoscale supported by the DOE Office of Science. Together the NSRCs comprise a suite of complementary facilities that provide researchers with state-of-the-art capabilities to fabricate, process, characterize and model nanoscale materials, and constitute the largest infrastructure investment of the National Nanotechnology Initiative. The NSRCs are located at DOE’s Argonne, Brookhaven, Lawrence Berkeley, Oak Ridge, Sandia and Los Alamos National Laboratories. For more information about the DOE NSRCs, please visit https://​sci​ence​.osti​.gov/​U​s​e​r​-​F​a​c​i​l​i​t​i​e​s​/​U​s​e​r​-​F​a​c​i​l​i​t​i​e​s​-​a​t​-​a​-​G​lance.

The Argonne Leadership Computing Facility provides supercomputing capabilities to the scientific and engineering community to advance fundamental discovery and understanding in a broad range of disciplines. Supported by the U.S. Department of Energy’s (DOE’s) Office of Science, Advanced Scientific Computing Research (ASCR) program, the ALCF is one of two DOE Leadership Computing Facilities in the nation dedicated to open science.

About the Advanced Photon Source

The U. S. Department of Energy Office of Science’s Advanced Photon Source (APS) at Argonne National Laboratory is one of the world’s most productive X-ray light source facilities. The APS provides high-brightness X-ray beams to a diverse community of researchers in materials science, chemistry, condensed matter physics, the life and environmental sciences, and applied research. These X-rays are ideally suited for explorations of materials and biological structures; elemental distribution; chemical, magnetic, electronic states; and a wide range of technologically important engineering systems from batteries to fuel injector sprays, all of which are the foundations of our nation’s economic, technological, and physical well-being. Each year, more than 5,000 researchers use the APS to produce over 2,000 publications detailing impactful discoveries, and solve more vital biological protein structures than users of any other X-ray light source research facility. APS scientists and engineers innovate technology that is at the heart of advancing accelerator and light-source operations. This includes the insertion devices that produce extreme-brightness X-rays prized by researchers, lenses that focus the X-rays down to a few nanometers, instrumentation that maximizes the way the X-rays interact with samples being studied, and software that gathers and manages the massive quantity of data resulting from discovery research at the APS.

This research used resources of the Advanced Photon Source, a U.S. DOE Office of Science User Facility operated for the DOE Office of Science by Argonne National Laboratory under Contract No. DE-AC02-06CH11357.

Argonne National Laboratory seeks solutions to pressing national problems in science and technology by conducting leading-edge basic and applied research in virtually every scientific discipline. Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science.

The U.S. Department of Energy’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit https://​ener​gy​.gov/​s​c​ience.




Feeling the Vibe


Newswise — It started with a social media post from Andrej Karpathy, one of the founders of OpenAI. Last year, he tweeted, ​“There’s a new kind of coding I call ​‘vibe coding,’ where you fully give into the vibes, embrace exponentials, and forget that the code even exists.” Karpathy said that large language models and voice-to-text programs had gotten so sophisticated that he could just ask a model to create something and then copy and paste the code it generated to build a project or create a web app from scratch. ​“I just see stuff, say stuff, run stuff, and copy-paste stuff, and it mostly works.” 

That groovy technique might be good for patching a glitchy website or building a phone app, but can it really change the way we do science? Researchers from the U.S. Department of Energy’s (DOE) Argonne National Laboratory are testing vibe coding tools and techniques to see how they stand up to data-intensive scientific challenges. At a recent hackathon, researchers from across the lab gathered to learn together and test commercially available coding tools like Cursor and Warp against scientific challenges as large and hairy as the hunt for dark matter and as pressing as the optimization of nuclear power plants. 

As a long-time leader in computational science and the home of Aurora, one of the world’s fastest and most powerful supercomputers, Argonne is no stranger to grand challenges. But to solve huge problems and to process more data than ever before, researchers are working to stay at the bleeding edge of harnessing artificial intelligence (AI) for science.

Rick Stevens sees vibe coding as another way Argonne researchers can continue to speed up scientific innovation. Stevens is the associate laboratory director for Computing, Environment and Life Sciences at Argonne. He has said that scientists need to be able to work as fast as they can think. He gets frustrated by the bottlenecks of current technology. But vibe coding is a productivity hack. ​“You’re unhobbled from your coding speed,” said Stevens. 

With vibe coding, researchers can interact with large language models in real time, asking them questions by talking rather than by typing commands, and then getting usable output in seconds or minutes. Stevens compared it to having an AI co-scientist — or even a team of co-scientists — working alongside you. He challenged fellow scientists to work with the technology every day. ​“You need to get your head around how to be productive in this environment,” he said. ​“Think, play and have a blast!”

Breaking barriers between ideas and action 

Part of the excitement around vibe coding is that we don’t know how it’s going to change science. At the hackathon, the vibe in the room was playful. The group was a mix of coders and non-coders from a variety of disciplines. Instead of quietly pecking away at their keyboards, researchers were laughing, bouncing ideas off each other and confidently speaking commands to their laptops. 

The promise of AI and vibe coding isn’t just about doing science faster, Stevens explained. These tools free up scientists to be more creative, to put their energy toward things that only a human can do. ​“With these tools, you’re not bottlenecked by writing code,” he said. ​“Now, you’re focused on ideas.” 

Here are some of the ideas Argonne scientists are vibing on:

1. Prototyping software to strengthen nuclear power plants

Nuclear power plants are an integral part of America’s energy supply and a reliable source of power for the growing energy needs of AI. Nuclear engineer Yeni Li and her team are creating AI models of those power plants to help plant engineers and managers predict the best times for maintenance. That knowledge can lead to more reliable and affordable energy production. 

Li said that vibe coding will be useful for setting up the software architecture she needs to turn her ideas into prototypes. ​“These tools will help us do a few days of work in a single afternoon,” said Li. 

2. Automating workflows in bioscience

Rosemarie Wilton doesn’t do a lot of coding in her work as a molecular biologist, but she does spend a significant amount of time using software tools for data analysis. Developing Python-coded pipelines would allow her to automate her data processing workflows and integrate multiple tools seamlessly. She was delighted to see how fast vibe coding could give her the command codes she needed. ​“For a coding novice, it’s really quite amazing. It will be a time saver,” she said. 

That quick win in generating command codes led Wilton and Computational Biologist Nick Chia to think about other ways vibe coding could help. Chia mused, ​“If we have an AI agent generating hypotheses for experiments, could we create another AI agent to order the chemicals or samples needed to run those experiments?” Speeding up routine processes like these could help Wilton and her team track the spread of human pathogens with greater accuracy or engineer new enzymes and biosynthetic pathways faster than ever before. 

3. Translating coding languages in science infrastructure

Zachary Sherman is a software developer who manages open-source Python tools for the Atmospheric Radiation Measurement group. He came to the hackathon looking for ways to quickly translate other coding languages into Python, a task that could take years of tedious manual coding. 

“There are many different atmospheric tools in different coding languages and also databases with application programming interfaces for downloading and interacting with atmospheric datasets,” said Sherman. ​“Some of these tools are outdated. We think vibe coding can help us create tools in Python to interact with these interfaces to download and work with the datasets. We also think vibe coding will help us modernize these code bases so we can troubleshoot issues faster and save time and money as we maintain essential scientific infrastructure.”

4. Understanding the nature of the universe

Chiara Bissolotti is a nuclear physicist trying to understand how all known particles interact. Tim Hobbs is a theoretical particle physicist trying to identify unknown particles that can help us understand the nature of dark matter or other possible ​“new physics” in the universe. Both of their fields generate huge amounts of data from theoretical computer simulations, cosmological observations and experiments at research institutions such as CERN’s Large Hadron Collider and the planned Electron-Ion Collider at DOE’s Brookhaven National Laboratory. The information hidden where their data sets overlap could be the key to answering some of the biggest mysteries of the universe, from quarks to the cosmos. But merging those data sets is a monumental task if you’re coding and comparing them by hand. 

“Can the data sets talk to each other?” asked Hobbs. ​“Might they be hiding common patterns, or guide us toward novel theoretical predictions or the automation of burdensome calculations?” 

Bissolotti summed it up, ​“We have many, many ideas. Many more ideas than time. If vibe coding can help us build the scaffolding of the code or help us make the data comparisons more scalable and efficient, we can cut our time to solution by a huge factor.”

5. Collaborating on complex problems in national security

Jonathan Ozik is a computational scientist who uses supercomputers and simulations to understand large and complex systems across many scientific domains, such as biological systems, health care interventions and infectious diseases in urban settings. He said vibe coding can help him explain his work to the many collaborators from different backgrounds that he works with. He also sees it as a way that he can help himself switch between complex projects. ​“It could give me a two-minute reintroduction to the code and the context I’m working in,” he said. ​“There’s no reason not to try to make your daily tasks easier.” 

Ozik predicts vibe coding will open research up to ideas we can’t yet begin to imagine: ​“If you have fewer perceived barriers, you create new possibilities. Things that were previously infeasible in science will become common.”

Argonne National Laboratory seeks solutions to pressing national problems in science and technology by conducting leading-edge basic and applied research in virtually every scientific discipline. Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science.

The U.S. Department of Energy’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit https://​ener​gy​.gov/​s​c​ience.




Untangling Signals From Subatomic Particles


Newswise — Each year, the Physical Sciences and Engineering (PSE) directorate at the U.S. Department of Energy’s (DOE) Argonne National Laboratory recognizes exceptional early-career researchers breaking into their fields with the PSE Early Investigator Named Awards. In 2025, the lab announced that six awardees would be receiving support in the form of funding and mentorship to conduct groundbreaking research aligned with Argonne’s strategic mission.

One member of the 2025 cohort is Maria Żurek, an assistant physicist in Argonne’s Physics (PHY) division, who studies the fundamental structure of protons and neutrons using the Continuous Electron Beam Accelerator Facility (CEBAF) at the DOE’s Thomas Jefferson National Accelerator Facility. For the PSE Early Investigator Named Award, Żurek will work under the guidance of Sylvester Joosten, interim leader of the Medium Energy group at Argonne, on a proposal titled, ​“Seeing the Unseen: Precision Calorimetry for 3D Nucleon Imaging.” In particle physics experiments, calorimetry refers to detection and analysis methods used to calculate particle energy.

“The national lab environment allows me to lead large projects and collaborate with fantastic scientists and engineers across divisions and institutions.” — Maria Żurek, Argonne assistant physicist

Here, Żurek discusses her research and other work she supports at Argonne.

Q: What role do you play at the lab?
A: I am an experimental nuclear physicist in the Physics division’s Medium Energy group, and I am working to understand the fundamental structure of the visible matter that makes up our world.

Q: What initiatives or projects are you most excited about being involved in at Argonne?
A: The national lab environment allows me to lead large projects and collaborate with fantastic scientists and engineers across divisions and institutions. I have the opportunity to work with talented postdocs on uncovering the inner workings of protons and neutrons using data from the CLAS12 experiment at Jefferson Lab, and I co-lead the development of electromagnetic calorimetry for the ePIC detector at the future Electron-Ion Collider (EIC) at the DOE’s Brookhaven National Laboratory. I am a team player, and doing great science with great people is the best job in the world.

Q: Can you talk a bit about the research you’re conducting for your proposal for which you received the 2025 PSE Early Investigator Named Award?
A: My PSE Early Investigator Named Award project tackles a hard problem: improving calorimetry for hadrons — protons, neutrons and other similar subatomic particles — in the medium-energy range typical of experiments at Jefferson Lab. Neutral particles, like neutrons, and another subatomic particle called muons are notoriously difficult to measure in this range. I will run preliminary simulations to test a practical dual-readout approach that separates light generated by different types of subatomic interactions, with the aim of getting cleaner, more precise energy and position measurements. The goal is to open new opportunities for 3D studies of proton and neutron structure and to provide evidence that can guide the next generation of detector designs.

Q: What do you like most about your job?
A: The people I work with, the diversity of problems I get to solve and the fact that I am always learning something new.

Q: How does your work support the lab’s mission? 
A: In my work I analyze data from world-class DOE user facilities, using measurements to sharpen our most fundamental understanding of how the universe is put together. I design and test modern detector technologies that let us see proton and neutron structure with greater clarity. This work uses Argonne’s strengths in hands-on experimentation and computation, and it delivers practical capability, validated hardware, documented procedures and reconstruction tools, for national research facilities today and for the EIC tomorrow. I work with engineers, scientists and trainees across Argonne to get from concept to instrument to reliable results. That is my piece of the mission.

Q: What do you enjoy doing outside of work?
A: I love hunting for hole-in-the-wall restaurants in Chicago’s neighborhoods and suburbs with my husband, and I never tire of admiring the city’s architecture, always walking with my head up. I love going to ballet, opera, musicals, sports games and concerts. A year ago, I started aerial gymnastics, and I even appreciate the bruises because they mean I am getting better. I enjoy leaf peeping in local parks and running our annual ​“fat squirrel contest” with friends. As someone who moved here, I still carry a newcomer’s curiosity — and ope! — I’m always ready to explore one more corner of American and Midwestern culture.

Q: What other sorts of career or professional development opportunities has Argonne provided?
A: I’ve gotten a lot from Argonne’s Mentorship Program, on both sides. As a mentee, the conversations with my mentors pushed me to set clear goals and get honest feedback; they also gave me a better view of how the lab works across divisions. As a mentor, I’ve learned to give useful feedback and to connect postdocs with the right people and resources. It’s simple, but it works because it creates time for focused conversations. Beyond mentoring, I’ve benefited from proposal workshops, science communication sessions and serving on several internal review committees.

Q: What encouraged you to get involved in the scientific discipline you are in?
A: I have always been drawn to big questions. In school I loved math, physics and chemistry, but I also loved literature for the way a good story pulls you in. A great high school physics teacher showed me that science can do the same thing: It tells a story about how the world works. I thought I might become a teacher, but during university I spent undergraduate internships at Fermilab (another DOE national laboratory), where I saw how national labs ​“zoom in” on particles to understand the building blocks of matter. That experience shifted my path. I wanted to be part of that discovery process.

Since then, I have followed the thread from curiosity to experiment — first, learning how to measure, then learning how to ask better questions, until it became a career in nuclear physics.

Argonne National Laboratory seeks solutions to pressing national problems in science and technology by conducting leading-edge basic and applied research in virtually every scientific discipline. Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science.

The U.S. Department of Energy’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit https://​ener​gy​.gov/​s​c​ience.




Nuclear Waste Transformed: PNNL Scientists Solidify History With Glass


Newswise — RICHLAND, Wash.—In a historic event decades in the making, the Hanford Site recently began immobilizing low-activity, radioactive waste by converting it into glass: a process known as vitrification. The event marked the successful start-up of Hanford’s Waste Treatment and Immobilization Plant, or “Vit Plant,” which will render millions of gallons of waste—generated by plutonium production during the Manhattan Project and Cold War—into glass for safe storage for thousands of years. The milestone also represents nearly 60 years of scientific contributions made by scientists and engineers at the Department of Energy’s Pacific Northwest National Laboratory.

“PNNL is proud to have played a pivotal role in advancing modern vitrification technology,” said Deb Gracio, PNNL director. “This milestone underscores the importance of innovation, collaboration, and scientific excellence in solving some of the world’s most pressing problems. It wouldn’t have been possible without a strong partnership among PNNL, DOE’s Hanford Field Office, Bechtel National Inc., the Office of Environmental Management, Hanford Tank Waste Operations & Closure, and of course our local community and stakeholders.”

Persistent and intense efforts by PNNL researchers—chemical engineers, computational scientists, materials scientists and chemists, among others—have advanced the science of vitrification since the 1960s, making this pursuit of materials science a defining element of PNNL’s history and impact. Not only have their innovations and collaborations with staff at the Vit Plant led to this historic achievement—their work has also informed vitrification operations around the world.

Birthplace of the melter

In the 1960s, researchers at PNNL engineered a technology that even today is among the most widely used tools for nuclear waste vitrification: the liquid-fed ceramic melter, which can be found amid vitrification operations on nearly every continent. Inside a melter, where temperatures can reach 2,100°F, low-activity waste is immobilized after being mixed with glass-forming chemicals—using formulas determined by a PNNL algorithm—then fed on top of a pool of molten glass. After the mixture is efficiently converted into glass, it is poured into containers and cooled to yield solid glass with radionuclides “locked” into the atomic structure of the glass. Simple at its core, carrying this process out in the real world can be anything but.

Each of the Hanford Site’s 177 one-million-gallon-capacity tanks contained a chemically unique and nonuniform waste. The composition of these wastes dictates both the behavior of the waste and which glass-forming chemicals are needed to make an acceptable glass. The “right” glass must not only incorporate and immobilize as much waste as possible—it also needs to be durable and avoid pitfalls like being difficult to transport through the plant, producing gas products in quantities that are challenging to treat or damaging to the plant’s infrastructure. Historically, designing a glass that strikes a balance among these goals meant spending a great deal of time fine-tuning the recipes.

For years, this process was carried out in a methodical, back-and-forth approach between glass design and performance testing. Scientists would consider the composition of a target waste, design a type of glass for the task, test its properties and adjust its composition until successful. In most facilities, this process can take months or even years.

“For the Vit Plant here in the Tri-Cities to operate successfully, we had to make it so that process happened on the order of minutes,” said John Vienna, PNNL materials scientist and lab fellow.

The challenge of vitrifying Hanford Site waste is made profoundly more challenging by the waste’s chemical complexity, according to Vienna, who has led a wide variety of research efforts in waste management, including the design of glasses used at the Hanford Site today. The Hanford Site’s waste is not only the most complex waste in the world but also the largest quantity ever to be targeted for vitrification.

From conventional to computational

Vienna, alongside his fellow scientists and colleagues from the Waste Treatment and Immobilization Plant with support from DOE glass scientist Albert Kruger and the Hanford Field Office, helped to solve the timeline challenge by innovating an entirely new approach to glass design. Instead of relying on the conventional approach carried out in a laboratory, they created a computational approach that utilizes modeling. Computer models are trained on hundreds of historical testing results and then prompted to make predictions by taking in the chemical makeup of waste to generate corresponding “recipes” that yield processable, economic and incredibly long-lasting glass.

Incorporating a partially computational approach has saved many years of effort and many millions of dollars for vitrification operations like those underway at DOE’s Savannah River Site in South Carolina. In the desert of southeastern Washington state, pretreated waste arrives at the beginning of the vitrification process in roughly 9,000-gallon batches. The waste is analyzed, and that information is fed into an algorithm that generates the corresponding glass design.

By comparison, a similar traditional approach was used at New York’s West Valley Demonstration Project site in the late 1990s, where glass design took roughly a decade. At the Hanford Site, this process now takes less than 120 minutes, and PNNL’s glass algorithm app is getting faster with each update.

Many current and former staff at PNNL have contributed to the design of the melters and other key equipment at the Vit Plant. The submerged bed scrubber, the air displacement slurry pump and melter technologies were all initiated and developed at PNNL. Will Eaton, a melter specialist, led portions of the Vit Plant melter designs and continues to lead research to improve melter materials and optimize melter processing. These innovations, along with PNNL contributions to designs led by other Vit Plant partners, make it possible for each melter to produce up to 15 metric tons of glass per day when operating at full capacity.

“PNNL has been an integral part of the Hanford Waste Treatment and Immobilization Plant. They have assisted in solving technical challenges and developed the vitrification glass recipes that are currently being processed in the Low-Activity Waste Facility,” said Chris Musick, general manager of the Bechtel-led Waste Treatment Completion Company LLC. “We look forward to growing our partnership with PNNL in the future as we move forward with treating tank waste and completing the high-level waste scope.”

The next generation

Today, PNNL scientists continue to support Hanford’s Waste Treatment and Immobilization Plant by analyzing pretreated and vitrified waste, as well as providing fast answers during the facility’s start-up. Seeing the first vitrified waste marked an especially satisfying career moment, said materials scientist José Marcial.

“It’s extremely exciting,” said Marcial, whose scientific career began with a vitrification-focused internship at PNNL as a high school student while studying at Kiona-Benton City High School then Columbia Basin College. “This shows that this isn’t just an academic exercise. It’s all of our effort being put to real use to benefit the country and our community. It’s truly an amazing time to be a part of this work.”

Similarly, Vienna, a mentor to Marcial, is enjoying the chance to witness the culmination of scientific effort spanning dozens of careers and thousands of scientific manuscripts and reports. “We’ve got three generations of researchers that have dedicated their careers to Hanford tank waste,” said Vienna. “Since the 1960s, there has always been a vitrification presence here at PNNL.”

Though vitrification at Hanford has begun, the work is far from over. Marcial and others are now focused on continuing near- and long-term support for the Waste Treatment and Immobilization Plant by contributing to improvements in overall efficiency, fine-tuning the glass algorithm performance and being part of the team addressing any emerging operational challenges. Additional PNNL researchers are applying their expertise to the broader cleanup mission, including grout waste form development, tank waste treatment, tank waste solids, the high-level waste facility and environmental remediation of subsurface soil and groundwater. As they look toward the future, Marcial looks toward the next generation of scientists.

“For me, it was an internship that helped me discover my passion and pursue a career that’s both rewarding and beneficial to my local community,” said Marcial. “I grew up here in the Tri-Cities, and at first my parents didn’t know anything about the work that PNNL does. They just knew I wanted to pursue a career in science, so they helped me accomplish that, and I want to do the same for others. I think it’s important to always bring up the next generation of scientists so they, too, can help to solve challenges for the benefit of the country.”

###

About PNNL

Pacific Northwest National Laboratory draws on its distinguishing strengths in chemistry, Earth sciences, biology and data science to advance scientific knowledge and address challenges in energy resiliency and national security. Founded in 1965, PNNL is operated by Battelle and supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit the DOE Office of Science website. For more information on PNNL, visit PNNL’s News Center. Follow us on TwitterFacebookLinkedIn and Instagram.




A Smashing Success: Relativistic Heavy Ion Collider Wraps up Final Collisions


Newswise — UPTON, N.Y. — Just after 9 a.m. on Friday, Feb. 6, 2026, final beams of oxygen ions — oxygen atoms stripped of their electrons — circulated through the twin 2.4-mile-circumference rings of the Relativistic Heavy Ion Collider (RHIC) and crashed into one another at nearly the speed of light inside the collider’s two house-sized particle detectors, STAR and sPHENIX. RHIC, a nuclear physics research facility at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory, has been smashing atoms since the summer of 2000. The final collisions cap a quarter century of remarkable experiments using 10 different atomic species colliding over a wide range of energies in different configurations. The RHIC program has produced groundbreaking discoveries about the building blocks of matter and the nature of proton spin and technological advances in accelerators, detectors, and computing that have far surpassed scientists’ expectations when this discovery machine first turned on.

“RHIC has been one of the most successful user facilities operated by the DOE Office of Science, serving thousands of scientists from across the nation and around the globe,” said DOE Under Secretary for Science Darío Gil. “Supporting these one-of-a-kind research facilities pushes the limits of technology and expands our understanding of our world through transformational science — central pillars of DOE’s mission to ensure America’s security and prosperity.”

Gil was in the Main Control Room of Brookhaven Lab’s collider complex to officially end the 25th and final run at RHIC in advance of announcing the next major milestone in the construction of the Electron-Ion Collider (EIC), a state-of-the-art nuclear physics research facility that will be built by reusing major components of RHIC.

“It’s been an amazing run,” said Wolfram Fischer, chair of Brookhaven Lab’s Collider-Accelerator Department (C-AD), speaking of the entirety of the RHIC program. As head of C-AD, Fischer is responsible for the day-to-day, year-to-year operations of the collider and all its ancillary accelerator infrastructure. “Experiencing the challenges of first trying to get beams to circulate during commissioning in the fall of 1999, one could not have dreamed how far the performance of this machine would come,” he said. “We’ve pushed well beyond the original design in terms of the number of collisions we can produce, the energy range of those collisions, the variety of ions we’ve collided, and our ability to align the spins of protons and maintain a high degree of this alignment or polarization.”

The 25th and final run produced the largest-ever dataset from RHIC’s most energetic head-on smashups between two beams of gold ions, among the heaviest ions collided at RHIC. It also yielded a treasure trove of proton-proton collisions that will provide essential comparison data and insight into proton spin, a set of low-energy fixed target collisions to complete RHIC’s “beam energy scan,” and a final burst of oxygen-oxygen interactions. All this data will add to that collected previously by RHIC’s detectors — STAR, which has been running with many upgrades since RHIC’s beginning; PHENIX, another original RHIC detector that ceased operations in 2016; PHOBOS and BRAHMS, two smaller original detectors that ran from 2000 through 2005 and 2006, respectively; and sPHENIX, RHIC’s newest most rapid-fire collision “camera,” which came online in 2023.

This final run generated the primary data set for the new sPHENIX experiment. This year, sPHENIX accumulated more than 200 petabytes of raw data — or 200 quadrillion bytes — more than all previous RHIC raw datasets combined. This massive dataset includes 40 billion snapshots of the unique form of matter generated in gold-ion collisions.

Collectively, the RHIC measurements will fill in missing details in physicists’ understanding of how a soup of fundamental particles known as quarks and gluons — which last existed in nature some 14 billion years ago, a microsecond after the Big Bang — coalesced and converged to form the more ordinary atomic particles that make up everything visible in our world today. Recreating this primordial matter, known as a quark-gluon plasma (QGP), was the primary reason for building RHIC. RHIC’s energetic collisions of heavy ions such as gold were designed to set quarks and gluons free from “confinement” within protons and neutrons by melting the boundaries of these nuclear particles.

Thanks to considerable contributions from Japan’s RIKEN institute, RHIC was also built with unique capabilities for polarizing protons so that physicists could explore the origins of proton spin. This intrinsic quantum property, somewhat analogous to a planet spinning on its axis, has been leveraged to develop powerful technologies like nuclear magnetic resonance imaging and medical MRIs. RHIC’s polarized proton collisions have opened a new window into the mystery of how spin arises from the proton’s quarks and gluons.

PHENIX and STAR have both collected and published results from large swaths of spin-polarized collisions using selection “triggers” to decide which events to capture and study. During Run 25, sPHENIX became the world’s first detector to record a continuous streaming dataset from RHIC’s spin-polarized proton collisions — thus eliminating the need for triggers and potentially paving the way for unanticipated discoveries.

“This final RHIC run, with its impressive dataset, is a capstone that exemplifies the success of the entire RHIC program,” said John Hill, interim director of Brookhaven Lab. “The scientists, engineers, and technicians at Brookhaven deserve huge credit for their dedication and innovation throughout the operating life of RHIC — and for continually finding new ways to maximize the scientific output of this remarkable machine. We are also extremely grateful for the continued support of the U.S. Department of Energy, and for our collaborators from other DOE labs, U.S. universities, and scientific institutions around the globe. This exploration of the matter that makes up our world and of how it came to be has been, and will continue to be, a truly international endeavor.”

Captivating discoveries

In early 2001, as the earliest RHIC data came out, some scientists were convinced that they’d seen signs of the post-Big-Bang QGP. But the data also presented puzzling surprises. Instead of the predicted uniformly expanding gas of quarks and gluons, the matter created in RHIC’s collisions seemed to flow more like a liquid — and, remarkably, one with extremely low viscosity. Additional experiments and a careful multiyear analysis led the four original RHIC collaborations to conclude in 2005 that RHIC was generating a nearly “perfect” liquid. By 2010, they had sufficient evidence to declare this liquid hot enough to be the long-sought QGP.

Since then, RHIC physicists have been making precision measurements of the QGP, including its temperature at different stages, how it swirls — it’s the swirliest matter ever! — how quarks and gluons in the primordial soup transition under various conditions of temperature and pressure to the nuclear matter that makes up atoms in our world, and how collisions of even small particles can create tiny drops of the QGP. They’ve explored exotic forms of nuclear matter such as that found in neutron stars, detected traces of the heaviest exotic antimatter ever created in a laboratory, and explored how visible matter emerges from the “nothingness” of empty space. The sPHENIX experiment has only recently published its first physics results, laying the foundation for its future of scientific insights.

“RHIC transformed nuclear physics by demonstrating the remarkable consequences of ‘boiling the vacuum,’ to paraphrase renowned physicist T. D. Lee’s description of matter governed by quantum chromodynamics (QCD),” said Brookhaven Lab theorist Raju Venugopalan. “In QCD — the theory that describes quarks and gluons and their interactions — findings from RHIC propelled the rapid development of new analytical approaches and high-performance computing. The RHIC data also sparked several unanticipated connections between the behavior of the QGP fluid and strongly correlated condensed matter systems, including ultra-cold atoms, as well as links to concepts such as quantum entanglement and the formation and evaporation of black holes.”

Advances in nuclear physics theory and the enormous RHIC datasets have also pushed the evolution of supercomputers, AI methods for analyzing “big data,” and the infrastructure needed to store and share data seamlessly with RHIC collaborators around the world. In 2024, Brookhaven’s data center — which also houses data from the ATLAS experiment at the Large Hadron Collider (LHC) at CERN, the European Organization for Nuclear Research, and other experiments — passed the milestone of storing 300 petabytes of data, the largest compilation of nuclear and particle physics data in the U.S. With the newest data from RHIC and ATLAS, the total now tops 610 petabytes.

In the proton spin program, RHIC’s measurements greatly improved the precision with which scientists could determine gluons’ contribution to proton spin, along with the contribution from quarks. This effort was motivated by surprising results from experiments elsewhere in the 1980s showing that quarks contribute only a fraction to this quantum property. Gluons were initially assumed to contribute the rest. RHIC’s measurements reveal that gluons contribute about as much as the quarks — not enough to fully solve the “spin puzzle.” A more recent analysis established that at least some of the gluons are spin aligned with the spin of the proton they are in. But there is still more to explore in this spin puzzle.

“Spin is one of the fundamental quantum numbers of every elementary particle in the universe except one, the Higgs,” said Elke Aschenauer, a Brookhaven Lab physicist who has played a pivotal role in RHIC’s spin physics program. “RHIC’s measurements have established the groundwork for understanding the complexity of proton spin. The future EIC will be a precision machine for studying proton spin.”

All Relativistic Heavy Ion Collider data is stored on tape at Brookhaven Lab’s data center. When physicists want access to a particular dataset — or multiple sets simultaneously — a robot grabs the appropriate tape(s) and mounts the desired data to disk within seconds. Collaborators around the world can tap into the data as if it were on their own desktop. (David Rahner/Brookhaven National Laboratory)

Continuing legacy

Even with so many impressive discoveries in the books, RHIC physicists say there will be many more to come for at least another decade.

“The science mission of RHIC will continue until we analyze all the data and publish all the papers,” said Abhay Deshpande, Brookhaven Lab’s associate laboratory director for nuclear and particle physics. He emphasized how important it will be to preserve RHIC’s data for future scientific analyses.

RHIC’s data will also continue to serve as an essential bridge between ongoing and planned experiments exploring nuclear matter at lower collision energies — for example at the Facility for Antiproton and Ion Research (FAIR) being built in Germany and the Super Proton Synchrotron at CERN — and at much higher energies at CERN’s LHC.

“Analyzing the latest RHIC data will also help train the next generation of physicists needed to run and analyze data from future experiments,” said Lijuan Ruan, a Brookhaven Lab physicist and co-spokesperson for the STAR Collaboration.    

A big part of that future will take place right here at Brookhaven National Laboratory where major components of the RHIC accelerator complex will live on in a new nuclear physics research facility, the world’s only polarized Electron-Ion Collider. Engineers and technicians will remove one of RHIC’s ion storage rings and replace it with a new ring for storing accelerated electrons inside the existing accelerator tunnel. Meanwhile, the other RHIC ring, refurbished for its new mission, will receive ions accelerated by C-AD’s existing injector complex, traveling around the tunnel in the opposite direction from the electrons. Scientists will leverage the experience gained during 25 years of RHIC operations — as well as reams of RHIC accelerator physics data — to develop and train new AI algorithms designed to optimize EIC accelerator performance.  

When electrons collide with ions where the two EIC rings cross, the action will be captured by a brand-new particle detector. Instead of recreating the early universe, these microscope-like interactions will enable precision measurements that reveal how quarks and gluons are organized and interact within matter as we know it in today’s world.

“We’ll learn how quarks and gluons generate mass, how their interactions contribute to proton spin, and much more that will revolutionize our understanding of matter — much as the science we’ve explored at RHIC has,” said Deshpande, who also serves as director of science for the EIC. “This is the future of Brookhaven Lab and nuclear physics in the U.S.”

Daniel Marx, one of the accelerator physicists working on the design of the EIC’s new electron storage ring, said, “It’s going to be very challenging, but also exciting. We’ll be doing things that have never been done before.”

Perhaps Marx was echoing the sentiments of the physicists who originally built RHIC, demonstrating another big part of RHIC’s legacy: an ongoing willingness to tackle unprecedented scientific and technological challenges.

“We are confident that we have the people who will make the EIC happen because of the expertise we have developed by building and running RHIC,” Deshpande said.

RHIC and the future EIC are funded primarily by the DOE Office of Science.

Brookhaven National Laboratory is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit science.energy.gov.

Follow @BrookhavenLab on social media. Find us on Instagram, LinkedIn, X, and Facebook.

Related Links