Feeling the Vibe


Newswise — It started with a social media post from Andrej Karpathy, one of the founders of OpenAI. Last year, he tweeted, ​“There’s a new kind of coding I call ​‘vibe coding,’ where you fully give into the vibes, embrace exponentials, and forget that the code even exists.” Karpathy said that large language models and voice-to-text programs had gotten so sophisticated that he could just ask a model to create something and then copy and paste the code it generated to build a project or create a web app from scratch. ​“I just see stuff, say stuff, run stuff, and copy-paste stuff, and it mostly works.” 

That groovy technique might be good for patching a glitchy website or building a phone app, but can it really change the way we do science? Researchers from the U.S. Department of Energy’s (DOE) Argonne National Laboratory are testing vibe coding tools and techniques to see how they stand up to data-intensive scientific challenges. At a recent hackathon, researchers from across the lab gathered to learn together and test commercially available coding tools like Cursor and Warp against scientific challenges as large and hairy as the hunt for dark matter and as pressing as the optimization of nuclear power plants. 

As a long-time leader in computational science and the home of Aurora, one of the world’s fastest and most powerful supercomputers, Argonne is no stranger to grand challenges. But to solve huge problems and to process more data than ever before, researchers are working to stay at the bleeding edge of harnessing artificial intelligence (AI) for science.

Rick Stevens sees vibe coding as another way Argonne researchers can continue to speed up scientific innovation. Stevens is the associate laboratory director for Computing, Environment and Life Sciences at Argonne. He has said that scientists need to be able to work as fast as they can think. He gets frustrated by the bottlenecks of current technology. But vibe coding is a productivity hack. ​“You’re unhobbled from your coding speed,” said Stevens. 

With vibe coding, researchers can interact with large language models in real time, asking them questions by talking rather than by typing commands, and then getting usable output in seconds or minutes. Stevens compared it to having an AI co-scientist — or even a team of co-scientists — working alongside you. He challenged fellow scientists to work with the technology every day. ​“You need to get your head around how to be productive in this environment,” he said. ​“Think, play and have a blast!”

Breaking barriers between ideas and action 

Part of the excitement around vibe coding is that we don’t know how it’s going to change science. At the hackathon, the vibe in the room was playful. The group was a mix of coders and non-coders from a variety of disciplines. Instead of quietly pecking away at their keyboards, researchers were laughing, bouncing ideas off each other and confidently speaking commands to their laptops. 

The promise of AI and vibe coding isn’t just about doing science faster, Stevens explained. These tools free up scientists to be more creative, to put their energy toward things that only a human can do. ​“With these tools, you’re not bottlenecked by writing code,” he said. ​“Now, you’re focused on ideas.” 

Here are some of the ideas Argonne scientists are vibing on:

1. Prototyping software to strengthen nuclear power plants

Nuclear power plants are an integral part of America’s energy supply and a reliable source of power for the growing energy needs of AI. Nuclear engineer Yeni Li and her team are creating AI models of those power plants to help plant engineers and managers predict the best times for maintenance. That knowledge can lead to more reliable and affordable energy production. 

Li said that vibe coding will be useful for setting up the software architecture she needs to turn her ideas into prototypes. ​“These tools will help us do a few days of work in a single afternoon,” said Li. 

2. Automating workflows in bioscience

Rosemarie Wilton doesn’t do a lot of coding in her work as a molecular biologist, but she does spend a significant amount of time using software tools for data analysis. Developing Python-coded pipelines would allow her to automate her data processing workflows and integrate multiple tools seamlessly. She was delighted to see how fast vibe coding could give her the command codes she needed. ​“For a coding novice, it’s really quite amazing. It will be a time saver,” she said. 

That quick win in generating command codes led Wilton and Computational Biologist Nick Chia to think about other ways vibe coding could help. Chia mused, ​“If we have an AI agent generating hypotheses for experiments, could we create another AI agent to order the chemicals or samples needed to run those experiments?” Speeding up routine processes like these could help Wilton and her team track the spread of human pathogens with greater accuracy or engineer new enzymes and biosynthetic pathways faster than ever before. 

3. Translating coding languages in science infrastructure

Zachary Sherman is a software developer who manages open-source Python tools for the Atmospheric Radiation Measurement group. He came to the hackathon looking for ways to quickly translate other coding languages into Python, a task that could take years of tedious manual coding. 

“There are many different atmospheric tools in different coding languages and also databases with application programming interfaces for downloading and interacting with atmospheric datasets,” said Sherman. ​“Some of these tools are outdated. We think vibe coding can help us create tools in Python to interact with these interfaces to download and work with the datasets. We also think vibe coding will help us modernize these code bases so we can troubleshoot issues faster and save time and money as we maintain essential scientific infrastructure.”

4. Understanding the nature of the universe

Chiara Bissolotti is a nuclear physicist trying to understand how all known particles interact. Tim Hobbs is a theoretical particle physicist trying to identify unknown particles that can help us understand the nature of dark matter or other possible ​“new physics” in the universe. Both of their fields generate huge amounts of data from theoretical computer simulations, cosmological observations and experiments at research institutions such as CERN’s Large Hadron Collider and the planned Electron-Ion Collider at DOE’s Brookhaven National Laboratory. The information hidden where their data sets overlap could be the key to answering some of the biggest mysteries of the universe, from quarks to the cosmos. But merging those data sets is a monumental task if you’re coding and comparing them by hand. 

“Can the data sets talk to each other?” asked Hobbs. ​“Might they be hiding common patterns, or guide us toward novel theoretical predictions or the automation of burdensome calculations?” 

Bissolotti summed it up, ​“We have many, many ideas. Many more ideas than time. If vibe coding can help us build the scaffolding of the code or help us make the data comparisons more scalable and efficient, we can cut our time to solution by a huge factor.”

5. Collaborating on complex problems in national security

Jonathan Ozik is a computational scientist who uses supercomputers and simulations to understand large and complex systems across many scientific domains, such as biological systems, health care interventions and infectious diseases in urban settings. He said vibe coding can help him explain his work to the many collaborators from different backgrounds that he works with. He also sees it as a way that he can help himself switch between complex projects. ​“It could give me a two-minute reintroduction to the code and the context I’m working in,” he said. ​“There’s no reason not to try to make your daily tasks easier.” 

Ozik predicts vibe coding will open research up to ideas we can’t yet begin to imagine: ​“If you have fewer perceived barriers, you create new possibilities. Things that were previously infeasible in science will become common.”

Argonne National Laboratory seeks solutions to pressing national problems in science and technology by conducting leading-edge basic and applied research in virtually every scientific discipline. Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science.

The U.S. Department of Energy’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit https://​ener​gy​.gov/​s​c​ience.




Untangling Signals From Subatomic Particles


Newswise — Each year, the Physical Sciences and Engineering (PSE) directorate at the U.S. Department of Energy’s (DOE) Argonne National Laboratory recognizes exceptional early-career researchers breaking into their fields with the PSE Early Investigator Named Awards. In 2025, the lab announced that six awardees would be receiving support in the form of funding and mentorship to conduct groundbreaking research aligned with Argonne’s strategic mission.

One member of the 2025 cohort is Maria Żurek, an assistant physicist in Argonne’s Physics (PHY) division, who studies the fundamental structure of protons and neutrons using the Continuous Electron Beam Accelerator Facility (CEBAF) at the DOE’s Thomas Jefferson National Accelerator Facility. For the PSE Early Investigator Named Award, Żurek will work under the guidance of Sylvester Joosten, interim leader of the Medium Energy group at Argonne, on a proposal titled, ​“Seeing the Unseen: Precision Calorimetry for 3D Nucleon Imaging.” In particle physics experiments, calorimetry refers to detection and analysis methods used to calculate particle energy.

“The national lab environment allows me to lead large projects and collaborate with fantastic scientists and engineers across divisions and institutions.” — Maria Żurek, Argonne assistant physicist

Here, Żurek discusses her research and other work she supports at Argonne.

Q: What role do you play at the lab?
A: I am an experimental nuclear physicist in the Physics division’s Medium Energy group, and I am working to understand the fundamental structure of the visible matter that makes up our world.

Q: What initiatives or projects are you most excited about being involved in at Argonne?
A: The national lab environment allows me to lead large projects and collaborate with fantastic scientists and engineers across divisions and institutions. I have the opportunity to work with talented postdocs on uncovering the inner workings of protons and neutrons using data from the CLAS12 experiment at Jefferson Lab, and I co-lead the development of electromagnetic calorimetry for the ePIC detector at the future Electron-Ion Collider (EIC) at the DOE’s Brookhaven National Laboratory. I am a team player, and doing great science with great people is the best job in the world.

Q: Can you talk a bit about the research you’re conducting for your proposal for which you received the 2025 PSE Early Investigator Named Award?
A: My PSE Early Investigator Named Award project tackles a hard problem: improving calorimetry for hadrons — protons, neutrons and other similar subatomic particles — in the medium-energy range typical of experiments at Jefferson Lab. Neutral particles, like neutrons, and another subatomic particle called muons are notoriously difficult to measure in this range. I will run preliminary simulations to test a practical dual-readout approach that separates light generated by different types of subatomic interactions, with the aim of getting cleaner, more precise energy and position measurements. The goal is to open new opportunities for 3D studies of proton and neutron structure and to provide evidence that can guide the next generation of detector designs.

Q: What do you like most about your job?
A: The people I work with, the diversity of problems I get to solve and the fact that I am always learning something new.

Q: How does your work support the lab’s mission? 
A: In my work I analyze data from world-class DOE user facilities, using measurements to sharpen our most fundamental understanding of how the universe is put together. I design and test modern detector technologies that let us see proton and neutron structure with greater clarity. This work uses Argonne’s strengths in hands-on experimentation and computation, and it delivers practical capability, validated hardware, documented procedures and reconstruction tools, for national research facilities today and for the EIC tomorrow. I work with engineers, scientists and trainees across Argonne to get from concept to instrument to reliable results. That is my piece of the mission.

Q: What do you enjoy doing outside of work?
A: I love hunting for hole-in-the-wall restaurants in Chicago’s neighborhoods and suburbs with my husband, and I never tire of admiring the city’s architecture, always walking with my head up. I love going to ballet, opera, musicals, sports games and concerts. A year ago, I started aerial gymnastics, and I even appreciate the bruises because they mean I am getting better. I enjoy leaf peeping in local parks and running our annual ​“fat squirrel contest” with friends. As someone who moved here, I still carry a newcomer’s curiosity — and ope! — I’m always ready to explore one more corner of American and Midwestern culture.

Q: What other sorts of career or professional development opportunities has Argonne provided?
A: I’ve gotten a lot from Argonne’s Mentorship Program, on both sides. As a mentee, the conversations with my mentors pushed me to set clear goals and get honest feedback; they also gave me a better view of how the lab works across divisions. As a mentor, I’ve learned to give useful feedback and to connect postdocs with the right people and resources. It’s simple, but it works because it creates time for focused conversations. Beyond mentoring, I’ve benefited from proposal workshops, science communication sessions and serving on several internal review committees.

Q: What encouraged you to get involved in the scientific discipline you are in?
A: I have always been drawn to big questions. In school I loved math, physics and chemistry, but I also loved literature for the way a good story pulls you in. A great high school physics teacher showed me that science can do the same thing: It tells a story about how the world works. I thought I might become a teacher, but during university I spent undergraduate internships at Fermilab (another DOE national laboratory), where I saw how national labs ​“zoom in” on particles to understand the building blocks of matter. That experience shifted my path. I wanted to be part of that discovery process.

Since then, I have followed the thread from curiosity to experiment — first, learning how to measure, then learning how to ask better questions, until it became a career in nuclear physics.

Argonne National Laboratory seeks solutions to pressing national problems in science and technology by conducting leading-edge basic and applied research in virtually every scientific discipline. Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science.

The U.S. Department of Energy’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit https://​ener​gy​.gov/​s​c​ience.




Nuclear Waste Transformed: PNNL Scientists Solidify History With Glass


Newswise — RICHLAND, Wash.—In a historic event decades in the making, the Hanford Site recently began immobilizing low-activity, radioactive waste by converting it into glass: a process known as vitrification. The event marked the successful start-up of Hanford’s Waste Treatment and Immobilization Plant, or “Vit Plant,” which will render millions of gallons of waste—generated by plutonium production during the Manhattan Project and Cold War—into glass for safe storage for thousands of years. The milestone also represents nearly 60 years of scientific contributions made by scientists and engineers at the Department of Energy’s Pacific Northwest National Laboratory.

“PNNL is proud to have played a pivotal role in advancing modern vitrification technology,” said Deb Gracio, PNNL director. “This milestone underscores the importance of innovation, collaboration, and scientific excellence in solving some of the world’s most pressing problems. It wouldn’t have been possible without a strong partnership among PNNL, DOE’s Hanford Field Office, Bechtel National Inc., the Office of Environmental Management, Hanford Tank Waste Operations & Closure, and of course our local community and stakeholders.”

Persistent and intense efforts by PNNL researchers—chemical engineers, computational scientists, materials scientists and chemists, among others—have advanced the science of vitrification since the 1960s, making this pursuit of materials science a defining element of PNNL’s history and impact. Not only have their innovations and collaborations with staff at the Vit Plant led to this historic achievement—their work has also informed vitrification operations around the world.

Birthplace of the melter

In the 1960s, researchers at PNNL engineered a technology that even today is among the most widely used tools for nuclear waste vitrification: the liquid-fed ceramic melter, which can be found amid vitrification operations on nearly every continent. Inside a melter, where temperatures can reach 2,100°F, low-activity waste is immobilized after being mixed with glass-forming chemicals—using formulas determined by a PNNL algorithm—then fed on top of a pool of molten glass. After the mixture is efficiently converted into glass, it is poured into containers and cooled to yield solid glass with radionuclides “locked” into the atomic structure of the glass. Simple at its core, carrying this process out in the real world can be anything but.

Each of the Hanford Site’s 177 one-million-gallon-capacity tanks contained a chemically unique and nonuniform waste. The composition of these wastes dictates both the behavior of the waste and which glass-forming chemicals are needed to make an acceptable glass. The “right” glass must not only incorporate and immobilize as much waste as possible—it also needs to be durable and avoid pitfalls like being difficult to transport through the plant, producing gas products in quantities that are challenging to treat or damaging to the plant’s infrastructure. Historically, designing a glass that strikes a balance among these goals meant spending a great deal of time fine-tuning the recipes.

For years, this process was carried out in a methodical, back-and-forth approach between glass design and performance testing. Scientists would consider the composition of a target waste, design a type of glass for the task, test its properties and adjust its composition until successful. In most facilities, this process can take months or even years.

“For the Vit Plant here in the Tri-Cities to operate successfully, we had to make it so that process happened on the order of minutes,” said John Vienna, PNNL materials scientist and lab fellow.

The challenge of vitrifying Hanford Site waste is made profoundly more challenging by the waste’s chemical complexity, according to Vienna, who has led a wide variety of research efforts in waste management, including the design of glasses used at the Hanford Site today. The Hanford Site’s waste is not only the most complex waste in the world but also the largest quantity ever to be targeted for vitrification.

From conventional to computational

Vienna, alongside his fellow scientists and colleagues from the Waste Treatment and Immobilization Plant with support from DOE glass scientist Albert Kruger and the Hanford Field Office, helped to solve the timeline challenge by innovating an entirely new approach to glass design. Instead of relying on the conventional approach carried out in a laboratory, they created a computational approach that utilizes modeling. Computer models are trained on hundreds of historical testing results and then prompted to make predictions by taking in the chemical makeup of waste to generate corresponding “recipes” that yield processable, economic and incredibly long-lasting glass.

Incorporating a partially computational approach has saved many years of effort and many millions of dollars for vitrification operations like those underway at DOE’s Savannah River Site in South Carolina. In the desert of southeastern Washington state, pretreated waste arrives at the beginning of the vitrification process in roughly 9,000-gallon batches. The waste is analyzed, and that information is fed into an algorithm that generates the corresponding glass design.

By comparison, a similar traditional approach was used at New York’s West Valley Demonstration Project site in the late 1990s, where glass design took roughly a decade. At the Hanford Site, this process now takes less than 120 minutes, and PNNL’s glass algorithm app is getting faster with each update.

Many current and former staff at PNNL have contributed to the design of the melters and other key equipment at the Vit Plant. The submerged bed scrubber, the air displacement slurry pump and melter technologies were all initiated and developed at PNNL. Will Eaton, a melter specialist, led portions of the Vit Plant melter designs and continues to lead research to improve melter materials and optimize melter processing. These innovations, along with PNNL contributions to designs led by other Vit Plant partners, make it possible for each melter to produce up to 15 metric tons of glass per day when operating at full capacity.

“PNNL has been an integral part of the Hanford Waste Treatment and Immobilization Plant. They have assisted in solving technical challenges and developed the vitrification glass recipes that are currently being processed in the Low-Activity Waste Facility,” said Chris Musick, general manager of the Bechtel-led Waste Treatment Completion Company LLC. “We look forward to growing our partnership with PNNL in the future as we move forward with treating tank waste and completing the high-level waste scope.”

The next generation

Today, PNNL scientists continue to support Hanford’s Waste Treatment and Immobilization Plant by analyzing pretreated and vitrified waste, as well as providing fast answers during the facility’s start-up. Seeing the first vitrified waste marked an especially satisfying career moment, said materials scientist José Marcial.

“It’s extremely exciting,” said Marcial, whose scientific career began with a vitrification-focused internship at PNNL as a high school student while studying at Kiona-Benton City High School then Columbia Basin College. “This shows that this isn’t just an academic exercise. It’s all of our effort being put to real use to benefit the country and our community. It’s truly an amazing time to be a part of this work.”

Similarly, Vienna, a mentor to Marcial, is enjoying the chance to witness the culmination of scientific effort spanning dozens of careers and thousands of scientific manuscripts and reports. “We’ve got three generations of researchers that have dedicated their careers to Hanford tank waste,” said Vienna. “Since the 1960s, there has always been a vitrification presence here at PNNL.”

Though vitrification at Hanford has begun, the work is far from over. Marcial and others are now focused on continuing near- and long-term support for the Waste Treatment and Immobilization Plant by contributing to improvements in overall efficiency, fine-tuning the glass algorithm performance and being part of the team addressing any emerging operational challenges. Additional PNNL researchers are applying their expertise to the broader cleanup mission, including grout waste form development, tank waste treatment, tank waste solids, the high-level waste facility and environmental remediation of subsurface soil and groundwater. As they look toward the future, Marcial looks toward the next generation of scientists.

“For me, it was an internship that helped me discover my passion and pursue a career that’s both rewarding and beneficial to my local community,” said Marcial. “I grew up here in the Tri-Cities, and at first my parents didn’t know anything about the work that PNNL does. They just knew I wanted to pursue a career in science, so they helped me accomplish that, and I want to do the same for others. I think it’s important to always bring up the next generation of scientists so they, too, can help to solve challenges for the benefit of the country.”

###

About PNNL

Pacific Northwest National Laboratory draws on its distinguishing strengths in chemistry, Earth sciences, biology and data science to advance scientific knowledge and address challenges in energy resiliency and national security. Founded in 1965, PNNL is operated by Battelle and supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit the DOE Office of Science website. For more information on PNNL, visit PNNL’s News Center. Follow us on TwitterFacebookLinkedIn and Instagram.




A Smashing Success: Relativistic Heavy Ion Collider Wraps up Final Collisions


Newswise — UPTON, N.Y. — Just after 9 a.m. on Friday, Feb. 6, 2026, final beams of oxygen ions — oxygen atoms stripped of their electrons — circulated through the twin 2.4-mile-circumference rings of the Relativistic Heavy Ion Collider (RHIC) and crashed into one another at nearly the speed of light inside the collider’s two house-sized particle detectors, STAR and sPHENIX. RHIC, a nuclear physics research facility at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory, has been smashing atoms since the summer of 2000. The final collisions cap a quarter century of remarkable experiments using 10 different atomic species colliding over a wide range of energies in different configurations. The RHIC program has produced groundbreaking discoveries about the building blocks of matter and the nature of proton spin and technological advances in accelerators, detectors, and computing that have far surpassed scientists’ expectations when this discovery machine first turned on.

“RHIC has been one of the most successful user facilities operated by the DOE Office of Science, serving thousands of scientists from across the nation and around the globe,” said DOE Under Secretary for Science Darío Gil. “Supporting these one-of-a-kind research facilities pushes the limits of technology and expands our understanding of our world through transformational science — central pillars of DOE’s mission to ensure America’s security and prosperity.”

Gil was in the Main Control Room of Brookhaven Lab’s collider complex to officially end the 25th and final run at RHIC in advance of announcing the next major milestone in the construction of the Electron-Ion Collider (EIC), a state-of-the-art nuclear physics research facility that will be built by reusing major components of RHIC.

“It’s been an amazing run,” said Wolfram Fischer, chair of Brookhaven Lab’s Collider-Accelerator Department (C-AD), speaking of the entirety of the RHIC program. As head of C-AD, Fischer is responsible for the day-to-day, year-to-year operations of the collider and all its ancillary accelerator infrastructure. “Experiencing the challenges of first trying to get beams to circulate during commissioning in the fall of 1999, one could not have dreamed how far the performance of this machine would come,” he said. “We’ve pushed well beyond the original design in terms of the number of collisions we can produce, the energy range of those collisions, the variety of ions we’ve collided, and our ability to align the spins of protons and maintain a high degree of this alignment or polarization.”

The 25th and final run produced the largest-ever dataset from RHIC’s most energetic head-on smashups between two beams of gold ions, among the heaviest ions collided at RHIC. It also yielded a treasure trove of proton-proton collisions that will provide essential comparison data and insight into proton spin, a set of low-energy fixed target collisions to complete RHIC’s “beam energy scan,” and a final burst of oxygen-oxygen interactions. All this data will add to that collected previously by RHIC’s detectors — STAR, which has been running with many upgrades since RHIC’s beginning; PHENIX, another original RHIC detector that ceased operations in 2016; PHOBOS and BRAHMS, two smaller original detectors that ran from 2000 through 2005 and 2006, respectively; and sPHENIX, RHIC’s newest most rapid-fire collision “camera,” which came online in 2023.

This final run generated the primary data set for the new sPHENIX experiment. This year, sPHENIX accumulated more than 200 petabytes of raw data — or 200 quadrillion bytes — more than all previous RHIC raw datasets combined. This massive dataset includes 40 billion snapshots of the unique form of matter generated in gold-ion collisions.

Collectively, the RHIC measurements will fill in missing details in physicists’ understanding of how a soup of fundamental particles known as quarks and gluons — which last existed in nature some 14 billion years ago, a microsecond after the Big Bang — coalesced and converged to form the more ordinary atomic particles that make up everything visible in our world today. Recreating this primordial matter, known as a quark-gluon plasma (QGP), was the primary reason for building RHIC. RHIC’s energetic collisions of heavy ions such as gold were designed to set quarks and gluons free from “confinement” within protons and neutrons by melting the boundaries of these nuclear particles.

Thanks to considerable contributions from Japan’s RIKEN institute, RHIC was also built with unique capabilities for polarizing protons so that physicists could explore the origins of proton spin. This intrinsic quantum property, somewhat analogous to a planet spinning on its axis, has been leveraged to develop powerful technologies like nuclear magnetic resonance imaging and medical MRIs. RHIC’s polarized proton collisions have opened a new window into the mystery of how spin arises from the proton’s quarks and gluons.

PHENIX and STAR have both collected and published results from large swaths of spin-polarized collisions using selection “triggers” to decide which events to capture and study. During Run 25, sPHENIX became the world’s first detector to record a continuous streaming dataset from RHIC’s spin-polarized proton collisions — thus eliminating the need for triggers and potentially paving the way for unanticipated discoveries.

“This final RHIC run, with its impressive dataset, is a capstone that exemplifies the success of the entire RHIC program,” said John Hill, interim director of Brookhaven Lab. “The scientists, engineers, and technicians at Brookhaven deserve huge credit for their dedication and innovation throughout the operating life of RHIC — and for continually finding new ways to maximize the scientific output of this remarkable machine. We are also extremely grateful for the continued support of the U.S. Department of Energy, and for our collaborators from other DOE labs, U.S. universities, and scientific institutions around the globe. This exploration of the matter that makes up our world and of how it came to be has been, and will continue to be, a truly international endeavor.”

Captivating discoveries

In early 2001, as the earliest RHIC data came out, some scientists were convinced that they’d seen signs of the post-Big-Bang QGP. But the data also presented puzzling surprises. Instead of the predicted uniformly expanding gas of quarks and gluons, the matter created in RHIC’s collisions seemed to flow more like a liquid — and, remarkably, one with extremely low viscosity. Additional experiments and a careful multiyear analysis led the four original RHIC collaborations to conclude in 2005 that RHIC was generating a nearly “perfect” liquid. By 2010, they had sufficient evidence to declare this liquid hot enough to be the long-sought QGP.

Since then, RHIC physicists have been making precision measurements of the QGP, including its temperature at different stages, how it swirls — it’s the swirliest matter ever! — how quarks and gluons in the primordial soup transition under various conditions of temperature and pressure to the nuclear matter that makes up atoms in our world, and how collisions of even small particles can create tiny drops of the QGP. They’ve explored exotic forms of nuclear matter such as that found in neutron stars, detected traces of the heaviest exotic antimatter ever created in a laboratory, and explored how visible matter emerges from the “nothingness” of empty space. The sPHENIX experiment has only recently published its first physics results, laying the foundation for its future of scientific insights.

“RHIC transformed nuclear physics by demonstrating the remarkable consequences of ‘boiling the vacuum,’ to paraphrase renowned physicist T. D. Lee’s description of matter governed by quantum chromodynamics (QCD),” said Brookhaven Lab theorist Raju Venugopalan. “In QCD — the theory that describes quarks and gluons and their interactions — findings from RHIC propelled the rapid development of new analytical approaches and high-performance computing. The RHIC data also sparked several unanticipated connections between the behavior of the QGP fluid and strongly correlated condensed matter systems, including ultra-cold atoms, as well as links to concepts such as quantum entanglement and the formation and evaporation of black holes.”

Advances in nuclear physics theory and the enormous RHIC datasets have also pushed the evolution of supercomputers, AI methods for analyzing “big data,” and the infrastructure needed to store and share data seamlessly with RHIC collaborators around the world. In 2024, Brookhaven’s data center — which also houses data from the ATLAS experiment at the Large Hadron Collider (LHC) at CERN, the European Organization for Nuclear Research, and other experiments — passed the milestone of storing 300 petabytes of data, the largest compilation of nuclear and particle physics data in the U.S. With the newest data from RHIC and ATLAS, the total now tops 610 petabytes.

In the proton spin program, RHIC’s measurements greatly improved the precision with which scientists could determine gluons’ contribution to proton spin, along with the contribution from quarks. This effort was motivated by surprising results from experiments elsewhere in the 1980s showing that quarks contribute only a fraction to this quantum property. Gluons were initially assumed to contribute the rest. RHIC’s measurements reveal that gluons contribute about as much as the quarks — not enough to fully solve the “spin puzzle.” A more recent analysis established that at least some of the gluons are spin aligned with the spin of the proton they are in. But there is still more to explore in this spin puzzle.

“Spin is one of the fundamental quantum numbers of every elementary particle in the universe except one, the Higgs,” said Elke Aschenauer, a Brookhaven Lab physicist who has played a pivotal role in RHIC’s spin physics program. “RHIC’s measurements have established the groundwork for understanding the complexity of proton spin. The future EIC will be a precision machine for studying proton spin.”

All Relativistic Heavy Ion Collider data is stored on tape at Brookhaven Lab’s data center. When physicists want access to a particular dataset — or multiple sets simultaneously — a robot grabs the appropriate tape(s) and mounts the desired data to disk within seconds. Collaborators around the world can tap into the data as if it were on their own desktop. (David Rahner/Brookhaven National Laboratory)

Continuing legacy

Even with so many impressive discoveries in the books, RHIC physicists say there will be many more to come for at least another decade.

“The science mission of RHIC will continue until we analyze all the data and publish all the papers,” said Abhay Deshpande, Brookhaven Lab’s associate laboratory director for nuclear and particle physics. He emphasized how important it will be to preserve RHIC’s data for future scientific analyses.

RHIC’s data will also continue to serve as an essential bridge between ongoing and planned experiments exploring nuclear matter at lower collision energies — for example at the Facility for Antiproton and Ion Research (FAIR) being built in Germany and the Super Proton Synchrotron at CERN — and at much higher energies at CERN’s LHC.

“Analyzing the latest RHIC data will also help train the next generation of physicists needed to run and analyze data from future experiments,” said Lijuan Ruan, a Brookhaven Lab physicist and co-spokesperson for the STAR Collaboration.    

A big part of that future will take place right here at Brookhaven National Laboratory where major components of the RHIC accelerator complex will live on in a new nuclear physics research facility, the world’s only polarized Electron-Ion Collider. Engineers and technicians will remove one of RHIC’s ion storage rings and replace it with a new ring for storing accelerated electrons inside the existing accelerator tunnel. Meanwhile, the other RHIC ring, refurbished for its new mission, will receive ions accelerated by C-AD’s existing injector complex, traveling around the tunnel in the opposite direction from the electrons. Scientists will leverage the experience gained during 25 years of RHIC operations — as well as reams of RHIC accelerator physics data — to develop and train new AI algorithms designed to optimize EIC accelerator performance.  

When electrons collide with ions where the two EIC rings cross, the action will be captured by a brand-new particle detector. Instead of recreating the early universe, these microscope-like interactions will enable precision measurements that reveal how quarks and gluons are organized and interact within matter as we know it in today’s world.

“We’ll learn how quarks and gluons generate mass, how their interactions contribute to proton spin, and much more that will revolutionize our understanding of matter — much as the science we’ve explored at RHIC has,” said Deshpande, who also serves as director of science for the EIC. “This is the future of Brookhaven Lab and nuclear physics in the U.S.”

Daniel Marx, one of the accelerator physicists working on the design of the EIC’s new electron storage ring, said, “It’s going to be very challenging, but also exciting. We’ll be doing things that have never been done before.”

Perhaps Marx was echoing the sentiments of the physicists who originally built RHIC, demonstrating another big part of RHIC’s legacy: an ongoing willingness to tackle unprecedented scientific and technological challenges.

“We are confident that we have the people who will make the EIC happen because of the expertise we have developed by building and running RHIC,” Deshpande said.

RHIC and the future EIC are funded primarily by the DOE Office of Science.

Brookhaven National Laboratory is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit science.energy.gov.

Follow @BrookhavenLab on social media. Find us on Instagram, LinkedIn, X, and Facebook.

Related Links




The Big Questions: Mary Bishai on Mining for Neutrinos


BYLINE: Shannon Brescher Shea: Social media manager and senior writer/editor in the Office of Science’s Office of Communications and Public Affairs

Newswise — Scientists recognized by the Department of Energy Office of Science Distinguished Scientists Fellows Award are pursuing answers to science’s biggest questions. Mary Bishai is a senior physicist at DOE’s Brookhaven National Laboratory.

If it wasn’t for a magazine, I may have become a completely different type of scientist. 

In 1985, my uncle – who was a prominent marine biology professor – was tutoring me in high school biology. As a science lover, he had copies of National Geographic lying around. Intrigued, I convinced my parents to get me a subscription. One article caught my eye – “Worlds Within the Atom.” It described how physicists used massive particle accelerators to study the tiniest things in existence. 

Even though I was born in and living in Egypt, I was enthralled by the research in Europe and the United States. I decided I would one day work at CERN in Switzerland or the Tevatron collider at the Department of Energy’s (DOE) Fermilab.

Although my engineer parents wanted me to follow in their footsteps, I entered the American University of Cairo as a physics major instead. An exchange program later brought me to the United States.

Then nearly 13 years after I first read about the Tevatron at Fermilab, I was there. Fulfilling my dream, I delved into the interactions between the Standard Model of Particle Physics fundamental particles called Quarks and Gluons.

But that’s not the end of the story. Along the way, another type of physics caught my eye – neutrino physics. Since then, I’ve pursued the question – how can neutrinos help us answer some of the biggest questions about how our universe evolved?

The little neutral one

Neutrinos are a type of fundamental particle. They’re in a group called the leptons, which also includes electrons. However, neutrinos are much smaller than their familiar cousins.

Neutrinos are incredibly abundant. On the tip of your tongue right now, there are 300 neutrinos left over from the Big Bang. The sun, Supernovae, cosmic rays interacting with the atmosphere, and nuclear reactors also produce neutrinos. They’re the second most abundant particle in the universe, after photons (particles of light). Neutrinos are everywhere. 

Despite them being so common, neutrinos interact very little with other matter. Every second, 100 billion neutrinos produced by the sun move through your thumbnail and never leave a mark. A neutrino would have to travel 1.6 light years through lead – or 100,000 times the distance from the Earth to the sun – to interact with a single atom. Or as writer John Updike declared in the poem “Cosmic Gall,” “The earth is just a silly ball / To them, through which they simply pass, / Like dustmaids through a drafty hall / Or photons through a sheet of glass.” This lack of interaction inspired the nickname of “ghost particles.” 

Scientists are interested in neutrinos because of their ubiquity and the fact that they could hold the answers to some of physics’ biggest questions. One of those questions is the issue of why there is something in our universe rather than nothing. 

But none of that drew me to neutrino research. Wave-particle duality – or the idea that all matter can act like waves or particles – is a key concept in quantum mechanics. Scientists in the 1960’s stipulated that if neutrinos have non-zero mass, one type of neutrino could convert to another then back again. This would be a direct signature of quantum interference and wave-particle duality. In the late 1990s and early 2000s, experimental results confirmed the observation of neutrino “oscillations.” Hearing about one of the experiments, I said, “Oh my God, this is wave particle duality. It’s quantum mechanics and it’s just there. That’s cool, that’s what I want to do.” 

When I joined DOE’s Brookhaven National Laboratory in 2004 to study neutrinos, I joined a history of “ghostbuster” physicists.  

A history of ghostbusters

Our story starts in the 1930s. At that point, scientists were interested in how radioactive particles fall apart. Beta decay is when a nucleus emits an electron or its anti-matter partner, the positron. When a Nuclei nucleus undergoes beta decay, it transforms into another type of nucleus. When scientists looked at this process, they expected it to release a specific amount of energy. But it didn’t. It seemed like this result contradicted the Law of Energy Conservation, where energy can neither be created nor destroyed. 

Enter our first ghostbuster – Wolfgang Pauli. In a letter to fellow physicists attending a workshop, he proposed the idea of a yet-unknown particle that would carry away some of the energy. It would be neutral and have extremely small mass. While he valued his research enough to write the letter, it didn’t win out over a social obligation. In the same letter, he explained that he couldn’t have traveled to the workshop “since I am indispensable here in Zurich because of a ball.” Physicists do like to party. 

Now let’s jump ahead to the 1950s at DOE’s Los Alamos National Laboratory. Determined to track down these mysterious particles, Fred Reines and Clyde Cowan pursued the “poltergeist project.” While they first proposed detecting neutrinos from nuclear bomb testing, that idea was dismissed. Instead, they placed particle detectors near the Hanford and Savannah River nuclear reactors. The detectors sensed a telltale: two flashes of light from ghost-like neutrinos emitted by the reactors interacting with the material in the detectors. By counting these flashes, the scientists could count the neutrinos being captured by the detector. Developing the first neutrino detector netted Reines the Nobel Prize in 1995.

In addition to reactors, scientists realized that they could produce neutrinos in particle accelerators. From early on, Brookhaven was a leader in neutrino research. Physicists Leon Lederman, Melvin Schwartz, and Jack Steinberger used a proton beam from Brookhaven’s Alternating Gradient Synchrotron to slam protons into a target. A type of particle called a pi meson emerged, which then decayed into a neutrino and a Muons (another cousin of the electron). 

The scientists wanted to know if these were the same type of neutrinos as the ones from beta decay. The tracks the neutrinos left in their detector revealed mostly muon neutrinos and not electron neutrinos which are the type of neutrinos from beta decay. Another Nobel Prize-winning discovery. Later experiments at Fermilab confirmed a third type of neutrino called the tau neutrino – the neutral partner of the tau lepton, the heavier sibling of electrons and muons.

But both reactors and accelerators are made by humans. What about neutrinos from the sun? That was Ray Davis’s question. A chemist and physicist from Brookhaven, Davies began a long-standing physics experiment in 1967. He wanted to test the models that predicted how many solar neutrinos Earth receives. 

Davies installed a particle detector with 615 tons of cleaning fluid in the Homestead gold mine in South Dakota. The solar neutrinos interacted with the chlorine in the cleaning fluid to produce a unique isotope – argon-37. To track the interactions, he painstakingly counted the atoms of argon-37. He kept this up for almost 20 years! For demonstrating how to detect solar neutrinos, he also received a Nobel Prize. 

As these experiments revealed different types of neutrinos – called “flavors” – they also brought up new questions. From studying beta decay, scientists knew that neutrinos are extraordinarily light. In fact, they assumed that neutrinos didn’t have mass at all, like photons. But observations suggested that assumption was wrong. 

In the late 1950s to 1960s, scientists suggested that the different flavors of neutrinos were different mixes of quantum states. In highly relativistic particles like neutrinos, mass, energy, and momentum are all closely related. So when neutrinos act like waves and not particles, you can use their speed to understand their mass. If the different flavors had different speeds, neutrinos would have to have mass. One sign of neutrinos having mass would be one flavor of neutrino turning into another. 

While theory supported that idea, no one had observed that behavior – at least not until 1998 at the Super-Kamiokande (Super-K) detector. This experiment studied neutrinos created by cosmic rays smacking into the atmosphere. It identified if they were muon or electron neutrinos, as well as the direction they came from. The number of neutrinos that came from near the experiment matched well with estimates. In contrast, the ones from far away had a major deficit. The “disappearing” neutrinos were the first observations of neutrinos changing flavor, called oscillation. 

Later experiments confirmed the idea of neutrino oscillation. They also gave evidence of at least three different masses. The results won the leaders of the Super-K and Sudbury Neutrino Observatory experiments yet another Nobel Prize.

From not knowing that neutrinos existed to realizing that they change flavors over time, a lot changed in neutrino science in 60 years. But there was so much we still didn’t know.

Becoming a ghostbuster

This is where I come back into the story. The results from the KamLAND experiment following the Super-K project were so intriguing that I wanted to study this bizarre particle. 

One of the earliest projects I worked on was the Daya Bay experiment. This was an extremely difficult project. This experiment measured neutrinos from one of the most powerful nuclear reactors in the world. We had three detectors: one close to the reactor core, one a few hundred meters away, and a last one about a kilometer away. Spreading out the detectors allowed us to study the differences between them. Taking data over the course of 10 years, we detected 5 million anti-neutrino interactions! They were the most precise measurements in the world of antineutrinos from reactors. 

With these results, we knew there were three mass states and three flavors of neutrino. Each mass state is a different mix of flavors. The first mass state is dominated by the electron neutrino flavor. The second mass state has almost equal amounts of all three types. The third mass state is almost all muon and tao neutrino with a tiny amount of electron neutrinos. While we knew the second mass state was heavier than the first one, we didn’t know if the first mass state was heavier or lighter than the third one.

These flavors and mass states brought up a new question – could neutrinos explain why there is something rather than nothing? There is a fundamental principle called charge-parity symmetry. It states that if a particle is swapped with its anti-particle and left and right are swapped, the laws of physics will act in an identical way. However, if this law was universally true, there would have been equal amounts of matter and anti-matter at the beginning of the universe. As matter and anti-matter completely destroy each other and the universe is dominated by matter, we know there must be an exception. If neutrinos and anti-neutrinos demonstrate different mixing of neutrino flavors, this could be the exception. But to find out, we needed to better understand how neutrinos change flavor. 

The ultimate neutrino experiment

Exploring this issue was why we designed the Deep Underground Neutrino Experiment (DUNE). 

In the early 2000s, a multidisciplinary, multi-institutional team proposed the ultimate neutrino experiment. We picked two facilities with a long history of neutrino research – the former Homestake Mine and Fermilab. Where Ray Davies once studied solar neutrinos is now home to the Sanford Underground Research Facility. Fermilab has a particle accelerator that produces the most powerful neutrino beam in the world. The locations are 1,300 kilometers apart, enough space for us to capture plenty of oscillations.

Besides the sheer distance, DUNE is extremely large and complex. From the beam line to the shielding, everything must be extremely precise. The detectors use 17 kilotons of liquid argon that must be kept at -300 degrees F. Each of the two cryostats that keep the liquid cold is the size of a Boeing 787 plane. To fit the equipment, we had to massively expand the underground space of the former mine.

In addition to detecting neutrino oscillation, DUNE should also provide us with new insights into other issues. It will look for new particles, several types of proton decay, and neutrinos produced by supernovas. 

Recognizing the importance of this experiment, more partners joined the effort. Currently, we have 1,400 scientists from 209 institutions. Our international partners at CERN and elsewhere have made essential contributions to building and testing parts of the detectors.  

I have been involved with DUNE since early in its conception and served as DUNE project scientist from 2012 to 2015, leading the conceptual design of the project. I was also honored to serve as DUNE co-spokesperson from 2023 to 2025. In August 2024, we celebrated our biggest milestone yet – the ribbon cutting of the cavern expansion. The next milestone will be installing the first of four detectors underground. 

Looking forward, I hope that DUNE provides the next generation of scientists and engineers with the same opportunities I had. Working in experimental particle physics at the DOE National Labs has given me the incredible opportunity to study the fundamental science of our universe. I am lucky to study the worlds within the atom that I first read about in a magazine 40 years ago.