LLNL and Meta Co-Develop Future of Materials with Groundbreaking Polymer Chemistry Dataset for Training AI Models | Newswise


Newswise — Polymers are fundamental to our daily lives, serving as the core components for a wide array of goods, including clothing, packaging, transportation infrastructure, construction materials and electronics. Advances in polymer science open pathways for recycling and upcycling waste materials into more valuable chemical feedstocks. They also can have an outsized environmental impact: many widely used polymers are Per- and Polyfluoroalkyl Substances (PFAS), widely recognized as “forever chemicals.”

In a pioneering partnership to accelerate materials discovery with artificial intelligence (AI), researchers from Lawrence Livermore National Laboratory (LLNL) and Meta have created the world’s largest open dataset of atomistic polymer chemistry — a trove of millions of quantum-accurate simulations designed to help AI model the complex behavior of plastics, films, batteries and countless everyday materials.

In a recent paper, the team details Open Polymers 2026 (OPoly26) – a dataset with an unprecedented number and diversity of polymer structures with corresponding simulations performed at quantum accuracy. OPoly26 is a massive reference library that enables AI to learn patterns from millions of pre-computed polymer structures in hours or days, addressing a longstanding gap in polymer data and laying the foundation for safer, faster and more sustainable materials design. The OPoly26 paper formalizes the dataset’s release and demonstrates how the data improves the performance of machine-learned interatomic potentials (MLIPs) on polymer materials.

The work builds on the Meta and Lawrence Berkeley National Laboratory (LBNL)-led      Open Molecules 2025 (OMol25) Dataset, which is making waves with its sweeping collection of open molecular data aimed at advancing AI-driven chemistry. The OPoly26 dataset contains more than 6 million density functional theory (DFT) calculations on polymeric chemical systems, making it nearly ten times larger than the next largest comparable polymer dataset.

LLNL’s partnership with Meta — described by LLNL materials scientist and OPoly26 co-principal investigator (PI) Evan Antoniuk as a “natural fit” — seeks to address this shortfall. By generating critical missing data on polymers with the shared goals of expanding and democratizing open datasets for materials scientists, the team hopes to accelerate the pace of discovery across polymer chemistry.

“This fills a huge gap,” said Antoniuk. “We see this as a community resource, one that we hope becomes the go-to starting point for anyone interested in performing atomistic simulations of polymers.”

LLNL contributed significant computational power and polymer domain knowledge — generating a diverse set of polymer structures and running simulations to help model how these polymers behave in real-world conditions. In turn, Meta contributed vast computational resources to perform 1.2 billion core hours of DFT simulations and train state-of-the-art MLIP models, leveraging the expertise that had already been refined during their earlier molecular effort.

“Meta’s partnership with LLNL demonstrates how open science and AI can accelerate breakthroughs in materials research,” said Rob Sherman, vice president of policy at Meta. “By making this dataset publicly available, we’re giving scientists potent new tools to address critical challenges in healthcare and beyond.”

LLNL is uniquely positioned to generate the OPoly26 dataset at the scale and fidelity required. Researchers tapped into LLNL’s Tuolumne, the world’s 12th fastest supercomputer and companion to the exascale El Capitan, leveraging this hardware with their collective expertise to compress years of simulation work into months and enabling the dataset to reach a scale unmatched in polymer science.

“Comprehensive coverage of this chemical space is essential to the success of the OPoly26 dataset,” said LLNL staff scientist Nick Liesen. “We have worked to leverage pipelines that take us from a simple text string to fully atomistic representations of polymer dynamics at scale.”

Beyond performing all the DFT calculations, researchers at Meta trained and benchmarked machine-learned interatomic potentials at scale, enabling the team to evaluate how well AI models generalize across small-molecule and polymer chemistry. The paper reports substantial improvements in model accuracy when polymer data is incorporated alongside small-molecule training sets, highlighting the importance of training AI on data that reflects real-world complexity.

Understanding why certain polymers, including PFAS-based materials, resist chemical change requires models that can accurately describe both reactive and nonreactive behavior. Capturing this behavior under realistic conditions required careful attention to reactive configurations, according to LBNL chemist and OPoly26 co-PI Sam Blau, who also previously co-led OMol25.

“Reactivity — the breakage and formation of chemical bonds — is central to polymer synthesis, manufacturing, aging and recycling, and to nanoscale patterning of polymer thin films for semiconductor manufacturing,” said Blau. “By going beyond stable structures and explicitly sampling hundreds of thousands of reactive configurations, we aim to accurately describe the reactive events that often govern polymer behavior under real-world conditions.”

Beyond outlining how the dataset was generated and performing standard tests of MLIP performance, the OPoly26 paper also introduces an initial suite of polymer-specific evaluation tasks to benchmark how effectively these models capture simulated polymer phenomena and interactions, such as polymer solvation. Future work will include evaluating the MLIP models against experimental measurements, offering a gauge of how well they can capture real-world polymer properties.

“LLNL’s significant investment in high-performance scientific computing and computational materials science capabilities have been critical to achieving the scale needed to cover many thousands of distinct chemical structures,” said LLNL Materials Science Division Leader Ibo Matthews. “That scale is essential not only for generating the data, but for rigorously evaluating how well AI models perform across the full range of polymer behaviors relevant to real-world applications.”

With a focus on open collaboration, the team is making all data publicly available to fuel polymer advancements across academia, industry and government. The authors also emphasized that OPoly26 is being released under an open license to maximize reuse and reproducibility. Through this open approach, the partnership ensures that the benefits of this public-private investment flow broadly across the entire research community.

The team includes LLNL scientists Brian Van Essen, James Diffenderfer, Helgi Ingolfsson and Supun Mohottalalage, and polymer simulation experts Amitesh Maiti and Matt Kroonblawd from the Lab’s Materials Science Division. Co-authors also included LBNL’s Nitesh Kumar and Lauren Chua. Blau and Kumar’s work was funded by the Center for High Precision Patterning Science (CHiPPS), while Chua was supported by her DOE Computational Sciences Graduate Fellowship. LLNL’s Laboratory Directed Research and Development program funded the LLNL researchers.

This partnership was made possible through a data transfer agreement, facilitated by LLNL’s Innovation and Partnerships Office (IPO). IPO is the Laboratory’s focal point for industry engagement and facilitates partnerships to deliver mission-driven solutions that support national security and grow the U.S. economy. To connect with LLNL on industrial partnerships in Advanced Computing, AI and Quantum technologies, contact IPO Business Development Executive Clarence Cannon.




Fentanyl or Phony? Machine Learning Algorithm Learns to Pick Out Opioid Signatures | Newswise


Newswise — New forms of fentanyl are created every day. For law enforcement, that poses a challenge: how do you identify a chemical you’ve never seen before?

Researchers at Lawrence Livermore National Laboratory (LLNL) aim to answer that question with a machine learning model that can distinguish opioids from other chemicals with an accuracy over 95% in a laboratory setting. The foundation for this new technique was published in Analytical Methods.

To identify synthetic opioids like fentanyl now, chemists try to match their signature to a library of a few hundred known samples. But studies suggest there could be thousands of unknown forms, some more dangerous than others. Recognizing those new versions requires a reference-free identification system: a way to catch an opioid even if it does not exist in a chemical database yet.

“When law enforcement finds a new clandestine drug operation, those labs often produce never-before-seen fentanyl derivatives. We can’t just go check a database, and we can’t just go back to who made it and ask how they did it,” said LLNL computational mathematician and author Colin Ponce. “And law enforcement needs to identify the samples they find quickly because there’s going to be another sample tomorrow. I think that’s a little bit of a unique situation.”

Machine learning might seem like a natural fit to identify novel or unknown opioids. And it is — to an extent. The method works best with large data sets, which are difficult to generate for toxic substances like synthetic opioids. 

To even get a machine learning algorithm off the ground, the team had to create the chemical data. They did so with LLNL’s mass spectrometry capabilities coupled to an autosampler, which enabled them to measure hundreds of samples under the same experimental conditions. This minimized variables for the machine learning algorithms. 

“In the world of AI, data is gold, and if you don’t have good data, then you’re not going to generate accurate machine learning models,” said LLNL chemist and author Carolyn Fisher. “Good data is something that we can control and generate at LLNL.” 

With that data in hand, they tried different machine learning techniques as they homed in on the best method: a random forest model. 

“When a model like this eventually gets into the hands of a user, the output has to be interpretable and trustworthy,” said LLNL scientist and author Kourosh Arasteh. “We explored machine learning methods ranging from simple regression and random forests to more complex neural network approaches to balance interpretability with performance.” 

The random forest approach runs through a collection of decision trees. Each tree asks a series of questions about the data and, based on each answer, lands on a prediction: opioid or not. Together, they vote on the final classification.

“Our 650 samples are not the same as having 300,000 samples. On the machine learning side, we needed to make sure that we were designing techniques that that were appropriate for that kind of scale,” said Ponce.

This study trained and tested the algorithm with analytically pure samples. These ideal chemicals contain no contaminants or impurities.

“The challenge is that nothing is analytically pure in the real world,” said Fisher. “The next step is to add in background noise and have the AI understand what it should care about during a classification task.”

Fisher and Ponce emphasized that this work would have been impossible without collaboration across the disciplines of data science and chemistry. The two are friends outside of work, and this study, a Laboratory Directed Research and Development project, emerged from a series of organic conversations between them.

“To me, this project really captures what LLNL does best,” said fellow author and LLNL software engineer Steven Magana-Zook. “When you get chemists and data scientists working side by side, you end up with results that neither group could get on their own. That kind of cross-disciplinary work is exactly what makes this place so strong.”

That approach, while essential to the work, initially proved to be an obstacle. The team faced rejection of this manuscript from two journals — reviewers in chemistry didn’t fully grasp the machine learning aspects and experts on the computational side felt uncertain about the chemistry.

“I don’t think people talk about failure enough. It’s so common in science. We fail so much more than we succeed,” said Fisher. “But we keep iterating and improving. I’m proud of our resilience.” 

The team’s persistence paid off. Looking ahead, they aim to further develop their algorithm using real-world samples with higher background signals. 

Other LLNL coauthors include Roald Leif, Alex Vu, Mark Dreyer, Brian Mayer and Audrey Williams.




Light-Based 3D Printing Method Lets Scientists Program Plastic Properties at the Microscale | Newswise


Newswise — Researchers at Lawrence Livermore National Laboratory (LLNL) have co-developed a new way to precisely control the internal structure of common plastics during 3D printing, allowing a single printed object to seamlessly shift from rigid to flexible using only light.

In a paper published today in Science, the researchers describe a technique called crystallinity regulation in additive fabrication of thermoplastics (CRAFT) that enables microscopic control over how plastic molecules arrange themselves as an object is printed. The work opens new possibilities for advanced manufacturing, soft robotics, national defense, energy damping and information storage, according to the researchers. The team includes collaborators from Sandia National Laboratories (SNL), the University of Texas at Austin, Oregon State University, Arizona State University and Savannah River National Laboratory.

The team demonstrated that by carefully tuning light intensity during printing, they could dictate how crystalline or amorphous a thermoplastic becomes at specific locations within a part. That molecular arrangement determines whether a material behaves more stiff and rigid, or as a softer, more flexible plastic — without changing the base material. CRAFT builds on that principle by allowing researchers to control crystallinity spatially during printing, rather than uniformly throughout a part.

“A classic example of crystallinity is the difference between high-density polyethylene —picture a milk jug — and low-density polyethylene, like squeeze bottles and plastic bags. The bulk property difference in these two forms of polyethylene stems largely from differences in crystallinity,” said LLNL staff scientist Johanna Schwartz. “Our CRAFT effort is exciting in that we are controlling the crystallinity within a thermoplastic spatially with variations in light intensity, making areas of increased and decreased crystallinity to produce parts with control over material properties throughout the whole geometry.”

A key challenge, however, was translating this new materials capability into practical manufacturing instructions that could be used on real 3D printers, according to LLNL engineer Hernán Villanueva. Villanueva joined the project after early discussions with Schwartz and former SNL scientists Samuel Leguizamon and Alex Commisso identified a missing link: a way to convert any three-dimensional computer-aided design (CAD) into the detailed light patterns needed to print parts using the CRAFT method.

Villanueva said he drew on prior work in a multi-institutional team focused on lattice structures and advanced manufacturing workflows. In that effort, he developed software that rapidly converted complex, topology-optimized designs into printing instructions by parallelizing the process on LLNL’s high-performance computing (HPC) systems — reducing turnaround times from days to hours or minutes.

Applying that same computational approach to CRAFT, Villanueva adapted the workflow to encode “changes in light” rather than changes in material. He was soon able to convert 3D CAD geometries directly into CRAFT printing instructions, cutting instruction-generation time from hours — or even a full day — down to seconds, making rapid design iteration and demonstration of the method practical.

“This work is a natural extension of the Lab’s strengths in advanced manufacturing and materials by design,” Villanueva said. “As part of the CRAFT effort, we have evolved a tool that connects materials science with computational workflows and advanced printing, enabling us to move directly from a 3D design to a part with spatially varying properties.”

The team’s method relies on a light-activated polymerization process in which exposure level governs the stereochemistry of growing polymer chains, researchers said. Lower light intensities favor more ordered crystalline regions, while higher intensities suppress crystallization, yielding softer, more transparent material. By projecting grayscale patterns during printing, the team produced parts with smoothly varying mechanical and optical properties.

The demonstrated ability to tune properties by changing a light’s intensity rather than swapping materials could significantly simplify additive manufacturing (3D printing), Schwartz explained.

“If you can get many different properties from one vat of material, printing complex multi-material or multi-modulus structures becomes much easier,” she said.

The researchers demonstrated the CRAFT technique on commercial 3D printers, fabricating objects that combine multiple mechanical behaviors in a single print. Examples included bio-inspired structures that mimic bones, tendons and soft tissue, reproductions of famous paintings, as well as materials designed to absorb or redirect vibrational energy without adding weight or complexity. Among the most striking demonstrations was the ability to encode crystallinity through transparency differences, according to Schwartz.

“Being able to visualize the differences easily spatially, to the point of generating the Mona Lisa out of only one material, was incredibly cool,” Schwartz said.

LLNL’s Villanueva said the work reflects the Lab’s long-standing investments in HPC and in integrating modeling, design tools and novel manufacturing processes. He added that future work could integrate topology optimization directly into the CRAFT framework, enabling researchers to optimize light patterns themselves — rather than material layouts — to achieve desired performance.

Because the process works with thermoplastics — materials that can be melted and reshaped — printed parts remain recyclable and reprocessable, an important advantage for manufacturing sustainability. The findings suggest a future where 3D-printed plastic components can be tailored at the molecular level for specific functions, bridging the gap between material science and digital manufacturing.

From an applications standpoint, Schwartz said the technology could have broad and near-term impact.

“Energy dampening and metamaterial design are the most exciting use cases to me,” she said. “From space to fusion to electronics, there are so many industries that rely on energy and vibrational dampening control. This CRAFT printing process can access all of them.”