“Plastic-Eating” Fusion Enzyme Improves Polyester Textile Recycling | Newswise


Newswise — In a study published in Bioresource Technology Journal, scientists from the universities of Portsmouth and Manchester report that a specially engineered enzyme can significantly speed up the breakdown of PET – the plastic used in water bottles, food packaging and polyester clothing – when it is processed at high concentrations similar to those used in industry. 

PET, short for poly (ethylene terephthalate), is cheap, durable, and widely used. But those same qualities mean it builds up in vast quantities once thrown away. 

Polyester textiles are notoriously difficult to recycle. Their fibres are tightly packed and highly ordered into a structure created during manufacturing, which makes them resistant to biological breakdown. 

 

Enzymes are natural proteins that can speed up chemical reactions. The team combined two different components into one fusion enzyme. The first was a heat-tolerant cutinase; a natural enzyme that normally breaks down a protective polyester found on plant surfaces called cutin. The second was a binding module designed to help the enzyme to attach more tightly to plastic. 

The two components were carefully matched, so they work best at the same temperature and are suited to the same kinds of plastic structure. The aim was to make the enzyme stick to PET and ensure it could continue breaking it down efficiently under realistic recycling conditions. 

While the modified enzyme did attach more strongly to highly crystalline PET – the tough, tightly packed form found in many plastics – did not automatically lead to faster breakdown. In fact, when the plastic structure remained highly ordered, there was limited gain. 

The real progress came when the plastic was less crystalline and as a result more accessible to the enzyme. Under controlled conditions that mimic industrial recycling – including carefully managed pH and plastic concentrations of 20 per cent by weight – the fused enzyme broke down less-ordered PET much more quickly. 

The biggest improvement was seen in a pre-consumer polyester textile that had been specially treated to make it less crystalline and finely ground. In that case, the amount of useful breakdown products doubled. 

“By matching the enzyme with the right binding module and preparing the plastic in the right way, we can overcome a major bottleneck in plastic recycling,” said Professor Andrew Pickford, Director of the University of Portsmouth’s Centre for Enzyme Innovation (CEI). “This isn’t just about helping the enzyme stick to the surface – it’s about making sure the chemical reaction can run efficiently at the high plastic concentrations used in industry.” 

The findings also help explain why earlier studies of similar enzyme combinations have produced mixed results. If an enzyme binds too tightly to the surface, it can slow the reaction – a well-established concept in chemistry known as the Sabatier principle. 

The study suggests that enzyme-based recycling of PET – a promising but technically challenging solution – could become more practical at scale but success, depends on getting three factors right: the enzyme, any helper module that guides it to the plastic, and the structure of the material itself. 




How Antibiotic-Degrading Bacteria Shield Microbial Communities From Collapse | Newswise


Newswise — By comparing natural microbial adaptation with targeted bioaugmentation using an antibiotic-degrading strain, the study reveals how biodegradation capacity fundamentally reshapes microbial succession, stability, and resilience under sustained antibiotic exposure.

Environmental risk assessments often judge antibiotics solely by concentration and intrinsic toxicity, assuming uniform microbial responses. However, microbial communities actively shape contaminant fate, particularly when they include antibiotic-degrading organisms. Sulfamethoxazole (SMX), a common sulfonamide found in wastewater and surface waters, illustrates this complexity. Even at low levels, SMX can suppress sensitive taxa, disrupt community structure, and impair essential functions such as nutrient removal. Yet some bacteria possess specialized genes that enzymatically inactivate SMX, reducing antibiotic pressure for the broader community. How such biodegradation capacity governs microbial succession and community stability remains insufficiently understood.

study (DOI:10.48130/biocontam-0025-0016) published in Biocontaminant on 12 December 2025 by Bin Liang’s team, Harbin Institute of Technology, demonstrates that antibiotic-degrading bacteria act as keystone protectors that mitigate antibiotic stress, stabilize microbial community succession, and enhance ecosystem resilience, highlighting biodegradation capacity as a critical determinant of environmental risk.

Using a controlled sequencing batch reactor framework, the study first isolated and characterized an SMX-degrading bacterium from activated sludge by continuous subculture with SMX as the sole carbon source, then tested how degrader-enabled biodegradation reshapes community succession by inoculating the strain under defined antibiotic stress and tracking community dynamics with SMX degradation assays, ex situ degradation tests, and 16S rRNA sequencing across multiple reactor phases. The isolated strain, Paenarthrobacter sp. M5 (100% 16S rRNA similarity to P. ureafaciens), fully degraded 30 mg/L SMX within 10 h, producing equimolar 3-amino-5-methylisoxazole and carrying the key gene sadA; mechanistically, a SadA/SadC two-component system drove ipso-hydroxylation and cleavage of the -C–S–N- bond, yielding non-antibacterial intermediates (including p-aminophenol that could be further metabolized for growth). Four reactor treatments were established—NN (no SMX), SN (natural adaptation with SMX), NM (M5 inoculated without SMX), and SM (pre-adaptation: M5 inoculated with SMX)—revealing that SN communities acquired biodegradation gradually (over ~28 cycles at 2 mg/L SMX), whereas SM communities showed immediate, efficient degradation after inoculation; with increasing SMX loads, both SMX-exposed groups ultimately achieved complete removal, indicating inducible biodegradation under sustained selection. When SMX exposure was paused and then reintroduced at high levels, functional recovery ranked SN > SM > NM, while NN showed ~70% degradation with high replicate variability, underscoring how evolutionary history governs resilience. Ex situ assays reinforced these trends: SN improved to 36.3%, 62.3%, and 100% removal at 2, 5, and 10 mg/L SMX, SM remained consistently complete across phases, NN stayed low (12.2%–16.6%), and NM declined (30.5%→13.4%), highlighting antibiotics as the key driver sustaining degrader colonization. 16S/OTU analyses showed a shared core microbiome across all groups, but shared OTUs dropped sharply during restructuring (from 1,035 to ~440) before stabilizing (~533–578), while α-diversity patterns revealed that slower biodegradation in SN retarded succession and preserved higher diversity during T2–T4, whereas efficient degradation in SM buffered antibiotic stress and restored “regular” successional dynamics. Multivariate statistics (ADONIS/MRPP) confirmed dose-dependent SMX-driven divergence in SN versus NN, but minimal structural differences between SM and NN through most phases, indicating that bioaugmentation-mediated biodegradation can protect community structure from antibiotic perturbation.

These findings have direct relevance for wastewater treatment and environmental management. Antibiotic-degrading bacteria can stabilize treatment performance by protecting key microbial functions from antibiotic disruption. Targeted bioaugmentation or monitoring of native degrader populations could reduce the risk of treatment failure and limit conditions that favor the spread of antibiotic resistance.

###

References

DOI

10.48130/biocontam-0025-0016

Original Source URL

https://doi.org/10.48130/biocontam-0025-0016

Funding Information

The study was funded by the National Natural Science Foundation of China (Grant No. 52322007), the Guangdong Basic and Applied Basic Research Foundation (Grant No. 2023B1515020077), and Shenzhen Science and Technology Program (Grant No. JCYJ20240813105125034).

About Biocontaminant

Biocontaminant is a multidisciplinary platform dedicated to advancing fundamental and applied research on biological contaminants across diverse environments and systems. The journal serves as an innovative, efficient, and professional forum for global researchers to disseminate findings in this rapidly evolving field.




Team Simulates a Living Cell That Grows and Divides | Newswise


Newswise — By simulating the life cycle of a minimal bacterial cell — from DNA replication to protein translation to metabolism and cell division — scientists have opened a new frontier of computer vision into the essential processes of life.

The researchers, led by chemistry professor Zan Luthey-Schulten, present their findings in the journal Cell.

The team simulated a living cell at nanoscale resolution and recapitulated how every molecule within that cell behaved over the course of a full cell cycle. The work took many years, vast computer resources, large experimental datasets, a suite of experimental and computational techniques and an understanding of the roles, behaviors and physical interactions of thousands of molecular players. The researchers had to account for every gene, protein, RNA molecule and chemical reaction occurring within the cell to recreate the timing of cellular events. For example, their model had to accurately reflect the processes that allow the cell to double in size prior to cell division.

Watch a video of the full life cycle 4D simulation of a minimal bacterial cell. 

To make the task more manageable, the team used a living “minimal cell” developed at the J. Craig Venter Institute in California. The version of the cell used in the new study, JCVI-syn3A — “Syn3A” for short — is a modified bacterium with a pared-down genome that carries only the genes needed to replicate its DNA, grow, divide and perform most of the other functions that make life possible.

“This is a three-dimensional, fully dynamic kinetic model of a living minimal cell that mimics what goes on in the actual cell,” Luthey-Schulten said. “Such a comprehensive undertaking was only possible through the combined efforts of a host of collaborators at the U of I as well as Harvard Medical School, where we systematically modeled the essential metabolism and other subcellular networks through a series of publications starting in 2018.”

The Syn3A cell has fewer than 500 genes, all of which reside on a single circular strand of DNA. The laboratories of study co-authors Angad Mehta, a professor of chemistry, and Taekjip Ha, of Boston Children’s Hospital and Harvard Medical School, generated additional experimental data that allowed the team to accurately simulate and validate numerous aspects of cell function.

Watch a video of the first-ever whole cell 4D simulation showing everything everywhere all at once.

“Most importantly, their work revealed the extent of DNA replication and that Syn3A’s cell division is symmetrical,” Luthey-Schulten said.

Both factors guided and validated the simulations performed by Zane Thornburg, a postdoctoral fellow at the Beckman Institute for Advanced Science and Technology and the Cancer Center at Illinois, and Andrew Maytin, a graduate student in Luthey-Schulten’s lab.

Like other bacterial cells, Syn3A has no nucleus. Every molecule that comprises and sustains it is either a component of its outer membrane, is transported into it from outside the cell or is assembled in the cytoplasm. The cell is so jam-packed with molecular players that, when creating high-resolution cartoons and animations of their computer simulations, the researchers had to render some of the components invisible. Making all the cellular proteins invisible, for example, allowed the scientists to see how Syn3A’s chromosome threads through the cell’s crowded interior.

Some processes were more computationally expensive than others, the team discovered. For example, Maytin realized that chromosome replication was slowing the whole simulation to a crawl, nearly doubling the time it took to capture the whole cell cycle. He determined that efficiently simulating the cell’s DNA replication process required its own dedicated graphics processing unit, while another GPU handled all other cellular dynamics. This allowed the team to simulate the full, 105-minute cell cycle in just six days of computer time.

Thornburg and Maytin struggled with the challenge of simulating cellular events occurring at the same time in various parts of the cell.

“I can’t overstate how hard it is to simulate things that are moving — and doing it in 3D for an entire cell was … triumphant,” Thornburg said. “One of the last big hurdles that Andrew and I had to solve was understanding how the membrane and the DNA talk to one another when both are moving.”

While the simulated cell cycle has its limitations — this was not an atom-by-atom simulation but instead averaged the dynamics of individual molecules — it yielded a surprisingly accurate accounting of the timing of cellular processes. In repeated simulations involving individual cells with slightly varying start conditions, the simulated cell cycle occurred, on average, within two minutes of the real-world cell cycle, Thornburg said. The work was repeatedly guided and tested against actual experimental outcomes, a process that allowed the scientists to refine their simulations.

The ability to accurately capture the ever-changing conditions within a living cell opens a new window on the foundations of living systems, Luthey-Schulten said.

“We have a whole-cell model that predicts many cellular properties simultaneously,” she said. “If you want to know what’s going on, say, in nucleotide metabolism, you can also look at what’s going on in DNA replication and the biogenesis of ribosomes. So the simulations can give you the results of hundreds of experiments simultaneously.”

Study co-authors also include Illinois chemistry alumnus Benjamin Gilbert and John Glass, who leads the J. Craig Venter Institute Synthetic Biology Group.

This work was conducted in the National Science Foundation’s Science and Technology Center for Quantitative Cell Biology at the U of I. Luthey-Schulten also is a professor of physics and a professor in the Beckman Institute at the U. of I. The research was conducted using the Delta advanced computing and data resource, which is supported by the NSF and the state of Illinois. Delta is a joint effort of the U of I and its National Center for Supercomputing Applications.




Most mass spectrometers can process just a few molecules at once. A reengineered prototype does a billion simultaneously | Newswise


Newswise — Mass spectrometry is already a powerful tool for determining what kind and how many molecules are present in a given sample. But most instruments still analyze their molecules one or just a few at a time, an approach that is inefficient and costly, and in which rare, but significant molecules can easily fall between the cracks.

A more powerful version of the technology could one day allow scientists to read the full molecular contents of a single cell, track thousands of chemical reactions at once, and ultimately accelerate efforts like drug development.

Now, a new study describes the first big step in that direction by producing a prototype, dubbed MultiQ-IT, that’s capable of handling vast numbers of molecules at once. The findings offer a blueprint for faster, more sensitive instruments that could position mass spectrometry for the kind of transformation that reshaped genomics and computing.

“What revolutionized DNA sequencing wasn’t any change in the underlying chemistry. That’s remained fundamentally the same,” says Brian T. Chait, Laboratory of Mass Spectrometry and Gaseous Ion Chemistry at Rockefeller. “It was the ability to run so many chemical reactions in parallel, which took genome sequencing from a billion-dollar effort to something that costs around $100. The same thing happened in computing with GPUs. And that’s what we’re trying to do with mass spectrometry.”

A massive bottleneck

Mass spectrometry was invented around 1913 and has since become one of biology’s most powerful analytical tools. The technology allows scientists to identify and quantify molecules by ionizing them, or giving them an electric charge, and measuring their mass-to-charge ratio. But despite its sophistication, most mass spectrometers still do this sequentially, one or just a few ion species at a time, often lacking the exquisite sensitivity needed to identify rare molecules in complex biological samples.

“It’s a wonderful technique—you can do unimaginably wonderful, analytical things with it,” Chait says. “But I was always a little frustrated by its limitations. I knew, in my heart, it could be better.”

If it were, it could transform single-cell proteomics as well as metabolomics, burgeoning fields that aim to identify and quantitate the complete set of proteins or metabolites in a single cell. Unlike DNA, these molecules cannot be amplified, and the most abundant species may be millions of times more prevalent than the rarest.  Mass spectrometry is already proving useful in these applications, but without far greater ability to detect faint signals against an overwhelming background of more abundant species, it will fall well short of its full potential.

Chait and colleagues suspected that the only way to overcome this limitation would be to usher the century-old technology through the so-called “massive parallelization” that once transformed computing and genomics. In computing, researchers discovered that dividing large tasks into many smaller ones and processing them simultaneously—using graphics processing units, or GPUs—dramatically increased performance. DNA sequencing followed a similar path, resulting in relatively low-cost platforms that analyze millions of reactions at once.

“It was a very obvious idea,” says Andrew Krutchinsky, a senior research associate in the lab. “But how to do it with mass spectrometry wasn’t obvious.”

Toward massively parallel processing

The idea for the MultiQ-IT grew out of decades of research into how molecules move in and out of a cell’s nucleus through hundreds of tiny gateways called nuclear pore complexes. Chait and colleagues had observed how the cell spreads the work across many parallel openings, instead of forcing traffic through a single channel. The team wondered whether mass spectrometry could be redesigned along these lines.

The result was a new ion-trapping chamber designed to replace the core component of a conventional mass spectrometer. The cube-shaped device is lined with hundreds of small, electrically controlled openings. Inside, ions are slowed by multiple collisions with residual gas molecules and allowed to move randomly through the chamber, where the system can filter, hold, and redirect many populations at once instead of analyzing them one by one. The team scaled the design from six openings to more than 1,000, testing how efficiently ions could be confined and sorted, and demonstrated that a single incoming stream could be split into multiple parallel streams for simultaneous analysis.

Its performance was striking. At any given moment, a 486-port version of MultiQ-IT could hold up to ten billion charges, roughly a thousand times the capacity of conventional ion traps.

By allowing abundant background molecules to leak out while retaining rarer, information rich ones, the system improved signal-to-noise ratios by as much as 100-fold, revealing proteins that had been undetectable. To achieve this, the researchers applied a small electrical voltage barrier across the trap’s exits: singly charged ions had enough energy to escape, while multiply charged, biologically important ions remained confined. In their 1,134-port design, just 39 open ports were enough to reach half maximum efficiency for this depletion, echoing how cells use a limited number of pores to similar effect. The team also found that parallelization addressed a physical constraint: packing billions of like-charged particles into a small space creates intense electrical repulsion, but distributing them across many channels reduced this repulsion in these channels..

This increased sensitivity demonstrated by their prototype could for example lead to improved detection of low abundance crosslinked peptides, which are proving very useful for mapping the structures of large protein complexes. “The least abundant things can be more important than the more abundant things,” Krutchinsky says.

For now, MultiQ-IT is less a finished commercial instrument than a demonstration of what is possible. The researchers see their role as establishing the physical blueprint that could one day be scaled into robust clinical and analytical tools.

“There was a lot of development between the discovery of a reaction for sequencing DNA and modern genomics; decades between the first transistor and putting a billion transistors on a chip,” Chait says. “In both cases, someone first had to show it could be done, and then industry took over. I think we’ve shown one way mass spectrometry can be done more efficiently.”




Five Georgia Tech Faculty Named to NAI Senior Members Class of 2026 | Newswise


Newswise — Five faculty members from Georgia Tech have been elected as senior members of the National Academy of Inventors (NAI). As members, they are recognized as distinguished academic inventors with a strong record of patenting technologies, licensing IP, and commercializing their research. Their innovations have made, or have the potential to make, meaningful impacts on society. 

 “The election of our faculty members to this prestigious association is a powerful affirmation of the innovative research happening at Georgia Tech,” said Raghupathy “Siva” Sivakumar, chief commercialization officer at Georgia Tech. “Their work to take research to market reflects the growing importance of invention in addressing society’s most complex challenges. This recognition signals the strength of the commercialization ecosystem at Georgia Tech to advance impactful research, encourage innovation, and prepare the next generation of inventors.” 

The 2026 Georgia Tech NAI senior members are: 

  • Jason David Azoulay, associate professor, School of Materials Science and Engineering School and School of Chemistry and Biochemistry
  • Jaydev Prataprai Desai, professor and cardiovascular biomedical engineering distinguished chair, Wallace H. Coulter Department of Biomedical Engineering
  • David Frost, Elizabeth and Bill Higginbotham Professor and Regents’ Entrepreneur, School of Civil and Environmental Engineering
  • Chandra Raman, Dunn Family Professor of Physics, School of Physics
  • Aaron Young, associate professor, George W. Woodruff School of Mechanical Engineering

Jason David Azoulay

Azoulay is recognized for pioneering new classes of functional materials through innovative polymer synthesis, heterocycle chemistry, and polymerization reactions. His work spans electronic, photonic, and quantum materials, device fabrication, and chemical sensing for environmental monitoring. He has demonstrated new classes of organic semiconductors with infrared functionality and holds nine issued U.S. patents. Azoulay is the Georgia Research Alliance Vasser-Woolley Distinguished Investigator and holds a joint appointment in the School of Chemistry and Biochemistry. 

Jaydev Prataprai Desai

Desai is recognized for advancing medical robotics and translational biomedical innovation with inventions spanning robotically steerable guidewires for endovascular interventions, minimally invasive surgical tools, MEMS sensors for cancer diagnosis, and rehabilitation robotics for people with motor impairments. He is the founding editor-in-chief of the Journal of Medical Robotics Research, has authored more than 225 peer-reviewed publications, and serves as the Director of Georgia Center for Medical Robotics at Georgia Tech. Desai holds 16 U.S. and International patents.  

David Frost

Frost has built a career at the intersection of civil engineering research and entrepreneurship. A leader in the study of natural and human-made disasters and their impacts on infrastructure, he has founded two Georgia Tech-based software companies: Dataforensics, which offers tools for subsurface data collection and infrastructure project management, and Filio, an AI-powered mobile platform that supports visual asset management in construction and post-disaster reconnaissance. In 2023, Frost was named a Regents’ Entrepreneur by the University System of Georgia’s Board of Regents, a designation reserved for tenured faculty who have successfully taken their research into a commercial setting. He holds four U.S. patents.  

Chandra Raman

Raman is a physicist, inventor, and technology entrepreneur whose research on ultracold atoms is enabling a new generation of ultraprecise quantum sensing devices. He is the co-inventor of chip-scale atomic beam technology — a breakthrough that makes it possible to miniaturize quantum sensors for navigation and timing applications in environments where GPS fails, with uses spanning autonomous vehicles, aerospace, and national security. Raman holds six U.S. patents, three of which have been issued and two licensed. To bring his inventions to market, he founded 8Seven8 Inc., Georgia’s first quantum hardware company. He is a fellow of the American Physical Society and an advisor to national and space-based quantum initiatives. 

Aaron Young

Young directs the Exoskeleton and Prosthetic Intelligent Controls Lab, where he develops robotic exoskeletons and intelligent control systems to improve walking function and physical capability for people with mobility impairments and industrial safety applications. His research has been supported by major federal grants from the National Institutes of Health, and he holds three U.S. patents. Young works with Georgia Tech’s Office of Technology Licensing and Quadrant-i to advance promising technologies toward real-world adoption. 

About Georgia Tech’s Office of Commercialization 

The Office of Commercialization is the nexus of research commercialization and entrepreneurship at Georgia Tech, bringing leading-edge research and innovation to market. It comprises six key units — ATDC, CREATE-X, VentureLab, Quadrant-i, Technology Licensing, and Velocity Startups — that empower students and faculty to launch startups, manage intellectual property, and transform research ideas into positive societal impact. Learn more at commercialization.gatech.edu

About the National Academy of Inventors 

The National Academy of Inventors is a member organization comprising U.S. and international universities, and governmental and nonprofit research institutes, with over 4,000 individual inventor members and fellows spanning more than 250 institutions worldwide. It was founded in 2010 to recognize and encourage inventors with patents issued from the U.S. Patent and Trademark Office, enhance the visibility of academic technology and innovation, and translate the inventions of its members to benefit society. Learn more at academyofinventors.org




Target the Tumor. Spare the Body | Newswise


Newswise — Exhaustion creeps in. Appetite vanishes. Hair thins. The person in the mirror looks gaunt. It’s the paradox of cancer treatment: The same drugs meant to save a life can also wear the body down.

Nick Housley, assistant professor in Georgia Tech’s School of Biological Sciences, wants to change that. He studies where cancer drugs go once they’re inside the body, including places they were never intended to reach. Some of the medicine finds the tumor. The rest interacts with healthy tissue.

This approach has saved millions of lives. It can also create punishing side effects.

“The problem isn’t that these drugs don’t work,” said Housley. “It’s that they affect far more of the body than they need to.”

When Chemistry Does the Work

Cancer cells consume oxygen and nutrients at a higher rate than healthy tissue, and that changes the environment around a tumor. In a recent Nature Communications paper, Housley and his team introduced a drug delivery system that senses those physical changes and guides medicine to the disease. The drug is released only when it encounters those tumor-specific conditions.

Housley’s system is designed to work across many cancer types. Rather than being tailored to a specific tumor type or genetic marker, the system is “cancer agnostic.”

“We don’t need to know anything about the tumor ahead of time,” said Housley. “These particles circulate through the body, but they persist where tumors create those conditions.”

The Whack-a-Mole Problem

Housley’s design sidesteps a constant challenge in cancer treatment. Tumors are not static. They change in response to pressure from therapy, creating what Housley describes as “a whack-a-mole problem”, where hitting one target can push the disease to reemerge in a different form.

“Tumors are constantly changing,” said Housley. “You hit one thing with a targeted therapy, and that pressure causes the tumor to evolve. That’s a big problem with classically targeted therapies.”

“It has the potential to be a breakthrough at the clinic…patients in early trials could benefit directly; that’s rare and exciting.” –Nick Housley

Letting the Body Lead

Housley’s drug delivery system is called SANGs, short for “self-assembling nanohydrogels.” Nanohydrogels are microscopic, gel-like particles designed to carry drugs through the bloodstream. As the nanohydrogels circulate, they keep the drug contained. The particles pass through healthy tissue without releasing medicine. When they encounter the environment created by a tumor, they linger and release the drug where it’s needed most.

In preclinical studies, the nanohydrogels did what they were designed to do. They circulated through the body without releasing the drug too early, responding to tumor-specific conditions and concentrating treatment at the disease site.

Moving Toward Patients

Housley and his team are now planning to test SANGs with additional drugs and across a wider range of cancers, laying the groundwork for human clinical trials.

“The moment we can get our first patient in the study, the moment we can collect that first data and begin to see what this really changes, that will be a big moment,” he said.

Cancer treatment is physically taxing, but it also forces people into what Housley describes as “a constant calculus — weighing time gained against what that time will feel like.”

The goal isn’t to remove uncertainty from cancer care. It’s to narrow the impact of treatment, so patients aren’t forced to sacrifice how they feel for how long they live.




Changing the Playing Field in Nickel Catalysis | Newswise


BYLINE: Tracy Crane, Department of Chemistry

Newswise — Researchers at the University of Illinois Urbana-Champaign have reported a breakthrough in nickel catalysis that harnesses a rare oxidation state of nickel that has proved challenging to control yet is highly valued for its potential to facilitate important chemical reactions.

The researchers, led by Liviu Mirica, a professor of chemistry at Illinois, explain in a recently published paper in Nature Catalysis how they have overcome a long-standing challenge in the field of nickel catalysis by developing a new method for synthesizing thermally stable Ni(I) compounds that opens new avenues for building complex molecules.

“We have developed shelf-stable Ni(I) compounds that could dramatically change the playing field of nickel catalysis. And that’s why we have an international patent for it, and we’re working with pharmaceutical companies and chemical vendors who want to license it,” Mirica said.

Nickel-catalyzed cross-coupling reactions are widely used to form carbon–carbon and carbon–heteroatom bonds, essential steps in producing pharmaceuticals, agrochemicals, and advanced materials. Traditionally, these reactions rely on two forms of nickel – Ni(0) or Ni(II) – as catalysts. Catalytically competent Ni(I) sources have remained elusive, but attractive.

“This form of nickel is highly desirable partly because it may open up new avenues of reactivity that have remained elusive with traditional sources of nickel,” said Sagnik Chakrabarti, co-author and former graduate student in the Mirica group who worked on the project with graduate students Jubyeong Chae and Katy A. Knecht.

Mirica said previous approaches by chemists have used specialized ligands that limit the generality of Ni(I) in a reaction the way one would use Ni(II) or Ni(0) sources. By tapping into the unique properties of organic compounds called isocyanides, the Mirica group has developed a simple system that gets the chemistry to work.

In their study, they demonstrated how the commercially available isocyanides function as simple supporting ligands, which connect to the nickel atom and form stable, powerful catalysts that can be used to snap molecular pieces together with exceptional speed and precision, opening an untapped chemical space for reaction discovery.

Their Ni(I) complexes are readily available, shelf-stable, easily prepared, and easily handled catalysts that are efficient for a wide variety of chemical reactions. This is unique because most Ni(I) complexes tend to be rather unstable, which has limited their use in catalytic settings.

“We were able to put Ni(I), ‘nickel one’, in a bottle so people can use it on a wider scale for various synthetic applications,” Mirica said.

In the study, the researchers demonstrate that these new catalysts work in several of the most important reactions used to make pharmaceuticals, electronics, advanced materials, and more. They report the synthesis, characterization, and catalytic activity of two classes of Ni(I) isocyanide complexes: coordinatively saturated homoleptic compounds and coordinatively unsaturated Ni(I)-halide compounds. One is slightly more reactive than the other.

Their complexes exhibit rapid ligand substitution and demonstrate exceptional performance in Kumada, Suzuki–Miyaura, and Buchwald–Hartwig cross-coupling reactions, according to the study, and notably, they exhibit chemo-selectivity, displaying their versatility.

According to Mirica and Chakrabarti this new class of catalysts could be a game changer in nickel catalysis. Chakrabarti said there could be new reactions that could be discovered by directly introducing Ni(I) into reactions.

“And in fact, in the paper, we do talk about a new class of reactions that we developed and that has not been achieved with Ni catalysts before,” he said. “It’s just a snippet of reactivity, not like a full vignette in itself, but it still shows that by synthesizing something that’s different from what’s out there, we can maybe coax unique reactivity.”

The research team also found that a tiny amount goes a long way. 

“The interesting thing that we found is that we can use very, very tiny amounts of the nickel catalyst, which is unusual in Ni catalysis, which typically needs higher amounts of the catalyst,” Mirica said.

The study also highlights the structural diversity of isocyanides and their potential as spectator ligands for reaction discovery. Their study showed that this chemistry is not limited to just the one class of isocyanide they used, the tert-butyl isocyanide, but it’s broadly applicable to other classes of isocyanides as well.

“So, the generality in using a bunch of different isocyanides bodes well for the future development of this chemistry,” Chakrabarti said.

Future work in the Mirica group will explore the fundamental structure and bonding of these unusually stable compounds, their new reactivity, and the differences in reactivity between alkyl and aryl isocyanide-supported complexes, which, according to their study, exhibit divergent catalytic behavior.

Original release:




A Smashing Success: Relativistic Heavy Ion Collider Wraps up Final Collisions


Newswise — UPTON, N.Y. — Just after 9 a.m. on Friday, Feb. 6, 2026, final beams of oxygen ions — oxygen atoms stripped of their electrons — circulated through the twin 2.4-mile-circumference rings of the Relativistic Heavy Ion Collider (RHIC) and crashed into one another at nearly the speed of light inside the collider’s two house-sized particle detectors, STAR and sPHENIX. RHIC, a nuclear physics research facility at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory, has been smashing atoms since the summer of 2000. The final collisions cap a quarter century of remarkable experiments using 10 different atomic species colliding over a wide range of energies in different configurations. The RHIC program has produced groundbreaking discoveries about the building blocks of matter and the nature of proton spin and technological advances in accelerators, detectors, and computing that have far surpassed scientists’ expectations when this discovery machine first turned on.

“RHIC has been one of the most successful user facilities operated by the DOE Office of Science, serving thousands of scientists from across the nation and around the globe,” said DOE Under Secretary for Science Darío Gil. “Supporting these one-of-a-kind research facilities pushes the limits of technology and expands our understanding of our world through transformational science — central pillars of DOE’s mission to ensure America’s security and prosperity.”

Gil was in the Main Control Room of Brookhaven Lab’s collider complex to officially end the 25th and final run at RHIC in advance of announcing the next major milestone in the construction of the Electron-Ion Collider (EIC), a state-of-the-art nuclear physics research facility that will be built by reusing major components of RHIC.

“It’s been an amazing run,” said Wolfram Fischer, chair of Brookhaven Lab’s Collider-Accelerator Department (C-AD), speaking of the entirety of the RHIC program. As head of C-AD, Fischer is responsible for the day-to-day, year-to-year operations of the collider and all its ancillary accelerator infrastructure. “Experiencing the challenges of first trying to get beams to circulate during commissioning in the fall of 1999, one could not have dreamed how far the performance of this machine would come,” he said. “We’ve pushed well beyond the original design in terms of the number of collisions we can produce, the energy range of those collisions, the variety of ions we’ve collided, and our ability to align the spins of protons and maintain a high degree of this alignment or polarization.”

The 25th and final run produced the largest-ever dataset from RHIC’s most energetic head-on smashups between two beams of gold ions, among the heaviest ions collided at RHIC. It also yielded a treasure trove of proton-proton collisions that will provide essential comparison data and insight into proton spin, a set of low-energy fixed target collisions to complete RHIC’s “beam energy scan,” and a final burst of oxygen-oxygen interactions. All this data will add to that collected previously by RHIC’s detectors — STAR, which has been running with many upgrades since RHIC’s beginning; PHENIX, another original RHIC detector that ceased operations in 2016; PHOBOS and BRAHMS, two smaller original detectors that ran from 2000 through 2005 and 2006, respectively; and sPHENIX, RHIC’s newest most rapid-fire collision “camera,” which came online in 2023.

This final run generated the primary data set for the new sPHENIX experiment. This year, sPHENIX accumulated more than 200 petabytes of raw data — or 200 quadrillion bytes — more than all previous RHIC raw datasets combined. This massive dataset includes 40 billion snapshots of the unique form of matter generated in gold-ion collisions.

Collectively, the RHIC measurements will fill in missing details in physicists’ understanding of how a soup of fundamental particles known as quarks and gluons — which last existed in nature some 14 billion years ago, a microsecond after the Big Bang — coalesced and converged to form the more ordinary atomic particles that make up everything visible in our world today. Recreating this primordial matter, known as a quark-gluon plasma (QGP), was the primary reason for building RHIC. RHIC’s energetic collisions of heavy ions such as gold were designed to set quarks and gluons free from “confinement” within protons and neutrons by melting the boundaries of these nuclear particles.

Thanks to considerable contributions from Japan’s RIKEN institute, RHIC was also built with unique capabilities for polarizing protons so that physicists could explore the origins of proton spin. This intrinsic quantum property, somewhat analogous to a planet spinning on its axis, has been leveraged to develop powerful technologies like nuclear magnetic resonance imaging and medical MRIs. RHIC’s polarized proton collisions have opened a new window into the mystery of how spin arises from the proton’s quarks and gluons.

PHENIX and STAR have both collected and published results from large swaths of spin-polarized collisions using selection “triggers” to decide which events to capture and study. During Run 25, sPHENIX became the world’s first detector to record a continuous streaming dataset from RHIC’s spin-polarized proton collisions — thus eliminating the need for triggers and potentially paving the way for unanticipated discoveries.

“This final RHIC run, with its impressive dataset, is a capstone that exemplifies the success of the entire RHIC program,” said John Hill, interim director of Brookhaven Lab. “The scientists, engineers, and technicians at Brookhaven deserve huge credit for their dedication and innovation throughout the operating life of RHIC — and for continually finding new ways to maximize the scientific output of this remarkable machine. We are also extremely grateful for the continued support of the U.S. Department of Energy, and for our collaborators from other DOE labs, U.S. universities, and scientific institutions around the globe. This exploration of the matter that makes up our world and of how it came to be has been, and will continue to be, a truly international endeavor.”

Captivating discoveries

In early 2001, as the earliest RHIC data came out, some scientists were convinced that they’d seen signs of the post-Big-Bang QGP. But the data also presented puzzling surprises. Instead of the predicted uniformly expanding gas of quarks and gluons, the matter created in RHIC’s collisions seemed to flow more like a liquid — and, remarkably, one with extremely low viscosity. Additional experiments and a careful multiyear analysis led the four original RHIC collaborations to conclude in 2005 that RHIC was generating a nearly “perfect” liquid. By 2010, they had sufficient evidence to declare this liquid hot enough to be the long-sought QGP.

Since then, RHIC physicists have been making precision measurements of the QGP, including its temperature at different stages, how it swirls — it’s the swirliest matter ever! — how quarks and gluons in the primordial soup transition under various conditions of temperature and pressure to the nuclear matter that makes up atoms in our world, and how collisions of even small particles can create tiny drops of the QGP. They’ve explored exotic forms of nuclear matter such as that found in neutron stars, detected traces of the heaviest exotic antimatter ever created in a laboratory, and explored how visible matter emerges from the “nothingness” of empty space. The sPHENIX experiment has only recently published its first physics results, laying the foundation for its future of scientific insights.

“RHIC transformed nuclear physics by demonstrating the remarkable consequences of ‘boiling the vacuum,’ to paraphrase renowned physicist T. D. Lee’s description of matter governed by quantum chromodynamics (QCD),” said Brookhaven Lab theorist Raju Venugopalan. “In QCD — the theory that describes quarks and gluons and their interactions — findings from RHIC propelled the rapid development of new analytical approaches and high-performance computing. The RHIC data also sparked several unanticipated connections between the behavior of the QGP fluid and strongly correlated condensed matter systems, including ultra-cold atoms, as well as links to concepts such as quantum entanglement and the formation and evaporation of black holes.”

Advances in nuclear physics theory and the enormous RHIC datasets have also pushed the evolution of supercomputers, AI methods for analyzing “big data,” and the infrastructure needed to store and share data seamlessly with RHIC collaborators around the world. In 2024, Brookhaven’s data center — which also houses data from the ATLAS experiment at the Large Hadron Collider (LHC) at CERN, the European Organization for Nuclear Research, and other experiments — passed the milestone of storing 300 petabytes of data, the largest compilation of nuclear and particle physics data in the U.S. With the newest data from RHIC and ATLAS, the total now tops 610 petabytes.

In the proton spin program, RHIC’s measurements greatly improved the precision with which scientists could determine gluons’ contribution to proton spin, along with the contribution from quarks. This effort was motivated by surprising results from experiments elsewhere in the 1980s showing that quarks contribute only a fraction to this quantum property. Gluons were initially assumed to contribute the rest. RHIC’s measurements reveal that gluons contribute about as much as the quarks — not enough to fully solve the “spin puzzle.” A more recent analysis established that at least some of the gluons are spin aligned with the spin of the proton they are in. But there is still more to explore in this spin puzzle.

“Spin is one of the fundamental quantum numbers of every elementary particle in the universe except one, the Higgs,” said Elke Aschenauer, a Brookhaven Lab physicist who has played a pivotal role in RHIC’s spin physics program. “RHIC’s measurements have established the groundwork for understanding the complexity of proton spin. The future EIC will be a precision machine for studying proton spin.”

All Relativistic Heavy Ion Collider data is stored on tape at Brookhaven Lab’s data center. When physicists want access to a particular dataset — or multiple sets simultaneously — a robot grabs the appropriate tape(s) and mounts the desired data to disk within seconds. Collaborators around the world can tap into the data as if it were on their own desktop. (David Rahner/Brookhaven National Laboratory)

Continuing legacy

Even with so many impressive discoveries in the books, RHIC physicists say there will be many more to come for at least another decade.

“The science mission of RHIC will continue until we analyze all the data and publish all the papers,” said Abhay Deshpande, Brookhaven Lab’s associate laboratory director for nuclear and particle physics. He emphasized how important it will be to preserve RHIC’s data for future scientific analyses.

RHIC’s data will also continue to serve as an essential bridge between ongoing and planned experiments exploring nuclear matter at lower collision energies — for example at the Facility for Antiproton and Ion Research (FAIR) being built in Germany and the Super Proton Synchrotron at CERN — and at much higher energies at CERN’s LHC.

“Analyzing the latest RHIC data will also help train the next generation of physicists needed to run and analyze data from future experiments,” said Lijuan Ruan, a Brookhaven Lab physicist and co-spokesperson for the STAR Collaboration.    

A big part of that future will take place right here at Brookhaven National Laboratory where major components of the RHIC accelerator complex will live on in a new nuclear physics research facility, the world’s only polarized Electron-Ion Collider. Engineers and technicians will remove one of RHIC’s ion storage rings and replace it with a new ring for storing accelerated electrons inside the existing accelerator tunnel. Meanwhile, the other RHIC ring, refurbished for its new mission, will receive ions accelerated by C-AD’s existing injector complex, traveling around the tunnel in the opposite direction from the electrons. Scientists will leverage the experience gained during 25 years of RHIC operations — as well as reams of RHIC accelerator physics data — to develop and train new AI algorithms designed to optimize EIC accelerator performance.  

When electrons collide with ions where the two EIC rings cross, the action will be captured by a brand-new particle detector. Instead of recreating the early universe, these microscope-like interactions will enable precision measurements that reveal how quarks and gluons are organized and interact within matter as we know it in today’s world.

“We’ll learn how quarks and gluons generate mass, how their interactions contribute to proton spin, and much more that will revolutionize our understanding of matter — much as the science we’ve explored at RHIC has,” said Deshpande, who also serves as director of science for the EIC. “This is the future of Brookhaven Lab and nuclear physics in the U.S.”

Daniel Marx, one of the accelerator physicists working on the design of the EIC’s new electron storage ring, said, “It’s going to be very challenging, but also exciting. We’ll be doing things that have never been done before.”

Perhaps Marx was echoing the sentiments of the physicists who originally built RHIC, demonstrating another big part of RHIC’s legacy: an ongoing willingness to tackle unprecedented scientific and technological challenges.

“We are confident that we have the people who will make the EIC happen because of the expertise we have developed by building and running RHIC,” Deshpande said.

RHIC and the future EIC are funded primarily by the DOE Office of Science.

Brookhaven National Laboratory is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit science.energy.gov.

Follow @BrookhavenLab on social media. Find us on Instagram, LinkedIn, X, and Facebook.

Related Links