Georgia Tech Names Mike Gazarik Director of Georgia Tech Research Institute | Newswise


Newswise — Georgia Institute of Technology has named Michael “Mike” Gazarik as the new director of the Georgia Tech Research Institute (GTRI) and a Georgia Tech senior vice president, effective February 16. 

A nationally respected aerospace and research leader, Gazarik has led large, complex research organizations across government, industry, and academia, shaping strategy, driving growth, and building institutions that deliver mission-critical innovation. With more than three decades of experience, his career reflects a deep ability to align technology with national priorities and guide organizations through periods of change and opportunity. 

A Georgia Tech alumnus, Gazarik currently serves as faculty director of the Engineering Management Program at the University of Colorado Boulder and as a part‑time staff member at the Johns Hopkins Applied Physics Laboratory. He previously held senior leadership roles at NASA, including director of engineering at NASA Langley Research Center and inaugural associate administrator for the Space Technology Mission Directorate (STMD). In industry, he spent eight years as vice president of engineering at Ball Aerospace, leading its strategic growth from an elite science contractor into a strategic national security asset that doubled in size.

“Mike Gazarik brings a rare combination of technical depth, executive leadership, and deep government experience,” said Tim Lieuwen, Georgia Tech’s executive vice president for Research. “He knows large research enterprises operate within the realities of policy and budget and has a proven ability to align technology with mission priorities while earning trust across stakeholders. We are excited to welcome Mike back to Georgia Tech to lead GTRI at a pivotal moment for research and innovation.”

GTRI employs more than 3,000 employees, conducting nearly $1 billion in annual research in areas such as autonomous systems, cybersecurity, electromagnetics, electronic warfare, modeling and simulation, sensors, systems engineering, and threat systems. GTRI’s renowned researchers combine science, engineering, economics, and policy to address challenges facing national security, industry, and society.

For nearly a century, GTRI has partnered with government and industry to deliver solutions to the most mission-critical challenges facing our nation,” said Georgia Tech President Ángel Cabrera. “We are proud to welcome Mike Gazarik to lead a crown jewel of our research enterprise and a crucial component of our nation’s science and technology fabric. His experience and leadership will strengthen GTRI’s ability to deliver on its mission and help make our nation safer, healthier, and more competitive.”

Gazarik is widely recognized for leading complex research enterprises with a focus on stability, strategic alignment, and mission impact. At NASA, he helped shape the agency’s science and technology enterprise during periods of fiscal constraint and technical risk, maintaining balance across broad mission areas and forming STMD to consolidate technology development. At Ball Aerospace, he guided significant growth and aligned strategy with evolving national security and civil space needs. His academic work has focused on preparing engineering leaders for mission-driven organizations — experience that aligns closely with GTRI’s role as a trusted partner to government and industry.

He earned a B.S. in electrical engineering from the University of Pittsburgh and an M.S. and Ph.D. in electrical engineering from Georgia Tech. Gazarik is a fellow of the American Institute of Aeronautics and Astronautics (AIAA), a former chair of AIAA’s Corporate Strategic Committee, and was elected to the AIAA Board of Trustees in 2025. His honors include NASA’s Outstanding Leadership Medal, the Silver Snoopy Award, the 2023 AIAA Rocky Mountain Section Educator of the Year, and recognition as Engineering Manager of the Year by the American Society of Engineering Management.

“GTRI has a remarkable legacy of delivering solutions that matter for the nation,” said Gazarik. “I’m honored to return to Georgia Tech and lead an organization that combines deep technical expertise with a mission-driven culture. My focus will be on listening, building on GTRI’s strengths, and ensuring we continue to advance research that makes a real difference for our partners and society.”

As director, Gazarik will lead GTRI’s multidisciplinary research enterprise, advancing its mission to deliver high‑impact science and technology solutions in support of national security, space systems, and critical societal needs.




Why Amazon’s CEO is ‘confident’ with $200 billion spending plan


Andy Jassy, CEO of Amazon, speaks during an unveiling event in New York, Feb. 26, 2025.

Michael Nagle | Bloomberg | Getty Images

Amazon‘s stock plunged 11% in extended trading on Thursday, dragged lower by market jitters around the company’s $200 billion capex plans, the highest spending forecast among the megacap companies.

The forecast is a sharp increase from Amazon’s capital expenditures last year, and it was more than $50 billion above analysts’ expectations. The company reported spending roughly $131 billion on purchases of property and equipment in 2025, up from about $83 billion in the year prior.

Tech companies have laid out aggressive spending plans on artificial intelligence infrastructure since OpenAI ushered in the modern era of this technology with the release of ChatGPT in late 2022, but at the start of 2026, those lavish commitments have only kept growing.

Google parent Alphabet on Wednesday said it would spend up to $185 billion in 2026, while Meta last week said its capital expenditures could nearly double from last year to somewhere between $115 billion to $135 billion in 2026

On a conference call with investors, Wall Street analysts pressed Amazon executives for more clarity around the spending blitz and when it could begin to pay off. CEO Andy Jassy said in prepared remarks at the beginning of the call that he was “confident” that company’s cloud unit will see a “strong return on invested capital,” though he didn’t say when it could materialize.

“Help us, get to that — get to your level of confidence in having a strong long term return on that invested capital,” Mark Mahaney, Evercore ISI head of internet research, said to Jassy.

Jassy said the company needs the capital to keep pace with “very high demand” for Amazon’s AI compute, which requires more infrastructure such as data centers, chips and networking equipment.

“This isn’t some sort of quixotic, top-line grab,” Jassy said. “We have confidence that we, that these investments will yield strong returns on invested capital. We’ve done that with our core AWS business. I think that will very much be true here as well.”

Sales at Amazon Web Services grew 24% to $35.6 billion in the most recent period, beating analysts’ expectations and marking the cloud unit’s “fastest growth in 13 quarters,” Jassy said.

AWS could’ve grown faster if it had more capacity to meet demand, “so we are being incredibly scrappy around that,” he said.

The company’s cloud unit added almost 4 gigawatts of computing capacity in 2025, and AWS expects to double that power by the end of 2027, Jassy noted.

Barclays analyst Ross Sandler asked Jassy how he sees the AI market evolving from the current landscape, where it remains “a bit top-heavy with a lot of the spend clustering around a few of the AI-native labs.”

Jassy said the AI market has become more like a “barbell,” with the AI labs on one side and enterprises on the other end, looking to the technology as a “productivity and cost avoidance” tool. The middle is comprised of enterprises that are in various stages of building AI applications, he said.

“That middle part of the barbell very well may end up being the largest and most durable,” Jassy said.

WATCH: Amazon shares fall on earnings miss, $200 billion guidance for 2026 capex spending

Why Amazon’s CEO is ‘confident’ with 0 billion spending plan


Alphabet shares close flat after earnings beat. Here’s what’s happening


Alphabet shares close flat after earnings beat. Here’s what’s happening

Alphabet’s shares closed largely flat on Thursday after the company beat Wall Street’s expectations on earnings and revenue, with artificial intelligence spending projected to increase hugely this year.

The Google parent closed nearly 2% lower on Wednesday. After the bell, Alphabet reported fourth-quarter revenue of $113.83 billion, above the $111.43 billion estimate from analysts polled by LSEG.

Its Google Cloud division had $17.66 billion in revenue versus a forecast of $16.18 billion, according to StreetAccount. YouTube Advertising posted $11.38 billion in revenue versus the estimated $11.84 billion.

The tech giant said it would significantly increase its 2026 capital expenditure to between $175 billion and $185 billion — more than double its 2025 spend. A significant portion of capex spending would go toward investing in AI compute capacity for Google DeepMind.

What analysts are saying

Barclays analysts said in a note Thursday that Infrastructure, DeepMind and Waymo costs “weighed on overall Alphabet profitability,” and will continue to do so in 2026.

“Cloud’s growth is astonishing, measured by any metric: revenue, backlog, API tokens inferenced, enterprise adoption of Gemini. These metrics combined with DeepMind’s progress on the model side, starts to justify the 100% increase in capex in ’26,” they said.

“The AI story is getting better while Search is accelerating – that’s the most important take for GOOG,” they added.

Deutsche Bank analysts said in a note Thursday that Alphabet has “stunned the world” with its huge capex spending plan. “With tech in a current state of flux, it’s not clear whether that’s a good or a bad thing,” they wrote.

Correction: This story has been updated to correct that Alphabet shares were down on Thursday.


AI, Automation, and Biosensors Speed the Path to Synthetic Jet Fuel | Newswise


BYLINE: Will Ferguson

Newswise — When it comes to powering aircraft, jet engines need dense, energy-packed fuels. Right now, nearly all of that fuel comes from petroleum, as batteries don’t yet deliver enough punch for most flights. Scientists have long dreamed of a synthetic alternative: teaching microbes to ferment plant material into high-performance jet fuels. But designing these microbial “mini-factories” has traditionally been slow and expensive because of the unpredictability of biological systems.

In a pair of recent studies, two teams at the Joint BioEnergy Institute (JBEI), which is managed by Lawrence Berkeley National Laboratory (Berkeley Lab), have demonstrated complementary ways to dramatically speed up this process. One combines artificial intelligence and lab automation to rapidly test and refine the genetic designs of biofuel-producing microbes. The other turns a microbe’s “bad habit” into a powerful sensing tool, uncovering hidden pathways that boost production.

Their shared target is isoprenol — a clear, volatile alcohol that can be converted into DMCO, a next-generation jet fuel with higher energy density than today’s conventional aviation fuels. Producing isoprenol efficiently has been a long-standing challenge in synthetic biology.

The two studies — one published in Nature Communications, the other in Science Advances — tackle different sides of this challenge. The first uses automation and machine learning to engineer Pseudomonas putida strains that produce five times more isoprenol than before. The second approach turns the bacterium’s natural fuel-sensing ability into an advantage. By rewiring that system into a biosensor, the team could rapidly screen millions of variants and identify strains that make up to 36 times more isoprenol.

“These are two powerful complementary strategies,” said senior author of the biosensor study Thomas Eng, JBEI deputy director of Host Engineering and a research scientist in Berkeley Lab’s Biological Systems and Engineering (BSE) Division. “One is data-driven optimization; the other is discovery. Together, they give us a way to move much faster than traditional trial-and-error.”

A new engine for strain design

The AI and automation study was led by Taek Soon Lee, director of Pathway and Metabolic Engineering at JBEI, and Héctor García Martín, director of Data Science and Modeling at JBEI, both staff scientists in Berkeley Lab’s BSE Division. They set out to accelerate one of synthetic biology’s most time-consuming steps: improving microbial production through a series of genetic tweaks to different combinations of genes. Traditionally, scientists alter a few genes at a time and test the results — a painstaking, intuition-driven process that can take months or even years to yield meaningful gains.

By contrast, the Berkeley Lab researchers built an automated pipeline that uses robotics to create and test hundreds of genetic designs in parallel. After each round, machine learning algorithms analyze the results to systematically suggest the next set of strain genetic designs. The result is a system that moves 10 to 100 times faster than conventional methods.

“Standard metabolic engineering is slow because you’re relying on human intuition and biological knowledge,” said García Martín. “Our goal was to make strain improvement systematic and fast.”

Lead author David Carruthers, a scientific engineering associate with JBEI and BSE, developed a robotic workflow that connects key lab steps into one automated system. Working with collaborators at Lawrence Livermore National Laboratory, the team introduced a custom microfluidic electroporation device that can insert genetic material into 384 Pseudomonas putida strains in under a minute — a task that typically takes hours by hand.

At the core of the system is CRISPR interference (CRISPRi), a tool that lets researchers “turn down” gene activity rather than switching genes off completely. This fine-tuning makes it possible to test subtle gene combinations that shape the cell’s metabolism and track the effects through detailed protein measurements. After each round, the machine learning model analyzes the results and recommends the next set of genes that are most likely to boost performance when dialed down.

“Traditionally, optimizing production is a kind of guess-and-check process,” Carruthers said. “You make one change, test it, and hope you’re climbing toward a higher peak. By combining automation and machine learning, we were able to climb that landscape systematically — in weeks, not years.”

Lee, who led the metabolic engineering work, emphasized why this level of automation is so transformative for biology.

“We have been engineering Pseudomonas by hand for years, but biological experiments always come with small variations that are hard to control,” he said. “Automation gives us the ability to generate the same high-quality data every time, which is essential for machine learning to work well.”

Patrick Kinnunen, a former Berkeley Lab JBEI postdoctoral researcher who co-developed the data strategy, highlighted how crucial that reproducibility was for the algorithms. “Automation didn’t just make the experiments faster — it made the data cleaner,” he said. “That clarity is what lets it uncover non-intuitive genetic combinations that we probably would have missed by hand.”

Using their automated learning loop, the team completed six engineering cycles, each lasting just a few weeks instead of the months typical of manual workflows. They boosted isoprenol titers (the concentration of product in the culture) five-fold compared to their starting strain.

Turning a bug into a feature

Meanwhile, a second team led by Eng tackled a different but equally stubborn hurdle: how to select target genes that, when dialed down, improve isoprenol production significantly. The team’s microbe, Pseudomonas putida, posed a peculiar problem. It didn’t just make isoprenol, it also consumed the fuel molecule almost as soon as it produced it, undermining production efforts. Initially, this looked like a flaw. But during the COVID-19 pandemic, Eng and colleagues realized it might be a clue: if the microbe could sense and eat isoprenol, it likely had a built-in molecular sensor.

“There was a real ‘Aha!’ moment,” Eng said. “We had spent more than a year trying to figure out why the cells were consuming the product. One day we thought, ‘Wait, if they can sense it, there has to be a protein that detects it. Maybe we can turn that from a problem into a tool.’”

The team discovered the molecular system the microbe uses to sense isoprenol: two proteins that work together to detect the fuel and send signals inside the cell. They then rewired this system into a biosensor — a kind of biological “engine light” that turns on in proportion to how much fuel the cell produces.

Then came the clever twist: They linked the sensor to genes essential for survival, creating a system where only the microbes that make the most fuel can grow. Instead of measuring thousands of samples by hand, they let natural selection do the screening. This approach rapidly surfaced “champion” strains, including variants that produced up to 36 times more isoprenol than the original.

“What started as a frustrating bug became our biggest asset,” Eng said. “We turned the microbe’s fuel-eating behavior into a sensor that reports and selects for the best producers automatically.”

The approach also revealed surprising biology; high-producing strains switched to feed on their own amino acids once glucose ran out, sustaining production by rewiring their metabolism in unexpected ways. Just as importantly, the workflow can be applied to other molecules, offering a flexible new tool for rapidly engineering microbes — not just for isoprenol, but for a wide range of bio-based products.

Scaling up to industry-ready

Although developed independently, the two approaches fit together well. The AI-driven pipeline excels at rapidly optimizing combinations of a known set of gene targets, while the biosensor method is best for discovering novel gene targets, revealing genetic levers that would be difficult to predict.

“One is depth-first; the other is breadth-first,” Eng said. “Machine learning systematically optimizes combinations of annotated targets, while the biosensor approach starts fresh and lets the cells tell us which gene targets matter.”

Both teams are now working to scale their methods from lab experiments to industrially relevant fermentation systems — a critical step for producing synthetic aviation fuel at commercial levels. They’re also adapting their approaches to other microbes and target molecules, aiming to make them broadly applicable in biomanufacturing.

“If widely adopted, these approaches could reshape the industry,” García Martín said. “Instead of taking a decade and hundreds of people to develop one new bioproduct, small teams could do it in a year or less.”

Aindrila Mukhopadhyay, BSE deputy director for science, director of Host Engineering at JBEI, and a coauthor on the biosensor study, said these kinds of tools are changing how biological research gets done.

“Engineering biology is challenging due to the inherent unpredictability of metabolism and that makes the engineering slow,” Mukhopadhyay said. “By streamlining key steps — as we did through selections — and leveraging automation and AI, we’re making it a faster, more systematic process that is easier to adopt.”

JBEI is a Bioenergy Research Center funded by the Department of Energy Office of Science.

###

Lawrence Berkeley National Laboratory (Berkeley Lab) is committed to groundbreaking research focused on discovery science and solutions for abundant and reliable energy supplies. The lab’s expertise spans materials, chemistry, physics, biology, earth and environmental science, mathematics, and computing. Researchers from around the world rely on the lab’s world-class scientific facilities for their own pioneering research. Founded in 1931 on the belief that the biggest problems are best addressed by teams, Berkeley Lab and its scientists have been recognized with 17 Nobel Prizes. Berkeley Lab is a multiprogram national laboratory managed by the University of California for the U.S. Department of Energy’s Office of Science.

DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.