New System Designed to Protect Drones From Cyber Threats | Newswise


Newswise — Adelaide University researchers have initiated the development of a world-first cybersecurity system designed to protect drones from increasingly sophisticated cyber threats.

A new study led by the Industrial AI Research Centre and published in the international journal Computers and Industrial Engineering, paves the way for safer and more resilient unmanned aerial systems (UAS) that are less vulnerable to hacking, signal disruption and malicious software.

Senior author Professor Javaan Chahl says the research addresses a growing but often overlooked problem: modern drones are effectively flying computers that can be attacked.

“Today’s drones are used in warfare, for emergency response, infrastructure inspections, agriculture, environmental monitoring, logistics and even medical deliveries,” Prof Chahl says.

“They collect large amounts of data, process it onboard, and communicate continuously with operators or cloud-based systems. While this makes drones powerful and versatile, it also makes them vulnerable.”

To solve this, the team has developed a new onboard security architecture based on Software-Defined Wide Area Networking, or SD-WAN, which acts as a smart traffic controller for internet connections.

“Instead of relying on a single link, the drone can use multiple communication pathways at once – such as mobile networks, Wi-Fi or other radio links – and automatically switch between them if one fails or is attacked.”

According to first author Tom Scully, PhD candidate and cybersecurity expert, if a drone is hacked, the impact is just not digital.

“A cyber-attack can interfere with flight controls, disrupt communications, expose sensitive data, and even cause a physical accident.”

The researchers say that many drones still rely on basic communication methods that lack encryption – the digital equivalent of sending sensitive information on an open radio channel. This means that attackers could intercept data, inject false commands or overwhelm the drone’s systems.

The system also includes a next-generation firewall, which works like an advanced security gate. It monitors incoming and outgoing data in real time, blocks suspicious activity, and ensures that only authorised communications are allowed.

Importantly, this firewall runs directly on the drone, rather than relying on remote systems.

One of the most innovative aspects of the research is the inclusion of malware sandboxing – a technology normally found in large corporate networks – where suspicious files can be opened and examined without risking damage. If malicious behaviour is detected, the system can block it immediately.

The researchers have successfully demonstrated the software on a drone platform, using real-world onboard computing hardware with cloud-based control systems.

The team plans to conduct future trials to further validate the system in real time, potentially supporting its adoption in commercial, emergency and government drone operations.

“Our goal is simple,” Scully says. “As drones become part of everyday life, we need to ensure they are not only smart and autonomous, but also secure, resilient and trustworthy.”




‘The low of the low’: Montreal charity targeted by scam speaks out – Montreal | Globalnews.ca


Dan Payne struggles to raise money to feed the homeless population through his Montreal charity, DIFY.

‘The low of the low’: Montreal charity targeted by scam speaks out – Montreal | Globalnews.ca

“The first year I had to pay everything out of my pocket,” he told Global News from his home office in the city’s Notre-Dame-de-Grâce (NDG) neighbourhood.

So he was overjoyed when, a week ago, someone sent an email offering to make a $2,500 donation to the charity. With that amount, Payne said he could feed 3,000 people.

“I’m a good shopper,” he quipped.

After he replied to the would-be donor, the charity got a followup email from them with an offer of even more money.

“‘We’re gonna send you actually $5,000 and we’d like you to give the $2,500 to a needy mother with a sick child,’” Payne said, quoting the message.

Story continues below advertisement

The retiree said that’s when he became suspicious. A few days afterwards, the charity got a delivery via FedEx — a cheque for $5,000 supposedly issued by a company in Ontario. The document had the company’s name and address, but with a slightly modified logo.

For news impacting Canada and around the world, sign up for breaking news alerts delivered directly to you when they happen.

Get breaking National news

For news impacting Canada and around the world, sign up for breaking news alerts delivered directly to you when they happen.

A company representative refused to be interviewed.

After the cheque was delivered, Payne recalled, the scammer contacted him numerous times expressing urgency. That’s a common technique fraudsters use, according to Terry Cutler, who runs cybersecurity company Cyology Labs.


“They use a sense of urgency to make you do stuff that you’re not supposed to do,” he explained. “That’s why it’s important you take a step back, breathe.”

Refusing to take the bait, Payne said he exercised due diligence and contacted a representative at the Ontario company named on the cheque.

“They got back to me and said it was fraudulent,” Payne recalled.

Cutler explained that these kinds of cheque fraud have been happening for years. The scammer issues a cheque to the target person or organization, then asks for a refund or, as in Payne’s case, asks for some of the cash to be given to a third party.

The problem is that it takes banks days to clear cheques after deposit, even if the account says the funds are available, Cutler pointed out.

Story continues below advertisement

“You go and refund the person, but then you realize shortly after that, that the cheque bounced,” he said.

So the victim loses the cash.

In their 2024 annual report, Montreal police state the number of fraud cases in their territory rose from 2,417 in 2019 to 11,617 in 2024.

“The increase in the number of frauds reported to the SPVM in 2024 can be attributed mainly to more incidents of card fraud (for example, bank card) and computer fraud,” the report states.

Organizations of all stripes are targeted. But community workers like David Chapman, executive director of Resilience Montreal, which serves the unhoused community, stressed that it’s their organizations that have the most to lose.

“Fraudsters picking on community organizations which are working with marginalized populations — it’s pretty much the low of the low,” he stated.

He told Global News that he’s seen a growth in the number of phishing attempts targeting his organization, which has forced staff to be more vigilant.

“You have to spend money to protect yourself from this,” he said.

Funds, he and others point out, that could be spent on potentially saving someone’s life.

Story continues below advertisement

Payne said he reported his incident to the Canadian Anti-Fraud Centre.

&copy 2026 Global News, a division of Corus Entertainment Inc.


Decoding the Shadows: Vehicle Recognition Software Uncovers Unusual Traffic Behavior | Newswise


Newswise — Researchers at the Department of Energy’s Oak Ridge National Laboratory have developed a deep learning algorithm that analyzes drone, camera and sensor data to reveal unusual vehicle patterns that may indicate illicit activity, including the movement of nuclear materials.

The software monitors routine traffic over time to establish a baseline for “patterns of life,” enabling detection of deviations that could signal something out of place. For example, a surge in overnight truck traffic at a facility which is normally only visited during the day could reveal illegal shipments. 

The research builds on a previous ORNL-developed technology for recognizing specific vehicles from side views. Researchers improved the structure of this software’s deep learning network to provide much broader capabilities than any existing recognition systems, said ORNL’s Sally Ghanem, lead researcher.

“The majority of the current re-identification models require specific views of the car from the same angles. But our model does not have any of these limitations,” Ghanem said. “We can basically put in any view, from any distance, and determine if it is the same vehicle.” That means the top of a truck seen from a drone can be matched with a side view from the ground. 

This precision in recognition was achieved by training the software on hundreds of thousands of publicly available images from surveillance cameras, ground sensors and drones, combined with computer-generated images based on vehicle specifications. ORNL researcher John Holliman built 3D digital models of many car and truck brands, varying the paint jobs, perspectives and lighting conditions to create a wide range of training scenarios. Unlike most vehicle data sets, the ORNL training images also included older vehicle models.

The image set was expanded with footage captured during six data collections around three ORNL campus intersections chosen because vehicles enter and exit by the same route. “We’re using drones to improve the training data because they are very flexible,” Ghanem said. “Drones can circle a vehicle and change their distance to get many angles, so we can simulate images collected from a satellite or at road level.”

To demonstrate that flexibility, ORNL’s Zach Ryan and Jairus Hines piloted a drone hovering 80 feet over the road to ORNL’s High Flux Isotope Reactor, rotating the drone to follow vehicles through turns for multiple perspectives. They also filmed desirable footage of vehicles slightly hidden by tree limbs or traffic lights, and even blurry shots caused by electrical or magnetic interference. 

“The more low-resolution images we include, the more robust the model,” Ghanem said. Unclear footage and nighttime images train the software to more accurately identify vehicles even when visibility is poor, as in some satellite images.

To avoid bias, Ghanem weeded out repetitive images of the same angle or vehicle type. She also taught the algorithm with both correct and incorrect matches, making sure the correct pairs represented different perspectives. These methods prevent the algorithm from choosing based only on obvious similarities, such as front views of white sedans. “By retraining the model on challenging pairs, we make it more capable of tricky matches,” Ghanem said. 

After training, the team tested the software against 10,000 image pairs, evenly split between correct and incorrect matches. The system proved more than 97% accurate. 

The software leverages a series of neural networks – computational models that function similarly to the brain – which can be trained to not only match different viewpoints but derive long-term patterns from the results. “The project supports nuclear nonproliferation, enabling us to identify whether shipment activities are happening at a specific place,” Ghanem said. 

But the algorithm is also precise enough to track an individual vehicle with stickers, dents or other distinguishing features across a variety of sensors, flagging repeated visits to the same location even if the vehicle takes different routes each time. Researchers are exploring possibilities for adapting the algorithm to incorporate information from non-visual sensors. It could also be applied to identifying the shipment of dangerous or illegal substances on other forms of transportation, such as ships and airplanes.

ORNL researchers and staff who contributed to the project, which was funded through ORNL’s Laboratory Directed Research and Development program, include Ghanem, John Holliman, Ryan Kerekes, Andrew Duncan, Jairus Hines, Ken Dayman, and former staff member Zach Ryan. The High Flux Isotope Reactor is a DOE Office of Science user facility.

UT-Battelle manages ORNL for the Department of Energy’s Office of Science, the single largest supporter of basic research in the physical sciences in the United States. The Office of Science is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.