With drones becoming more sophisticated and their detection becoming more difficult, scientists have now turned to the hoverfly also called flower flies with researchers reverse engineering the visual systems of hoverflies to detect drones’ acoustic signatures from almost four kilometres away. The finding could help combat the growing use of IED-carrying drones, including those used in Ukraine.
For the first time, Australian researchers have reverse engineered the visual systems of hoverflies to detect drones’ acoustic signatures from almost four kilometres away.
Autonomous systems experts from the University of South Australia (UniSA), Flinders University and defence company Midspar Systems say that trials using bio-inspired signal processing techniques show up to a 50 per cent better detection rate than existing methods.
The findings, which could help combat the growing global threat posed by IED-carrying drones, including in Ukraine, have been reported in The Journal of the Acoustical Society of America.
UniSA Professor of Autonomous Systems, Anthony Finn, says that insect vision systems have been mapped for some time now to improve camera-based detections, but this is the first time that bio-vision has been applied to acoustic data.
“Bio-vision processing has been shown to greatly increase the detection range of drones in both visual and infrared data. However, we have now shown we can pick up clear and crisp acoustic signatures of drones, including very small and quiet ones, using an algorithm based on the hoverfly’s visual system,” Prof Finn said.
The hoverfly’s superior visual and tracking skills have been successfully modelled to detect drones in busy, complex and obscure landscapes, both for civilian and military purposes.
“Unauthorised drones pose distinctive threats to airports, individuals and military bases. It is therefore becoming ever-more critical for us to be able to detect specific locations of drones at long distances, using techniques that can pick up even the weakest signals. Our trials using the hoverfly-based algorithms show we can now do this,” Prof Finn says.
Associate Professor in Autonomous Systems at Flinders University, Dr Russell Brinkworth, says the ability to both see and hear small drones at greater distances could be hugely beneficial for aviation regulators, safety authorities and the wider public seeking to monitor ever increasing numbers of autonomous aircraft in sensitive airspace.
“We’ve witnessed drones entering airspace where commercial airlines are landing and taking off in recent years, so developing the capacity to actually monitor small drones when they’re active near our airports or in our skies could be extremely beneficial towards improving safety.”
“The impact of UAVs in modern warfare is also becoming evident during the war in Ukraine, so keeping on top of their location is actually in the national interest. Our research aims to extend the detection range considerably as the use of drones increases in the civilian and military space.”
Compared with traditional techniques, bio-inspired processing improved detection ranges by between 30 and 49 percent, depending on the type of drone and the conditions.
Similar conditions exist in the natural world. Dark lit regions are very noisy but insects such as the hoverfly have a very powerful visual system that can capture visual signals, researchers say.
“We worked under the assumption that the same processes which allow small visual targets to be seen amongst visual clutter could be redeployed to extract low volume acoustic signatures from drones buried in noise,” Dr Brinkworth says.
By converting acoustic signals into two-dimensional ‘images’ (called spectrograms), researchers used the neural pathway of the hoverfly brain to improve and suppress unrelated signals and noise, increasing the detection range for the sounds they wanted to detect.
Using their image-processing skills and sensing expertise, the researchers made this bio-inspired acoustic data breakthrough thanks to Federal Government funding through the Department of Defence’s Next Generation Technologies Fund.
The funding partly supports technological solutions to address the weaponisation of drones which are now among the deadliest weapons in modern warfare, killing or injuring more than 3000 enemy combatants in Afghanistan and being deployed in the current war in Ukraine.
Researchers have created a new model for detecting drones at long range by analysing how hoverflies see the world. They report a 50 per cent improvement in detection range. It was just
the latest urgent reminder that as drones get more stealthy, drone detection technology is struggling to keep up.
Now researchers from University of South Australia and Flinders University are reporting a marked improvement in detection ranges, thanks to an unlikely source: the hoverfly. By painstakingly measuring and modelling the neurology of the hoverfly’s vision system, they’ve built an algorithm that extends detection ranges by up to 50 per cent, the researchers say.
The findings were published in The Journal of the Acoustical Society of America. For well over a decade, Russell Brinkworth, an expert in autonomous systems at Flinders University, has been assembling a model of the hoverfly’s brain. But why the hoverfly?
“Flies are much smaller and less complicated than people,” he said. “Also, flies are really good at flying. Flies are so good at flying, we call them flies. So if you want to know how to fly, looking at a fly for inspiration is a good idea.”
Flies’ aeronautical flair partly rests on their rapid visual processing of the world around them. Though their brains are small, each one still has a million neurons. And these neurons are in turn tiny — smaller than can be resolved under a light microscope. “They’re smaller than you could possibly see,” Dr Brinkworth said.
To map the brain, the researchers used a probe that had to be even smaller than a neuron, and could only record the activity of one neuron at a time. “And so you have to hit an invisible target with an invisible spear tip. It’s very difficult.”
Over many years, Dr Brinkworth and his colleagues shined lights in flies’ eyes and recorded the response of individual neurons. Eventually, they had a mathematical model of how hoverfly brains process visual signals, said Anthony Finn, an expert on sensor processing at the University of South Australia.
“Essentially what we’re talking about here is an algorithm,” he said. The next step was to see if the visual model worked with acoustic signals.
“We took the signal processing concepts of the vision system and then applied them to the acoustic area,” he said.
“Instead of processing, say, an individual pixel, you are processing an individual frequency channel.”
– The writer is a senior journalist and media consultant. The views expressed are of the writer and do not necessarily reflect the views of Raksha Anirveda