A dragonfly's eyes and brain are the inspiration for a new machine vision system that has applications for surveillance, wildlife monitoring or smart cars.
Mechanical Engineering PhD Student Zahra Bagheri at the University of Adelaide in South Australia says that despite having low visual acuity and brains no bigger than a grain of rice, dragonflies are remarkably good at tracking prey.
"They're not like mammals which have developed very good brains, and they have very low resolution eyes compared to other animals, but they can catch their prey more than 97 per cent of the time while they're moving at very high speeds in very cluttered environments," Bagheri says.
"That means they have adopted very efficient methods for target tracking."
Bagheri is part of a team of engineers and neuroscientists that have used those methods to develop a machine vision algorithm that can be applied in a virtual reality simulation, allowing an artificial intelligence system to 'pursue' an object.
Her project is a combination of neuroscience, mechanical engineering and computer science, building on years of research in to insect vision already undertaken at the University of Adelaide.
Zahra Bagheri and Benjamin Cazzolato with the robot that will use the newly developed machine vision algorithm.
"Detecting and tracking small objects against complex backgrounds is a highly challenging task. Consider a cricket or baseball player trying to take a match-winning catch in the outfield," Bagheri explains.
"They have seconds or less to spot the ball, track it and predict its path as it comes down against the brightly coloured backdrop of excited fans in the crowd - all while running or even diving towards the point where they predict it will fall!"
This is known as selective attention. Dr Steve Wiederman is leading the dragonfly project, and conducted the original research recording the responses of neurons in the dragonfly brain.
"Selective attention is fundamental to humans' ability to select and respond to one sensory stimulus in the presence of distractions," Dr Wiederman says.
"Precisely how this works in biological brains remains poorly understood, and this has been a hot topic in neuroscience in recent years," he says.
"The dragonfly hunts for other insects, and these might be part of a swarm - they're all tiny moving objects. Once the dragonfly has selected a target, its neuron activity filters out all other potential prey."
"It has diverse applications. It can be used in surveillance, wildlife monitoring, smart cars and even bionic vision."
The team has emulated that ability with their algorithm. Rather than trying to perfectly centre the target in its field of view, Bagheri says the system locks on to the background and lets the target move against it.
"This reduces distractions from the background and gives time for underlying brain-like motion processing to work. It then makes small movements of its gaze and rotates towards the target to keep the target roughly frontal," Bagheri says.
Because the algorithm is based on a dragonfly's small brain and limited vision, it can rival insects' abilities as well as those of more elaborate machine vision systems - all with relatively low complexity.
"It's shown that we can do it with very low resolution cameras and very limited computational resources. It doesn't need high-performance computers or anything like that."
This bio-inspired "active vision" system has been tested in virtual reality worlds composed of various natural scenes. The Adelaide team has found that it performs just as robustly as the state-of-the-art engineering target tracking algorithms, while running up to 20 times faster.
"We are hoping to test it on a robot - we're working on that right now. It has diverse applications. It can be used in surveillance, wildlife monitoring, smart cars and even bionic vision."
Bagheri is lead author of the paper, titled Properties of Neuronal Facilitation that Improve Target Tracking in Natural Pursuit Simulations, which was published this week in the Journal of The Royal Society Interface.