New theory allows drones to see distances with one eye

News - 07 January 2016 - Webredactie Communication

A new theory has been published that allows drones to see distances with a single camera. Assistant Professor Guido de Croon from TU Delft’s Micro Air Vehicle Laboratory has found that drones approaching an object with an insect-inspired vision strategy become unstable at a specific distance from the object. Turning this weakness into a strength, drones can actually use the timely detection of that instability to estimate distance. The new theory, published online on 7 January in the journal Bioinspiration & Biomimetics, will enable further miniaturisation of autonomous drones and provides a new hypothesis on flying insect behaviour.

In an effort to make ever-smaller drones navigate by themselves, researchers are increasingly turning to flying insects for inspiration. For example, current consumer drones for indoor flight use an insect-inspired strategy to estimate their velocity. The strategy comprises a downward-looking camera that determines ‘optical flow’ – the speed with which objects move through the camera’s field of view.  

Optical flow only provides information on the ratio between distance and velocity. Hence, an additional sonar is generally added to indoor flying drones. The sonar determines the drone’s distance to the ground, after which its velocity can be calculated from the optical flow. With the newly-proposed theory, the sonar can be abandoned altogether, making it possible for indoor flying drones to become even smaller.

Seeing distances with one eye

Soft landings

Although flying insects such as honey bees have two-facet eyes, these are placed so close together that they cannot use stereo vision to estimate distances for navigation purposes. Bees therefore rely heavily on the ‘distanceless’ cue of optical flow and lack the sensors to retrieve the actual distance to objects in their environment.  

So how do insects navigate successfully? Previous research has shown that simple optical flow control laws enable safe navigation. For instance, keeping the optical flow constant while going down should ensure that a drone makes a soft landing. However, implementing this strategy with real drones turned out to be very difficult.  

“This research was born of frustration with being unable to recreate fast, smooth optical flow landings. The drones would always start to oscillate up and down close to the end of the landing,” says de Croon. “At first I thought it had to do with issues such as the computer vision algorithms not working well enough when close to the ground, but later I discovered that the effect is still there even when you have perfect vision.” 

Using instability

In fact, theoretical analysis of drone control laws showed that a robot that is trying to keep optical flow constant starts to oscillate at a specific distance from the landing surface. The oscillations are induced by the robot itself, because movements close to the ground have a much larger and faster effect on optical flow than at greater distances. The key idea, then, is that timely detection of such oscillations tells the drone how far it is from the surface.

Image: Two landing trajectories over time, starting from 4 meters. The light blue drone has stronger reactions to optical flow deviations than the magenta drone, and hence starts to oscillate further away from the landing surface.

“What really makes me smile is that the robot exploits the oncoming instability of its control system to see distances so that it can, for instance, determine when to switch off its propellers. The last few months, I’ve been receiving strange looks in the flight arena while cheering a flying robot that seemed to be on the verge of losing control,” says de Croon.  

A closer look at the biological literature showed that flying insects do exhibit some behaviours that are triggered at specific distances, for example honey bees always start to hover at a certain distance from a landing surface. The new theory provides a hypothesis as to how insects might see the distances that induce such behaviours.  

More information 
Article ‘Monocular distance estimation with optical flow maneuvers and efference copies: a stability-based strategy’ in Bioinspiration & Biomimetics, article published online on 7 January 2016. 

Background information and images 
Website of the TU Delft Mirco Air Vehicle Laboratory.  
Website Guido de Croon 'Artificial Intelligence for Small Autonomous Robots'.  

Contact 
Guido de Croon: G.C.H.E.deCroon@tudelft.nl, +31 (0)15 278 1402. 
Ilona van den Brink (science information officer): i.vandenbrink@tudelft.nl, +31 (0)15 278 4259. 

/* */