flagflag

A visually guided flight robot inspired by the fly


responsibility for wording of article: Akira Takashima (OIST)


A lot of research has been conducted on robots inspired by the nervous system of arthropods, including insects. For example, both insects and humans can be regarded as “vehicles” capable of autonomously moving around in a complex environment. One challenge in treating insects from a robotic viewpoint is to clearly demonstrate how movement functions under the guidance of vision. Insects have an extensive behavioral repertoire, and using their excellent sensors and limited resources, they can respond to an unpredictable environment. Flying insects, such as flies and dragonflies in particular, are far more agile than vertebrates or modern mobile robots. The sensory–motor control system of an insect is an engineering masterpiece. In the followings, we describe the features of a visually guided flight robot inspired by the fly. 

In 1991, Franceschini et al. created a fully autonomous mobile robot that detected optic flow (OF) with planar compound eyes and an elementary motion detector (EMD) array, inspired by the fly. This robot, named “Fly,” detects the outside world with ring-shaped ommatidia. A given pair of adjacent ommatidia drives a single EMD, and 114 EMDs as a whole analyze OF in the azimuthal plane. The compound eyes of “Fly” have a built-in resolution gradient that increases on the basis of a certain rule. The resolution gradient incorporated in the anatomical structure of the eyes ensures the detection of the contrast of objects entering the robot’s field of vision during translational motion (a distance that adjacent ommatidia predict for a point a certain distance away). “Fly” could process analog signals in parallel, just like the brain. The robot could weave its way around inanimate objects.

There is a visually guided flight robot inspired by the fly, which uses the same motion detection principles mentioned above to guide flight agents. This robot can fly and land on a rough terrain and follow the terrain without any information on the ground speed or surface contours. This robot is FANIA, a semi-constrained miniature helicopter with a variable pitch rotor. It has eyes equipped with 19 EMDs, and it uses signals from the eyes to avoid the ground. It was able to fly over contrasting obstacles in its trial in a circular arena.

Franceschin et al. also formatted OF perceived by flying organisms and construct an autopilot called an OF regulator. A miniature helicopter (OCTAVE), equipped with this autopilot, can take off, fly, and land smoothly in trials over various terrains in a circular arena. It also showed a very good approximation to an insect response when buffeted by the wind.

The fly-inspired autopilot differed greatly from the conventional autopilot designed by humans. A conventional autopilot is designed to allow an airplane to maintain a constant speed and constant altitude, and it requires many expensive, large sensors. In contrast, OCTAVE does not collide with the ground by only regulating its altitude. 

The retina of a fly in flight is microscanning. It moves back and forth by a few micrometers 5 times a second. Franceschin et al. hypothesized that microscanning is involved in motion detection, and on this basis, they built 2 types of robots with visual sensing (SCANIA and OSCAR) using only circular-motion OF activity. SCANIA has only low-resolution eyes; however, using its vision, it could avoid walls surrounding a rectangular area in which SCANIA was. OSCAR and robots equipped with OSCAR had 2 propellers suspended from the ceiling of the test chamber and could accordingly move their heads freely. They were equipped with microscanning and could fix a line of site on bars or other nearby objects. This robot could track at an angle up to 30° above the target. Even if buffeted by the pendulum motion of a hanging strap, floor vibrations, or wind gusts, the robot could hold its gaze. This feature would be very useful in real helicopters for detecting transmission lines without relying on radar or lasers.

The bio-robot approach outlined here paves the way for designing new devices and machinery, particularly inexpensive sensorimotor control systems for micro-vehicles. It also provides useful feedback in the fields of neuroethology and neurophysiology.


Further Reading

昆虫ミメティックス Insect Mimetics(2008),針山孝彦,下澤楯夫,pp. 866-877


updating of the site
Copyright (cc) 2005-2014 RIKEN BSI Neuroinformatics Japan Center. Some rights reserved.