Sensing visual motion gives a creature valuable information about its inter
actions with the environment. Flies in particular use visual motion informa
tion to navigate through turbulent air, avoid obstacles, and land safely. M
obile robots are ideal candidates for using this sensory modality to enhanc
e their performance, but so far have been limited by the computational expe
nse of processing video. Also, the complex structure of natural visual scen
es poses an algorithmic challenge for extracting useful information in a ro
bust manner. We address both issues by creating a small, low-power visual s
ensor with integrated analog parallel processing to extract motion in real-
time. Because our architecture is based on biological motion detectors, we
gain the advantages of this highly evolved system: A design that robustly a
nd continuously extracts relevant information from its visual environment.
We show that this sensor is suitable for use in the real world, and demonst
rate its ability to compensate for an imperfect motor system in the control
of an autonomous robot. The sensor attenuates open-loop rotation by a fact
or of 31 with less than 1 mW power dissipation.