Hardware implementation of a visual-motion pixel using oriented spatiotemporal neural filters

Citation
R. Etienne-cummings et al., Hardware implementation of a visual-motion pixel using oriented spatiotemporal neural filters, IEEE CIR-II, 46(9), 1999, pp. 1121-1136
Citations number
19
Categorie Soggetti
Eletrical & Eletronics Engineeing
Journal title
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-ANALOG AND DIGITAL SIGNAL PROCESSING
ISSN journal
10577130 → ACNP
Volume
46
Issue
9
Year of publication
1999
Pages
1121 - 1136
Database
ISI
SICI code
1057-7130(199909)46:9<1121:HIOAVP>2.0.ZU;2-A
Abstract
A pixel for measuring two-dimensional (2-D) visual motion with two one-dime nsional (1-D) detectors has been implemented in very large scale integratio n. Based on the spatiotemporal feature extraction model of Adelson and Berg en, the pixel is realized using a general-purpose analog neural computer an d a silicon retina. Because the neural computer only offers sum-and-thresho ld neurons, the Adelson and Bergen's model is modified. The quadratic nonli nearity is replaced with a full-wave rectification, while the contrast norm alization is replaced with edge detection and thresholding. Motion is extra cted in two dimensions by using two 1-D detectors with spatial smoothing or thogonal to the direction of motion. Analysis shows that our pixel, althoug h it has some limitations, has much loner hardware complexity compared to t he full 2-D model. It also produces more accurate results and has a reduced aperture problem compared to the two 1-D model with no smoothing. Real-tim e velocity is represented as a distribution of activity of the 18 X and 18 Y velocity-tuned neural filters.