2D FEATURE TRACKING ALGORITHM FOR MOTION ANALYSIS

Citation
S. Krishnan et D. Raviv, 2D FEATURE TRACKING ALGORITHM FOR MOTION ANALYSIS, Pattern recognition, 28(8), 1995, pp. 1103-1126
Citations number
40
Categorie Soggetti
Computer Sciences, Special Topics","Engineering, Eletrical & Electronic","Computer Science Artificial Intelligence
Journal title
ISSN journal
00313203
Volume
28
Issue
8
Year of publication
1995
Pages
1103 - 1126
Database
ISI
SICI code
0031-3203(1995)28:8<1103:2FTAFM>2.0.ZU;2-6
Abstract
In this paper, we describe a local-neighborhood pixel-based adaptive a lgorithm to track image features, both spatially and temporally, over a sequence of monocular images. The algorithm assumes no a priori know ledge about the image features to be tracked, or the relative motion b etween the camera and the three dimensional(3D) objects. The features to be tracked are selected by the algorithm and they correspond to the peaks of a 'correlation surface' constructed from a local neighborhoo d in the first image of the sequence to be analysed. Any kind of motio n, i.e., 6 DOF (translation and rotation), can be tolerated keeping in mind the pixels-per-frame motion limitations. No subpixel computation s being necessary. Taking into account constraints of temporal continu ity, the algorithm uses simple and efficient predictive tracking over multiple frames. Trajectories of features on multiple objects can also be computed. The algorithm accepts a slow, continuous change of brigh tness D.C. level in the pixels of the feature. Another important aspec t of the algorithm is the use of an adaptive feature matching threshol d that accounts for change in relative brightness of neighboring pixel s. As applications of the feature tracking algorithm, and to test the accuracy of the tracking, we show how the algorithm has been used to e xtract the Focus of Expansion (FOE) and to compute the time-to-contact using real image sequences of unstructured, unknown environments. In both applications information from multiple frames is used.