WHAT CAN BE SEEN IN A NOISY OPTICAL-FLOW FIELD PROJECTED BY A MOVING PLANAR PATCH IN 3D SPACE

Authors
Citation
Sc. Pei et Lg. Liou, WHAT CAN BE SEEN IN A NOISY OPTICAL-FLOW FIELD PROJECTED BY A MOVING PLANAR PATCH IN 3D SPACE, Pattern recognition, 30(9), 1997, pp. 1401-1413
Citations number
28
Categorie Soggetti
Computer Sciences, Special Topics","Engineering, Eletrical & Electronic","Computer Science Artificial Intelligence
Journal title
ISSN journal
00313203
Volume
30
Issue
9
Year of publication
1997
Pages
1401 - 1413
Database
ISI
SICI code
0031-3203(1997)30:9<1401:WCBSIA>2.0.ZU;2-#
Abstract
In this paper, we would like to propose a brand new interpretation to the so-called ''structure-from-motion'' (SFM) problem. The optical flo w field projected by a moving rigid planar patch in 3D space is our ma in consideration. Instead of just obtaining an explicit 3D motion/pose solution like the old approaches did before, we focus our attention o n analyzing its error sensitivity, uncertainty, and ambiguity from ano ther point of view. Our new method can handle the above error analysis easily. As known well before, the optical flow field projected by a 3 D moving planar patch can be completely expressed by eight coefficient s (two for second-order, four for first-order, and two for zeroth-orde r). Based on these flow coefficients easily determined by a linear reg ression method or other similar approaches, the error sensitivity of 3 D estimates can be analyzed quantitatively and qualitatively in a coar se-to-fine way. The concepts of camera fixation and singular value dec omposition (SVD) play important roles in our analysis. There are three goals for our experiments: (1) To prove the correctness of the algori thms (simulated image). (2) To show the tendency of error sensitivity when the 3D poses of the target planar patch are varied in a controlle d manner (simulated image). (3) To show that our analysis is workable in the real-world application (real-world image). (C) 1997 Pattern Rec ognition Society. Published by Elsevier Science Ltd.