B. Porr et al., HOW TO HEAR VISUAL DISPARITIES - REAL-TIME STEREOSCOPIC SPATIAL DEPTHANALYSIS USING TEMPORAL RESONANCE, Biological cybernetics, 78(5), 1998, pp. 329-336
In a stereoscopic system, both eyes or cameras have a slightly differe
nt view. As a consequence, small variations between the projected imag
es exist ('disparities') which are spatially evaluated in order to ret
rieve depth information (Sanger 1988; Fleet et al. 1991). A strong sim
ilarity exists between the analysis of visual disparities and the dete
rmination of the azimuth of a sound source (Wagner and Frost 1993). Th
e direction of the sound is thereby determined from the temporal delay
between the left and right ear signals (Konishi and Sullivan 1986). S
imilarly, here we transpose the spatially defined problem of disparity
analysis into the temporal domain and utilize two resonators implemen
ted in the form of causal (electronic) filters to determine the dispar
ity as local temporal phase differences between the left and right fil
ter responses. This approach permits real-time analysis and can be sol
ved analytically for a step function contrast change, which is an impo
rtant case in all real-world applications. The proposed theoretical fr
amework for spatial depth retrieval directly utilizes a temporal algor
ithm borrowed from auditory signal analysis. Thus, the suggested simil
arity between the visual and the auditory system in the brain (Wagner
and Frost 1993) finds its analogy here at the algorithmical level. We
will compare the results from the temporal resonance algorithm with th
ose obtained from several other techniques like crosscorrelation or sp
atial phase-based disparity estimation showing that the novel algorith
m achieves performances similar to the 'classical' approaches using mu
ch lower computational resources.