We investigate the psychophysical and computational processes underlyi
ng the perception of depth from texture cues in perspective projection
s. In particular, we analyse the similarities between the processes as
sociated with perspective and orthographic projections. Based on a ser
ies of psychophysical experiments using white noise stimuli presented
in perspective projection, we suggest that the visual system may chara
cterize the spatial-frequency spectrum by the average peak frequency (
APF), which is the same characteristic as that used in orthographic pr
ojections. We demonstrate that normalization of the APF yields an esti
mate of depth in perspective projection. Our previous studies suggest
that the APF is used in orthographic projections, where the output of
the normalization process represents a linear approximation of surface
slant. Based on these results, together with previous psychophysical
evidence, we propose a neural network model of shape-from-texture for
the perspective view. Simulations of the model show qualitative agreem
ent with human perception over a range of real and artificial stimuli.