We present a simple method, based on the quantum regression theorem, to cal
culate the quantum correlation spectra for two optical beams in the lineari
zed fluctuation regime. As an application. we discuss the dynamical instabi
lity, the squeezing spectra and the QND properties of a crossed Kerr-type d
ispersive model.