The signal-to-noise ratio maximizing approach in optimal filtering provides
a robust tool to detect signals in the presence of colored noise. The meth
od fails, however, when the data present a regimelike behavior. An approach
is developed in this manuscript to recover local (in phase space) behavior
in an intermittent regimelike behaving system. The method is first formula
ted in its general form within a Gaussian framework, given an estimate of t
he noise covariance, and demands that the signal corresponds to minimizing
the noise probability distribution for any given value, i.e., on isosurface
s, of the data probability distribution. The extension to the non-Gaussian
case is provided through the use of finite mixture models for data that sho
w regimelike behavior. The method yields the correct signal when applied in
a simplified manner to synthetic time series with and without regimes, com
pared to the signal-to-noise ratio approach, and helps identify the right f
requency of the oscillation spells in the classical and variants of the Lor
enz system.