Long-memory errors dramatically slow down the convergence of minimax risks
in a fixed design nonparametric regression. The problem becomes even more c
omplicated for the case of adaptive estimation. This defines the curse of l
ong-memory errors. I show that using a random design, instead of a fixed on
e, allows one to overcome this curse and make familiar data-driven estimato
rs robust. Moreover, the result holds for a wide class of nonstationary err
ors with bounded moments (including bounded deterministic errors). Possible
extensions are discussed.