The tails of gravitational waves result from the non-linear interactio
n between the usual quadrupole radiation generated by an isolated syst
em (with total mass-energy M), and the static monopole field associate
d with M. Their contributions to the field at large distances from the
system include a particular effect of modulation of the phase in the
Fourier domain, having M as a factor and depending on the frequency as
similar to omega ln omega . In this paper we investigate the level at
which this tail effect could be detected in future laser interferomet
ric detectors. We consider a family of matched filters of inspiralling
compact binary signals, allowing for this effect and parametrized by
a family of independent 'test' parameters including M. Detecting the e
ffect is equivalent to attributing, by optimal signal processing, a no
n-zero value to M. The 1-sigma error bar in the measurement of M is co
mputed by analytical and numerical methods as a function of the optima
l signal-to-noise ratio (SNR). We find that the minimal values of the
SNR for detection of the tail effect at the 1-sigma level range from s
imilar to 100 to similar to 2800 for neutron-star binaries (depending
on the type of noise in the detector and on our a priori knowledge of
the binary), and from similar to 15 to similar to 400 for a black-hole
binary with M = 20M.. It is argued that some of these values, at leas
t for black-hole binaries, could be achieved in future generations of
detectors, following the currently planned VIRGO and LIGO detectors.