Kym. Wong et D. Sherrington, NEURAL NETWORKS OPTIMALLY TRAINED WITH NOISY DATA, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics, 47(6), 1993, pp. 4465-4482
We study the retrieval behaviors of neural networks which are trained
to optimize their performance for an ensemble of noisy example pattern
s. In particular, we consider (1) the performance overlap, which refle
cts the performance of the network in an operating condition identical
to the training condition; (2) the storage overlap, which reflects th
e ability of the network to merely memorize the stored information; (3
) the attractor overlap, which reflects the precision of retrieval for
dilute feedback networks; and (4) the boundary overlap, which defines
the boundary of the basin of attraction, and hence the associative ab
ility for dilute feedback networks. We find that for sufficiently low
training noise, the network optimizes its overall perforance by sacrif
icing the individual performance of a minority of patterns, resulting
in a two-band distribution of the aligning fields. For a narrow range
of storage level, the network loses and then regains its retrieval cap
ability when the training noise level increases, and we interpret that
this reentrant retrieval behavior is related to competing tendencies
in structuring the basins of attraction for the stored patterns. Reent
rant behavior is also observed in the space of synaptic interactions,
in which the replica symmetric solution of the optimal network destabi
lizes and then restabilizes when the training noise level increases. W
e summarize these observations by picturing training noises as an inst
rument for widening the basins of attractions of the stored patterns a
t the expense of reducing the precision of retrieval.