Information transmission and recovery in neural communications channels

Citation
Mc. Eguia et al., Information transmission and recovery in neural communications channels, PHYS REV E, 62(5), 2000, pp. 7111-7122
Citations number
32
Categorie Soggetti
Physics
Journal title
PHYSICAL REVIEW E
ISSN journal
1063651X → ACNP
Volume
62
Issue
5
Year of publication
2000
Part
B
Pages
7111 - 7122
Database
ISI
SICI code
1063-651X(200011)62:5<7111:ITARIN>2.0.ZU;2-G
Abstract
Biological neural communications channels transport environmental informati on from sensors through chains of active dynamical neurons to neural center s for decisions and actions to achieve required functions. These kinds of c ommunications channels are able to create information and to transfer infor mation from one time scale to the other because of the intrinsic nonlinear dynamics of the component neurons. We discuss a very simple neural informat ion channel composed of sensory input in the form of a spike train that arr ives at a model neuron, then moves through a realistic synapse to a second neuron where the information in the initial sensory signal is read. Our mod el neurons are four-dimensional generalizations of the Hindmarsh-Rose neuro n, and we use a model of chemical synapse derived from first-order kinetics . The four-dimensional model neuron has a rich variety of dynamical behavio rs, including periodic bursting, chaotic bursting, continuous spiking, and multistability. We show that, for many of these regimes, the parameters of the chemical synapse can be tuned so that information about the stimulus th at is unreadable at the first neuron in the channel can be recovered by the dynamical activity of the synapse and the second neuron. Information creat ion by nonlinear dynamical systems that allow chaotic oscillations is famil iar in their autonomous oscillations. It is associated with the instabiliti es that lead to positive Lyapunov exponents in their dynamical behavior. Ou r results indicate how nonlinear neurons acting as input/output systems alo ng a communications channel can recover information apparently "lost" in ea rlier junctions on the channel. Our measure of information transmission is the average mutual information between elements, and because the channel is active and nonlinear, the average mutual information between the sensory s ource and the final neuron may be greater than the average mutual informati on at an earlier neuron in the channel. This behavior is strikingly differe nt than the passive role communications channels usually play, and the "dat a processing theorem" of conventional communications theory is violated by these neural channels. Our calculations indicate that neurons can reinforce reliable transmission along a chain even when the synapses and the neurons are not completely reliable components. This phenomenon is generic in para meter space, robust in the presence of noise, and independent of the discre tization process. Our results suggest a framework in which one might unders tand the apparent design complexity of neural information transduction netw orks. If networks with many dynamical neurons can recover information not a pparent at various way stations in the communications channel, such network s may be more robust to noisy signals, may be more capable of communicating many types of encoded sensory neural information, and may be the appropria te design for components, neurons and synapses, which can be individually i mprecise, inaccurate "devices.".