Atención

Búsqueda avanzada
Buscar en:   Desde:
 
Information transmission and recovery in neural communications channels
Manuel C Eguia, Misha I Rabinovich y Henry D I Abarbanel.
PHYSICAL REVIEW E - STATISTICAL PHYSICS, PLASMAS, FLUIDS AND RELATED INTERDISCIPLINARY TOPICS, vol. 62, 2000, pp. 7111-7122.
  ARK: https://n2t.net/ark:/13683/pdps/9px
Resumen
Biological neural communications channels transport environmental information from sensors throughchains of active dynamical neurons to neural centers for decisions and actions to achieve required functions.These kinds of communications channels are able to create information and to transfer information from onetime scale to the other because of the intrinsic nonlinear dynamics of the component neurons. We discuss avery simple neural information channel composed of sensory input in the form of a spike train that arrives ata model neuron, then moves through a realistic synapse to a second neuron where the information in the initialsensory signal is read. Our model neurons are four-dimensional generalizations of the Hindmarsh-Rose neuron,and we use a model of chemical synapse derived from first-order kinetics. The four-dimensional model neuronhas a rich variety of dynamical behaviors, including periodic bursting, chaotic bursting, continuous spiking,and multistability. We show that, for many of these regimes, the parameters of the chemical synapse can betuned so that information about the stimulus that is unreadable at the first neuron in the channel can berecovered by the dynamical activity of the synapse and the second neuron. Information creation by nonlineardynamical systems that allow chaotic oscillations is familiar in their autonomous oscillations. It is associatedwith the instabilities that lead to positive Lyapunov exponents in their dynamical behavior. Our results indicatehow nonlinear neurons acting as input/output systems along a communications channel can recover informa-tion apparently ‘‘lost’’ in earlier junctions on the channel. Our measure of information transmission is theaverage mutual information between elements, and because the channel is active and nonlinear, the averagemutual information between the sensory source and the final neuron may be greater than the average mutualinformation at an earlier neuron in the channel. This behavior is strikingly different than the passive rolecommunications channels usually play, and the ‘‘data processing theorem’’ of conventional communicationstheory is violated by these neural channels. Our calculations indicate that neurons can reinforce reliabletransmission along a chain even when the synapses and the neurons are not completely reliable components.This phenomenon is generic in parameter space, robust in the presence of noise, and independent of thediscretization process. Our results suggest a framework in which one might understand the apparent designcomplexity of neural information transduction networks. If networks with many dynamical neurons can re-cover information not apparent at various waystations in the communications channel, such networks may bemore robust to noisy signals, may be more capable of communicating many types of encoded sensory neuralinformation, and may be the appropriate design for components, neurons and synapses, which can be indi-vidually imprecise, inaccurate ‘‘devices.’’
Texto completo