Until recently there have been no convincing quantitative measurements
on the rates of information transmission in real neurons. Here we rev
iew the theoretical basis for making such measurements, together with
the data which demonstrate remarkably high information rates in a vari
ety of systems. In fact these rates are within a factor of two of the
absolute physical limits set by the entropy of neural spike trains. Th
ese observations lead to sharp theoretical questions about the structu
re of the code and the strategy for adapting the code to different ens
embles of input signals.