Information theory in neuroscience
Alexander G. Dimitrov
0
1
Aurel A. Lazar
0
1
Jonathan D. Victor
0
1
0
J. D. Victor Division of Systems Neurology and Neuroscience, Department of Neurology and Neuroscience, Weill Cornell Medical College
, 1300 York Avenue,
New York, NY 10065, USA
1
A. A. Lazar Department of Electrical Engineering, Columbia University
, Mail Code 4712, 500 West 120th Street,
New York, NY 10027, USA
Information Theory started and, according to some, ended with Shannon's seminal paper A Mathematical Theory of Communication (Shannon 1948). Because its significance and flexibility were quickly recognized, there were numerous attempts to apply it to diverse fields outside of its original scope. This prompted Shannon to write his famous essay The Bandwagon (Shannon 1956), warning against indiscriminate use of the new tool. Nevertheless, non-standard applications of Information Theory persisted. Very soon after Shannon's initial publication (Shannon 1948), several manuscripts provided the foundations of much of the current use of information theory in neuroscience. MacKay and McCulloch (1952) applied the concept of information to propose limits of the transmission capacity of a nerve cell. This work
-
foreshadowed future work on what can be termed
Neural Information Flowhow much information
moves through the nervous system, and the constraints
that information theory imposes on the capabilities of
neural systems for communication, computation and
behavior. A second set of manuscripts, by Attneave
(1954) and Barlow (1961), discussed information as
a constraint on neural system structure and function,
proposing that neural structure in sensory system is
matched to statistical structure of the sensory
environment, in a way to optimize information transmission.
This is the main idea behind the Structure from
Information line of research that is still very active today. A
third thread, Reliable Computation with Noisy/Faulty
Elements, started both in the information-theoretic
community (Shannon and McCarthy 1956) and
neuroscience (Winograd and Cowan 1963). With the advent
of integrated circuits that were essentially faultless,
interest began to wane. However, as IC technology
continues to push towards smaller and faster
computational elements (even at the expense of reliability), and
as neuromorphic systems are developed with variability
designed in (Merolla and Boahen 2006), this topic is
gaining in popularity again in the electronics
community, and neuroscientists again may have something to
contribute to the discussion.
1 Subsequent developments
The theme that arguably has had the widest influence
on the neuroscience community, and is most heavily
represented in the current special issue of JCNS, is
that of Neural Information Flow. The initial works of
MacKay and McCulloch (1952), McCulloch (1952) and
Rapoport and Horvath (1960) showed that neurons are
in principle able to relay large quantities of
information. This research lead to the first attempts to
characterize the information flow in specific neural systems
(Werner and Mountcastle 1965), and also started the
first major controversy in the field, which still resonates
today: the debate about timing versus frequency codes
(Stein 1967; Stein et al. 1972). A steady stream of
articles followed, both discussing these hypothesis and
attempting to clarify the type of information relayed by
nerve cells (Abeles and Lass 1975; Eagles and Purple
1974; Eckhorn and Ppel 1974; Eckhorn et al. 1976;
Harvey 1978; Lass and Abeles 1975; Norwich 1977;
Poussart 1971; Stark et al. 1969; Taylor 1975; Walloe
1970).
After the initial rise in interest, the application of
Information Theory to neuroscience was extended to
a few more systems and questions (Eckhorn and Ppel
1981; Eckhorn and Querfurth 1985; Fuller and Williams
1983; Kjaer et al. 1994; Lestienne and Strehler 1987,
1988; Optican and Richmond 1987; Surmeier and
Weinberg 1985; Tsukuda et al. 1984; Victor and
Johanessma 1986), but did not spread too broadly. This
was presumably because, despite strong theoretical
advances in Information Theory, its applicability was
hampered by difficulty in measuring and interpreting
information-theoretic quantities.
The work of de Ruyter van Steveninck and Bialek
(1988) started what could be called the modern era
of information-theoretic analysis in neuroscience, in
which Information Theory is seeing more and more
refined applications. Their work advanced the
conceptual aspects of the application of information theory to
neuroscience and, subsequently, provided a relatively
straightforward way to estimate information-theoretic
quantities (Strong et al. 1998). This work provided an
approach to removing biases in information estimates
due to finite sample size, but the scope of applicability
of their approach was limited. The difficulties in
obtaining unbiased estimates of information-theoretic
quantities were noted early on by Carlton (1969) and Miller
(1955) and brought again to attention by Treves an (...truncated)