Institut de la Vision, Sorbonne Université, CNRS, INSERM
Statistical physics for the modeling of spatial information in neuronal networks
Sensory systems are dealing with a constant stream of information. Visual input, for example, is first perceived through the retina, then processed in the brain and (together with information from other sources) stored in areas like the hippocampus. Quantifying this process is a hard task due to the large number of neurons involved. It is therefore natural that we are using methods from statistical mechanics to tackle it.
In the first part of the talk, we address the problem of estimating mutual informations, which amounts to estimating entropies. An exact computation is impossible for the correlated activity of a large population of neurons because the number of parameters to be estimated is prohibitively large. A popular way to deal with this is to map the activity onto a Ising model. However, this limits the number of possible activity states for every neuron to two, so information is lost in high-activity phases. We lift this restriction by allowing for arbitrarily many states in every time bin, which makes it possible to adapt the bin size according to the time scales of the data.
In the second part, we discuss the problem of storing multiple spatial patterns in an attractor network, a classical paradigm for memory, in particular for the hippocampus. A common criticism of this kind of networks is that the disorder present in the synapses deteriorates the system's capability to reliably preserve the information of a certain pattern. In order to investigate if this is valid, we compute the spatial information content of a network receiving space-dependent input whose synapses are composed of a local and a disordered component. We observe that the decay of the Fisher information is slow for not too large disorder strength, indicating that attractor networks have a regime in which the advantageous effects of connectivity on information storage outweigh the detrimental ones.