Inferring learning rules from distributions of firing rates in cortical neurons.


Information about external stimuli is thought to be stored in cortical circuits through experience-dependent modifications of synaptic connectivity. These modifications of network connectivity should lead to changes in neuronal activity as a particular stimulus is repeatedly encountered. Here we ask what plasticity rules are consistent with the differences in the statistics of the visual response to novel and familiar stimuli in inferior temporal cortex, an area underlying visual object recognition. We introduce a method that allows one to infer the dependence of the presumptive learning rule on postsynaptic firing rate, and we show that the inferred learning rule exhibits depression for low postsynaptic rates and potentiation for high rates. The threshold separating depression from potentiation is strongly correlated with both mean and s.d. of the firing rate distribution. Finally, we show that network models implementing a rule extracted from data show stable learning dynamics and lead to sparser representations of stimuli.





Published Version (Please cite this version)


Publication Info

Lim, Sukbin, Jillian L McKee, Luke Woloszyn, Yali Amit, David J Freedman, David L Sheinberg and Nicolas Brunel (2015). Inferring learning rules from distributions of firing rates in cortical neurons. Nat Neurosci, 18(12). pp. 1804–1810. 10.1038/nn.4158 Retrieved from

This is constructed from limited available data and may be imprecise. To cite this article, please review & use the official citation provided by the journal.



Nicolas Brunel

Duke School of Medicine Distinguished Professor in Neuroscience

We use theoretical models of brain systems to investigate how they process and learn information from their inputs. Our current work focuses on the mechanisms of learning and memory, from the synapse to the network level, in collaboration with various experimental groups. Using methods from
statistical physics, we have shown recently that the synaptic
connectivity of a network that maximizes storage capacity reproduces
two key experimentally observed features: low connection probability
and strong overrepresentation of bidirectionnally connected pairs of
neurons. We have also inferred `synaptic plasticity rules' (a
mathematical description of how synaptic strength depends on the
activity of pre and post-synaptic neurons) from data, and shown that
networks endowed with a plasticity rule inferred from data have a
storage capacity that is close to the optimal bound.

Unless otherwise indicated, scholarly articles published by Duke faculty members are made available here with a CC-BY-NC (Creative Commons Attribution Non-Commercial) license, as enabled by the Duke Open Access Policy. If you wish to use the materials in ways not already permitted under CC-BY-NC, please consult the copyright owner. Other materials are made available here through the author’s grant of a non-exclusive license to make their work openly accessible.