Emergence of irregular activity in networks of strongly coupled conductance-based neurons

Loading...
Thumbnail Image

Journal Title

Journal ISSN

Volume Title

Repository Usage Stats

70
views
22
downloads

Abstract

Cortical neurons are characterized by irregular firing and a broad distribution of rates. The balanced state model explains these observations with a cancellation of mean excitatory and inhibitory currents, which makes fluctuations drive firing. In networks of neurons with current-based synapses, the balanced state emerges dynamically if coupling is strong, i.e. if the mean number of synapses per neuron $K$ is large and synaptic efficacy is of order $1/\sqrt{K}$. When synapses are conductance-based, current fluctuations are suppressed when coupling is strong, questioning the applicability of the balanced state idea to biological neural networks. We analyze networks of strongly coupled conductance-based neurons and show that asynchronous irregular activity and broad distributions of rates emerge if synapses are of order $1/\log(K)$. In such networks, unlike in the standard balanced state model, current fluctuations are small and firing is maintained by a drift-diffusion balance. This balance emerges dynamically, without fine tuning, if inputs are smaller than a critical value, which depends on synaptic time constants and coupling strength, and is significantly more robust to connection heterogeneities than the classical balanced state model. Our analysis makes experimentally testable predictions of how the network response properties should evolve as input increases.

Department

Description

Provenance

Citation

Scholars@Duke

Brunel

Nicolas Brunel

Duke School of Medicine Distinguished Professor in Neuroscience

We use theoretical models of brain systems to investigate how they process and learn information from their inputs. Our current work focuses on the mechanisms of learning and memory, from the synapse to the network level, in collaboration with various experimental groups. Using methods from
statistical physics, we have shown recently that the synaptic
connectivity of a network that maximizes storage capacity reproduces
two key experimentally observed features: low connection probability
and strong overrepresentation of bidirectionnally connected pairs of
neurons. We have also inferred `synaptic plasticity rules' (a
mathematical description of how synaptic strength depends on the
activity of pre and post-synaptic neurons) from data, and shown that
networks endowed with a plasticity rule inferred from data have a
storage capacity that is close to the optimal bound.



Unless otherwise indicated, scholarly articles published by Duke faculty members are made available here with a CC-BY-NC (Creative Commons Attribution Non-Commercial) license, as enabled by the Duke Open Access Policy. If you wish to use the materials in ways not already permitted under CC-BY-NC, please consult the copyright owner. Other materials are made available here through the author’s grant of a non-exclusive license to make their work openly accessible.