Browsing by Author "Brunel, Nicolas"
Results Per Page
Sort Options
Item Open Access A cerebellar learning model of vestibulo-ocular reflex adaptation in wild-type and mutant mice.(The Journal of neuroscience : the official journal of the Society for Neuroscience, 2014-05) Clopath, Claudia; Badura, Aleksandra; De Zeeuw, Chris I; Brunel, NicolasMechanisms of cerebellar motor learning are still poorly understood. The standard Marr-Albus-Ito theory posits that learning involves plasticity at the parallel fiber to Purkinje cell synapses under control of the climbing fiber input, which provides an error signal as in classical supervised learning paradigms. However, a growing body of evidence challenges this theory, in that additional sites of plasticity appear to contribute to motor adaptation. Here, we consider phase-reversal training of the vestibulo-ocular reflex (VOR), a simple form of motor learning for which a large body of experimental data is available in wild-type and mutant mice, in which the excitability of granule cells or inhibition of Purkinje cells was affected in a cell-specific fashion. We present novel electrophysiological recordings of Purkinje cell activity measured in naive wild-type mice subjected to this VOR adaptation task. We then introduce a minimal model that consists of learning at the parallel fibers to Purkinje cells with the help of the climbing fibers. Although the minimal model reproduces the behavior of the wild-type animals and is analytically tractable, it fails at reproducing the behavior of mutant mice and the electrophysiology data. Therefore, we build a detailed model involving plasticity at the parallel fibers to Purkinje cells' synapse guided by climbing fibers, feedforward inhibition of Purkinje cells, and plasticity at the mossy fiber to vestibular nuclei neuron synapse. The detailed model reproduces both the behavioral and electrophysiological data of both the wild-type and mutant mice and allows for experimentally testable predictions.Item Open Access A Three-Threshold Learning Rule Approaches the Maximal Capacity of Recurrent Neural Networks.(PLoS Comput Biol, 2015-08) Alemi, Alireza; Baldassi, Carlo; Brunel, Nicolas; Zecchina, RiccardoUnderstanding the theoretical foundations of how memories are encoded and retrieved in neural populations is a central challenge in neuroscience. A popular theoretical scenario for modeling memory function is the attractor neural network scenario, whose prototype is the Hopfield model. The model simplicity and the locality of the synaptic update rules come at the cost of a poor storage capacity, compared with the capacity achieved with perceptron learning algorithms. Here, by transforming the perceptron learning rule, we present an online learning rule for a recurrent neural network that achieves near-maximal storage capacity without an explicit supervisory error signal, relying only upon locally accessible information. The fully-connected network consists of excitatory binary neurons with plastic recurrent connections and non-plastic inhibitory feedback stabilizing the network dynamics; the memory patterns to be memorized are presented online as strong afferent currents, producing a bimodal distribution for the neuron synaptic inputs. Synapses corresponding to active inputs are modified as a function of the value of the local fields with respect to three thresholds. Above the highest threshold, and below the lowest threshold, no plasticity occurs. In between these two thresholds, potentiation/depression occurs when the local field is above/below an intermediate threshold. We simulated and analyzed a network of binary neurons implementing this rule and measured its storage capacity for different sizes of the basins of attraction. The storage capacity obtained through numerical simulations is shown to be close to the value predicted by analytical calculations. We also measured the dependence of capacity on the strength of external inputs. Finally, we quantified the statistics of the resulting synaptic connectivity matrix, and found that both the fraction of zero weight synapses and the degree of symmetry of the weight matrix increase with the number of stored patterns.Item Open Access Acetylcholine Modulates Cerebellar Granule Cell Spiking by Regulating the Balance of Synaptic Excitation and Inhibition.(The Journal of neuroscience : the official journal of the Society for Neuroscience, 2020-04) Fore, Taylor R; Taylor, Benjamin N; Brunel, Nicolas; Hull, CourtSensorimotor integration in the cerebellum is essential for refining motor output, and the first stage of this processing occurs in the granule cell layer. Recent evidence suggests that granule cell layer synaptic integration can be contextually modified, although the circuit mechanisms that could mediate such modulation remain largely unknown. Here we investigate the role of ACh in regulating granule cell layer synaptic integration in male rats and mice of both sexes. We find that Golgi cells, interneurons that provide the sole source of inhibition to the granule cell layer, express both nicotinic and muscarinic cholinergic receptors. While acute ACh application can modestly depolarize some Golgi cells, the net effect of longer, optogenetically induced ACh release is to strongly hyperpolarize Golgi cells. Golgi cell hyperpolarization by ACh leads to a significant reduction in both tonic and evoked granule cell synaptic inhibition. ACh also reduces glutamate release from mossy fibers by acting on presynaptic muscarinic receptors. Surprisingly, despite these consistent effects on Golgi cells and mossy fibers, ACh can either increase or decrease the spike probability of granule cells as measured by noninvasive cell-attached recordings. By constructing an integrate-and-fire model of granule cell layer population activity, we find that the direction of spike rate modulation can be accounted for predominately by the initial balance of excitation and inhibition onto individual granule cells. Together, these experiments demonstrate that ACh can modulate population-level granule cell responses by altering the ratios of excitation and inhibition at the first stage of cerebellar processing.SIGNIFICANCE STATEMENT The cerebellum plays a key role in motor control and motor learning. While it is known that behavioral context can modify motor learning, the circuit basis of such modulation has remained unclear. Here we find that a key neuromodulator, ACh, can alter the balance of excitation and inhibition at the first stage of cerebellar processing. These results suggest that ACh could play a key role in altering cerebellar learning by modifying how sensorimotor input is represented at the input layer of the cerebellum.Item Open Access An electrophysiological basis for human memory(2022) Vaz, Alex PatrickMemory is a fundamentally important process that guides our future behavior based on past experience. Its importance is underscored by the fact that a major feature of many neurodegenerative disorders is memory loss, which is disabling to an increasing portion of the aging population. However, the underlying electrophysiological processes underlying memory formation and retrieval in humans remains very poorly understood, and in turn, limits our abilities to provide effective therapy for patients suffering from these disorders. Here, we endeavored to investigate the underpinnings of human memory through intracranial recordings in human epilepsy patients undergoing routine monitoring for potential resective surgery. This unprecedented access to the human brain during awake behavior allowed us to make several inroads into understanding human memory. First, we investigated fast frequency oscillations in the brain, termed ripples, and their relevance during human episodic memory. We found that during a paired associates verbal memory task, ripples coupled between the medial temporal lobe (MTL) memory system and the temporal association cortex, and this coupling preceded the reinstatement of memory representations from the memory encoding period. Next, we measured single unit spiking activity from anterior temporal lobe in order to examine if temporal patterns of activity may serve as a general neural code that is replayed during memory retrieval. We found that verbal memories corresponded to item-specific sequences of cortical spiking activity, these sequences replayed during memory retrieval, and replay was preceded by ripples in the MTL. Finally, to develop a more mechanistic understanding of our findings, we used a randomly connected recurrent leaky integrate and fire neural network model to investigate the characteristics needed for significant spike sequence generation. We found that randomly connected networks can generate sequences under many parameter regimes with just white noise inputs, the specific output sequence was inherently related to the connectivity of the network, and these models could make quantitative predictions about dynamic excitatory and inhibitory balance during spiking sequences in the human data. Taken together, our results demonstrate a flexible mode of communication between the MTL and cortex in the service of episodic memory, and we provide a theoretical framework for understanding the generation of these neural patterns in the human cortex.
Item Open Access Analytical approximations of the firing rate of an adaptive exponential integrate-and-fire neuron in the presence of synaptic noise.(Front Comput Neurosci, 2014) Hertäg, Loreen; Durstewitz, Daniel; Brunel, NicolasComputational models offer a unique tool for understanding the network-dynamical mechanisms which mediate between physiological and biophysical properties, and behavioral function. A traditional challenge in computational neuroscience is, however, that simple neuronal models which can be studied analytically fail to reproduce the diversity of electrophysiological behaviors seen in real neurons, while detailed neuronal models which do reproduce such diversity are intractable analytically and computationally expensive. A number of intermediate models have been proposed whose aim is to capture the diversity of firing behaviors and spike times of real neurons while entailing the simplest possible mathematical description. One such model is the exponential integrate-and-fire neuron with spike rate adaptation (aEIF) which consists of two differential equations for the membrane potential (V) and an adaptation current (w). Despite its simplicity, it can reproduce a wide variety of physiologically observed spiking patterns, can be fit to physiological recordings quantitatively, and, once done so, is able to predict spike times on traces not used for model fitting. Here we compute the steady-state firing rate of aEIF in the presence of Gaussian synaptic noise, using two approaches. The first approach is based on the 2-dimensional Fokker-Planck equation that describes the (V,w)-probability distribution, which is solved using an expansion in the ratio between the time constants of the two variables. The second is based on the firing rate of the EIF model, which is averaged over the distribution of the w variable. These analytically derived closed-form expressions were tested on simulations from a large variety of model cells quantitatively fitted to in vitro electrophysiological recordings from pyramidal cells and interneurons. Theoretical predictions closely agreed with the firing rate of the simulated cells fed with in-vivo-like synaptic noise.Item Open Access Attractor Dynamics in Networks with Learning Rules Inferred from In Vivo Data.(Neuron, 2018-07) Pereira, Ulises; Brunel, NicolasThe attractor neural network scenario is a popular scenario for memory storage in the association cortex, but there is still a large gap between models based on this scenario and experimental data. We study a recurrent network model in which both learning rules and distribution of stored patterns are inferred from distributions of visual responses for novel and familiar images in the inferior temporal cortex (ITC). Unlike classical attractor neural network models, our model exhibits graded activity in retrieval states, with distributions of firing rates that are close to lognormal. Inferred learning rules are close to maximizing the number of stored patterns within a family of unsupervised Hebbian learning rules, suggesting that learning rules in ITC are optimized to store a large number of attractor states. Finally, we show that there exist two types of retrieval states: one in which firing rates are constant in time and another in which firing rates fluctuate chaotically.Item Open Access Bayesian reconstruction of memories stored in neural networks from their connectivityGoldt, Sebastian; Krzakala, Florent; Zdeborová, Lenka; Brunel, NicolasThe advent of comprehensive synaptic wiring diagrams of large neural circuits has created the field of connectomics and given rise to a number of open research questions. One such question is whether it is possible to reconstruct the information stored in a recurrent network of neurons, given its synaptic connectivity matrix. Here, we address this question by determining when solving such an inference problem is theoretically possible in specific attractor network models and by providing a practical algorithm to do so. The algorithm builds on ideas from statistical physics to perform approximate Bayesian inference and is amenable to exact analysis. We study its performance on three different models and explore the limitations of reconstructing stored patterns from synaptic connectivity.Item Open Access Behavioral state and stimulus strength regulate the role of somatostatin interneurons in stabilizing network activity.(bioRxiv, 2024-09-10) Cammarata, Celine M; Pei, Yingming; Shields, Brenda C; Lim, Shaun SX; Hawley, Tammy; Li, Jennifer Y; St Amand, David; Brunel, Nicolas; Tadross, Michael R; Glickfeld, Lindsey LInhibition stabilization enables cortical circuits to encode sensory signals across diverse contexts. Somatostatin-expressing (SST) interneurons are well-suited for this role through their strong recurrent connectivity with excitatory pyramidal cells. We developed a cortical circuit model predicting that SST cells become increasingly important for stabilization as sensory input strengthens. We tested this prediction in mouse primary visual cortex by manipulating excitatory input to SST cells, a key parameter for inhibition stabilization, with a novel cell-type specific pharmacological method to selectively block glutamatergic receptors on SST cells. Consistent with our model predictions, we find antagonizing glutamatergic receptors drives a paradoxical facilitation of SST cells with increasing stimulus contrast. In addition, we find even stronger engagement of SST-dependent stabilization when the mice are aroused. Thus, we reveal that the role of SST cells in cortical processing gradually switches as a function of both input strength and behavioral state.Item Open Access Bistability and up/down state alternations in inhibition-dominated randomly connected networks of LIF neurons.(Scientific reports, 2017-09-20) Tartaglia, Elisa M; Brunel, NicolasElectrophysiological recordings in cortex in vivo have revealed a rich variety of dynamical regimes ranging from irregular asynchronous states to a diversity of synchronized states, depending on species, anesthesia, and external stimulation. The average population firing rate in these states is typically low. We study analytically and numerically a network of sparsely connected excitatory and inhibitory integrate-and-fire neurons in the inhibition-dominated, low firing rate regime. For sufficiently high values of the external input, the network exhibits an asynchronous low firing frequency state (L). Depending on synaptic time constants, we show that two scenarios may occur when external inputs are decreased: (1) the L state can destabilize through a Hopf bifucation as the external input is decreased, leading to synchronized oscillations spanning d δ to β frequencies; (2) the network can reach a bistable region, between the low firing frequency network state (L) and a quiescent one (Q). Adding an adaptation current to excitatory neurons leads to spontaneous alternations between L and Q states, similar to experimental observations on UP and DOWN states alternations.Item Open Access Burst-Dependent Bidirectional Plasticity in the Cerebellum Is Driven by Presynaptic NMDA Receptors.(Cell reports, 2016-04) Bouvier, Guy; Higgins, David; Spolidoro, Maria; Carrel, Damien; Mathieu, Benjamin; Léna, Clément; Dieudonné, Stéphane; Barbour, Boris; Brunel, Nicolas; Casado, MarianoNumerous studies have shown that cerebellar function is related to the plasticity at the synapses between parallel fibers and Purkinje cells. How specific input patterns determine plasticity outcomes, as well as the biophysics underlying plasticity of these synapses, remain unclear. Here, we characterize the patterns of activity that lead to postsynaptically expressed LTP using both in vivo and in vitro experiments. Similar to the requirements of LTD, we find that high-frequency bursts are necessary to trigger LTP and that this burst-dependent plasticity depends on presynaptic NMDA receptors and nitric oxide (NO) signaling. We provide direct evidence for calcium entry through presynaptic NMDA receptors in a subpopulation of parallel fiber varicosities. Finally, we develop and experimentally verify a mechanistic plasticity model based on NO and calcium signaling. The model reproduces plasticity outcomes from data and predicts the effect of arbitrary patterns of synaptic inputs on Purkinje cells, thereby providing a unified description of plasticity.Item Open Access Calcium-based plasticity model explains sensitivity of synaptic changes to spike pattern, rate, and dendritic location.(Proceedings of the National Academy of Sciences of the United States of America, 2012-03) Graupner, Michael; Brunel, NicolasMultiple stimulation protocols have been found to be effective in changing synaptic efficacy by inducing long-term potentiation or depression. In many of those protocols, increases in postsynaptic calcium concentration have been shown to play a crucial role. However, it is still unclear whether and how the dynamics of the postsynaptic calcium alone determine the outcome of synaptic plasticity. Here, we propose a calcium-based model of a synapse in which potentiation and depression are activated above calcium thresholds. We show that this model gives rise to a large diversity of spike timing-dependent plasticity curves, most of which have been observed experimentally in different systems. It accounts quantitatively for plasticity outcomes evoked by protocols involving patterns with variable spike timing and firing rate in hippocampus and neocortex. Furthermore, it allows us to predict that differences in plasticity outcomes in different studies are due to differences in parameters defining the calcium dynamics. The model provides a mechanistic understanding of how various stimulation protocols provoke specific synaptic changes through the dynamics of calcium concentration and thresholds implementing in simplified fashion protein signaling cascades, leading to long-term potentiation and long-term depression. The combination of biophysical realism and analytical tractability makes it the ideal candidate to study plasticity at the synapse, neuron, and network levels.Item Open Access Cerebellar learning using perturbations.(eLife, 2018-11-12) Bouvier, Guy; Aljadeff, Johnatan; Clopath, Claudia; Bimbard, Célian; Ranft, Jonas; Blot, Antonin; Nadal, Jean-Pierre; Brunel, Nicolas; Hakim, Vincent; Barbour, BorisThe cerebellum aids the learning of fast, coordinated movements. According to current consensus, erroneously active parallel fibre synapses are depressed by complex spikes signalling movement errors. However, this theory cannot solve the credit assignment problem of processing a global movement evaluation into multiple cell-specific error signals. We identify a possible implementation of an algorithm solving this problem, whereby spontaneous complex spikes perturb ongoing movements, create eligibility traces and signal error changes guiding plasticity. Error changes are extracted by adaptively cancelling the average error. This framework, stochastic gradient descent with estimated global errors (SGDEGE), predicts synaptic plasticity rules that apparently contradict the current consensus but were supported by plasticity experiments in slices from mice under conditions designed to be physiological, highlighting the sensitivity of plasticity studies to experimental conditions. We analyse the algorithm's convergence and capacity. Finally, we suggest SGDEGE may also operate in the basal ganglia.Item Open Access Characteristics of sequential activity in networks with temporally asymmetric Hebbian learning.(Proceedings of the National Academy of Sciences of the United States of America, 2020-11-11) Gillett, Maxwell; Pereira, Ulises; Brunel, NicolasSequential activity has been observed in multiple neuronal circuits across species, neural structures, and behaviors. It has been hypothesized that sequences could arise from learning processes. However, it is still unclear whether biologically plausible synaptic plasticity rules can organize neuronal activity to form sequences whose statistics match experimental observations. Here, we investigate temporally asymmetric Hebbian rules in sparsely connected recurrent rate networks and develop a theory of the transient sequential activity observed after learning. These rules transform a sequence of random input patterns into synaptic weight updates. After learning, recalled sequential activity is reflected in the transient correlation of network activity with each of the stored input patterns. Using mean-field theory, we derive a low-dimensional description of the network dynamics and compute the storage capacity of these networks. Multiple temporal characteristics of the recalled sequential activity are consistent with experimental observations. We find that the degree of sparseness of the recalled sequences can be controlled by nonlinearities in the learning rule. Furthermore, sequences maintain robust decoding, but display highly labile dynamics, when synaptic connectivity is continuously modified due to noise or storage of other patterns, similar to recent observations in hippocampus and parietal cortex. Finally, we demonstrate that our results also hold in recurrent networks of spiking neurons with separate excitatory and inhibitory populations.Item Open Access Correlations between synapses in pairs of neurons slow down dynamics in randomly connected neural networks(2017-08-01) Martí, Daniel; Brunel, Nicolas; Ostojic, SrdjanNetworks of randomly connected neurons are among the most popular models in theoretical neuroscience. The connectivity between neurons in the cortex is however not fully random, the simplest and most prominent deviation from randomness found in experimental data being the overrepresentation of bidirectional connections among pyramidal cells. Using numerical and analytical methods, we investigated the effects of partially symmetric connectivity on dynamics in networks of rate units. We considered the two dynamical regimes exhibited by random neural networks: the weak-coupling regime, where the firing activity decays to a single fixed point unless the network is stimulated, and the strong-coupling or chaotic regime, characterized by internally generated fluctuating firing rates. In the weak-coupling regime, we computed analytically for an arbitrary degree of symmetry the auto-correlation of network activity in presence of external noise. In the chaotic regime, we performed simulations to determine the timescale of the intrinsic fluctuations. In both cases, symmetry increases the characteristic asymptotic decay time of the autocorrelation function and therefore slows down the dynamics in the network.Item Open Access Cortical dynamics during naturalistic sensory stimulations: experiments and models.(Journal of physiology, Paris, 2011-01) Mazzoni, Alberto; Brunel, Nicolas; Cavallari, Stefano; Logothetis, Nikos K; Panzeri, StefanoWe report the results of our experimental and theoretical investigations of the neural response dynamics in primary visual cortex (V1) during naturalistic visual stimulation. We recorded Local Field Potentials (LFPs) and spiking activity from V1 of anaesthetized macaques during binocular presentation of Hollywood color movies. We analyzed these recordings with information theoretic methods, and found that visual information was encoded mainly by two bands of LFP responses: the network fluctuations measured by the phase and power of low-frequency (less than 12 Hz) LFPs; and fast gamma-range (50-100 Hz) oscillations. Both the power and phase of low frequency LFPs carried information largely complementary to that carried by spikes, whereas gamma range oscillations carried information largely redundant to that of spikes. To interpret these results within a quantitative theoretical framework, we then simulated a sparsely connected recurrent network of excitatory and inhibitory neurons receiving slowly varying naturalistic inputs, and we determined how the LFPs generated by the network encoded information about the inputs. We found that this simulated recurrent network reproduced well the experimentally observed dependency of LFP information upon frequency. This network encoded the overall strength of the input into the power of gamma-range oscillations generated by inhibitory-excitatory neural interactions, and encoded slow variations in the input by entraining the network LFP at the corresponding frequency. This dynamical behavior accounted quantitatively for the independent information carried by high and low frequency LFPs, and for the experimentally observed cross-frequency coupling between phase of slow LFPs and the power of gamma LFPs. We also present new results showing that the model's dynamics also accounted for the extra visual information that the low-frequency LFP phase of spike firing carries beyond that carried by spike rates. Overall, our results suggest biological mechanisms by which cortex can multiplex information about naturalistic sensory environments.Item Open Access Coupled ripple oscillations between the medial temporal lobe and neocortex retrieve human memory(Science, 2019-03-01) Vaz, Alex P; Inati, Sara K; Brunel, Nicolas; Zaghloul, Kareem AEpisodic memory retrieval relies on the recovery of neural representations of waking experience. This process is thought to involve a communication dynamic between the medial temporal lobe memory system and the neocortex. How this occurs is largely unknown, however, especially as it pertains to awake human memory retrieval. Using intracranial electroencephalographic recordings, we found that ripple oscillations were dynamically coupled between the human medial temporal lobe (MTL) and temporal association cortex. Coupled ripples were more pronounced during successful verbal memory retrieval and recover the cortical neural representations of remembered items. Together, these data provide direct evidence that coupled ripples between the MTL and association cortex may underlie successful memory retrieval in the human brain.Item Open Access Dynamics of networks of excitatory and inhibitory neurons in response to time-dependent inputs.(Front Comput Neurosci, 2011) Ledoux, Erwan; Brunel, NicolasWe investigate the dynamics of recurrent networks of excitatory (E) and inhibitory (I) neurons in the presence of time-dependent inputs. The dynamics is characterized by the network dynamical transfer function, i.e., how the population firing rate is modulated by sinusoidal inputs at arbitrary frequencies. Two types of networks are studied and compared: (i) a Wilson-Cowan type firing rate model; and (ii) a fully connected network of leaky integrate-and-fire (LIF) neurons, in a strong noise regime. We first characterize the region of stability of the "asynchronous state" (a state in which population activity is constant in time when external inputs are constant) in the space of parameters characterizing the connectivity of the network. We then systematically characterize the qualitative behaviors of the dynamical transfer function, as a function of the connectivity. We find that the transfer function can be either low-pass, or with a single or double resonance, depending on the connection strengths and synaptic time constants. Resonances appear when the system is close to Hopf bifurcations, that can be induced by two separate mechanisms: the I-I connectivity and the E-I connectivity. Double resonances can appear when excitatory delays are larger than inhibitory delays, due to the fact that two distinct instabilities exist with a finite gap between the corresponding frequencies. In networks of LIF neurons, changes in external inputs and external noise are shown to be able to change qualitatively the network transfer function. Firing rate models are shown to exhibit the same diversity of transfer functions as the LIF network, provided delays are present. They can also exhibit input-dependent changes of the transfer function, provided a suitable static non-linearity is incorporated.Item Open Access Emergence of irregular activity in networks of strongly coupled conductance-based neuronsSanzeni, Alessandro; Histed, Mark H; Brunel, NicolasCortical neurons are characterized by irregular firing and a broad distribution of rates. The balanced state model explains these observations with a cancellation of mean excitatory and inhibitory currents, which makes fluctuations drive firing. In networks of neurons with current-based synapses, the balanced state emerges dynamically if coupling is strong, i.e. if the mean number of synapses per neuron $K$ is large and synaptic efficacy is of order $1/\sqrt{K}$. When synapses are conductance-based, current fluctuations are suppressed when coupling is strong, questioning the applicability of the balanced state idea to biological neural networks. We analyze networks of strongly coupled conductance-based neurons and show that asynchronous irregular activity and broad distributions of rates emerge if synapses are of order $1/\log(K)$. In such networks, unlike in the standard balanced state model, current fluctuations are small and firing is maintained by a drift-diffusion balance. This balance emerges dynamically, without fine tuning, if inputs are smaller than a critical value, which depends on synaptic time constants and coupling strength, and is significantly more robust to connection heterogeneities than the classical balanced state model. Our analysis makes experimentally testable predictions of how the network response properties should evolve as input increases.Item Open Access Estimating network parameters from combined dynamics of firing rate and irregularity of single neurons.(Journal of neurophysiology, 2011-01) Hamaguchi, Kosuke; Riehle, Alexa; Brunel, NicolasHigh firing irregularity is a hallmark of cortical neurons in vivo, and modeling studies suggest a balance of excitation and inhibition is necessary to explain this high irregularity. Such a balance must be generated, at least partly, from local interconnected networks of excitatory and inhibitory neurons, but the details of the local network structure are largely unknown. The dynamics of the neural activity depends on the local network structure; this in turn suggests the possibility of estimating network structure from the dynamics of the firing statistics. Here we report a new method to estimate properties of the local cortical network from the instantaneous firing rate and irregularity (CV(2)) under the assumption that recorded neurons are a part of a randomly connected sparse network. The firing irregularity, measured in monkey motor cortex, exhibits two features; many neurons show relatively stable firing irregularity in time and across different task conditions; the time-averaged CV(2) is widely distributed from quasi-regular to irregular (CV(2) = 0.3-1.0). For each recorded neuron, we estimate the three parameters of a local network [balance of local excitation-inhibition, number of recurrent connections per neuron, and excitatory postsynaptic potential (EPSP) size] that best describe the dynamics of the measured firing rates and irregularities. Our analysis shows that optimal parameter sets form a two-dimensional manifold in the three-dimensional parameter space that is confined for most of the neurons to the inhibition-dominated region. High irregularity neurons tend to be more strongly connected to the local network, either in terms of larger EPSP and inhibitory PSP size or larger number of recurrent connections, compared with the low irregularity neurons, for a given excitatory/inhibitory balance. Incorporating either synaptic short-term depression or conductance-based synapses leads many low CV(2) neurons to move to the excitation-dominated region as well as to an increase of EPSP size.Item Open Access Firing rate of the leaky integrate-and-fire neuron with stochastic conductance-based synaptic inputs with short decay timesOleskiw, Timothy D; Bair, Wyeth; Shea-Brown, Eric; Brunel, NicolasWe compute the firing rate of a leaky integrate-and-fire (LIF) neuron with stochastic conductance-based inputs in the limit when synaptic decay times are much shorter than the membrane time constant. A comparison of our analytical results to numeric simulations is presented for a range of biophysically-realistic parameters.
- «
- 1 (current)
- 2
- 3
- »