Robust Information Storage and Consolidation in Attractor Neural Networks

Loading...
Thumbnail Image

Date

2023

Authors

Journal Title

Journal ISSN

Volume Title

Repository Usage Stats

20
views
68
downloads

Abstract

Long-term memory is believed to be stored in the human brain by changing synapses in a neuronal activity-dependent way. This idea has been implemented in the attractor neural network models, where the connectivity strength between neurons is determined by Hebbian synaptic plasticity rules. Classical studies of memory modeling synapses as continuous variables in networks of binary neurons have shown that such networks can have large storage capacities. However, a rising number of evidence suggests that synapses in brain structures involved in memory, such as the hippocampus and neocortex, are more digital than analog. Understanding how a large amount of information can be robustly stored with discrete-like synapses in the brain remains an open question in the field of computational neuroscience.

In this study, we explored a series of synaptic plasticity rules for discrete-like synapses and investigated how their application in attractor neural networks will affect the memory function of the system. We built mean-field equations to calculate the storage capacity of the network. We studied a network with a binarized Hebbian learning rule, showing that such networks can provide a near-optimal storage capacity, in the space of all possible binary connectivity matrices. We investigated a model with double-well synapses, where each synapse is described by a continuous variable that evolves in a potential with multiple minima. We showed that this model could interpolate between models with discrete synapses and models with continuous synapses by varying the shape of the potential. Our results indicated that discrete-like synapses could benefit neural networks by increasing their robustness with respect to noise. Furthermore, we incorporated the double-well synapses model with the memory consolidation mechanism. Our result showed that memory consolidation could significantly enhance the storage capacity of the network, leading to a power law decay of the memory forgetting curve as observed in psychological experiments.

Department

Description

Provenance

Citation

Citation

Feng, Yu (2023). Robust Information Storage and Consolidation in Attractor Neural Networks. Dissertation, Duke University. Retrieved from https://hdl.handle.net/10161/29111.

Collections


Except where otherwise noted, student scholarship that was shared on DukeSpace after 2009 is made available to the public under a Creative Commons Attribution / Non-commercial / No derivatives (CC-BY-NC-ND) license. All rights in student work shared on DukeSpace before 2009 remain with the author and/or their designee, whose permission may be required for reuse.