Repeated sequential learning increases memory capacity via effective decorrelation in a recurrent neural network
Repeated sequential learning increases memory capacity via effective decorrelation in a recurrent neural network
The authors show that repeated learning decorrelates memory representations, resulting in higher memory performance than other classical models. Such enhancement depends on the relationship between neural dynamics and synaptic plasticity. The model's spontaneous activity is shaped by learning, to exhibit chaotic activity with intermittent correlations to learned memories, as is …