Solvable neural network model for input-output associations: Optimal recall at the onset of chaos

Type: Article

Publication Date: 2023-12-08

Citations: 0

DOI: https://doi.org/10.1103/physrevresearch.5.043221

Abstract

In neural information processing, inputs modulate neural dynamics to generate desired outputs. To unravel the dynamics and underlying neural connectivity enabling such input-output association, we propose an exactly solvable neural-network model with a connectivity matrix explicitly consisting of inputs and required outputs. An analytic form of the response under the input is derived, while three distinctive types of responses including chaotic dynamics are obtained as distinctive bifurcations against input strength, depending on the neural sensitivity and number of inputs. Optimal performance is achieved at the onset of chaos. The relevance of the results to cognitive dynamics is discussed.

Locations

  • Physical Review Research - View - PDF
  • arXiv (Cornell University) - View - PDF

Similar Works

Action Title Year Authors
+ Solvable Neural Network Model for Input-Output Associations: Optimal Recall at the Onset of Chaos 2023 Tomoki Kurikawa
Kunihiko Kaneko
+ PDF Chat Optimal Input Representation in Neural Systems at the Edge of Chaos 2021 Guillermo B. Morales
Miguel A. Muñoz
+ Optimal input representation in neural systems at the edge of chaos 2021 Guillermo B. Morales
Miguel Á. Muñoz
+ Optimal input representation in neural systems at the edge of chaos 2021 Guillermo B. Morales
Miguel A. Muñoz
+ PDF Chat Nonlinear Neural Dynamics and Classification Accuracy in Reservoir Computing 2024 Claus Metzner
Achim Schilling
Andreas Maier
Patrick Krauß
+ An attractor neural network architecture with an ultra high information capacity: numerical results 2015 Alireza Alemi
+ An attractor neural network architecture with an ultra high information capacity: numerical results 2015 Alireza Alemi
+ Attractor and integrator networks in the brain 2021 Mikail Khona
Ila Fiete
+ Optimal Machine Intelligence at the Edge of Chaos 2019 Ling Feng
Lin Zhang
Choy Heng Lai
+ PDF Chat Linear-Threshold Network Models for Describing and Analyzing Brain Dynamics 2024 Michael McCreesh
Erfan Nozari
Jorge Cortés
+ PDF Chat Response functions improving performance in analog attractor neural networks 1994 Nicolas Brunel
Riccardo Zecchina
+ PDF Chat Input-Driven Dynamics for Robust Memory Retrieval in Hopfield Networks 2024 Simone Betteti
Giacomo Baggio
Francesco Bullo
Sandro Zampieri
+ PDF Chat Excitatory/inhibitory balance emerges as a key factor for RBN performance, overriding attractor dynamics 2023 Emmanuel Calvet
Jean Rouat
Bertrand Reulet
+ PDF Chat The dynamics of neural codes in biological and artificial neural networks 2024 Guillermo B. Morales
+ Continuous or discrete attractors in neural circuits? A self-organized switch at maximal entropy 2007 Alberto Bernacchia
+ PDF Chat Learning continuous chaotic attractors with a reservoir computer 2022 Lindsay M. Smith
Jason Z. Kim
Zhixin Lu
Dani S. Bassett
+ Chaos-guided Input Structuring for Improved Learning in Recurrent Neural Networks 2017 Priyadarshini Panda
Kaushik Roy
+ Chaos-guided Input Structuring for Improved Learning in Recurrent Neural Networks 2017 Priyadarshini Panda
Kaushik Roy
+ Cross-Frequency Coupling Increases Memory Capacity in Oscillatory Neural Networks 2022 Connor Bybee
Alexander Belsten
Friedrich T. Sommer
+ Charting and navigating the space of solutions for recurrent neural networks. 2021 Elia Turner
Kabir Dabholkar
Omri Barak

Works That Cite This (0)

Action Title Year Authors