Stimulus-dependent Maximum Entropy Models of Neural Population Codes

Type: Article

Publication Date: 2013-03-14

Citations: 115

DOI: https://doi.org/10.1371/journal.pcbi.1002922

Abstract

Neural populations encode information about their stimulus in a collective fashion, by joint activity patterns of spiking and silence. A full account of this mapping from stimulus to neural activity is given by the conditional probability distribution over neural codewords given the sensory input. For large populations, direct sampling of these distributions is impossible, and so we must rely on constructing appropriate models. We show here that in a population of 100 retinal ganglion cells in the salamander retina responding to temporal white-noise stimuli, dependencies between cells play an important encoding role. We introduce the stimulus-dependent maximum entropy (SDME) model—a minimal extension of the canonical linear-nonlinear model of a single neuron, to a pairwise-coupled neural population. We find that the SDME model gives a more accurate account of single cell responses and in particular significantly outperforms uncoupled models in reproducing the distributions of population codewords emitted in response to a stimulus. We show how the SDME model, in conjunction with static maximum entropy models of population vocabulary, can be used to estimate information-theoretic quantities like average surprise and information transmission in a neural population.

Locations

  • PubMed Central - View
  • arXiv (Cornell University) - View - PDF
  • Europe PMC (PubMed Central) - View - PDF
  • DOAJ (DOAJ: Directory of Open Access Journals) - View
  • PubMed - View
  • DataCite API - View
  • PLoS Computational Biology - View - PDF

Works That Cite This (48)

Action Title Year Authors
+ PDF Chat Nonlinear decoding of a complex movie from the mammalian retina 2018 Vicente Botella‐Soler
Stéphane Deny
Georg Martius
Olivier Marre
Gašper Tkačik
+ PDF Chat Transformation of Stimulus Correlations by the Retina 2013 Kristina D. Simmons
Jason Prentice
Gašper Tkačik
Jan Homann
Heather K. Yee
Stephanie E. Palmer
Philip Nelson
Vijay Balasubramanian
+ PDF Chat Separating intrinsic interactions from extrinsic correlations in a network of sensory neurons 2018 Ulisse Ferrari
Stéphane Deny
Matthew Chalk
Gašper Tkačik
Olivier Marre
Thierry Mora
+ What do we mean by the dimensionality of behavior? 2020 William Bialek
+ Time-Dependent Maximum Entropy Model for Populations of Retinal Ganglion Cells 2022 Geoffroy Delamare
Ulisse Ferrari
+ PDF Chat Exact computation of the maximum-entropy potential of spiking neural-network models 2014 Rodrigo Cofré
Bruno Cessac
+ Thermodynamics for a network of neurons: Signatures of criticality 2014 Gašper Tkačik
Thierry Mora
Olivier Marre
Dario Amodei
Michael J. Berry
William Bialek
+ Learning quadratic receptive fields from neural responses to natural stimuli 2012 Kanaka Rajan
Olivier Marre
Gašper Tkačik
+ Habit learning supported by efficiently controlled network dynamics in naive macaque monkeys 2020 Karol P. Szymula
Fabio Pasqualetti
Ann M. Graybiel
Theresa M. Desrochers
Danielle S. Bassett
+ Unsupervised Bayesian Ising Approximation for revealing the neural dictionary in songbirds 2019 Damián G. Hernández
Samuel J. Sober
Ilya Nemenman