Learning efficient task-dependent representations with synaptic plasticity

C Bredenberg, E P Simoncelli and C Savin

bioRxiv, Technical Report 2020.06.19.162172, Jun 2020.

DOI: 10.1101/2020.06.19.162172

This paper has been superseded by:
Learning efficient task-dependent representations with synaptic plasticity
C Bredenberg, E P Simoncelli and C Savin.
Adv. Neural Information Processing Systems (NeurIPS), vol.33 Dec 2020.


Download:

  • Reprint (pdf)

  • Neural populations do not perfectly encode the sensory world: their capacity is limited by the number of neurons, metabolic and other biophysical resources, and intrinsic noise. The brain is presumably shaped by these limitations, improving efficiency by discarding some aspects of incoming sensory streams, while prefer-entially preserving commonly occurring, behaviorally-relevant information. Here we construct a stochastic recurrent neural circuit model that can learn efficient, task-specific sensory codes using a novel form of reward-modulated Hebbian synaptic plasticity. We illustrate the flexibility of the model by training an initially unstructured neural network to solve two different tasks: stimulus estimation, and stimulus discrimination. The network achieves high performance in both tasks by appropriately allocating resources and using its recurrent circuitry to best compensate for different levels of noise. We also show how the interaction between stimulus priors and task structure dictates the emergent network representations.
  • Superseded Publications: Bredenberg19a
  • Listing of all publications