Efficiently encoding noisy inputs in a recurrent network with adaptive representational dimensionality

L Duong, T Yerxa, E P Simoncelli and D Lipshutz.

Published in Computational and Systems Neuroscience (CoSyNe), Mar 2026.

Efficient coding theory posits that sensory systems are optimized to transmit as much information as possible given available resources. The theory predicts a fundamental trade-off: high-dimensional, decorrelated codes are optimal for clean signals, whereas low-dimensional, redundant codes are superior for noisy signals. However, the circuit-level mechanisms that could flexibly modulate the dimensionality of a population response are unknown. We propose a normative circuit model in which recurrent connectivity dynamically adjusts the dimensionality of the neural population code.

We derive our model from a novel unsupervised objective that balances input reconstruction with a geometric regularizer based on the participation ratio, a measure of population response dimensionality. Optimization of this objective can be achieved with a circuit whose synapses are learned online via local Hebbian plasticity, augmented with a single hyper-parameter that governs the strength of recurrence to shape the code’s dimensionality. We also show that this circuit can be structured to satisfy Dale’s law, in which case the balance between recurrent excitation (E) and inhibition (I) controls dimensionality, demonstrating how E-I balance can shape the population code.


  • Listing of all publications