Neuronal Computation Underlying Inferential Reasoning in Humans and Mice.

Barron HC
Reeve HM
Koolschijn RS
Perestenko PV
Shpektor A
Nili H
Rothaermel R
Campo-Urriza N
O'Reilly JX
Bannerman DM
Behrens TE
Dupret D

Everyday we use our memory to guide the decisions we make. We can even infer relationships between separate life events. Yet, it is not clear how the brain supports this process. Here, by conducting experiments with both mice and people, we show that brain cells in a region called the hippocampus support inference by linking memories for separate life events.

Scientific Abstract

Every day we make decisions critical for adaptation and survival. We repeat actions with known consequences. But we also draw on loosely related events to infer and imagine the outcome of entirely novel choices. These inferential decisions are thought to engage a number of brain regions; however, the underlying neuronal computation remains unknown. Here, we use a multi-day cross-species approach in humans and mice to report the functional anatomy and neuronal computation underlying inferential decisions. We show that during successful inference, the mammalian brain uses a hippocampal prospective code to forecast temporally structured learned associations. Moreover, during resting behavior, coactivation of hippocampal cells in sharp-wave/ripples represent inferred relationships that include reward, thereby "joining-the-dots" between events that have not been observed together but lead to profitable outcomes. Computing mnemonic links in this manner may provide an important mechanism to build a cognitive map that stretches beyond direct experience, thus supporting flexible behavior.

A matrix showing the similarity between neuronal activity patterns representing sounds and lights. Red boxes indicate high similarity, whereas blue boxes indicate low similarity
A cross-stimulus Representational Similarity Matrix (RSM) showing that the hippocampal activity patterns that represent discrete visual cues (x-axis) paired with auditory cues (y-axis) are retrieved at the time of correct inference (with red pixels indicating high similarity) while the subject is only exposed to the auditory cues.
Citation
2020. Cell, 183(1):228-243.e21.
Related Content
Publication
Trouche S, Koren V, Doig NM, Ellender TJ, El-Gaby M, Lopes-Dos-Santos V, Reeve HM, Perestenko PV, Garas FN, Magill PJ, Sharott A, Dupret D

2019. Cell, 176(6):1393-1406.e16.

Publication
Koolschijn RS, Emir UE, Pantelides AC, Nili H, Behrens TE, Barron HC
2019. Neuron 101:528-541
Publication
Barron HC, Auksztulewicz R, Friston KJ
2020. Prog. Neurobiol., 192:101821.
Publication
Groschner LN, Chan Wah Hak L, Bogacz R, DasGupta S, Miesenböck G
2018. Cell, 173(4):894-905.
Publication
Koolschijn RS, Shpektor A, Clarke WT, Ip IB, Dupret D, Emir UE, Barron HC

2021. eLife, 10:e70071