Jump to Content

Generative replay for compositional visual understanding in the prefrontal-hippocampal circuit

Published
View publication Download

Abstract

Understanding the visual world is a constructive and compositional process. Whilst a frontal- hippocampal circuit is known to be essential for this task, little is known about the associated neuronal computations. Although visual understanding appears superficially distinct from other known functions of this circuit, such as spatial reasoning, compositional inference, and model-based planning, recent models suggest deeper computational similarities. Here, using fMRI, we show that representations of visual objects in these brain regions are relational and compositional – key computational properties theorised to support rapid construction of hippocampal maps. Using MEG, we show that rapid sequences of representations, akin to replay in spatial navigation and planning problems, are also engaged in visual understanding. Whilst these sequences have previously been proposed as mechanisms to plan possible futures or learn from the past, here they are used to understand the present. Replay sequences change content over timescales of understanding and form constructive hypotheses about possible object configurations. These hypotheses play out in an order that supports relational inference, progressing from predictable to uncertain scene elements, gradually constraining possible configurations, and converging on the correct configuration. Together, these results suggest a computational bridge between apparently distinct functions of hippocampal-prefrontal circuitry, and a role for generative replay in compositional inference and hypothesis testing.

Authors

Zeb Kurth-Nelson, Matt Botvinick, Tim Behrens*, Ray Dolan*, Philipp Schwartenbeck*, Yunzhe Liu*, Tim Muller*, Shirley Mark*, Alon Baram*

Venue

Cell