In their recent publication in ScienceAdvances, researchers at the Donders Institute show that the brain actively predicts how objects should appear based on the three-dimensional structure of their surroundings. When objects rotated in a way that matched the geometry of the scene, their representations in the visual cortex of the brain were stronger. Remarkably, even when an object was completely hidden from view, its expected orientation could still be read-out from brain activity.
In one of the studies, participants were shown realistic rooms with a central object, such as a bed or a couch. This object was viewed from one of two possible angles and on each trial the viewpoint was changed by rotating the room in discrete steps. During the first two snapshots, the object was fully visible, allowing participants to encode its position within the room. In the following three snapshots, the object was hidden, so only the rotating room was visible. In the final snapshot, the object reappeared, either in a way that was congruent with the room’s rotation or incongruent, creating a mismatch between expected and actual orientation. Importantly, the total rotation (30° or 90°) varied, and participants could only infer the object’s new orientation from the changing viewpoint of the room. This design allowed researchers to measure how the brain dynamically updates object representations in response to predictable changes in the environment