Imagine a busy street cafe, your senses are bombarded with many different signals: the friend’s juicy gossip, his facial expressions, the clinking of glasses and the roaring motor noise of a truck passing by. To make sense of this sensory cacophony, the brain needs to solve the causal inference or binding problem. It has to decide which signals come from common sources and should hence be integrated and which signals come from different sources and should be processed independently.
Noppeney: “In everyday life, the brain solves the causal inference problem seemingly effortlessly when crossing a busy street, talking to a friend in a noisy restaurant or shopping in the supermarket. Yet, it is a tremendous computational challenge. How can the brain with its very limited computational resources accomplish this feat? The key hypothesis of my proposal is that the brain does not solve the causal inference problem exactly but computes approximate solutions via attentional mechanisms. Rather than solving the causal inference problem simultaneously for the entire scene, it sequentially selects subsets of signals within an attentional spotlight and assesses the underlying source configurations only for those few signals. The brain successively shifts this attentional spotlight across the scene. By combining these local solutions the brain eventually obtains a solution for the entire scene, which is not exact but approximates the exact solution sufficiently well.”
The brain actively selects signals
This project will alter our perspectives about how we perceive the world, moving from near-optimal passive perception in simple situations to active information gathering at the service of approximate solutions in more complex dynamic multisensory environments. In a nutshell, the brain does not passively wait for signals to arrive as studied in many typical cognitive neuroscience experiments. Instead, the brain actively selects subsets of signals for perception via attentional mechanisms or even by directing its sensors via eye, head or hand movements – a strategy referred to as active sensing.
Noppeney: “In the longer term the results of this project may also drive novel insights into the profound difficulties older, sensory-impaired and neuropsychiatric patients face in everyday life. When older people say they can’t hear anymore, they may often still have near-normal audiograms. But they fail to understand their relatives at a busy family gathering. The same goes for people with schizophrenia or autism spectrum disorder who may experience difficulties in these complex environments with numerous signals and sources.” This project may help us to tease apart their underlying failure modes. It may also have the potential to inspire synergistic interactions with artificial intelligence research.