Project
Cross-modal emotional context-dependent filtering: origin and neuronal mechanisms.
I recently demonstrated that monkeys interpret complex multimodal social stimuli and associate them based on their shared social meaning. In an emotional context established by visual stimuli, vocalisations elicit high or low brain activity depending on their relevance in visually-primed social contexts. Specifically, responses to affiliative vocalisations are boosted or blocked in affiliative and aggressive contexts, respectively. Surprisingly, the outcome of this context-dependent filtering (CDF) is opposite to expected predictive coding effects. We first aim to determine whether this unexpected cross-modal CDF can be generalized. Specifically, we will test whether CDF also occurs when contexts are created by social auditory stimuli, thereby filtering visual processing. We will identify CDF effects throughout the brain using sub-mm fMRI, and study their details in specific regions with focal ultrasound imaging (fUSI). We next aim to identify areas driving CDF by i) determining specific targets using task-based functional connectivity analyses, which ii) will be chemogenetically inactivated. We predict that inactivation of potential (prefrontal) source areas will block CDF in sensory cortex (readout fMRI and fUSI). Finally, we will manipulate the context to determine when CDF versus predictive coding mechanisms occur (using fUSI) - we predict that filtering requires more time to establish (as contexts needs to be learnt) and that predictive coding mechanisms occur swiftly.