When basic visual and auditory features interact


Karla Evans

Kanwisher Lab, Brain & Cognitive Science, MIT and Visual Attention Lab, Brigham and Women's Hosipital


The ability to detect relationships between different sensory events grants behavioral flexibility, permitting the substitution of one sensory channel for another as well as the integration of information from different sensory streams. Studies that I will present examine the featural correspondence between basic auditory and visual features, exploring the nature, the neural correlates and the role of attention in their interaction. Using both behavioral and brain imaging methods, I explored crossmodal interaction based on feature correspondence between auditory pitch of sound and the visual features of vertical location, size, spatial frequency and contrast. Using a speeded classification task, I compared performance between two bimodal conditions to determine which polarities of the two dimensions from different modalities mapped onto one another. For three out of four crossmodal pairings, congruent features resulted in crossmodal facilitation and incongruent features in interference, with the irrelevant dimension of one modality affecting performance on the relevant dimension of another modality. Thus for example high pitch with high position, small size, or high spatial frequency gave better performance than high pitch with low position, large size or low spatial frequency. Using an indirect measure, I was able to locate the interaction as occurring at least partly at the perceptual level.

EEG and fMRI studies revealed that the neuronal substrates of the interactions include the sensory-specific cortices and regions in STS and STG.  Both the integration sites and the sensory cortices are modulated by congruency between auditory and visual features starting at about 250 ms post-stimulus. The timing of the EEG components suggests that the crossmodal interactions happen through convergence onto multimodal areas, which generate feedback projections onto sensory cortices.

Three further behavioral experiments show no effects of attention load in these interactions, suggesting that they occur automatically. The congruency effect was no larger with attention divided across both modalities than when it was focused on one and it was equal in low and in high load conditions.