In a previous work we showed that the visual cortex tracks acoustic amplitude modulations accompanying lip movements during silently presented visual speech. Whether similar visuo-phonological transformation processes also exist for spectral modulations is unknown, which altogether could support the integration of auditory and visual cues for speech processing. Also, given increasing hearing difficulties in elderly individuals, we were interested in how these processes change as a function of age. Participants watched silent videos of a speaker and paid attention to the lip movements while being seated in the MEG. We found that the visual cortex not only tracks the unheard speech envelope, but also the unheard modulations of resonant frequencies (or formants) and the pitch (or fundamental frequency) linked to the lip movements, a process that is in general related to speech comprehension. Interestingly, only the processing of intelligible unheard formants decreases significantly with age in the visual and also in the cingulate cortex. This is not the case for the processing of the unheard speech envelope, the fundamental frequency or the purely visual information carried by lip movements. These results show that not only the global unheard speech envelope, but also unheard spectral fine-details are transformed from a mere visual to a phonological representation. Aging affects especially the ability to derive spectral dynamics at formant frequencies. Since listening in noisy environments should capitalize on the ability to track spectral fine-details, our results provide a novel focus on compensatory processes in such challenging situations.
bioRxiv Subject Collection: Neuroscience