Max-Planck-Institut für empirische Ästhetik
Lecture by Hyojin Park: Neural oscillatory mechanisms in dynamic
information representation during natural audiovisual speech perception
Recent evidence suggests that brain rhythms track acoustic envelope in auditory speech (speech entrainment) and this mechanism facilitates speech intelligibility. We recently demonstrated that this is the case for visual speech (lip movement) as well. This has led us to ask to what extent auditory and visual information are represented in brain areas, either jointly or individually. Which system conveys shared information from multisensory inputs and which system represents the inputs synergistically? In my talk, I will present our recent work (Park et al., 2018, PLoS Biol) which shows how information in entrained auditory and visual speech interact to facilitate speech comprehension. Here we used a novel Information Theory approach to decompose dynamic information quantities which is called partial information decomposition (PID). In addition, I will discuss our recent advance in the question of linking function to anatomy via diffusion tensor imaging that studied whether the degree of white matter integrity in individuals differently predict speech entrainment/information interaction that we observed.