Foundations of speech perception and language comprehension

Projects

Neural Correlates of Affect Perception in Screams

Screaming is an ability we share with many other primates, and which we possess long before we learn to express our affective state with speech. Previous studies focusing on fearful screams highlighted certain acoustic features, such as roughness, unexploited by speech (Arnal et al., 2015), leading to activation of the amygdala and other subcortical structures critical for danger appraisal.

more

 

 

Cognitive Processes behind Prosodic Perception

The tone of the voice carries information about the emotional state or intention of a speaker. Whereas the nature of acoustic features of contrasted prosodic signals has attracted a lot of attention in the last decades (particularly since Banse & Scherer, 1996), the communication of emotions/intentions remains poorly understood.

more 

A psychophysical study

In a foundational study published in 1955, Miller and Nicely measured the perceptual confusions among 19 consonants followed by the vowel [a] (ba, da, ga, etc.). The stimuli were subjected to different kinds of linear distortions, i.e., additive noise and variations in bandwidth.

more

Bilingualism and the Size of Auditory Cortical Areas

Bilingualism has become common. Infants' auditory cortex undergoes structural maturation during the first three years (Yakovlev & Lecours, 1967). They develop an auditory capacity to specifically recognize acoustic patterns used in their native language (e.g., Kuhl et al. 1992).

more

Cortical tracking of complex spoken sentences

Humans naturally tune in to the rhythm of speech (Giraud & Poeppel, 2012). Recent work has shown that low-frequency brain rhythms have been shown concurrently to track the main constituents in a linguistic hierarchy: phrases and sentences (Ding et al., 2016).

more