Neural oscillations in auditory cognition, music, speech, and language

Projects

Linguistic Decoding

When we listen to someone speaking, we are able to quickly and effortlessly understand the content of the spoken language. This ability, however, obscures the complexity of the neural processes that underlie comprehension.

more

Auditory multi-scale Processing

Natural sounds, music, and vocal sounds have a rich temporal structure over multiple timescales, and behaviorally relevant acoustic information is usually carried on more than one timescale. For example, speech conveys linguistic information at several scales: 20-80 ms for phonemic information, 100-300 ms for syllabic information, and more than 1000 ms for intonation information.

more

 

Speech-specific Processing

The precise role of cortical oscillations in speech processing is under investigation. According to current research, the phase alignment of Δ/θ-band (2-8 Hz) neural oscillations in the auditory cortex is involved in the segmentation of speech.

more

Speech chunking

In a recent study, Ding et al. (2016) showed that spectral peaks of brain waves corresponded to multiple levels of linguistic structure (e.g., peaks in the delta and theta range corresponded to the phrase and syllable rate, respectively).

more