Neural oscillations in auditory cognition, music, speech, and language
Neuronal oscillations are believed to play a role in various perceptual and cognitive tasks, including attention, navigation, memory, motor planning, and - most relevant in the context of the present work - spoken-language comprehension. The specific computational functions of neuronal oscillations are uncertain. We aim to elucidate how these ubiquitous neurophysiological attributes may underpin speech, language, and music processing. Speech and other dynamically changing auditory signals (as well as visual stimuli, including sign) contain critical information required for successful decoding that is carried at multiple temporal scales (e.g. slower intonation-level information, syllabic information, and rapidly changing featural information). These different aspects of signals (slow and fast temporal modulation, frequency composition) must be analyzed to achieve successful recognition. To parse a naturalistic input signal (e.g. speech signal) into elementary pieces, one ‘mesoscopic-level’ mechanism is suggested to be the application of temporal windows, implemented as low-frequency oscillations on privileged time scales.
Projects
Linguistic Decoding
When we listen to someone speaking, we are able to quickly and effortlessly understand the content of the spoken language. This ability, however, obscures the complexity of the neural processes that underlie comprehension.
Auditory multi-scale Processing
Natural sounds, music, and vocal sounds have a rich temporal structure over multiple timescales, and behaviorally relevant acoustic information is usually carried on more than one timescale. For example, speech conveys linguistic information at several scales: 20-80 ms for phonemic information, 100-300 ms for syllabic information, and more than 1000 ms for intonation information.
When our brain rhythms change
The human brain exhibits rhythms that are characteristic for anatomical areas and presumably involved in diverse perceptual and cognitive processes. Visual deprivation results in behavioral adaptation and cortical reorganization, particularly affecting sensory cortices.
The structure of subjective experience
Consciousness is one of the most fascinating yet least understood aspects of human nature, or perhaps nature at large. Our lives dwell in our conscious experiences: this is where we experience love, we feel the ‘chills’ with a good piece of art, enjoy the taste of a good wine, and suffer excruciating pain. Our experience is structured, i.e., it has space and time.
Speech-specific Processing
The precise role of cortical oscillations in speech processing is under investigation. According to current research, the phase alignment of Δ/θ-band (2-8 Hz) neural oscillations in the auditory cortex is involved in the segmentation of speech.
Speech chunking
In a recent study, Ding et al. (2016) showed that spectral peaks of brain waves corresponded to multiple levels of linguistic structure (e.g., peaks in the delta and theta range corresponded to the phrase and syllable rate, respectively).
Probing auditory perceptual constraints across individuals
Natural sounds contain rich temporal structure over various timescales. Previous research suggests neuronal oscillations to be one critical mechanism for processing the temporal structure of sound, with a particular sensitivity in the delta (0.5-3 Hz), theta (4-8 Hz), and low gamma ranges (25-80 Hz), compared to the alpha range (8-12 Hz).
Robustness of rhythmic perception in audition
Acoustic rhythms are a fundamental feature in our acoustic environment, including in speech and music. Auditory perception could profit from entrainment of cortical neural oscillations to rhythmicity in the acoustic input, which aligns optimal oscillatory phase with critical auditory events. However, to what extent oscillatory phase modulates the efficiency of auditory perception is still unclear.