Music and Eye-Tracking

Tracking gaze, pupil, and blinking is important to understand attentional processes in the visual domain. For example, it has long been shown that visual-spatial attention is strongly coupled with gaze (but can be de-coupled in the case of covert attention). There is growing evidence that measuring eye parameters can also be useful to understand auditory processing, even though the most important receptors for auditory processing are in the ear. Auditory information can attract the gaze, e.g., in the case of salient auditory changes which elicit an orienting response. Such changes occur frequently in music and often form the building blocks of musical structure. We hypothesize that the eyes respond to salient changes in music and that aspects of musical processing can be “decoded” from ocular measures. Further, beyond indexing music cognition, the eyes can reveal information about a person’s subjective state during musical engagement (see the project Aesthetic Absorption).
 

In this project, we follow-up on our previous studies to further explore eye measures in the context of music listening and music performance. One goal is to differentiate between different factors that affect ocular data, such as acoustically driven ocular changes, changes related to top-down processing of musical structure, vs. changes induced by subjective states. Because we think of music as more than just auditory input (e.g., it is in many cases audiovisual, with a strong motor component), we are also interested in exploring how audio-visual interactions and motoric activity may interfere with or enhance musical processing and aesthetic experiences.

However, the insights one can gain from music and eye-tracking studies are limited to the contexts in which one can record data and the analysis techniques one can apply to such high-dimensional datasets (i.e., ocular, acoustic, and behavioral measures, perhaps in combination with additional physiological measures). We therefore, have several broad lines of research related to methods development, e.g.,

  • Extending sophisticated dynamic modeling techniques to ocular data
  • Determining the degree to which ocular measures may or may not index different attentional or anticipatory processes
  • Testing the use of deep-learning models for webcam-based online eye-tracking studies
  • Assessing the feasibility of mobile eye-tracking in live, concert settings

To promote scientific exchange of researchers utilizing eye tracking in music and basic auditory research, we regularly organize a conference on this topic. The program of the first conference in 2017 can be found here  and the related special issue, published in the Journal of Eye Movement Research, can be found here . The next conference will take place 7-8 July 2022; see the conference website for more information.

 

back

Publications

Selected publications and conference presentations related to this project include:

Saxena, S., Fink, L. K., & Lange, E. B. (2023). Deep learning models for webcam eye tracking in online experiments. Behavior Research Methods. doi:10.3758/s13428-023-02190-6

Lange, E. B., & Fink, L.K. (2023). Eye blinking, musical processing, and subjective states—A methods account. Psychophysiology, e14350. https://doi.org/10.1111/psyp.14350

Fink, L., Simola, J., Tavano, A., Lange, E., Wallot, S., & Laeng, B. (2023). From pre-processing to advanced dynamic modeling of pupil data. Behavioral Research Methods.https://doi.org/10.3758/s13428-023-02098-1

Saxena, S., Lange, E. B., & Fink, L. (2022). Towards efficient calibration for webcam eye-tracking in online experiments. In Proceedings ETRA 2022: ACM Symposium on Eye Tracking Research and Applications (pp. 1-7). doi:10.1145/3517031.3529645.

Fink, L. & Simola, J., Tavano, A., Wallot, S., & Laeng, B. (in revision). From pre-processing to dynamic modeling of pupil data. Preprint on PsyArXiv: https://psyarxiv.com/wqvue

Fink, L., Janata, P., Ganapathy, S., Furukawa, S., Lange, L. (July, 2021). Spectral signatures of the pupillary response as an implicit measure of musical absorption. Talk (virtual) presented at the International Conference for Music Perception & Cognition. https://www.youtube.com/watch?v=5bpDhrxUvLg

Lange, E. & Fink, L. (2021). What is the relation between musical features and spontaneous or restricted blink activity? Blick und Bewegung Symposium, organized by Jörg Mühlhans. Talk (virtual) presented at the DAGA 47. Deutsche Jahrestagung für Akustik, Vienna, Austria.

Fink, L., Warrenburg, L., Howlin, C., Randall, W., Hansen, N., & Wald-Fuhrmann, M. (2021). Viral tunes: Changes in musical behaviours and interest in coronamusic predict socio-emotional coping during COVID-19 lockdown. Humanities and Social Sciences Communications, 8: 180. doi:10.1057/s41599-021-00858-y.

Fink, L., Lange, E., & Groner, R. (2019). The application of eye-tracking in music research. Journal of Eye Movement Research, 11(2):1. https://doi.org/10.16910/jemr.11.2.1.

Fink, L., Hurley, B.,Geng, J. &Janata, P. (2018).A linear oscillator model predicts dynamic temporal attention and pupillary entrainment to rhythmic musical patterns. Journal of Eye Movement Research, 11(2):12. https://doi.org/10.16910/jemr.11.2.12.

Lange, E. & Fink, L. (2017). Using eye-tracking and pupillometry to study rhythmic processing in music and dance. Symposium presented at the European Conference on Eye Movements, Wüppertal, Germany.