Neural Networks for Real-Time Joint Music Performance: Piano duos in the MR-scanner

Joint music performance is a highly complex task that requires musicians to dynamically integrate self- and other-produced actions. To synchronise their actions in time, ensemble musicians constantly anticipate the sounds produced by their co-performers and flexibly adapt their own actions accordingly. Recent evidence shows that the degree of self-other integration varies depending on situational and cognitive demands of the ongoing performance, suggesting a dynamic balance between self-other integration and segregation. The goal of this project is to identify the neural bases that regulate this balance.

Therefore, we ask pianists to perform duets together that contain various tempo changes and differ in familiarity. We record brain activity from one of the pianists playing on an MR-compatible piano in the MR-scanner while being accompanied by their partner on an extra piano outside of the scanner room. Our findings already highlight the involvement of cortical and cerebellar sensorimotor regions known to support motor simulation, auditory imagery and the adaptation of ongoing actions. Overall, the current study sheds light on the neural bases balancing the interaction dynamics between musicians during real-time ensemble performance.