Studying Consonance Perception in Continuous Pitch and Timbre Space

Consonance is a fundamental aspect of music perception that has been studied for many centuries. Most of this work relies heavily on musical stimuli drawn from quantized musical scales (e.g., 12-tone equal temperament, or the Bohlen-Pierce scale), synthesized either using idealized harmonic complex tones or Western instrument sounds (e.g., piano, clarinet). We believe that there is a lot to be learned by relaxing these practices, studying how consonance varies as a function of continuous pitch intervals and as a function of timbre. However, relaxing these constraints substantially increases the space of possible stimuli and hence the demands of data collection. Our response to this problem is twofold: (a) we take advantage of large-scale, highly automated online data collection using PsyNet and (b) we take advantage of a new adaptive paradigm recently introduced by our group, ‘Gibbs Sampling with People’ (Harrison et al., 2020), which is particularly well-suited to navigating high-dimensional stimulus spaces. Combined with computational modeling, the results of our experiments shed light on several important aspects of consonance perception, in particular the the sense in which consonance patterns are affected by timbre, and the respective contributions of harmonicity, interference between partials, and sharpness to consonance perception.

 

References

Harrison*, P., Marjieh*, R., Adolfi, F., van Rijn, P., Anglada-Tort, M., Tchernichovski, O., Larrouy-Maestri, P. and Jacoby, N., 2020. Gibbs Sampling with People. Advances in Neural Information Processing Systems, 33.