Musicians and non-musicians’ consonant/dissonant perception investigated by EEG and fMRI. Jo, H., Hsieh, T., Chien, W., Shaw, F., Liang, S., & Kung, C. Technical Report August, 2021. Company: Cold Spring Harbor Laboratory Distributor: Cold Spring Harbor Laboratory Label: Cold Spring Harbor Laboratory Section: New Results Type: article
Musicians and non-musicians’ consonant/dissonant perception investigated by EEG and fMRI [link]Paper  doi  abstract   bibtex   
The perception of two (or more) simultaneous musical notes, depending on their pitch interval(s), could be broadly categorized as consonant or dissonant. Previous studies have suggested that musicians and non-musicians adopt different strategies when discerning music intervals: the frequency ratio (perfect fifth or tritone) for the former, and frequency differences (e.g., roughness vs. non-roughness) for the latter. To extend and replicate this previous finding, in this follow-up study we reran the ElectroEncephaloGraphy (EEG) experiment, and separately collected functional magnetic resonance imaging (fMRI) data of the same protocol. The behavioral results replicated our previous findings that musicians used pitch intervals and nonmusicians roughness for consonant judgments. And the ERP amplitude differences between groups in both frequency ratio and frequency differences were primarily around N1 and P2 periods along the midline channels. The fMRI results, with the joint analyses by univariate, multivariate, and connectivity approaches, further reinforce the involvement of midline and related-brain regions in consonant/dissonance judgments. Additional representational similarity analysis (or RSA), and the final spatio-temporal searchlight RSA (or ss-RSA), jointly combined the fMRI-EEG into the same representational space, providing final support on the neural substrates of neurophysiological signatures. Together, these analyses not just exemplify the importance of replication, that musicians rely more on top-down knowledge for consonance/dissonance perception; but also demonstrate the advantages of multiple analyses in constraining the findings from both EEG and fMRI. Significance Statement In this study, the neural correlates of consonant and dissonant perception has been revisited with both EEG and fMRI. Behavioral results of the current study well replicated the pattern of our earlier work (Kung et al., 2014), and the ERP results, though showing that both musicians and nonmusicians processed rough vs. non-rough notes similarly, still supported the top-down modulation in musicians likely through long-term practice. The fMRI results, combining univariate (GLM contrast and functional connectivity) and multivariate (MVPA searchlight and RSA on voxel-, connectivity-, and spatio-temporal RSA searchlight-level) analyses, commonly speak to lateralized and midline regions, at different time windows, as the core brain networks that underpin both musicians’ and nonmusicians’ consonant/dissonant perceptions.
@techreport{jo_musicians_2021,
	title = {Musicians and non-musicians’ consonant/dissonant perception investigated by {EEG} and {fMRI}},
	copyright = {© 2021, Posted by Cold Spring Harbor Laboratory. This pre-print is available under a Creative Commons License (Attribution-NonCommercial-NoDerivs 4.0 International), CC BY-NC-ND 4.0, as described at http://creativecommons.org/licenses/by-nc-nd/4.0/},
	url = {https://www.biorxiv.org/content/10.1101/2021.08.15.456377v1},
	abstract = {The perception of two (or more) simultaneous musical notes, depending on their pitch interval(s), could be broadly categorized as consonant or dissonant. Previous studies have suggested that musicians and non-musicians adopt different strategies when discerning music intervals: the frequency ratio (perfect fifth or tritone) for the former, and frequency differences (e.g., roughness vs. non-roughness) for the latter. To extend and replicate this previous finding, in this follow-up study we reran the ElectroEncephaloGraphy (EEG) experiment, and separately collected functional magnetic resonance imaging (fMRI) data of the same protocol. The behavioral results replicated our previous findings that musicians used pitch intervals and nonmusicians roughness for consonant judgments. And the ERP amplitude differences between groups in both frequency ratio and frequency differences were primarily around N1 and P2 periods along the midline channels. The fMRI results, with the joint analyses by univariate, multivariate, and connectivity approaches, further reinforce the involvement of midline and related-brain regions in consonant/dissonance judgments. Additional representational similarity analysis (or RSA), and the final spatio-temporal searchlight RSA (or ss-RSA), jointly combined the fMRI-EEG into the same representational space, providing final support on the neural substrates of neurophysiological signatures. Together, these analyses not just exemplify the importance of replication, that musicians rely more on top-down knowledge for consonance/dissonance perception; but also demonstrate the advantages of multiple analyses in constraining the findings from both EEG and fMRI.
Significance Statement In this study, the neural correlates of consonant and dissonant perception has been revisited with both EEG and fMRI. Behavioral results of the current study well replicated the pattern of our earlier work (Kung et al., 2014), and the ERP results, though showing that both musicians and nonmusicians processed rough vs. non-rough notes similarly, still supported the top-down modulation in musicians likely through long-term practice. The fMRI results, combining univariate (GLM contrast and functional connectivity) and multivariate (MVPA searchlight and RSA on voxel-, connectivity-, and spatio-temporal RSA searchlight-level) analyses, commonly speak to lateralized and midline regions, at different time windows, as the core brain networks that underpin both musicians’ and nonmusicians’ consonant/dissonant perceptions.},
	language = {en},
	urldate = {2021-11-16},
	author = {Jo, HanShin and Hsieh, Tsung-Hao and Chien, Wei-Che and Shaw, Fu-Zen and Liang, Sheng-Fu and Kung, Chun-Chia},
	month = aug,
	year = {2021},
	doi = {10.1101/2021.08.15.456377},
	note = {Company: Cold Spring Harbor Laboratory
Distributor: Cold Spring Harbor Laboratory
Label: Cold Spring Harbor Laboratory
Section: New Results
Type: article},
	pages = {2021.08.15.456377},
}

Downloads: 0