Correspondence of categorical and feature-based representations of music in the human brain. Nakai, T., Koide-Majima, N., & Nishimoto, S. Brain and Behavior, 11(1):e01936, 2021. _eprint: https://onlinelibrary.wiley.com/doi/pdf/10.1002/brb3.1936Paper doi abstract bibtex 1 download Introduction Humans tend to categorize auditory stimuli into discrete classes, such as animal species, language, musical instrument, and music genre. Of these, music genre is a frequently used dimension of human music preference and is determined based on the categorization of complex auditory stimuli. Neuroimaging studies have reported that the superior temporal gyrus (STG) is involved in response to general music-related features. However, there is considerable uncertainty over how discrete music categories are represented in the brain and which acoustic features are more suited for explaining such representations. Methods We used a total of 540 music clips to examine comprehensive cortical representations and the functional organization of music genre categories. For this purpose, we applied a voxel-wise modeling approach to music-evoked brain activity measured using functional magnetic resonance imaging. In addition, we introduced a novel technique for feature-brain similarity analysis and assessed how discrete music categories are represented based on the cortical response pattern to acoustic features. Results Our findings indicated distinct cortical organizations for different music genres in the bilateral STG, and they revealed representational relationships between different music genres. On comparing different acoustic feature models, we found that these representations of music genres could be explained largely by a biologically plausible spectro-temporal modulation-transfer function model. Conclusion Our findings have elucidated the quantitative representation of music genres in the human cortex, indicating the possibility of modeling this categorization of complex auditory stimuli based on brain activity.
@article{nakai_correspondence_2021,
title = {Correspondence of categorical and feature-based representations of music in the human brain},
volume = {11},
issn = {2162-3279},
url = {https://onlinelibrary.wiley.com/doi/abs/10.1002/brb3.1936},
doi = {10.1002/brb3.1936},
abstract = {Introduction Humans tend to categorize auditory stimuli into discrete classes, such as animal species, language, musical instrument, and music genre. Of these, music genre is a frequently used dimension of human music preference and is determined based on the categorization of complex auditory stimuli. Neuroimaging studies have reported that the superior temporal gyrus (STG) is involved in response to general music-related features. However, there is considerable uncertainty over how discrete music categories are represented in the brain and which acoustic features are more suited for explaining such representations. Methods We used a total of 540 music clips to examine comprehensive cortical representations and the functional organization of music genre categories. For this purpose, we applied a voxel-wise modeling approach to music-evoked brain activity measured using functional magnetic resonance imaging. In addition, we introduced a novel technique for feature-brain similarity analysis and assessed how discrete music categories are represented based on the cortical response pattern to acoustic features. Results Our findings indicated distinct cortical organizations for different music genres in the bilateral STG, and they revealed representational relationships between different music genres. On comparing different acoustic feature models, we found that these representations of music genres could be explained largely by a biologically plausible spectro-temporal modulation-transfer function model. Conclusion Our findings have elucidated the quantitative representation of music genres in the human cortex, indicating the possibility of modeling this categorization of complex auditory stimuli based on brain activity.},
language = {en},
number = {1},
urldate = {2021-10-26},
journal = {Brain and Behavior},
author = {Nakai, Tomoya and Koide-Majima, Naoko and Nishimoto, Shinji},
year = {2021},
note = {\_eprint: https://onlinelibrary.wiley.com/doi/pdf/10.1002/brb3.1936},
keywords = {MTF model, STG, fMRI, music genre},
pages = {e01936},
}
Downloads: 1
{"_id":"5yrbXgMJ9kBoqmM8A","bibbaseid":"nakai-koidemajima-nishimoto-correspondenceofcategoricalandfeaturebasedrepresentationsofmusicinthehumanbrain-2021","author_short":["Nakai, T.","Koide-Majima, N.","Nishimoto, S."],"bibdata":{"bibtype":"article","type":"article","title":"Correspondence of categorical and feature-based representations of music in the human brain","volume":"11","issn":"2162-3279","url":"https://onlinelibrary.wiley.com/doi/abs/10.1002/brb3.1936","doi":"10.1002/brb3.1936","abstract":"Introduction Humans tend to categorize auditory stimuli into discrete classes, such as animal species, language, musical instrument, and music genre. Of these, music genre is a frequently used dimension of human music preference and is determined based on the categorization of complex auditory stimuli. Neuroimaging studies have reported that the superior temporal gyrus (STG) is involved in response to general music-related features. However, there is considerable uncertainty over how discrete music categories are represented in the brain and which acoustic features are more suited for explaining such representations. Methods We used a total of 540 music clips to examine comprehensive cortical representations and the functional organization of music genre categories. For this purpose, we applied a voxel-wise modeling approach to music-evoked brain activity measured using functional magnetic resonance imaging. In addition, we introduced a novel technique for feature-brain similarity analysis and assessed how discrete music categories are represented based on the cortical response pattern to acoustic features. Results Our findings indicated distinct cortical organizations for different music genres in the bilateral STG, and they revealed representational relationships between different music genres. On comparing different acoustic feature models, we found that these representations of music genres could be explained largely by a biologically plausible spectro-temporal modulation-transfer function model. Conclusion Our findings have elucidated the quantitative representation of music genres in the human cortex, indicating the possibility of modeling this categorization of complex auditory stimuli based on brain activity.","language":"en","number":"1","urldate":"2021-10-26","journal":"Brain and Behavior","author":[{"propositions":[],"lastnames":["Nakai"],"firstnames":["Tomoya"],"suffixes":[]},{"propositions":[],"lastnames":["Koide-Majima"],"firstnames":["Naoko"],"suffixes":[]},{"propositions":[],"lastnames":["Nishimoto"],"firstnames":["Shinji"],"suffixes":[]}],"year":"2021","note":"_eprint: https://onlinelibrary.wiley.com/doi/pdf/10.1002/brb3.1936","keywords":"MTF model, STG, fMRI, music genre","pages":"e01936","bibtex":"@article{nakai_correspondence_2021,\n\ttitle = {Correspondence of categorical and feature-based representations of music in the human brain},\n\tvolume = {11},\n\tissn = {2162-3279},\n\turl = {https://onlinelibrary.wiley.com/doi/abs/10.1002/brb3.1936},\n\tdoi = {10.1002/brb3.1936},\n\tabstract = {Introduction Humans tend to categorize auditory stimuli into discrete classes, such as animal species, language, musical instrument, and music genre. Of these, music genre is a frequently used dimension of human music preference and is determined based on the categorization of complex auditory stimuli. Neuroimaging studies have reported that the superior temporal gyrus (STG) is involved in response to general music-related features. However, there is considerable uncertainty over how discrete music categories are represented in the brain and which acoustic features are more suited for explaining such representations. Methods We used a total of 540 music clips to examine comprehensive cortical representations and the functional organization of music genre categories. For this purpose, we applied a voxel-wise modeling approach to music-evoked brain activity measured using functional magnetic resonance imaging. In addition, we introduced a novel technique for feature-brain similarity analysis and assessed how discrete music categories are represented based on the cortical response pattern to acoustic features. Results Our findings indicated distinct cortical organizations for different music genres in the bilateral STG, and they revealed representational relationships between different music genres. On comparing different acoustic feature models, we found that these representations of music genres could be explained largely by a biologically plausible spectro-temporal modulation-transfer function model. Conclusion Our findings have elucidated the quantitative representation of music genres in the human cortex, indicating the possibility of modeling this categorization of complex auditory stimuli based on brain activity.},\n\tlanguage = {en},\n\tnumber = {1},\n\turldate = {2021-10-26},\n\tjournal = {Brain and Behavior},\n\tauthor = {Nakai, Tomoya and Koide-Majima, Naoko and Nishimoto, Shinji},\n\tyear = {2021},\n\tnote = {\\_eprint: https://onlinelibrary.wiley.com/doi/pdf/10.1002/brb3.1936},\n\tkeywords = {MTF model, STG, fMRI, music genre},\n\tpages = {e01936},\n}\n","author_short":["Nakai, T.","Koide-Majima, N.","Nishimoto, S."],"key":"nakai_correspondence_2021","id":"nakai_correspondence_2021","bibbaseid":"nakai-koidemajima-nishimoto-correspondenceofcategoricalandfeaturebasedrepresentationsofmusicinthehumanbrain-2021","role":"author","urls":{"Paper":"https://onlinelibrary.wiley.com/doi/abs/10.1002/brb3.1936"},"keyword":["MTF model","STG","fMRI","music genre"],"metadata":{"authorlinks":{}},"downloads":1},"bibtype":"article","biburl":"https://api.zotero.org/groups/4476011/items?key=BfP7bN7FF9dJwtyiLBORewdg&format=bibtex&limit=100","dataSources":["Xr58FaQSpr3DSFeK8"],"keywords":["mtf model","stg","fmri","music genre"],"search_terms":["correspondence","categorical","feature","based","representations","music","human","brain","nakai","koide-majima","nishimoto"],"title":"Correspondence of categorical and feature-based representations of music in the human brain","year":2021,"downloads":1}