Musical Emotions: Predicting Second-by-Second Subjective Feelings of Emotion From Low-Level Psychoacoustic Features and Physiological Measurements. Coutinho, E. & Cangelosi, A. Emotion, 11(4):921-937, 8, 2011. Paper Website doi abstract bibtex We sustain that the structure of affect elicited by music is largely dependent on dynamic temporal patterns in low-level music structural parameters. In support of this claim, we have previously provided evidence that spatiotemporal dynamics in psychoacoustic features resonate with two psychological dimensions of affect underlying judgments of subjective feelings: arousal and valence. In this article we extend our previous investigations in two aspects. First, we focus on the emotions experienced rather than perceived while listening to music. Second, we evaluate the extent to which peripheral feedback in music can account for the predicted emotional responses, that is, the role of physiological arousal in determining the intensity and valence of musical emotions. Akin to our previous findings, we will show that a significant part of the listeners' reported emotions can be predicted from a set of six psychoacoustic features-loudness, pitch level, pitch contour, tempo, texture, and sharpness. Furthermore, the accuracy of those predictions is improved with the inclusion of physiological cues-skin conductance and heart rate. The interdisciplinary work presented here provides a new methodology to the field of music and emotion research based on the combination of computational and experimental work, which aid the analysis of the emotional responses to music, while offering a platform for the abstract representation of those complex relationships. Future developments may aid specific areas, such as, psychology and music therapy, by providing coherent descriptions of the emotional effects of specific music stimuli. © 2011 American Psychological Association.
@article{
title = {Musical Emotions: Predicting Second-by-Second Subjective Feelings of Emotion From Low-Level Psychoacoustic Features and Physiological Measurements},
type = {article},
year = {2011},
keywords = {article,journal},
pages = {921-937},
volume = {11},
websites = {http://gateway.webofknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcApp=PARTNER_APP&SrcAuth=LinksAMR&KeyUT=WOS:000294594400021&DestLinkType=FullRecord&DestApp=ALL_WOS&UsrCustomerID=f3ec48df247ee1138ccd8d3ba59bacc2,http://doi.apa.org/getdoi.cfm?doi=10.103},
month = {8},
id = {5801e60b-a9f7-3efb-9dd8-5bd2ce12c465},
created = {2024-08-09T12:19:56.931Z},
file_attached = {true},
profile_id = {ffa9027c-806a-3827-93a1-02c42eb146a1},
group_id = {da2a8249-fdf4-3036-ba56-7358198a1600},
last_modified = {2024-08-09T12:20:55.429Z},
read = {true},
starred = {false},
authored = {false},
confirmed = {true},
hidden = {false},
citation_key = {Coutinho2011a},
source_type = {article},
private_publication = {false},
abstract = {We sustain that the structure of affect elicited by music is largely dependent on dynamic temporal patterns in low-level music structural parameters. In support of this claim, we have previously provided evidence that spatiotemporal dynamics in psychoacoustic features resonate with two psychological dimensions of affect underlying judgments of subjective feelings: arousal and valence. In this article we extend our previous investigations in two aspects. First, we focus on the emotions experienced rather than perceived while listening to music. Second, we evaluate the extent to which peripheral feedback in music can account for the predicted emotional responses, that is, the role of physiological arousal in determining the intensity and valence of musical emotions. Akin to our previous findings, we will show that a significant part of the listeners' reported emotions can be predicted from a set of six psychoacoustic features-loudness, pitch level, pitch contour, tempo, texture, and sharpness. Furthermore, the accuracy of those predictions is improved with the inclusion of physiological cues-skin conductance and heart rate. The interdisciplinary work presented here provides a new methodology to the field of music and emotion research based on the combination of computational and experimental work, which aid the analysis of the emotional responses to music, while offering a platform for the abstract representation of those complex relationships. Future developments may aid specific areas, such as, psychology and music therapy, by providing coherent descriptions of the emotional effects of specific music stimuli. © 2011 American Psychological Association.},
bibtype = {article},
author = {Coutinho, Eduardo and Cangelosi, Angelo},
doi = {10.1037/a0024700},
journal = {Emotion},
number = {4}
}
Downloads: 0
{"_id":"j9CJciq5RHtCeAfsc","bibbaseid":"coutinho-cangelosi-musicalemotionspredictingsecondbysecondsubjectivefeelingsofemotionfromlowlevelpsychoacousticfeaturesandphysiologicalmeasurements-2011","downloads":0,"creationDate":"2017-04-06T12:03:59.127Z","title":"Musical Emotions: Predicting Second-by-Second Subjective Feelings of Emotion From Low-Level Psychoacoustic Features and Physiological Measurements","author_short":["Coutinho, E.","Cangelosi, A."],"year":2011,"bibtype":"article","biburl":"https://bibbase.org/service/mendeley/ffa9027c-806a-3827-93a1-02c42eb146a1","bibdata":{"title":"Musical Emotions: Predicting Second-by-Second Subjective Feelings of Emotion From Low-Level Psychoacoustic Features and Physiological Measurements","type":"article","year":"2011","keywords":"article,journal","pages":"921-937","volume":"11","websites":"http://gateway.webofknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcApp=PARTNER_APP&SrcAuth=LinksAMR&KeyUT=WOS:000294594400021&DestLinkType=FullRecord&DestApp=ALL_WOS&UsrCustomerID=f3ec48df247ee1138ccd8d3ba59bacc2,http://doi.apa.org/getdoi.cfm?doi=10.103","month":"8","id":"5801e60b-a9f7-3efb-9dd8-5bd2ce12c465","created":"2024-08-09T12:19:56.931Z","file_attached":"true","profile_id":"ffa9027c-806a-3827-93a1-02c42eb146a1","group_id":"da2a8249-fdf4-3036-ba56-7358198a1600","last_modified":"2024-08-09T12:20:55.429Z","read":"true","starred":false,"authored":false,"confirmed":"true","hidden":false,"citation_key":"Coutinho2011a","source_type":"article","private_publication":false,"abstract":"We sustain that the structure of affect elicited by music is largely dependent on dynamic temporal patterns in low-level music structural parameters. In support of this claim, we have previously provided evidence that spatiotemporal dynamics in psychoacoustic features resonate with two psychological dimensions of affect underlying judgments of subjective feelings: arousal and valence. In this article we extend our previous investigations in two aspects. First, we focus on the emotions experienced rather than perceived while listening to music. Second, we evaluate the extent to which peripheral feedback in music can account for the predicted emotional responses, that is, the role of physiological arousal in determining the intensity and valence of musical emotions. Akin to our previous findings, we will show that a significant part of the listeners' reported emotions can be predicted from a set of six psychoacoustic features-loudness, pitch level, pitch contour, tempo, texture, and sharpness. Furthermore, the accuracy of those predictions is improved with the inclusion of physiological cues-skin conductance and heart rate. The interdisciplinary work presented here provides a new methodology to the field of music and emotion research based on the combination of computational and experimental work, which aid the analysis of the emotional responses to music, while offering a platform for the abstract representation of those complex relationships. Future developments may aid specific areas, such as, psychology and music therapy, by providing coherent descriptions of the emotional effects of specific music stimuli. © 2011 American Psychological Association.","bibtype":"article","author":"Coutinho, Eduardo and Cangelosi, Angelo","doi":"10.1037/a0024700","journal":"Emotion","number":"4","bibtex":"@article{\n title = {Musical Emotions: Predicting Second-by-Second Subjective Feelings of Emotion From Low-Level Psychoacoustic Features and Physiological Measurements},\n type = {article},\n year = {2011},\n keywords = {article,journal},\n pages = {921-937},\n volume = {11},\n websites = {http://gateway.webofknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcApp=PARTNER_APP&SrcAuth=LinksAMR&KeyUT=WOS:000294594400021&DestLinkType=FullRecord&DestApp=ALL_WOS&UsrCustomerID=f3ec48df247ee1138ccd8d3ba59bacc2,http://doi.apa.org/getdoi.cfm?doi=10.103},\n month = {8},\n id = {5801e60b-a9f7-3efb-9dd8-5bd2ce12c465},\n created = {2024-08-09T12:19:56.931Z},\n file_attached = {true},\n profile_id = {ffa9027c-806a-3827-93a1-02c42eb146a1},\n group_id = {da2a8249-fdf4-3036-ba56-7358198a1600},\n last_modified = {2024-08-09T12:20:55.429Z},\n read = {true},\n starred = {false},\n authored = {false},\n confirmed = {true},\n hidden = {false},\n citation_key = {Coutinho2011a},\n source_type = {article},\n private_publication = {false},\n abstract = {We sustain that the structure of affect elicited by music is largely dependent on dynamic temporal patterns in low-level music structural parameters. In support of this claim, we have previously provided evidence that spatiotemporal dynamics in psychoacoustic features resonate with two psychological dimensions of affect underlying judgments of subjective feelings: arousal and valence. In this article we extend our previous investigations in two aspects. First, we focus on the emotions experienced rather than perceived while listening to music. Second, we evaluate the extent to which peripheral feedback in music can account for the predicted emotional responses, that is, the role of physiological arousal in determining the intensity and valence of musical emotions. Akin to our previous findings, we will show that a significant part of the listeners' reported emotions can be predicted from a set of six psychoacoustic features-loudness, pitch level, pitch contour, tempo, texture, and sharpness. Furthermore, the accuracy of those predictions is improved with the inclusion of physiological cues-skin conductance and heart rate. The interdisciplinary work presented here provides a new methodology to the field of music and emotion research based on the combination of computational and experimental work, which aid the analysis of the emotional responses to music, while offering a platform for the abstract representation of those complex relationships. Future developments may aid specific areas, such as, psychology and music therapy, by providing coherent descriptions of the emotional effects of specific music stimuli. © 2011 American Psychological Association.},\n bibtype = {article},\n author = {Coutinho, Eduardo and Cangelosi, Angelo},\n doi = {10.1037/a0024700},\n journal = {Emotion},\n number = {4}\n}","author_short":["Coutinho, E.","Cangelosi, A."],"urls":{"Paper":"https://bibbase.org/service/mendeley/ffa9027c-806a-3827-93a1-02c42eb146a1/file/5f07f27c-dd84-8bd3-6970-42af304d7f6b/2011___Coutinho_Cangelosi___Musical_Emotions_Predicting_Second_by_Second_Subjective_Feelings_of_Emotion_From_Low_Level_P.pdf.pdf","Website":"http://gateway.webofknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcApp=PARTNER_APP&SrcAuth=LinksAMR&KeyUT=WOS:000294594400021&DestLinkType=FullRecord&DestApp=ALL_WOS&UsrCustomerID=f3ec48df247ee1138ccd8d3ba59bacc2,http://doi.apa.org/getdoi.cfm?doi=10.103"},"biburl":"https://bibbase.org/service/mendeley/ffa9027c-806a-3827-93a1-02c42eb146a1","bibbaseid":"coutinho-cangelosi-musicalemotionspredictingsecondbysecondsubjectivefeelingsofemotionfromlowlevelpsychoacousticfeaturesandphysiologicalmeasurements-2011","role":"author","keyword":["article","journal"],"metadata":{"authorlinks":{"coutinho, e":"https://bibbase.org/service/mendeley/ffa9027c-806a-3827-93a1-02c42eb146a1"}},"downloads":0},"search_terms":["musical","emotions","predicting","second","second","subjective","feelings","emotion","low","level","psychoacoustic","features","physiological","measurements","coutinho","cangelosi"],"keywords":["article","journal"],"authorIDs":["58e62eafe417243f58000008","58e63ebe8f93a1ae58000053","5c0e7cf57d11e01000000150","6Z276FjCJdxu2qttA","7HsugpwDMrLzay5kC","FShD3dRw9yuvcgPCs","HmyGQqdmpxNriPHXS","LxZfJ4SZafxuJzeKG","S2CzJg64apkJsPsHA","gSef4eEL8c5wuNF7w","hzjyyz7CdSRZyRptY","mo4CFXJ7ukAMT9nho","nxyiocLrG567rGYAm","oenDqaFMmkBMZtjuK","tE6ysWmRvSH5uoPFW","tvZgz2JnB9BN8znPF"],"dataSources":["Tcd3cXtdQsiKHPZsW","ya2CyA73rpZseyrZ8","2252seNhipfTmjEBQ"]}