Whodunnit -- Searching for the most important feature types signalling emotion-related user states in speech. Batliner, A.; Steidl, S.; Seppi, D.; Vogt, T.; Wagner, J.; Devillers, L; Vidrascu, L.; Aharonson, V.; Kessous, L.; and Amir, N. Computer Speech & Language.
Whodunnit -- Searching for the most important feature types signalling emotion-related user states in speech [pdf]Paper  doi  abstract   bibtex   
In this article, we describe and interpret a set of acoustic and linguistic features that characterise emotional/emotion-related user states -- confined to the one database processed: four classes in a German corpus of children interacting with a pet robot. To this end, we collected a very large feature vector consisting of more than 4000 features extracted at different sites. We performed extensive feature selection (Sequential Forward Floating Search) for seven acoustic and four linguistic types of features, ending up in a small number of `most important' features which we try to interpret by discussing the impact of different feature and extraction types. We establish different measures of impact and discuss the mutual influence of acoustics and linguistics.
@article{batliner_whodunnit_2011,
	Author = {Batliner, Anton and Steidl, Stefan and Seppi, Dino and Vogt, Thurid and Wagner, Johannes and Devillers, L and Vidrascu, Laurence and Aharonson, Vered and Kessous, Loic and Amir, Noam},
	Date = {2011},
	Date-Modified = {2017-04-19 08:04:06 +0000},
	Doi = {10.1016/j.csl.2009.12.003},
	File = {Attachment:files/935/Batliner et al. - 2011 - Whodunnit -- Searching for the most important feature types signalling emotion-related user states in speech.pdf:application/pdf},
	Journal = {Computer Speech \& Language},
	Keywords = {dialogue systems, emotions, speaking styles, speech technology},
	Number = {1},
	Title = {Whodunnit -- Searching for the most important feature types signalling emotion-related user states in speech},
	Url = {http://www.esat.kuleuven.be/psi/spraak/cgi-bin/get_file.cgi?/dseppi/csl10/csl10.pdf},
	Volume = {25},
	Abstract = {In this article, we describe and interpret a set of acoustic and linguistic features that characterise emotional/emotion-related user states -- confined to the one database processed: four classes in a German corpus of children interacting with a pet robot. To this end, we collected a very large feature vector consisting of more than 4000 features extracted at different sites. We performed extensive feature selection (Sequential Forward Floating Search) for seven acoustic and four linguistic types of features, ending up in a small number of `most important' features which we try to interpret by discussing the impact of different feature and extraction types. We establish different measures of impact and discuss the mutual influence of acoustics and linguistics.},
	Bdsk-File-1 = {YnBsaXN0MDDUAQIDBAUGJCVYJHZlcnNpb25YJG9iamVjdHNZJGFyY2hpdmVyVCR0b3ASAAGGoKgHCBMUFRYaIVUkbnVsbNMJCgsMDxJXTlMua2V5c1pOUy5vYmplY3RzViRjbGFzc6INDoACgAOiEBGABIAFgAdccmVsYXRpdmVQYXRoWWFsaWFzRGF0YV8QXy4uLy4uLy4uL0JpYmxpb2dyYWZpYS9QYXBlcnMvQmF0bGluZXIvV2hvZHVubml0IC0tIFNlYXJjaGluZyBmb3IgdGhlIG1vc3QgaW1wb3J0YW50IGZlYXR1cmUucGRm0hcLGBlXTlMuZGF0YU8RAlQAAAAAAlQAAgAADE1hY2ludG9zaCBIRAAAAAAAAAAAAAAAAAAAAMv2H85IKwAAEIZnLh9XaG9kdW5uaXQgLS0gU2VhcmMjMTA4NjY3MzIucGRmAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAQhmcy1AnS+QAAAAAAAAAAAAMABAAACSAAAAAAAAAAAAAAAAAAAAAIQmF0bGluZXIAEAAIAADL9gOuAAAAEQAIAADUCbbZAAAAAQAUEIZnLhCGZY4ABfxHAAX7mAAAwEYAAgBmTWFjaW50b3NoIEhEOlVzZXJzOgBqb2FxdWltX2xsaXN0ZXJyaToAQmlibGlvZ3JhZmlhOgBQYXBlcnM6AEJhdGxpbmVyOgBXaG9kdW5uaXQgLS0gU2VhcmMjMTA4NjY3MzIucGRmAA4AdAA5AFcAaABvAGQAdQBuAG4AaQB0ACAALQAtACAAUwBlAGEAcgBjAGgAaQBuAGcAIABmAG8AcgAgAHQAaABlACAAbQBvAHMAdAAgAGkAbQBwAG8AcgB0AGEAbgB0ACAAZgBlAGEAdAB1AHIAZQAuAHAAZABmAA8AGgAMAE0AYQBjAGkAbgB0AG8AcwBoACAASABEABIAblVzZXJzL2pvYXF1aW1fbGxpc3RlcnJpL0JpYmxpb2dyYWZpYS9QYXBlcnMvQmF0bGluZXIvV2hvZHVubml0IC0tIFNlYXJjaGluZyBmb3IgdGhlIG1vc3QgaW1wb3J0YW50IGZlYXR1cmUucGRmABMAAS8AABUAAgAY//8AAIAG0hscHR5aJGNsYXNzbmFtZVgkY2xhc3Nlc11OU011dGFibGVEYXRhox0fIFZOU0RhdGFYTlNPYmplY3TSGxwiI1xOU0RpY3Rpb25hcnmiIiBfEA9OU0tleWVkQXJjaGl2ZXLRJidUcm9vdIABAAgAEQAaACMALQAyADcAQABGAE0AVQBgAGcAagBsAG4AcQBzAHUAdwCEAI4A8AD1AP0DVQNXA1wDZwNwA34DggOJA5IDlwOkA6cDuQO8A8EAAAAAAAACAQAAAAAAAAAoAAAAAAAAAAAAAAAAAAADww==},
	Bdsk-Url-1 = {http://www.esat.kuleuven.be/psi/spraak/cgi-bin/get_file.cgi?/dseppi/csl10/csl10.pdf},
	Bdsk-Url-2 = {http://dx.doi.org/10.1016/j.csl.2009.12.003}}
Downloads: 0