Head movements, facial expressions and feedback in conversations: empirical evidence from Danish multimodal data. Paggio, P. & Navarretta, C. Journal on Multimodal User Interfaces, 7(1):29–37, March, 2013.
Head movements, facial expressions and feedback in conversations: empirical evidence from Danish multimodal data [link]Paper  doi  abstract   bibtex   
This article deals with multimodal feedback in two Danish multimodal corpora, i.e., a collection of map-task dialogues and a corpus of free conversations in first encounters between pairs of subjects. Machine learning techniques are applied to both sets of data to investigate various relations between the non-verbal behaviour—more specifically head movements and facial expressions—and speech with regard to the expression of feedback. In the map-task data, we study the extent to which the dialogue act type of linguistic feedback expressions can be classified automatically based on the non-verbal features. In the conversational data, on the other hand, non-verbal and speech features are used together to distinguish feedback from other multimodal behaviours. The results of the two sets of experiments indicate in general that head movements, and to a lesser extent facial expressions, are important indicators of feedback, and that gestures and speech disambiguate each other in the machine learning process.
@article{paggio_head_2013,
	title = {Head movements, facial expressions and feedback in conversations: empirical evidence from {Danish} multimodal data},
	volume = {7},
	issn = {1783-8738},
	shorttitle = {Head movements, facial expressions and feedback in conversations},
	url = {https://doi.org/10.1007/s12193-012-0105-9},
	doi = {10.1007/s12193-012-0105-9},
	abstract = {This article deals with multimodal feedback in two Danish multimodal corpora, i.e., a collection of map-task dialogues and a corpus of free conversations in first encounters between pairs of subjects. Machine learning techniques are applied to both sets of data to investigate various relations between the non-verbal behaviour—more specifically head movements and facial expressions—and speech with regard to the expression of feedback. In the map-task data, we study the extent to which the dialogue act type of linguistic feedback expressions can be classified automatically based on the non-verbal features. In the conversational data, on the other hand, non-verbal and speech features are used together to distinguish feedback from other multimodal behaviours. The results of the two sets of experiments indicate in general that head movements, and to a lesser extent facial expressions, are important indicators of feedback, and that gestures and speech disambiguate each other in the machine learning process.},
	language = {en},
	number = {1},
	urldate = {2019-09-05},
	journal = {Journal on Multimodal User Interfaces},
	author = {Paggio, Patrizia and Navarretta, Costanza},
	month = mar,
	year = {2013},
	keywords = {Feedback, Gestures, Backchanneling, Facial expressions, Head movements},
	pages = {29--37},
}

Downloads: 0