Integrating PAMOCAT in the Research Cycle: Linking Motion Capturing and Conversation Analysis. Brüning, B., Schnier, C., Pitsch, K., & Wachsmuth, S. In Proceedings of the 14th ACM International Conference on Multimodal Interaction, of ICMI '12, pages 201--208, New York, NY, USA, 2012. ACM.
Paper doi abstract bibtex In order to understand and model the non-verbal communicative conduct of humans, it seems fruitful to combine qualitative (Conversation Analysis [6] [10] [11]) and quantitative analytical (motion capturing) methods. Tools for data visualization and annotation are important as they constitute a central interface between different research approaches and methodologies. With this aim we have developed the pre-annotation tool "PAMOCAT - Pre Annotation Motion Capture Analysis Tool" that detects different phenomena. These phenomena are in the category single person and person overlapping phenomena. Included are functions for the analysis of head focused objects, hand activities, single DOF- degree of freedom activity, posture detection and intrusions into the co-participant's space. These detected phenomena related to the frames will be displayed in an overview. The phenomena can be chosen to search for a specific constellation between these different phenomena. A sophisticated user interface easily allows the annotating person to find correlations between different joints and phenomena, to analyze the corresponding 3D pose in a reconstructed virtual environment, and to export combined qualitative and quantitative annotations to standard annotation tools. Using this technique we are able to examine complex setups with three participants engaged in conversation. In this paper we propose how PAMOCAT can be integrated in the research cycle by showing a concrete PAMOCAT-based micro-analysis of a multimodal phenomenon, which deals with kinetic procedures to claim the floor.
@inproceedings{bruning_integrating_2012,
address = {New York, NY, USA},
series = {{ICMI} '12},
title = {Integrating {PAMOCAT} in the {Research} {Cycle}: {Linking} {Motion} {Capturing} and {Conversation} {Analysis}},
isbn = {978-1-4503-1467-1},
shorttitle = {Integrating {PAMOCAT} in the {Research} {Cycle}},
url = {http://doi.acm.org/10.1145/2388676.2388716},
doi = {10.1145/2388676.2388716},
abstract = {In order to understand and model the non-verbal communicative conduct of humans, it seems fruitful to combine qualitative (Conversation Analysis [6] [10] [11]) and quantitative analytical (motion capturing) methods. Tools for data visualization and annotation are important as they constitute a central interface between different research approaches and methodologies. With this aim we have developed the pre-annotation tool "PAMOCAT - Pre Annotation Motion Capture Analysis Tool" that detects different phenomena. These phenomena are in the category single person and person overlapping phenomena. Included are functions for the analysis of head focused objects, hand activities, single DOF- degree of freedom activity, posture detection and intrusions into the co-participant's space. These detected phenomena related to the frames will be displayed in an overview. The phenomena can be chosen to search for a specific constellation between these different phenomena. A sophisticated user interface easily allows the annotating person to find correlations between different joints and phenomena, to analyze the corresponding 3D pose in a reconstructed virtual environment, and to export combined qualitative and quantitative annotations to standard annotation tools. Using this technique we are able to examine complex setups with three participants engaged in conversation. In this paper we propose how PAMOCAT can be integrated in the research cycle by showing a concrete PAMOCAT-based micro-analysis of a multimodal phenomenon, which deals with kinetic procedures to claim the floor.},
urldate = {2014-06-05TZ},
booktitle = {Proceedings of the 14th {ACM} {International} {Conference} on {Multimodal} {Interaction}},
publisher = {ACM},
author = {Brüning, Bernhard and Schnier, Christian and Pitsch, Karola and Wachsmuth, Sven},
year = {2012},
pages = {201--208}
}
Downloads: 0
{"_id":"oMRhJwHGNjpbtYbX9","bibbaseid":"brning-schnier-pitsch-wachsmuth-integratingpamocatintheresearchcyclelinkingmotioncapturingandconversationanalysis-2012","downloads":0,"creationDate":"2016-10-05T13:48:42.871Z","title":"Integrating PAMOCAT in the Research Cycle: Linking Motion Capturing and Conversation Analysis","author_short":["Brüning, B.","Schnier, C.","Pitsch, K.","Wachsmuth, S."],"year":2012,"bibtype":"inproceedings","biburl":"http://bibbase.org/zotero/alanlivio","bibdata":{"bibtype":"inproceedings","type":"inproceedings","address":"New York, NY, USA","series":"ICMI '12","title":"Integrating PAMOCAT in the Research Cycle: Linking Motion Capturing and Conversation Analysis","isbn":"978-1-4503-1467-1","shorttitle":"Integrating PAMOCAT in the Research Cycle","url":"http://doi.acm.org/10.1145/2388676.2388716","doi":"10.1145/2388676.2388716","abstract":"In order to understand and model the non-verbal communicative conduct of humans, it seems fruitful to combine qualitative (Conversation Analysis [6] [10] [11]) and quantitative analytical (motion capturing) methods. Tools for data visualization and annotation are important as they constitute a central interface between different research approaches and methodologies. With this aim we have developed the pre-annotation tool \"PAMOCAT - Pre Annotation Motion Capture Analysis Tool\" that detects different phenomena. These phenomena are in the category single person and person overlapping phenomena. Included are functions for the analysis of head focused objects, hand activities, single DOF- degree of freedom activity, posture detection and intrusions into the co-participant's space. These detected phenomena related to the frames will be displayed in an overview. The phenomena can be chosen to search for a specific constellation between these different phenomena. A sophisticated user interface easily allows the annotating person to find correlations between different joints and phenomena, to analyze the corresponding 3D pose in a reconstructed virtual environment, and to export combined qualitative and quantitative annotations to standard annotation tools. Using this technique we are able to examine complex setups with three participants engaged in conversation. In this paper we propose how PAMOCAT can be integrated in the research cycle by showing a concrete PAMOCAT-based micro-analysis of a multimodal phenomenon, which deals with kinetic procedures to claim the floor.","urldate":"2014-06-05TZ","booktitle":"Proceedings of the 14th ACM International Conference on Multimodal Interaction","publisher":"ACM","author":[{"propositions":[],"lastnames":["Brüning"],"firstnames":["Bernhard"],"suffixes":[]},{"propositions":[],"lastnames":["Schnier"],"firstnames":["Christian"],"suffixes":[]},{"propositions":[],"lastnames":["Pitsch"],"firstnames":["Karola"],"suffixes":[]},{"propositions":[],"lastnames":["Wachsmuth"],"firstnames":["Sven"],"suffixes":[]}],"year":"2012","pages":"201--208","bibtex":"@inproceedings{bruning_integrating_2012,\n\taddress = {New York, NY, USA},\n\tseries = {{ICMI} '12},\n\ttitle = {Integrating {PAMOCAT} in the {Research} {Cycle}: {Linking} {Motion} {Capturing} and {Conversation} {Analysis}},\n\tisbn = {978-1-4503-1467-1},\n\tshorttitle = {Integrating {PAMOCAT} in the {Research} {Cycle}},\n\turl = {http://doi.acm.org/10.1145/2388676.2388716},\n\tdoi = {10.1145/2388676.2388716},\n\tabstract = {In order to understand and model the non-verbal communicative conduct of humans, it seems fruitful to combine qualitative (Conversation Analysis [6] [10] [11]) and quantitative analytical (motion capturing) methods. Tools for data visualization and annotation are important as they constitute a central interface between different research approaches and methodologies. With this aim we have developed the pre-annotation tool \"PAMOCAT - Pre Annotation Motion Capture Analysis Tool\" that detects different phenomena. These phenomena are in the category single person and person overlapping phenomena. Included are functions for the analysis of head focused objects, hand activities, single DOF- degree of freedom activity, posture detection and intrusions into the co-participant's space. These detected phenomena related to the frames will be displayed in an overview. The phenomena can be chosen to search for a specific constellation between these different phenomena. A sophisticated user interface easily allows the annotating person to find correlations between different joints and phenomena, to analyze the corresponding 3D pose in a reconstructed virtual environment, and to export combined qualitative and quantitative annotations to standard annotation tools. Using this technique we are able to examine complex setups with three participants engaged in conversation. In this paper we propose how PAMOCAT can be integrated in the research cycle by showing a concrete PAMOCAT-based micro-analysis of a multimodal phenomenon, which deals with kinetic procedures to claim the floor.},\n\turldate = {2014-06-05TZ},\n\tbooktitle = {Proceedings of the 14th {ACM} {International} {Conference} on {Multimodal} {Interaction}},\n\tpublisher = {ACM},\n\tauthor = {Brüning, Bernhard and Schnier, Christian and Pitsch, Karola and Wachsmuth, Sven},\n\tyear = {2012},\n\tpages = {201--208}\n}\n\n","author_short":["Brüning, B.","Schnier, C.","Pitsch, K.","Wachsmuth, S."],"key":"bruning_integrating_2012","id":"bruning_integrating_2012","bibbaseid":"brning-schnier-pitsch-wachsmuth-integratingpamocatintheresearchcyclelinkingmotioncapturingandconversationanalysis-2012","role":"author","urls":{"Paper":"http://doi.acm.org/10.1145/2388676.2388716"},"downloads":0},"search_terms":["integrating","pamocat","research","cycle","linking","motion","capturing","conversation","analysis","brüning","schnier","pitsch","wachsmuth"],"keywords":[],"authorIDs":[],"dataSources":["tudya6YojbqEiF783"]}