{"_id":"6TzMXsZJB6Mepi3gR","bibbaseid":"guedes-extendingmultimedialanguagestosupportmultimodaluserinteractionsthesis-2017","authorIDs":[],"author_short":["Guedes, Á. L. V."],"bibdata":{"bibtype":"thesis","type":"phdthesis","location":",","title":"Extending multimedia languages to support multimodal user interactions (thesis)","abstract":"Recent advances in recognition technologies, such as speech, touch and gesture, have given rise to a new class of user interfaces that does not only explore multiple modalities but also allows for multiple interacting users. The development of applications with both multimodal and multiuser interactions arise new specification and execution issues. The specification of multimodal application is commonly the focus of multimodal interaction research, while the specification of the synchronization of audiovisual media is usually the focus of multimedia research. In this thesis, aiming to assist the specification of such applications, we propose to integrate concepts from those two research areas and to extend multimedia languages with first-class entities to support multiuser and multimodal features. Those entities were instantiated in NCL and HTML. To evaluate our approach, we performed an evaluation with NCL and HTML developers to capture evidences of their acceptance of the proposed entities and instantiations in those languages.","institution":"PUC-Rio","author":[{"propositions":[],"lastnames":["Guedes"],"firstnames":["Álan","Lívio","Vasconcelos"],"suffixes":[]}],"year":"2017","bibtex":"@thesis{guedes_extending_2017-1,\n\tlocation = {,},\n\ttitle = {Extending multimedia languages to support multimodal user interactions (thesis)},\n\tabstract = {Recent advances in recognition technologies, such as speech, touch and gesture, have given rise to a new class of user interfaces that does not only explore multiple modalities but also allows for multiple interacting users. The development of applications with both multimodal and multiuser interactions arise new specification and execution issues. The specification of multimodal application is commonly the focus of multimodal interaction research, while the specification of the synchronization of audiovisual media is usually the focus of multimedia research. In this thesis, aiming to assist the specification of such applications, we propose to integrate concepts from those two research areas and to extend multimedia languages with first-class entities to support multiuser and multimodal features. Those entities were instantiated in {NCL} and {HTML}. To evaluate our approach, we performed an evaluation with {NCL} and {HTML} developers to capture evidences of their acceptance of the proposed entities and instantiations in those languages.},\n\tinstitution = {{PUC}-Rio},\n\ttype = {phdthesis},\n\tauthor = {Guedes, Álan Lívio Vasconcelos},\n\tyear = {2017},\n}\n\n","author_short":["Guedes, Á. L. V."],"key":"guedes_extending_2017-1","id":"guedes_extending_2017-1","bibbaseid":"guedes-extendingmultimedialanguagestosupportmultimodaluserinteractionsthesis-2017","role":"author","urls":{},"metadata":{"authorlinks":{}},"downloads":0,"html":""},"bibtype":"thesis","biburl":"http://www.telemidia.puc-rio.br/files/biblio/all.bib","creationDate":"2020-03-03T14:08:14.761Z","downloads":0,"keywords":[],"search_terms":["extending","multimedia","languages","support","multimodal","user","interactions","thesis","guedes"],"title":"Extending multimedia languages to support multimodal user interactions (thesis)","year":2017,"dataSources":["gXSBTZhj3xCWydoZF"]}