Extending multimedia languages to support multimodal user interactions. Guedes, Á. L., Azevedo, R. G. d. A., & Barbosa, S. D. J. Multimedia Tools and Applications, 76(4):5691–5720, 2017.
Extending multimedia languages to support multimodal user interactions [link]Paper  doi  abstract   bibtex   41 downloads  
Historically, the Multimedia community research has focused on output modalities, through studies on timing and multimedia processing. The Multimodal Interaction community, on the other hand, has focused on user-generated modalities, through studies on Multimodal User Interfaces (MUI). In this paper, aiming to assist the development of multimedia applications with MUIs, we propose the integration of concepts from those two communities in a unique high-level programming framework. The framework integrates user modalities —both user-generated (e.g., speech, gestures) and user-consumed (e.g., audiovisual, haptic)— in declarative programming languages for the specification of interactive multimedia applications. To illustrate our approach, we instantiate the framework in the NCL (Nested Context Language) multimedia language. NCL is the declarative language for developing interactive applications for Brazilian Digital TV and an ITU-T Recommendation for IPTV services. To help evaluate our approach, we discuss a usage scenario and implement it as an NCL application extended with the proposed multimodal features. Also, we compare the expressiveness of the multimodal NCL against existing multimedia and multimodal languages, for both input and output modalities.
@article{2017_02_guedes,
author="Guedes, {\'A}lan L{\'i}vio Vasconcelos and Azevedo, Roberto Gerson de
Albuquerque and Barbosa, Simone Diniz Junqueira",
title="Extending multimedia languages to support multimodal user
interactions",
journal="Multimedia Tools and Applications",
year="2017",
volume="76",
number="4",
pages="5691--5720",
abstract="Historically, the Multimedia community research has focused on
output modalities, through studies on timing and multimedia processing. The
Multimodal Interaction community, on the other hand, has focused on
user-generated modalities, through studies on Multimodal User Interfaces
(MUI). In this paper, aiming to assist the development of multimedia
applications with MUIs, we propose the integration of concepts from those two
communities in a unique high-level programming framework. The framework
integrates user modalities ---both user-generated (e.g., speech, gestures)
and user-consumed (e.g., audiovisual, haptic)--- in declarative programming
languages for the specification of interactive multimedia applications. To
illustrate our approach, we instantiate the framework in the NCL (Nested
Context Language) multimedia language. NCL is the declarative language for
developing interactive applications for Brazilian Digital TV and an ITU-T
Recommendation for IPTV services. To help evaluate our approach, we discuss a
usage scenario and implement it as an NCL application extended with the
proposed multimodal features. Also, we compare the expressiveness of the
multimodal NCL against existing multimedia and multimodal languages, for both
input and output modalities.",
issn="1573-7721",
doi="10.1007/s11042-016-3846-8",
url="http://dx.doi.org/10.1007/s11042-016-3846-8",
}

%%% 2016 %%%

Downloads: 41