Extending multimedia languages to support multimodal user interactions. Guedes, Á. L. V., Azevedo, R. G. d. A., & Barbosa, S. D. J. 76(4):5691–5720, 2017.
Extending multimedia languages to support multimodal user interactions [link]Paper  doi  abstract   bibtex   41 downloads  
Historically, the Multimedia community research has focused on output modalities, through studies on timing and multimedia processing. The Multimodal Interaction community, on the other hand, has focused on user-generated modalities, through studies on Multimodal User Interfaces (MUI). In this paper, aiming to assist the development of multimedia applications with MUIs, we propose the integration of concepts from those two communities in a unique high-level programming framework. The framework integrates user modalities —both user-generated (e.g., speech, gestures) and user-consumed (e.g., audiovisual, haptic)— in declarative programming languages for the specification of interactive multimedia applications. To illustrate our approach, we instantiate the framework in the NCL (Nested Context Language) multimedia language. NCL is the declarative language for developing interactive applications for Brazilian Digital TV and an ITU-T Recommendation for IPTV services. To help evaluate our approach, we discuss a usage scenario and implement it as an NCL application extended with the proposed multimodal features. Also, we compare the expressiveness of the multimodal NCL against existing multimedia and multimodal languages, for both input and output modalities.
@article{guedes_extending_2017,
	title = {Extending multimedia languages to support multimodal user interactions},
	volume = {76},
	issn = {1573-7721},
	url = {http://dx.doi.org/10.1007/s11042-016-3846-8},
	doi = {10.1007/s11042-016-3846-8},
	abstract = {Historically, the Multimedia community research has focused on output modalities, through studies on timing and multimedia processing. The Multimodal Interaction community, on the other hand, has focused on user-generated modalities, through studies on Multimodal User Interfaces ({MUI}). In this paper, aiming to assist the development of multimedia applications with {MUIs}, we propose the integration of concepts from those two communities in a unique high-level programming framework. The framework integrates user modalities —both user-generated (e.g., speech, gestures) and user-consumed (e.g., audiovisual, haptic)— in declarative programming languages for the specification of interactive multimedia applications. To illustrate our approach, we instantiate the framework in the {NCL} (Nested Context Language) multimedia language. {NCL} is the declarative language for developing interactive applications for Brazilian Digital {TV} and an {ITU}-T Recommendation for {IPTV} services. To help evaluate our approach, we discuss a usage scenario and implement it as an {NCL} application extended with the proposed multimodal features. Also, we compare the expressiveness of the multimodal {NCL} against existing multimedia and multimodal languages, for both input and output modalities.},
	pages = {5691--5720},
	number = {4},
	journaltitle = {Multimedia Tools and Applications},
	author = {Guedes, Álan Lívio Vasconcelos and Azevedo, Roberto Gerson de Albuquerque and Barbosa, Simone Diniz Junqueira},
	year = {2017},
}

Downloads: 41