Modeling Sensory Effects As First-class Entities in Multimedia Applications. Josué, M., Abreu, R., Barreto, F., Mattos, D., Amorim, G., dos Santos, J., & Muchaluat-Saade, D. In Proceedings of the 9th ACM Multimedia Systems Conference, of MMSys '18, pages 225–236, New York, NY, USA, 2018. ACM. 00000
Modeling Sensory Effects As First-class Entities in Multimedia Applications [link]Paper  doi  abstract   bibtex   
Multimedia applications are usually composed by audiovisual content. Traditional multimedia conceptual models, and consequently declarative multimedia authoring languages, do not support the definition of multiple sensory effects. Multiple sensorial media (mulsemedia) applications consider the use of sensory effects that can stimulate touch, smell and taste, in addition to hearing and sight. Therefore, mulsemedia applications have been usually developed using general-purpose programming languages. In order to fill in this gap, this paper proposes an approach for modeling sensory effects as first-class entities, enabling multimedia applications to synchronize sensorial media to interactive audiovisual content in a high-level specification. Thus, complete descriptions of mulsemedia applications will be made possible with multimedia models and languages. In order to validate our ideas, an interactive mulsemedia application example is presented and specified with NCL (Nested Context Language) and Lua. Lua components are used for translating sensory effect high-level attributes to MPEG-V SEM (Sensory Effect Metadata) files. A sensory effect simulator was developed to receive SEM files and simulate mulsemedia application rendering.
@inproceedings{josue_modeling_2018,
	address = {New York, NY, USA},
	series = {{MMSys} '18},
	title = {Modeling {Sensory} {Effects} {As} {First}-class {Entities} in {Multimedia} {Applications}},
	isbn = {978-1-4503-5192-8},
	url = {http://doi.acm.org/10.1145/3204949.3204967},
	doi = {10.1145/3204949.3204967},
	abstract = {Multimedia applications are usually composed by audiovisual content. Traditional multimedia conceptual models, and consequently declarative multimedia authoring languages, do not support the definition of multiple sensory effects. Multiple sensorial media (mulsemedia) applications consider the use of sensory effects that can stimulate touch, smell and taste, in addition to hearing and sight. Therefore, mulsemedia applications have been usually developed using general-purpose programming languages. In order to fill in this gap, this paper proposes an approach for modeling sensory effects as first-class entities, enabling multimedia applications to synchronize sensorial media to interactive audiovisual content in a high-level specification. Thus, complete descriptions of mulsemedia applications will be made possible with multimedia models and languages. In order to validate our ideas, an interactive mulsemedia application example is presented and specified with NCL (Nested Context Language) and Lua. Lua components are used for translating sensory effect high-level attributes to MPEG-V SEM (Sensory Effect Metadata) files. A sensory effect simulator was developed to receive SEM files and simulate mulsemedia application rendering.},
	urldate = {2018-09-17},
	booktitle = {Proceedings of the 9th {ACM} {Multimedia} {Systems} {Conference}},
	publisher = {ACM},
	author = {Josué, Marina and Abreu, Raphael and Barreto, Fábio and Mattos, Douglas and Amorim, Glauco and dos Santos, Joel and Muchaluat-Saade, Débora},
	year = {2018},
	note = {00000},
	pages = {225--236}
}

Downloads: 0