\n \n \n
\n\n\n
\n
\n\n \n \n \n \n \n \n Improving the Authoring of Web-based Interactive E-books with FableJS.\n \n \n \n \n\n\n \n Silva, A.; de Souza, W.; Moraes, D.; Azevedo, R.; and Neto, C. S.\n\n\n \n\n\n\n In
Anais da VII Escola Regional de Computação do Ceará, Maranhão e Piauí, pages 182–189, Porto Alegre, RS, Brasil, 2019. SBC\n
\n\n
\n\n
\n\n
\n\n \n \n Paper\n \n \n\n \n\n \n link\n \n \n\n bibtex\n \n\n \n\n \n \n \n 23 downloads\n \n \n\n \n \n \n \n \n \n \n\n \n \n \n\n\n\n
\n
@inproceedings{2019_11_silva,\nauthor={Alfredo Silva and Welton de Souza and Daniel Moraes and Roberto\nAzevedo and Carlos Soares Neto},\ntitle={Improving the Authoring of Web-based Interactive E-books with\nFableJS},\nbooktitle={Anais da VII Escola Regional de Computação do Ceará, Maranhão\ne Piauí},\nlocation={São Luís},\nyear={2019},\nkeywords={},\nissn={0000-0000},\npages={182--189},\npublisher={SBC},\naddress={Porto Alegre, RS, Brasil},\nurl={https://sol.sbc.org.br/index.php/ercemapi/article/view/8861},\n}\n\n
\n
\n\n\n\n
\n\n\n
\n\n\n
\n
\n\n \n \n \n \n \n Subjective Evaluation of 360-degree Sensory Experiences.\n \n \n \n\n\n \n Guedes, Á. L. V.; Azevedo, R. G. d. A.; Frossard, P.; Barbosa, S. D.; and Colcher, S.\n\n\n \n\n\n\n In
IEEE 21st International Workshop on Multimedia Signal Processing, of
MMSP'19, pages 6, Kuala Lumpur, Malaysia, 9 2019. \n
\n\n
\n\n
\n\n
\n\n \n\n \n\n \n link\n \n \n\n bibtex\n \n\n \n \n \n abstract \n \n\n \n\n \n \n \n \n \n \n \n\n \n \n \n\n\n\n
\n
@inproceedings{2019_09b_guedes,\ntitle={Subjective Evaluation of 360-degree Sensory Experiences},\nauthor={Guedes, Álan Lívio Vasconcelos and Azevedo, Roberto Gerson de\nAlbuquerque and Frossard, Pascal and Barbosa, Simone D.J. and Colcher,\nSérgio},\nyear={2019},\nmonth={9},\npages={6},\nbooktitle={IEEE 21st International Workshop on Multimedia Signal Processing},\nseries={MMSP'19},\naddress={Kuala Lumpur, Malaysia},\nabstract={Traditionally, most multimedia content has been developed to\nstimulate two of the human senses, i.e., sight and hearing. Due to recent\ntechnological advancements, however, innovative services have been developed\nthat provide more realistic, immersive, and engaging experiences to the\naudience. Omnidirectional (i.e., 360-degree) video, for instance, is becoming\nincreasingly popular. It allows the viewer to navigate the full 360-degree\nview of a scene from a specific point. In particular, when consumed through\nhead-mounted displays, 360-degree videos provide increased immersion and\nsense of presence. The use of multi-sensory effects ---e.g., wind, vibration,\nand scent--- has also been explored by recent work, which allows an improved\nexperience by stimulating other users' senses through sensory effects that go\nbeyond the audiovisual content. Understanding how these additional\nmulti-sensory effects affect the users' perceived quality of experience~(QoE)\nin 360-degree, however, is still an open research problem at large. As a step\nto better understand the QoE of immersive sensory experiences, this paper\npresents a test-bed and discusses a user-focused study on a scenario in which\nthe user is immersed in the 360-degree video content and is stimulated\nthrough additional sensory effects. Quantitative results indicated that the\nsensorial effects can considerably increase the sense of presence of\n360-degree videos. Qualitative results provided us with a better view of the\nlimitations of current technologies and interesting insights such as the\nusers' sense of surprise.},\n}\n\n
\n
\n\n\n
\n Traditionally, most multimedia content has been developed to stimulate two of the human senses, i.e., sight and hearing. Due to recent technological advancements, however, innovative services have been developed that provide more realistic, immersive, and engaging experiences to the audience. Omnidirectional (i.e., 360-degree) video, for instance, is becoming increasingly popular. It allows the viewer to navigate the full 360-degree view of a scene from a specific point. In particular, when consumed through head-mounted displays, 360-degree videos provide increased immersion and sense of presence. The use of multi-sensory effects —e.g., wind, vibration, and scent— has also been explored by recent work, which allows an improved experience by stimulating other users' senses through sensory effects that go beyond the audiovisual content. Understanding how these additional multi-sensory effects affect the users' perceived quality of experience (QoE) in 360-degree, however, is still an open research problem at large. As a step to better understand the QoE of immersive sensory experiences, this paper presents a test-bed and discusses a user-focused study on a scenario in which the user is immersed in the 360-degree video content and is stimulated through additional sensory effects. Quantitative results indicated that the sensorial effects can considerably increase the sense of presence of 360-degree videos. Qualitative results provided us with a better view of the limitations of current technologies and interesting insights such as the users' sense of surprise.\n
\n\n\n
\n\n\n
\n\n\n
\n
\n\n \n \n \n \n \n Graph-based detection of seams in 360-degree images.\n \n \n \n\n\n \n de Simone*, F.; Azevedo*, R. G. d. A.; Sohyeong, K.; and Frossard, P.\n\n\n \n\n\n\n In
2019 IEEE International Conference on Image Processing (ICIP), of
ICIP'19, pages 3776-3780, Taipei, Taiwan, Sept 2019. \n
*Equal contributions.\n\n
\n\n
\n\n
\n\n \n\n \n \n doi\n \n \n\n \n link\n \n \n\n bibtex\n \n\n \n \n \n abstract \n \n\n \n \n \n 7 downloads\n \n \n\n \n \n \n \n \n \n \n\n \n \n \n \n \n\n\n\n
\n
@inproceedings{2019_09_desimone,\ntitle={Graph-based detection of seams in 360-degree images},\nauthor={de Simone*, Francesca and Azevedo*, Roberto Gerson de Albuquerque and\nSohyeong, Kim and Frossard, Pascal},\nmonth={Sept},\nyear={2019},\ndoi={10.1109/ICIP.2019.8803578},\nissn={2381-8549},\npages={3776-3780},\nnote={*Equal contributions.},\nbooktitle={2019 IEEE International Conference on Image Processing (ICIP)},\nseries={ICIP'19},\naddress={Taipei, Taiwan},\nkeywords={Omnidirectional image; cube map projection; compression; visual\ndistortion; quality metric},\nabstract={In this paper, we propose an algorithm to detect a specific kind of\ndistortions, referred to as seams, which commonly occur when a 360-degree\nimage is represented in planar domain, e.g, via the Cube Map (CM) projection,\nand undergoes lossy compression. The proposed algorithm exploits a\ngraph-based representation to account for the actual sampling density of the\n360-degree signal in the native spherical domain. The CM image is considered\nas a signal lying on a graph defined on the spherical surface. The spectra of\nthe processed and the original signals, computed by applying the Graph\nFourier Transform, are compared to detect the seams. To test our method a\ndataset of compressed CM 360-degree images, annotated by experts, has been\ncreated. The performance of the proposed algorithm is compared to those\nachieved by baseline metrics, as well as to the same approach based on\nspectral comparison but ignoring the spherical nature of the signal. The\nexperimental results show that the proposed method has the best performance\nand can successfully detect up to approximately 90\\% of visible seams on our\ndataset.},\n}\n\n
\n
\n\n\n
\n In this paper, we propose an algorithm to detect a specific kind of distortions, referred to as seams, which commonly occur when a 360-degree image is represented in planar domain, e.g, via the Cube Map (CM) projection, and undergoes lossy compression. The proposed algorithm exploits a graph-based representation to account for the actual sampling density of the 360-degree signal in the native spherical domain. The CM image is considered as a signal lying on a graph defined on the spherical surface. The spectra of the processed and the original signals, computed by applying the Graph Fourier Transform, are compared to detect the seams. To test our method a dataset of compressed CM 360-degree images, annotated by experts, has been created. The performance of the proposed algorithm is compared to those achieved by baseline metrics, as well as to the same approach based on spectral comparison but ignoring the spherical nature of the signal. The experimental results show that the proposed method has the best performance and can successfully detect up to approximately 90% of visible seams on our dataset.\n
\n\n\n
\n\n\n
\n
\n\n \n \n \n \n \n \n On the First JND and Break in Presence of 360-degree Content: An Exploratory Study.\n \n \n \n \n\n\n \n Azevedo, R. G. d. A.; Birkbeck, N.; Janatra, I.; Adsumilli, B.; and Frossard, P.\n\n\n \n\n\n\n In
Proceedings of the 11th ACM Workshop on Immersive Mixed and Virtual Environment Systems, of
MMVE '19, pages 1–3, New York, NY, USA, 2019. ACM\n
\n\n
\n\n
\n\n
\n\n \n \n Paper\n \n \n\n \n \n doi\n \n \n\n \n link\n \n \n\n bibtex\n \n\n \n \n \n abstract \n \n\n \n \n \n 45 downloads\n \n \n\n \n \n \n \n \n \n \n\n \n \n \n \n \n \n \n \n \n \n \n \n \n\n\n\n
\n
@inproceedings{2019_06_azevedo,\nauthor={Azevedo, Roberto Gerson de Albuquerque and Birkbeck, Neil and\nJanatra, Ivan and Adsumilli, Balu and Frossard, Pascal},\ntitle={On the First JND and Break in Presence of 360-degree Content: An\nExploratory Study},\nbooktitle={Proceedings of the 11th ACM Workshop on Immersive Mixed and\nVirtual Environment Systems},\nseries={MMVE '19},\nyear={2019},\nisbn={978-1-4503-6299-3},\nlocation={Amherst, Massachusetts},\npages={1--3},\nnumpages={3},\nurl={http://doi.acm.org/10.1145/3304113.3326115},\ndoi={10.1145/3304113.3326115},\nacmid={3326115},\npublisher={ACM},\naddress={New York, NY, USA},\nkeywords={360-degree video, JND, presence, visual distortions, visual\nquality},\nabstract={Unlike traditional planar 2D visual content, immersive 360-degree\nimages and videos undergo particular processing steps and are intended to be\nconsumed via head-mounted displays (HMDs). To get a deeper understanding on\nthe perception of 360-degree visual distortions when consumed through HMDs,\nwe perform an exploratory task-based subjective study in which we have asked\nsubjects to define the first noticeable difference and break-in-presence\npoints when incrementally adding specific compression artifacts. The results\nof our study: give insights on the range of allowed visual distortions for\n360-degree content; show that the added visual distortions are more tolerable\nin mono than in stereoscopic 3D; and identify issues with current 360-degree\nobjective quality metrics.},\n}\n\n%%% 2018 %%%\n%\n
\n
\n\n\n
\n Unlike traditional planar 2D visual content, immersive 360-degree images and videos undergo particular processing steps and are intended to be consumed via head-mounted displays (HMDs). To get a deeper understanding on the perception of 360-degree visual distortions when consumed through HMDs, we perform an exploratory task-based subjective study in which we have asked subjects to define the first noticeable difference and break-in-presence points when incrementally adding specific compression artifacts. The results of our study: give insights on the range of allowed visual distortions for 360-degree content; show that the added visual distortions are more tolerable in mono than in stereoscopic 3D; and identify issues with current 360-degree objective quality metrics.\n
\n\n\n
\n\n\n\n\n\n