Connecting users to virtual worlds within MPEG-V standardization. Han, S., Han, J., Kim, J. D. K., & Kim, C. Signal Processing: Image Communication, 28(2):97–113, February, 2013. 00007
Connecting users to virtual worlds within MPEG-V standardization [link]Paper  doi  abstract   bibtex   
Virtual world such as Second life and 3D internet/broadcasting services have been increasingly popular. A life-scale virtual world presentation and the intuitive interaction between the users and the virtual worlds would provide more natural and immersive experience for users. The emergence of novel interaction technologies, such as facial-expression/body-motion tracking and remote interaction for virtual object manipulation, could be used to provide a strong connection between users in the real world and avatars in the virtual world. For the wide acceptance and the use of the virtual world, various types of novel interaction devices should have a unified interaction format between the real world and the virtual world. Thus, MPEG-V Media Context and Control (ISO/IEC 23005) standardizes such connecting information. The paper provides an overview and its usage example of MPEG-V from the real world to the virtual world (R2V) on interfaces for controlling avatars and virtual objects in the virtual world by the real world devices. In particular, we investigate how the MPEG-V framework can be applied for the facial animation and hand-based 3D manipulation using intelligent camera. In addition, in order to intuitively manipulate objects in a 3D virtual environment, we present two interaction techniques using motion sensors such as a two-handed spatial 3D interaction approach and a gesture-based interaction approach.
@article{han_connecting_2013,
	title = {Connecting users to virtual worlds within {MPEG}-{V} standardization},
	volume = {28},
	issn = {0923-5965},
	url = {http://www.sciencedirect.com/science/article/pii/S0923596512002020},
	doi = {10.1016/j.image.2012.10.014},
	abstract = {Virtual world such as Second life and 3D internet/broadcasting services have been increasingly popular. A life-scale virtual world presentation and the intuitive interaction between the users and the virtual worlds would provide more natural and immersive experience for users. The emergence of novel interaction technologies, such as facial-expression/body-motion tracking and remote interaction for virtual object manipulation, could be used to provide a strong connection between users in the real world and avatars in the virtual world. For the wide acceptance and the use of the virtual world, various types of novel interaction devices should have a unified interaction format between the real world and the virtual world. Thus, MPEG-V Media Context and Control (ISO/IEC 23005) standardizes such connecting information. The paper provides an overview and its usage example of MPEG-V from the real world to the virtual world (R2V) on interfaces for controlling avatars and virtual objects in the virtual world by the real world devices. In particular, we investigate how the MPEG-V framework can be applied for the facial animation and hand-based 3D manipulation using intelligent camera. In addition, in order to intuitively manipulate objects in a 3D virtual environment, we present two interaction techniques using motion sensors such as a two-handed spatial 3D interaction approach and a gesture-based interaction approach.},
	number = {2},
	urldate = {2015-07-14},
	journal = {Signal Processing: Image Communication},
	author = {Han, Seungju and Han, Jae-Joon and Kim, James D. K. and Kim, Changyeong},
	month = feb,
	year = {2013},
	note = {00007},
	pages = {97--113}
}

Downloads: 0