Experiencing Thing2Reality: Transforming 2D Content into Conditioned Multiviews and 3D Gaussian Objects for XR Communication. Hu, E., Li, M., Qian, X., Olwal, A., Kim, D., Heo, S., & Du, R. In Adjunct Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology, of UIST Adjunct '24, New York, NY, USA, 2024. Association for Computing Machinery.
Experiencing Thing2Reality: Transforming 2D Content into Conditioned Multiviews and 3D Gaussian Objects for XR Communication [link]Paper  doi  abstract   bibtex   
During remote communication, participants share both digital and physical content, such as product designs, digital assets, and environments, to enhance mutual understanding. Recent advances in augmented communication have facilitated users to swiftly create and share digital 2D copies of physical objects from video feeds into a shared space. However, the conventional 2D representation of digital objects restricts users’ ability to spatially reference items in a shared immersive environment. To address these challenges, we propose Thing2Reality, an Extended Reality (XR) communication platform designed to enhance spontaneous discussions regarding both digital and physical items during remote sessions. With Thing2Reality, users can quickly materialize ideas or physical objects in an immersive environment and share them as conditioned multiview renderings or 3D Gaussians. Our system enables users to interact with remote objects or discuss concepts in a collaborative manner.
@inproceedings{10.1145/3672539.3686740,
author = {Hu, Erzhen and Li, Mingyi and Qian, Xun and Olwal, Alex and Kim, David and Heo, Seongkook and Du, Ruofei},
title = {Experiencing Thing2Reality: Transforming 2D Content into Conditioned Multiviews and 3D Gaussian Objects for XR Communication},
year = {2024},
isbn = {9798400707186},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3672539.3686740},
doi = {10.1145/3672539.3686740},
abstract = {During remote communication, participants share both digital and physical content, such as product designs, digital assets, and environments, to enhance mutual understanding. Recent advances in augmented communication have facilitated users to swiftly create and share digital 2D copies of physical objects from video feeds into a shared space. However, the conventional 2D representation of digital objects restricts users’ ability to spatially reference items in a shared immersive environment. To address these challenges, we propose Thing2Reality, an Extended Reality (XR) communication platform designed to enhance spontaneous discussions regarding both digital and physical items during remote sessions. With Thing2Reality, users can quickly materialize ideas or physical objects in an immersive environment and share them as conditioned multiview renderings or 3D Gaussians. Our system enables users to interact with remote objects or discuss concepts in a collaborative manner.},
booktitle = {Adjunct Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology},
articleno = {23},
numpages = {3},
keywords = {augmented communication, co-presence, extended reality, image-to-3D, remote collaboration, spatial referencing},
location = {Pittsburgh, PA, USA},
series = {UIST Adjunct '24}
}

Downloads: 0