Interpretation of Depth from Scaled Motion Parallax in Virtual Reality. Teng, X., Wilcox, L. M., & Allison, R. S. In Journal of Vision (Vision Sciences Society Abstracts), volume 21, pages 2035. 2021. -1 doi abstract bibtex Humans use visual, vestibular, kinesthetic and other cues to effectively navigate through the world. Therefore, conflict between these sources of information has potentially significant implications for human perception of geometric layout. Previous work has found that introducing gain differences between physical and virtual head movement had little effect on distance perception. However, motion parallax is known to be a potent cue to relative depth. In the present study, we explore the impact of conflict between physical and portrayed self-motion on perception of object shape. To do so we varied the gain between virtual and physical head motion (ranging from a factor of 0.5 to 2) and measured the effect on depth perception. Observers viewed a `fold' stimulus, a convex dihedral angle formed by two irregularly-textured, wall-oriented planes connected at a common vertical edge. Stimuli were rendered and presented using head mounted displays (Oculus Rift S or Quest in Rift S emulation mode). On each trial, observers adjusted the angle of the fold till the two joined planes appeared perpendicular. To assess the role of stereopsis we tested binocularly and monocularly. To introduced motion parallax, observers swayed laterally through a distance of 30 cm at 0.5 Hz timed to a metronome beat; this motion was multiplied by the gain to produce the virtual view-point. Our results showed that gain had little effect on depth perception in the binocular test conditions. Using a model incorporating self and object motion, we computed predicted perceived depths based on the adjusted angles and then compared these with each observer's input. The modelled outcomes were very consistent across visual manipulations, suggesting that observers have remarkably accurate perception of object motion under these conditions. Additional analyses predict corresponding variations in distance perception and we will test these hypotheses in future experiments.
@incollection{Teng:2021ty,
abstract = {Humans use visual, vestibular, kinesthetic and other cues to effectively navigate through the world. Therefore, conflict between these sources of information has potentially significant implications for human perception of geometric layout. Previous work has found that introducing gain differences between physical and virtual head movement had little effect on distance perception. However, motion parallax is known to be a potent cue to relative depth. In the present study, we explore the impact of conflict between physical and portrayed self-motion on perception of object shape. To do so we varied the gain between virtual and physical head motion (ranging from a factor of 0.5 to 2) and measured the effect on depth perception. Observers viewed a `fold' stimulus, a convex dihedral angle formed by two irregularly-textured, wall-oriented planes connected at a common vertical edge. Stimuli were rendered and presented using head mounted displays (Oculus Rift S or Quest in Rift S emulation mode). On each trial, observers adjusted the angle of the fold till the two joined planes appeared perpendicular. To assess the role of stereopsis we tested binocularly and monocularly. To introduced motion parallax, observers swayed laterally through a distance of 30 cm at 0.5 Hz timed to a metronome beat; this motion was multiplied by the gain to produce the virtual view-point. Our results showed that gain had little effect on depth perception in the binocular test conditions. Using a model incorporating self and object motion, we computed predicted perceived depths based on the adjusted angles and then compared these with each observer's input. The modelled outcomes were very consistent across visual manipulations, suggesting that observers have remarkably accurate perception of object motion under these conditions. Additional analyses predict corresponding variations in distance perception and we will test these hypotheses in future experiments.
},
author = {Teng, X. and Wilcox, L. M. and Allison, R. S.},
booktitle = {Journal of Vision (Vision Sciences Society Abstracts)},
date-added = {2021-09-06 09:10:13 -0400},
date-modified = {2021-09-06 09:10:13 -0400},
doi = {10.1167/jov.21.9.2035},
keywords = {Stereopsis},
pages = {2035},
title = {Interpretation of Depth from Scaled Motion Parallax in Virtual Reality},
volume = {21},
year = {2021},
url-1 = {https://doi.org/10.1167/jov.21.9.2035}}
Downloads: 0
{"_id":"zyDq6bbDr9tEC5XvE","bibbaseid":"teng-wilcox-allison-interpretationofdepthfromscaledmotionparallaxinvirtualreality-2021","author_short":["Teng, X.","Wilcox, L. M.","Allison, R. S."],"bibdata":{"bibtype":"incollection","type":"incollection","abstract":"Humans use visual, vestibular, kinesthetic and other cues to effectively navigate through the world. Therefore, conflict between these sources of information has potentially significant implications for human perception of geometric layout. Previous work has found that introducing gain differences between physical and virtual head movement had little effect on distance perception. However, motion parallax is known to be a potent cue to relative depth. In the present study, we explore the impact of conflict between physical and portrayed self-motion on perception of object shape. To do so we varied the gain between virtual and physical head motion (ranging from a factor of 0.5 to 2) and measured the effect on depth perception. Observers viewed a `fold' stimulus, a convex dihedral angle formed by two irregularly-textured, wall-oriented planes connected at a common vertical edge. Stimuli were rendered and presented using head mounted displays (Oculus Rift S or Quest in Rift S emulation mode). On each trial, observers adjusted the angle of the fold till the two joined planes appeared perpendicular. To assess the role of stereopsis we tested binocularly and monocularly. To introduced motion parallax, observers swayed laterally through a distance of 30 cm at 0.5 Hz timed to a metronome beat; this motion was multiplied by the gain to produce the virtual view-point. Our results showed that gain had little effect on depth perception in the binocular test conditions. Using a model incorporating self and object motion, we computed predicted perceived depths based on the adjusted angles and then compared these with each observer's input. The modelled outcomes were very consistent across visual manipulations, suggesting that observers have remarkably accurate perception of object motion under these conditions. Additional analyses predict corresponding variations in distance perception and we will test these hypotheses in future experiments. ","author":[{"propositions":[],"lastnames":["Teng"],"firstnames":["X."],"suffixes":[]},{"propositions":[],"lastnames":["Wilcox"],"firstnames":["L.","M."],"suffixes":[]},{"propositions":[],"lastnames":["Allison"],"firstnames":["R.","S."],"suffixes":[]}],"booktitle":"Journal of Vision (Vision Sciences Society Abstracts)","date-added":"2021-09-06 09:10:13 -0400","date-modified":"2021-09-06 09:10:13 -0400","doi":"10.1167/jov.21.9.2035","keywords":"Stereopsis","pages":"2035","title":"Interpretation of Depth from Scaled Motion Parallax in Virtual Reality","volume":"21","year":"2021","url-1":"https://doi.org/10.1167/jov.21.9.2035","bibtex":"@incollection{Teng:2021ty,\n\tabstract = {Humans use visual, vestibular, kinesthetic and other cues to effectively navigate through the world. Therefore, conflict between these sources of information has potentially significant implications for human perception of geometric layout. Previous work has found that introducing gain differences between physical and virtual head movement had little effect on distance perception. However, motion parallax is known to be a potent cue to relative depth. In the present study, we explore the impact of conflict between physical and portrayed self-motion on perception of object shape. To do so we varied the gain between virtual and physical head motion (ranging from a factor of 0.5 to 2) and measured the effect on depth perception. Observers viewed a `fold' stimulus, a convex dihedral angle formed by two irregularly-textured, wall-oriented planes connected at a common vertical edge. Stimuli were rendered and presented using head mounted displays (Oculus Rift S or Quest in Rift S emulation mode). On each trial, observers adjusted the angle of the fold till the two joined planes appeared perpendicular. To assess the role of stereopsis we tested binocularly and monocularly. To introduced motion parallax, observers swayed laterally through a distance of 30 cm at 0.5 Hz timed to a metronome beat; this motion was multiplied by the gain to produce the virtual view-point. Our results showed that gain had little effect on depth perception in the binocular test conditions. Using a model incorporating self and object motion, we computed predicted perceived depths based on the adjusted angles and then compared these with each observer's input. The modelled outcomes were very consistent across visual manipulations, suggesting that observers have remarkably accurate perception of object motion under these conditions. Additional analyses predict corresponding variations in distance perception and we will test these hypotheses in future experiments.\n},\n\tauthor = {Teng, X. and Wilcox, L. M. and Allison, R. S.},\n\tbooktitle = {Journal of Vision (Vision Sciences Society Abstracts)},\n\tdate-added = {2021-09-06 09:10:13 -0400},\n\tdate-modified = {2021-09-06 09:10:13 -0400},\n\tdoi = {10.1167/jov.21.9.2035},\n\tkeywords = {Stereopsis},\n\tpages = {2035},\n\ttitle = {Interpretation of Depth from Scaled Motion Parallax in Virtual Reality},\n\tvolume = {21},\n\tyear = {2021},\n\turl-1 = {https://doi.org/10.1167/jov.21.9.2035}}\n\n\n\n","author_short":["Teng, X.","Wilcox, L. M.","Allison, R. S."],"key":"Teng:2021ty","id":"Teng:2021ty","bibbaseid":"teng-wilcox-allison-interpretationofdepthfromscaledmotionparallaxinvirtualreality-2021","role":"author","urls":{"-1":"https://doi.org/10.1167/jov.21.9.2035"},"keyword":["Stereopsis"],"metadata":{"authorlinks":{}}},"bibtype":"incollection","biburl":"https://bibbase.org/network/files/ibWG96BS4w7ibooE9","dataSources":["BPKPSXjrbMGteC59J","MpMK4SvZzj5Fww5vJ","YbBWRH5Fc7xRr8ghk","szZaibkmSiiQBFQG8","DoyrDTpJ7HHCtki3q","JaoxzeTFRfvwgLoCW","XKwRm5Lx8Z9bzSzaP","AELuRZBpnp7nRDaqw"],"keywords":["stereopsis"],"search_terms":["interpretation","depth","scaled","motion","parallax","virtual","reality","teng","wilcox","allison"],"title":"Interpretation of Depth from Scaled Motion Parallax in Virtual Reality","year":2021}