Distance perception when real and virtual head motion do not match. Cutone, M., Wilcox, L. M., & Allison, R. S. In Vestibular Oriented Research Meeting, Journal of Vestibular Research, volume 30, pages 139. 2020. -1 doi abstract bibtex For self-generated motion parallax, a sense of head velocity is needed to estimate distance from object motion (1). This information can be obtained from vestibular, proprioceptive, and visual sourc-es. If the magnitude of efferent signals from the ves-tibular system produced by head motion do not correlate with the velocity gradient of the visible op-tic fl ow pattern, a confl ict arises which leads to breakdown of motion-distance invariance. This po-tentially results in distortions of perceived distances to objects as visual and vestibular signals are non-concordant. We assessed this prediction by varying the gain between the observer's physical head mo-tion and simulated motion. Given that the relative and absolute motion parallax would be greater than expected from head motion when gain was greater than 1.0, we anticipated that this manipulation would result in objects appearing closer to the observer. Using an HMD, we presented targets 1 to 3 meters away from the observer within a cue rich environ-ment with textured walls and fl oors. Participants stood and swayed laterally at a rate of 0.5 Hz. Lat-eral gain was applied by amplifying their real posi-tion by factors of 1.0 to 3.0, then using that to set the instantaneous viewpoint within the virtual environ-ment. After presentation, the target disappeared, and the participant performed a blind walk and reached for it. Their hand position was recorded, and we computed positional errors relative to the target. We found no effect of our motion parallax gain manipu-lation on binocular reaching accuracy. To evaluate the role of stereopsis in counteracting the anticipated distortion in perceived space, we tested observers on the same task monocularly. In this case, distances were perceived as nearer as gain increased, but the effects were relatively small. Taken together our re-sults suggest that observers are fl exible in their inter-pretation of observer produced motion parallax during active head movement. This provides consid-erable tolerance of spatial perception to mismatches between physical and virtual motion in rich virtual environments
@incollection{Cutone:wb,
abstract = {For self-generated motion parallax, a sense of head velocity is needed to estimate distance from object motion (1). This information can be obtained from vestibular, proprioceptive, and visual sourc-es. If the magnitude of efferent signals from the ves-tibular system produced by head motion do not correlate with the velocity gradient of the visible op-tic fl ow pattern, a confl ict arises which leads to breakdown of motion-distance invariance. This po-tentially results in distortions of perceived distances to objects as visual and vestibular signals are non-concordant. We assessed this prediction by varying the gain between the observer's physical head mo-tion and simulated motion. Given that the relative and absolute motion parallax would be greater than expected from head motion when gain was greater than 1.0, we anticipated that this manipulation would result in objects appearing closer to the observer. Using an HMD, we presented targets 1 to 3 meters away from the observer within a cue rich environ-ment with textured walls and fl oors. Participants stood and swayed laterally at a rate of 0.5 Hz. Lat-eral gain was applied by amplifying their real posi-tion by factors of 1.0 to 3.0, then using that to set the instantaneous viewpoint within the virtual environ-ment. After presentation, the target disappeared, and the participant performed a blind walk and reached for it. Their hand position was recorded, and we computed positional errors relative to the target. We found no effect of our motion parallax gain manipu-lation on binocular reaching accuracy. To evaluate the role of stereopsis in counteracting the anticipated distortion in perceived space, we tested observers on the same task monocularly. In this case, distances were perceived as nearer as gain increased, but the effects were relatively small. Taken together our re-sults suggest that observers are fl exible in their inter-pretation of observer produced motion parallax during active head movement. This provides consid-erable tolerance of spatial perception to mismatches between physical and virtual motion in rich virtual environments},
author = {Cutone, M. and Wilcox, L. M. and Allison, R. S.},
booktitle = {Vestibular Oriented Research Meeting, Journal of Vestibular Research},
date-added = {2020-05-21 13:02:13 -0400},
date-modified = {2020-07-07 13:48:40 -0400},
doi = {10.3233/VES-200699},
keywords = {Augmented & Virtual Reality},
pages = {139},
title = {Distance perception when real and virtual head motion do not match},
volume = {30},
year = {2020},
url-1 = {https://doi.org/10.3233/VES-200699}}
Downloads: 0
{"_id":"Tfm6Y44c4WGCSJADo","bibbaseid":"cutone-wilcox-allison-distanceperceptionwhenrealandvirtualheadmotiondonotmatch-2020","author_short":["Cutone, M.","Wilcox, L. M.","Allison, R. S."],"bibdata":{"bibtype":"incollection","type":"incollection","abstract":"For self-generated motion parallax, a sense of head velocity is needed to estimate distance from object motion (1). This information can be obtained from vestibular, proprioceptive, and visual sourc-es. If the magnitude of efferent signals from the ves-tibular system produced by head motion do not correlate with the velocity gradient of the visible op-tic fl ow pattern, a confl ict arises which leads to breakdown of motion-distance invariance. This po-tentially results in distortions of perceived distances to objects as visual and vestibular signals are non-concordant. We assessed this prediction by varying the gain between the observer's physical head mo-tion and simulated motion. Given that the relative and absolute motion parallax would be greater than expected from head motion when gain was greater than 1.0, we anticipated that this manipulation would result in objects appearing closer to the observer. Using an HMD, we presented targets 1 to 3 meters away from the observer within a cue rich environ-ment with textured walls and fl oors. Participants stood and swayed laterally at a rate of 0.5 Hz. Lat-eral gain was applied by amplifying their real posi-tion by factors of 1.0 to 3.0, then using that to set the instantaneous viewpoint within the virtual environ-ment. After presentation, the target disappeared, and the participant performed a blind walk and reached for it. Their hand position was recorded, and we computed positional errors relative to the target. We found no effect of our motion parallax gain manipu-lation on binocular reaching accuracy. To evaluate the role of stereopsis in counteracting the anticipated distortion in perceived space, we tested observers on the same task monocularly. In this case, distances were perceived as nearer as gain increased, but the effects were relatively small. Taken together our re-sults suggest that observers are fl exible in their inter-pretation of observer produced motion parallax during active head movement. This provides consid-erable tolerance of spatial perception to mismatches between physical and virtual motion in rich virtual environments","author":[{"propositions":[],"lastnames":["Cutone"],"firstnames":["M."],"suffixes":[]},{"propositions":[],"lastnames":["Wilcox"],"firstnames":["L.","M."],"suffixes":[]},{"propositions":[],"lastnames":["Allison"],"firstnames":["R.","S."],"suffixes":[]}],"booktitle":"Vestibular Oriented Research Meeting, Journal of Vestibular Research","date-added":"2020-05-21 13:02:13 -0400","date-modified":"2020-07-07 13:48:40 -0400","doi":"10.3233/VES-200699","keywords":"Augmented & Virtual Reality","pages":"139","title":"Distance perception when real and virtual head motion do not match","volume":"30","year":"2020","url-1":"https://doi.org/10.3233/VES-200699","bibtex":"@incollection{Cutone:wb,\n\tabstract = {For self-generated motion parallax, a sense of head velocity is needed to estimate distance from object motion (1). This information can be obtained from vestibular, proprioceptive, and visual sourc-es. If the magnitude of efferent signals from the ves-tibular system produced by head motion do not correlate with the velocity gradient of the visible op-tic fl ow pattern, a confl ict arises which leads to breakdown of motion-distance invariance. This po-tentially results in distortions of perceived distances to objects as visual and vestibular signals are non-concordant. We assessed this prediction by varying the gain between the observer's physical head mo-tion and simulated motion. Given that the relative and absolute motion parallax would be greater than expected from head motion when gain was greater than 1.0, we anticipated that this manipulation would result in objects appearing closer to the observer. Using an HMD, we presented targets 1 to 3 meters away from the observer within a cue rich environ-ment with textured walls and fl oors. Participants stood and swayed laterally at a rate of 0.5 Hz. Lat-eral gain was applied by amplifying their real posi-tion by factors of 1.0 to 3.0, then using that to set the instantaneous viewpoint within the virtual environ-ment. After presentation, the target disappeared, and the participant performed a blind walk and reached for it. Their hand position was recorded, and we computed positional errors relative to the target. We found no effect of our motion parallax gain manipu-lation on binocular reaching accuracy. To evaluate the role of stereopsis in counteracting the anticipated distortion in perceived space, we tested observers on the same task monocularly. In this case, distances were perceived as nearer as gain increased, but the effects were relatively small. Taken together our re-sults suggest that observers are fl exible in their inter-pretation of observer produced motion parallax during active head movement. This provides consid-erable tolerance of spatial perception to mismatches between physical and virtual motion in rich virtual environments},\n\tauthor = {Cutone, M. and Wilcox, L. M. and Allison, R. S.},\n\tbooktitle = {Vestibular Oriented Research Meeting, Journal of Vestibular Research},\n\tdate-added = {2020-05-21 13:02:13 -0400},\n\tdate-modified = {2020-07-07 13:48:40 -0400},\n\tdoi = {10.3233/VES-200699},\n\tkeywords = {Augmented & Virtual Reality},\n\tpages = {139},\n\ttitle = {Distance perception when real and virtual head motion do not match},\n\tvolume = {30},\n\tyear = {2020},\n\turl-1 = {https://doi.org/10.3233/VES-200699}}\n\n\n\n","author_short":["Cutone, M.","Wilcox, L. M.","Allison, R. S."],"key":"Cutone:wb","id":"Cutone:wb","bibbaseid":"cutone-wilcox-allison-distanceperceptionwhenrealandvirtualheadmotiondonotmatch-2020","role":"author","urls":{"-1":"https://doi.org/10.3233/VES-200699"},"keyword":["Augmented & Virtual Reality"],"metadata":{"authorlinks":{}}},"bibtype":"incollection","biburl":"https://bibbase.org/network/files/ibWG96BS4w7ibooE9","dataSources":["BPKPSXjrbMGteC59J","MpMK4SvZzj5Fww5vJ","YbBWRH5Fc7xRr8ghk","szZaibkmSiiQBFQG8","DoyrDTpJ7HHCtki3q","JaoxzeTFRfvwgLoCW","XKwRm5Lx8Z9bzSzaP","AELuRZBpnp7nRDaqw"],"keywords":["augmented & virtual reality"],"search_terms":["distance","perception","real","virtual","head","motion","match","cutone","wilcox","allison"],"title":"Distance perception when real and virtual head motion do not match","year":2020}