Distance perception when real and virtual head motion do not match. Cutone, M., Wilcox, L. M., & Allison, R. S. In Vestibular Oriented Research Meeting, Journal of Vestibular Research, volume 30, pages 139. 2020.
Distance perception when real and virtual head motion do not match [link]-1  doi  abstract   bibtex   
For self-generated motion parallax, a sense of head velocity is needed to estimate distance from object motion (1). This information can be obtained from vestibular, proprioceptive, and visual sourc-es. If the magnitude of efferent signals from the ves-tibular system produced by head motion do not correlate with the velocity gradient of the visible op-tic fl ow pattern, a confl ict arises which leads to breakdown of motion-distance invariance. This po-tentially results in distortions of perceived distances to objects as visual and vestibular signals are non-concordant. We assessed this prediction by varying the gain between the observer's physical head mo-tion and simulated motion. Given that the relative and absolute motion parallax would be greater than expected from head motion when gain was greater than 1.0, we anticipated that this manipulation would result in objects appearing closer to the observer. Using an HMD, we presented targets 1 to 3 meters away from the observer within a cue rich environ-ment with textured walls and fl oors. Participants stood and swayed laterally at a rate of 0.5 Hz. Lat-eral gain was applied by amplifying their real posi-tion by factors of 1.0 to 3.0, then using that to set the instantaneous viewpoint within the virtual environ-ment. After presentation, the target disappeared, and the participant performed a blind walk and reached for it. Their hand position was recorded, and we computed positional errors relative to the target. We found no effect of our motion parallax gain manipu-lation on binocular reaching accuracy. To evaluate the role of stereopsis in counteracting the anticipated distortion in perceived space, we tested observers on the same task monocularly. In this case, distances were perceived as nearer as gain increased, but the effects were relatively small. Taken together our re-sults suggest that observers are fl exible in their inter-pretation of observer produced motion parallax during active head movement. This provides consid-erable tolerance of spatial perception to mismatches between physical and virtual motion in rich virtual environments
@incollection{Cutone:wb,
	abstract = {For self-generated motion parallax, a sense of head velocity  is  needed  to  estimate  distance  from  object  motion   (1).   This   information   can   be   obtained   from  vestibular,  proprioceptive,  and  visual  sourc-es. If the magnitude of efferent signals from the ves-tibular  system  produced  by  head  motion  do  not  correlate with the velocity gradient of the visible op-tic fl  ow  pattern,  a  confl  ict  arises  which  leads  to  breakdown of motion-distance invariance. This po-tentially results in distortions of perceived distances to  objects  as  visual  and  vestibular  signals  are  non-concordant. We assessed this prediction by varying the  gain  between  the  observer's  physical  head  mo-tion  and  simulated  motion.  Given  that  the  relative  and absolute motion parallax would be greater than expected  from  head  motion  when  gain  was  greater  than 1.0, we anticipated that this manipulation would result  in  objects  appearing  closer  to  the  observer.  Using an HMD, we presented targets 1 to 3 meters away  from  the  observer  within  a  cue  rich  environ-ment  with  textured  walls  and  fl oors.  Participants stood and swayed laterally at a rate of 0.5 Hz. Lat-eral gain was applied by amplifying their real posi-tion by factors of 1.0 to 3.0, then using that to set the instantaneous viewpoint within the virtual environ-ment. After presentation, the target disappeared, and the participant performed a blind walk and reached for  it.  Their  hand  position  was  recorded,  and  we  computed positional errors relative to the target. We found no effect of our motion parallax gain manipu-lation  on  binocular  reaching  accuracy.  To  evaluate  the role of stereopsis in counteracting the anticipated distortion in perceived space, we tested observers on the  same  task  monocularly.  In  this  case,  distances  were perceived as nearer as gain increased, but the effects were relatively small. Taken together our re-sults suggest that observers are fl exible in their inter-pretation   of   observer   produced   motion   parallax   during active head movement. This provides consid-erable tolerance of spatial perception to mismatches between  physical  and  virtual  motion  in  rich  virtual  environments},
	author = {Cutone, M. and Wilcox, L. M. and Allison, R. S.},
	booktitle = {Vestibular Oriented Research Meeting, Journal of Vestibular Research},
	date-added = {2020-05-21 13:02:13 -0400},
	date-modified = {2020-07-07 13:48:40 -0400},
	doi = {10.3233/VES-200699},
	keywords = {Augmented & Virtual Reality},
	pages = {139},
	title = {Distance perception when real and virtual head motion do not match},
	volume = {30},
	year = {2020},
	url-1 = {https://doi.org/10.3233/VES-200699}}

Downloads: 0