Body posture affects the perception of visually simulated self-motion. Jorges, B, Bury, N., McManus, M, Allison, R. S., Jenkin, M, & Harris, L. R. In Journal of Vision (Vision Sciences Society Abstracts), volume 21, pages 2301. 2021. -1 doi abstract bibtex Perceiving one's self-motion is a multisensory process involving integrating visual, vestibular and other cues. The perception of self-motion can be elicited by visual cues alone (vection) in a stationary observer. In this case, optic flow information compatible with self-motion may be affected by conflicting vestibular cues signaling that the body is not accelerating. Since vestibular cues are less reliable when lying down (Fernandez & Goldberg, 1976), conflicting vestibular cues might bias the self-motion percept less when lying down than when upright. To test this hypothesis, we immersed 20 participants in a virtual reality hallway environment and presented targets at different distances ahead of them. The targets then disappeared, and participants experienced optic flow simulating constant-acceleration, straight-ahead self-motion. They indicated by a button press when they felt they had reached the position of the previously-viewed target. Participants also performed a task that assessed biases in distance perception. We showed them virtual boxes at different simulated distances. On each trial, they judged if the height of the box was bigger or smaller than a reference ruler held in their hands. Perceived distance can be inferred from biases in perceived size. They performed both tasks sitting upright and lying supine. Participants needed less optic flow (perceived they had travelled further) to perceive they had reached the target's position when supine than when sitting (by 4.8%, bootstrapped 95% CI=[3.5%;6.4%], determined using Linear Mixed Modelling). Participants also judged objects as larger (compatible with closer) when upright than when supine (by 2.5%, 95% CI=[0.03%;4.6%], as above). The bias in traveled distance thus cannot be reduced to a bias in perceived distance. These results suggest that vestibular cues impact self-motion distance perception, as they do heading judgements (MacNeilage, Banks, DeAngelis & Angelaki, 2010), even when the task could be solved with visual cues alone.
@incollection{Jorges:mz,
abstract = {Perceiving one's self-motion is a multisensory process involving integrating visual, vestibular and other cues. The perception of self-motion can be elicited by visual cues alone (vection) in a stationary observer. In this case, optic flow information compatible with self-motion may be affected by conflicting vestibular cues signaling that the body is not accelerating. Since vestibular cues are less reliable when lying down (Fernandez \& Goldberg, 1976), conflicting vestibular cues might bias the self-motion percept less when lying down than when upright. To test this hypothesis, we immersed 20 participants in a virtual reality hallway environment and presented targets at different distances ahead of them. The targets then disappeared, and participants experienced optic flow simulating constant-acceleration, straight-ahead self-motion. They indicated by a button press when they felt they had reached the position of the previously-viewed target. Participants also performed a task that assessed biases in distance perception. We showed them virtual boxes at different simulated distances. On each trial, they judged if the height of the box was bigger or smaller than a reference ruler held in their hands. Perceived distance can be inferred from biases in perceived size. They performed both tasks sitting upright and lying supine. Participants needed less optic flow (perceived they had travelled further) to perceive they had reached the target's position when supine than when sitting (by 4.8\%, bootstrapped 95\% CI=[3.5\%;6.4\%], determined using Linear Mixed Modelling). Participants also judged objects as larger (compatible with closer) when upright than when supine (by 2.5\%, 95\% CI=[0.03\%;4.6\%], as above). The bias in traveled distance thus cannot be reduced to a bias in perceived distance. These results suggest that vestibular cues impact self-motion distance perception, as they do heading judgements (MacNeilage, Banks, DeAngelis \& Angelaki, 2010), even when the task could be solved with visual cues alone.},
author = {Jorges, B and Bury, N. and McManus, M and Allison, R. S. and Jenkin, M and Harris, L. R.},
booktitle = {Journal of Vision (Vision Sciences Society Abstracts)},
date-added = {2021-09-06 09:10:13 -0400},
date-modified = {2021-09-11 22:21:25 -0400},
doi = {10.1167/jov.21.9.2301},
keywords = {Optic flow & Self Motion (also Locomotion & Aviation)},
pages = {2301},
title = {Body posture affects the perception of visually simulated self-motion},
volume = {21},
year = {2021},
url-1 = {https://doi.org/10.1167/jov.21.9.2301}}
Downloads: 0
{"_id":"DyLYES3gsSREy2i8b","bibbaseid":"jorges-bury-mcmanus-allison-jenkin-harris-bodypostureaffectstheperceptionofvisuallysimulatedselfmotion-2021","author_short":["Jorges, B","Bury, N.","McManus, M","Allison, R. S.","Jenkin, M","Harris, L. R."],"bibdata":{"bibtype":"incollection","type":"incollection","abstract":"Perceiving one's self-motion is a multisensory process involving integrating visual, vestibular and other cues. The perception of self-motion can be elicited by visual cues alone (vection) in a stationary observer. In this case, optic flow information compatible with self-motion may be affected by conflicting vestibular cues signaling that the body is not accelerating. Since vestibular cues are less reliable when lying down (Fernandez & Goldberg, 1976), conflicting vestibular cues might bias the self-motion percept less when lying down than when upright. To test this hypothesis, we immersed 20 participants in a virtual reality hallway environment and presented targets at different distances ahead of them. The targets then disappeared, and participants experienced optic flow simulating constant-acceleration, straight-ahead self-motion. They indicated by a button press when they felt they had reached the position of the previously-viewed target. Participants also performed a task that assessed biases in distance perception. We showed them virtual boxes at different simulated distances. On each trial, they judged if the height of the box was bigger or smaller than a reference ruler held in their hands. Perceived distance can be inferred from biases in perceived size. They performed both tasks sitting upright and lying supine. Participants needed less optic flow (perceived they had travelled further) to perceive they had reached the target's position when supine than when sitting (by 4.8%, bootstrapped 95% CI=[3.5%;6.4%], determined using Linear Mixed Modelling). Participants also judged objects as larger (compatible with closer) when upright than when supine (by 2.5%, 95% CI=[0.03%;4.6%], as above). The bias in traveled distance thus cannot be reduced to a bias in perceived distance. These results suggest that vestibular cues impact self-motion distance perception, as they do heading judgements (MacNeilage, Banks, DeAngelis & Angelaki, 2010), even when the task could be solved with visual cues alone.","author":[{"propositions":[],"lastnames":["Jorges"],"firstnames":["B"],"suffixes":[]},{"propositions":[],"lastnames":["Bury"],"firstnames":["N."],"suffixes":[]},{"propositions":[],"lastnames":["McManus"],"firstnames":["M"],"suffixes":[]},{"propositions":[],"lastnames":["Allison"],"firstnames":["R.","S."],"suffixes":[]},{"propositions":[],"lastnames":["Jenkin"],"firstnames":["M"],"suffixes":[]},{"propositions":[],"lastnames":["Harris"],"firstnames":["L.","R."],"suffixes":[]}],"booktitle":"Journal of Vision (Vision Sciences Society Abstracts)","date-added":"2021-09-06 09:10:13 -0400","date-modified":"2021-09-11 22:21:25 -0400","doi":"10.1167/jov.21.9.2301","keywords":"Optic flow & Self Motion (also Locomotion & Aviation)","pages":"2301","title":"Body posture affects the perception of visually simulated self-motion","volume":"21","year":"2021","url-1":"https://doi.org/10.1167/jov.21.9.2301","bibtex":"@incollection{Jorges:mz,\n\tabstract = {Perceiving one's self-motion is a multisensory process involving integrating visual, vestibular and other cues. The perception of self-motion can be elicited by visual cues alone (vection) in a stationary observer. In this case, optic flow information compatible with self-motion may be affected by conflicting vestibular cues signaling that the body is not accelerating. Since vestibular cues are less reliable when lying down (Fernandez \\& Goldberg, 1976), conflicting vestibular cues might bias the self-motion percept less when lying down than when upright. To test this hypothesis, we immersed 20 participants in a virtual reality hallway environment and presented targets at different distances ahead of them. The targets then disappeared, and participants experienced optic flow simulating constant-acceleration, straight-ahead self-motion. They indicated by a button press when they felt they had reached the position of the previously-viewed target. Participants also performed a task that assessed biases in distance perception. We showed them virtual boxes at different simulated distances. On each trial, they judged if the height of the box was bigger or smaller than a reference ruler held in their hands. Perceived distance can be inferred from biases in perceived size. They performed both tasks sitting upright and lying supine. Participants needed less optic flow (perceived they had travelled further) to perceive they had reached the target's position when supine than when sitting (by 4.8\\%, bootstrapped 95\\% CI=[3.5\\%;6.4\\%], determined using Linear Mixed Modelling). Participants also judged objects as larger (compatible with closer) when upright than when supine (by 2.5\\%, 95\\% CI=[0.03\\%;4.6\\%], as above). The bias in traveled distance thus cannot be reduced to a bias in perceived distance. These results suggest that vestibular cues impact self-motion distance perception, as they do heading judgements (MacNeilage, Banks, DeAngelis \\& Angelaki, 2010), even when the task could be solved with visual cues alone.},\n\tauthor = {Jorges, B and Bury, N. and McManus, M and Allison, R. S. and Jenkin, M and Harris, L. R.},\n\tbooktitle = {Journal of Vision (Vision Sciences Society Abstracts)},\n\tdate-added = {2021-09-06 09:10:13 -0400},\n\tdate-modified = {2021-09-11 22:21:25 -0400},\n\tdoi = {10.1167/jov.21.9.2301},\n\tkeywords = {Optic flow & Self Motion (also Locomotion & Aviation)},\n\tpages = {2301},\n\ttitle = {Body posture affects the perception of visually simulated self-motion},\n\tvolume = {21},\n\tyear = {2021},\n\turl-1 = {https://doi.org/10.1167/jov.21.9.2301}}\n\n\n\n","author_short":["Jorges, B","Bury, N.","McManus, M","Allison, R. S.","Jenkin, M","Harris, L. R."],"key":"Jorges:mz","id":"Jorges:mz","bibbaseid":"jorges-bury-mcmanus-allison-jenkin-harris-bodypostureaffectstheperceptionofvisuallysimulatedselfmotion-2021","role":"author","urls":{"-1":"https://doi.org/10.1167/jov.21.9.2301"},"keyword":["Optic flow & Self Motion (also Locomotion & Aviation)"],"metadata":{"authorlinks":{}}},"bibtype":"incollection","biburl":"https://bibbase.org/network/files/ibWG96BS4w7ibooE9","dataSources":["BPKPSXjrbMGteC59J","MpMK4SvZzj5Fww5vJ","YbBWRH5Fc7xRr8ghk","szZaibkmSiiQBFQG8","DoyrDTpJ7HHCtki3q","JaoxzeTFRfvwgLoCW","XKwRm5Lx8Z9bzSzaP","AELuRZBpnp7nRDaqw"],"keywords":["optic flow & self motion (also locomotion & aviation)"],"search_terms":["body","posture","affects","perception","visually","simulated","self","motion","jorges","bury","mcmanus","allison","jenkin","harris"],"title":"Body posture affects the perception of visually simulated self-motion","year":2021}