Audiovisual Signals Shape Our Perception of Materials in Virtual Reality. Koppisetty, H., Allison, R., & Wilcox, L. In CVR-CIAN Conference 2025: The Brain and Integrative Vision, pages 79. 2025.
-1 doi abstract bibtex The sound and appearance of interacting objects determine how we perceive their material properties. Extending our preliminary results reported at IMRF 2024, we investigated how observers resolve discrepancies between a material's sound and its appearance in realistic virtual environments. Specifically, we evaluate how audio identity, visual identity, active engagement, and audio quality interact to determine the material classification. Participants (N =42) experienced simulations of an object being struck by a metal rod in a virtual reality headset. All combinations of impact sounds and visual textures for four materials were paired for the target object, creating sixteen conditions. To examine the effect of agency, half the trials involved an agent striking the target (agent-interaction), while in the other half, participants struck it themselves (self-interaction). They then classified the target object as one of the four materials. In a second experiment, for a smaller set of conditions, three groups of 18 participants were tested with sounds produced by metal, plastic, and glass rods, combined with varying levels of white noise. The first experiment showed that participants primarily classified the material based on auditory properties, agency had no effect on perception, and in four of the sixteen conditions, potential audiovisual illusions were observed. The second experiment showed that the material of the rod and added white noise influenced material classification. Glass rod-generated sounds led to more consistent responses as compared to sounds generated by other rods. Our findings suggest that for impact events the auditory signal is the strongest determinant of material classification in the presence of conflicting appearance. While agency had no effect, additional factors, such as the material of the sound-generating rod and sound quality, did influence perception. These results highlight the importance of sound in VR and how audio-visual cues interact in complex ways to impact material perception.
@incollection{Koppisetty:2025aa,
abstract = {The sound and appearance of interacting objects determine how we perceive their
material properties.
Extending our preliminary results reported at IMRF 2024, we investigated how observers
resolve discrepancies between a material's sound and its appearance in realistic virtual
environments. Specifically, we evaluate how audio identity, visual identity, active
engagement, and audio quality interact to determine the material classification.
Participants (N =42) experienced simulations of an object being struck by a metal rod in
a virtual reality headset. All combinations of impact sounds and visual textures for four
materials were paired for the target object, creating sixteen conditions. To examine the
effect of agency, half the trials involved an agent striking the target (agent-interaction),
while in the other half, participants struck it themselves (self-interaction). They then
classified the target object as one of the four materials. In a second experiment, for
a smaller set of conditions, three groups of 18 participants were tested with sounds
produced by metal, plastic, and glass rods, combined with varying levels of white noise.
The first experiment showed that participants primarily classified the material based
on auditory properties, agency had no effect on perception, and in four of the sixteen
conditions, potential audiovisual illusions were observed. The second experiment
showed that the material of the rod and added white noise influenced material
classification. Glass rod-generated sounds led to more consistent responses as
compared to sounds generated by other rods.
Our findings suggest that for impact events the auditory signal is the strongest
determinant of material classification in the presence of conflicting appearance. While
agency had no effect, additional factors, such as the material of the sound-generating
rod and sound quality, did influence perception. These results highlight the importance
of sound in VR and how audio-visual cues interact in complex ways to impact material
perception.},
annote = {JUNE 17-19, 2025
SECOND STUDENT CENTRE
YORK UNIVERSITY},
author = {Harshitha Koppisetty and Robert Allison and Laurie Wilcox},
booktitle = {CVR-CIAN Conference 2025: The Brain and Integrative Vision},
date-added = {2025-07-26 06:20:14 -0400},
date-modified = {2025-07-26 06:20:14 -0400},
doi = {10.25071/10315/42927},
keywords = {Misc.},
pages = {79},
title = {Audiovisual Signals Shape Our Perception of Materials in Virtual Reality},
url-1 = {https://doi.org/10.25071/10315/42927},
year = {2025},
bdsk-url-1 = {https://doi.org/10.25071/10315/42927}}
Downloads: 0
{"_id":"GmPFK5Y3KS3KnMk3t","bibbaseid":"koppisetty-allison-wilcox-audiovisualsignalsshapeourperceptionofmaterialsinvirtualreality-2025","author_short":["Koppisetty, H.","Allison, R.","Wilcox, L."],"bibdata":{"bibtype":"incollection","type":"incollection","abstract":"The sound and appearance of interacting objects determine how we perceive their material properties. Extending our preliminary results reported at IMRF 2024, we investigated how observers resolve discrepancies between a material's sound and its appearance in realistic virtual environments. Specifically, we evaluate how audio identity, visual identity, active engagement, and audio quality interact to determine the material classification. Participants (N =42) experienced simulations of an object being struck by a metal rod in a virtual reality headset. All combinations of impact sounds and visual textures for four materials were paired for the target object, creating sixteen conditions. To examine the effect of agency, half the trials involved an agent striking the target (agent-interaction), while in the other half, participants struck it themselves (self-interaction). They then classified the target object as one of the four materials. In a second experiment, for a smaller set of conditions, three groups of 18 participants were tested with sounds produced by metal, plastic, and glass rods, combined with varying levels of white noise. The first experiment showed that participants primarily classified the material based on auditory properties, agency had no effect on perception, and in four of the sixteen conditions, potential audiovisual illusions were observed. The second experiment showed that the material of the rod and added white noise influenced material classification. Glass rod-generated sounds led to more consistent responses as compared to sounds generated by other rods. Our findings suggest that for impact events the auditory signal is the strongest determinant of material classification in the presence of conflicting appearance. While agency had no effect, additional factors, such as the material of the sound-generating rod and sound quality, did influence perception. These results highlight the importance of sound in VR and how audio-visual cues interact in complex ways to impact material perception.","annote":"JUNE 17-19, 2025 SECOND STUDENT CENTRE YORK UNIVERSITY","author":[{"firstnames":["Harshitha"],"propositions":[],"lastnames":["Koppisetty"],"suffixes":[]},{"firstnames":["Robert"],"propositions":[],"lastnames":["Allison"],"suffixes":[]},{"firstnames":["Laurie"],"propositions":[],"lastnames":["Wilcox"],"suffixes":[]}],"booktitle":"CVR-CIAN Conference 2025: The Brain and Integrative Vision","date-added":"2025-07-26 06:20:14 -0400","date-modified":"2025-07-26 06:20:14 -0400","doi":"10.25071/10315/42927","keywords":"Misc.","pages":"79","title":"Audiovisual Signals Shape Our Perception of Materials in Virtual Reality","url-1":"https://doi.org/10.25071/10315/42927","year":"2025","bdsk-url-1":"https://doi.org/10.25071/10315/42927","bibtex":"@incollection{Koppisetty:2025aa,\n\tabstract = {The sound and appearance of interacting objects determine how we perceive their\nmaterial properties.\nExtending our preliminary results reported at IMRF 2024, we investigated how observers\nresolve discrepancies between a material's sound and its appearance in realistic virtual\nenvironments. Specifically, we evaluate how audio identity, visual identity, active\nengagement, and audio quality interact to determine the material classification.\nParticipants (N =42) experienced simulations of an object being struck by a metal rod in\na virtual reality headset. All combinations of impact sounds and visual textures for four\nmaterials were paired for the target object, creating sixteen conditions. To examine the\neffect of agency, half the trials involved an agent striking the target (agent-interaction),\nwhile in the other half, participants struck it themselves (self-interaction). They then\nclassified the target object as one of the four materials. In a second experiment, for\na smaller set of conditions, three groups of 18 participants were tested with sounds\nproduced by metal, plastic, and glass rods, combined with varying levels of white noise.\nThe first experiment showed that participants primarily classified the material based\non auditory properties, agency had no effect on perception, and in four of the sixteen\nconditions, potential audiovisual illusions were observed. The second experiment\nshowed that the material of the rod and added white noise influenced material\nclassification. Glass rod-generated sounds led to more consistent responses as\ncompared to sounds generated by other rods.\nOur findings suggest that for impact events the auditory signal is the strongest\ndeterminant of material classification in the presence of conflicting appearance. While\nagency had no effect, additional factors, such as the material of the sound-generating\nrod and sound quality, did influence perception. These results highlight the importance\nof sound in VR and how audio-visual cues interact in complex ways to impact material\nperception.},\n\tannote = {JUNE 17-19, 2025\nSECOND STUDENT CENTRE\nYORK UNIVERSITY},\n\tauthor = {Harshitha Koppisetty and Robert Allison and Laurie Wilcox},\n\tbooktitle = {CVR-CIAN Conference 2025: The Brain and Integrative Vision},\n\tdate-added = {2025-07-26 06:20:14 -0400},\n\tdate-modified = {2025-07-26 06:20:14 -0400},\n\tdoi = {10.25071/10315/42927},\n\tkeywords = {Misc.},\n\tpages = {79},\n\ttitle = {Audiovisual Signals Shape Our Perception of Materials in Virtual Reality},\n\turl-1 = {https://doi.org/10.25071/10315/42927},\n\tyear = {2025},\n\tbdsk-url-1 = {https://doi.org/10.25071/10315/42927}}\n\n","author_short":["Koppisetty, H.","Allison, R.","Wilcox, L."],"key":"Koppisetty:2025aa","id":"Koppisetty:2025aa","bibbaseid":"koppisetty-allison-wilcox-audiovisualsignalsshapeourperceptionofmaterialsinvirtualreality-2025","role":"author","urls":{"-1":"https://doi.org/10.25071/10315/42927"},"keyword":["Misc."],"metadata":{"authorlinks":{}},"html":""},"bibtype":"incollection","biburl":"www.cse.yorku.ca/percept/papers/self.bib","dataSources":["2KKYxJNEDKp35ykmq","BPKPSXjrbMGteC59J"],"keywords":["misc."],"search_terms":["audiovisual","signals","shape","perception","materials","virtual","reality","koppisetty","allison","wilcox"],"title":"Audiovisual Signals Shape Our Perception of Materials in Virtual Reality","year":2025}