Author ORCID Identifier
https://orcid.org/0009-0005-8782-5797
Document Type
Conference Paper
Disciplines
1.2 COMPUTER AND INFORMATION SCIENCE
Abstract
Social virtual reality (VR) applications have become more ubiquitous in recent years; central to this is the communication pipeline, how users perceive virtual human facial expressions, and how they control them in real time, especially when using VR devices without face-tracking. We investigated both aspects in a set of experiments. Firstly, we compared the perception of virtual human emotions on a traditional 2D screen and in VR. In a second experiment, we used a validated set of stimuli to compare three different control methods for manipulating an avatar’s facial expressions in VR. These control methods utilize non-tracking control techniques, which do not rely on real-time face tracking but rely on alternative inputs via the VR controller. Our analysis shows that in VR, the effectiveness ratings for happy, sad, and surprise were significantly higher, and disgust was significantly more recognizable, compared to the screen. These findings contribute to our understanding of virtual human based emotional communication in VR by demonstrating that the perception of facial expression varies between screen and VR. Additionally, we identify raycast selection (point and click) as the most accurate control method, whereas thumbstick labeled (using a controller thumbstick with UI labels for guidance) was the fastest and most preferred method by participants.
DOI
10.1109/ICVR66534.2025.11172599
Recommended Citation
Chandran, J K Sangeeth; Salvador, Marisa Llorens; and Ennis, Cathy, "Face Off: Evaluating Virtual Human Expressions and Non-Tracking Control Methods in VR" (2025). Conference papers. 459.
https://arrow.tudublin.ie/scschcomcon/459
Funder
Research Ireland
Creative Commons License

This work is licensed under a Creative Commons Attribution-NonCommercial-Share Alike 4.0 International License.
Publication Details
Presented at the 11th International Conference on Virtual Reality (ICVR 2025). Published in IEEE.
10.1109/ICVR66534.2025.11172599