Human perception involves interpreting many sensory inputs to determine the context of their experience. In the realm of Virtual Reality, the system is in charge of providing specific sensory information to the users to embed them into a digital landscape. We use a combination of senses, including vestibular balance to determine spatial orientation and position within our environment. The discrepancy between the visuals provided by VR and our vestibular orientation in the natural world may cause motion sickness and disengage users. We are investigating the application of vestibular approaches to VR experiences and more specifically how engaged the person feels in the virtual environment.
We hypothesize that when a user experiences a greater match in vestibular motion, the user would be more engaged in the video game and have a greater presence.
The experiment we propose would have a set of users taking part in a simulated environment where they would be in control of moving their head to view what is around them. In the first setup, the head motions by the user would result in the background moving proportionally to the user’s head movements resulting in a vestibular match. In the second case, we would introduce some form of lateral movement, which the user cannot control through head movements. This would result in a vestibular mismatch.
Client: Celia Hodent at Epic Games
Team: Arjun Madan, Vikas Piddempally, Xavier Primus, Sakthi Thirukonda and Christopher Simmons
All the Project related documentation and reports can be found on the Github repository.