TransCoop: The Perception of Self-Motion in Virtual Environments

Abstract
This project explores the fundamental tasks that underlie our perception of self-motion and the ways in which existing technological solutions to simulating self-motion enable or disrupt self-motion perception. The aim is to find the relevant cues (visual, audio, physical motion, etc.) for perception of movement in virtual environments such as the Immersion Square (Hochschule Bonn-Rhein-Sieg) or the IVY (York University).

Introduction
Examples for important questions within the project are:

  • What is the influence of different cues on depth perception?
  • Which tradeoffs exist in terms of the technology?
  • Are less ‘real’ alternatives equally as effective?
  • What is the impact of different ways of moving in VR?
  • Is the effect of the cues for self-motion and depth perception the same in different virtual environments?

Implementation
We carry out several experiments using different kinds of virtual environments: the Immersion Square and the FIVISquare with three projection screens each at Bonn-Rhein-Sieg University of Applied Sciences, as well as the IVY at York, a room in which all walls, ceiling and floor are display surfaces, and the so-called VR-Trike – a VR-equipped tricycle (at York University).

Results
Results obtained from this project can be expected to have a wide range of applications ranging from the development of more effective teleoperational interfaces to influencing and informing the design of flight simulators and maybe even to the development of more exciting amusement park rides.