Intuitive locomotion interfaces to support natural and effective simulated observer movement in virtual environments

Abstract
Enabling effective yet intuitive spatial orientation and locomotion in 3D environments is a highly relevant research problem. The problem is not only driven forward by existing research and industrial display installations, but also by recent market movements towards game-driven head mounted displays such as the Oculus Rift. Among others, effective locomotion interfaces can drastically improve the experience in 3D spaces, increase the overall usability and user experience of the system, enhance space perception important for a wide range of tasks (including design review), can avoid disorientation, and reduce motion sickness.

However, supporting effective spatial orientation and self-motion perception is difficult. Despite recent advances in Virtual Reality (VR) technology, providing a convincing and embodied sensation of being immersed in large virtual spaces is hard to achieve. Moreover, offering effective spatial orientation by limiting excessive disorientation and reducing motion sickness remain open problems. In this project, our overall goal is to iteratively design and evaluate a novel locomotion interface that enables users to quickly, intuitively, and precisely position their virtual viewpoint in 3D space and move through the environment. In more detail, the project has following objectives:

  1. Improve our understanding of self-motion perception and simulation, and define the role of multi-sensory factors
  2. Create a novel locomotion system, based on acquired understanding of underlying human factors
  3. Evaluate the locomotion system in application-driven scenarios