Non-Visual Cues for View Management in Narrow Field of View Augmented Reality Displays

Alexander Marquardt, Christina Trepkowski, David Eibich, Jens Maiero, Ernst Kruijff
accepted for publication in Proceedings of the IEEE International Symposium for Mixed and Augmented Reality 2019 - 2019
Download the publication : MTEMK_ismar19.pdf [2.6Mo]  

Abstract

Head-worn devices with a narrow field of view are common commodity for Augmented Reality. However, their limited screen space makes view management difficult. Especially in dense information spaces this potentially leads to visual conflicts such as overlapping labels (occlusion) and visual clutter. In this paper, we look into the potential of using audio and vibrotactile feedback to guide search and information localization. Our results indicate users can be guided with high accuracy using audio-tactile feedback with maximum median deviations of only 2° on longitude, 3.6° on latitude and 0.07 meter in depth. Regarding the encoding of latitude we found a superior performance when using audio, resulting in an improvement of 61% and fastest search times. When interpreting localization cues the maximum median deviation was 9.9° on longitude and 18% of a selected distance to be encoded which could be reduced to 14% when using audio.

Images and movies

 

BibTex references

@InProceedings{MTEMK19,
  author       = {Marquardt, Alexander and Trepkowski, Christina and Eibich, David and Maiero, Jens and Kruijff, Ernst},
  title        = {Non-Visual Cues for View Management in Narrow Field of View Augmented Reality Displays},
  booktitle    = {accepted for publication in Proceedings of the IEEE International Symposium for Mixed and Augmented Reality 2019},
  series       = {ISMAR},
  year         = {2019},
}

Other publications in the database

» Alexander Marquardt
» Christina Trepkowski
» David Eibich
» Jens Maiero
» Ernst Kruijff