FaceHaptics: Robot Arm based Versatile Facial Haptics for Immersive Environments

Alexander Wilberz, Dominik Leschtschow, Christina Trepkowski, Jens Maiero, Ernst Kruijff, Bernhard Riecke E.
Accepted for publication In Proceedings of the 2020 ACM CHI Conference on Human Factors in Computing Systems - 2020

Abstract

This paper introduces FaceHaptics, a novel haptic display based on a robot arm attached to a head-mounted virtual reality display. It provides localized, multi-directional and movable haptic cues in the form of wind, warmth, moving and single-point touch events and water spray to dedicated parts of the face not covered by the head-mounted display. The easily extensible system, however, can principally mountany type of compact haptic actuator or object. User study1 showed that users appreciate the directional resolution of cues, and can judge wind direction well, especially when they move their head and wind direction is adjusted dynamically to compensate for head rotations. Study 2 showed that adding FaceHaptics cues to a VR walkthrough can significantly improve user experience, presence, and emotional responses.

Images and movies

 

BibTex references

@InProceedings{WLTMKR20,
  author       = {Wilberz, Alexander and Leschtschow, Dominik and Trepkowski, Christina and Maiero, Jens and Kruijff, Ernst and Riecke E., Bernhard},
  title        = {FaceHaptics: Robot Arm based Versatile Facial Haptics for Immersive Environments},
  booktitle    = {Accepted for publication In Proceedings of the 2020 ACM CHI Conference on Human Factors in Computing Systems},
  series       = {CHI},
  year         = {2020},
}

Other publications in the database

» Christina Trepkowski
» Jens Maiero
» Ernst Kruijff
» Bernhard Riecke E.