FaceHaptics: Robot Arm based Versatile Facial Haptics for Immersive Environments

profile

Beyond audio­vi­sual cues in VR: Using an HMD-mounted robot arm for ver­sa­tile facial haptics

Abstract: FaceHaptics is a novel haptic dis­play based on a robot arm attached to a head-mounted vir­tual real­ity dis­play. It pro­vides local­ized, multi-directional and mov­able haptic cues in the form of wind, warmth, moving and single-point touch events and water spray to ded­i­cated parts of the face not cov­ered by the head-mounted dis­play. The easily exten­si­ble system, how­ever, can prin­ci­pally mount any type of com­pact haptic actu­a­tor or object. User study 1 showed that users appre­ci­ate the direc­tional res­o­lu­tion of cues, and can judge wind direc­tion well, espe­cially when they move their head and wind direc­tion is adjusted dynam­i­cally to com­pen­sate for head rota­tions. Study 2 showed that adding FaceHaptics cues to a VR walk­through can sig­nif­i­cantly improve user expe­ri­ence, pres­ence, and emo­tional responses.

brief system demo video

YouTube Preview Image

30 sec teaser video

YouTube Preview Image

Publications

Wilberz, A., Leschtschow, D., Trepkowski, C., Maiero, J., Kruijff, E., & Riecke, B. E. (2020). FaceHaptics: Robot Arm based Versatile Facial Haptics for Immersive Environments. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 1–14. https://doi.org/10.1145/3313831.3376481 (Download)