VibRing: A Wearable Vibroacoustic Sensor for Single-Handed Gesture Recognition
Published in Proceeding of the ACM on Human-Computer Interaction. Volume 9, Issue 4, Article EICS006, June, 2025
Recommended citation: Bu Li, Xincheng Huang, and Robert Xiao. 2025. VibRing: A Wearable Vibroacoustic Sensor for Single-Handed Gesture Recognition. Proc. ACM Hum.-Comput. Interact. 9, 4, Article EICS006 (June 2025), 25 pages. https://doi.org/10.1145/3733052
Abstract: Single-handed gestures offer rapid and intuitive interactions for input in interactive applications ranging from smartwatches and phones to augmented reality. Past research has explored using computer vision or inertial measurement units (IMUs) to sense such gestures, but these sensing modalities can be variously subject to occlusion, high power consumption, or sensitivity to random motion. In this work, we explore passively detecting the vibroacoustic signature of subtle single-handed gestures through a wearable piezoelectric sensor, providing a robust, low-power sensing modality. We present (1) a hand-gesture design framework encompassing a large set of subtle, rapid single-handed gestures which balance comfort and vibroacoustic distinguishability, (2) VibRing, a lightweight wireless hand-gesture sensing platform, leveraging a single finger-worn vibroacoustic sensor, and (3) a multifaceted system evaluation where we consider several aspects: general usability, tolerance to variance, user adaptability, and extended usage. Our results demonstrate that VibRing can support an 11-gesture set with a general accuracy of 94.2% and low-performance variance across multiple days (90.2% accuracy in cross-day validation). To support a new user, VibRing requires only 10 minutes of training data to achieve an accuracy of 92.7%. We also tested the extended use of VibRing in an office study where users performed periodic gesture inputs during typical office tasks with real-time classification, achieving a true-positive rate of 90.9%. Finally, to demonstrate the utility of VibRing, we present three examples of applications which benefit from our subtle gesture interactions.
Paper: DOWNLOAD PDF
Demo: