Publications

Filter by category
Filter by tag

2023

TongueTap: Multimodal Tongue Gesture Recognition with Head-Worn Devices

TongueTap: Multimodal Tongue Gesture Recognition with Head-Worn Devices

Tan Gemicioglu, R. Michael Winters, Yu-Te Wang, Thomas M. Gable, Ivan J. Tashev

Published in Proceedings of the 25th International Conference on Multimodal Interaction, 2023

Abstract

Mouth-based interfaces are a promising new approach enabling silent, hands-free and eyes-free interaction with wearable devices. However, interfaces sensing mouth movements are traditionally custom-designed and placed near or within the mouth. TongueTap synchronizes multimodal EEG, PPG, IMU, eye tracking and head tracking data from two commercial headsets to facilitate tongue gesture recognition using only off-the-shelf devices on the upper face. We classified eight closed-mouth tongue gestures with 94% accuracy, offering an invisible and inaudible method for discreet control of head-worn devices. Moreover, we found that the IMU alone differentiates eight gestures with 80% accuracy and a subset of four gestures with 92% accuracy. We built a dataset of 48,000 gesture trials across 16 participants, allowing TongueTap to perform user-independent classification. Our findings suggest tongue gestures can be a viable interaction technique for VR/AR headsets and earables without requiring novel hardware.

full-paper      gesture, sensing, subtle-interaction doi paper dataset

Gaze & Tongue: A Subtle Hands-Free Interaction for Head-worn Devices

Gaze & Tongue: A Subtle Hands-Free Interaction for Head-worn Devices

Tan Gemicioglu, R. Michael Winters, Yu-Te Wang, Thomas M. Gable, Ann Paradiso, Ivan J. Tashev

Published in Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems, 2023

Abstract

Gaze tracking allows hands-free and voice-free interaction with computers, and has gained more use recently in virtual and augmented reality headsets. However, it traditionally uses dwell time for selection tasks, which suffers from the Midas Touch problem. Tongue gestures are subtle, accessible and can be sensed non-intrusively using an IMU at the back of the ear, PPG and EEG. We demonstrate a novel interaction method combining gaze tracking with tongue gestures for gaze-based selection faster than dwell time and multiple selection options. We showcase its usage as a point-and-click interface in three hands-free games and a musical instrument.

demo      gaze, gesture, sensing, subtle-interaction doi paper Best Demo Finalist

Transitional Gestures for Enhancing ITR and Accuracy in Movement-based BCIs

Transitional Gestures for Enhancing ITR and Accuracy in Movement-based BCIs

Tan Gemicioglu, Yuhui Zhao, Melody Jackson, Thad Starner

Published in Proceedings of the 10th International Brain-Computer Interface Meeting, 2023

Abstract

BCIs using imagined or executed movement enable subjects to communicate by performing gestures in sequential patterns. Conventional interaction methods have a one-to-one mapping between movements and commands but new methods such as BrainBraille have instead used a pseudo-binary encoding where multiple body parts are tensed simultaneously. However, non-invasive BCI modalities such as EEG and fNIRS have limited spatial specificity, and have difficulty distinguishing simultaneous movements. We propose a new method using transitions in gesture sequences to combinatorially increase possible commands without simultaneous movements. We demonstrate the efficacy of transitional gestures in a pilot fNIRS study where accuracy increased from 81% to 92% when distinguishing transitions of two movements instead of two movements independently. We calculate ITR for a potential transitional version of BrainBraille, where ITR would increase from 143bpm to 218bpm.

poster      brain-computer-interface, gesture paper

2022

Tongue Gestures for Hands-Free Interaction in Head Worn Displays

Tongue Gestures for Hands-Free Interaction in Head Worn Displays

Tan Gemicioglu, Mike Winters, Yu-Te Wang, Ivan Tashev

Published in Proceedings of the 2022 ACM International Joint Conference on Pervasive and Ubiquitous Computing, 2022

Abstract

Head worn displays are often used in situations where users’ hands may be occupied or otherwise unusable due to permanent or situational movement impairments. Hands-free interaction methods like voice recognition and gaze tracking allow accessible interaction with reduced limitations for user ability and environment. Tongue gestures offer an alternative method of private, hands-free and accessible interaction. However, past tongue gesture interfaces come in intrusive or otherwise inconvenient form factors preventing their implementation in head worn displays. We present a multimodal tongue gesture interface using existing commercial headsets and sensors only located in the upper face. We consider design factors for choosing robust and usable tongue gestures, introduce eight gestures based on the criteria and discuss early work towards tongue gesture recognition with the system.

poster      gesture, sensing, subtle-interaction doi paper poster