Sitemap

A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.

Pages

Posts

news

Bci Society Psc

Published:

Joined the BCI Society Postdoc and Student Committee to organize an open science initiative for BCI learning resources.

Pura Travel

Published:

Earned the PURA Travel Award to support my travel to New Orleans for CHI 2022. Thank you UROP!

Urop Symposium

Published:

Presented on Passive Haptic Rehearsal at Georgia Tech’s Undergraduate Research Symposium and earned the 2nd place Oral Presentation Award!

Chi2022

Published:

Attended CHI 2022 in New Orleans. Excellent discussions and really enjoyed meeting folks at my first in-person HCI conference!

Back To School Ubicomp

Published:

Finished my internship project on tongue gesture recognition and back to Georgia Tech. Look forward to sharing our findings at ISWC/UbiComp and beyond!

Ubicomp Award

Published:

Our live demonstration of passive haptic learning earned the Best Demo Award at UbiComp 2022!

Futuring Sigchi

Published:

Excited to be a member of the newly-formed Futuring SIGCHI committee!

Graduation Bs

Published:

Graduated from my Bachelor’s in Computer Science with highest honors. Starting a Master’s degree as part of my BS/MS in January.

Chi Gazetongue

Published:

I’ll be presenting my work on combining tongue gestures with gaze tracking at CHI 2023 Interactivity in April. Look forward to seeing everyone in Hamburg.

Gt Awards

Published:

Honored to receive the Sigma Xi Best Undergraduate Research Award and Donald V. Jackson Fellowship Award for my research involvement and leadership at Georgia Tech!

Phd Cornelltech

Published:

I’ll be starting my PhD at Cornell Tech in NYC in the Fall with a Cornell Fellowship. I’ll be advised by Dr. Tanzeem Choudhury and Dr. Cheng Zhang, researching closed-loop passive interventions and intent-driven interaction.

projects

Autonomous Navigation for Mobile Robots in Open Terrain

Autonomous Navigation for Mobile Robots in Open Terrain

Published:

As part of the software team, I prepared a complete replica of competition in simulation to enable RoboJackets’ Intelligent Ground Vehicle Competition robots to be tested realistically. I also coded motor control firmware and path planning algorithms to enable more accurate robot motion. Later, as project manager, I supervised the software, electrical and mechanical teams’ progress.

organizations      competition, robotics report code 3rd place Grand Award

SilentSpeller: Silent Speech Text Entry using Electropalatography

SilentSpeller: Silent Speech Text Entry using Electropalatography

Published:

Silent speech systems provide a means for communication to people with movement disabilities like muscular dystrophy while preserving privacy. SilentSpeller is the first-ever silent speech system capable of use with a >1000 word vocabulary while in motion. We made a novel text entry system with capacitive tongue sensing from an oral wearable device to enable a privacy-preserving alternative to speech recognition.

research      assistive-technology, subtle-interaction, wearables video CHI'22 paper CHI'21 Interactivity paper UROP Outstanding Oral Presentation Award BuzzFeed

Hand Pose Estimation using Convolutional Neural Networks in Stereoscopic Vision

Hand Pose Estimation using Convolutional Neural Networks in Stereoscopic Vision

Published:

Estimating hand poses is valuable for gesture interactions and hand tracking but often requires expensive depth cameras. Stereo cameras show multiple perspectives of the hand, allowing depth perception. We created a pipeline for estimating location of hand and finger keypoints with a stereo camera using deep convolutional neural networks.

coursework      computer-vision, machine-learning slides

BrainBraille: A Passively Learnable Brain Computer Interface using fNIRS

BrainBraille: A Passively Learnable Brain Computer Interface using fNIRS

Published:

We developed a new brain-computer interface using fNIRS to detect attempted motor movement in different regions of the body. Converting attempted motions to language to enable more versatile communication options for people with movement disabilities. For my undergraduate thesis, I explored how transitional gestures may enable higher accuracy and information transfer with brain-computer interfaces.

research      assistive-technology, brain-computer-interfaces, fnirs undergraduate thesis President's Undergraduate Research Award

Passive Haptic Learning for Accelerated Learning of Piano

Passive Haptic Learning for Accelerated Learning of Piano

Published:

Learning piano is difficult, especially for older learners with busy lives. Passive haptic learning can reduce time spent practicing piano through instructional tactile cues. We designed a custom vibrotactile haptic glove for daily wear, enabling faster learning of piano skills. I led a group of undergraduate and graduate students in manufacturing glove hardware, designing a web portal and organizing user studies to evaluate performance.

research      haptics, music, wearables video CHI'22 IMI paper UbiComp'22 demo UROP Outstanding Oral Presentation Award UbiComp'22 Best Demo Award

Horizon Worlds: A Community of Practice for Social VR Design

Horizon Worlds: A Community of Practice for Social VR Design

Published:

Despite recent attention, Horizon Worlds hasn’t been studied extensively as a social VR platform. Using ethnographic methods, my group studied the online VR community in Horizon Worlds. Based on observational reports and interviews, we found the creative community of world designers to be a prototypical community of practice for designing social VR experiences.

coursework      social-computing, virtual-reality report

Perceived Credibility of Public Health Messages

Perceived Credibility of Public Health Messages

Published:

With the COVID-19 pandemic, online delivery of public health messages has become a critical role for public healthcare. We examined how credibility of public health messages regarding COVID-19 varies across different platforms (Twitter, original website) and source (CDC, Georgia Department of Health, independent academics) in a controlled experiment.

coursework      healthcare, social-computing poster

Tongue Gestures in Head Worn Devices

Tongue Gestures in Head Worn Devices

Published:

Tongue gestures are an accessible and subtle method for interacting with wearables but past studies have used custom hardware with a single sensing modality. At Microsoft Research, we used multimodal sensors in a commercial VR headset and EEG headband to build a 50,000 gesture dataset and real-time classifier. We also invented a new interaction method combining tongue and gaze to enable faster gaze-based selection in hands-free interactions.

research      head-worn-displays, sensing, subtle-interaction UbiComp'22 poster talk

publications

Mobile, Hands-Free, Silent Speech Texting Using SilentSpeller

Mobile, Hands-Free, Silent Speech Texting Using SilentSpeller

Naoki Kimura, Tan Gemicioglu, Jonathan Womack, Richard Li, Yuhui Zhao, Abdelkareem Bedri, Alex Olwal, Jun Rekimoto, Thad Starner

Published in Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems, 2021

Abstract

Voice control provides hands-free access to computing, but there are many situations where audible speech is not appropriate. Most unvoiced speech text entry systems can not be used while on-the-go due to movement artifacts. SilentSpeller enables mobile silent texting using a dental retainer with capacitive touch sensors to track tongue movement. Users type by spelling words without voicing. In offline isolated word testing on a 1164-word dictionary, SilentSpeller achieves an average 97% character accuracy. 97% offline accuracy is also achieved on phrases recorded while walking or seated. Live text entry achieves up to 53 words per minute and 90% accuracy, which is competitive with expert text entry on mini-QWERTY keyboards without encumbering the hands.

demo, sensing, subtle-interaction doi paper video BuzzFeed

Passive Haptic Rehearsal for Accelerated Piano Skill Acquisition

Passive Haptic Rehearsal for Accelerated Piano Skill Acquisition

Tan Gemicioglu, Noah Teuscher, Brahmi Dwivedi, Soobin Park, Emerson Miller, Celeste Mason, Caitlyn Seim, Thad Starner

Published in Intelligent Music Interfaces Workshop at the 2022 CHI Conference on Human Factors in Computing Systems, 2022

Abstract

Passive haptic learning (PHL) uses vibrotactile stimulation to train piano songs using repetition, even when the recipient of stimulation is focused on other tasks. However, many of the benefits of playing piano cannot be acquired without actively playing the instrument. In this position paper, we posit that passive haptic rehearsal, where active piano practice is assisted by separate sessions of passive stimulation, is of greater everyday use than solely PHL. We propose a study to examine the effects of passive haptic rehearsal for self-paced piano learners and consider how to incorporate passive rehearsal into everyday practice.

haptics, piano, workshop doi paper slides

SilentSpeller: Towards mobile, hands-free silent speech text entry using electropalatography

SilentSpeller: Towards mobile, hands-free silent speech text entry using electropalatography

Naoki Kimura, Tan Gemicioglu, Jonathan Womack, Richard Li, Yuhui Zhao, Abdelkareem Bedri, Zixiong Su, Alex Olwal, Jun Rekimoto, Thad Starner

Published in Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems, 2022

Abstract

Speech is inappropriate in many situations, limiting when voice control can be used. Most unvoiced speech text entry systems can not be used while on-the-go due to movement artifacts. Using a dental retainer with capacitive touch sensors, SilentSpeller tracks tongue movement, enabling users to type by spelling words without voicing. SilentSpeller achieves an average 97% character accuracy in offline isolated word testing on a 1164-word dictionary. Walking has little effect on accuracy; average offline character accuracy was roughly equivalent on 107 phrases entered while walking (97.5%) or seated (96.5%). To demonstrate extensibility, the system was tested on 100 unseen words, leading to an average 94% accuracy. Live text entry speeds for seven participants averaged 37 words per minute at 87% accuracy. Comparing silent spelling to current practice suggests that SilentSpeller may be a viable alternative for silent mobile text entry.

full-paper, sensing, subtle-interaction doi paper video

Learning Piano Songs with Passive Haptic Training: an Interactive Lesson

Learning Piano Songs with Passive Haptic Training: an Interactive Lesson

Asha Bhandarkar, Tan Gemicioglu, Brahmi Dwivedi, Caitlyn Seim, Thad Starner

Published in Proceedings of the 2022 ACM International Joint Conference on Pervasive and Ubiquitous Computing, 2022

Abstract

Passive haptic learning (PHL) is a method for learning piano pieces through repetition of haptic stimuli while a user is focused on other daily tasks. In combination with active practice techniques, this method is a powerful tool for users to learn piano pieces with less time spent in active practice while also reducing cognitive effort and increasing retention. We propose a demo combining these two learning methods, in which attendees will engage in a short active practice session followed by a passive practice session using vibrotactile haptic gloves. Attendees will be able to experience the effects of passive haptic learning for themselves as well as gauge their mastery of the piece with a final performance at the end of the demo.

demo, haptics, piano doi paper video Best Demo Award

Tongue Gestures for Hands-Free Interaction in Head Worn Displays

Tongue Gestures for Hands-Free Interaction in Head Worn Displays

Tan Gemicioglu, Mike Winters, Yu-Te Wang, Ivan Tashev

Published in Proceedings of the 2022 ACM International Joint Conference on Pervasive and Ubiquitous Computing, 2022

Abstract

Head worn displays are often used in situations where users’ hands may be occupied or otherwise unusable due to permanent or situational movement impairments. Hands-free interaction methods like voice recognition and gaze tracking allow accessible interaction with reduced limitations for user ability and environment. Tongue gestures offer an alternative method of private, hands-free and accessible interaction. However, past tongue gesture interfaces come in intrusive or otherwise inconvenient form factors preventing their implementation in head worn displays. We present a multimodal tongue gesture interface using existing commercial headsets and sensors only located in the upper face. We consider design factors for choosing robust and usable tongue gestures, introduce eight gestures based on the criteria and discuss early work towards tongue gesture recognition with the system.

gesture, poster, sensing, subtle-interaction doi paper poster

Transitional Gestures for Enhancing ITR and Accuracy in Movement-based BCIs

Transitional Gestures for Enhancing ITR and Accuracy in Movement-based BCIs

Tan Gemicioglu, Yuhui Zhao, Melody Jackson, Thad Starner

Published in Proceedings of the 10th International Brain-Computer Interface Meeting, 2023

Abstract

BCIs using imagined or executed movement enable subjects to communicate by performing gestures in sequential patterns. Conventional interaction methods have a one-to-one mapping between movements and commands but new methods such as BrainBraille have instead used a pseudo-binary encoding where multiple body parts are tensed simultaneously. However, non-invasive BCI modalities such as EEG and fNIRS have limited spatial specificity, and have difficulty distinguishing simultaneous movements. We propose a new method using transitions in gesture sequences to combinatorially increase possible commands without simultaneous movements. We demonstrate the efficacy of transitional gestures in a pilot fNIRS study where accuracy increased from 81% to 92% when distinguishing transitions of two movements instead of two movements independently. We calculate ITR for a potential transitional version of BrainBraille, where ITR would increase from 143bpm to 218bpm.

brain-computer-interface, gesture, poster paper

Gaze & Tongue: A Subtle Hands-Free Interaction for Head-worn Devices

Gaze & Tongue: A Subtle Hands-Free Interaction for Head-worn Devices

Tan Gemicioglu, R. Michael Winters, Yu-Te Wang, Thomas M. Gable, Ann Paradiso, Ivan J. Tashev

Published in Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems, 2023

Abstract

Gaze tracking allows hands-free and voice-free interaction with computers, and has gained more use recently in virtual and augmented reality headsets. However, it traditionally uses dwell time for selection tasks, which suffers from the Midas Touch problem. Tongue gestures are subtle, accessible and can be sensed non-intrusively using an IMU at the back of the ear, PPG and EEG. We demonstrate a novel interaction method combining gaze tracking with tongue gestures for gaze-based selection faster than dwell time and multiple selection options. We showcase its usage as a point-and-click interface in three hands-free games and a musical instrument.

demo, gaze, gesture, sensing, subtle-interaction doi paper

talks

teaching

Teaching experience 1

Undergraduate course, University 1, Department, 2014

This is a description of a teaching experience. You can use markdown like any other post.

Teaching experience 2

Workshop, University 1, Department, 2015

This is a description of a teaching experience. You can use markdown like any other post.