Projects

Filter by category
Filter by tag

2023

BreathePulse

BreathePulse

Published:

Whereas past guided breathing systems use visual or tactile feedback, we designed BreathePulse, an airflow-based system for reducing respiratory rate more naturally and unobtrusively. We evaluated BreathePulse in an intensive n-back task and provided guidelines for making future guided breathing devices more effortless and comfortable for users.

research      entrainment, respiration, stress, ubiquitous-computing

2022

Tongue Gestures in Head Worn Devices

Tongue Gestures in Head Worn Devices

Published:

Tongue gestures are an accessible and subtle method for interacting with wearables but past studies have used custom hardware with a single sensing modality. At Microsoft Research, we used multimodal sensors in a commercial VR headset and EEG headband to build a 50,000 gesture dataset and real-time classifier. We also invented a new interaction method combining tongue and gaze to enable faster gaze-based selection in hands-free interactions.

research      head-worn-displays, sensing, subtle-interaction UbiComp'22 poster talk

Perceived Credibility of Public Health Messages

Perceived Credibility of Public Health Messages

Published:

With the COVID-19 pandemic, online delivery of public health messages has become a critical role for public healthcare. We examined how credibility of public health messages regarding COVID-19 varies across different platforms (Twitter, original website) and source (CDC, Georgia Department of Health, independent academics) in a controlled experiment.

coursework      healthcare, social-computing poster

Horizon Worlds: A Community of Practice for Social VR Design

Horizon Worlds: A Community of Practice for Social VR Design

Published:

Despite recent attention, Horizon Worlds hasn’t been studied extensively as a social VR platform. Using ethnographic methods, my group studied the online VR community in Horizon Worlds. Based on observational reports and interviews, we found the creative community of world designers to be a prototypical community of practice for designing social VR experiences.

coursework      social-computing, virtual-reality report

2021

Passive Haptic Learning for Accelerated Learning of Piano

Passive Haptic Learning for Accelerated Learning of Piano

Published:

Learning piano is difficult, especially for older learners with busy lives. Passive haptic learning can reduce time spent practicing piano through instructional tactile cues. We designed a custom vibrotactile haptic glove for daily wear, enabling faster learning of piano skills. I led a group of undergraduate and graduate students in manufacturing glove hardware, designing a web portal and organizing user studies to evaluate performance.

research      haptics, music, wearables video CHI'22 IMI paper UbiComp'22 demo UROP Outstanding Oral Presentation Award UbiComp'22 Best Demo Award

BrainBraille: A Passively Learnable Brain Computer Interface using fNIRS

BrainBraille: A Passively Learnable Brain Computer Interface using fNIRS

Published:

We developed a new brain-computer interface using fNIRS to detect attempted motor movement in different regions of the body. Converting attempted motions to language to enable more versatile communication options for people with movement disabilities. For my undergraduate thesis, I explored how transitional gestures may enable higher accuracy and information transfer with brain-computer interfaces.

research      assistive-technology, brain-computer-interfaces, fnirs undergraduate thesis President's Undergraduate Research Award

2020

Hand Pose Estimation using Convolutional Neural Networks in Stereoscopic Vision

Hand Pose Estimation using Convolutional Neural Networks in Stereoscopic Vision

Published:

Estimating hand poses is valuable for gesture interactions and hand tracking but often requires expensive depth cameras. Stereo cameras show multiple perspectives of the hand, allowing depth perception. We created a pipeline for estimating location of hand and finger keypoints with a stereo camera using deep convolutional neural networks.

coursework      computer-vision, machine-learning slides

SilentSpeller: Silent Speech Text Entry using Electropalatography

SilentSpeller: Silent Speech Text Entry using Electropalatography

Published:

Silent speech systems provide a means for communication to people with movement disabilities like muscular dystrophy while preserving privacy. SilentSpeller is the first-ever silent speech system capable of use with a >1000 word vocabulary while in motion. We made a novel text entry system with capacitive tongue sensing from an oral wearable device to enable a privacy-preserving alternative to speech recognition.

research      assistive-technology, subtle-interaction, wearables video CHI'22 paper CHI'21 Interactivity paper UROP Outstanding Oral Presentation Award BuzzFeed

2019

Autonomous Navigation for Mobile Robots in Open Terrain

Autonomous Navigation for Mobile Robots in Open Terrain

Published:

As part of the software team, I prepared a complete replica of competition in simulation to enable RoboJackets’ Intelligent Ground Vehicle Competition robots to be tested realistically. I also coded motor control firmware and path planning algorithms to enable more accurate robot motion. Later, as project manager, I supervised the software, electrical and mechanical teams’ progress.

organizations      competition, robotics report code 3rd place Grand Award

2018

2017