Jackknife: A Reliable Reliable with Few Samples and Many Modalities

Abstract

Despite decades of research, there is yet no general rapid prototyping recognizer for dynamic gestures that can be trained with few samples, work with continuous data, and achieve high accuracy that is also modality-agnostic. To begin to solve this problem, we describe a small suite of accessible techniques that we collectively refer to as the Jackknife gesture recognizer. Our dynamic time warping based approach for both segmented and continuous data is designed to be a robust, go-to method for gesture recognition across a variety of modalities using only limited training samples. We evaluate pen and touch, Wii Remote, Kinect, Leap Motion, and sound-sensed gesture datasets as well as conduct tests with continuous data. Across all scenarios we show that our approach is able to achieve high accuracy, suggesting that Jackknife is a capable recognizer and good first choice for many endeavors.

Publication
In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, ACM.
Avatar
Corey Pittman
Computer Science, PhD

My research interests include augmented reality, novel user interfaces, and gesture recognition.

Related