Most gesture recognition systems analyze gestures intended for communication (e.g. sign language) or for command (e.g. navigation in a virtual world). We attempt instead to recognize gestures made in the course of performing everyday work activities. Specifically, we examine activities in a wood shop, both in isolation as well as in the context of a simulated assembly task. We apply linear discriminant analysis (LDA) and hidden Markov model (HMM) techniques to features derived from body-worn accelerometers and microphones. The resulting system can successfully segment and identify most shop activities with zero false positives and 83.5% accuracy.
Paul Lukowicz, Jamie A. Ward, Holger Junker, Mathi