Detecting human arm motion in a typical classroom environment is a challenging task due to the noisy and highly dynamic background, varying light conditions, as well as the small size and multiple number of possible matched objects. A robust vision system that can detect events of students’ hands being raised for asking questions is described. This system is intended to support the collaborative demands of distributed classroom lecturing and further serve as a test case for real-time gesture recognition vision systems. Various techniques including temporal and spatial segmentation, skin color identification, as well as shape and feature analysis are investigated and discussed. Limitations and problems are also analyzed and testing results are illustrated.
Jie Yao, Jeremy R. Cooperstock