Abstract Gestures are an important modality for human-machine communication. Computer vision modules performing gesture recognition can be important components of intelligent homes, assistive environments, and human-computer interfaces. A key problem in recognizing gestures is that the appearance of a gesture can vary widely depending on variables such as the person performing the gesture, or the position and orientation of the camera. This paper presents a database-based approach for addressing this problem. The large variability in appearance among different examples of the same gesture is addressed by creating large gesture databases, that store enough exemplars from each gesture to capture the variability within that gesture. This database-based approach is applied to two gesture recognition problems: handshape categorization and motion-based recognition of American Sign Language (ASL) signs. A key aspect of our approach is the use of database indexing methods, in order to address ...