: This paper presents a gesture-based Human-Computer Interface (HCI) to navigate a learning object repository mapped in a 3D virtual environment. With this interface, the user can ...
Qing Chen, Abu Saleh Md. Mahfujur Rahman, Xiaojun ...
This paper introduces a new sensor architecture for making interactive surfaces that are sensitive to human hand and finger gestures. This sensor recognizes multiple hand position...
Abstract. In this paper we review the major approaches to multimodal human computer interaction from a computer vision perspective. In particular, we focus on body, gesture, gaze, ...
This paper describes a Mixed Reality-supported interactive museum exhibit. Using an easy and intuitive pointing gesture recognition system, the museum visitor is able to create hi...
Motion-based control is gaining popularity, and motion gestures form a complementary modality in human-computer interactions. To achieve more robust user-independent motion gestur...