Sciweavers

COLING
2010
13 years 7 months ago
Interpreting Pointing Gestures and Spoken Requests - A Probabilistic, Salience-based Approach
We present a probabilistic, salience-based approach to the interpretation of pointing gestures together with spoken utterances. Our mechanism models dependencies between spatial a...
Ingrid Zukerman, Gideon Kowadlo, Patrick Ye
IJACTAICIT
2010
184views more  IJACTAICIT 2010»
13 years 9 months ago
Gesture Recognition for Human-Computer Interaction (HCI)
Considerable effort has been put towards developing intelligent and natural interfaces between users and computer systems. This is done by means of a variety of modes of informati...
Jane J. Stephan, Sana'a Khudayer
ICONIP
2010
13 years 9 months ago
Online Gesture Recognition for User Interface on Accelerometer Built-in Mobile Phones
Recently, several smart phones are equipped with a 3D-accelerometer that can be used for gesture-based user interface (UI). In order to utilize the gesture UI for the real-time sys...
BongWhan Choe, Jun-Ki Min, Sung-Bae Cho
COMSIS
2010
13 years 9 months ago
An accelerometer-based gesture recognition algorithm and its application for 3D interaction
Abstract. This paper proposes an accelerometer-based gesture recognition algorithm. As a pre-process procedure, raw data output by accelerometer should be quantized, and then use d...
Jianfeng Liu, Zhigeng Pan, Xiangcheng Li
HCI
2009
13 years 10 months ago
An Approach to Glove-Based Gesture Recognition
Nowadays, computer interaction is mostly done using dedicated devices. But gestures are an easy mean of expression between humans that could be used to communicate with computers ...
Farid Parvini, Dennis McLeod, Cyrus Shahabi, Bahar...
HCI
2009
13 years 10 months ago
Gesture-Controlled User Input to Complete Questionnaires on Wrist-Worn Watches
The aim of this work was to investigate arm gestures as an alternative input modality for wrist-worn watches. In particular we implemented a gesture recognition system and question...
Oliver Amft, Roman Amstutz, Asim Smailagic, Daniel...
HCI
2009
13 years 10 months ago
An Open Source Framework for Real-Time, Incremental, Static and Dynamic Hand Gesture Learning and Recognition
Real-time, static and dynamic hand gesture learning and recognition makes it possible to have computers recognize hand gestures naturally. This creates endless possibilities in the...
Todd C. Alexander, Hassan S. Ahmed, Georgios C. An...
PUC
2002
135views more  PUC 2002»
14 years 2 days ago
SenToy in FantasyA: Designing an Affective Sympathetic Interface to a Computer Game
We describe the design process of an affective control toy, named SenToy, used to control a synthetic character in a computer game. SenToy allows players1 to influence the emotion...
Ana Paiva, Gerd Andersson, Kristina Höök...
AROBOTS
2000
179views more  AROBOTS 2000»
14 years 8 days ago
A Gesture Based Interface for Human-Robot Interaction
Service robotics is currently a pivotal research area in robotics, with enormous societal potential. Since service robots directly interact with people, nding natural" and ea...
Stefan Waldherr, Roseli Romero, Sebastian Thrun
BEHAVIOURIT
2005
81views more  BEHAVIOURIT 2005»
14 years 10 days ago
A flick in the right direction: a case study of gestural input
This paper describes the design and evaluation of a gesture-based scheme for issuing the back and forward commands in web browsers. In designing our gesture recogniser we conducte...
Michael Moyle, Andy Cockburn