Sciweavers

275 search results - page 31 / 55
» Bare-hand 3D gesture input to interactive systems
Sort
View
GW
2003
Springer
151views Biometrics» more  GW 2003»
14 years 23 days ago
Evaluating Multimodal Interaction Patterns in Various Application Scenarios
In this work, we present the results of a comparative user study evaluating multimodal user interactions with regard to two different operation scenarios: a desktop Virtual-Reality...
Frank Althoff, Gregor McGlaun, Manfred K. Lang, Ge...
MHCI
2009
Springer
14 years 2 months ago
overView: physically-based vibrotactile feedback for temporal information browsing
An approach to providing tangible feedback to users of a mobile device in both highly visual touchscreen-based and eyes-free interaction scenarios and the transition between the t...
Steven Strachan, Grégoire Lefebvre, Sophie ...
FGR
2002
IEEE
228views Biometrics» more  FGR 2002»
14 years 15 days ago
Real-Time Tracking of Multiple Fingertips and Gesture Recognition for Augmented Desk Interface Systems
In this paper, we propose a fast and robust method for tracking a user’s hand and multiple fingertips; we then demonstrate gesture recognition based on measured fingertip traj...
Kenji Oka, Yoichi Sato, Hideki Koike
METMBS
2004
174views Mathematics» more  METMBS 2004»
13 years 9 months ago
Med-LIFE: A Diagnostic Aid for Medical Imagery
We present a system known as Med-LIFE (Medical application of Learning, Image Fusion, and Exploration) currently under development for medical image analysis. This pipelined syste...
Joshua R. New, Erion Hasanbelliu, Mario Aguilar
ACHI
2008
IEEE
14 years 2 months ago
Multimodal Metric Study for Human-Robot Collaboration
The aim of our research is to create a system whereby human members of a team can collaborate in a natural way with robots. In this paper we describe a Wizard of Oz (WOZ) study co...
Scott Green, Scott Richardson, Randy Stiles, Mark ...