Sciweavers

521 search results - page 35 / 105
» Affective multimodal human-computer interaction
Sort
View
ACHI
2008
IEEE
15 years 10 months ago
Multimodal Metric Study for Human-Robot Collaboration
The aim of our research is to create a system whereby human members of a team can collaborate in a natural way with robots. In this paper we describe a Wizard of Oz (WOZ) study co...
Scott Green, Scott Richardson, Randy Stiles, Mark ...
CHI
2003
ACM
16 years 4 months ago
Auditory and visual feedback during eye typing
We describe a study on how auditory and visual feedback affects eye typing. Results show that the feedback method influences both text entry speed and error rate. In addition, a p...
Anne Aula, I. Scott MacKenzie, Kari-Jouko Räi...
VIP
2003
15 years 5 months ago
Face and Body Gesture Recognition for a Vision-Based Multimodal Analyzer
For the computer to interact intelligently with human users, computers should be able to recognize emotions, by analyzing the human’s affective state, physiology and behavior. I...
Hatice Gunes, Massimo Piccardi, Tony Jan
CHI
2005
ACM
16 years 4 months ago
Evaluation of multimodal input for entering mathematical equations on the computer
Current standard interfaces for entering mathematical equations on computers are arguably limited and cumbersome. Mathematics notations have evolved to aid visual thinking and yet...
Lisa Anthony, Jie Yang, Kenneth R. Koedinger
114
Voted
HAPTICS
2002
IEEE
15 years 9 months ago
Comparing Two Haptic Interfaces for Multimodal Graph Rendering
This paper describes the evaluation of two multimodal interfaces designed to provide visually impaired people with access to various types of graphs. The interfaces consist of aud...
Wai Yu, Stephen A. Brewster