Sciweavers

395 search results - page 4 / 79
» When do we interact multimodally
Sort
View
ACMDIS
2006
ACM
14 years 1 months ago
What do usability evaluators do in practice?: an explorative study of think-aloud testing
Think-aloud testing is a widely employed usability evaluation method, yet its use in practice is rarely studied. We report an explorative study of 14 think-aloud sessions, the aud...
Mie Nørgaard, Kasper Hornbæk
IQ
2007
13 years 9 months ago
When Interactive TV Meets Online Auction: A Study On Factors Affecting User Adoption
: This research has attempted to find the impact of information quality (IQ) and system quality (SQ) on users’ attitude to adopt the interactive TV based auction service. The pri...
Jaeheung Yoo, Imsook Ha, Junkyun Choi
HAPTICS
2007
IEEE
14 years 1 months ago
Where are we with Haptic Visualization?
There is a growing interest into non-visual forms of data communication, not only driven by the need for accessible representations but also because researchers are realizing the ...
Jonathan C. Roberts, Sabrina A. Panëels
WEBNET
2000
13 years 8 months ago
MPML: A Multimodal Presentation Markup Language with Character Agent Control Functions
: As a new style of effective information presentations and a new multimodal information content production on the World Wide Web (WWW), multimodal presentation using interactive l...
Takayuki Tsutsui, Santi Saeyor, Mitsuru Ishizuka
ICMI
2009
Springer
95views Biometrics» more  ICMI 2009»
14 years 2 months ago
Multimodal inference for driver-vehicle interaction
In this paper we present a novel system for driver-vehicle interaction which combines speech recognition with facialexpression recognition to increase intention recognition accura...
Tevfik Metin Sezgin, Ian Davies, Peter Robinson