Sciweavers

IUI
2000
ACM

Expression constraints in multimodal human-computer interaction

14 years 4 months ago
Expression constraints in multimodal human-computer interaction
Thanks to recent scientific advances, it is now possible to design multimodal interfaces allowing the use of speech and pointing out gestures on a touchscreen. However, present speech recognizers and natural language interpreters cannot yet process spontaneous speech accurately. These limitations make it necessary to impose constraints on users’ speech inputs. Thus, ergonomic studies are needed to provide user interface designers with efficient guidelines for the definition of usable speech constraints. We have evolved a method for the design of expression constraints that define tractable and usable multimodal command languages; that is, languages which are both interpretable reliably by present systems, and easy to learn mostly through human-computer interaction. We present an empirical study which attempts to assess the usability of such a command language in a realistic multimodal software environment. A comparison between the initial behaviors of the subjects involved in this s...
Sandrine Robbe-Reiter, Noelle Carbonell, Pierre Da
Added 01 Aug 2010
Updated 01 Aug 2010
Type Conference
Year 2000
Where IUI
Authors Sandrine Robbe-Reiter, Noelle Carbonell, Pierre Dauchy
Comments (0)