Sciweavers

116 search results - page 20 / 24
» Designing and Prototyping Multimodal Commands
Sort
View
WWW
2006
ACM
14 years 8 months ago
DiTaBBu: automating the production of time-based hypermedia content
We present DiTaBBu, Digital Talking Books Builder, a framework for automatic production of time-based hypermedia for the Web, focusing on the Digital Talking Books domain. Deliver...
Carlos Duarte, Luís Carriço, Rui Lop...
WWW
2004
ACM
14 years 8 months ago
A generic uiml vocabulary for device- and modality independent user interfaces
We present in this poster our work on a User Interface Markup Language (UIML) vocabulary for the specification of device- and modality independent user interfaces. The work presen...
Rainer Simon, Michael Jank, Florian Wegscheider
CHI
2006
ACM
14 years 7 months ago
Feeling what you hear: tactile feedback for navigation of audio graphs
Access to digitally stored numerical data is currently very limited for sight impaired people. Graphs and visualizations are often used to analyze relationships between numerical ...
Steven A. Wall, Stephen A. Brewster
MHCI
2009
Springer
14 years 2 months ago
Expectations for user experience in haptic communication with mobile devices
The haptic modality – the sense of touch – is utilized very limitedly in current human-computer interaction. Especially in mobile communication, the haptic modality could prov...
Jani Heikkinen, Thomas Olsson, Kaisa Vää...
CHI
2008
ACM
13 years 9 months ago
Cross-channel mobile social software: an empirical study
In this paper, we introduce a prototype system designed to support mobile group socializing that has been appropriated for everyday use by 150 users over 18 months. The system sup...
Clint Heyer, Margot Brereton, Stephen Viller