Sciweavers

521 search results - page 34 / 105
» Affective multimodal human-computer interaction
Sort
View
150
Voted
OZCHI
2005
ACM
15 years 9 months ago
Dawn explorer: a framework for multimodal accessibility to computer systems
Technology is advancing at a rapid pace, automating many everyday chores in the process, changing the way we perform work and providing various forms of entertainment. Makers of t...
Frank Loewenich, Frédéric Maire
HCI
2007
15 years 5 months ago
Character Agents in E-Learning Interface Using Multimodal Real-Time Interaction
This paper describes an e-learning interface with multiple tutoring character agents. The character agents use eye movement information to facilitate empathy-relevant reasoning and...
Hua Wang, Jie Yang, Mark H. Chignell, Mitsuru Ishi...
CHI
2007
ACM
16 years 4 months ago
Gazetop: interaction techniques for gaze-aware tabletops
GazeTop is a tabletop system that tracks multi-user eye movement in a co-located setting. Knowledge of eye movement is highly relevant to tabletop interaction: eyes can point to d...
David Holman
CHI
2007
ACM
16 years 4 months ago
Devices as interactive physical containers: the shoogle system
Shoogle is a novel interface for sensing data within a mobile device, such as presence and properties of text messages or remaining resources. It is based around active exploratio...
John Williamson, Roderick Murray-Smith, Stephen Hu...
CHI
2003
ACM
16 years 4 months ago
Super cilia skin: an interactive membrane
In this paper we introduce Super Cilia Skin, a multi-modal interactive membrane. We conceived Super Cilia Skin as a computationally enhanced membrane coupling tactilekinesthetic i...
Hayes Raffle, Mitchell W. Joachim, James Tichenor