Sciweavers

395 search results - page 28 / 79
» When do we interact multimodally
Sort
View
ACMACE
2009
ACM
14 years 3 months ago
Instantaneous saccade driven eye gaze interaction
In this paper, we introduce and evaluate a new Instantaneous Saccade (IS) selection scheme for eye gaze driven interfaces where the speed of the target selection is of utmost impo...
Oleg V. Komogortsev, Young Sam Ryu, Do Hyong Koh, ...
ICAT
2007
IEEE
14 years 17 days ago
Direct-Projected AR Based Interactive User Interface for Medical Surgery
In the field of computer aided surgery, augmented reality (AR) technology has been successfully used for enhancing accuracy of surgery and making surgeons convenient by visually a...
Byung-Kuk Seo, Moon-Hyun Lee, Hanhoon Park, Jong-I...
CHI
1999
ACM
14 years 1 months ago
Touch-Sensing Input Devices
We can touch things, and our senses tell us when our hands are touching something. But most computer input devices cannot detect when the user touches or releases the device or so...
Ken Hinckley, Mike Sinclair
INTERACT
2007
13 years 10 months ago
ThumbSpace: Generalized One-Handed Input for Touchscreen-Based Mobile Devices
In this paper, we present ThumbSpace, a software-based interaction technique that provides general one-handed thumb operation of touchscreenbased mobile devices. Our goals are to p...
Amy K. Karlson, Benjamin B. Bederson
CAISE
2006
Springer
14 years 14 days ago
Mediation Patterns for Message Exchange Protocols
Abstract. Systems interact with their environment (e.g., other systems) by exchanging messages in a particular order. Interoperability problems arise when systems do not understand...
Stanislav Pokraev, Manfred Reichert