Sciweavers

424 search results - page 9 / 85
» Virtually Shared Displays and User Input Devices
Sort
View
IJVR
2007
123views more  IJVR 2007»
13 years 7 months ago
Towards Sociable Virtual Humans: Multimodal Recognition of Human Input and Behavior
—One of the biggest obstacles for constructing effective sociable virtual humans lies in the failure of machines to recognize the desires, feelings and intentions of the human us...
Christian Eckes, Konstantin Biatov, Frank Hül...
MHCI
2007
Springer
14 years 1 months ago
Co-present photo sharing on mobile devices
The paper reports a mobile application that allows users to share photos with other co-present users by synchronizing the display on multiple mobile devices. Various floor control...
Leonard Martin Ah Kun, Gary Marsden
CHI
2007
ACM
13 years 11 months ago
An exploratory study of input configuration and group process in a negotiation task using a large display
This paper reports on an exploratory study of the effects of input configuration on group behavior and performance in a collaborative task performed by a collocated group using a ...
Jeremy P. Birnholtz, Tovi Grossman, Clarissa Mak, ...
CHI
1999
ACM
13 years 12 months ago
Touch-Sensing Input Devices
We can touch things, and our senses tell us when our hands are touching something. But most computer input devices cannot detect when the user touches or releases the device or so...
Ken Hinckley, Mike Sinclair
UIST
2009
ACM
14 years 2 months ago
Abracadabra: wireless, high-precision, and unpowered finger input for very small mobile devices
We present Abracadabra, a magnetically driven input technique that offers users wireless, unpowered, high fidelity finger input for mobile devices with very small screens. By exte...
Chris Harrison, Scott E. Hudson