Sciweavers

304 search results - page 26 / 61
» A Multi-Context System Computing Modalities
Sort
View
NAACL
1994
13 years 9 months ago
Predicting and Managing Spoken Disfluencies During Human-Computer Interaction
This research characterizes the spontaneous spoken disfluencies typical of human-computer interaction, and presents a predictive model accounting for their occurrence. Data were c...
Sharon L. Oviatt
ICMI
2004
Springer
263views Biometrics» more  ICMI 2004»
14 years 1 months ago
Analysis of emotion recognition using facial expressions, speech and multimodal information
The interaction between human beings and computers will be more natural if computers are able to perceive and respond to human non-verbal communication such as emotions. Although ...
Carlos Busso, Zhigang Deng, Serdar Yildirim, Murta...
HAPTICS
2009
IEEE
14 years 2 months ago
A whole-arm tactile display system
This work presents a new tactile display device for relaying contact information to locations along the human arm. The system is intended to facilitate teleoperation of whole-arm ...
Riichiro Tadakuma, Robert D. Howe
IHI
2010
198views Healthcare» more  IHI 2010»
13 years 2 months ago
Large-scale multimodal mining for healthcare with mapreduce
Recent advances in healthcare and bioscience technologies and proliferation of portable medical devices are producing massive amounts of multimodal data. The need for parallel pro...
Fei Wang, Vuk Ercegovac, Tanveer Fathima Syeda-Mah...
ICASSP
2011
IEEE
12 years 11 months ago
Improving kernel-energy trade-offs for machine learning in implantable and wearable biomedical applications
Emerging biomedical sensors and stimulators offer unprecedented modalities for delivering therapy and acquiring physiological signals (e.g., deep brain stimulators). Exploiting th...
Kyong-Ho Lee, Sun-Yuan Kung, Naveen Verma