Sciweavers

395 search results - page 15 / 79
» When do we interact multimodally
Sort
View
KDD
2012
ACM
238views Data Mining» more  KDD 2012»
11 years 10 months ago
Multi-source learning for joint analysis of incomplete multi-modality neuroimaging data
Incomplete data present serious problems when integrating largescale brain imaging data sets from different imaging modalities. In the Alzheimer’s Disease Neuroimaging Initiativ...
Lei Yuan, Yalin Wang, Paul M. Thompson, Vaibhav A....
CVPR
2000
IEEE
14 years 9 months ago
Multimodal Speaker Detection Using Error Feedback Dynamic Bayesian Networks
Design and development of novel human-computer interfaces poses a challenging problem: actions and intentions of users have to be inferred from sequences of noisy and ambiguous mu...
Vladimir Pavlovic, James M. Rehg, Ashutosh Garg, T...
COLING
2000
13 years 8 months ago
Taking Account of the User's View in 3D Multimodal Instruction Dialogue
While recent advancements in virtual reality technology have created a rich communication interface linking humans and computers, there has been little work on building dialogue s...
Yukiko I. Nakano, Kenji Imamura, Hisashi Ohara
AVI
2006
13 years 9 months ago
Mixed reality: a model of mixed interaction
Mixed reality systems seek to smoothly link the physical and data processing (digital) environments. Although mixed reality systems are becoming more prevalent, we still do not ha...
Céline Coutrix, Laurence Nigay
PERSUASIVE
2009
Springer
14 years 2 months ago
Designing empathic computers: the effect of multimodal empathic feedback using animated agent
Experiencing emotional distress is the number one reason why people who are undergoing behaviour modification (e.g. quitting smoking, dieting) suffer from relapses. Providing emot...
Hien Nguyen, Judith Masthoff