Sciweavers

335 search results - page 4 / 67
» Efficiency of multimodal metaphors in the presentation of le...
Sort
View
IJCV
2007
134views more  IJCV 2007»
13 years 7 months ago
Multi-sensory and Multi-modal Fusion for Sentient Computing
This paper presents an approach to multi-sensory and multi-modal fusion in which computer vision information obtained from calibrated cameras is integrated with a large-scale sent...
Christopher Town
KDD
2012
ACM
238views Data Mining» more  KDD 2012»
11 years 10 months ago
Multi-source learning for joint analysis of incomplete multi-modality neuroimaging data
Incomplete data present serious problems when integrating largescale brain imaging data sets from different imaging modalities. In the Alzheimer’s Disease Neuroimaging Initiativ...
Lei Yuan, Yalin Wang, Paul M. Thompson, Vaibhav A....
CHI
2003
ACM
14 years 7 months ago
A design tool for camera-based interaction
Cameras provide an appealing new input medium for interaction. The creation of camera-based interfaces is outside the skill-set of most programmers and completely beyond the skill...
Jerry Alan Fails, Dan R. Olsen
IUI
2005
ACM
14 years 1 months ago
Multimodal new vocabulary recognition through speech and handwriting in a whiteboard scheduling application
Our goal is to automatically recognize and enroll new vocabulary in a multimodal interface. To accomplish this our technique aims to leverage the mutually disambiguating aspects o...
Edward C. Kaiser
CORR
2010
Springer
100views Education» more  CORR 2010»
13 years 2 months ago
Discovering Knowledge from Multi-modal Lecture Recordings
Educational media mining is the process of converting raw media data from educational systems to useful information that can be used to design learning systems, answer research qu...
Kannan Rajkumar, Christian Guetl