Sciweavers

194 search results - page 11 / 39
» Multimodality and Gestures in the Teacher
Sort
View
IUI
2010
ACM
14 years 4 months ago
Usage patterns and latent semantic analyses for task goal inference of multimodal user interactions
This paper describes our work in usage pattern analysis and development of a latent semantic analysis framework for interpreting multimodal user input consisting speech and pen ge...
Pui-Yu Hui, Wai Kit Lo, Helen M. Meng
ICMI
2009
Springer
95views Biometrics» more  ICMI 2009»
14 years 2 months ago
Salience in the generation of multimodal referring acts
Pointing combined with verbal referring is one of the most paradigmatic human multimodal behaviours. The aim of this paper is foundational: to uncover the central notions that are...
Paul Piwek
ICMI
2004
Springer
151views Biometrics» more  ICMI 2004»
14 years 1 months ago
Multimodal model integration for sentence unit detection
In this paper, we adopt a direct modeling approach to utilize conversational gesture cues in detecting sentence boundaries, called SUs, in video taped conversations. We treat the ...
Mary P. Harper, Elizabeth Shriberg
KI
2008
Springer
13 years 7 months ago
The Enhancement of Low-Level Classifications for Ambient Assisted Living
Abstract. Assisted living means providing the assisted with custom services, specific to their needs and capabilities. Computer monitoring can supply some of these services, be it ...
Rachel E. Goshorn, Deborah Goshorn, Mathias Kö...
HRI
2010
ACM
14 years 1 months ago
Transparent active learning for robots
—This research aims to enable robots to learn from human teachers. Motivated by human social learning, we believe that a transparent learning process can help guide the human tea...
Crystal Chao, Maya Cakmak, Andrea Lockerd Thomaz