Sciweavers

194 search results - page 37 / 39
» Multimodality and Gestures in the Teacher
Sort
View
LRE
2008
174views more  LRE 2008»
13 years 7 months ago
IEMOCAP: interactive emotional dyadic motion capture database
Since emotions are expressed through a combination of verbal and non-verbal channels, a joint analysis of speech and gestures is required to understand expressive human communicati...
Carlos Busso, Murtaza Bulut, Chi-Chun Lee, Abe Kaz...
JDCTA
2010
688views more  JDCTA 2010»
13 years 2 months ago
Hand Mouse: Real Time Hand Motion Detection System Based on Analysis of Finger Blobs
Hand detection is a fundamental step in many practical applications as gesture recognition, video surveillance, and multimodal machine interface and so on. The aim of this paper i...
Ibrahim Furkan Ince, Manuel Socarras-Garzon, Tae-C...
ICMI
2007
Springer
138views Biometrics» more  ICMI 2007»
14 years 1 months ago
Speech-filtered bubble ray: improving target acquisition on display walls
The rapid development of large interactive wall displays has been accompanied by research on methods that allow people to interact with the display at a distance. The basic method...
Edward Tse, Mark S. Hancock, Saul Greenberg
COOPIS
2002
IEEE
14 years 20 days ago
An Evolvable Framework for Perceptual Collaborative Applications
The Neem Platform is a research test bed for Project Neem, concerned with the development of socially and culturally aware collaborative systems in a wide range of domains, through...
Paulo Barthelmess, Clarence A. Ellis, The Neem Pla...
CHI
2005
ACM
14 years 8 months ago
MusicCube: making digital music tangible
To some extent listening to digital music via storage devices has led to a loss of part of the physical experience associated with earlier media formats such as CDs and LPs. For e...
Miguel Bruns Alonso, David V. Keyson