Sciweavers

121 search results - page 11 / 25
» Sensing and visualizing spatial relations of mobile devices
Sort
View
ASSETS
2008
ACM
13 years 11 months ago
Computer vision-based clear path guidance for blind wheelchair users
We describe a system for guiding blind and visually impaired wheelchair users along a clear path that uses computer vision to sense the presence of obstacles or other terrain feat...
Volodymyr Ivanchenko, James Coughlan, William Gerr...
ACMDIS
2008
ACM
13 years 11 months ago
TapGlance: designing a unified smartphone interface
The difference between using one mobile phone and another can feel like learning a new language based on our extensive experience designing mobile applications for spatial data na...
Daniel C. Robbins, Bongshin Lee, Roland Fernandez
AI
2005
Springer
13 years 9 months ago
Learning to talk about events from narrated video in a construction grammar framework
The current research presents a system that learns to understand object names, spatial relation terms and event descriptions from observing narrated action sequences. The system e...
Peter Ford Dominey, Jean-David Boucher
CHI
2009
ACM
14 years 2 months ago
A tag in the hand: supporting semantic, social, and spatial navigation in museums
Designers of mobile, social systems must carefully think about how to help their users manage spatial, semantic, and social modes of navigation. Here, we describe our deployment o...
Dan Cosley, Jonathan Baxter, Soyoung Lee, Brian Al...
CG
2007
Springer
13 years 9 months ago
Visual analysis of users' performance data in fitness activities
This paper presents a tool for the visual analysis of fitness performance data, such as running speed and heart rate. The tool, called MOPET Analyzer, provides a set of interacti...
Daniele Nadalutti, Luca Chittaro