Sciweavers

PUC
2007

Recognizing context for annotating a live life recording

13 years 11 months ago
Recognizing context for annotating a live life recording
Abstract In the near future, it will be possible to continuously record and store the entire audio–visual lifetime of a person together with all digital information that the person perceives or creates. While the storage of this data will be possible soon, retrieval and indexing into such large data sets are unsolved challenges. Since today’s retrieval cues seem insufficient we argue that additional cues, obtained from body-worn sensors, make associative retrieval by humans possible. We present three approaches to create such cues, each along with an experimental evaluation: the user’s physical activity from acceleration sensors, his social environment from audio sensors, and his interruptibility from multiple sensors. Keywords Context-awareness Æ Information retrieval Æ Sensing systems Æ Context recognition Æ Wearable computing
Nicky Kern, Bernt Schiele, Albrecht Schmidt
Added 27 Dec 2010
Updated 27 Dec 2010
Type Journal
Year 2007
Where PUC
Authors Nicky Kern, Bernt Schiele, Albrecht Schmidt
Comments (0)