We consider the problem of assisting vulnerable people and their carers to reduce the occurrence, and concomitant consequences, of accidents in the home. A wireless sensor network employing multiple sensing and event detection modalities and distributed processing is proposed for smart home monitoring applications. Distributed vision-based analysis is used to detect occupant’s posture, and features from multiple cameras are merged through a collaborative reasoning function to determine significant events. The ambient assistance provided will assume minimal expectations on the technology people have to directly interact with. Vision-based technology is coupled with AI-based algorithms in such a way that occupants do not have to wear sensors, other than an unobtrusive identification badge, or learn and remember to use a specific device. In addition the system can assess situations, anticipate problems, produce alerts, advise carers and provide explanations.
Hamid K. Aghajan, Juan Carlos Augusto, Chen Wu, Pa