Truly ubiquitous computing poses new and significant challenges. A huge number of heterogeneous devices will interact to perform complex distributed tasks. One of the key aspects that will condition the role and impact of these new tecnologies on developed societies is how to obtain a manageable representation of the surrounding environment starting from simple sensing capabilities. This will make devices able to adapt their computing activities on an ever-changing environment. This paper presents a framework to promote unsupervised training processes among different sensors. This technology allows different sensors to exchange the needed knowledge to create a model to classify events happening into the environment. In particular we developed, as a case study, a multi-modal multi-sensor classification system combining data from a camera and a body-worn accelerometer to identify the user motion state. The body-worn accelerometer sensor learns a Gaussian mixture model of the user beha...