In this paper, we propose a new method to obtain customized video summarization according to specific user preferences. Our approach is tailored on Cultural Heritage scenario and is designed on identifying candidate shots, selecting from the original streams only the scenes with behavior patterns related to the presence of relevant experiences, and further filtering them in order to obtain a summary matching the requested user preferences. Our preliminary results show that the proposed approach is able to leverage user’s preferences in order to obtain a customized summary, so that different users may extract from the same stream different summaries. Categories and Subject Descriptors I.4 [Image processing and computer vision]: Costumized egocentric video summarization; H.3.1 [Information systems]: Content analysis and indexing General Terms Algorithms, Design, Experimentation Keywords Video summarization, egocentric vision, wearable devices