We propose a novel statistical approach to detect defects in digitized archive film by using temporal information across a number of frames modeled with an HMM. The HMM is trained for normal observation sequences and then applied within a framework to detect defective pixels by examining each new observation sequence and its subformations via a leave-one-out process. We compare against state-of-the-art results to demonstrate that the proposed method achieves better detection rates, with fewer false alarms.