Signal processing using over-complete representations has been an active research field in recent years. In this article, we study the following two related problems: (1) given two wavelets and the Gaussian observation model, what is the optimal estimate of the signal which is corrupted by additive noise? and (2) to minimize the variance of the estimate, what is the relationship between the phase responses of the two scaling filters? Based on a study of these two problems, we develop a denoising algorithm. We test the proposed algorithm in image denoising and show that its performance is comparable to that of the state-of-the-art.