We consider the problem of learning Gaussian multiresolution (MR) models in which data are only available at the finest scale and the coarser, hidden variables serve both to capture long-distance dependencies and to provide graphical and statistical structure that leads to very efficient, new inference algorithms. These models are aimed at overcoming the limitations of tree-structured MR models that result in blocky artifacts. In particular in such tree models, variables at one scale are independent of each other conditioned on other scales. In the work presented here, we seek to develop models in which variables at each scale have sparse conditional covariance structure when conditioned on other scales. This leads to statistical counterparts to so-called multipole methods in physics. Our objective is to learn sparse graphical structure (namely an embedded tree) for dependencies between variables across different scales, which translates into sparsity in part of the inverse of the cov...
Myung Jin Choi, Venkat Chandrasekaran, Alan S. Wil