Sciweavers

ISBI
2002
IEEE

Capturing contextual dependencies in medical imagery using hierarchical multi-scale models

15 years 7 days ago
Capturing contextual dependencies in medical imagery using hierarchical multi-scale models
In this paper we summarize our results for two classes of hierarchical multi-scale models that exploit contextual information for detection of structure in mammographic imagery. The first model, the hierarchical pyramid neural network (HPNN), is a discriminative model which is capable of integrating information either coarse-to-fine or fine-tocoarse for microcalcification and mass detection. The second model, the hierarchical image probability (HIP) model, captures short-range and contextual dependencies through a combination of coarse-to-fine factoring and a set of hidden variables. The HIP model, being a generative model, has broad utility, and we present results for classification, synthesis and compression of mammographic mass images. The two models demonstrate the utility of the hierarchical multi-scale framework for computer assisted detection and diagnosis.
Paul Sajda, Clay Spence, Lucas C. Parra
Added 20 Nov 2009
Updated 20 Nov 2009
Type Conference
Year 2002
Where ISBI
Authors Paul Sajda, Clay Spence, Lucas C. Parra
Comments (0)