We introduce a new EM framework in which it is possible not only to optimize the model parameters but also the number of model components. A key feature of our approach is that we use nonparametric density estimation to improve parametric density estimation in the EM framework. While the classical EM algorithm estimates model parameters empirically using the data points themselves, we estimate them using nonparametric density estimates. There exist many possible applications that require optimal adjustment of model components. We present experimental results in two domains. One is polygonal approximation of laser range data, which is an active research topic in robot navigation. The other is grouping of edge pixels to contour boundaries, which still belongs to unsolved problems in computer vision. Categories and Subject Descriptors I.5 [Pattern Recognition]: General General Terms Algorithms, Performance, Experimentation Keywords EM, Expectation Maximization, Kullback-Leibler divergenc...