In this work, we propose novel results for the optimization of divergences within the framework of region-based active contours. We focus on parametric statistical models where the region descriptor is chosen as the probability density function (pdf) of an image feature (e.g. intensity) inside the region and the pdf belongs to the exponential family. The optimization of divergences appears as a flexible tool for segmentation with and without intensity prior. As far as segmentation without reference is concerned, we aim at maximizing the discrepancy between the pdf of the inside region and the pdf of the outside region. Moreover, since the optimization framework is performed within the exponential family, we can cope with difficult segmentation problems including various noise models (Gaussian, Rayleigh, Poisson, Bernoulli ...). We also experimentally show that the maximisation of the KL divergence offers interesting properties compare to some other data terms (e.g. minimization of t...