The Expectation-Maximization (EM) algorithm is a popular tool in statistical estimation problems involving incomplete data or in problems which can be posed in a similar form, such as mixture estimation. As the EM algorithm is a hillclimbing approach, it will involve the problems such as local maxima, plateau and ridges. In the case of mixture models, these problems involve the initialization of the algorithm and the configuration of the data set. To escape these kinds of situation, randomization is typically a good method. We proposed a random swap EM algorithm (RSEM) to overcome these problems in Gaussian mixture models. Random swaps are repeatedly performed in our method, which can break the configuration of the local maxima and other situations. Compare to the strategies in other methods, the proposed algorithm has its own advantage. We also show its practical applications in image segmentation problems.