Abstract. This paper proposes a general approach named ExpectationMiniMax (EMM) for clustering analysis without knowing the cluster number. It describes the contrast function of Expectation-Maximization (EM) algorithm by an approximate one with a designable error term. Through adaptively minimizing a specific error term meanwhile maximizing the approximate contrast function, the EMM automatically penalizes all rivals during the competitive learning. Subsequently, the EMM not only includes the Rival Penalized Competitive Learning algorithm (Xu et al. 1993) and its Type A form (Xu 1997) with the new variants developed, but also provides a better alternative way to optimize the EM contrast function with at least two advantages: (1) faster model parameter learning speed, and (2) automatic model-complexity selection capability. We present the general learning procedures of the EMM, and demonstrate its outstanding performance in comparison with the EM.