The HMM (Hidden Markov Model) is a probabilistic model of the joint probability of a collection of random variables with both observations and states. The GMM (Gaussian Mixture Model) is a finite mixture probability distribution model. Although the two models have a close relationship, they are always discussed independently and separately. The EM (Expectation-Maximum) algorithm is a general method to improve the descent algorithm for finding the Maximum Likelihood Estimation. The EM of HMM and the EM of GMM have similar formula. Two points are proposed in this paper. One is that the EM of GMM can be regarded as a special EM of HMM. The other is that the EM algorithm of GMM based on symbol is faster in implementation than EM algorithm of GMM based on sample (or on observation) traditionally.