In this paper, Multi-View Expectation and Maximization algorithm for finite mixture models is proposed by us to handle realworld learning problems which have natural feature splits. Multi-View EM does feature split as Co-training and Co-EM, but it considers multiview learning problems in the EM framework. The proposed algorithm has these impressing advantages comparing with other algorithms in Cotraining setting: it can be applied for both unsupervised learning and semi-supervised learning tasks; it can easily deal with more two views learning problems; it can simultaneously utilize different classifiers and different optimization criteria such as ML and MAP in different views for learning; its convergence is theoretically guaranteed. Experiments on synthetic data, USPS data and WebKB data1 demonstrated that Multi-View EM performed satisfactorily well compared with Co-EM, Cotraining and standard EM.