— We analyze the generalization performance of a student in a model composed of linear perceptrons: a true teacher, ensemble teachers, and the student. Calculating the generalization error of the student analytically using statistical mechanics in the framework of online learning, we prove that when the learning rate satisfies η < 1, the larger the number K is and the more variety the ensemble teachers have, the smaller the generalization error is. On the other hand, when η > 1, the properties are completely reversed. If the variety of the ensemble teachers is rich enough, the direction cosine between the true teacher and the student becomes unity in the limit of η → 0 and K → ∞.