For fast classification under real-time constraints, as required in many imagebased pattern recognition applications, linear discriminant functions are a good choice. Linear discriminant analysis (LDA) computes such discriminant functions in a space spanned by real-valued features extracted from the input. The accuracy of the trained classifier crucially depends on these features, its time complexity on their number. As the number of available features is immense in most real-world problems, it becomes essential to use meta-heuristics for feature selection and/or feature optimization. These methods typically involve iterated training of a classifier after substitutions or modifications of features. Therefore, we derive an efficient incremental update formula for LDA discriminant functions for the substitution of features. It scales linearly in the number of altered features and quadratically in the overall number of features, while completely retraining scales cubically in the number ...