Support Vector Machines (SVM) are one of the most useful
techniques in classification problems. One clear example
is face recognition. However, SVM cannot be applied
when the feature vectors defining our samples have missing
entries. This is clearly the case in face recognition when
occlusions are present in the training and/or testing sets.
When k features are missing in a sample vector of class 1,
these define an affine subspace of k dimensions. The goal
of the SVM is to maximize the margin between the vectors
of class 1 and class 2 on those dimensions with no missing
elements and, at the same time, maximize the margin between
the vectors in class 2 and the affine subspace of class
1. This second term of the SVM criterion will minimize the
overlap between the classification hyperplane and the subspace
of solutions in class 1, because we do not know which
values in this subspace a test vector can take. The hyperplane
minimizing this overlap is obviously the one paralle...
Aleix M. Martínez, Hongjun Jia