Dimensionality reduction involves mapping a set of high dimensional input points onto a low dimensional manifold so that "similar" points in input space are mapped to ne...
Multi-instance learning deals with problems that treat bags of instances as training examples. In single-instance learning problems, dimensionality reduction is an essential step ...
PCA-SIFT is an extension to SIFT which aims to reduce SIFT’s high dimensionality (128 dimensions) by applying PCA to the gradient image patches. However PCA is not a discriminati...
Dimension reduction is popular for learning predictive models in high-dimensional spaces. It can highlight the relevant part of the feature space and avoid the curse of dimensiona...
Many classes of image data span a low dimensional nonlinear space embedded in the natural high dimensional image space. We adopt and generalize a recently proposed dimensionality ...