Sciweavers

166 search results - page 17 / 34
» Learning a kernel matrix for nonlinear dimensionality reduct...
Sort
View
AAAI
2012
11 years 10 months ago
Sparse Probabilistic Relational Projection
Probabilistic relational PCA (PRPCA) can learn a projection matrix to perform dimensionality reduction for relational data. However, the results learned by PRPCA lack interpretabi...
Wu-Jun Li, Dit-Yan Yeung
CVPR
2007
IEEE
14 years 9 months ago
Trace Ratio vs. Ratio Trace for Dimensionality Reduction
A large family of algorithms for dimensionality reduction end with solving a Trace Ratio problem in the form of arg maxW Tr(WT SpW)/Tr(WT SlW)1 , which is generally transformed in...
Huan Wang, Shuicheng Yan, Dong Xu, Xiaoou Tang, Th...
PR
2008
129views more  PR 2008»
13 years 7 months ago
A comparison of generalized linear discriminant analysis algorithms
7 Linear discriminant analysis (LDA) is a dimension reduction method which finds an optimal linear transformation that maximizes the class separability. However, in undersampled p...
Cheong Hee Park, Haesun Park
CVPR
2008
IEEE
14 years 9 months ago
Parameterized Kernel Principal Component Analysis: Theory and applications to supervised and unsupervised image alignment
Parameterized Appearance Models (PAMs) (e.g. eigentracking, active appearance models, morphable models) use Principal Component Analysis (PCA) to model the shape and appearance of...
Fernando De la Torre, Minh Hoai Nguyen
CORR
2012
Springer
171views Education» more  CORR 2012»
12 years 3 months ago
Random Feature Maps for Dot Product Kernels
Approximating non-linear kernels using feature maps has gained a lot of interest in recent years due to applications in reducing training and testing times of SVM classifiers and...
Purushottam Kar, Harish Karnick