Similarity search usually encounters a serious problem in the high dimensional space, known as the “curse of dimensionality”. In order to speed up the retrieval efficiency, p...
Dimensionality reduction plays a fundamental role in data processing, for which principal component analysis (PCA) is widely used. In this paper, we develop the Laplacian PCA (LPC...
This paper presents a new approach to feature analysis in automatic speech recognition (ASR) based on locality preserving projections (LPP). LPP is a manifold based dimensionality...
Nonlinear dimensionality reduction (NLDR) algorithms such as Isomap, LLE and Laplacian Eigenmaps address the problem of representing high-dimensional nonlinear data in terms of lo...
Marwan Torki is a Ph. D. student in computer science department at Rutgers University. He took his M.Sc. and B.Sc. in computer science from department of computer science, faculty ...
Due to the curse of dimensionality, high-dimensional data is often pre-processed with some form of dimensionality reduction for the classification task. Many common methods of su...
We introduce a parametric version (pDRUR) of the recently proposed Dimensionality Reduction by Unsupervised Regression algorithm. pDRUR alternately minimizes reconstruction error ...
When classifying high-dimensional sequence data, traditional methods (e.g., HMMs, CRFs) may require large amounts of training data to avoid overfitting. In such cases dimensional...
The dimensionality curse has profound e ects on the effectiveness of high-dimensional similarity indexing from the performance perspective. One of the well known techniques for im...