We present in this paper a novel approach for shape description based on kernel principal component analysis (KPCA). The strength of this method resides in the similarity (rotation, translation and particularly scale) invariance of KPCA when using a family of triangular conditionally positive definite kernels. Beside this invariance, the method provides an effective way to capture non-linearities in shape geometry. A given two-dimensional curve is described using the eigenvalues of the underlying manifold modeled in a high-dimensional Hilbert space. Using Fourier analysis, we will show that this eigenvalue description captures low to high variations of the shape frequencies. Experiments conducted on standard databases including the SQUID, the Swedish and the Smithsonian leaf databases, show that the method is effective in capturing invariance and generalizes well for shape matching and retrieval. Key words: Statistical Learning, Kernel Principal Component Analysis, Scale Invariance...