Abstract--Plenty of methods have been proposed in order to discover latent variables (features) in data sets. Such approaches include the principal component analysis (PCA), independent component analysis (ICA), factor analysis (FA), etc., to mention only a few. A recently investigated approach to decompose a data set with a given dimensionality into a lower dimensional space is the so-called nonnegative matrix factorization (NMF). Its only requirement is that both decomposition factors are nonnegative. To approximate the original data, the minimization of the NMF objective function is performed in the Euclidean space, where the difference between the original data and the factors can be minimized by employing 2-norm. In this paper, we propose a generalization of the NMF algorithm by translating the objective function into a Hilbert space (also called feature space) under nonnegativity constraints. With the help of kernel functions, we developed an approach that allows high-order depen...