Nonnegative Matrix and Tensor Factorization (NMF/NTF) and Sparse Component Analysis (SCA) have already found many potential applications, especially in multi-way Blind Source Separation (BSS), multi-dimensional data analysis, model reduction and sparse signal/image representations. In this paper we propose a family of the modified Regularized Alternating Least Squares (RALS) algorithms for NMF/NTF. By incorporating regularization and penalty terms into the weighted Frobenius norm we are able to achieve sparse and/or smooth representations of the desired solution, and to alleviate the problem of getting stuck in local minima. We implemented the RALS algorithms in our NMFLAB/NTFLAB Matlab Toolboxes, and compared them with standard NMF algorithms. The proposed algorithms are characterized by improved efficiency and convergence properties, especially for largescale problems.