Compared to Singular Value Decomposition (SVD), Generalized Low Rank Approximations of Matrices (GLRAM) can consume less computation time, obtain higher compression ratio, and yield competitive classification performance. GLRAM has been successfully applied to applications such as image compression and retrieval, and quite a few extensions have been successively proposed. However, in literature, some basic properties and crucial problems with regard to GLRAM have not been explored or solved yet. For this sake, we revisit GLRAM in this paper. First, we reveal such a close relationship between GLRAM and SVD that GLRAM's objective function is identical to SVD's objective function except the imposed constraints. Second, we derive a lower-bound of GLRAM's objective function, and discuss when the lower-bound can be touched. Moreover, from the viewpoint of minimizing the lower-bound, we answer one open problem raised by Ye (Machine Learning, 2005), i.e., a theoretical justifica...