This paper is devoted to the study of the performance of the linear minimum mean-square error (LMMSE) receiver for (receive) correlated multiple-input multiple-output (MIMO) systems. By the random matrix theory, it is well known that the signal-to-noise ratio (SNR) at the output of this receiver behaves asymptotically like a Gaussian random variable as the number of receive and transmit antennas converge to +1 at the same rate. However, this approximation being inaccurate for the estimation of some performance metrics such as the bit error rate (BER) and the outage probability, especially for small system dimensions, Li et al. proposed convincingly to assume that the SNR follows a generalized gamma distribution which parameters are tuned by computing the first three asymptotic moments of the SNR. In this paper, this technique is generalized to (receive) correlated channels, and closed-form expressions for the first three asymptotic moments of the SNR are provided. To obtain these resul...