This work presents a Log-stable model for natural images blockvariance. Exponential and halfnormal distributions have been previously used to model block-variance, but they were employed to fit images for which the assumption of constant intra-block variance does not hold. We show that when this assumption holds, the Logstable model yields a much better fit in an ML sense. We use a computationally efficient method for estimating the Log-stable parameters through the empirical Kullback-Leibler Divergence, which is asymptotically optimum in an ML sense, and show the validity of the lognormal distribution as an approximation with closed-form formulas for the ML parameter estimation.