—In this paper we present an Information Theoretic Estimator for the number of sources mutually disjoint in a linear mixing model. The approach follows the Minimum Description Length prescription and is roughly equal to the sum of negative normalized maximum log-likelihood and the logarithm of number of sources. Preliminary numerical evidence supports this approach and compares favorabily to both the Akaike (AIC) and Bayesian (BIC) Information Criteria. I. THE MIXING MODEL AND SIGNALS Consider the following mixing model: xd(t) = L l=1 ad,lsl(t)+nd(t) , 1 ≤ d ≤ D , 1 ≤ t ≤ T (1) This model corresponds to an Instantaneous Linear Mixing Model with L souces and D sensors. We will frequently use the vector notation X(t) = (x1(t), . . . , xD(t))T , and matrix A = (ad,l). In this paper the following assumptions are made: 1) (H1) Noise signals (nd)1≤d≤D are Gaussian i.i.d. with zero mean and unknown variance σ2 ; 2) (H2) Source Signals are unknown, but at every moment t at most ...