—For a long time, signal processing applications, and most particularly detection and parameter estimation methods, have relied on the limiting behaviour of test statistics and estimators, as the number n of observations of a population grows large comparatively to the population size N, i.e. n/N → ∞. Modern technological and societal advances now demand the study of sometimes extremely large populations, while simultaneously requiring fast signal processing due to accelerated system dynamics; this results in not-so-large practical ratios n/N, sometimes even smaller than one. A disruptive change in classical signal processing methods has therefore been initiated in the past ten years, mostly spurred by the field of large dimensional random matrix theory. The early literature in random matrix theory for signal processing applications is however scarce and highly technical. This tutorial proposes an accessible methodological