In this paper, supervised nonparametric information theoretic classification (ITC) is introduced. Its principle relies on the likelihood of a data sample of transmitting its class label to data points in its vicinity. ITC's learning rule is linked to the concept of information potential and the approach is validated on Ripley's data set. We show that ITC may outperform classical classification algorithms, such as probabilistic neural networks and support vector machines.