Machine learning with few training examples always leads to over-fitting problems, whereas human individuals are often able to recognize difficult object categories from only one single view. It is a common belief, that this is mostly established by transferring knowledge from related classes. Therefore, we introduce a new hybrid classifier for learning with very few examples by exploiting interclass relationships. The approach consists of a randomized decision trees structure which is significantly enhanced using maximum a posteriori (MAP) estimation. For this reason, a constrained Gaussian is introduced as a new parametric family of prior distributions for multinomial distributions to represent shared knowledge of related categories. We show that the resulting MAP estimation leads to a simple recursive estimation technique, which is applicable beyond our hybrid classifier. Experimental evaluation on two public datasets (including the very demanding Mammals database) shows the benefi...