In this work we propose an approach to binary classification based on an extension of Bayes Point Machines. Particularly, we take into account the whole set of hypotheses that are consistent with the data (the so called version space) and the intrinsic noise in class labeling. We follow a Bayesian approach and compute an approximate posterior distribution for the model parameters, which leads to a predictive distribution over unseen data. The most compelling feature of the proposed model is that it is able to learn the noise present in the data with no additional cost. All the computations are carried out by means of the approximate Bayesian inference algorithm Expectation Propagation. Experimental results indicate that the proposed approach outperforms Support Vector Machines over several of the classification problems studied and is competitive with other Bayesian classification algorithms based on Gaussian Processes. Key words: Kernel Methods, Approximate Inference, Bayesian Method...