The Support Vector Machine (SVM) solution corresponds to the centre of the largest sphere inscribed in version space. Alternative approaches like Bayesian Point Machines (BPM) and Analytic Centre Machines have suggested that the generalization performance can be further enhanced by considering other possible centres of version space like the centroid (centre of mass) or the analytic centre. We present an algorithm to compute exactly the centroid of higher dimensional polyhedra, then derive approximation algorithms to build a new learning machine whose performance is comparable to BPM. We also show that for regular kernel matrices (Gaussian kernels for example), the SVM solution can be obtained by solving a linear system of equalities.