A new family of boosting algorithms, denoted TaylorBoost, is proposed. It supports any combination of loss function and first or second order optimization, and includes classical algorithms such as AdaBoost, GradientBoost, or LogitBoost as special cases. Its restriction to the set of canonical losses makes it possible to have boosting algorithms with explicit margin control. A new large family of losses with this property, based on the set of cumulative distributions of zero mean random variables, is then proposed. A novel loss function in this family, the Laplace loss, is finally derived. The combination of this loss and second order TaylorBoost produces a boosting algorithm with explicit margin control.