A new method for function estimation and variable selection, specifically designed for additive models fitted by cubic splines is proposed.This new method involves regularizing additive models using the l1-norm, which generalizes the lasso to the nonparametric setting.As in the linear case, it shrinks coefficients and produces some coefficients that are exactly zero. It gives parsimonious models, selects significant variables, and reveals nonlinearities in the effects of predictors. Two strategies for finding a parsimonious additive model solution are proposed. Both algorithms are based on a fixed point algorithm, combined with a singular value decomposition that considerably reduces computation. The empirical behavior of parsimonious additive models is compared to the adaptive backfitting BRUTO algorithm. The results allow to characterize the domains in which our approach is effective: it performs significantly better than BRUTO when model estimation is challenging.An implem...