Feature and structure selection is an important part of many classification problems. In previous papers, an approach called basis pursuit classification has been proposed which poses feature selection as a regularization problem using a 1-norm to measure parameter complexity. In addition, a complete optimal parameter set, here called the locus, can be calculated which contains every optimal collection of sparse features as a function of the regularization parameter. This paper considers how to iteratively calculate the parameter locus using a set of rank-1 inverse matrix updates. The algorithm is tested on both artificial and real data and it is shown that the computational cost is reduced from a cubed to a squared problem in the number of features.