Logistic Regression is a well-known classification method that has been used widely in many applications of data mining, machine learning, computer vision, and bioinformatics. Sparse logistic regression embeds feature selection in the classification framework using the 1-norm regularization, and is attractive in many applications involving high-dimensional data. In this paper, we propose Lassplore for solving Large-scale sparse logistic regression. Specifically, we formulate the problem as the 1-ball constrained smooth convex optimization, and propose to solve the problem using the Nesterov's method, an optimal first-order black-box method for smooth convex optimization. One of the critical issues in the use of the Nesterov's method is the estimation of the step size at each of the optimization iterations. Previous approaches either applies the constant step size which assumes that the Lipschitz gradient is known in advance, or requires a sequence of decreasing step size whi...