We propose a framework that learns the graph structure underlying a set of smooth signals. Given X ∈ Rm×n whose rows reside on the vertices of an unknown graph, we learn the edge weights w ∈ R m(m−1)/2 + under the smoothness assumption that tr X LX is small. We show that the problem is a weighted -1 minimization that leads to naturally sparse solutions. We point out how known graph learning or construction techniques fall within our framework and propose a new model that performs better than the state of the art in many settings. We present efficient, scalable primal-dual based algorithms for both our model and the previous state of the art, and evaluate their performance on artificial and real data.