The problems of dimension reduction and inference of statistical dependence are addressed by the modeling framework of learning gradients. The models we propose hold for Euclidean spaces as well as the manifold setting. The central quantity in this approach is an estimate of the gradient and the gradient outer product. We relate the gradient outer product to standard statistical quantities such covariances and provide a simple and precise comparison of a variety of simultaneous regression and dimensionality reduction methods. We provide rates of convergence for both inference of informative directions as well as inference of a graphical model of variable dependencies. We illustrate the efficacy of the method of simulated and real data.