Diffusion Maps (DiffMaps) has recently provided a general framework that unites many other spectral manifold learning algorithms, including Laplacian Eigenmaps, and it has become one of the most successful and popular frameworks for manifold learning to date. However, Diffusion Maps often creates unnecessary distortions, and its performance varies widely in response to parameter value changes. In this paper, we draw a previously unnoticed connection between DiffMaps and spring-motivated methods. We show that DiffMaps has a physical interpretation: it finds the arrangement of high-dimensional objects in low-dimensional space that minimizes the elastic energy of a particular spring network. Within this interpretation, we recognize the root cause of a variety of problems that are commonly observed in the Diffusion Maps output, including sensitivity to user-specified parameters, sensitivity to sampling density, and distortion of boundaries. We then show how to exploit the connection bet...
Shannon M. Hughes, Peter J. Ramadge