Despite outstanding successes of the state-of-the-art clustering algorithms, many of them still suffer from shortcomings. Mainly, these algorithms do not capture coherency and homogeneity of clusters simultaneously. We show that some of the best performing spectral as well as hierarchical clustering algorithms can lead to incorrect clustering when the data is comprised of clusters with different densities or includes outliers. We introduce algorithms based on variants of geodesic distance that capture both coherency and homogeneity of clusters. Such choice of distance measure empowers simple clustering algorithms such as K-medoids to outperform spectral and hierarchical clustering algorithms. To show the theoretical merits of our approach, we present theoretical analysis of a simplified version of the algorithm. We also provide remarkable experimental evidence on the performance of our algorithms on a number of challenging clustering problems.