In this paper, we address the problem of deriving bounds for the moments of nearest neighbor distributions. The bounds are formulated for the general case and specifically applied to the problem of noise variance estimation with the Delta test and the Gamma test. For this problem, we focus on the rate of convergence and the bias of the estimators and validate the theoretical achievements with experimental results.