Estimating divergence between two point processes, i.e. probability laws on the space of spike trains, is an essential tool in many computational neuroscience applications, such as change detection and neural coding. However, the problem of estimating divergence, although well studied in the Euclidean space, has seldom been addressed in a more general setting. Since the space of spike trains can be viewed as a metric space, we address the problem of estimating JensenShannon divergence in a metric space using a nearest neighbor based approach. We empirically demonstrate the validity of the proposed estimator, and compare it against other available methods in the context of two-sample problem.
Sohan Seth, Austin J. Brockmeier, José Carl