Abstract. Mutual information (MI) is a common criterion in independent component analysis (ICA) optimization. MI is derived from probability density functions (PDF). There are scenarios in which assuming a parametric form for the PDF leads to poor performance. Therefore, the need arises for non-parametric PDF and MI estimation. Existing nonparametric algorithms suffer from high complexity, particularly in high dimensions. To counter this obstacle, we present an ICA algorithm based on accelerated kernel entropy estimation. It achieves both high separation performance and low computational complexity. For K sources with N samples, our ICA algorithm has an iteration complexity of at most O(KN log N + K2 N).
Sarit Shwartz, Michael Zibulevsky, Yoav Y. Schechn