Abstract. Recently, the authors developed the Minimax Mutual Information algorithm for linear ICA of real-valued mixtures, which is based on a density estimate stemming from Jaynes’ maximum entropy principle. Since the entropy estimates result in an approximate upper bound for the actual mutual information of the separated outputs, minimizing this upper bound results in a robust performance and good generalization. In this paper, we extend the mentioned algorithm to complex-valued mixtures. Simulations with artificial data demonstrate that the proposed algorithm outperforms FastICA.
Jian-Wu Xu, Deniz Erdogmus, Yadunandana N. Rao, Jo