The exponential embedding of two or more probability density functions (PDFs) is proposed for multimodal sensor processing. It approximates the unknown PDF by exponentially embedding the known PDFs. Such embedding is of a exponential family indexed by some parameters, and hence inherits many nice properties of the exponential family. It is shown that the approximated PDF is asymptotically the one that is the closest to the unknown PDF in Kullback-Leibler (KL) divergence. Applied to hypothesis testing, this approach shows improved performance compared to existing methods for cases of practical importance where the sensor outputs are not independent.