Sciweavers

AAAI
2015

Cross-Modal Similarity Learning via Pairs, Preferences, and Active Supervision

8 years 8 months ago
Cross-Modal Similarity Learning via Pairs, Preferences, and Active Supervision
We present a probabilistic framework for learning pairwise similarities between objects belonging to different modalities, such as drugs and proteins, or text and images. Our framework is based on learning a binary code based representation for objects in each modality, and has the following key properties: (i) it can leverage both pairwise as well as easy-to-obtain relative preference based cross-modal constraints, (ii) the probabilistic framework naturally allows querying for the most useful/informative constraints, facilitating an active learning setting (existing methods for cross-modal similarity learning do not have such a mechanism), and (iii) the binary code length is learned from the data. We demonstrate the effectiveness of the proposed approach on two problems that require computing pairwise similarities between cross-modal object pairs: cross-modal link prediction in bipartite graphs, and hashing based cross-modal similarity search.
Yi Zhen, Piyush Rai, Hongyuan Zha, Lawrence Carin
Added 27 Mar 2016
Updated 27 Mar 2016
Type Journal
Year 2015
Where AAAI
Authors Yi Zhen, Piyush Rai, Hongyuan Zha, Lawrence Carin
Comments (0)