We present an empirical study of the applicability of Probabilistic Lexicalized Tree Insertion Grammars (PLTIG), a lexicalized counterpart to Probabilistic Context-Free Grammars (...
The authors extended the idea of training multiple tasks simultaneously on a partially shared feed forward network. A shared input subvector was added to represented common inputs...
This paper introduces a novel method for obtaining increased predictive performance from transparent models in situations where production input vectors are available when building...
Current work in object categorization discriminates
among objects that typically possess gross differences
which are readily apparent. However, many applications
require making ...
Andrew Moldenke, Asako Yamamuro, David A. Lytle, E...
Abstract. Multiple-instance learning (MIL) allows for training classifiers from ambiguously labeled data. In computer vision, this learning paradigm has been recently used in many ...