Supervised text classification is the task of automatically assigning a category label to a previously unlabeled text document. We start with a collection of pre-labeled examples whose assigned categories are used to build a predictive model for each category. In previous research, incorporating semantic features from the WordNet lexical database is one of many approaches that have been tried to improve the predictive accuracy of text classification models. The intuition is that words in the training set alone may not be extensive enough to enable the generation of a universal model for a category, but through WordNet expansion (i.e., incorporating words defined by various relationships in WordNet), a more accurate model may be possible. In this paper, we report preliminary results obtained from a comprehensive study where WordNet features, part of speech tags, and term weighting schemes are incorporated into two-category text classification models generated by both a Naive Bayes text...
Trevor N. Mansuy, Robert J. Hilderman