Sciweavers

ML
2015
ACM

Additive regularization of topic models

8 years 8 months ago
Additive regularization of topic models
Abstract. Probabilistic topic modeling of text collections is a powerful tool for statistical text analysis. Determining the optimal number of topics remains a challenging problem in topic modeling. We propose a simple entropy regularization for topic selection in terms of Additive Regularization of Topic Models (ARTM), a multicriteria approach for combining regularizers. The entropy regularization gradually eliminates insignificant and linearly dependent topics. This process converges to the correct value on semi-real data. On real text collections it can be combined with sparsing, smoothing and decorrelation regularizers to produce a sequence of models with different numbers of well interpretable topics.
Konstantin Vorontsov, Anna Potapenko
Added 14 Apr 2016
Updated 14 Apr 2016
Type Journal
Year 2015
Where ML
Authors Konstantin Vorontsov, Anna Potapenko
Comments (0)