A wide variety of Dirichlet-multinomial ‘topic’ models have found interesting applications in recent years. While Gibbs sampling remains an important method of inference in such models, variational techniques have certain advantages such as easy assessment of convergence, easy optimization without the need to maintain detailed balance, a bound on the marginal likelihood, and side-stepping of issues with topic-identifiability. The most accurate variational technique thus far, namely collapsed variational LDA (CV-LDA)[1], did not deal with model selection nor did it include inference for hyperparameters. We address both issues by generalizing their technique, obtaining the first variational algorithm to deal with the HDP and to deal with hyperparameters of Dirichlet variables. Experiments show a very significant improvement in accuracy relative to CV-LDA.