Adaptor grammars (Johnson et al., 2007b) are a non-parametric Bayesian extension of Probabilistic Context-Free Grammars (PCFGs) which in effect learn the probabilities of entire s...
—We present a structured model of context that supports an integrated approach to language acquisition and use. The model extends an existing formal notation, Embodied Constructi...
- In this study, a context-sensitive grammar is suggested to model various forms of RNA secondary structures, especially pseudoknots. Comparing with a conventional context-free gra...
Inducing a grammar from text has proven to be a notoriously challenging learning task despite decades of research. The primary reason for its difficulty is that in order to induce...
This paper establishes a connection between two apparently very different kinds of probabilistic models. Latent Dirichlet Allocation (LDA) models are used as "topic models&qu...