Markov logic networks (MLNs) use firstorder formulas to define features of Markov networks. Current MLN structure learners can only learn short clauses (4-5 literals) due to extre...
We introduce a natural generalization of submodular set cover and exact active learning with a finite hypothesis class (query learning). We call this new problem interactive submo...
Restricted Boltzmann machines were developed using binary stochastic hidden units. These can be generalized by replacing each binary unit by an infinite number of copies that all ...
Predicting the "Value at Risk" of a portfolio of stocks is of great significance in quantitative finance. We introduce a new class models, "dynamical products of ex...
Multiagent Inductive Learning is the problem that groups of agents face when they want to perform inductive learning, but the data of interest is distributed among them. This pape...
When modeling high-dimensional richly structured data, it is often the case that the distribution defined by the Deep Boltzmann Machine (DBM) has a rough energy landscape with man...
We propose a general and efficient algorithm for learning low-rank matrices. The proposed algorithm converges super-linearly and can keep the matrix to be learned in a compact fac...
The hierarchical Dirichlet process (HDP) is a Bayesian nonparametric mixed membership model--each data point is modeled with a collection of components of different proportions. T...
Sinead Williamson, Chong Wang, Katherine A. Heller...