Sciweavers

1826 search results - page 16 / 366
» Using Random Forests in the Structured Language Model
Sort
View
ICASSP
2008
IEEE
14 years 3 months ago
A weighted subspace approach for improving bagging performance
Bagging is an ensemble method that uses random resampling of a dataset to construct models. In classification scenarios, the random resampling procedure in bagging induces some c...
Qu-Tang Cai, Chun-Yi Peng, Chang-Shui Zhang
EMNLP
2009
13 years 6 months ago
On the Use of Virtual Evidence in Conditional Random Fields
Virtual evidence (VE), first introduced by (Pearl, 1988), provides a convenient way of incorporating prior knowledge into Bayesian networks. This work generalizes the use of VE to...
Xiao Li
ICASSP
2011
IEEE
13 years 9 days ago
Structured Output Layer neural network language model
This paper introduces a new neural network language model (NNLM) based on word clustering to structure the output vocabulary: Structured Output Layer NNLM. This model is able to h...
Hai Son Le, Ilya Oparin, Alexandre Allauzen, Jean-...
JCB
2006
215views more  JCB 2006»
13 years 8 months ago
Protein Fold Recognition Using Segmentation Conditional Random Fields (SCRFs)
Protein fold recognition is an important step towards understanding protein three-dimensional structures and their functions. A conditional graphical model, i.e., segmentation con...
Yan Liu 0002, Jaime G. Carbonell, Peter Weigele, V...
BIRTHDAY
2009
Springer
13 years 9 months ago
On the Evolution of OCL for Capturing Structural Constraints in Modelling Languages
Abstract. The Object Constraint Language (OCL) can be used to capture strucnstraints in the context of the abstract syntax of modelling languages (metamodels) defined in the MOF me...
Dimitrios S. Kolovos, Richard F. Paige, Fiona A. C...