Sciweavers

ICML
1999
IEEE

Lazy Bayesian Rules: A Lazy Semi-Naive Bayesian Learning Technique Competitive to Boosting Decision Trees

15 years 1 months ago
Lazy Bayesian Rules: A Lazy Semi-Naive Bayesian Learning Technique Competitive to Boosting Decision Trees
Lbr is a lazy semi-naive Bayesian classi er learning technique, designed to alleviate the attribute interdependence problem of naive Bayesian classi cation. To classify a test example, it creates a conjunctive rule that selects a most appropriate subset of training examples and induces a local naive Bayesian classi er using this subset. Lbr can signi cantly improve the performance of the naive Bayesian classi er. A bias and variance analysis of Lbr reveals that it signi cantly reduces the bias of naive Bayesian classi cation at a cost of a slight increase in variance. It is interesting to compare this lazy technique with boosting and bagging, two well-known state-of-the-art non-lazy learning techniques. Empirical comparison of Lbr with boosting decision trees on discrete valued data shows that Lbr has, on average, signi cantly lower variance and higher bias. As a result of the interaction of these e ects, the average prediction error of Lbr over a range of learning tasks is at a level...
Zijian Zheng, Geoffrey I. Webb, Kai Ming Ting
Added 17 Nov 2009
Updated 17 Nov 2009
Type Conference
Year 1999
Where ICML
Authors Zijian Zheng, Geoffrey I. Webb, Kai Ming Ting
Comments (0)