Sciweavers

COLT
1993
Springer

Parameterized Learning Complexity

14 years 3 months ago
Parameterized Learning Complexity
We describe three applications in computational learning theory of techniques and ideas recently introduced in the study of parameterized computational complexity. (1) Using parameterized problem reducibilities, we show that P-sized DNF (CNF) formulas can be exactly learned in time polynomial in the number of variables by extended equivalence queries if and only if the dominating sets of a graph can be learned in polynomial time by extended equivalence queries. (That is, learning by an arbitary hypothesis class. See Angluin [?].) Since learning dominating sets is a special case of learning monotone CNF formulas, this extends to the exact learning model a result of Kearns, li, Pitt and Valiant in the PAC prediction model [?]. We show that P-sized DNF (CNF) formulas can be learned exactly in polynomial time by extended equivalence and membership queries if and only there is an algorithm running in time polynomial in n and k to learn the k element dominating sets of an n vertex graph. We...
Rodney G. Downey, Patricia A. Evans, Michael R. Fe
Added 09 Aug 2010
Updated 09 Aug 2010
Type Conference
Year 1993
Where COLT
Authors Rodney G. Downey, Patricia A. Evans, Michael R. Fellows
Comments (0)