We prove that the concept class of disjunctions cannot be pointwise approximated by linear combinations of any small set of arbitrary real-valued functions. That is, suppose that there exist functions 1, . . . , r : {-1, 1}n R with the property that every disjunction f on n variables has f - r i=1 ii 1/3 for some reals 1, . . . , r. We prove that then r exp{( n)}, which is tight. We prove an incomparable lower bound for the concept class of decision lists. For the concept class of majority functions, we obtain a lower bound of (2n/n), which almost meets the trivial upper bound of 2n for any concept class. These lower bounds substantially strengthen and generalize the polynomial approximation lower bounds of Paturi (1992) and show that the regression-based agnostic learning algorithm of Kalai et al. (2005) is optimal. Keywords. Agnostic learning, approximate rank, matrix analysis, communication complexity Subject classification. 03D15, 68Q32, 68Q17.
Adam R. Klivans, Alexander A. Sherstov