The tutorial starts with an overview of the concepts of VC dimension and structural risk minimization. We then describe linear Support Vector Machines (SVMs) for separable and non-...
Most of the work on the Vapnik-Chervonenkis dimension of neural networks has been focused on feedforward networks. However, recurrent networks are also widely used in learning app...
The notion of embedding a class of dichotomies in a class of linear half spaces is central to the support vector machines paradigm. We examine the question of determining the mini...
We study the complexity of approximating the VC dimension of a collection of sets, when the sets are encoded succinctly by a small circuit. We show that this problem is • Σp 3-...
We examine the so-called rigorous support vector machine (RSVM) approach proposed by Vapnik (1998). The formulation of RSVM is derived by explicitly implementing the structural ris...
Abstract. This paper is concerned with generalization issues for a decision tree learner for structured data called Alkemy. Motivated by error bounds established in statistical lea...
We give a compression scheme for any maximum class of VC dimension d that compresses any sample consistent with a concept in the class to at most d unlabeled points from the domain...