Sciweavers

ICANN
2009
Springer

Constrained Learning Vector Quantization or Relaxed k-Separability

14 years 4 months ago
Constrained Learning Vector Quantization or Relaxed k-Separability
Neural networks and other sophisticated machine learning algorithms frequently miss simple solutions that can be discovered by a more constrained learning methods. Transition from a single neuron solving linearly separable problems, to multithreshold neuron solving k-separable problems, to neurons implementing prototypes solving q-separable problems, is investigated. Using Learning Vector Quantization (LVQ) approach this transition is presented as going from two prototypes defining a single hyperplane, to many co-linear prototypes defining parallel hyperplanes, to unconstrained prototypes defining Voronoi tesselation. For most datasets relaxing the co-linearity condition improves accuracy increasing complexity of the model, but for data with inherent logical structure LVQ algorithms with constraints significantly outperforms original LVQ and many other algorithms.
Marek Grochowski, Wlodzislaw Duch
Added 25 Jul 2010
Updated 25 Jul 2010
Type Conference
Year 2009
Where ICANN
Authors Marek Grochowski, Wlodzislaw Duch
Comments (0)