Sciweavers

GECCO
2010
Springer

Evolving neural networks in compressed weight space

14 years 4 months ago
Evolving neural networks in compressed weight space
We propose a new indirect encoding scheme for neural networks in which the weight matrices are represented in the frequency domain by sets of Fourier coefficients. This scheme exploits spatial regularities in the matrix to reduce the dimensionality of the representation by ignoring high-frequency coefficients, as is done in lossy image compression. We compare the efficiency of searching in this “compressed” network space to searching in the space of directly encoded networks, using the CoSyNE neuroevolution algorithm on three benchmark problems: pole-balancing, ball throwing and octopusarm control. The results show that this encoding can dramatically reduce the search space dimensionality such that solutions can be found in significantly fewer evaluations. Categories and Subject Descriptors I.2.6 [Artificial Intelligence]: Learning—Connectionism and neural nets General Terms Algorithms
Jan Koutnik, Faustino J. Gomez, Jürgen Schmid
Added 19 Jul 2010
Updated 19 Jul 2010
Type Conference
Year 2010
Where GECCO
Authors Jan Koutnik, Faustino J. Gomez, Jürgen Schmidhuber
Comments (0)