HyperNEAT represents a class of neuroevolutionary algorithms that captures some of the power of natural development with a ionally efficient high-level abstraction of development. This class of algorithms is intended to provide many of the desirable properties produced in biological phenotypes by natural developmental processes, such as regularity, modularity and hierarchy. While it has been previously shown that HyperNEAT produces regular artificial neural network (ANN) phenotypes, in this paper we investigated the open question of whether HyperNEAT can produce modular ANNs. We conducted such research on problems where modularity should be beneficial, and found that HyperNEAT failed to generate modular ANNs. We then imposed modularity on HyperNEAT’s phenotypes and its performance improved, demonstrating that modularity increases performance on this problem. We next tested two techniques to encourage modularity in HyperNEAT, but did not observe an increase in either modularity or pe...
Jeff Clune, Benjamin E. Beckmann, Philip K. McKinl