Abstract- Interactive combat games are useful as testbeds for learning systems employing evolutionary computation. Of particular value are games that can be modified to accommodate differing levels of complexity. In this paper, we present the use of Xpilot as a learning environment that can be used to evolve primitive reactive behaviors, yet can be complex enough to require combat strategies and team cooperation. In addition, we use this environment with a genetic algorithm to learn the weights for an artificial neural network controller that provides both offensive and defensive reactive control for an autonomous agent.
Gary B. Parker, Matt Parker, Steven D. Johnson