This paper investigates possible connection strategies in sparsely connected associative memory models. This is interesting because real neural networks must have both efficient performance and minimal wiring length. We show, by using a Genetic Algorithm to evolve networks, that connection strategies, like those with exponentially reducing numbers of connections from near to far units, work efficiently and have low wiring costs. This implies, when modelling brain-like abilities in artificial neural networks, that it is possible to get good performance even with minimal numbers of long range connections.