The performance of sparsely-connected associative memory models built from a set of perceptrons is investigated using different patterns of connectivity. Architectures based on Gaussian and exponential distributions are compared to networks created by progressively rewiring a locally-connected network. It is found that while all three architectures are capable of good pattern-completion performance, the Gaussian and exponential architectures require a significantly lower mean wiring length to achieve the same results. In the case of networks of low connectivity, relatively tight Gaussian and exponential distributions achieve the best overall performance.