We study the effects of various emergent topologies of interaction on the rate of language convergence in a population of communicating agents. The agents generate, parse, and learn the meanings of sentences from each other using recurrent neural networks. An agent chooses another agent to interact with and learn from, based on the second agent's fitness. Fitness is defined to include a frequency-dependent term capturing the approximate number of interactions an agent has had with others--its "popularity" as a linguistic partner, which in turn is a measure of the proportion of the population which it has taught. This method of frequencydependent selection is based on our earlier Noisy Preferential Attachment algorithm, which has been shown to produce various network topologies, including scale-free and smallworld networks. We show that convergence occurs much more quickly with this strategy than it does for uniformly random interactions. In addition, this strategy more ...