In this paper, a fast adaptive neural network classifier named FANNC is proposed. FANNC exploits the advantages of both adaptive resonance theory and field theory. It needs only onepass learning, and achieves not only high predictive accuracy but also fast learning speed. Besides, FANNC has incremental learning ability. When new instances are fed, it does not need retrain the whole training set. Instead, it could learn the knowledge encoded in those instances through slightly adjusting the network topology when necessary, that is, adaptively appending one or two hidden units and corresponding connections to the existing network. This characteristic makes FANNC fit for real-time online learning tasks. Moreover, since the network architecture is adaptively set up, the disadvantage of manually determining the number of hidden units of most feed-forward neural networks is overcome. Benchmark tests show that FANNC is a preferable neural network classifier, which is superior to several other...