This paper presents a novel and notable swarm approach to evolve an optimal set of weights and architecture of a neural network for classification in data mining. In a distributed environment the proposed approach generates randomly multiple architectures competing with each other while fine-tuning their architectural loopholes to generate an optimum model with maximum classification accuracy. Aiming at better generalization ability, we analyze the use of particle swarm optimization (PSO) to evolve an optimal architecture with high classification accuracy. Experiments performed on benchmark datasets show that the performance of the proposed approach has good classification accuracy and generalization ability. Further, a comparative performance of the proposed model with other competing models is given to show its effectiveness in terms of classification accuracy.