We present an evolving neural network model in which synapses appear and disappear stochastically according to bio-inspired probabilities. These are in general nonlinear functions of the local fields felt by neurons—akin to electrical stimulation—and of the global average field—representing total energy consumption. We find that initial degree distributions then evolve towards stationary states which can either be fairly homogeneous or highly heterogeneous, depending on parameters. The critical cases—which can result in scale-free distributions—are shown to correspond, under a mean-field approximation, to nonlinear drift-diffusion equations. We show how appropriate choices of parameters yield good quantitative agreement with published experimental data concerning synaptic densities during brain development (synaptic pruning).
Samuel Johnson, Joaquín Marro, Jorge F. Mej