In this paper, learning algorithm for a single multiplicative spiking neuron (MSN) is proposed and tested for various applications where a multilayer perceptron (MLP) neural network is conventionally used. It is found that a single MSN is sufficient for the applications that require a number of neurons in different hidden layers of a conventional neural network. Several benchmark and real-life problems of classification and function-approximation are illustrated. It is observed that by incorporating nonlinear synaptic interaction, threshold variability, and spiking phenomena, learning in artificial neural networks can be made more efficient.