The choice of transfer functions may strongly influence complexity and performance of neural networks used in classification and approximation tasks. A taxonomy of activation and output functions is proposed, allowing to generate many transfer functions. Several less-known types of transfer functions and new combinations of activation/output functions are described. Functions parameterize to change from localized to delocalized type, functions with activation based on non-Euclidean distance measures, bicentral functions formed from pairs of sigmoids are discussed.