We discuss properties of high order neurons in competitive learning. In such neurons, geometric shapes replace the role of classic `point' neurons in neural networks. Complex analytical shapes are modeled by replacing the classic synaptic weight of the neuron by high-order tensors in homogeneous coordinates. Such neurons permit not only mapping of the data domain but also decomposition of some of its topological properties, which may reveal symbolic structure of the data. Moreover, eigentensors of the synaptic tensors reveal the coefficients of polynomial rules that the network is essentially carrying out. We show how such neurons can be formulated to follow the maximum-correlation activation principle and permit simple local Hebbian learning. We demonstrate decomposition of spatial arrangements of data clusters including very close and partially overlapping clusters, which are difficult to separate using classic neurons.
Hod Lipson, Hava T. Siegelmann