A learning model for coupled oscillators is proposed. The proposed learning rule takes a simple form by which the intrinsic frequencies of the component oscillators and the coupli...
We consider the origin of the high-dimensional input space as a variable which can be optimized before or during neuronal learning. This set of variables acts as a translation on ...
Daniel Remondini, Nathan Intrator, Gastone C. Cast...
Hebbian learning has been a staple of neural-network models for many years. It is well known that the most straight-forward implementations of this popular learning rule lead to u...
The paradigm of Hebbian learning has recently received a novel interpretation with the discovery of synaptic plasticity that depends on the relative timing of pre and post synapti...
Areas of the brain involved in various forms of memory exhibit patterns of neural activity quite unlike those in canonical computational models. We show how to use well-founded Ba...
Independent component analysis (ICA) is a powerful method to decouple signals. Most of the algorithms performing ICA do not consider the temporal correlations of the signal, but o...
Reward-modulated spike-timing-dependent plasticity (STDP) has recently emerged as a candidate for a learning rule that could explain how local learning rules at single synapses su...
Robert A. Legenstein, Dejan Pecevski, Wolfgang Maa...
In this paper a new learning algorithm is proposed with the purpose of texture segmentation. The algorithm is a competitive clustering scheme with two specific features: elliptic...
Abstract— Supervised learning rules for spiking neural networks are currently only able to use time-to-first-spike coding and are plagued by very irregular learning curves due t...