We have implemented an aspect of learning and memory in the nervous system using analog electronics. Using a simple synaptic circuit we realize networks with Hebbian type adaptation rules. With increased synaptic activity, the synaptic weights are increased or decreased. That increase or decrease continues with subsequent synaptic activity. This paper explores the relationship between synaptic activity and weight for various inputs We will use our relatively simple network to bootstrap into larger, more complex systems. This system helps to provide insight into intricate natural designs, such as cerebellar cortex. Using the physical properties of our floating-gate pFET device, we are able to re-establish properties seen previously and build upon these first steps [1], [2]. We can modify our learning rule rates and dynamics through capacitively coupled input voltages. Our learning rule has connections to reinforced learning, and therefore may finde useful engineering applications [3...
Christal Gordon, Paul E. Hasler