The Convergence-Zone model shows how sparse, random memory patterns can lead to one-shot storage and high capacity in the hippocampal component of the episodic memory system. This paper presents a biologically more realistic version of the model, with continuously-weighted connections and storage through Hebbian learning and normalization. In contrast to the gradual weight adaptation in many neural network models, episodic memory turns out to require high learning rates. Normalization allows earlier patterns to be overwritten, introducing time-dependent forgetting similar to the hippocampus. Key words: Convergence-zone; Episodic memory; Neural network;