Context trees are a popular and effective tool for tasks such as compression, sequential prediction, and language modeling. We present an algebraic perspective of context trees for the task of individual sequence prediction. Our approach stems from a generalization of the notion of margin used for linear predictors. By exporting the concept of margin to context trees, we are able to cast the individual sequence prediction problem as the task of finding a linear separator in a Hilbert space, and to apply techniques from machine learning and online optimization to this problem. Our main contribution is a memory efficient adaptation of the Perceptron algorithm for individual sequence prediction. We name our algorithm the Shallow Perceptron and prove a shifting mistake bound, which relates its performance with the performance of any sequence of context trees. We also prove that the Shallow Perceptron grows a context tree at a rate that is upperbounded by its mistake-rate, which imposes an ...