This paper considers the least-square online gradient descent algorithm in a reproducing kernel Hilbert space (RKHS) without explicit regularization. We present a novel capacity independent approach to derive error bounds and convergence results for this algorithm. We show that, although the algorithm does not involve an explicit RKHS regularization term, choosing the step sizes appropriately can yield competitive error rates with those for both offline and online regularization algorithms in the literature. Short Title: Online gradient descent learning Keywords and Phrases: Online learning, reproducing kernel Hilbert space, gradient descent, error analysis. AMS Subject Classification Numbers: 68Q32, 68T05, 62J02, 62L20. Contact author: Yiming Ying, Telephone: +44 (0)20 7387 0374, Fax: +44 (0)20 7387 1397 1