Abstract. Semantic vector models are traditionally used to model concepts derived from discrete input such as tokenized text. This paper describes a technique to address continuous and graded quantities using such models. The method presented here grows out of earlier work on modelling orthography, or letter-by-letter word encoding, in which a graded vector is used to model character-positions within a word. We extend this idea to use a graded vector for a position along any scale. The technique is applied to modelling time-periods in an example dataset of Presidents of the United States. Initial examples demonstrate that encoding the time-periods using graded semantic vectors gives an improvement over modelling the dates in question as distinct strings. This work is significant because it fills a surprising technical gap: though vector spaces over a continuous groundfield seem a natural choice for representing graded quantities, this capability has been hitherto lacking, and is a n...