One way to analyse programs is to to derive expressions for their computational behaviour. A time bound function (or worst-case complexity) gives an upper bound for the computation time as a function of the size of input. We describe a system to derive such time bounds automatically using abstract interpretation. The semantics-based setting makes it possible to prove the correctness of the time bound function. The system can analyse programs in a first-order subset of Lisp and we show how the system also can be used to analyse programs in other languages.