In this paper we propose an extension of sequence kernels to the case where the symbols that define the sequences have multiple representations. This configuration occurs in natural language processing for instance, where words can be characterized according to different linguistic dimensions. The core of our contribution is to integrate early the different representations in the kernel, in a way that generates rich composite features defined across the various symbol dimensions.