Abstract—Realization of all-digital baseband receiver processing for multi-Gigabit communication requires analog-to-digital converters(ADCs) of sufficient rate and output resolution. A promising architecture for this purpose is the time-interleaved ADC (TI-ADC), in which several “sub-ADCs” are employed in parallel. However, the timing mismatch between the subADCs, if left uncompensated, leads to error floors in receiver performance. Standard linear digital mismatch compensation (e.g., based on the zero-forcing criterion) requires a number of taps that increases with the desired resolution. In this paper, we show that oversampling provides a scalable (in the number of sub-ADCs and in the desired resolution) approach to mismatch compensation, allowing elimination of mismatch-induced error floors at reasonable complexity. While the structure of the interference due to mismatch is different from that due to a dispersive channel, there is a strong analogy between the role of oversa...