Channel Linearity Mismatch Effects in Time-Interleaved ADC Systems

Naoki KUROSAWA  Haruo KOBAYASHI  Kensuke KOBAYASHI  

Publication
IEICE TRANSACTIONS on Fundamentals of Electronics, Communications and Computer Sciences   Vol.E85-A   No.4   pp.749-756
Publication Date: 2002/04/01
Online ISSN: 
DOI: 
Print ISSN: 0916-8508
Type of Manuscript: Special Section PAPER (Special Section of Selected Papers from the 14th Workshop on Circuits and Systems in Karuizawa)
Category: 
Keyword: 
ADC,  interleave,  channel mismatch,  DNL,  INL,  

Full Text: PDF(885.4KB)>>
Buy this Article




Summary: 
A time-interleaved ADC system is an effective way to implement a high-sampling-rate ADC with relatively slow circuits. In the system, several channel ADCs operate at interleaved sampling times as if they were effectively a single ADC operating at a much higher sampling rate. Mismatches among channel ADCs degrade SNR and SFDR of the ADC system as a whole, and the effects of offset, gain and bandwidth mismatches as well as timing skew of the clocks distributed to the channels have been well investigated. This paper investigates the channel linearity mismatch effects in the time-interleaved ADC system, which are very important in practice but had not been investigated previously. We consider two cases: differential nonlinearity mismatch and integral nonlinearity mismatch cases. Our numerical simulation shows distinct features of such mismatch especially in frequency domain. The derived results can be useful for deriving calibration algorithms to compensate for the channel mismatch effects.