Digital Calibration and Correction Methods for CMOS Analog-to-Digital Converters

Shiro DOSHO  

IEICE TRANSACTIONS on Electronics   Vol.E95-C   No.4   pp.421-431
Publication Date: 2012/04/01
Online ISSN: 1745-1353
DOI: 10.1587/transele.E95.C.421
Print ISSN: 0916-8516
Type of Manuscript: INVITED PAPER (Special Section on Solid-State Circuit Design – Architecture, Circuit, Device and Design Methodology)
analog circuits,  Moore's law,  high performance,  system LSIs,  miniaturization,  digital calibration,  correction,  

Full Text: FreePDF(1.8MB)

Along with the miniaturization of CMOS-LSIs, control methods for LSIs have been extensively developed. The most predominant method is to digitize observed values as early as possible and to use digital control. Thus, many types of analog-to-digital converters (ADCs) have been developed such as temperature, time, delay, and frequency converters. ADCs are the easiest circuits into which digital correction methods can be introduced because their outputs are digital. Various types of calibration method have been developed, which has markedly improved the figure of merits by alleviating margins for device variations. The above calibration and correction methods not only overcome a circuit's weak points but also give us the chance to develop quite new circuit topologies and systems. In this paper, several digital calibration and correction methods for major analog-to-digital converters are described, such as pipelined ADCs, delta-sigma ADCs, and successive approximation ADCs.