This paper presents a perturbation-based gain and nonlinearity background calibration scheme for high-resolution pipelined analog-to-digital converters (ADCs). Two uncorrelated pseudo-random sequences are used to inject a perturbation signal into the pipeline stages and then estimate the linearity of multiplying digital-to-analog converters (MDACs). The gain and linearity errors are corrected to achieve high-resolution performance. A 14-bit pipelined ADC is simulated to verify the proposed calibration scheme. The SNDR is improved from 45 dB to 80 dB. The simulated SFDR is over 99 dB to show the linearity improvement.
Download Full PDF Version (Non-Commercial Use)