Timing mismatch calibration circuit for high-speed time-interleaved ADCs

The concept of a Time-Interleaved analog-to-digital converter (TI ADC) which comprises sub-ADCs (channels) is proposed as a means of increasing the speed of analog-to-digital converters (ADCs), albeit with a power and area penalty. During the alternate sampling process, timing mismatch between th...

Full description

Saved in:
Bibliographic Details
Main Author: Liu, Yifei
Other Authors: Chang Joseph Sylvester
Format: Theses and Dissertations
Language:English
Published: 2018
Subjects:
Online Access:http://hdl.handle.net/10356/76014
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:The concept of a Time-Interleaved analog-to-digital converter (TI ADC) which comprises sub-ADCs (channels) is proposed as a means of increasing the speed of analog-to-digital converters (ADCs), albeit with a power and area penalty. During the alternate sampling process, timing mismatch between the sub-ADCs degrade the overall performance of the TI ADC. The timing mismatch has to be detected first, and subsequently, mitigated to improve the TI ADC performance. The detection and correction of the timing mismatch can be done by means of fully-digital approaches (mathematical algorithms), or hardware approaches (dedicated analog circuitries). The fully-digital approaches are preferable as they are impervious to process variations, and can be easily configured and implemented using computer programs, FPGAs, microcontrollers, or DSPs. This Master of Science dissertation pertains to the implementation of a combined timing-mismatch detection and correction algorithm (fully-digital approach) for a 2 GHz 4-channel 14-bit TI ADC. The detection algorithm is based on the average difference between samples algorithm. Simulation results show that when the mismatch is varied linearly (-5% to 5% of the channel sampling period), the error detected by the algorithm also varies linearly (-0.025 to 0.025 normalized value). It can, therefore, be concluded that the mismatch and the detected error have an unambiguous one-to-one correspondence. The correction algorithm is based on the Lagrange polynomial interpolation algorithm that estimates the signal shape by interpolating the samples. Computer simulation results of the combined detection and correction algorithms when used with the TI ADC show that the Signal to Noise and Distortion Ratio (SNDR) is ~90 dB on average for input frequencies ≤600 MHz and -5% to 5% timing mismatch. This is a 45 dB SNDR improvement compared to without the algorithms.