Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/76014
Full metadata record
DC FieldValueLanguage
dc.contributor.authorLiu, Yifei
dc.date.accessioned2018-09-18T04:45:16Z
dc.date.available2018-09-18T04:45:16Z
dc.date.issued2018
dc.identifier.urihttp://hdl.handle.net/10356/76014
dc.description.abstractThe concept of a Time-Interleaved analog-to-digital converter (TI ADC) which comprises sub-ADCs (channels) is proposed as a means of increasing the speed of analog-to-digital converters (ADCs), albeit with a power and area penalty. During the alternate sampling process, timing mismatch between the sub-ADCs degrade the overall performance of the TI ADC. The timing mismatch has to be detected first, and subsequently, mitigated to improve the TI ADC performance. The detection and correction of the timing mismatch can be done by means of fully-digital approaches (mathematical algorithms), or hardware approaches (dedicated analog circuitries). The fully-digital approaches are preferable as they are impervious to process variations, and can be easily configured and implemented using computer programs, FPGAs, microcontrollers, or DSPs. This Master of Science dissertation pertains to the implementation of a combined timing-mismatch detection and correction algorithm (fully-digital approach) for a 2 GHz 4-channel 14-bit TI ADC. The detection algorithm is based on the average difference between samples algorithm. Simulation results show that when the mismatch is varied linearly (-5% to 5% of the channel sampling period), the error detected by the algorithm also varies linearly (-0.025 to 0.025 normalized value). It can, therefore, be concluded that the mismatch and the detected error have an unambiguous one-to-one correspondence. The correction algorithm is based on the Lagrange polynomial interpolation algorithm that estimates the signal shape by interpolating the samples. Computer simulation results of the combined detection and correction algorithms when used with the TI ADC show that the Signal to Noise and Distortion Ratio (SNDR) is ~90 dB on average for input frequencies ≤600 MHz and -5% to 5% timing mismatch. This is a 45 dB SNDR improvement compared to without the algorithms.en_US
dc.format.extent63 p.en_US
dc.language.isoenen_US
dc.subjectDRNTU::Engineering::Electrical and electronic engineeringen_US
dc.titleTiming mismatch calibration circuit for high-speed time-interleaved ADCsen_US
dc.typeThesis
dc.contributor.supervisorChang Joseph Sylvesteren_US
dc.contributor.schoolSchool of Electrical and Electronic Engineeringen_US
dc.description.degreeMaster of Science (Electronics)en_US
item.fulltextWith Fulltext-
item.grantfulltextrestricted-
Appears in Collections:EEE Theses
Files in This Item:
File Description SizeFormat 
LiuYifei_2018.pdf
  Restricted Access
Main article2.26 MBAdobe PDFView/Open

Page view(s)

234
Updated on Dec 2, 2023

Download(s)

9
Updated on Dec 2, 2023

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.