Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/148431
Title: Design of digital-intensive time-mode data converters based on delay correction utilising a time-to-digital converter
Authors: Kong, Junjie
Keywords: Engineering::Electrical and electronic engineering::Integrated circuits
Issue Date: 2021
Publisher: Nanyang Technological University
Source: Kong, J. (2021). Design of digital-intensive time-mode data converters based on delay correction utilising a time-to-digital converter. Doctoral thesis, Nanyang Technological University, Singapore. https://hdl.handle.net/10356/148431
Abstract: Traditional methods of analogue-digital conversion quantise data in the voltage domain. Device scaling makes it harder to resolve signals in the voltage domain with reasonable power consumption, due to down-scaling voltage supplies but stagnating noise levels, therefore reducing the Signal-to-Noise Ratio. Many designs rely on bulky calibration (backend and frontend) circuits to improve the analogue performance, with many taking the form of digital calibration. On the other hand, switching time of a MOSFET has improved with advanced nodes, giving more accuracy in time-domain resolution of digital signal edge transitions. This opens up the possibility of implementing ADCs in time-mode, capitalizing on this advantage. Unconventional methods of using time-domain for quantisation are thus gaining popularity in the form of time-mode ADCs. In such approaches, analogue information is represented by time delay through a voltage-to-time converter (VTC) and processed by a time-to-digital converter (TDC) which is largely digital and can exploit the digital advantages of newer technologies. Because the switching speed of transistors increases with smaller technology nodes, one could expect that the resolution of TDCs are heading in the range of femtoseconds. There are many well-known TDC architectures that are aimed at achieving this target. Amplifying time, interpolating between time pulses, or even subtracting delay between two cells (Vernier principle), just to name a few, are some of the reported techniques. On top of that, manipulating the bulk potential of the transistors vary the threshold voltage depending on the polarity of the source-bulk potential. This coupled with the Vernier principle was the method used in the proposed TDCs. But there are latency issues with high resolution TDCs, especially with Vernier TDCs, and therefore require some trick to enable a shorter quantisation time. Of the two TDCs, the Coarse-Fine TDC has a high resolution of 1.08 ps, and a sampling frequency of 278 MHz and consumes a relatively lower power of 1.8 mW. The non-linearity parameters are about 1 LSB, rather comparable to other TDC architectures, and the area could be kept to just 0.01 mm2, thanks to the compact design achievable by the digital-intensive nature of the TDC. Although time-domain quantisation leverages on the faster switching speed of transistors, voltage-to-time conversion in the most primitive form of capacitive charge and discharge is very time consuming. This becomes a bottleneck in the quantisation speed of a time-mode ADC. Moreover, the non-linear and inverted voltage-delay relationship limits most time-mode ADCs to a very small input voltage range. Unlike analogue and digital data, time data cannot be stored. Its one-use only characteristic requires it to be regenerated for repeated use and consequently lengthening the quantisation cycle. This research work first explores a new method of linearising the voltage-to-time relationship by the use of linear converters to correct the generated time output. The proposed concept of implementing a time quantiser in a feedback loop is able to linearise output delays up to 700 mV input range, whose linear range was limited to originally just 500 mV, or 100 mV in one of the reported works. Moreover, the inverse voltage-to-time relationship could be avoided with a special delay cell structure that was first proposed in the TDC-based VTC. To further increase this linear input range, a novel idea of mapping data from one domain to another was proposed as a new methodology for voltage-to-time conversion. In this new concept, the input voltage is instead mapped onto a representative output delay, and with the TDC as a feedback element for delay correction, the same input to full scale ratio in both voltage and time domains is maintained for linear V-T conversion. This output delay is then quantised by the same TDC for a digital representation of the input voltage, i.e. two output modes in a single device. Therefore, this data converter is able to leverage on the digital-intensive nature of the TDC and utilises only very little analogue blocks to convert input voltage into output data in two domains. With the proposed data mapping technique, it was possible to achieve an input voltage range that is 100% of the supply voltage. However, the presence of a feedback loop increases the number of iterations and therefore reduces the sampling frequency of the proposed VTC. Nevertheless, a small area thanks to a rather digital-intensive design could be achieved. This thus allows various techniques used in digital designs to increase the sampling frequency for example pipelining, time-interleaving, etc.
URI: https://hdl.handle.net/10356/148431
DOI: 10.32657/10356/148431
Rights: This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License (CC BY-NC 4.0).
Fulltext Permission: open
Fulltext Availability: With Fulltext
Appears in Collections:EEE Theses

Files in This Item:
File Description SizeFormat 
Thesis_Kong_Junjie.pdfThesis8.93 MBAdobe PDFView/Open

Page view(s)

135
Updated on May 19, 2022

Download(s) 50

66
Updated on May 19, 2022

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.