Please use this identifier to cite or link to this item:
https://hdl.handle.net/10356/146221
Title: | Temporal phase unwrapping using deep learning | Authors: | Yin, Wei Chen, Qian Feng, Shijie Tao, Tianyang Huang, Lei Trusiak, Maciej Asundi, Anand Krishna Zuo, Chao |
Keywords: | Engineering::Electrical and electronic engineering | Issue Date: | 2019 | Source: | Yin, W., Chen, Q., Feng, S., Tao, T., Huang, L., Trusiak, M., . . . Zuo, C. (2019). Temporal phase unwrapping using deep learning. Scientific Reports, 9(1), 20175-. doi:10.1038/s41598-019-56222-3 | Journal: | Scientific Reports | Abstract: | The multi-frequency temporal phase unwrapping (MF-TPU) method, as a classical phase unwrapping algorithm for fringe projection techniques, has the ability to eliminate the phase ambiguities even while measuring spatially isolated scenes or the objects with discontinuous surfaces. For the simplest and most efficient case in MF-TPU, two groups of phase-shifting fringe patterns with different frequencies are used: the high-frequency one is applied for 3D reconstruction of the tested object and the unit-frequency one is used to assist phase unwrapping for the wrapped phase with high frequency. The final measurement precision or sensitivity is determined by the number of fringes used within the high-frequency pattern, under the precondition that its absolute phase can be successfully recovered without any fringe order errors. However, due to the non-negligible noises and other error sources in actual measurement, the frequency of the high-frequency fringes is generally restricted to about 16, resulting in limited measurement accuracy. On the other hand, using additional intermediate sets of fringe patterns can unwrap the phase with higher frequency, but at the expense of a prolonged pattern sequence. With recent developments and advancements of machine learning for computer vision and computational imaging, it can be demonstrated in this work that deep learning techniques can automatically realize TPU through supervised learning, as called deep learning-based temporal phase unwrapping (DL-TPU), which can substantially improve the unwrapping reliability compared with MF-TPU even under different types of error sources, e.g., intensity noise, low fringe modulation, projector nonlinearity, and motion artifacts. Furthermore, as far as we know, our method was demonstrated experimentally that the high-frequency phase with 64 periods can be directly and reliably unwrapped from one unit-frequency phase using DL-TPU. These results highlight that challenging issues in optical metrology can be potentially overcome through machine learning, opening new avenues to design powerful and extremely accurate high-speed 3D imaging systems ubiquitous in nowadays science, industry, and multimedia. | URI: | https://hdl.handle.net/10356/146221 | ISSN: | 2045-2322 | DOI: | 10.1038/s41598-019-56222-3 | Schools: | School of Mechanical and Aerospace Engineering | Research Centres: | Centre for Optical and Laser Engineering | Rights: | © 2019 The Author(s). This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/. | Fulltext Permission: | open | Fulltext Availability: | With Fulltext |
Appears in Collections: | MAE Journal Articles |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
s41598-019-56222-3.pdf | 2.43 MB | Adobe PDF | ![]() View/Open |
SCOPUSTM
Citations
5
134
Updated on May 4, 2025
Web of ScienceTM
Citations
5
70
Updated on Oct 30, 2023
Page view(s) 20
709
Updated on May 5, 2025
Download(s) 50
125
Updated on May 5, 2025
Google ScholarTM
Check
Altmetric
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.