Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/175172
Title: Vortex-SegFormer: a transformer-based vortex detection method
Authors: Lim, Nicky
Keywords: Computer and Information Science
Issue Date: 2024
Publisher: Nanyang Technological University
Source: Lim, N. (2024). Vortex-SegFormer: a transformer-based vortex detection method. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/175172
Project: SCSE23-0405 
Abstract: The identification of vortices plays a critical role in various domains, such as fluid dynamics, weather forecasting, and engineering systems. Existing vortex detection methods are typically categorized into global, local, and machine learning-based methods. Global methods provide high accuracy but require high computational power, while local detection methods output results quickly at the expense of being less reliable. Machine learning-based methods aim to combine both speed and accuracy, but they tend to accept input patches, which discards some global information. To mitigate the drawbacks, we propose Vortex-SegFormer, a Transformer-based model, to detect vortices in 2D flows reliably without the need for patch sampling. Additionally, it can generalize well to unseen data, even with differing resolutions than the training data. The proposed method is then compared against existing vortex detection methods on simulated flow fields. The results show that Vortex-SegFormer outperforms existing methods and can capture more vortices.
URI: https://hdl.handle.net/10356/175172
Schools: School of Computer Science and Engineering 
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
Vortex-SegFormer (Nicky Lim).pdf
  Restricted Access
916.57 kBAdobe PDFView/Open

Page view(s)

106
Updated on May 7, 2025

Download(s)

5
Updated on May 7, 2025

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.