Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/164765
Title: A semi-supervised machine learning approach for in-process monitoring of laser powder bed fusion
Authors: Nguyen, Ngoc Vu
Hum, Allen Jun Wee
Tran, Tuan
Keywords: Engineering::Mechanical engineering
Issue Date: 2022
Source: Nguyen, N. V., Hum, A. J. W. & Tran, T. (2022). A semi-supervised machine learning approach for in-process monitoring of laser powder bed fusion. Materials Today: Proceedings, 70, 583-586. https://dx.doi.org/10.1016/j.matpr.2022.09.607
Journal: Materials Today: Proceedings
Abstract: Laser powder bed fusion (L-PBF), despite the tremendous potential in metal additive manufacturing, is still facing a significant barrier toward wider adoption due to the current lack of quality assurance. Notable efforts aiming at effective quality control of L-PBF products rely on using machine learning (ML) of monitoring data to either identify possible defects or predict the product quality. In this study, we propose a semi-supervised ML approach using layerwise monitoring images. We train the ML model using reference monitoring images to classify surface appearances of samples printed without defect and with a common type of defect in L-PBF, i.e., overheating. The trained ML model enables determination of overheated regions in L-PBF products during printing process. We then demonstrate our ML's capability by performing prediction on a test sample having overhanging structures.
URI: https://hdl.handle.net/10356/164765
ISSN: 2214-7853
DOI: 10.1016/j.matpr.2022.09.607
Schools: School of Mechanical and Aerospace Engineering 
Research Centres: Singapore Centre for 3D Printing 
Rights: © 2022 Elsevier Ltd. All rights reserved.
Fulltext Permission: none
Fulltext Availability: No Fulltext
Appears in Collections:MAE Journal Articles
SC3DP Journal Articles

SCOPUSTM   
Citations 50

1
Updated on Jun 11, 2024

Page view(s)

132
Updated on Jun 19, 2024

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.