Please use this identifier to cite or link to this item:
|Title:||Automatic body measurement by neural networks||Authors:||Zhao, Jingyi||Keywords:||DRNTU::Engineering::Computer science and engineering::Computing methodologies::Simulation and modeling||Issue Date:||2019||Abstract:||Size prediction and garment customization are two main goals of body measurement for garment design. Traditional body measurement, involving manual measurement and trying clothes in person, is time-consuming and not cost-efficient. With the help of 3D body scanner and neural networks, body measurement can be fast and precise, thus reducing the cost. This project introduces neural network models to predict body sizes and the measurements used to customize clothes from various body data. In this project, three kinds of input data are used: raw 3D point clouds of human bodies, key body locations, and estimated body measurements. Raw point clouds are collected by scanning the participants’ body, and key body locations and estimated measurements are automatically computed by existing software. Then the manual measurement is applied to the participants to obtain the size labels and useful measurements for garment customization, which are used as the ground-truth values of output data. Different network structures are utilized for different kinds of input data. The results show that neural networks can achieve decent performance in predicting measurements for making clothes, and different input data can lead to different accuracies of prediction. The models can be further improved with a larger amount of data, in order to make it production-ready.||URI:||http://hdl.handle.net/10356/77220||Rights:||Nanyang Technological University||Fulltext Permission:||restricted||Fulltext Availability:||With Fulltext|
|Appears in Collections:||SCSE Student Reports (FYP/IA/PA/PI)|
Files in This Item:
|Automatic Body Measurement by Neural Networks (Amendment Report).pdf|
|833.99 kB||Adobe PDF||View/Open|
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.