Please use this identifier to cite or link to this item:
Title: Machine learning for object identification using Lidar point cloud data
Authors: Chen, Xiaoxin
Keywords: Engineering::Electrical and electronic engineering
Issue Date: 2022
Publisher: Nanyang Technological University
Source: Chen, X. (2022). Machine learning for object identification using Lidar point cloud data. Final Year Project (FYP), Nanyang Technological University, Singapore.
Project: B1092-211
Abstract: Due to the increasing number of point cloud applications in computer vision and autonomous driving, more research attention has been focused on 3D point cloud learning. With the dominant approach in solving 2D image problems, deep learning is the most frequent model used in 3D point cloud processing. However, deep learning on point clouds is still in its infancy due to the specific characteristics of point clouds, such as permutation invariance. Nowadays, numerous methods applied deep learning on point cloud have been proposed to address the difficulties. This study provides a detailed but comprehensive analysis of recent developments in deep learning methods for 3D point cloud object classification in order to motivate future research. It also includes standardized and integrated practical codes with validation and visualization to provide researchers with convenience in understanding and evaluating the frameworks. Insightful discussion based on the comparative experiment results from the benchmark and real-life LiDAR datasets may further give inspiration on future research directions.
Schools: School of Electrical and Electronic Engineering 
Organisations: Institute of High Performance Computing
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:EEE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
FYP_Final_Report_Chen Xiaoxin.pdf
  Restricted Access
4.18 MBAdobe PDFView/Open

Page view(s)

Updated on Feb 21, 2024


Updated on Feb 21, 2024

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.