Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/158861
Title: System integration of a vision-based robot system for the food industry
Authors: CHIA, JING CHENG
Keywords: Engineering::Mechanical engineering::Robots
Issue Date: 2022
Publisher: Nanyang Technological University
Source: CHIA, J. C. (2022). System integration of a vision-based robot system for the food industry. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/158861
Project: A029
Abstract: As technology becomes more advance, so too does the field of robotics and automation. Many robots can be found assembling vehicle parts in automotive factories. These robots consist mainly of mechanical arms programmed to do welding and screwing on some parts of the cars. Nowadays, the definition of robotics evolved and expanded that includes the development, innovation, and use of robot for surveillance in harsh environment, robot that assist in many aspects in healthcare and even autonomous vehicle deploying in many places in Singapore for a future intelligent traffic system. This is especially true with the development of Artificial Intelligence in robotics industry which makes high level autonomy of robots possible in the complicated environment. Deep learning approach is widely utilised in robotics field such as object detection, robot navigation, natural language processing and point cloud registration. The purpose of this Final-Year Project is to integrate point cloud registration method into the vision-based food assembly robot. The main objective is to match two point cloud data collected from two depth cameras into higher quality depth information, wider perspective and lesser blind spot missing area point cloud data. Recent deep point cloud matching method mostly focus on standard point cloud data with high overlapping ratio but rarely deploy in practical application. Therefore, this project will focus on comparison on different developed point cloud registration approach on both standard data and real-world data. To compare quality of data collected from different tilt angles of camera, the author will design a tilt module for the depth cameras.
URI: https://hdl.handle.net/10356/158861
Schools: School of Mechanical and Aerospace Engineering 
Research Centres: Robotics Research Centre 
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:MAE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
FYP_Report_updated_21042022.pdf
  Restricted Access
2.07 MBAdobe PDFView/Open

Page view(s)

66
Updated on Sep 26, 2023

Download(s) 50

25
Updated on Sep 26, 2023

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.