Please use this identifier to cite or link to this item:
Full metadata record
DC FieldValueLanguage
dc.contributor.authorZia, Saeedehen_US
dc.description.abstractAutonomous warehousing operations are important target elements of Industry 4.0. One major component of autonomous warehousing is the automated cargo handling system that significantly contributes to high-velocity distribution of goods, which directly and substantially contributes to cost reduction in warehousing procedure. Robotic cargo handling is a highly flexible solution to realize the automated cargo handling system. One of the major bottlenecks of robotic cargo handling is the detection of elements in warehouse environment using visual perception sensors. In this thesis, the research on detection of pallet and cargo in the warehouse environment using deep learning techniques is reported, and the entire process from data generation to network training is explained in detail. This work is a partial of an on-going research on robotic warehousing, which was started in our research group about three years ago and it is expected to continue for several years after the submission of this thesis. The motivation of this research is to study the possibility of making enhancements in 3D pallet detection and localization system in an outdoor cargo and pallet handling site. At the time the task of my research was planned, classic methods were used for pallet detection purpose. The current research aims to open a new research track, which is focused on the deep learning object detection techniques, which has the potential to enhance the pallet detection performance using 3D LiDARs as the primary choice of a long range visual perception sensor that is not sensitive to outdoor lighting conditions. The main objectives of the this thesis include a) designing a simulated environment that is similar to an outdoor cargo handling site, b) designing a simulated robot and sensory ii system for object detection in Gazebo, c) writing a program that communicates with the simulated robot and the simulated world to receive the sensory data for further processing in robot operating system (ROS), d) writing a ground truth generation program to assist making a database of observations combined with precise 3D labelling, e) generation of a Kitti-structured database of many observations from the pallets and cargo in different placements and with their respective 3D bounding box labels, f) training a few deep learning structures with the generated database to analyze the learning performance, g) suggestion of the guidelines to enhance the detection results for the future work. In this thesis all the mentioned objectives were addressed, the results are reported, and the observations are discussed. Moreover, the database of pallet point cloud images is now accessible for download for any future works in this direction. I believe that that the outcome of this research plays a significant role in formation of a modern object detection technique that would replace the currently used classic method, or it would be used as a complementary work to enhance the pallet detection system.en_US
dc.publisherNanyang Technological Universityen_US
dc.subjectEngineering::Computer science and engineering::Computing methodologies::Artificial intelligenceen_US
dc.subjectEngineering::Computer science and engineering::Information systemsen_US
dc.titlePallet detection and localization in warehouse environmenten_US
dc.typeThesis-Master by Courseworken_US
dc.contributor.supervisorL. G. Peeen_US
dc.contributor.schoolWee Kim Wee School of Communication and Informationen_US
dc.description.degreeMaster of Science (Information Systems)en_US
item.fulltextWith Fulltext-
Appears in Collections:WKWSCI Theses
Files in This Item:
File Description SizeFormat 
Pallet Detection and Localization in Warehouse Environment -Thesis dissertation statement.pdf
  Restricted Access
Thesis and statement7.03 MBAdobe PDFView/Open

Page view(s) 20

Updated on Jul 23, 2024


Updated on Jul 23, 2024

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.