Please use this identifier to cite or link to this item:
Title: Autonomous navigation system
Authors: Lai, Ming Hui
Keywords: Engineering::Computer science and engineering::Computing methodologies::Image processing and computer vision
Issue Date: 2022
Publisher: Nanyang Technological University
Source: Lai, M. H. (2022). Autonomous navigation system. Final Year Project (FYP), Nanyang Technological University, Singapore.
Project: SCSE21-0707
Abstract: Visual SLAM has gained traction in the field of autonomous navigation systems in recent years. In the past, LiDAR SLAM reigned as the most popular SLAM metho in autonomous navigation, due to its highly accurate and fast performance. However, the expensive price of LiDAR causes a demand for lower cost but effective LiDAR solutions. Visual SLAM relies mainly on cameras, which are cheaper than LiDAR and have therefore gained popularity as a cost-effective alternative to LiDAR SLAM. There have since been many visual SLAM methods proposed by the community which have been demonstrated to be capable of accurate and robust SLAM, such as ORB-SLAM 3. In this project, we aim to create an autonomous navigation system using ORB-SLAM 3. We leveraged pre-existing robotic solutions provided in the Robot Operating System (ROS) framework and combined it with ORB-SLAM 3 to develop a navigation stack capable of autonomously navigating a robot in an indoor environment.
Schools: School of Computer Science and Engineering 
Research Centres: Hardware & Embedded Systems Lab (HESL) 
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
  Restricted Access
1.46 MBAdobe PDFView/Open

Page view(s)

Updated on Feb 20, 2024

Download(s) 50

Updated on Feb 20, 2024

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.