Please use this identifier to cite or link to this item:
Title: Visual inertial SLAM based autonomous navigation system
Authors: Chua, Eng Soon
Keywords: Engineering::Computer science and engineering::Computing methodologies::Image processing and computer vision
Issue Date: 2023
Publisher: Nanyang Technological University
Source: Chua, E. S. (2023). Visual inertial SLAM based autonomous navigation system. Final Year Project (FYP), Nanyang Technological University, Singapore.
Project: SCSE22-0775 
Abstract: Development and use of Visual and Visual Inertial SLAM for autonomous navigation have been prevalent for the past decade. Autonomous navigation in the past commonly used traditional SLAM with LiDAR as sensors which provides accurate depth ranging of the environment. The development of Visual SLAM using cameras as sensors was initiated as a lower-cost alternative to the usage of more expensive LiDAR for autonomous navigation. Today newer Visual SLAM methods with the use of Inertial Measurement Units (IMU) referred to as Visual Inertial SLAM and compatibility with different camera types such as stereo cameras and RGB-D cameras were proposed and shared with the public. In this project, we aim to implement an autonomous navigation system using COVINS and Multi-Robot Coordination. The navigation stack of Multi-Robot Coordination is integrated with COVINS, and the integrated system is then implemented on an Unguided Ground Vehicle to perform autonomous navigation in an indoor environment.
Schools: School of Computer Science and Engineering 
Organisations: Hardware & Embedded Systems Lab (HESL) 
Fulltext Permission: embargo_restricted_20241118
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
FYP Final Report SCSE22-0775 Chua Eng Soon.pdf
  Until 2024-11-18
Undergraduate project report1.88 MBAdobe PDFUnder embargo until Nov 18, 2024

Page view(s)

Updated on Jun 13, 2024

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.