Please use this identifier to cite or link to this item:
Title: Real-time visual localization system on an embedded platform
Authors: Do, Anh Tu
Keywords: Engineering::Computer science and engineering::Computer systems organization::Special-purpose and application-based systems
Issue Date: 2021
Publisher: Nanyang Technological University
Source: Do, A. T. (2021). Real-time visual localization system on an embedded platform. Final Year Project (FYP), Nanyang Technological University, Singapore.
Project: SCSE20-0144
Abstract: Simultaneous localization and mapping (SLAM) of an unknown environment is an important task for mobile autonomous robots and navigation systems. Indoor SLAM, however, cannot rely on GPS signal and existing solutions often assume static scene conditions which may not occur in real-life scenarios. In addition, the maps generated using typical SLAM systems have no semantic meaning of the environment and are not useful for higher level mission such as path planning, object interaction, etc. The computation cost of such system is another concern as SLAM systems are often implemented on embedded platforms. In this project, two real-time visual SLAM systems are designed for the Jetson Xavier NX embedded platform that are capable of working in dynamic environments and generating 3D semantic global maps for context-aware tasks. It employs deep neural network for obtaining semantic information to build the global maps. A distributed computing setup is also considered in order to achieve real-time 3D semantic map generation. The quantitative and qualitative results of the proposed semantic SLAM systems are presented for comparison and evaluation.
Fulltext Permission: embargo_restricted_20220731
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
  Until 2022-07-31
FYP Report6.66 MBAdobe PDFUnder embargo until Jul 31, 2022

Page view(s)

Updated on May 19, 2022

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.