Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/184346
Title: Object-oriented autonomous navigation system for mobile robots
Authors: Cai, Haonan
Keywords: Engineering
Issue Date: 2025
Publisher: Nanyang Technological University
Source: Cai, H. (2025). Object-oriented autonomous navigation system for mobile robots. Master's thesis, Nanyang Technological University, Singapore. https://hdl.handle.net/10356/184346
Abstract: Autonomous navigation is a key capability for mobile robots, enabling exploration and targetbased movement. This dissertation integrates SLAM, semantic segmentation, and the Nav2 framework in ROS2 to enhance autonomous exploration and navigation. The system employs a frontier-based exploration strategy for mapping while using SegFormer, a transformer-based model, for human detection. A depth camera estimates the 3D position of detected objects, enabling navigation. The implementation is tested in a Gazebo simulation environment to evaluate perception, mapping, and navigation performance. Nav2’s SMAC 2D planner and MPPI controller parameters are tuned to improve trajectory quality and obstacle avoidance. Experimental results indicate a balance between exploration and target-driven navigation under simulation conditions. Future work includes real-world deployment, better adaptability to dynamic obstacles, and refining the semantic segmentation model. Multi-robot collaboration and automated parameter optimization may further improve efficiency and scalability.
URI: https://hdl.handle.net/10356/184346
Schools: School of Mechanical and Aerospace Engineering 
Research Centres: Robotics Research Centre 
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:MAE Theses

Files in This Item:
File Description SizeFormat 
Dissertation.pdf
  Restricted Access
9.34 MBAdobe PDFView/Open

Page view(s)

19
Updated on May 7, 2025

Download(s)

1
Updated on May 7, 2025

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.