Please use this identifier to cite or link to this item:
|Title:||Vision-based multi-sensor fusion for robust unmanned aerial vehicles autonomous navigation||Authors:||Yuan, Shenghai||Keywords:||Engineering::Computer science and engineering::Computing methodologies::Image processing and computer vision
Engineering::Electrical and electronic engineering::Control and instrumentation::Robotics
|Issue Date:||2019||Source:||Yuan, S. (2019). Vision-based multi-sensor fusion for robust unmanned aerial vehicles autonomous navigation. Doctoral thesis, Nanyang Technological University, Singapore.||Abstract:||The primary objective of this dissertation is to propose a robust vision-based multi-sensor fusion navigation system for lightweight unmanned aerial vehicles to navigate and perform complex tasks in a challenging environment. In recent years, the demand for autonomous drones has grown exponentially. Such applications can be the day to day routine goods delivery or inspection work in an area that humans can't reach. The proposed navigation system can run in real-time, low cost and with a high level of robustness. The proposed methods use the camera as the primary sensor to both odometry and mapping. Then the multi-sensor fusion is done to enhance the odometry reliability of the UAV system.||URI:||https://hdl.handle.net/10356/85185
|Fulltext Permission:||open||Fulltext Availability:||With Fulltext|
|Appears in Collections:||EEE Theses|
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.