Please use this identifier to cite or link to this item:
|Title:||Object recognition and tracking for quadrotor in a dynamic environment||Authors:||Lim, Benjamin Zhen Wei||Keywords:||Engineering::Mechanical engineering::Control engineering
|Issue Date:||2020||Publisher:||Nanyang Technological University||Project:||C091||Abstract:||Drones have seen a significant increase in their usage over the years, with companies developing cutting-edge vision algorithms and deploying them on their drones, providing them with a high level of autonomy. However, majority of these algorithms struggle to perform in a dynamic environment due to the limited computational resources that a drone possesses. Modern research has largely focused on improving either tracking algorithms or detection algorithms, but few have tried to merge both algorithms into one. Since the flaws of one algorithm can be covered by the other, it is highly beneficial to amalgamate tracking and detection, optimizing the overall computational footprint involved. It was found that tracking uses substantially lower computational resources than detection. As such, by allowing the tracker and detector to work in succession, majority of the tracking operation will involve only the tracker. Different OpenCV trackers were evaluated and Kernelized-Correlation-Filter (KCF) tracker was ultimately chosen given its strong ability to report tracking losses. Region Based Detectors (RBDs) were also fared against Single Shot Detectors (SSDs) and it was found that SSDs performed better on lightweight devices, and YOLOv3 was chosen as the detector. The hand-take over process performed well in flight tests, where the drone was able to maintain its position above the target object in a dynamic environment with occasional overshoots. Since the entire experiment was running on Robot Operating System (ROS), the tracking and detection algorithms are modular and can be easily swapped out by any algorithms deemed suitable. This provides room for developments in the future, bringing drones a step closer to full autonomy with real-time tracking operations.||URI:||https://hdl.handle.net/10356/141115||Fulltext Permission:||restricted||Fulltext Availability:||With Fulltext|
|Appears in Collections:||MAE Student Reports (FYP/IA/PA/PI)|
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.