Please use this identifier to cite or link to this item:
Title: Gesture control for indoor navigation and maze exploration for drones
Authors: Sazali Mohammed Ali
Keywords: DRNTU::Engineering::Computer science and engineering
Issue Date: 2017
Abstract: As drones are now becoming more popular and being commercialized all over the globe, new ways of controlling these drones have been explored. The purpose of this study is to investigate one of the many ways of controlling this drones that feels second nature to us and gives the natural flight experience to the user. In this study, the implementation of using a motion controller to control the motion of a drone is via human gestures. Using the LEAP motion sensor as the motion controller and the Parrot AR Drone 2.0 for this implementation. The Parrot AR Drone is commercial quad rotor having a built-in Wi-Fi system. The Parrot AR Drone is connected to the laptop via Wi-Fi and the LEAP sensor is connected to the laptop via USB port. The LEAP Motion sensor recognizes the hand gestures and relays it on to the laptop. The laptop, acting as the server, runs the program which is used as the platform for this implementation. JavaScript embedded in HTML is the programming language used for interaction with the AR Drone to convey the simple hand gestures via web browser. In the implementation, we have written JavaScript codes to interpret the hand gestures captured by the LEAP, and transmit them to control the motion of the AR Drone through these gestures
Rights: Nanyang Technological University
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
  Restricted Access
1.61 MBAdobe PDFView/Open

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.