Please use this identifier to cite or link to this item:
Title: Natural gesture based control for quadcopter
Authors: Chang, Yi Wei
Keywords: DRNTU::Engineering::Computer science and engineering
Issue Date: 2015
Abstract: Human-Computer Interaction has progressed rapidly throughout the years and the opportunities that it provides are massive. Through Natural User Interface, natural gestures can be used to control a quadcopter. This means that the user himself becomes the controller and is able to send commands to the quadcopter just by changing the posture of his body. Flying a quadcopter is quite different from controlling a remote controlled car. This is because unlike a car, a quadcopter can travel in three different axes. This is why the Microsoft Kinect is being used, as it is able to obtain depth information from the surroundings. The Microsoft Kinect is used to track the skeleton data of the user in order to determine the current posture of the user. Each posture is then classified as a specific command which is then sent to the quadcopter, which in this case is the Parrot AR.Drone 2.0. Certain thresholds were set for each posture so that there is a reasonable amount of leeway provided. This ensures that the user does not accidentally issue a command that he did not intend to do. The commands were sent to the quadcopter using Wi-Fi over the User Datagram Protocol to reduce bandwidth overhead and latency. This enables the quadcopter to respond instantly to the commands given by the user.
Rights: Nanyang Technological University
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
Natural gesture based control for Quadcopter.pdf
  Restricted Access
1.04 MBAdobe PDFView/Open

Page view(s)

checked on Sep 27, 2020


checked on Sep 27, 2020

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.