Please use this identifier to cite or link to this item:
Title: Interactive gesture-based music on a mobile phone
Authors: Ng, Joseph Heng Qi
Keywords: DRNTU::Engineering::Computer science and engineering::Computing methodologies::Image processing and computer vision
DRNTU::Visual arts and music::Performing arts
Issue Date: 2019
Abstract: This project is concerned with using hand gestures to produce musical sounds, using a mobile phone and its camera functionality. As such, the end goal involves the user being able to, with the aid of additional props, generate musical sounds out of their mobile phones by simply using hand gestures related movements. It is important to note that the purpose of the project is not to create a musical instrument that can easily replace other traditional instruments such as the guitar or the piano, but merely serve as a new form of musical expression tool, that is not explored deeply before. The end user should have very basic ideas of how music theory works in order to use the application fully, but there is no technical skill required beforehand, thus all users would be able to use the application without difficulty. The project would utilize the Android platform, with the target API level of 21. The OpenCV library is used for image processing, and the JSyn library is used for audio synthesis.
Rights: Nanyang Technological University
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
FYP ReportFinal.pdf
  Restricted Access
FYP Report3.27 MBAdobe PDFView/Open

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.