Please use this identifier to cite or link to this item:
Title: Compact gesture recognition algorithm using tiny machine learning
Authors: Li, Junying
Keywords: Engineering::Electrical and electronic engineering
Issue Date: 2023
Publisher: Nanyang Technological University
Source: Li, J. (2023). Compact gesture recognition algorithm using tiny machine learning. Master's thesis, Nanyang Technological University, Singapore.
Project: D-256-21221-04426
Abstract: For gesture recognition based on convolution neural network, general processors are not efficient for CNN implementation and cannot meet performance requirements. Plenty of great works on implementing the convolution neural network on FPGA have been carried out in recent years. But it is still a very difficult task due to the complex computation of convolution, limited hardware resources, and high-speed requirements. In this project, a convolutional neural network of the leNet-5 architecture used for gesture recognition is implemented on FPGA. Four convolution structures are implemented and compared on FPGA in an attempt to reach the best compromise between parallelism, speed, and utilization. The hardware description language is Verilog. The hardware design and simulation kit are Vivado. The data nature is of floating-point numbers using the IEEE 754 standard and with implementations of both half-precision and full-precision variants. The network built with python has an accuracy of 97.5% and a loss of 1.047512 on the six-gesture classification task. The hardware resource utilization, power consumption, and clock cycles of each layer of the CNN implemented on FPGA are estimated and evaluated.
Schools: School of Electrical and Electronic Engineering 
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:EEE Theses

Files in This Item:
File Description SizeFormat 
  Restricted Access
3.7 MBAdobe PDFView/Open

Page view(s)

Updated on Apr 15, 2024


Updated on Apr 15, 2024

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.