Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/162785
Title: Towards efficient and effective face forgery detection
Authors: Peng, Weixing
Keywords: Engineering::Computer science and engineering::Computing methodologies::Image processing and computer vision
Issue Date: 2022
Publisher: Nanyang Technological University
Source: Peng, W. (2022). Towards efficient and effective face forgery detection. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/162785
Project: SCSE21-0635
Abstract: Rapid progress in Face Forgery algorithms is becoming an increasingly relevant threat in the modern-day. To address the issue, multiple state-of-the-art forgery detection algorithms have been proposed. However, these models made use of deep and complex Convolutional Neural Networks (CNN) as their backbone, making them unrealistic to use in a production environment due to the high requirement for computational resources. Luckily, research on network pruning techniques has made tremendous progress in recent years, making it possible to compress CNN models by 90%, while retaining the same model accuracy. In this project, we experimented with and evaluated multiple pruning strategies and techniques. We then applied our findings by pruning a baseline face forgery detection model and achieved an impressive 2x speedup in the model inference time and a 70% decrease in model size. When our model is run under CPU-only hardware, a 30x speedup was achieved.
URI: https://hdl.handle.net/10356/162785
Schools: School of Computer Science and Engineering 
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
FYP_final1_Peng_weixing.pdf
  Restricted Access
1.04 MBAdobe PDFView/Open

Page view(s)

87
Updated on Sep 26, 2023

Download(s)

19
Updated on Sep 26, 2023

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.