Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/162531
Title: Object counting using machine learning
Authors: Tang, Haihan
Keywords: Engineering::Computer science and engineering::Computing methodologies::Image processing and computer vision
Issue Date: 2022
Publisher: Nanyang Technological University
Source: Tang, H. (2022). Object counting using machine learning. Master's thesis, Nanyang Technological University, Singapore. https://hdl.handle.net/10356/162531
Abstract: In this thesis, to improve the accuracy of multi-modal crowd count estimation, a three-stream adaptive fusion network (TAFNet) and a scale-aware self-differential attention network (SDANet) are proposed. The proposed TAFNet is adopted to adaptively extract and fuse the optical information with thermal information, increasing the effectiveness of multi-modal information fusing. The proposed SDANet utilizes multi-scale features to estimate the density map and predict crowd number, which solves the scale variation problem of crowds. Several novel modules are proposed to highlight the scale information and avoid information redundancy. The experiments on RGBT-CC benchmark show the effectiveness of proposed methods for RGB-T crowd counting compared with state-of-the-art methods. The experiments on ShanghaitechRGBD benchmark demonstrate that proposed networks are capable of RGB-D crowd counting. In addition, the estimated density maps have high quality and are close to the ground truth density maps.
URI: https://hdl.handle.net/10356/162531
DOI: 10.32657/10356/162531
Rights: This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License (CC BY-NC 4.0).
Fulltext Permission: open
Fulltext Availability: With Fulltext
Appears in Collections:EEE Theses

Files in This Item:
File Description SizeFormat 
thesis_tanghaihan.pdf32.19 MBAdobe PDFView/Open

Page view(s)

51
Updated on Nov 29, 2022

Download(s) 50

21
Updated on Nov 29, 2022

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.