Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/152852
Title: ACSL : adaptive correlation-driven sparsity learning for deep neural network compression
Authors: He, Wei
Wu, Meiqing
Lam, Siew-Kei
Keywords: Engineering::Computer science and engineering
Issue Date: 2021
Source: He, W., Wu, M. & Lam, S. (2021). ACSL : adaptive correlation-driven sparsity learning for deep neural network compression. Neural Networks, 144, 465-477. https://dx.doi.org/10.1016/j.neunet.2021.09.012
Journal: Neural Networks 
Abstract: Deep convolutional neural network compression has attracted lots of attention due to the need to deploy accurate models on resource-constrained edge devices. Existing techniques mostly focus on compressing networks for image-level classification, and it is not clear if they generalize well on network architectures for more challenging pixel-level tasks, e.g., dense crowd counting or semantic segmentation. In this paper, we propose an adaptive correlation-driven sparsity learning (ACSL) framework for channel pruning that outperforms state-of-the-art methods on both image-level and pixel-level tasks. In our ACSL framework, we first quantify the data-dependent channel correlation information with a channel affinity matrix. Next, we leverage these inter-dependencies to induce sparsity into the channels with the introduced adaptive penalty strength. After removing the redundant channels, we obtain compact and efficient models, which have significantly less number of parameters while maintaining comparable performance with the original models. We demonstrate the advantages of our proposed approach on three popular vision tasks, i.e., dense crowd counting, semantic segmentation, and image-level classification. The experimental results demonstrate the superiority of our framework. In particular, for crowd counting on the Mall dataset, the proposed ACSL framework is able to reduce up to 94% parameters (VGG16-Decoder) and 84% FLOPs (ResNet101), while maintaining the same performance of (at times outperforming) the original model.
URI: https://hdl.handle.net/10356/152852
ISSN: 0893-6080
DOI: 10.1016/j.neunet.2021.09.012
Rights: © 2021 Elsevier Ltd. All rights reserved.
Fulltext Permission: none
Fulltext Availability: No Fulltext
Appears in Collections:SCSE Journal Articles

Page view(s)

142
Updated on Feb 5, 2023

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.