Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/155808
Title: HACScale : hardware-aware compound scaling for resource-efficient DNNs
Authors: Kong, Hao
Liu, Di
Luo, Xiangzhong
Liu, Weichen
Subramaniam, Ravi
Keywords: Engineering::Computer science and engineering
Issue Date: 2022
Source: Kong, H., Liu, D., Luo, X., Liu, W. & Subramaniam, R. (2022). HACScale : hardware-aware compound scaling for resource-efficient DNNs. 2022 27th Asia and South Pacific Design Automation Conference (ASP-DAC), 708-713. https://dx.doi.org/10.1109/ASP-DAC52403.2022.9712593
Project: M4082282
M4082087
Abstract: Model scaling is an effective way to improve the accuracy of deep neural networks (DNNs) by increasing the model capacity. However, existing approaches seldom consider the underlying hardware, causing inefficient utilization of hardware resources and consequently high inference latency. In this paper, we propose HACScale, a hardware-aware model scaling strategy to fully exploit hardware resources for higher accuracy. In HACScale, different dimensions of DNNs are jointly scaled with consideration of their contributions to hardware utilization and accuracy. To improve the efficiency of width scaling, we introduce importance-aware width scaling in HACScale, which computes the importance of each layer to the accuracy and scales each layer accordingly to optimize the trade-off between accuracy and model parameters. Experiments show that HACScale improves the hardware utilization by 1.92× on ImageNet, as a result, it achieves 2.41% accuracy improvement with a negligible latency increase of 0.6%. On CIFAR-10, HACScale improves the accuracy by 2.23% with only 6.5% latency growth.
URI: https://hdl.handle.net/10356/155808
ISBN: 9781665421355
DOI: 10.1109/ASP-DAC52403.2022.9712593
Rights: © 2022 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. The published version is available at: https://doi.org/10.1109/ASP-DAC52403.2022.9712593.
Fulltext Permission: open
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Conference Papers

Files in This Item:
File Description SizeFormat 
asp_dac2022.pdf359.43 kBAdobe PDFView/Open

Page view(s)

27
Updated on May 16, 2022

Download(s)

5
Updated on May 16, 2022

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.