Please use this identifier to cite or link to this item:
https://hdl.handle.net/10356/172038
Title: | Computation-efficient knowledge distillation via uncertainty-aware mixup | Authors: | Xu, Guodong Liu, Ziwei Loy, Chen Change |
Keywords: | Engineering::Computer science and engineering | Issue Date: | 2023 | Source: | Xu, G., Liu, Z. & Loy, C. C. (2023). Computation-efficient knowledge distillation via uncertainty-aware mixup. Pattern Recognition, 138, 109338-. https://dx.doi.org/10.1016/j.patcog.2023.109338 | Journal: | Pattern Recognition | Abstract: | Knowledge distillation (KD) has emerged as an essential technique not only for model compression, but also other learning tasks such as continual learning. Given the richer application spectrum and potential online usage of KD, knowledge distillation efficiency becomes a pivotal component. In this work, we study this little-explored but important topic. Unlike previous works that focus solely on the accuracy of student network, we attempt to achieve a harder goal – to obtain a performance comparable to conventional KD with a lower computation cost during the transfer. To this end, we present UNcertainty-aware mIXup (UNIX), an effective approach that can reduce transfer cost by 20% to 30% and yet maintain comparable or achieve even better student performance than conventional KD. This is made possible via effective uncertainty sampling and a novel adaptive mixup approach that select informative samples dynamically over ample data and compact knowledge in these samples. We show that our approach inherently performs hard sample mining. We demonstrate the applicability of our approach to improve various existing KD approaches by reducing their queries to a teacher network. Extensive experiments are performed on CIFAR100 and ImageNet. Code and model are available at https://github.com/xuguodong03/UNIXKD. | URI: | https://hdl.handle.net/10356/172038 | ISSN: | 0031-3203 | DOI: | 10.1016/j.patcog.2023.109338 | Schools: | School of Computer Science and Engineering | Rights: | © 2023 Elsevier Ltd. All rights reserved. | Fulltext Permission: | none | Fulltext Availability: | No Fulltext |
Appears in Collections: | SCSE Journal Articles |
SCOPUSTM
Citations
20
19
Updated on Mar 8, 2025
Page view(s)
120
Updated on Mar 14, 2025
Google ScholarTM
Check
Altmetric
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.