Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/169051
Title: SSD-KD: a self-supervised diverse knowledge distillation method for lightweight skin lesion classification using dermoscopic images
Authors: Wang, Yongwei
Wang, Yuheng
Cai, Jiayue
Lee, Tim K.
Miao, Chunyan
Wang, Jane Z.
Keywords: Engineering::Computer science and engineering
Issue Date: 2023
Source: Wang, Y., Wang, Y., Cai, J., Lee, T. K., Miao, C. & Wang, J. Z. (2023). SSD-KD: a self-supervised diverse knowledge distillation method for lightweight skin lesion classification using dermoscopic images. Medical Image Analysis, 84, 102693-. https://dx.doi.org/10.1016/j.media.2022.102693
Journal: Medical Image Analysis 
Abstract: Skin cancer is one of the most common types of malignancy, affecting a large population and causing a heavy economic burden worldwide. Over the last few years, computer-aided diagnosis has been rapidly developed and make great progress in healthcare and medical practices due to the advances in artificial intelligence, particularly with the adoption of convolutional neural networks. However, most studies in skin cancer detection keep pursuing high prediction accuracies without considering the limitation of computing resources on portable devices. In this case, the knowledge distillation (KD) method has been proven as an efficient tool to help improve the adaptability of lightweight models under limited resources, meanwhile keeping a high-level representation capability. To bridge the gap, this study specifically proposes a novel method, termed SSD-KD, that unifies diverse knowledge into a generic KD framework for skin disease classification. Our method models an intra-instance relational feature representation and integrates it with existing KD research. A dual relational knowledge distillation architecture is self-supervised trained while the weighted softened outputs are also exploited to enable the student model to capture richer knowledge from the teacher model. To demonstrate the effectiveness of our method, we conduct experiments on ISIC 2019, a large-scale open-accessed benchmark of skin diseases dermoscopic images. Experiments show that our distilled MobileNetV2 can achieve an accuracy as high as 85% for the classification tasks of 8 different skin diseases with minimal parameters and computing requirements. Ablation studies confirm the effectiveness of our intra- and inter-instance relational knowledge integration strategy. Compared with state-of-the-art knowledge distillation techniques, the proposed method demonstrates improved performance. To the best of our knowledge, this is the first deep knowledge distillation application for multi-disease classification on the large-scale dermoscopy database. Our codes and models are available at https://github.com/enkiwang/Portable-Skin-Lesion-Diagnosis.
URI: https://hdl.handle.net/10356/169051
ISSN: 1361-8415
DOI: 10.1016/j.media.2022.102693
Schools: School of Computer Science and Engineering 
Research Centres: Joint NTU-UBC Research Centre of Excellence in Active Living for the Elderly (LILY) 
Rights: © 2022 Elsevier B.V. All rights reserved.
Fulltext Permission: none
Fulltext Availability: No Fulltext
Appears in Collections:SCSE Journal Articles

SCOPUSTM   
Citations 10

37
Updated on Jun 21, 2024

Web of ScienceTM
Citations 50

3
Updated on Oct 30, 2023

Page view(s)

86
Updated on Jun 22, 2024

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.