Please use this identifier to cite or link to this item:
|Title:||PNPDet : efficient few-shot detection without forgetting via Plug-and-Play sub-networks||Authors:||Zhang, Gongjie
|Keywords:||Engineering::Computer science and engineering||Issue Date:||2021||Source:||Zhang, G., Cui, K., Wu, R., Lu, S., & Tian, Y. (2021). PNPDet : efficient few-shot detection without forgetting via Plug-and-Play sub-networks. Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, 3823-3832.||Abstract:||The human visual system can detect objects of unseen categories from merely a few examples. However, such capability remains absent in state-of-the-art detectors. To bridge this gap, several attempts have been proposed to perform few-shot detection by incorporating meta-learning techniques. Such methods can improve detection performance on unseen categories, but also add huge computational burden, and usually degrade detection performance on seen categories. In this paper, we present PNPDet, a novel Plug-and-Play Detector, for efficient few-shot detection without forgetting. It introduces a simple but effective architecture with separate sub-networks that disentangles the recognition of base and novel categories and prevents hurting performance on known categories while learning new concepts. Distance metric learning is further incorporated into sub-networks, consistently boosting detection performance for both base and novel categories. Experiments show that the proposed PNPDet can achieve comparable few-shot detection performance on unseen categories while not losing accuracy on seen categories, and also remain efficient and flexible at the same time.||URI:||https://hdl.handle.net/10356/146204||Rights:||© 2021 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.||Fulltext Permission:||open||Fulltext Availability:||With Fulltext|
|Appears in Collections:||EEE Conference Papers|
Updated on Apr 22, 2021
Updated on Apr 22, 2021
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.