Please use this identifier to cite or link to this item:
Title: Fast OWDETR: transformer for open world object detection
Authors: Chen, Xuanying
Keywords: Engineering::Computer science and engineering::Computing methodologies::Image processing and computer vision
Issue Date: 2022
Publisher: Nanyang Technological University
Source: Chen, X. (2022). Fast OWDETR: transformer for open world object detection. Master's thesis, Nanyang Technological University, Singapore.
Abstract: Object detection is one of the basic computer vision tasks. Recently, a more challenging task, called open world object detection, which aims to identify novel unknown objects and incrementally learn to classify them when labels are available has been proposed. Open World Object Detector (ORE) and Open-world Detection Transformer (OWDETR) are two methodologies proposed to address the open world task, while they are both time-consuming in training and with shortcomings. Aiming to improve the training speed and detection performance, we propose Fast OWDETR based on OWDETR which is a transformer-based approach. Speci cally, we replace the attentiondriven pseudo labeling mechanism in OWDETR with a logits-based one, and change the standard Deformable DETR into Deformable DETR with box re nement. For shorter transferring time between tasks, we present an incremental learning approach which dynamically reduces the number of trainable parameters in the classi cation head while keeping the backbone frozen after initial training. Our extensive experiments show that Fast OWDETR can achieve detection performance comparable with OWDETR while using less training time within tasks and between tasks.
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:EEE Theses

Files in This Item:
File Description SizeFormat 
  Restricted Access
17.73 MBAdobe PDFView/Open

Page view(s)

Updated on Dec 8, 2022


Updated on Dec 8, 2022

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.