Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/149071
Full metadata record
DC FieldValueLanguage
dc.contributor.authorZhang, Gongjieen_US
dc.contributor.authorLu, Shijianen_US
dc.contributor.authorZhang, Weien_US
dc.date.accessioned2021-05-28T05:43:02Z-
dc.date.available2021-05-28T05:43:02Z-
dc.date.issued2019-
dc.identifier.citationZhang, G., Lu, S. & Zhang, W. (2019). CAD-Net : a context-aware detection network for objects in remote sensing imagery. IEEE Transactions On Geoscience and Remote Sensing, 57(12), 10015-10024. https://dx.doi.org/10.1109/TGRS.2019.2930982en_US
dc.identifier.issn0196-2892en_US
dc.identifier.urihttps://hdl.handle.net/10356/149071-
dc.description.abstractAccurate and robust detection of multi-class objects in optical remote sensing images is essential to many real-world applications such as urban planning, traffic control, searching and rescuing, etc. However, state-of-the-art object detection techniques designed for images captured using ground-level sensors usually experience a sharp performance drop when directly applied to remote sensing images, largely due to the object appearance differences in remote sensing images in term of sparse texture, low contrast, arbitrary orientations, large scale variations, etc. This paper presents a novel object detection network (CAD-Net) that exploits attention-modulated features as well as global and local contexts to address the new challenges in detecting objects from remote sensing images. The proposed CAD-Net learns global and local contexts of objects by capturing their correlations with the global scene (at scene-level) and the local neighboring objects or features (at object-level), respectively. In addition, it designs a spatial-and-scale-aware attention module that guides the network to focus on more informative regions and features as well as more appropriate feature scales. Experiments over two publicly available object detection datasets for remote sensing images demonstrate that the proposed CAD-Net achieves superior detection performance. The implementation codes will be made publicly available for facilitating future researches.en_US
dc.description.sponsorshipNanyang Technological Universityen_US
dc.language.isoenen_US
dc.relation.ispartofIEEE Transactions on Geoscience and Remote Sensingen_US
dc.rights© 2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. The published version is available at: https://doi.org/10.1109/TGRS.2019.2930982.en_US
dc.subjectEngineering::Computer science and engineeringen_US
dc.titleCAD-Net : a context-aware detection network for objects in remote sensing imageryen_US
dc.typeJournal Articleen
dc.contributor.schoolSchool of Computer Science and Engineeringen_US
dc.identifier.doi10.1109/TGRS.2019.2930982-
dc.description.versionAccepted versionen_US
dc.identifier.arxiv1903.00857-
dc.identifier.issue12en_US
dc.identifier.volume57en_US
dc.identifier.spage10015en_US
dc.identifier.epage10024en_US
dc.subject.keywordsOptical Remote Sensing Imagesen_US
dc.subject.keywordsObject Detectionen_US
dc.subject.keywordsDeep Learningen_US
dc.subject.keywordsConvolutional Neural Networks (CNNs)en_US
dc.description.acknowledgementThis work was supported in part by the Nanyang Technological University under Start-Up Grant, in part by the National Key Research and Development Plan of China under Grant 2017YFB1300205, in part by the National Natural Science Foundation of China (NSFC) under Grant 61573222, and in part by the Major Research Program of Shandong Province under Grant 2018CXGC1503.en_US
item.fulltextWith Fulltext-
item.grantfulltextopen-
Appears in Collections:SCSE Journal Articles
Files in This Item:
File Description SizeFormat 
cad-net.pdf5.08 MBAdobe PDFThumbnail
View/Open

SCOPUSTM   
Citations 1

320
Updated on Mar 22, 2024

Web of ScienceTM
Citations 5

154
Updated on Oct 26, 2023

Page view(s)

305
Updated on Mar 27, 2024

Download(s) 20

320
Updated on Mar 27, 2024

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.