Please use this identifier to cite or link to this item:
Title: Lightweight salient object detection in optical remote sensing images via feature correlation
Authors: Li, Gongyang
Liu, Zhi
Bai, Zhen
Lin, Weisi
Ling, Haibin
Keywords: Engineering::Computer science and engineering
Issue Date: 2022
Source: Li, G., Liu, Z., Bai, Z., Lin, W. & Ling, H. (2022). Lightweight salient object detection in optical remote sensing images via feature correlation. IEEE Transactions On Geoscience and Remote Sensing, 60.
Project: MOE2016-T2-2- 057(S)
Journal: IEEE Transactions on Geoscience and Remote Sensing
Abstract: Salient object detection in optical remote sensing images (ORSI-SOD) has been widely explored for understanding ORSIs. However, previous methods focus mainly on improving the detection accuracy while neglecting the cost in memory and computation, which may hinder their real-world applications. In this paper, we propose a novel lightweight ORSI-SOD solution, named CorrNet, to address these issues. In CorrNet, we first lighten the backbone (VGG-16) and build a lightweight subnet for feature extraction. Then, following the coarse-to-fine strategy, we generate an initial coarse saliency map from high-level semantic features in a Correlation Module (CorrM). The coarse saliency map serves as the location guidance for low-level features. In CorrM, we mine the object location information between high-level semantic features through the cross-layer correlation operation. Finally, based on low-level detailed features, we refine the coarse saliency map in the refinement subnet equipped with Dense Lightweight Refinement Blocks, and produce the final fine saliency map. By reducing the parameters and computations of each component, CorrNet ends up having only 4.09M parameters and running with 21.09G FLOPs. Experimental results on two public datasets demonstrate that our lightweight CorrNet achieves competitive or even better performance compared with 26 state-of-the-art methods (including 16 large CNN-based methods and 2 lightweight methods), and meanwhile enjoys the clear memory and run time efficiency. The code and results of our method are available at
ISSN: 0196-2892
DOI: 10.1109/TGRS.2022.3145483
Rights: © 2022 IEEE. All rights reserved.
Fulltext Permission: none
Fulltext Availability: No Fulltext
Appears in Collections:SCSE Journal Articles

Citations 20

Updated on Feb 4, 2023

Web of ScienceTM
Citations 20

Updated on Feb 4, 2023

Page view(s)

Updated on Feb 4, 2023

Google ScholarTM




Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.