Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/155736
Title: Progressive self-guided loss for salient object detection
Authors: Yang, Sheng
Lin, Weisi
Lin, Guosheng
Jiang, Qiuping
Liu, Zichuan
Keywords: Engineering::Computer science and engineering::Computing methodologies::Image processing and computer vision
Issue Date: 2021
Source: Yang, S., Lin, W., Lin, G., Jiang, Q. & Liu, Z. (2021). Progressive self-guided loss for salient object detection. IEEE Transactions On Image Processing, 30, 8426-8438. https://dx.doi.org/10.1109/TIP.2021.3113794
Project: MOE2016-T2-2-057(S) 
Journal: IEEE Transactions on Image Processing 
Abstract: We present a simple yet effective progressive self-guided loss function to facilitate deep learning-based salient object detection (SOD) in images. The saliency maps produced by the most relevant works still suffer from incomplete predictions due to the internal complexity of salient objects. Our proposed progressive self-guided loss simulates a morphological closing operation on the model predictions for progressively creating auxiliary training supervisions to step-wisely guide the training process. We demonstrate that this new loss function can guide the SOD model to highlight more complete salient objects step-by-step and meanwhile help to uncover the spatial dependencies of the salient object pixels in a region growing manner. Moreover, a new feature aggregation module is proposed to capture multi-scale features and aggregate them adaptively by a branch-wise attention mechanism. Benefiting from this module, our SOD framework takes advantage of adaptively aggregated multi-scale features to locate and detect salient objects effectively. Experimental results on several benchmark datasets show that our loss function not only advances the performance of existing SOD models without architecture modification but also helps our proposed framework to achieve state-of-the-art performance.
URI: https://hdl.handle.net/10356/155736
ISSN: 1057-7149
DOI: 10.1109/TIP.2021.3113794
Rights: © 2021 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. The published version is available at: https://doi.org/10.1109/TIP.2021.3113794.
Fulltext Permission: open
Fulltext Availability: With Fulltext
Appears in Collections:EEE Journal Articles
SCSE Journal Articles

Files in This Item:
File Description SizeFormat 
ys_tipfinal.pdf3.71 MBAdobe PDFThumbnail
View/Open

SCOPUSTM   
Citations 20

18
Updated on Mar 20, 2023

Web of ScienceTM
Citations 20

17
Updated on Mar 20, 2023

Page view(s)

117
Updated on Mar 20, 2023

Download(s) 50

53
Updated on Mar 20, 2023

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.