Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/170688
Title: Edge detection guide network for semantic segmentation of remote-sensing images
Authors: Jin, Jianhui
Zhou, Wujie
Yang, Rongwang
Ye, Lv
Yu, Lu
Keywords: Engineering::Computer science and engineering
Issue Date: 2023
Source: Jin, J., Zhou, W., Yang, R., Ye, L. & Yu, L. (2023). Edge detection guide network for semantic segmentation of remote-sensing images. IEEE Geoscience and Remote Sensing Letters, 20, 5000505-. https://dx.doi.org/10.1109/LGRS.2023.3234257
Journal: IEEE Geoscience and Remote Sensing Letters
Abstract: The acquisition of high-resolution satellite and airborne remote sensing images has been significantly simplified due to the rapid development of sensor technology. Several practical applications of high-resolution remote sensing images (HRRSIs) are based on semantic segmentation. However, single-modal HRRSIs are difficult to classify accurately in the complex situation of some scene objects; therefore, the semantic segmentation of multi-source information fusion is gaining popularity. The inherent difference between multimodal features and the semantic gap between multi-level features typically affect the performance of existing multi-mode fusion methods. We propose a multimodal fusion network based on edge detection to address these issues. This method aids multimodal information fusion by utilizing spatial information contained in the boundary. An edge detection guide module is included in the feature extraction stage to realize the boundary information through the fusion of details and semantics between high-level and low-level features. The boundary information is extended into the well-designed multimodal adaptive fusion block (MAFB) to obtain the multimodal fusion features. Furthermore, a residual adaptive fusion block (RAFB) and a spatial position module (SPM) in the feature decoding stage have been designed to fuse multi-level features from the standpoint of local and global dependence. We compared our method to several state-of-the-art (SOTA) methods using the International Society for Photogrammetry and Remote Sensing's (ISPRS) Vaihingen and Potsdam datasets. The final results demonstrate that our method achieves excellent performance.
URI: https://hdl.handle.net/10356/170688
ISSN: 1545-598X
DOI: 10.1109/LGRS.2023.3234257
Schools: School of Computer Science and Engineering 
Rights: © 2023 IEEE. All rights reserved.
Fulltext Permission: none
Fulltext Availability: No Fulltext
Appears in Collections:SCSE Journal Articles

SCOPUSTM   
Citations 10

52
Updated on Mar 19, 2025

Web of ScienceTM
Citations 20

7
Updated on Oct 30, 2023

Page view(s)

170
Updated on Mar 21, 2025

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.