Please use this identifier to cite or link to this item:
Title: Simple global thresholding neural network for shadow detection
Authors: Li, Guiyuan
Zong, Changfu
Zhang, Dong
Zhu, Tianjun
Li, Jianying
Keywords: Engineering::Electrical and electronic engineering
Issue Date: 2021
Source: Li, G., Zong, C., Zhang, D., Zhu, T. & Li, J. (2021). Simple global thresholding neural network for shadow detection. Sensors and Materials, 33(9), 3307-3316.
Journal: Sensors and Materials
Abstract: Shadow detection based on vision sensors is widely used in image processing. Because of the variability of illumination and projection surface color, shadow detection based on a color image is a challenging problem. Aiming at solving the conflict between the complexity and robustness of current shadow detection algorithms, we established a new shadow detection network by combining the global thresholding method with a neural network, which realized the decoupling of the global threshold and binary fusion. Three public shadow detection datasets, large-scale shadow dataset of Stony Brook University (SBU), large-scale dataset with image shadow triplets (ISTD), and shadow detection for mobile robots features evaluation and datasets (SDMR), were utilized for its verification. Experimental results show that the performance of the proposed network approaches that of previous deep learning methods, both visually and in terms of objective indicators, but the proposed network has the advantages of a simple structure and good robustness.
ISSN: 0914-4935
DOI: 10.18494/SAM.2021.3398
Rights: © 2021 MYU K.K. All rights reserved. This paper was published in Sensors and Materials and is made available with permission of MYU K.K.
Fulltext Permission: open
Fulltext Availability: With Fulltext
Appears in Collections:EEE Journal Articles

Files in This Item:
File Description SizeFormat 
SM2690.pdf1.8 MBAdobe PDFView/Open

Page view(s)

Updated on May 24, 2022


Updated on May 24, 2022

Google ScholarTM




Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.