Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/162754
Title: Attention-guided progressive neural texture fusion for high dynamic range image restoration
Authors: Chen, Jie
Yang, Zaifeng
Chan, Tsz Nam
Li, Hui
Hou, Junhui
Chau, Lap-Pui
Keywords: Engineering::Electrical and electronic engineering
Issue Date: 2022
Source: Chen, J., Yang, Z., Chan, T. N., Li, H., Hou, J. & Chau, L. (2022). Attention-guided progressive neural texture fusion for high dynamic range image restoration. IEEE Transactions On Image Processing, 31, 2661-2672. https://dx.doi.org/10.1109/TIP.2022.3160070
Journal: IEEE Transactions on Image Processing 
Abstract: High Dynamic Range (HDR) imaging via multi-exposure fusion is an important task for most modern imaging platforms. In spite of recent developments in both hardware and algorithm innovations, challenges remain over content association ambiguities caused by saturation, motion, and various artifacts introduced during multi-exposure fusion such as ghosting, noise, and blur. In this work, we propose an Attention-guided Progressive Neural Texture Fusion (APNT-Fusion) HDR restoration model which aims to address these issues within one framework. An efficient two-stream structure is proposed which separately focuses on texture feature transfer over saturated regions and multi-exposure tonal and texture feature fusion. A neural feature transfer mechanism is proposed which establishes spatial correspondence between different exposures based on multi-scale VGG features in the masked saturated HDR domain for discriminative contextual clues over the ambiguous image areas. A progressive texture blending module is designed to blend the encoded two-stream features in a multi-scale and progressive manner. In addition, we introduce several novel attention mechanisms, i.e., the motion attention module detects and suppresses the content discrepancies among the reference images; the saturation attention module facilitates differentiating the misalignment caused by saturation from those caused by motion; and the scale attention module ensures texture blending consistency between different coder/decoder scales. We carry out comprehensive qualitative and quantitative evaluations and ablation studies, which validate that these novel modules work coherently under the same framework and outperform state-of-the-art methods.
URI: https://hdl.handle.net/10356/162754
ISSN: 1057-7149
DOI: 10.1109/TIP.2022.3160070
Rights: © 2022 IEEE. All rights reserved.
Fulltext Permission: none
Fulltext Availability: No Fulltext
Appears in Collections:EEE Journal Articles

SCOPUSTM   
Citations 50

3
Updated on Nov 25, 2022

Web of ScienceTM
Citations 50

3
Updated on Nov 30, 2022

Page view(s)

8
Updated on Dec 3, 2022

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.