Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/157035
Full metadata record
DC FieldValueLanguage
dc.contributor.authorZhan, Fangnengen_US
dc.contributor.authorYu, Yingchenen_US
dc.contributor.authorZhang, Changgongen_US
dc.contributor.authorWu, Rongliangen_US
dc.contributor.authorHu, Wenboen_US
dc.contributor.authorLu, Shijianen_US
dc.contributor.authorMa, Feiyingen_US
dc.contributor.authorXie, Xuansongen_US
dc.contributor.authorShao, Lingen_US
dc.date.accessioned2022-04-30T14:38:37Z-
dc.date.available2022-04-30T14:38:37Z-
dc.date.issued2022-
dc.identifier.citationZhan, F., Yu, Y., Zhang, C., Wu, R., Hu, W., Lu, S., Ma, F., Xie, X. & Shao, L. (2022). GMLight: lighting estimation via geometric distribution approximation. IEEE Transactions On Image Processing, 31, 2268-2278. https://dx.doi.org/10.1109/TIP.2022.3151997en_US
dc.identifier.issn1057-7149en_US
dc.identifier.urihttps://hdl.handle.net/10356/157035-
dc.description.abstractInferring the scene illumination from a single image is an essential yet challenging task in computer vision and computer graphics. Existing works estimate lighting by regressing representative illumination parameters or generating illumination maps directly. However, these methods often suffer from poor accuracy and generalization. This paper presents Geometric Mover's Light (GMLight), a lighting estimation framework that employs a regression network and a generative projector for effective illumination estimation. We parameterize illumination scenes in terms of the geometric light distribution, light intensity, ambient term, and auxiliary depth, which can be estimated by a regression network. Inspired by the earth mover's distance, we design a novel geometric mover's loss to guide the accurate regression of light distribution parameters. With the estimated light parameters, the generative projector synthesizes panoramic illumination maps with realistic appearance and high-frequency details. Extensive experiments show that GMLight achieves accurate illumination estimation and superior fidelity in relighting for 3D object insertion. The codes are available at https://github.com/fnzhan/Illumination-Estimation.en_US
dc.language.isoenen_US
dc.relation.ispartofIEEE Transactions on Image Processingen_US
dc.rights© 2022 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. The published version is available at: https://doi.org/10.1109/TIP.2022.3151997.en_US
dc.subjectEngineering::Computer science and engineeringen_US
dc.titleGMLight: lighting estimation via geometric distribution approximationen_US
dc.typeJournal Articleen
dc.contributor.schoolSchool of Computer Science and Engineeringen_US
dc.identifier.doi10.1109/TIP.2022.3151997-
dc.description.versionSubmitted/Accepted versionen_US
dc.identifier.pmid35235508-
dc.identifier.scopus2-s2.0-85125701304-
dc.identifier.volume31en_US
dc.identifier.spage2268en_US
dc.identifier.epage2278en_US
dc.subject.keywordsLighting Estimationen_US
dc.subject.keywordsImage Synthesisen_US
dc.description.acknowledgementThis work was supported by the RIE2020 Industry Alignment Fund—Industry Collaboration Projects (IAF-ICP) Funding Initiative, as well as cash and in-kind contribution from the industry partner(s).en_US
item.fulltextWith Fulltext-
item.grantfulltextopen-
Appears in Collections:SCSE Journal Articles
Files in This Item:
File Description SizeFormat 
GMLight Lighting Estimation via Geometric Distribution Approximation.pdf3.46 MBAdobe PDFThumbnail
View/Open

SCOPUSTM   
Citations 20

9
Updated on Mar 22, 2024

Web of ScienceTM
Citations 50

4
Updated on Oct 24, 2023

Page view(s)

108
Updated on Mar 28, 2024

Download(s) 50

84
Updated on Mar 28, 2024

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.