Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/156089
Full metadata record
DC FieldValueLanguage
dc.contributor.authorWang, Haoen_US
dc.contributor.authorLin, Guoshengen_US
dc.contributor.authorHoi, Steven C. H.en_US
dc.contributor.authorMiao, Chunyanen_US
dc.date.accessioned2022-04-06T08:57:15Z-
dc.date.available2022-04-06T08:57:15Z-
dc.date.issued2022-
dc.identifier.citationWang, H., Lin, G., Hoi, S. C. H. & Miao, C. (2022). Decomposing generation networks with structure prediction for recipe generation. Pattern Recognition, 126, 108578-. https://dx.doi.org/10.1016/j.patcog.2022.108578en_US
dc.identifier.issn0031-3203en_US
dc.identifier.urihttps://hdl.handle.net/10356/156089-
dc.description.abstractRecipe generation from food images and ingredients is a challenging task, which requires the interpretation of the information from another modality. Different from the image captioning task, where the captions usually have one sentence, cooking instructions contain multiple sentences and have obvious structures. To help the model capture the recipe structure and avoid missing some cooking details, we propose a novel framework: Decomposing Generation Networks (DGN) with structure prediction, to get more structured and complete recipe generation outputs. Specifically, we split each cooking instruction into several phases, and assign different sub-generators to each phase. Our approach includes two novel ideas: (i) learning the recipe structures with the global structure prediction component and (ii) producing recipe phases in the sub-generator output component based on the predicted structure. Extensive experiments on the challenging large-scale Recipe1M dataset validate the effectiveness of our proposed model, which improves the performance over the state-of-the-art results.en_US
dc.description.sponsorshipAI Singaporeen_US
dc.description.sponsorshipMinistry of Education (MOE)en_US
dc.description.sponsorshipMinistry of Health (MOH)en_US
dc.description.sponsorshipNational Research Foundation (NRF)en_US
dc.language.isoenen_US
dc.relationAISG-GC-2019-003en_US
dc.relationNRF-NRFI05-2019-0002en_US
dc.relationMOH/NIC/COG04/2017en_US
dc.relationMOH/NIC/HAIG03/2017en_US
dc.relationRG28/18 (S)en_US
dc.relationRG22/19 (S)en_US
dc.relation.ispartofPattern Recognitionen_US
dc.rights© 2022 Elsevier Ltd. All rights reserved. This paper was published in Pattern Recognition and is made available with permission of Elsevier Ltd.en_US
dc.subjectEngineering::Computer science and engineeringen_US
dc.titleDecomposing generation networks with structure prediction for recipe generationen_US
dc.typeJournal Articleen
dc.contributor.schoolSchool of Computer Science and Engineeringen_US
dc.contributor.researchJoint NTU-UBC Research Centre of Excellence in Active Living for the Elderly (LILY)en_US
dc.identifier.doi10.1016/j.patcog.2022.108578-
dc.description.versionSubmitted/Accepted versionen_US
dc.identifier.scopus2-s2.0-85124796277-
dc.identifier.volume126en_US
dc.identifier.spage108578en_US
dc.subject.keywordsText Generationen_US
dc.subject.keywordsVision-and-Languageen_US
dc.description.acknowledgementThis research is supported, in part, by the National Research Foundation (NRF), Singapore under its AI Singapore Programme (AISG Award No: AISG-GC-2019-003) and under its NRF Investigatorship Programme (NRFI Award No. NRF-NRFI05-2019-0002). This research is also supported, in part, by the Singapore Ministry of Health under its National Innovation Challenge on Active and Confident Ageing (NIC Project No. MOH/NIC/COG04/2017 and MOH/NIC/HAIG03/2017), and the MOE Tier-1 research grants: RG28/18 (S) and RG22/19 (S).en_US
item.grantfulltextembargo_20240707-
item.fulltextWith Fulltext-
Appears in Collections:SCSE Journal Articles
Files in This Item:
File Description SizeFormat 
PR_DGN.pdf
  Until 2024-07-07
3.72 MBAdobe PDFUnder embargo until Jul 07, 2024

Page view(s)

169
Updated on Feb 23, 2024

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.