Please use this identifier to cite or link to this item:
Title: Grounding referring expressions in images with neural module tree network
Authors: Tan, Kuan Yeow
Keywords: Engineering::Computer science and engineering::Computing methodologies::Image processing and computer vision
Issue Date: 2022
Publisher: Nanyang Technological University
Source: Tan, K. Y. (2022). Grounding referring expressions in images with neural module tree network. Final Year Project (FYP), Nanyang Technological University, Singapore.
Project: SCSE21-0519
Abstract: Grounding referring expressions in images or visual grounding for short, is a task used in Artificial Intelligence (AI) to locate and identify a target object through localization of natural language in images. The complex task of visual grounding requires composite visual reasoning to better mimic the human logical thought process. However, existing methods do not extend towards the multiple components of natural language and over-simplify it into either a monolithic sentence embedding or a rough composition of subject-predicate-object. To venture more into the complexity of natural language, a Neural Module Tree network (NMTree) is applied on the dependency parsing tree of the referring expression during the visual grounding process. Each node of the dependency parsing tree is taken as a neural module that calculates visual attention where the grounding score is accumulated in a bottom-up fashion to the root node of the tree. Gumbel-Softmax approximation is utilized to train the modules and their assembly end-to-end reducing parsing errors. NMTree will allow for the composite reasoning portion to be more loosely coupled from the visual grounding providing more intuitive perception during localization. The inclusion of NMTree had provided better explanation of grounding natural language and outperforms state-of-the-arts on several benchmarks.
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
  Restricted Access
1.21 MBAdobe PDFView/Open

Page view(s)

Updated on Jun 29, 2022


Updated on Jun 29, 2022

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.