Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/156616
Title: Causalqa: a causal framework for question answering
Authors: Dutta, Angshuk
Keywords: Engineering::Computer science and engineering
Issue Date: 2022
Publisher: Nanyang Technological University
Source: Dutta, A. (2022). Causalqa: a causal framework for question answering. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/156616
Project: SCSE21-0526
Abstract: Neural networks have proven their success in various fundamental applications such as object detection, image segmentation, image and text generation and several NLP tasks. That said, neural networks are black-box function approximators with good approximation capability described by the universal approximation theorem. This black box nature prevents the utilisation of neural networks in high risk areas such as healthcare. This brings a need for explainable AI. The paradigm used to explore these possibilities is causality. In this work, we introduce a novel question answering algorithm dubbed CausalQA which learns several subtasks such as causal structure learning. Furthermore, we introduce an interventional training paradigm based on previous theoretical works including recent theoretical works on linking graph neural networks to Structural Causal Models. We show proof of concept by evaluating it on a toy dataset and further evaluating it on question answering datasets. We achieve comparable performance to state-of-the-art models and empirically prove the ability.
URI: https://hdl.handle.net/10356/156616
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
FYP_ANGSHUK_DUTTA_U1822048A.pdf
  Restricted Access
15.82 MBAdobe PDFView/Open

Page view(s)

17
Updated on May 19, 2022

Download(s)

5
Updated on May 19, 2022

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.