Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/75959
Title: Relation identification for reasoning
Authors: Li, Linjie
Keywords: DRNTU::Engineering::Electrical and electronic engineering
Issue Date: 2018
Abstract: Relation extraction is a very important research area in Natural Language Processing. This thesis mainly concentrate on identifying cause-effect relation which can be used in various fields like question answering and medical science. A relation classification system is built in the thesis to achieve the target. The whole system consists of two parts. The first one is text representation. An accurate text representation is key to the performance of the whole classification system. Two methods are used in this part: traditional Bag of Words and Word embedding. Different types of word embedding methods are also compared. The second part is classification, results of word embedding can be further used to extract features and do the classification based on Neural Networks. Two popular structures: Convolutional Neural Network and Long Short Time Memory are implemented and compared. Experiments show that using the combination of Word embedding and Neural Network based classification performs much better than using traditional Bag of words to represent text and do the classification directly. The distinguished performance of CNN in solving relation classification problems are shown by experiments. Some methods are also taken to improve the performance of CNN-based structure in order to achieve the best classification results.
URI: http://hdl.handle.net/10356/75959
Schools: School of Electrical and Electronic Engineering 
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:EEE Theses

Files in This Item:
File Description SizeFormat 
LiLinJie_2018.pdf
  Restricted Access
1.92 MBAdobe PDFView/Open

Page view(s)

281
Updated on Jun 19, 2024

Download(s)

10
Updated on Jun 19, 2024

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.