Please use this identifier to cite or link to this item:
Title: Information extraction of hazard events
Authors: Lim, Joanna Jia Yi
Keywords: Engineering::Electrical and electronic engineering
Issue Date: 2021
Publisher: Nanyang Technological University
Source: Lim, J. J. Y. (2021). Information extraction of hazard events. Final Year Project (FYP), Nanyang Technological University, Singapore.
Project: A1104-201
Abstract: Information Extraction (IE) is the process of extracting structured information from unstructured text. Since news articles on hazard events consist of useful information and are usually reported in real-time, identifying, and extracting such information would allow the government and emergency response teams to better allocate resources and support to affected areas. In this project, several deep learning models were explored to identify occurrences of information like Deaths, Injury, Location, Date and Time in hazard events related news sentences. News articles of hazard events like attacks, earthquakes, typhoon, hurricanes, road accidents were first identified and filtered using keywords. Next, sentences of interest from these news articles were isolated and labelled to form the hazard events database. The labelled training data is then used to train deep neural network models. Two schemes were explored in this project. In the first scheme, one single model was trained to handle multi-class samples. While in the second scheme, multiple binary classifiers were trained. Discussion and comparison of results between the two schemes were carried out. Finally, information like location, date and time was extracted using spaCy’s named entity recognition.
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:EEE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
Joanna Lim Jia Yi_Final Final Report.pdf
  Restricted Access
3.64 MBAdobe PDFView/Open

Page view(s)

Updated on May 19, 2022

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.