Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/141313
Title: Machine learning models for patent examination
Authors: Wang, Yanqing
Keywords: Engineering::Computer science and engineering::Computer systems organization::Computer-communication networks
Issue Date: 2020
Publisher: Nanyang Technological University
Abstract: This year, text classification remains one of the most attractive research topics in the field of NLP(Natural Language Processing). Due to the complexity of patent classification, Patent Examination is still inseparable from the examiners, which makes the efficiency of Patent Examination inefficient and time-consuming. This is an urgent problem which needs to be addressed. Since Google announced the outstanding performance of BERT in 11 NLP tasks at the end of October 2018, BERT (Bidirectional Encoder Representation from Transformers) has become a fire in the NLP field. This project attempts to use the BERT model to build a patent examination task model. This project comprehensively reviews and implements some patent examination tasks based on text classification, and comprehensively studies their performance on large data sets. Starting from the classification results of patent data sets, we tried to add a "summary" to improve the accuracy of patent classification. The tasks are mainly divided into the following two: (a) Build a text classification model based on BERT and train an optimization model. (b) Add a "summary" mechanism to the model to improve classification accuracy and verify the results.
URI: https://hdl.handle.net/10356/141313
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:EEE Theses

Files in This Item:
File Description SizeFormat 
Final Dissertation_G1901469K.pdf
  Restricted Access
2.37 MBAdobe PDFView/Open

Page view(s)

219
Updated on Feb 7, 2023

Download(s)

8
Updated on Feb 7, 2023

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.