Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/78408
Full metadata record
DC FieldValueLanguage
dc.contributor.authorLuo, Wenhao
dc.date.accessioned2019-06-19T12:43:27Z
dc.date.available2019-06-19T12:43:27Z
dc.date.issued2019
dc.identifier.urihttp://hdl.handle.net/10356/78408
dc.description.abstractWith economic globalization and the rapid development of the Internet, the connections between different countries and languages have become closer and closer, which sharply increase people's demand for cross-language communication. Traditional human translation which has many weaknesses cannot meet the needs of a wide range of translations. Artificial intelligence technology has been applied, Machine translation technology is an effective way to realize automatic translation and solve the increasingly common cross-language communication. The statistic machine translation can previously satisfy the minimum requirement of translation, but it requires many improvements. The work of this dissertation is to explore possible application of deep neural network and combine the current popular Recurrent Neural Network (RNN) system to achieve high performance machine translation. After comparing the advantages and disadvantages of different improvement models for RNN, Long Short-Term Memory (LSTM) [27] which is a more complete experimental algorithm model for the encoding and decoding process is engaged in this dissertation. Two kinds of Neural Machine Translation (NMT) models are available, the classical NMT model with greedy decoding, and the NMT model with attention mechanism [32], both were reviewed and explored in this study. Following which the BLEU i evaluation method is used to index the performance of two models, and the results obtained verify that the NMT model with attention mechanism has 1.9 BLEU value higher than the greedy decoding NMT model in training, and 2.3 BLEU value higher than the greedy decoding NMT model in testing, which directly proves the NMT model of the attention mechanism is improved for the performance of neural machine translation. The translation results have also been compared in three different size of sentences, from which conclusions can be got that normal NMT does well in short sentences only, but lose its power in middle and long sentences, while NMT model with attention mechanism act nicely in all three types of sentences. Whereas, some problems occur like excessive translation, which needs future exploration.en_US
dc.format.extent82 p.en_US
dc.language.isoenen_US
dc.subjectDRNTU::Engineering::Electrical and electronic engineeringen_US
dc.titleEncoder-decoder based neural machine translationen_US
dc.typeThesis
dc.contributor.supervisorGoh Wang Lingen_US
dc.contributor.schoolSchool of Electrical and Electronic Engineeringen_US
dc.description.degreeMaster of Science (Electronics)en_US
item.grantfulltextrestricted-
item.fulltextWith Fulltext-
Appears in Collections:EEE Theses
Files in This Item:
File Description SizeFormat 
Luo_wenhao,final-version1.pdf
  Restricted Access
10.87 MBAdobe PDFView/Open

Page view(s)

287
Updated on Jul 20, 2024

Download(s)

11
Updated on Jul 20, 2024

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.