Please use this identifier to cite or link to this item:
https://hdl.handle.net/10356/181468
Title: | Converse attention knowledge transfer for low-resource named entity recognition | Authors: | Lyu, Shengfei Sun, Linghao Yi, Huixiong Liu, Yong Chen, Huanhuan Miao, Chunyan |
Keywords: | Computer and Information Science | Issue Date: | 2024 | Source: | Lyu, S., Sun, L., Yi, H., Liu, Y., Chen, H. & Miao, C. (2024). Converse attention knowledge transfer for low-resource named entity recognition. International Journal of Crowd Science, 8(3), 140-148. https://dx.doi.org/10.26599/IJCS.2023.9100014 | Journal: | International Journal of Crowd Science | Abstract: | In recent years, great success has been achieved in many tasks of natural language processing (NLP), e.g., named entity recognition (NER), especially in the high-resource language, i.e., English, thanks in part to the considerable amount of labeled resources. More labeled resources, better word representations. However, most low-resource languages do not have such an abundance of labeled data as high-resource English, leading to poor performance of NER in these low-resource languages due to poor word representations. In the paper, we propose converse attention network (CAN) to augment word representations in low-resource languages from the high-resource language, improving the performance of NER in low-resource languages by transferring knowledge learned in the high-resource language. CAN first translates sentences in low-resource languages into high-resource English using an attention-based translation module. In the process of translation, CAN obtains the attention matrices that align word representations of high-resource language space and low-resource language space. Furthermore, CAN augments word representations learned in low-resource language space with word representations learned in high-resource language space using the attention matrices. Experiments on four low-resource NER datasets show that CAN achieves consistent and significant performance improvements, which indicates the effectiveness of CAN. | URI: | https://hdl.handle.net/10356/181468 | ISSN: | 2398-7294 | DOI: | 10.26599/IJCS.2023.9100014 | Schools: | School of Computer Science and Engineering | Rights: | © The author(s) 2024. The articles published in this open access journal are distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/). | Fulltext Permission: | open | Fulltext Availability: | With Fulltext |
Appears in Collections: | SCSE Journal Articles |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Converse_Attention_Knowledge_Transfer_for_Low-Resource_Named_Entity_Recognition.pdf | 625.81 kB | Adobe PDF | ![]() View/Open |
Page view(s)
43
Updated on Jan 16, 2025
Download(s)
4
Updated on Jan 16, 2025
Google ScholarTM
Check
Altmetric
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.