Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/163145
Title: Meta-based self-training and re-weighting for aspect-based sentiment analysis
Authors: He, Kai
Mao, Rui
Gong, Tieliang
Li, Chen
Cambria, Erik
Keywords: Engineering::Computer science and engineering
Issue Date: 2022
Source: He, K., Mao, R., Gong, T., Li, C. & Cambria, E. (2022). Meta-based self-training and re-weighting for aspect-based sentiment analysis. IEEE Transactions On Affective Computing, 3202831-. https://dx.doi.org/10.1109/TAFFC.2022.3202831
Journal: IEEE Transactions on Affective Computing
Abstract: Aspect-based sentiment analysis (ABSA) means to identify fine-grained aspects, opinions, and sentiment polarities. Recent ABSA research focuses on utilizing multi-task learning (MTL) to achieve less computational costs and better performance. However, there are certain limits in MTL-based ABSA. For example, unbalanced labels and sub-task learning difficulties may result in the biases that some labels and sub-tasks are overfitting, while the others are underfitting. To address these issues, inspired by neuro-symbolic learning systems, we propose a meta-based self-training method with a meta-weighter (MSM). We believe that a generalizable model can be achieved by appropriate symbolic representation selection (in-domain knowledge) and effective learning control (regulation) in a neural system. Thus, MSM trains a teacher model to generate in-domain knowledge (e.g., unlabeled data selection and pseudo-label generation), where the generated pseudo-labels are used by a student model for supervised learning. Then, the meta-weighter of MSM is jointly trained with the student model to provide each instance with sub-task-specific weights to coordinate their convergence rates, balancing class labels, and alleviating noise impacts introduced from self-training. The following experiments indicate that MSM can utilize 50% labeled data to achieve comparable results to state-of-arts models in ABSA and outperform them with all labeled data.
URI: https://hdl.handle.net/10356/163145
ISSN: 1949-3045
DOI: 10.1109/TAFFC.2022.3202831
Rights: © 2022 IEEE. All rights reserved.
Fulltext Permission: none
Fulltext Availability: No Fulltext
Appears in Collections:SCSE Journal Articles

SCOPUSTM   
Citations 20

12
Updated on Mar 24, 2023

Page view(s)

33
Updated on Mar 31, 2023

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.