Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/147806
Title: Multi-task gradient descent for multi-task learning
Authors: Bai, Lu
Ong, Yew-Soon
He, Tiantian
Gupta, Abhishek
Keywords: Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence
Issue Date: 2020
Source: Bai, L., Ong, Y., He, T. & Gupta, A. (2020). Multi-task gradient descent for multi-task learning. Memetic Computing, 12(4), 355-369. https://dx.doi.org/10.1007/s12293-020-00316-3
Project: AISG-RP-2018-004 
A19C1a0018 
Journal: Memetic Computing 
Abstract: Multi-Task Learning (MTL) aims to simultaneously solve a group of related learning tasks by leveraging the salutary knowledge memes contained in the multiple tasks to improve the generalization performance. Many prevalent approaches focus on designing a sophisticated cost function, which integrates all the learning tasks and explores the task-task relationship in a predefined manner. Different from previous approaches, in this paper, we propose a novel Multi-task Gradient Descent (MGD) framework, which improves the generalization performance of multiple tasks through knowledge transfer. The uniqueness of MGD lies in assuming individual task-specific learning objectives at the start, but with the cost functions implicitly changing during the course of parameter optimization based on task-task relationships. Specifically, MGD optimizes the individual cost function of each task using a reformative gradient descent iteration, where relations to other tasks are facilitated through effectively transferring parameter values (serving as the computational representations of memes) from other tasks. Theoretical analysis shows that the proposed framework is convergent under any appropriate transfer mechanism. Compared with existing MTL approaches, MGD provides a novel easy-to-implement framework for MTL, which can mitigate negative transfer in the learning procedure by asymmetric transfer. The proposed MGD has been compared with both classical and state-of-the-art approaches on multiple MTL datasets. The competitive experimental results validate the effectiveness of the proposed algorithm.
URI: https://hdl.handle.net/10356/147806
ISSN: 1865-9292
DOI: 10.1007/s12293-020-00316-3
Schools: School of Computer Science and Engineering 
Research Centres: Data Science and Artificial Intelligence Research Centre 
Singapore Institute of Manufacturing Technology 
Rights: © 2020 Springer-Verlag Berlin Heidelberg. This is a post-peer-review, pre-copyedit version of an article published in Memetic Computing. The final authenticated version is available online at: http://dx.doi.org/10.1007/s12293-020-00316-3
Fulltext Permission: open
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Journal Articles

Files in This Item:
File Description SizeFormat 
MGD_MCJ.pdf424.37 kBAdobe PDFThumbnail
View/Open

SCOPUSTM   
Citations 20

18
Updated on May 5, 2025

Web of ScienceTM
Citations 20

9
Updated on Oct 27, 2023

Page view(s)

415
Updated on May 4, 2025

Download(s) 20

376
Updated on May 4, 2025

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.