Please use this identifier to cite or link to this item:
https://hdl.handle.net/10356/141973
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Weng, Junwu | en_US |
dc.contributor.author | Jiang, Xudong | en_US |
dc.contributor.author | Zheng, Wei-Long | en_US |
dc.contributor.author | Yuan, Junsong | en_US |
dc.date.accessioned | 2020-06-12T06:50:39Z | - |
dc.date.available | 2020-06-12T06:50:39Z | - |
dc.date.issued | 2020 | - |
dc.identifier.citation | Weng, J., Jiang, X., Zheng, W.-L., & Yuan, J. (2019). Early action recognition with category exclusion using policy-based reinforcement learning. IEEE Transactions on Circuits and Systems for Video Technology, in-press. doi:10.1109/TCSVT.2020.2976789 | en_US |
dc.identifier.issn | 1051-8215 | en_US |
dc.identifier.uri | https://hdl.handle.net/10356/141973 | - |
dc.description.abstract | The goal of early action recognition is to predict action label when the sequence is partially observed. The existing methods treat the early action recognition task as sequential classification problems on different observation ratios of an action sequence. Since these models are trained by differentiating positive category from all negative classes, the diverse information of different negative categories is ignored, which we believe can be collected to help improve the recognition performance. In this paper, we step towards to a new direction by introducing category exclusion to early action recognition. We model the exclusion as a mask operation on the classification probability output of a pre-trained early action recognition classifier. Specifically, we use policy-based reinforcement learning to train an agent. The agent generates a series of binary masks to exclude interfering negative categories during action execution and hence help improve the recognition accuracy. The proposed method is evaluated on three benchmark recognition datasets, NTU-RGBD, First-Person Hand Action, as well as UCF-101. The proposed method enhances the recognition accuracy consistently over all different observation ratios on the three datasets, where the accuracy improvements on the early stages are especially significant. | en_US |
dc.description.sponsorship | NRF (Natl Research Foundation, S’pore) | en_US |
dc.language.iso | en | en_US |
dc.relation.ispartof | IEEE Transactions on Circuits and Systems for Video Technology | en_US |
dc.rights | © 2020 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. The published version is available at: https://doi.org/10.1109/TCSVT.2020.2976789. | en_US |
dc.subject | Engineering::Electrical and electronic engineering | en_US |
dc.title | Early action recognition with category exclusion using policy-based reinforcement learning | en_US |
dc.type | Journal Article | en |
dc.contributor.school | School of Electrical and Electronic Engineering | en_US |
dc.contributor.research | Institute for Media Innovation (IMI) | en_US |
dc.identifier.doi | 10.1109/TCSVT.2020.2976789 | - |
dc.description.version | Accepted version | en_US |
dc.subject.keywords | Category Exclusion | en_US |
dc.subject.keywords | Early Action Recognition | en_US |
item.fulltext | With Fulltext | - |
item.grantfulltext | open | - |
Appears in Collections: | IMI Journal Articles |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Early Action Recognition with Category Exclusion using Policy-based Reinforcement Learning.pdf | 3.42 MB | Adobe PDF | View/Open |
SCOPUSTM
Citations
20
23
Updated on Mar 27, 2024
Web of ScienceTM
Citations
20
18
Updated on Oct 25, 2023
Page view(s)
374
Updated on Mar 27, 2024
Download(s) 20
211
Updated on Mar 27, 2024
Google ScholarTM
Check
Altmetric
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.