Please use this identifier to cite or link to this item:
https://hdl.handle.net/10356/140529
Title: | Robust 3D hand pose estimation from single depth images using multi-view CNNs | Authors: | Ge, Liuhao Liang, Hui Yuan, Junsong Thalmann, Daniel |
Keywords: | Engineering::Computer science and engineering | Issue Date: | 2018 | Source: | Ge, L., Liang, H., Yuan, J., & Thalmann, D. (2018). Robust 3D hand pose estimation from single depth images using multi-view CNNs. IEEE Transactions on Image Processing, 27(9), 4422-4436. doi:10.1109/TIP.2018.2834824 | Project: | MOE2015-T2-2-114 | Journal: | IEEE Transactions on Image Processing | Abstract: | Articulated hand pose estimation is one of core technologies in human-computer interaction. Despite the recent progress, most existing methods still cannot achieve satisfactory performance, partly due to the difficulty of the embedded high-dimensional nonlinear regression problem. Most existing data-driven methods directly regress 3D hand pose from 2D depth image, which cannot fully utilize the depth information. In this paper, we propose a novel multi-view convolutional neural network (CNN)-based approach for 3D hand pose estimation. To better exploit 3D information in the depth image, we project the point cloud generated from the query depth image onto multiple views of two projection settings and integrate them for more robust estimation. Multi-view CNNs are trained to learn the mapping from projected images to heat-maps, which reflect probability distributions of joints on each view. These multi-view heat-maps are then fused to estimate the optimal 3D hand pose with learned pose priors, and the unreliable information in multi-view heat-maps is suppressed using a view selection method. Experimental results show that the proposed method is superior to the state-of-the-art methods on two challenging data sets. Furthermore, a cross-data set experiment also validates that our proposed approach has good generalization ability. | URI: | https://hdl.handle.net/10356/140529 | ISSN: | 1057-7149 | DOI: | 10.1109/TIP.2018.2834824 | Rights: | © 2018 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. The published version is available at: https://doi.org/10.1109/TIP.2018.2834824 | Fulltext Permission: | open | Fulltext Availability: | With Fulltext |
Appears in Collections: | EEE Journal Articles |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
08357595.pdf | 5.11 MB | Adobe PDF | ![]() View/Open |
SCOPUSTM
Citations
10
44
Updated on Mar 20, 2023
Web of ScienceTM
Citations
10
34
Updated on Mar 13, 2023
Page view(s)
199
Updated on Mar 21, 2023
Download(s) 50
98
Updated on Mar 21, 2023
Google ScholarTM
Check
Altmetric
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.