Please use this identifier to cite or link to this item:
|Title:||Real-time 3D hand pose estimation from depth images||Authors:||Ge, Liuhao||Keywords:||DRNTU::Engineering::Computer science and engineering::Computing methodologies::Image processing and computer vision||Issue Date:||3-Dec-2018||Source:||Ge, L. (2018). Real-time 3D hand pose estimation from depth images. Doctoral thesis, Nanyang Technological University, Singapore.||Abstract:||Accurate and real-time 3D hand pose estimation is one of the core technologies for human computer interaction in virtual reality and augmented reality applications, since this technology provides a natural way for users to interact with virtual environments and virtual objects. Despite the previous works in this field, it is still challenging to achieve efficient and robust hand pose estimation performance because of large variations in hand pose, high dimensionality of hand motion, severe self-occlusion and self-similarity of fingers. This thesis focuses on the problem of 3D hand pose estimation from single depth images. To achieve accurate results and real-time performance of 3D hand pose estimation, four different methods are proposed, which are multi-view CNNs-based method, 3D CNN-based method, point set-based holistic regression method, and point set-based point-wise regression method. Different to conventional 2D convolutional neural networks, our proposed methods represent the input and output as different forms and take advantages of 3D deep learning, which can effectively exploit the 3D spatial information in the depth image to accurately estimate the 3D hand joint locations. Experimental results on public hand pose datasets have shown that our proposed methods are able to achieve superior accuracy performance and run in real-time on GPU. Some hand pose estimation applications in virtual reality environments are developed in this thesis.||URI:||https://hdl.handle.net/10356/87652
|DOI:||https://doi.org/10.32657/10220/46781||Fulltext Permission:||open||Fulltext Availability:||With Fulltext|
|Appears in Collections:||IGS Theses|
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.