Please use this identifier to cite or link to this item:
Title: Spatial feature mapping for 6DoF object pose estimation
Authors: Mei, Jianhan
Jiang, Xudong
Ding, Henghui
Keywords: Engineering::Electrical and electronic engineering
Issue Date: 2022
Source: Mei, J., Jiang, X. & Ding, H. (2022). Spatial feature mapping for 6DoF object pose estimation. Pattern Recognition, 131, 108835-.
Journal: Pattern Recognition
Abstract: This work aims to estimate 6Dof (6D) object pose in background clutter. Considering the strong occlusion and background noise, we propose to utilize the spatial structure for better tackling this challenging task. Observing that the 3D mesh can be naturally abstracted by a graph, we build the graph using 3D points as vertices and mesh connections as edges. We construct the corresponding mapping from 2D image features to 3D points for filling the graph and fusion of the 2D and 3D features. Afterward, a Graph Convolutional Network (GCN) is applied to help the feature exchange among objects’ points in 3D space. To address the problem of rotation symmetry ambiguity for objects, a spherical convolution is utilized and the spherical features are combined with the convolutional features that are mapped to the graph. Predefined 3D keypoints are voted and the 6DoF pose is obtained via the fitting optimization. Two scenarios of inference, one with the depth information and the other without it are discussed. Tested on the datasets of YCB-Video and LINEMOD, the experiments demonstrate the effectiveness of our proposed method.
ISSN: 0031-3203
DOI: 10.1016/j.patcog.2022.108835
Schools: School of Electrical and Electronic Engineering 
Rights: © 2022 Elsevier Ltd. All rights reserved.
Fulltext Permission: none
Fulltext Availability: No Fulltext
Appears in Collections:EEE Journal Articles

Citations 50

Updated on Apr 12, 2024

Web of ScienceTM
Citations 20

Updated on Oct 26, 2023

Page view(s)

Updated on Apr 10, 2024

Google ScholarTM




Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.