Please use this identifier to cite or link to this item:
|Title:||OriNet : Robust 3-D orientation estimation with a single particular IMU||Authors:||Esfahani, Mahdi Abolfazli
|Keywords:||Engineering::Electrical and electronic engineering||Issue Date:||2020||Source:||Esfahani, M. A., Wang, H., Wu, K. & Yuan, S. (2020). OriNet : Robust 3-D orientation estimation with a single particular IMU. IEEE Robotics and Automation Letters, 5(2), 399-406. https://dx.doi.org/10.1109/LRA.2019.2959507||Journal:||IEEE Robotics and Automation Letters||Abstract:||Estimating the robot's heading is a crucial requirement in odometry systems which are attempting to estimate the movement trajectory of a robot. Small errors in the orientation estimation result in a significant difference between the estimated and real trajectory, and failure of the odometry system. The odometry problem becomes much more complicated for micro flying robots since they cannot carry massive sensors. In this manner, they should benefit from the small size and low-cost sensors, such as IMU, to solve the odometry problem, and industries always look for such solutions. However, IMU suffers from bias and measurement noise, which makes the problem of position and orientation estimation challenging to be solved by a single IMU. While there are numerous studies on the fusion of IMU with other sensors, this study illustrates the power of the first deep learning framework for estimating the full 3D orientation of the flying robots (as yaw, pitch, and roll in quaternion coordinates) accurately with the presence of a single IMU. A particular IMU should be utilized during the training and testing of the proposed system. Besides, a method based on the Genetic Algorithm is introduced to measure the IMU bias in each execution. The results show that the proposed method improved the flying robots' ability to estimate their orientation displacement by approximately 80% with the presence of a single particular IMU. The proposed approach also outperforms existing solutions that utilize a monocular camera and IMU simultaneously by approximately 30%.||URI:||https://hdl.handle.net/10356/154652||ISSN:||2377-3766||DOI:||10.1109/LRA.2019.2959507||Rights:||© 2019 IEEE. All rights reserved.||Fulltext Permission:||none||Fulltext Availability:||No Fulltext|
|Appears in Collections:||EEE Journal Articles|
Updated on May 26, 2022
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.