Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/154652
Full metadata record
DC FieldValueLanguage
dc.contributor.authorEsfahani, Mahdi Abolfazlien_US
dc.contributor.authorWang, Hanen_US
dc.contributor.authorWu, Keyuen_US
dc.contributor.authorYuan, Shenghaien_US
dc.date.accessioned2021-12-30T06:52:30Z-
dc.date.available2021-12-30T06:52:30Z-
dc.date.issued2020-
dc.identifier.citationEsfahani, M. A., Wang, H., Wu, K. & Yuan, S. (2020). OriNet : Robust 3-D orientation estimation with a single particular IMU. IEEE Robotics and Automation Letters, 5(2), 399-406. https://dx.doi.org/10.1109/LRA.2019.2959507en_US
dc.identifier.issn2377-3766en_US
dc.identifier.urihttps://hdl.handle.net/10356/154652-
dc.description.abstractEstimating the robot's heading is a crucial requirement in odometry systems which are attempting to estimate the movement trajectory of a robot. Small errors in the orientation estimation result in a significant difference between the estimated and real trajectory, and failure of the odometry system. The odometry problem becomes much more complicated for micro flying robots since they cannot carry massive sensors. In this manner, they should benefit from the small size and low-cost sensors, such as IMU, to solve the odometry problem, and industries always look for such solutions. However, IMU suffers from bias and measurement noise, which makes the problem of position and orientation estimation challenging to be solved by a single IMU. While there are numerous studies on the fusion of IMU with other sensors, this study illustrates the power of the first deep learning framework for estimating the full 3D orientation of the flying robots (as yaw, pitch, and roll in quaternion coordinates) accurately with the presence of a single IMU. A particular IMU should be utilized during the training and testing of the proposed system. Besides, a method based on the Genetic Algorithm is introduced to measure the IMU bias in each execution. The results show that the proposed method improved the flying robots' ability to estimate their orientation displacement by approximately 80% with the presence of a single particular IMU. The proposed approach also outperforms existing solutions that utilize a monocular camera and IMU simultaneously by approximately 30%.en_US
dc.language.isoenen_US
dc.relation.ispartofIEEE Robotics and Automation Lettersen_US
dc.rights© 2019 IEEE. All rights reserved.en_US
dc.subjectEngineering::Electrical and electronic engineeringen_US
dc.titleOriNet : Robust 3-D orientation estimation with a single particular IMUen_US
dc.typeJournal Articleen
dc.contributor.schoolSchool of Electrical and Electronic Engineeringen_US
dc.identifier.doi10.1109/LRA.2019.2959507-
dc.identifier.scopus2-s2.0-85078016601-
dc.identifier.issue2en_US
dc.identifier.volume5en_US
dc.identifier.spage399en_US
dc.identifier.epage406en_US
dc.subject.keywordsLocalizationen_US
dc.subject.keywordsSLAMen_US
item.fulltextNo Fulltext-
item.grantfulltextnone-
Appears in Collections:EEE Journal Articles

Page view(s)

27
Updated on Jul 1, 2022

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.