Please use this identifier to cite or link to this item:
|Title:||3-D localization of human based on an inertial capture system||Authors:||Yuan, Qilong
|Keywords:||DRNTU::Engineering::Mechanical engineering::Robots||Issue Date:||2013||Source:||Yuan, Q., & Chen, I. M. (2013). 3-D Localization of Human Based on an Inertial Capture System. IEEE Transactions on Robotics, 29(3), 806-812.||Series/Report no.:||IEEE transactions on robotics||Abstract:||This paper introduces a method to track the spatial location and movement of a human using wearable inertia sensors without additional external global positioning devices. Starting from the lower limb kinematics of a human, the method uses multiple wearable inertia sensors to determine the orientation of the body segments and lower limb joint motions. At the same time, based on human kinematics and locomotion phase detection, the spatial position and the trajectory of a reference point on the body can be determined. An experimental study has shown that the position error can be controlled within 1-2% of the total distance in both indoor and outdoor environments. The system is capable of localization on irregular terrains (like uphill/downhill). From the localization results, the ground shape and the height information that can be recovered after localization experiments are conducted. A benchmark study on the accuracy of this method was carried out using the camera-based motion analysis system to study the validity of the system. The localization data that are obtained from the proposed method match well with those from the commercial system. Since the sensors can be worn on the human at any time and any place, this method has no restriction to indoor and outdoor applications.||URI:||https://hdl.handle.net/10356/101463
|ISSN:||1552-3098||DOI:||10.1109/TRO.2013.2248535||Fulltext Permission:||none||Fulltext Availability:||No Fulltext|
|Appears in Collections:||MAE Journal Articles|
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.