Please use this identifier to cite or link to this item:
|Title:||Object grasping of humanoid robot based on YOLO||Authors:||Tian, Li
Thalmann, Nadia Magnenat
|Keywords:||Engineering::Computer science and engineering||Issue Date:||2019||Source:||Tian, L., Thalmann, N. M., Thalmann, D., Fang, Z., & Zheng, J. (2019). Object grasping of humanoid robot based on YOLO. 36th Computer Graphics International Conference, 476-482. doi:10.1007/978-3-030-22514-8_47||Abstract:||This paper presents a system that aims to achieve autonomous grasping for micro-controller based humanoid robots such as the Inmoov robot . The system consists of a visual sensor, a central controller and a manipulator. We modify the open sourced objection detection software YOLO (You Only Look Once) v2  and associate it with the visual sensor to make the sensor be able to detect not only the category of the target object but also the location with the help of a depth camera. We also estimate the dimensions (i.e., the height and width) of the target based on the bounding box technique (Fig. 1). After that, we send the information to the central controller (a humanoid robot), which controls the manipulator (customised robotic hand) to grasp the object with the help of inverse kinematics theory. We conduct experiments to test our method with the Inmoov robot. The experiments show that our method is capable of detecting the object and driving the robotic hands to grasp the target object.||URI:||https://hdl.handle.net/10356/138923||ISBN:||9783030225131||DOI:||10.1007/978-3-030-22514-8_47||Rights:||© 2019 Springer Nature Switzerland AG. All rights reserved. This paper was published in 36th Computer Graphics International Conference and is made available with permission of Springer Nature Switzerland AG.||Fulltext Permission:||open||Fulltext Availability:||With Fulltext|
|Appears in Collections:||IMI Conference Papers|
Files in This Item:
|Object Grasping of Humanoid Robot Based on YOLO.pdf||454.45 kB||Adobe PDF||View/Open|
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.