Please use this identifier to cite or link to this item:
|Title:||A robust robot design for item picking||Authors:||Causo, Albert
Kok, Yuan Yik
Teoh, Yee Seng
Tju, Hendra Suratno
|Keywords:||Engineering::Mechanical engineering::Robots||Issue Date:||2018||Source:||Causo, A., Chong, Z.-H., Luxman, R., Kok, Y. Y., Yi, Z., Pang, W.-C., . . . Chen, I.-M. (2018). A robust robot design for item picking. Proceedings of 2018 IEEE International Conference on Robotics and Automation (ICRA), 7421-7426. doi:10.1109/ICRA.2018.8461057||Abstract:||In order to build a stable and reliable system for the Amazon Robotics Challenge we went through a detailed study of the performance and system requirements based on the rules and our past experience of the challenge. The challenge was to build a robot that integrates grasping, vision, motion planning, among others, to be able to pick items from a shelf to specific order boxes. This paper presents the development process including component selection, module designs, and deployment. The resulting robot system has dual 6 degrees of freedom industrial arms mounted on fixed bases, which in turn are mounted on a calibrated table. The robot works with a custom-designed top-open extendable shelf. The vision system uses multiple stereo cameras mounted on a fixed calibrated frame. Feature-based comparison and machine-learning based matching are used to identify and determine item pose. The gripper system uses suction cup and the grasping strategy is pick from the top. Error recovery strategies were also implemented to ensure robust performance. During the competition, the robot was able to pick all target items with the shortest amount of time.||URI:||https://hdl.handle.net/10356/142666||ISBN:||978-1-5386-3082-2||DOI:||10.1109/ICRA.2018.8461057||Rights:||© 2018 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. The published version is available at: https://doi.org/10.1109/ICRA.2018.8461057.||Fulltext Permission:||open||Fulltext Availability:||With Fulltext|
|Appears in Collections:||MAE Conference Papers|
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.