dc.contributor.authorXiao, Yang
dc.contributor.authorZhang, Zhijun
dc.contributor.authorBeck, Aryel
dc.contributor.authorYuan, Junsong
dc.contributor.authorThalmann, Daniel
dc.date.accessioned2014-10-28T07:06:58Z
dc.date.available2014-10-28T07:06:58Z
dc.date.copyright2014en_US
dc.date.issued2014
dc.identifier.citationXiao, Y., Zhang, Z., Beck, A., Yuan, J., & Thalmann, D. (2014). Human robot interaction by understanding upper body gestures. Presence : teleoperators and virtual environments, 23(2), 133-154.en_US
dc.identifier.urihttp://hdl.handle.net/10220/24134
dc.description.abstractIn this paper, a human–robot interaction system based on a novel combination of sensors is proposed. It allows one person to interact with a humanoid social robot using natural body language. The robot understands the meaning of human upper body gestures and expresses itself by using a combination of body movements, facial expressions, and verbal language. A set of 12 upper body gestures is involved for communication. This set also includes gestures with human–object interactions. The gestures are characterized by head, arm, and hand posture information. The wearable Immersion CyberGlove II is employed to capture the hand posture. This information is combined with the head and arm posture captured from Microsoft Kinect. This is a new sensor solution for human-gesture capture. Based on the posture data from the CyberGlove II and Kinect, an effective and real-time human gesture recognition method is proposed. The gesture understanding approach based on an innovative combination of sensors is the main contribution of this paper. To verify the effectiveness of the proposed gesture recognition method, a human body gesture data set is built. The experimental results demonstrate that our approach can recognize the upper body gestures with high accuracy in real time. In addition, for robot motion generation and control, a novel online motion planning method is proposed. In order to generate appropriate dynamic motion, a quadratic programming (QP)-based dual-arms kinematic motion generation scheme is proposed, and a simplified recurrent neural network is employed to solve the QP problem. The integration of a handshake within the HRI system illustrates the effectiveness of the proposed online generation method.en_US
dc.language.isoenen_US
dc.relation.ispartofseriesPresence : teleoperators and virtual environmentsen_US
dc.rights© 2014 Massachusetts Institute of Technology Press. This paper was published in Presence: Teleoperators and Virtual Environments and is made available as an electronic reprint (preprint) with permission of Massachusetts Institute of Technology Press. The paper can be found at the following official DOI: [http://dx.doi.org/10.1162/PRES_a_00176]. One print or electronic copy may be made for personal use only. Systematic or multiple reproduction, distribution to multiple locations via electronic or other means, duplication of any material in this paper for a fee or for commercial purposes, or modification of the content of the paper is prohibited and is subject to penalties under law.en_US
dc.subjectDRNTU::Engineering::Mechanical engineering::Robots
dc.titleHuman robot interaction by understanding upper body gesturesen_US
dc.typeJournal Article
dc.contributor.schoolSchool of Electrical and Electronic Engineeringen_US
dc.identifier.doihttp://dx.doi.org/10.1162/PRES_a_00176
dc.description.versionPublished versionen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record