Please use this identifier to cite or link to this item:
|Title:||Contributions to robotic manipulation under uncertainty : calibration, estimation, motion planning||Authors:||Nguyen, Huy||Keywords:||DRNTU::Engineering::Electrical and electronic engineering::Control and instrumentation::Robotics||Issue Date:||2019||Source:||Nguyen, H. (2019). Contributions to robotic manipulation under uncertainty : calibration, estimation, motion planning. Doctoral thesis, Nanyang Technological University, Singapore.||Abstract:||Robotics has played a considerable role in increasing industrial productivity across the globe over the last decades. However, the automation of complex manipulation tasks, such as robotic assembly, is still beyond the capabilities of today’s industrial robots. One of the main problems to achieve this level of automation is associated with the uncertainties in positions and orientations of object poses in real-world environments. To tackle this problem, one of our key ideas is to explicitly represent and estimate uncertainties by using the calculus of probability theory and assigning probabilities to all potential positions of the object. As a result, the ambiguity and degree of certainty can be described in a mathematical way, which allows us to obtain a deep understanding of these uncertainties and design better solutions to the associated problems. In line with this idea, in this thesis, we present a number of contributions to robotic manipulation under uncertainty, including calibration, estimation and motion planning problems. First, we propose a probabilistic framework to precisely keep track of uncertainties throughout the entire manipulation process. In agreement with common manipulation pipelines, we decompose the process into two consecutive stages, namely perception and physical interaction. Each stage is associated with different sources and types of uncertainties, requiring different techniques. We discuss which representation of uncertainties is the most appropriate for each stage (e.g. as probability distributions in SE(3) during perception, as weighted particles during physical interactions), how to convert from one representation to another, and how to initialize or update the uncertainties at each step of the process (camera calibration, image processing, pushing, grasping, etc.). Finally, we demonstrate the benefit of this fine-grained knowledge of uncertainties in an actual assembly task. Second, we also present an approach to estimate the uncertainties of the hand-eye transformation. Even though the hand-eye calibration has been a long-established fundamental problem in the robot vision for several decades, there currently exists no method to derive the covariance of the hand-eye transformation. Such information is the most generic and relevant quantification of the uncertainties in the hand-eye calibration process. After obtaining the covariance of the hand-eye transformation, we also discuss a propagation method to compute the covariance of the object pose estimation in a real setting. Next, we present other contributions to decrease the running time and to deal with outlier measurements when performing touch-based localization in cluttered environments. In fact, such outlier measurements often lead to significant loss in precision in existing approaches. Experiments showed that our algorithm could provide, in a timely fashion, accurate and reliable localization in cluttered environments, in the presence of outliers. Finally, while the main contributions of this thesis lie in handling uncertainties in manipulation tasks, we also make a contribution toward solving the problem of trajectory planning with kinodynamic constraints in the space of rigid-body motion SE(3). Looking forward, it is our hope that this thesis will serve as a starting point for further development of the robotic automation of manipulation tasks.||URI:||https://hdl.handle.net/10356/83537
|DOI:||10.32657/10220/48014||Schools:||School of Mechanical and Aerospace Engineering||Research Centres:||Robotics Research Centre||Fulltext Permission:||open||Fulltext Availability:||With Fulltext|
|Appears in Collections:||MAE Theses|
Page view(s) 50459
Updated on Dec 8, 2023
Updated on Dec 8, 2023
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.