Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/69033
Title: Real-time high performance displacement sensing in handheld instrument for microsurgery
Authors: Aye, Yan Naing
Keywords: DRNTU::Engineering::Electrical and electronic engineering::Control and instrumentation::Robotics
Issue Date: 2016
Source: Aye, Y. M. (2016). Real-time high performance displacement sensing in handheld instrument for microsurgery. Doctoral thesis, Nanyang Technological University, Singapore.
Abstract: The main focus of this research is to achieve real-time high resolution, high accuracy displacement sensing in vision aided intelligent handheld instrument for microsurgery. The intelligent handheld instrument, ITrem2, enhances human manual positioning accuracy by cancelling erroneous hand movements and, at the same time, provides automatic micromanipulation functions. Visual data are acquired from a high-speed mono-vision camera attached to the optical surgical microscope and acceleration measurements are acquired from the inertial measurement unit on board ITrem2. These complementary sensing information sources are fused to achieve enhanced position sensing accuracy in real-time. A new accelerometer placement design is implemented to achieve better tool-tip acceleration sensing in ITrem2. The proposed accelerometer placement design deploys 4 low cost dual-axis digital MEMS accelerometers which are on board ITrem2. Besides taking into account the geometry of available space and the position of interest in ITrem2, the design uses redundant accelerometers for higher sensing resolution. To accurately estimate accelerometer parameters such as bias, scale factor, and cross axis effect in the single experiment, an accelerometer calibration method is proposed. An algorithm is proposed and implemented to fuse visual and inertial sensors for real-time visual servo control of the ITrem2 tool tip. The proposed method utilizes accurate time stamps of the vision and inertial measurements that are provided by the real-time computer. Including the sensing time as a new dimension in the fusion process, the time-aware integration of the inertial measurements provides accuracy of 4.6 µm at the nominal frequency even with significant delay and jitter in the vision measurements.
URI: http://hdl.handle.net/10356/69033
Schools: School of Mechanical and Aerospace Engineering 
Research Centres: Robotics Research Centre 
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:MAE Theses

Files in This Item:
File Description SizeFormat 
Aye_Thesis_HB2P.pdf
  Restricted Access
Thesis2.7 MBAdobe PDFView/Open

Page view(s) 50

478
Updated on May 7, 2025

Download(s) 50

25
Updated on May 7, 2025

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.