Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/75884
Title: Image-inspired haptic interaction
Authors: Zhang, Xingzi
Keywords: DRNTU::Engineering::Computer science and engineering::Software
Issue Date: 2018
Source: Zhang, X. (2018). Image-inspired haptic interaction. Doctoral thesis, Nanyang Technological University, Singapore.
Abstract: In this research, an image-inspired haptic interaction approach is proposed where the problem of augmenting images with 3D haptic models is addressed in a systematical way. It is hypothesized that images of real scenes can be converted into a tangible form with the proposed approach so that one could interact with them using desktop haptic devices as if they were real 3D scenes. To achieve this goal, two different scenarios are considered when augmenting images with haptic models. First, in the case when there is no available haptic model for the image, a method of sketching and augmenting the images with perspectively distorted haptic models is proposed. The research novelty of this method lies in the ways of creating distorted haptic models and aligning them with the images. In contrast to common 3D modelling, haptic models are defined in perspectively distorted modelling space, so that when projected with orthographic projection their contours would match the respective parts of the image. In the proposed method, function representations (FReps) are mostly used to define the augmented haptic models, which makes this method highly expandable – any type of function-based models, as long as it matches the image content, can be added to this method. Besides, the novelty is also in the haptic rendering algorithm proposed for rendering function-based models, which ensures smooth interaction with a combination of models defined in FReps. The use of FReps allows us to minimize the model size while still keeping the rendering precision, and makes the method specifically suitable for web-based real-time applications. Next, the case where the image scene can be reconstructed is considered. Given multiple images of a scene, the 3D model of the image can be obtained with computer vision techniques. Based on this scenario, a way of haptic interaction with real life scenes is proposed where multiple original images of the real scenes are augmented with the reconstructed polygon mesh. The research novelty here lies in the solution to the problem of interactive haptic rendering of large polygon meshes with reconstruction artifacts. In particular, the presented collision detection algorithm is robust enough to support smooth interaction with millions of polygons. To prove the hypothesis and investigate the feasibility and acceptance of the proposed approach, a usability test was conducted with the tangible image interface. Besides, a potential application of image-inspired haptic interaction—tangible online shopping interface—was investigated. The results of these studies have shown that the users’ attitude towards the proposed ideas is between neutral and largely positive, which leads to the conclusion that the hypothesis can be accepted under both two cases. Thus it can be concluded that the proposed image-inspired haptic interaction approach brings new possibilities to interaction with images and provides a way to enrich images with a touch interface and haptic feedback.
URI: http://hdl.handle.net/10356/75884
DOI: 10.32657/10356/75884
Schools: School of Computer Science and Engineering 
Fulltext Permission: open
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Theses

Files in This Item:
File Description SizeFormat 
G1300010E_ZHANG XINGZI_thesis.pdfMain article6.69 MBAdobe PDFThumbnail
View/Open

Page view(s)

425
Updated on Jun 19, 2024

Download(s) 20

239
Updated on Jun 19, 2024

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.