Please use this identifier to cite or link to this item:
|Title:||Towards efficient 3D calibration for different types of multi-view autostereoscopic 3D displays||Authors:||Xia, Xinxing
|Keywords:||Engineering::Computer science and engineering||Issue Date:||2018||Source:||Xia, X., Guan, Y., State, A., Cham, T.-J., & Fuchs, H. (2018). Towards efficient 3D calibration for different types of multi-view autostereoscopic 3D displays. Proceedings of Computer Graphics International 2018 (CGI 2018), 169-174. doi:10.1145/3208159.3208190||Abstract:||A novel and efficient 3D calibration method for different types of autostereoscopic multi-view 3D displays is presented in this paper. In our method, a camera is placed at different locations within the viewing volume of a 3D display to capture a series of images that relate to the subset of light rays emitted by the 3D display and arriving at each of the camera positions. Gray code patterns modulate the images shown on the 3D display, helping to significantly reduce the number of images captured by the camera and thereby accelerate the process of calculating the correspondence relationship between the pixels on the 3D display and the locations of the capturing camera. The proposed calibration method has been successfully tested on two different types of multi-view 3D displays and can be easily generalized for calibrating other types of such displays. The experimental results show that this novel 3D calibration method can also be used to improve the image quality by reducing the frequently observed crosstalk that typically exists when multiple users are simultaneously viewing multi-view 3D displays from a range of viewing positions.||URI:||https://hdl.handle.net/10356/138273||ISBN:||978-1-4503-6401-0||DOI:||10.1145/3208159.3208190||Rights:||© 2018 Association for Computing Machinery. All rights reserved.||Fulltext Permission:||none||Fulltext Availability:||No Fulltext|
|Appears in Collections:||IMI Conference Papers|
Updated on Mar 10, 2021
Updated on May 10, 2021
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.