Towards efficient 3D calibration for different types of multi-view autostereoscopic 3D displays

A novel and efficient 3D calibration method for different types of autostereoscopic multi-view 3D displays is presented in this paper. In our method, a camera is placed at different locations within the viewing volume of a 3D display to capture a series of images that relate to the subset of light r...

Full description

Saved in:
Bibliographic Details
Main Authors: Xia, Xinxing, Guan, Yunqing, State, Andrei, Cham, Tat-Jen, Fuchs, Henry
Other Authors: School of Computer Science and Engineering
Format: Conference or Workshop Item
Language:English
Published: 2020
Subjects:
Online Access:https://hdl.handle.net/10356/138273
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-138273
record_format dspace
spelling sg-ntu-dr.10356-1382732020-04-30T02:19:02Z Towards efficient 3D calibration for different types of multi-view autostereoscopic 3D displays Xia, Xinxing Guan, Yunqing State, Andrei Cham, Tat-Jen Fuchs, Henry School of Computer Science and Engineering Computer Graphics International 2018 (CGI 2018) Institute for Media Innovation (IMI) Engineering::Computer science and engineering Autostereoscopic Displays 3D Calibration A novel and efficient 3D calibration method for different types of autostereoscopic multi-view 3D displays is presented in this paper. In our method, a camera is placed at different locations within the viewing volume of a 3D display to capture a series of images that relate to the subset of light rays emitted by the 3D display and arriving at each of the camera positions. Gray code patterns modulate the images shown on the 3D display, helping to significantly reduce the number of images captured by the camera and thereby accelerate the process of calculating the correspondence relationship between the pixels on the 3D display and the locations of the capturing camera. The proposed calibration method has been successfully tested on two different types of multi-view 3D displays and can be easily generalized for calibrating other types of such displays. The experimental results show that this novel 3D calibration method can also be used to improve the image quality by reducing the frequently observed crosstalk that typically exists when multiple users are simultaneously viewing multi-view 3D displays from a range of viewing positions. NRF (Natl Research Foundation, S’pore) 2020-04-30T02:19:02Z 2020-04-30T02:19:02Z 2018 Conference Paper Xia, X., Guan, Y., State, A., Cham, T.-J., & Fuchs, H. (2018). Towards efficient 3D calibration for different types of multi-view autostereoscopic 3D displays. Proceedings of Computer Graphics International 2018 (CGI 2018), 169-174. doi:10.1145/3208159.3208190 978-1-4503-6401-0 https://hdl.handle.net/10356/138273 10.1145/3208159.3208190 2-s2.0-85062854269 169 174 en © 2018 Association for Computing Machinery. All rights reserved.
institution Nanyang Technological University
building NTU Library
country Singapore
collection DR-NTU
language English
topic Engineering::Computer science and engineering
Autostereoscopic Displays
3D Calibration
spellingShingle Engineering::Computer science and engineering
Autostereoscopic Displays
3D Calibration
Xia, Xinxing
Guan, Yunqing
State, Andrei
Cham, Tat-Jen
Fuchs, Henry
Towards efficient 3D calibration for different types of multi-view autostereoscopic 3D displays
description A novel and efficient 3D calibration method for different types of autostereoscopic multi-view 3D displays is presented in this paper. In our method, a camera is placed at different locations within the viewing volume of a 3D display to capture a series of images that relate to the subset of light rays emitted by the 3D display and arriving at each of the camera positions. Gray code patterns modulate the images shown on the 3D display, helping to significantly reduce the number of images captured by the camera and thereby accelerate the process of calculating the correspondence relationship between the pixels on the 3D display and the locations of the capturing camera. The proposed calibration method has been successfully tested on two different types of multi-view 3D displays and can be easily generalized for calibrating other types of such displays. The experimental results show that this novel 3D calibration method can also be used to improve the image quality by reducing the frequently observed crosstalk that typically exists when multiple users are simultaneously viewing multi-view 3D displays from a range of viewing positions.
author2 School of Computer Science and Engineering
author_facet School of Computer Science and Engineering
Xia, Xinxing
Guan, Yunqing
State, Andrei
Cham, Tat-Jen
Fuchs, Henry
format Conference or Workshop Item
author Xia, Xinxing
Guan, Yunqing
State, Andrei
Cham, Tat-Jen
Fuchs, Henry
author_sort Xia, Xinxing
title Towards efficient 3D calibration for different types of multi-view autostereoscopic 3D displays
title_short Towards efficient 3D calibration for different types of multi-view autostereoscopic 3D displays
title_full Towards efficient 3D calibration for different types of multi-view autostereoscopic 3D displays
title_fullStr Towards efficient 3D calibration for different types of multi-view autostereoscopic 3D displays
title_full_unstemmed Towards efficient 3D calibration for different types of multi-view autostereoscopic 3D displays
title_sort towards efficient 3d calibration for different types of multi-view autostereoscopic 3d displays
publishDate 2020
url https://hdl.handle.net/10356/138273
_version_ 1681058100335869952