Pixel-wise structured light calibration method with a color calibration target (2022)

Abstract

We propose to use a calibration target with a narrow spectral color range for the background (e.g., from blue) and broader spectral color range for the feature points (e.g., blue + red circles), and fringe patterns matching the background color for accurate phase extraction. Since the captured fringe patterns are not affected by the high contrast of the calibration target, phase information can be accurately extracted without edging artifacts. Those feature points can be clearly “seen” by the camera if the ambient light matches the feature color or without the background color. We extract each calibration pose for three-dimensional coordinate determination for each pixel, and then establish pixel-wise relationship between each coordinate and phase. Comparing with our previously published method, this method significantly fundamentally simplifies and improves the algorithm by eliminating the computational framework estimate smooth phase near high-contrast feature edges. Experimental results demonstrated the success of our proposed calibration method.

Digital image correlation assisted absolute phase unwrapping (2022)

Y-H. Liao, M. Xu, and S. Zhang,  "Digital image correlation assisted absolute phase unwrapping," Optics Express 30(18), 33022-33034 (2022); doi: 10.1364/OE.470704

Abstract

This paper presents an absolute phase unwrapping method for high-speed three-dimensional (3D) shape measurement. This method uses three phase-shifted patterns and one binary random pattern on a single-camera, single-projector structured light system. We calculate the wrapped phase from phase-shifted images and determine the coarse correspondence through the digital image correlation (DIC) between the captured binary random pattern of the object and the pre-captured binary random pattern of a flat surface. We then developed a computational framework to determine fringe order number pixel by pixel using the coarse correspondence information. Since only one additional pattern is used, the proposed method can be used for high-speed 3D shape measurement. Experimental results successfully demonstrated that the proposed method can achieve high-speed and high-quality measurement of complex scenes.

High-speed 3D optical sensing and information processing for automotive industry (2021)

S. Zhang, "High-speed 3d optical sensing and information processing for automotive industry," SAE International Journal of Advances and Current Practices in Mobility, 2021-01-030 (2021); doi:10.4271/2021-01-0303. (Selected as one of the best papers for 2021 WCX)

Abstract

This paper explains the basic principles behind two platform technologies that my research team has developed in the field of optical metrology and optical information processing: 1) high-speed 3D optical sensing; and 2) real-time 3D video compression and streaming. This paper will discuss how such platform technologies could benefit the automotive industry including in-situ quality control for additive manufacturing and autonomous vehicle systems. We will also discuss some of other applications that we have been working on such as crime scene capture in forensics.

Calibration method for an extended depth-of-field microscopic structured light system (2022)

L. Chen, X. Hu, and S. Zhang, "Calibration method for an extended depth-of-field microscopic structured light system," Optics Express, 30(1), 166-178 (2022); doi: 10.1364/OE.448019

Abstract

This paper presents a calibration method for a microscopic structured light system with an extended depth of field (DOF). We first employed the focal sweep technique to achieve large enough depth measurement range, and then developed a computational framework to alleviate the impact of phase errors caused by the standard off-the-shelf calibration target (black circles with a white background). Specifically, we developed a polynomial interpolation algorithm to correct phase errors near the black circles to obtain more accurate phase maps for projector feature points determination. Experimental results indicate that the proposed method can achieve a measurement accuracy of approximately 1.0 𝜇m for a measurement volume of approximately 2,500 𝜇m (W) × 2,000 𝜇m (H) × 500 𝜇m (D).

"Comparative study on 3D optical sensors for short range applications," Optics and Lasers in Engineering (2022)

R. Chen, J. Xu, and S. Zhang,  "Comparative study on 3D optical sensors for short range applications," Optics and Lasers in Engineering 149, 106763, (2022); doi:j.optlaseng.2021.106763

Abstract

The increasing availability of commercial 3D optical sensors drastically benefits the application fields including the mechatronics where providing affordable sensing means for perception and control is vital. Yet, to our knowledge, there is no thorough comparable study to the state-of-the-art 3D optical sensors, making it difficult for users to select for their specific applications. This paper evaluates the performance of each sensor for short range applications (i.e., $\le$ 1 m ). Specifically, we present our findings on the measurement accuracy of each sensor under ``ideal'' situations, compare the influence of various lighting conditions, object surface properties (e.g., transparency, shininess, contrast), and object locations. In addition, we developed software APIs and user instructions that are available for the community to easily use each of the evaluated commercially available 3D optical sensor.