"Guest Editorial Focused Section on sensing and perception systems for intelligent manufacturing (SPIM)," IEEE/ASME Trans. Mechatronics (2018) 

[115] X. Chen, S. Zhang, and J. M.P. Geraedts, "Guest Editorial Focused Section on sensing and perception systems for intelligent manufacturing (SPIM)," IEEE/ASME Trans. Mechatronics 23(3), 983-984 (2018); doi:10.1109/TMECH.2018.2837008

Abstract

This Focused Section provides a state-of-the-art update of research fronts in Field Sensing and Perception as  well as the applications to intelligent manufacturing problems. The topics covered in the collected papers include: optimization of field sensing network and systems, intelligent sensing and perception for robotic manufacturing processes, SLAM of indoor environments, and model guided measurement of various field effects. The problems related to field sensing and perception are challenging and spread out in various areas in practices. To our best knowledge, this Focused Section is the first effort to try to collectively present the research results related sensing and perception of ‘field effects’ which exist in various manufacturing  processes. It is our intention that this Focused Section would serve as an efficient highlight of such challenging and important problems to attract intensive research interests on these problems by arguably claiming an effective research area for field sensing and perception. 

"Pixel-by-pixel absolute phase retrieval using three phase-shifted fringe patterns without markers," Opt. Laser Eng., (2017)

C. Jiang,  B. Li, S. Zhang, "Pixel-by-pixel absolute phase retrieval using three phase-shifted fringe patterns without markers," Opt. Laser Eng., 91, 232-241 (2017);  doi:10.1016/j.optlaseng.2016.12.002

This paper presents a method that can recover absolute phase pixel by pixel without embedding markers on three phase-shifted fringe patterns, acquiring additional images, or introducing additional hardware component(s). The proposed
three-dimensional (3D) absolute shape measurement technique includes the following major steps: 1) segment the measured object into different regions using rough priori knowledge of surface geometry; 2) artificially create phase maps at different z planes using geometric constraints of structured light system; 3) unwrap the phase pixel by pixel for each region by properly referring to the artificially created phase map; and 4) merge unwrapped phases from all regions into a complete absolute phase map for 3D reconstruction. We demonstrate that conventional three-step phase-shifted fringe patterns can be used to create absolute phase map pixel by pixel even for large depth range objects. We have successfully implemented our proposed computational framework to achieve absolute 3D shape measurement at 40 Hz.

"Development of a mobile tool mark characterization/comparison system," J. Forensic Sci., (2017)

L. S. Chumbley,  M. Morris, R. Spotts, and C. Macziewski, "Development of a mobile tool mark characterization/comparison system," J. Forensic Sci., 62(1), 83-91 (2017), doi: 10.1111/1556-4029.13233

Since the development of the striagraph, various attempts have been made to enhance forensic investigation through the use of measuring and imaging equipment. This study describes the development of a prototype system employing an easy-to-use software interface designed to provide forensic examiners with the ability to measure topography of a toolmarked surface and then conduct various comparisons using a statistical algorithm. Acquisition of the data is carried out using a portable 3D optical profilometer, and comparison of the resulting data files is made using software named “MANTIS” (Mark and Tool Inspection Suite). The system has been tested on laboratory-produced markings that include fully striated marks (e.g., screwdriver markings), quasistriated markings produced by shear-cut pliers, impression marks left by chisels, rifling marks on bullets, and cut marks produced by knives. Using the system, an examiner has the potential to (i) visually compare two toolmarked surfaces in a manner similar to a comparison microscope and (ii) use the quantitative information embedded within the acquired data to obtain an objective statistical comparison of the data files. This study shows that, based on the results from laboratory samples, the system has great potential for aiding examiners in conducting comparisons of toolmarks.

"Evaluation of pixel-wise geometric constraints based phase unwrapping method for low signal-to-noise-ratio (SNR) phase," Advanced Optical Technologies, (2016)

[91] Y. An, Z. Liu and S. Zhang, "Evaluation of pixel-wise geometric constraints based phase unwrapping method for low signal-to-noise-ratio (SNR) phase," Advanced Optical Technologies, 5(5-6), 423–432, (2016); doi: 10.1515/aot-2016-0048

This paper evaluates the robustness of our recently proposed geometric constraints based phase unwrapping method to unwrap low signal-to-noise ratio (SNR) phase.  Instead of capturing additional images for absolute phase unwrapping, the new phase unwrapping algorithm uses geometric constraints of the digital fringe projection (DFP) system to create a virtual reference phase map to unwrap the phase pixel by pixel. Both simulation and experimental results demonstrate that this new phase unwrapping method can even successfully unwrap low SNR phase maps that brings difficulties for conventional multi-frequency phase unwrapping methods.

"Method for large-range structured light system calibration," Appl. Opt., (2016)

[91] Y. An, T. Bell, B. Li, J. Xu and S. Zhang, "Method for large range structured light system calibration", Appl. Opt., 55(33), 9563-9572 (2016); doi:10.1364/AO.55.009563

Structured light system calibration often requires the usage of a calibration target with a similar size as the field of view (FOV), which brings challenges to large range structured light system calibration since fabricating large calibration targets is difficult and expensive. This paper presents a large range system calibration method that does not need a large calibration target. The proposed  method includes two stages: 1) accurately calibrate intrinsics  (i.e. focal lengths, and principle points) at a near range where both the camera and projector are out of focus; and 2) calibrate the extrinsic parameters (translation and rotation) from camera to projector with the assistance of a low-accuracy large range 3D sensor (e.g., Microsoft Kinect). We have developed a large-scale 3D shape measurement system with a FOV of (1120 × 1900 × 1000) mm^3. Experiments demonstrate our system can achieve measurement accuracy as high as 0.07 mm with a standard deviation of 0.80 mm by measuring a 304.8 mm diameter sphere. As a comparison, Kinect V2 only achieved mean error of 0.80 mm with a standard deviation of 3.41 mm for the FOV of measurement.

"High-accuracy, high-speed 3D structured light imaging techniques and potential applications to intelligent robotics," Int. J. Intell. Robot. Applic. (2016)

[90] B. Li, Y. An, D. Cappelleri, J. Xu and S. Zhang, "High-accuracy, high-speed 3D structured light imaging techniques and potential applications to intelligent robotics," Int. J. Intell. Robot. Applic. 1(1), 86–103, (2016).

Abstract

This paper presents some of the high-accuracy and high-speed structured light 3D imaging methods developed in the optical metrology community. These advanced 3D optical imaging technologies could substantially benefit the intelligent robotics community as another sensing tool. This paper mainly focuses on one special 3D imaging technique: digital fringe projection (DFP) method because of its numerous advantageous features comparing to other 3D optical imaging methods in terms of accuracy, resolution, speed, and flexibility. We will discuss technologies that enabled 3D data acquisition, reconstruction, and display at 30 Hz or higher with over 300,000 measurement points per frame. This paper intends to introduce the DFP technologies to the intelligent robotics community, and casts our perspectives on potential applications that such sensing methods could be of value.

Motion induced error reduction by combining Fourier transform profilometry with phase-shifting profilometry, Opt. Express, (2016)

[88] B. Li, Z. Liu and S. Zhang, "Motion induced error reduction by combining Fourier transform profilometry with phase-shifting profilometry," Opt. Express 24(20), 23289-23303 2016; doi: 10.1364/OE.24.023289

We propose a hybrid computational framework to reduce motion induced measurement error by combining the Fourier transform profilometry (FTP) and phase-shifting profilometry (PSP). The proposed method is composed of three major steps: Step 1 is to extract continuous relative phase maps for each isolated object with single-shot FTP method and spatial phase unwrapping; Step 2 is to obtain an absolute phase map of the entire scene using PSP method, albeit motion induced errors exist on the extracted absolute phase map; and Step 3 is to shift the continuous relative phase maps from Step 1 to generate final absolute phase maps for each isolated object by referring to the absolute phase map with error from Step 2. Experiments demonstrate the success of the proposed computational framework for measuring multiple isolated rapidly moving objects.

"Pixel-wise absolute phase unwrapping using geometric constraints of structured light system," Opt. Express, (2016)

[87] Y. An, J. -S. Hyun, and S. Zhang, "Pixel-wise absolute phase unwrapping using geometric constraints of structured light system", Opt. Express, 24(15), 18445-18459, 2016; doi: 10.1364/OE.24.018445

This paper presents a method to unwrap phase pixel by pixel by solely using geometric constraints of the structured light system without requiring additional image acquisition or  another camera. Specifically, an artificial absolute phase map, Φ_{min},  at a given virtual depth plane z = z_{min}, is created from geometric constraints of the calibrated structured light system; the wrapped phase is pixel-by-pixel unwrapped by referring to Φ_{min}. Since Φ_{min} is defined in the projector space, the unwrapped phase obtained from this method is absolute for each pixel.  Experimental results demonstrate the success of this proposed novel absolute phase unwrapping method.

“Fast registration methodology for fastener assembly of large-scale structure,” IEEE Trans Industrial Electronics (2017)

[86] J. Xu, R. Chen, H. Chen, S. Zhang, and K. Chen, " Fast registration methodology for fastener assembly of large-scale structure," IEEE Trans Industrial Electronics, 64(1),  717-726 (2017); doi:10.1109/TIE.2016.2599140

Abstract

Fastener assembly is a tedious and time-consuming work because operators have to check assembly manuals and find right fastener for each hole. Hence, this article aims to develop a 3D projection system which projects assembly instruction onto the work piece surface directly to guide operators to assemble. However, in order to project the instruction accurately, the corresponding part of the CAD model of the physical scanned area needs to be attained through the rapid and accurate registration. In order to achieve this goal,firstly, a high-accuracy and rapid 3D measurement system is developed; secondly, a fast registration method based on local multi-scale geometric feature vector is proposed to accelerate the registration speed and improve the registration reliability. Experimental results demonstrate the measurement accuracy of the developed system, and verify the feasibility of the proposed registration method. Hence the proposed method can lead to improved assembly efficiency and decreased error probability, making great contributions to large-scale structure assembly.

 

"High-resolution, real-time simultaneous 3D surface geometry and temperature measurement," Opt. Express, (2016);

[85] Y. An and S. Zhang, "High-resolution, real-time simultaneous 3D surface geometry and temperature measurement," Opt. Express, 24(13), 14552-14563, 2016; doi: 10.1364/OE.24.014552

This paper presents a method to simultaneously measure three-dimensional (3D) surface geometry and temperature in real time. Specifically, we developed 1) a holistic approach to calibrate both a structured light system and a thermal camera under exactly the same world coordinate system even though these two sensors do not share the same wavelength; and 2) a computational framework to determine the sub-pixel corresponding temperature for each 3D point as well as discard those occluded points. Since the thermal 2D imaging and 3D visible imaging systems do not share the same spectrum of light, they can perform sensing simultaneously in real time: we developed a hardware system that can achieve real-time 3D geometry and temperature measurement at 26 Hz with 768 X 960 points per frame.
 

"Microscopic structured light 3D profilometry: binary defocusing technique VS sinusoidal fringe projection," Opt. Laser Eng., (2016)

[92] B. Li and S. Zhang, "Microscopic structured light 3D profilometry: binary defocusing technique VS sinusoidal fringe projection, " Opt. Laser Eng. 96, 117–123, (2017); doi: 10.1016/j.optlaseng.2016.06.009

Abstract

This paper compares the binary defocusing technique with conventional sinusoidal fringe projection under two different 3D microscopic profilometry systems: 1) both camera and projector use telecentric lenses, and 2) only camera uses a telecentric lens. Our simulation and experiments found that the binary defocusing technique is superior to the traditional sinusoidal fringe projection method by improving the measurement resolution approximately 19%. Finally, by taking the speed advantage of the binary defocusing technique, we presented a high-speed (500 Hz) and high-resolution (1600 X 1200) 3D microscopic profilometry system that could reach kHz.

"Single-shot absolute 3D shape measurement with Fourier transform profilometry", Appl. Opt., (2016)

[83] B. Li, Y. An and S. Zhang, "Single-shot absolute 3D shape measurement with Fourier transform profilometry," Appl. Opt., 2016; (accepted)

Abstract

Fourier transform profilometry (FTP) is one of the frequently adopted three-dimensional (3D) shape measurement methods owing to its nature of single-shot 3D shape recovery, yet it is challenging to retrieve the absolute phase map solely from one single grayscale fringe image. This paper presents a computational framework that overcomes this limitation of FTP with digital fringe projection (DFP). By using geometric constraints, an absolute phase map can be retrieved point-by-point from one single grayscale fringe image. Experiments demonstrate the success of our proposed framework with single-shot absolute 3D shape measurement capability.

"Enhanced two-frequency phase-shifting method," Appl. Opt. (2016)

[82] J. -S. Hyun, and S. Zhang, "Enhanced two-frequency phase-shifting method," Appl. Opt., 55(16), 4395-4401, 2016; doi: 10.1364/AO.55.004395

Abstract

One of the major challenges of employing a two-frequency (or -wavelength) phase-shifting algorithm for absolute three-dimensional (3D) shape measurement is its sensitivity to noise. Therefore, three- or morefrequency phase-shifting algorithms are often used in lieu of a two-frequency phase-shifting algorithm for applications where the noise is severe. This paper proposes a method to use geometric constraints of digital fringe projection (DFP) system to substantially reduce the noise impact by allowing the use of more than one period of equivalent phase map for temporal phase unwrapping. Experiments successfully verified the enhanced performance of the proposed method without increasing the number of patterns.

"High quality 3D shape measurement using saturated fringe patterns,'' Opt. Laser Eng. (2016)

[81] B. Chen and S. Zhang, "High quality 3D shape measurement using saturated fringe patterns,'' Opt. Laser Eng. (2016) (doi:10.1016/j.optlaseng.2016.04.012)

Abstract

This paper proposes a method to potentially conquer one of the challenges in the optical metrology community: optically measuring three-dimensional (3D) objects with high surface contrast. We discover that  for digitally equally phase-shifted fringe patterns, if the fringe period P is an even number, the N = P/2 x k, (k = 1, 2, 3, ...) step algorithm can accurately recover phase even if the fringe patterns are saturated; and if P is an odd number, N = P x k step algorithm can also accurately recover phase even if the fringe patterns are saturated.  This finding leads to a novel method to optically measure shiny surfaces, where the saturation due to surface shininess could be substantially alleviated. Both simulations and experiments successfully verified the proposed method.

"High dynamic range real-time 3D shape measurement," Opt. Express, (2016)

[80] C. Jiang, T. Bell, and S. Zhang, "High dynamic range real-time 3D shape measurement," Opt. Express., 24(7), 7337-7346, 2016(Cover feature); doi: 10.1364/OE.24.00733 

Abstract

This paper proposes a method that can measure high-contrast surfaces in real-time without changing camera exposures. We propose to use 180-degree phase-shifted (or inverted) fringe patterns to complement regular fringe patterns. If not all of the regular patterns are saturated, inverted fringe patterns are used in lieu of original saturated patterns for phase retrieval, and if all of the regular fringe patterns are saturated, both the original and inverted fringe patterns are all used for phase computation to reduce phase error. Experimental results demonstrate that three-dimensional (3D) shape measurement can be achieved in real time by adopting the proposed high dynamic range method.

 

 

 

“Method for out-of-focus camera calibration,” Appl. Opt., (2016)

[79] T. Bell, J. Xu, and S. Zhang, "Method for out-of-focus camera calibration," Appl. Opt., 55(9), 2346-2352, 2016; doi: 10.1364/AO.55.002346

Abstract

State-of-the-art camera calibration methods assume that the camera is at least nearly in focus, and thus fail if the camera is substantially defocused. This paper presents a method which enables the accurate calibration of an out-of-focus camera. Specifically, the proposed method uses a digital display (e.g., liquid crystal display monitor) to generate fringe patterns which encode feature points into the carrier phase; these feature points can be accurately recovered even if the fringe patterns are substantially blurred (i.e., the camera is substantially defocused). Experiments demonstrated that the proposed method can accurately calibrate a camera regardless of the amounts of defocusing: the focal length difference is approximately 0.2% when the camera is focused compared to when the camera is substantially defocused.

Technical Paper