Preprint
Review

This version is not peer-reviewed.

Optical Fringe Projection Driving 3D Metrology Uncomplicated

A peer-reviewed article of this preprint also exists.

Submitted:

02 January 2025

Posted:

04 January 2025

You are already at the latest version

Abstract

Optical fringe projection is an outstanding technology that significantly enhances three-dimensional (3D) metrology in numerous applications in science and engineering. Although the complexity of fringe projection systems may be overwhelming, current scientific advances bring improved models and methods that simplify the design and calibration of these systems, making 3D metrology uncomplicated. This paper provides an overview of the fundamentals of fringe projection profilometry, including imaging, stereo systems, phase demodulation, triangulation, and calibration. Some applications are described to highlight the usefulness and accuracy of modern optical fringe projection profilometers, impacting 3D metrology in different fields of science and engineering.

Keywords: 
;  ;  ;  ;  

1. Introduction

Optical fringe projection is an essential technology driving important applications in science, engineering, medicine, entertainment, and many other fields [1,2,3]. This technology, also known as fringe projection profilometry, is a contactless metrological tool suitable for accurate and high-resolution 3D object reconstruction [4,5]. Moreover, modern digital cameras, projectors, and computers enable fringe projection systems to operate at high rates [6,7,8], allowing the sensing of fast shape changes in dynamic phenomena [9,10]. The development of portable profilometers [11] has expanded the usefulness of fringe projection systems for in-situ studies in archaeological [12] and forensic applications [13].
Despite progress in hardware and theoretical models, fringe projection profilometry is an active research area within the scientific community worldwide [14,15]. Researchers are working to overcome new challenges in fringe projection technology [16]. Current investigations include miniaturization of electronic devices [17], multimodal imaging [18,19], self-calibration [20,21], setup optimization for full-view object reconstruction [22,23,24], dynamic range improvement for reconstructing reflective objects [25,26], and enabling automatic and adaptive operation [27,28,29]. For this reason, the specialization of students and professionals in this research area is crucial.
Inexperienced readers of fringe projection profilometry may feel overwhelmed by the abundance of specialized literature available today [30,31]. This technology encompasses various topics, from phase demodulation to triangulation and calibration, with several books and scientific articles dedicated to each subject [32,33]. Frustrating months of literature review could pass without acquiring enough background to construct and operate a profilometer. This paper presents essential concepts and helpful insights into the working principles of optical fringe projection. Elementary models for setting up and operating a 3D profilometer are provided. Illustrative applications where this technology has been implemented are described. This study offers a concise guide through the fundamentals of fringe projection profilometry, making this technology an uncomplicated tool for driving 3D metrology toward new frontiers.

2. Theoretical Preliminaries

2.1. Camera Imaging

A digital camera is a sophisticated device consisting mainly of a compound lens and a photosensitive sensor [34]. Cameras are designed to collect light rays traveling from the 3D space and record their intensities. The resultant intensity map is the camera output, known as the image. Although the physical imaging process is a complicated phenomenon, it can be modeled with a good approximation using the pinhole model [35,36]. Let p be the vector of a point in the 3D space, and let μ be the pixel where p was registered by the camera, as shown in Figure 1(a). Considering the pinhole model, the vectors p and μ are related as
μ = H 1 C H [ p ] ,
where H is the homogeneous coordinate operator [37], [ · ] 1 denotes the inverse, and
C = K [ R T , R T t ]
is a 3 × 4 matrix, known as the camera matrix, K is the intrinsic parameter matrix (non-singular upper triangular), R is a rotation matrix defining the camera orientation, [ · ] T denotes the transpose, and t is a translation vector specifying the camera position.

2.2. Cameras as Direction Sensors

Although digital cameras are well-known for their ability to capture photographs, optical fringe projection systems employ cameras for an additional purpose. Note that a digital image is an array of pixels, and each pixel detects a light ray coming from the 3D space in a specific direction, as shown in Figure 1(b). Thus, every pixel is associated with a particular light ray that has a unique direction. In this context, a helpful insight is that cameras are direction sensors [38]. A more formal analysis is performed by reversing the imaging process to return a space point p when the pixel μ is given. For this, Eqs. (1) and (2) can be rewritten as
λ H [ μ ] = K [ R T , R T t ] p 1 ,
where λ 0 is an arbitrary real-valued variable. After a few algebraic manipulations of Eq. (3), the following equation is reached:
p = t + λ d ,
which describes a line in the 3D space passing through the camera position t , with direction determined by the pixel μ as
d = R K 1 H [ μ ] .

2.3. Stereo Camera Systems and Triangulation

How can direction sensors be employed for 3D metrology? Consider two cameras forming a stereo system capturing an object from two different viewpoints, as shown in Figure 2(a). Let us assume the parameters of both cameras are known, say K 1 , R 1 , and t 1 for the first camera and K 2 , R 2 , and t 2 for the second camera. Let p be a point on the object surface captured by the two cameras at the pixels μ 1 and μ 2 , respectively. The lines representing the captured light rays can be reproduced using the available camera parameters and the given pixel points, as shown in Figure 2(b), namely
p = t 1 + λ 1 d 1 , ( Line - 1 ) ,
p = t 2 + λ 2 d 2 , ( Line - 2 ) .
This reasoning leads to computing the captured point p as the intersection of line-1 and line-2. This computation is generically known as triangulation, although more general constructions for determining points are comprised, such as the intersection of multiple lines, planes, and other geometrical objects, even in combinations [39].
Figure 2. (a) Two cameras capturing a point p from different viewpoints. (b) The captured point is determined as the intersection of the lines defined by μ 1 and μ 2 . (c) Experimental noise may cause skewed lines; therefore, p is determined as the mean point between the solution points p 1 and p 2 .
Figure 2. (a) Two cameras capturing a point p from different viewpoints. (b) The captured point is determined as the intersection of the lines defined by μ 1 and μ 2 . (c) Experimental noise may cause skewed lines; therefore, p is determined as the mean point between the solution points p 1 and p 2 .
Preprints 145019 g002
In particular, triangulation in a stereo system can be performed as follows. Since p represents a common point of the intersecting lines, then Eqs. (6) and (7) can be equated as t 1 + λ 1 d 1 = t 2 + λ 2 d 2 , which can be solved for the unknowns λ 1 and λ 2 by the least squares method as
λ 1 λ 2 = ( D T D ) 1 D T ( t 2 t 1 ) ,
where D = [ d 1 , d 2 ] is the regression matrix. The computed values of λ 1 and λ 2 allow determining the vector of the captured point as p 1 using Eq. (6) or p 2 using Eq. (7). Ideally, p 1 and p 2 are equal due to the intersection assumption. Nevertheless, slight deviations caused by experimental errors may result in skewed lines, as shown in Figure 2(c). For this reason, the vector p is defined as the average ( p 1 + p 2 ) / 2 ; i.e.,
p = 1 2 ( t 1 + t 2 ) + 1 2 d 1 , d 2 λ 1 λ 2 .

2.4. The Corresponding Point Problem

It is noteworthy that a 3D reconstruction requires the pair of pixels μ 1 and μ 2 where the captured point p was imaged, see Figure 2. The pixels μ 1 and μ 2 , related by a common point p , are known as a corresponding point, denoted by
μ 1 μ 2 .
Obtaining corresponding points is a challenging task because it depends on the object’s texture [40,41]. For example, no correspondence points can be established from a white object on a white background under homogeneous illumination because of the lack of feature points [42]. Even for objects with abundant texture, such as the human face in biometric applications, the resolution is low because more than one pixel is required to detect a feature, and not all image regions contain reliable features. As a result, stereo camera systems tend to produce low-resolution 3D reconstructions, and their accuracy depends on the texture of the object under study.

2.5. Equivalence Between Cameras and Projectors

Physically, the difference between a camera and a projector is that the light rays propagate in opposite directions, as shown in Figure 3. While a camera captures light rays traveling from space to its image pixels, a projector emits light rays from its slide pixels to space. However, if the sign of the direction vector of the light rays is omitted, cameras and projectors are identical. For this reason, cameras and projectors are mathematically equivalent, and both can be described by the pinhole model given in Eq. (1). Therefore, in addition to cameras being direction sensors, another valuable insight is that projectors are direction-controlled ray generators.

2.6. Camera-Projector Systems

The equivalence between cameras and projectors can be exploited to modify a stereo camera system by replacing one of the cameras with a projector. The resultant camera-projector system has the advantage of not having the corresponding point problem. This advantage is inferred from the following simplified description of the operation of a camera-projector system.
A computer-generated slide controls the brightness of every pixel of the projector. A black slide will turn off all the projector pixels. Suppose this slide is modified by setting the pixel μ 2 to white; then, a light ray will be emitted to the 3D space, illuminating the object at point p , as shown in Figure 4. In the absence of additional light sources, the camera will capture a dark image except for a bright point at the image pixel μ 1 . In this way, the corresponding point μ 1 μ 2 is known by simply reading the coordinates of the bright image pixel μ 1 and taking the coordinates of the white slide pixel μ 2 . Therefore, sophisticated algorithms to search corresponding points are unnecessary. For this reason, camera-projector systems can reconstruct even objects without texture, achieving high accuracy and resolution.
Although illustrative, the described working principle is impractical because one image is required to obtain only one corresponding point. Since modern digital projectors have millions of pixels, millions of images will be required for a single 3D reconstruction. Fortunately, efficient illumination techniques have been proposed to significantly reduce the number of required images.

2.7. Structured Illumination

The primary advantage of camera-projector systems is the absence of the corresponding point problem. Instead, slides are designed to “mark” the projector pixels such that they are recognized by the camera and paired with the image pixels, producing corresponding points. The different techniques for marking and recognizing projector pixels are known generically as structured illumination [43].
Typical structured illumination techniques include projecting dots, stripes, grids, codewords, rainbows, and fringes. These techniques can even be combined to produce hybrid structured illumination techniques. Each technique has different advantages and disadvantages regarding accuracy, resolution, number of images, noise robustness, object color sensitivity, and other criteria. Depending on the application, one technique will be more appropriate. In particular, the fringe projection technique is recommended for applications requiring higher accuracy and resolution.

2.8. Fringe Projection

The “mark-based” approach helps to understand the different structured illumination techniques intuitively. Alternatively, a powerful insight is gained by considering a camera-projector setup as a telecommunication system. Remember that the projector slide coordinates ( u , v ) must be registered on the camera image plane ( r , s ) to produce corresponding points. In this context, the projector would be considered a transmitter emitting the signals u and v, while the camera is a receiver detecting u and v to produce corresponding points.
The projector slide coordinates can be transmitted using phase modulation. For example, let us encode the values of u as the phase of a 2D cosine signal, known as grating, of the form
G k ( u , v ) = 1 2 + 1 2 cos ( ω u + δ k ) , for k = 1 , 2 , , n ,
where ω is the grating spatial frequency, δ k is a reference phase known as phase shift or grating displacement, and n is the number of images required to perform phase demodulation. Figure 5 shows two gratings, with low and high frequencies, respectively, and four phase shifts. When the grating G k ( u , v ) is used as a slide, the object under study is illuminated with fringes, as shown in Figure 6. Consequently, this particular structured illumination technique is known as fringe projection. The image captured by the camera is known as fringe pattern, modeled as
I k ( r , s ) = a ( r , s ) + b ( r , s ) cos ( ϕ ( r , s ) + δ k ) ,
where a ( r , s ) is the background light, b ( r , s ) is the fringe amplitude, and ϕ ( r , s ) is the phase containing the encoded signal u. Indeed, comparing the arguments of the cosine functions in the Eqs. (11) and (12), the slide projector coordinate u is read in the camera image plane from the demodulated phase as
u ( r , s ) = 1 ω ϕ ( r , s ) .
The fringe projection process can be repeated for the projector axis v. Namely, assuming the gratings were created using the angular frequency w and the recovered phase was φ ( r , s ) , then the slide projector coordinate v is available in the camera image plane as
v ( r , s ) = 1 w φ ( r , s ) .
As a result, for every camera image pixel ( r , s ) , the corresponding projector slide coordinates ( u ( r , s ) , v ( r , s ) ) are determined.
Figure 5. (a)-(d) A grating G k ( u , v ) with the frequency ω = π (one fringe) and four phase shifts. (e)-(h) A grating with a higher frequency, ω = 3.7 π , and four phase shifts.
Figure 5. (a)-(d) A grating G k ( u , v ) with the frequency ω = π (one fringe) and four phase shifts. (e)-(h) A grating with a higher frequency, ω = 3.7 π , and four phase shifts.
Preprints 145019 g005
Figure 6. Fringe patterns I k ( r , s ) obtained by displaying the gratings shown in Figure 5 on a 3D object.
Figure 6. Fringe patterns I k ( r , s ) obtained by displaying the gratings shown in Figure 5 on a 3D object.
Preprints 145019 g006
The phase retrieval algorithms used in fringe projection systems are inherited from optical metrology. Some adaptations are included considering the convenience of using digital camera-projector systems. For instance, the frequency of the gratings and the phase shifts are controlled precisely by the computer. Subsection 3 presents a phase retrieval algorithm suitable for fringe projection 3D metrology systems.

2.9. Phase and object profile misconception

It is worth remarking on the frequent confusion occurring when the phase extracted from a fringe projection experiment is plotted, as shown in Figure 7. Since the phase looks like the object profile, inexperienced practitioners may conclude that elementary transformations on the phase, such as rotation and scaling, are sufficient to achieve metric 3D object reconstruction. Unfortunately, this misconception leads to the formulation of a transformation that, in addition to being excessively complicated, is unnecessary. It is important to remember that the phase simply provides the projector slide coordinates required to establish corresponding points. Subsequently, the corresponding points are used to perform object reconstruction by triangulation.

3. Phase Demodulation Fringe Pattern Processing

The optical metrology community has developed a wide variety of fringe pattern processing methods for different applications and requirements [44,45]. For instance, Fourier fringe analysis allows phase recovery from a single fringe pattern [46], but the intrinsic spectrum filtering limits the spatial resolution. On the other hand, the phase-shifting method achieves the highest (pixel-wise) spatial resolution [47], but multiple fringe patterns with prefixed phase shifts are required.
Nowadays, digital computer-controlled cameras and projectors allow the capture of multiple fringe patterns at high speed and precise control of the grating frequency and phase shift [48]. For this reason, phase demodulation by phase-shifting and multi-frequency phase unwrapping is the preferred choice for fringe projection profilometry [49].

3.1. Phase-Shifting Wrapped Phase Extraction

The design of phase-shifting algorithms depends mainly on the distribution of the phase shifts and whether they are known or unknown. Exploiting the fact that phase shifts can be controlled with high precision, they are required to be
δ k = 2 π n ( k 1 ) .
For this particular case, the set of n fringe patterns given by Eq. (12) can be processed to estimate the background light, the fringe amplitude, and the encoded phase using the Bruning method [47,50], as
a ( r , s ) = 1 n k = 1 n I k ( r , s ) ,
b ( r , s ) = 2 n J 1 2 ( r , s ) + J 2 2 ( r , s ) ,
ψ ( r , s ) = tan 1 J 1 ( r , s ) / J 2 ( r , s ) ,
where tan 1 represents the four-quadrant arctangent function, with the auxiliary functions defined as
J 1 ( r , s ) = k = 1 n I k ( r , s ) sin δ k , and
J 2 ( r , s ) = k = 1 n I k ( r , s ) cos δ k .
As an illustration, the fringe patterns shown in Figure 6(e)-(h) were processed by Eqs. (16)-(18), obtaining the results presented in Figure 8. It is worth noting that the retrieved phase is ψ ( r , s ) , shown in Figure 8(c), while the required phase is ϕ ( r , s ) , shown in Figure 8(d). These phases are displayed using 3D plots in Figure 8(e) and Figure 8(f) for better visualization. The function ψ ( r , s ) is known as the wrapped phase due to its distinctive sawtooth-like shape. The phases ϕ ( r , s ) and ψ ( r , s ) are equivalent since
cos ϕ ( r , s ) = cos ψ ( r , s ) .
Unfortunately, ϕ ( , ) can take any real value while ψ ( π , π ] is always constrained to the left-open interval known as principal values. Adding any multiple of 2 π to the wrapped phase is also equivalent to ϕ ( r , s ) ; i.e.,
cos ϕ ( r , s ) = cos ψ ( r , s ) + 2 π h ( r , s ) ,
which leads to the general relationship between ϕ ( r , s ) and ψ ( r , s ) as
ϕ ( r , s ) = ψ ( r , s ) + 2 π h ( r , s ) ,
where h ( r , s ) is an integer-valued function, known as fringe order. Obtaining the actual phase ϕ ( r , s ) from the available wrapped phase ψ ( r , s ) is a process known as phase unwrapping.
Figure 8. Results of processing the fringe patterns shown in Figure 6(e)-(h). (a) Background light. (b) Fringe amplitude. (c) Extracted wrapped phase ψ ( r , s ) . (d) Required phase ϕ ( r , s ) . (e) and (f) 3D plots of the phases shown in (c) and (d), respectively.
Figure 8. Results of processing the fringe patterns shown in Figure 6(e)-(h). (a) Background light. (b) Fringe amplitude. (c) Extracted wrapped phase ψ ( r , s ) . (d) Required phase ϕ ( r , s ) . (e) and (f) 3D plots of the phases shown in (c) and (d), respectively.
Preprints 145019 g008

3.2. Hierarchical Multi-Frequency Phase Unwrapping

Since the wrapping phenomenon appears when the encoded phase exceeds the principal values, the straightforward way to recover the required phase is by preventing it exceeds the principal values. For this, the frequency of the grating should be chosen appropriately. For instance, assuming the projector slide axis u is normalized in the interval ( 1 , 1 ] , the angular frequency
ω 1 = π
will limit the fringe pattern phase ϕ 1 = ω 1 u within the principal values. Therefore, the required phase will coincide with that retrieved by Eq. (18), i.e.,
ϕ 1 ( r , s ) = ω 1 u ( r , s ) = ψ 1 ( r , s ) .
Note that the frequency ω 1 is so low that only one fringe is displayed, as shown in Figure 6(a)-(d). However, low-frequency gratings cannot underline the fine details of the object. On the other hand, high-frequency gratings highlight shape details, as shown in Figure 6(e)-(h), but the wrapping phenomenon appears.
The hierarchical multi-frequency phase unwrapping employs both low- and high-frequency gratings. The phase retrieved from low-frequency gratings assists in solving the wrapping problem, while the phase from high-frequency gratings permits achieving high fidelity. For example, let us consider a second grating with a frequency ω 2 higher than ω 1 ,
ω 2 > ω 1 .
Since the encoded phase exceeds the principal values, the required phase ϕ and the retrieved phase ψ are related as in Eq. (23), namely
ϕ 2 ( r , s ) = ω 2 u ( r , s ) = ψ 2 ( r , s ) + 2 π h 2 ( r , s ) .
The unknown function h 2 ( r , s ) can be determined using the previous phase ϕ 1 ( r , s ) given by Eq. (25). Specifically, substituting u = ϕ 1 / ω 1 in Eq. (27), we obtain
ω 2 ω 1 ϕ 1 ( r , s ) = ψ 2 ( r , s ) + 2 π h 2 ( r , s ) ,
which leads to
h 2 ( r , s ) = ( ω 2 / ω 1 ) ϕ 1 ( r , s ) ψ 2 ( r , s ) 2 π ,
where · is the round operator ensuring h 2 takes integer values. This process can be repeated for a third frequency ω 3 > ω 2 , and so on, until the desired resolution is reached. In general, the k-th retrieved phase ψ k ( r , s ) can be unwrapped using a previous phase ϕ k 1 ( r , s ) as
ϕ k ( r , s ) = ψ k ( r , s ) + 2 π h k ( r , s ) ,
h k ( r , s ) = α k ϕ k 1 ( r , s ) ψ k ( r , s ) 2 π ,
where α k is the amplification between the adjacent phases ϕ k 1 and ϕ k , defined as
α k = ω k / ω k 1 .
This phase unwrapping method is a recursive process that works hierarchically. It starts with the lowest (single-fringe) grating frequency and ends with the highest grating frequency supported by the projector.

3.3. Choosing Grating Frequencies

The operation of the hierarchical phase unwrapping method depends on how many frequencies can be used. Ideally, two frequencies, the lowest and highest, should be sufficient to attain a high-fidelity phase. However, in practice, using two frequencies often fails due to the excessive difference between the phases, as illustrated in Figure 9(a). For this reason, intermediate frequencies are employed to produce phases with less drastic changes, as shown in Figure 9(b)-(i).
Although any set of increasing frequencies would be chosen, using the same amplification between adjacent phases is recommended [38]; i.e.,
α 1 = α 2 = = α m = α ,
where m is the number of frequencies to be used. The grating frequencies fulfilling the constant amplification requirement are
ω 1 = π , ω 2 = π α , ω 3 = π α 2 , ω m = π α m 1 .
The amplification coefficient α is determined based on the projector resolution, which limits the supported maximum grating frequency. For instance, assuming the projector has N pixels along the u-axis, the amplification coefficient is given as
α = N ξ 1 / ( m 1 ) ,
where ξ is the number of pixels per fringe at maximum frequency (usually between 10 and 20 pixels). The frequencies for the gratings encoding the projector v-axis are determined analogously.

4. System Calibration

In the theoretical preliminaries presented in Section 2, the camera and projector parameters were assumed to be known. In practice, these parameters need to be estimated through the process known as system calibration. Earlier methods to calibrate a camera and a projector together employed cumbersome procedures. Nowadays, more practical alternatives to calibrate a camera-projector pair are available. In this section, a simple and flexible camera-projector calibration method is presented. First, the calibration of a single camera is explained to gain background. Then, the methodology is extended to projectors by exploiting the equivalence between cameras and projectors. Finally, the procedure for simultaneous camera and projector calibration is described.

4.1. Camera Calibration

Consider the pinhole model given by Eq. (1), rewritten here as
μ = H 1 K r ¯ 1 , r ¯ 2 , r ¯ 3 , R T t x y z 1 ,
where the rotation matrix was row-partitioned as R T = [ r ¯ 1 , r ¯ 2 , r ¯ 3 ] . The camera can be seen as a black box that receives a point p = [ x , y , z ] T as input and returns an image pixel point μ as output. Therefore, a 3D target can be used to obtain a set of input-output pairs, as shown in Figure 10(a), and estimate the camera matrix C by minimizing the output error [35], as outlined in Figure 10(c). The parameters, K, R and t , can be extracted from C by a triangular-orthogonal matrix decomposition. Nonetheless, despite the single-shot calibration feature, this approach is impractical because high-precision multi-point 3D targets are bulky and expensive.
Alternatively, instead of capturing a single image of a 3D target, the camera can be calibrated using multiple images of a 2D target [51], as shown in Figure 10(b). We refer to the plane where the calibration target is located as the reference plane. Without loss of generality, the reference plane is assumed to be the x y -plane. Therefore, since z is always zero, the third entry of p and the vector r ¯ 3 in Eq. (36) can be removed, simplifying the imaging process to
μ = H 1 G H [ ρ ] .
where ρ = [ x , y ] T represents points on the calibration target, and G is a 3 × 3 non-singular matrix known as homography, defined as
G = K r ¯ 1 , r ¯ 2 , R T t .
An image of the 2D calibration target establishes a set of input-output pairs, allowing a homography to be estimated by minimizing the output error, as outlined in Figure 10(d). Although a single homography is insufficient for camera calibration, multiple homographies provide enough information to recover the required intrinsic and extrinsic parameters. For this, the upper-triangular shape of K and the orthogonality property of rotation matrices are exploited.

4.2. Projector Calibration

The data required for homography-based calibration are input-output pairs consisting of points from the reference plane matched with points from the device plane. This requirement is independent of whether the device to be calibrated is a camera or a projector due to their mathematical equivalence. Accordingly, the method for calibrating a camera or a projector is the same [52]; they only differ in how the required input-output pairs are acquired.
For camera calibration, input-output pairs are acquired by placing the calibration target on the reference plane and capturing photographs, as shown in Figure 10(b). The target provides known points on the reference plane, while the images provide the corresponding pixel points using automatic feature point detection [53]. Note that this input-output acquisition strategy is unsuitable for projectors because they cannot take photographs of the reference plane.
For projector calibration, input-output pairs are acquired by placing the calibration target on the device plane as a slide and illuminating the reference plane, as shown in Figure 11(a). The target provides known points on the projector plane, and the corresponding points are measured on the reference plane. Unlike camera calibration, where the input-output acquisition is automatic using dedicated image processing routines, projector calibration requires a manual measurement of feature point coordinates on the reference plane.

4.3. Camera-Projector Calibration

Calibrating a fringe projection system requires estimating the parameters of a camera and a projector that are working together. Although cameras and projectors can be calibrated independently, simultaneous calibration is advantageous because the camera assists the projector calibration.
Remember that a physical target on the reference plane is required for camera calibration. In addition, a virtual target on the slide plane is necessary for projector calibration. Although both targets are superposing with each other on the reference plane, they can be distinguished using targets of different colors [54]. For instance, a yellow target on the reference plane, see Figure 11(b), and a cyan target on the projector plane, see Figure 11(c). In this manner, the two superposed targets are retrieved separately from the captured image through its red and blue channels, as shown in Figure 11(d)-(f).
The image of the physical and virtual targets on the camera plane is opportune because manual measurements on the reference plane are avoided. First, the image of the physical target is used to estimate the camera homography G c from the relation
μ p = H 1 G c H [ ρ p ] ,
where ρ p is a feature point of the physical target (yellow), and μ p is the corresponding point on the image (blue channel). In addition to using G c for camera calibration, this homography prevents measuring manually points on the reference plane for projector calibration. Namely, let μ v be an image point of the captured virtual target (red channel). The corresponding point ρ v on the reference plane (cyan) can be computed using the inverse of G c as
ρ v = H 1 G c 1 H [ μ v ] .
As a result, since the feature points μ s on the projector slide plane are known because it contains the virtual calibration target, the projector homography G p can be estimated from the relation
μ s = H 1 G p H [ ρ v ] .
This procedure is repeated for three or more poses of the devices or the reference plane moved freely. Finally, the camera and projector parameters are estimated from the estimated homographies. This methodology, known as camera-projector calibration, remains valid even when more advanced imaging models are employed [36].
For illustration purposes, the camera-projector calibration method was employed to calibrate the experimental system shown in Figure 12(a) using the software available in Ref. [52]. Then, the 50 × 100 × 50 millimeter pyramid on the reference plane was reconstructed. Figure 6 and Figure 9(i) show, respectively, some of the captured fringe patterns and the demodulated phase encoding the projector u-axis. Additional fringe patterns were captured to demodulate the phase encoding the v-axis. Equations (13) and (14) were used to obtain the projector slide coordinates u ( r , s ) and v ( r , s ) from the demodulated phase. Finally, Eq. (9) was used to compute an object point for each established corresponding point, resulting in the 3D reconstruction shown in Figure 12(b).

5. Optical 3D Metrology

Nowadays, fringe projection profilometry has been successfully implemented in several fields of science and engineering. Its valuable features, such as accuracy and high resolution, make it an outstanding 3D metrology tool. This section describes a few illustrative applications in which optical fringe projection has been employed.
Non-contact and high-accuracy features have motivated the implementation of fringe projection profilometry in medical applications [55]. Some representative applications include human respiration rate measurement [56], diagnosis of tympanic-membrane and middle-ear disease [57], skin assessment [58], tympanic membrane characterization [59], body measurement for guided radiotherapy [60], assistance in laparoscopic procedures [61], biomechanics studies [62], and intraoral measurement for orthodontic treatment [63]. Figure 13 shows a dental teeth model reconstruction illustrating an intraoral measurement.
Automatic and uninterrupted inspection of items on production lines is essential to ensure high-quality products [64]. For this purpose, fringe projection profilometry is a valuable tool because of its capacity for visual inspection [65]. Other tasks where fringe projection profilometry is useful are metal corrosion monitoring [66], crack recognition assistance [67], welding surface inspection [68], micro-scale component testing [64], turbine blade wear and damage characterization [69], aircraft surface defect detection [70], and electrical overload fault detection [18]. Figure 14 shows a visible-thermal profilometer for multimodal 3D reconstruction, detecting a heat source caused by a simulated fault.
Conventional biometric-based security systems employ gray-scale images to perform user authentication. However, these systems lose valuable information by omitting important human biometric features such as the color and 3D shape of faces, palms, and fingers [71]. Fringe projection profilometry has been employed in security systems for face recognition [72], 3D palmprint and hand imaging [73,74], 3D fingerprint imaging [75], and ear recognition [76].
Other recent applications using fringe projection profilometry include robot vision for object detection and navigation [77], footwear and tire impression in forensic science [13], shape and strain measurements for mechanical studies [78], plant phenotyping and leafy green evaluation in agriculture [79], high-precision panel telescope alignment [80], whole-body scanning for animation and entertainment [81], airbag inflation for car safety studies [82,83], ancient coin and sculpture imaging for heritage preservation [84,85], fast prototyping and reverse engineering [86], three-dimensional color mural capturing for cultural heritage documentation [87], and many others [14].
This section is not an exhaustive review but rather a description of the state-of-the-art of fringe projection profilometry. Readers are welcome to contribute fresh ideas to drive 3D metrology with even more efficient, accurate, practical, and affordable optical profilometers.

6. Conclusions

The theoretical and experimental fundamentals of fringe projection profilometry have been reviewed. Helpful insights explaining challenging and confusing concepts were given, making this technology an uncomplicated tool. Simple yet powerful methods for fringe pattern processing, triangulation, and calibration were presented. The studied methods provide the reader with an adequate background to understand and operate a fringe projection profilometer. Some applications were described to expand the reader’s scope and stimulate research on further developments, driving 3D metrology toward new frontiers.

Author Contributions

Conceptualization, R.J.; methodology, R.J.; software, R.J.; visualization, R.J.; validation, R.J., V.H.D.; formal analysis, V.H.D.; investigation, S.E.; data curation, R.J.; writing—original draft preparation, R.J.; writing—review and editing, S.E. and V.H.D.; visualization, R.J. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Secretaría de Ciencia, Humanidades, Tecnología e Innovación (Cátedras CONACYT 880).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Grambow, N.; Hinz, L.; Bonk, C.; Krüger, J.; Reithmeier, E. Creepage Distance Estimation of Hairpin Stators Using 3D Feature Extraction. Metrology 2023, 3, 169–185. [Google Scholar] [CrossRef]
  2. Zhang, S. (Ed.) Handbook of 3D Machine Vision: Optical Metrology and Imaging; Series in Optics and Optoelectronics; CRC Press: Boca Raton, 2013. [Google Scholar]
  3. Marrugo, A.G.; Gao, F.; Zhang, S. State-of-the-art active optical techniques for three-dimensional surface metrology: a review [Invited]. J. Opt. Soc. Am. A 2020, 37, B60–B77. [Google Scholar] [CrossRef]
  4. Jiang, C.; Li, Y.; Feng, S.; Hu, Y.; Yin, W.; Qian, J.; Zuo, C.; Liang, J. Fringe Projection Profilometry. In Coded Optical Imaging; Springer International Publishing: Cham, 2024; chapter 14; pp. 241–286. [Google Scholar]
  5. Harding, K. (Ed.) Handbook of Optical Dimensional Metrology; Series in Optics and Optoelectronics; CRC Press: Boca Raton, 2013. [Google Scholar]
  6. Zhang, Q.; Su, X. High-speed optical measurement for the drumhead vibration. Opt. Express 2005, 13, 3110–3116. [Google Scholar] [CrossRef] [PubMed]
  7. Zhang, S. High-Speed 3D Imaging with Digital Fringe Projection Techniques; CRC Press: Boca Raton, 2016. [Google Scholar]
  8. Zhang, S. High-speed 3D shape measurement with structured light methods: A review. Optics and Lasers in Engineering 2018, 106, 119–131. [Google Scholar] [CrossRef]
  9. Liu, Y.; Zhang, Q.; Liu, Y.; Yu, X.; Hou, Y.; Chen, W. High-speed 3D shape measurement using a rotary mechanical projector. Opt. Express 2021, 29, 7885–7903. [Google Scholar] [CrossRef] [PubMed]
  10. Takeda, M. Fourier fringe analysis and its application to metrology of extreme physical phenomena: a review [Invited]. Appl. Opt. 2013, 52, 20–29. [Google Scholar] [CrossRef] [PubMed]
  11. Munkelt, C.; Schmidt, I.; Brauer-Burchardt, C.; Kuhmstedt, P.; Notni, G. Cordless portable multi-view fringe projection system for 3D reconstruction. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, June 2007; pp. 1–2. [Google Scholar]
  12. Zaman, T.; Jonker, P.; Lenseigne, B.; Dik, J. Simultaneous capture of the color and topography of paintings using fringe encoded stereo vision. Heritage Science 2014, 2, 23. [Google Scholar] [CrossRef]
  13. Liao, Y.H.; Hyun, J.S.; Feller, M.; Bell, T.; Bortins, I.; Wolfe, J.; Baldwin, D.; Zhang, S. Portable high-resolution automated 3D imaging for footwear and tire impression capture. Journal of Forensic Sciences 2021, 66, 112–128. [Google Scholar] [CrossRef] [PubMed]
  14. Gorthi, S.S.; Rastogi, P.K. Fringe projection techniques: Whither we are? Optics and Lasers in Engineering 2010, 48, 133–140. [Google Scholar] [CrossRef]
  15. Kulkarni, R.; Rastogi, P. Optical measurement techniques - A push for digitization. Optics and Lasers in Engineering 2016, 87, 1–17. [Google Scholar] [CrossRef]
  16. Xu, J.; Zhang, S. Status, challenges, and future perspectives of fringe projection profilometry. Optics and Lasers in Engineering 2020, 135, 106193. [Google Scholar] [CrossRef]
  17. Peng, R.; Zhou, G.; Zhang, C.; Wei, C.; Wang, X.; Chen, X.; Yang, L.; Yue, H.; Liu, Y. Ultra-small, low-cost, and simple-to-control PSP projector based on SLCD technology. Opt. Express 2024, 32, 1878–1889. [Google Scholar] [CrossRef] [PubMed]
  18. Juarez-Salazar, R.; Benjumea, E.; Marrugo, A.G.; Diaz-Ramirez, V.H. Three-dimensional object texturing for visible-thermal fringe projection profilometers. In Proceedings of the Optics and Photonics for Information Processing XVIII, 2024; Volume 13136, p. 131360E. [Google Scholar]
  19. Benjumea, E.; Vargas, R.; Juarez-Salazar, R.; Marrugo, A.G. Toward a target-free calibration of a multimodal structured light and thermal imaging system. In Proceedings of the Dimensional Optical Metrology and Inspection for Practical Applications XIII, 2024; Volume 13038, p. 1303808. [Google Scholar]
  20. Feng, S.; Zuo, C.; Zhang, L.; Tao, T.; Hu, Y.; Yin, W.; Qian, J.; Chen, Q. Calibration of fringe projection profilometry: A comparative review. Optics and Lasers in Engineering 2021, 143, 106622. [Google Scholar] [CrossRef]
  21. Chen, R.; Xu, J.; Zhang, S.; Chen, H.; Guan, Y.; Chen, K. A self-recalibration method based on scale-invariant registration for structured light measurement systems. Optics and Lasers in Engineering 2017, 88, 75–81. [Google Scholar] [CrossRef]
  22. Juarez-Salazar, R. Flat mirrors, virtual rear-view cameras, and camera-mirror calibration. Optik 2024, 317, 172067. [Google Scholar] [CrossRef]
  23. Almaraz-Cabral, C.C.; Gonzalez-Barbosa, J.J.; Villa, J.; Hurtado-Ramos, J.B.; Ornelas-Rodriguez, F.J.; Córdova-Esparza, D.M. Fringe projection profilometry for panoramic 3D reconstruction. Optics and Lasers in Engineering 2016, 78, 106–112. [Google Scholar] [CrossRef]
  24. Flores, V.; Casaletto, L.; Genovese, K.; Martinez, A.; Montes, A.; Rayas, J. A Panoramic Fringe Projection system. Optics and Lasers in Engineering 2014, 58, 80–84. [Google Scholar] [CrossRef]
  25. Feng, S.; Zhang, L.; Zuo, C.; Tao, T.; Chen, Q.; Gu, G. High dynamic range 3D measurements with fringe projection profilometry: a review. Measurement Science and Technology 2018, 29, 122001. [Google Scholar] [CrossRef]
  26. Zhao, X.; Yu, T.; Liang, D.; He, Z. A review on 3D measurement of highly reflective objects using structured light projection. The International Journal of Advanced Manufacturing Technology 2024, 132, 4205–4222. [Google Scholar] [CrossRef]
  27. Duan, M.; Jin, Y.; Chen, H.; Zheng, J.; Zhu, C.; Chen, E. Automatic 3-D Measurement Method for Nonuniform Moving Objects. IEEE Transactions on Instrumentation and Measurement 2021, 70, 1–11. [Google Scholar] [CrossRef]
  28. Zhang, S. Rapid and automatic optimal exposure control for digital fringe projection technique. Optics and Lasers in Engineering 2020, 128, 106029. [Google Scholar] [CrossRef]
  29. Chen, R.; Xu, J.; Zhang, S. Digital fringe projection profilometry. In Advances in Optical Form and Coordinate Metrology; 2053-2563; IOP Publishing, 2020; pp. 1–28. [Google Scholar]
  30. Leach, R. (Ed.) Advances in optical form and coordinate metrology; Emerging Technologies in Optics and Photonics, IOP Publishing Ltd: Bristol, UK, 2020. [Google Scholar]
  31. Yoshizawa, T. (Ed.) Handbook of Optical Metrology: Principles and Applications, 2th ed.; CRC Press: Boca Raton, 2015. [Google Scholar]
  32. Lv, S.; Tang, D.; Zhang, X.; Yang, D.; Deng, W.; Kemao, Q. Fringe projection profilometry method with high efficiency, precision, and convenience: theoretical analysis and development. Opt. Express 2022, 30, 33515–33537. [Google Scholar] [CrossRef] [PubMed]
  33. Rastogi, P.K. (Ed.) Digital Optical Measurement Techniques and Applications; Artech House applied photonics series; Artech House, 2015. [Google Scholar]
  34. Ray, S. Applied Photographic Optics, 3ed ed.; Taylor & Francis, 2002. [Google Scholar]
  35. Hartley, R.; Zisserman, A. Multiple View Geometry in Computer Vision, 2nd ed.; Cambridge University Press, 2003. [Google Scholar]
  36. Juarez-Salazar, R.; Zheng, J.; Diaz-Ramirez, V.H. Distorted pinhole camera modeling and calibration. Applied Optics 2020, 59, 11310–11318. [Google Scholar] [CrossRef]
  37. Juarez-Salazar, R.; Diaz-Ramirez, V.H. Operator-based homogeneous coordinates: application in camera document scanning. Optical Engineering 2017, 56, 070801. [Google Scholar] [CrossRef]
  38. Juarez-Salazar, R.; Giron, A.; Zheng, J.; Diaz-Ramirez, V.H. Key concepts for phase-to-coordinate conversion in fringe projection systems. Applied Optics 2019, 58, 4828–4834. [Google Scholar] [CrossRef] [PubMed]
  39. Juarez-Salazar, R.; Rodriguez-Reveles, G.A.; Esquivel-Hernandez, S.; Diaz-Ramirez, V.H. Three-dimensional spatial point computation in fringe projection profilometry. Optics and Lasers in Engineering 2023, 164, 107482. [Google Scholar] [CrossRef]
  40. Maciel, J.; Costeira, J. A global solution to sparse correspondence problems. IEEE Transactions on Pattern Analysis and Machine Intelligence 2003, 25, 187–199. [Google Scholar] [CrossRef]
  41. Mouaddib, E.; Batlle, J.; Salvi, J. Recent progress in structured light in order to solve the correspondence problem in stereovision. In Proceedings of the Proceedings of International Conference on Robotics and Automation; 1997, Volume 1, pp. 130–136.
  42. Juarez-Salazar, R.; Rios-Orellana, O.I.; Diaz-Ramirez, V.H. Stereo-phase rectification for metric profilometry with two calibrated cameras and one uncalibrated projector. Applied Optics 2022, 61, 6097–6109. [Google Scholar] [CrossRef] [PubMed]
  43. Geng, J. Structured-light 3D surface imaging: a tutorial. Adv. Opt. Photon. 2011, 3, 128–160. [Google Scholar] [CrossRef]
  44. Rastogi, P.; Hack, E. Phase Estimation in Optical Interferometry; Taylor & Francis, 2014. [Google Scholar]
  45. Servin, M.; Quiroga, J.A.; Padilla, M. Fringe Pattern Analysis for Optical Metrology: Theory, Algorithms, and Applications; Wiley, 2014. [Google Scholar]
  46. Takeda, M.; Ina, H.; Kobayashi, S. Fourier-transform method of fringe-pattern analysis for computer-based topography and interferometry. J. Opt. Soc. Am. 1982, 72, 156–160. [Google Scholar] [CrossRef]
  47. Juarez-Salazar, R.; Mendoza-Rodriguez, C.; Hernandez-Beltran, J.E.; Robledo-Sanchez, C. How do phase-shifting algorithms work? European Journal of Physics 2018, 39, 065302. [Google Scholar] [CrossRef]
  48. Zhang, S. High-Speed 3D Imaging with Digital Fringe Projection Techniques; CRC Press: Boca Raton, 2016. [Google Scholar]
  49. Zuo, C.; Feng, S.; Huang, L.; Tao, T.; Yin, W.; Chen, Q. Phase shifting algorithms for fringe projection profilometry: A review. Optics and Lasers in Engineering 2018, 109, 23–59. [Google Scholar] [CrossRef]
  50. Bruning, J.H.; Herriott, D.R.; Gallagher, J.E.; Rosenfeld, D.P.; White, A.D.; Brangaccio, D.J. Digital Wavefront Measuring Interferometer for Testing Optical Surfaces and Lenses. Appl. Opt. 1974, 13, 2693–2703. [Google Scholar] [CrossRef] [PubMed]
  51. Zhang, Z. A flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence 2000, 22, 1330–1334. [Google Scholar] [CrossRef]
  52. Juarez-Salazar, R.; Esquivel-Hernandez, S.; Diaz-Ramirez, V.H. Are camera, projector, and camera-projector calibrations different? Applied Optics 2023, 62, 5999–6006. [Google Scholar] [CrossRef]
  53. Geiger, A.; Moosmann, F.; Car, O.; Schuster, B. Automatic camera and range sensor calibration using a single shot. In Proceedings of the IEEE International Conference on Robotics and Automation; 2012; pp. 3936–3943. [Google Scholar]
  54. Juarez-Salazar, R.; Diaz-Ramirez, V.H. Flexible camera-projector calibration using superposed color checkerboards. Optics and Lasers in Engineering 2019, 120, 59–65. [Google Scholar] [CrossRef]
  55. Norouzi, M.; Shirin, M.B. Investigating the precision of 3D scanner systems based on digital fringe projection method for biomedical engineering applications. In Proceedings of the 28th National and 6th International Iranian Conference on Biomedical Engineering, 2021; pp. 58–64. [Google Scholar]
  56. Lorenz, A.L.; Zhang, S. Human Respiration Rate Measurement with High-Speed Digital Fringe Projection Technique. Sensors 2023, 23. [Google Scholar] [CrossRef]
  57. Muyshondt, P.G.; Van der Jeught, S.; Dirckx, J.J. A calibrated 3D dual-barrel otoendoscope based on fringe-projection profilometry. Optics and Lasers in Engineering 2022, 149, 106795. [Google Scholar] [CrossRef]
  58. Bielfeldt, S.; Springmann, G.; Seise, M.; Wilhelm, K.P.; Callaghan, T. An updated review of clinical methods in the assessment of ageing skin – New perspectives and evaluation for claims support. International Journal of Cosmetic Science 2018, 40, 348–355. [Google Scholar] [CrossRef] [PubMed]
  59. Liang, J.; Luo, H.; Yokell, Z.; Nakmali, D.U.; Gan, R.Z.; Lu, H. Characterization of the nonlinear elastic behavior of chinchilla tympanic membrane using micro-fringe projection. Hearing Research 2016, 339, 1–11. [Google Scholar] [CrossRef]
  60. Price, G.J.; Parkhurst, J.M.; Sharrock, P.J.; Moore, C.J. Real-time optical measurement of the dynamic body surface for use in guided radiotherapy. Physics in Medicine & Biology 2011, 57, 415. [Google Scholar]
  61. Le, H.N.D.; Nguyen, H.; Wang, Z.; Opfermann, J.; Leonard, S.; Krieger, A.; Kang, J.U. Demonstration of a laparoscopic structured-illumination three-dimensional imaging system for guiding reconstructive bowel anastomosis. Journal of Biomedical Optics 2018, 23, 056009. [Google Scholar] [CrossRef] [PubMed]
  62. Genovese, K.; Humphrey, J.D. Multimodal optical measurement in vitro of surface deformations and wall thickness of the pressurized aortic arch. Journal of Biomedical Optics 2015, 20, 046005. [Google Scholar] [CrossRef] [PubMed]
  63. Chen, S. Intraoral 3-D Measurement by Means of Group Coding Combined With Consistent Enhancement for Fringe Projection Pattern. IEEE Transactions on Instrumentation and Measurement 2022, 71, 1–12. [Google Scholar] [CrossRef]
  64. Hu, Y.; Chen, Q.; Feng, S.; Zuo, C. Microscopic fringe projection profilometry: A review. Optics and Lasers in Engineering 2020, 135, 106192. [Google Scholar] [CrossRef]
  65. Qian, J.; Feng, S.; Xu, M.; Tao, T.; Shang, Y.; Chen, Q.; Zuo, C. High-resolution real-time 360° 3D surface defect inspection with fringe projection profilometry. Optics and Lasers in Engineering 2021, 137, 106382. [Google Scholar] [CrossRef]
  66. Casavola, C.; Pappalardi, P.; Pappalettera, G.; Renna, G. A Fringe Projection Based Approach for Corrosion Monitoring in Metals. Experimental Techniques 2018, 42, 291–297. [Google Scholar] [CrossRef]
  67. Ma, H.; Wang, J.; Shao, M. Crack recognition approach assisted by three-dimensional measurement technique. Journal of Optics 2024, 53, 4981–4987. [Google Scholar] [CrossRef]
  68. Li, B.; Xu, Z.; Gao, F.; Cao, Y.; Dong, Q. 3D Reconstruction of High Reflective Welding Surface Based on Binocular Structured Light Stereo Vision. Machines 2022, 10. [Google Scholar] [CrossRef]
  69. Schlobohm, J.; Bruchwald, O.; Frąckowiak, W.; Li, Y.; Kästner, M.; Pösch, A.; Reimche, W.; Maier, H.J.; Reithmeier, E. Advanced Characterization Techniques for Turbine Blade Wear and Damage. Procedia CIRP 2017, 59, 83–88. [Google Scholar] [CrossRef]
  70. Xia, R.; Zhao, J.; Zhang, T.; Su, R.; Chen, Y.; Fu, S. Detection method of manufacturing defects on aircraft surface based on fringe projection. Optik 2020, 208, 164332. [Google Scholar] [CrossRef]
  71. Ou, P.; Li, B.; Wang, Y.; Zhang, S. Flexible real-time natural 2D color and 3D shape measurement. Opt. Express 2013, 21, 16736–16741. [Google Scholar] [CrossRef]
  72. Guo, H.; Huang, P.S. Face recognition based on fringe pattern analysis. Optical Engineering 2010, 49, 037201. [Google Scholar] [CrossRef]
  73. Zhang, Z.; Huang, S.; Xu, Y.; Chen, C.; Zhao, Y.; Gao, N.; Xiao, Y. 3D palmprint and hand imaging system based on full-field composite color sinusoidal fringe projection technique. Appl. Opt. 2013, 52, 6138–6145. [Google Scholar] [CrossRef] [PubMed]
  74. Bai, X.; Gao, N.; Zhang, Z.; Zhang, D. Person Recognition Using 3-D Palmprint Data Based on Full-Field Sinusoidal Fringe Projection. IEEE Transactions on Instrumentation and Measurement 2019, 68, 3287–3298. [Google Scholar] [CrossRef]
  75. Huang, S.; Zhang, Z.; Zhao, Y.; Dai, J.; Chen, C.; Xu, Y.; Zhang, E.; Xie, L. 3D fingerprint imaging system based on full-field fringe projection profilometry. Optics and Lasers in Engineering 2014, 52, 123–130. [Google Scholar] [CrossRef]
  76. Chatterjee, A.; Singh, P.; Bhatia, V.; Prakash, S. Ear biometrics recognition using laser biospeckled fringe projection profilometry. Optics & Laser Technology 2019, 112, 368–378. [Google Scholar]
  77. Zhang, H.; Lee, S. Robot Bionic Vision Technologies: A Review. Applied Sciences 2022, 12. [Google Scholar] [CrossRef]
  78. de Jesus Ortiz-Gonzalez, A.; Martinez-Garcia, A.; Pascual-Francisco, J.B.; Rayas-Alvarez, J.A.; de Jesus Flores-Garcia, A. 3D shape and strain measurement of a thin-walled elastic cylinder using fringe projection profilometry. Appl. Opt. 2021, 60, 1349–1356. [Google Scholar] [CrossRef]
  79. Balasubramaniam, B.; Li, J.; Liu, L.; Li, B. 3D Imaging with Fringe Projection for Food and Agricultural Applications—A Tutorial. Electronics 2023, 12. [Google Scholar] [CrossRef]
  80. Berkson, J.; Hyatt, J.; Julicher, N.; Jeong, B.; Pimienta, I.; Ball, R.; Ellis, W.; Voris, J.; Torres-Barajas, D.; Kim, D. Systematic Radio Telescope Alignment Using Portable Fringe Projection Profilometry. Nanomanufacturing and Metrology 2024, 7, 6. [Google Scholar] [CrossRef]
  81. Liberadzki, P.; Adamczyk, M.; Witkowski, M.; Sitnik, R. Structured-Light-Based System for Shape Measurement of the Human Body in Motion. Sensors 2018, 18. [Google Scholar] [CrossRef] [PubMed]
  82. Heist, S.; Lutzke, P.; Schmidt, I.; Dietrich, P.; Kühmstedt, P.; Tünnermann, A.; Notni, G. High-speed three-dimensional shape measurement using GOBO projection. Optics and Lasers in Engineering 2016, 87, 90–96. [Google Scholar] [CrossRef]
  83. Heist, S.; Dietrich, P.; Landmann, M.; Kühmstedt, P.; Notni, G.; Tünnermann, A. GOBO projection for 3D measurements at highest frame rates: a performance analysis. Light: Science & Applications 2018, 7, 71. [Google Scholar]
  84. Spagnolo, G.; Ambrosini, D.; Paoletti, D. Low-cost optoelectronic system for three-dimensional artwork texture measurement. IEEE Transactions on Image Processing 2004, 13, 390–396. [Google Scholar] [CrossRef] [PubMed]
  85. Sansoni, G.; Docchio, F. 3-D optical measurements in the field of cultural heritage: the case of the Vittoria Alata of Brescia. IEEE Transactions on Instrumentation and Measurement 2005, 54, 359–368. [Google Scholar] [CrossRef]
  86. Burke, J.; Bothe, T.; Osten, W.; Hess, C.F. Reverse engineering by fringe projection. In Proceedings of the Interferometry XI: Applications; Osten, W., Ed.; 2002; Volume 4778, pp. 312–324. [Google Scholar]
  87. Hu, C.; Wang, Y.; Xia, G.; Han, Y.; Ma, X.; Jing, G. The generation method of orthophoto expansion map of arched dome mural based on three-dimensional fine color model. Heritage Science 2024, 12, 408. [Google Scholar] [CrossRef]
Figure 1. (a) Camera pinhole model with intrinsic parameters given by the upper-triangular matrix K and the extrinsic parameters (pose) consisting of the rotation matrix R and the translation vector t . (b) Every image pixel is associated with a light ray from the 3D space with a unique direction.
Figure 1. (a) Camera pinhole model with intrinsic parameters given by the upper-triangular matrix K and the extrinsic parameters (pose) consisting of the rotation matrix R and the translation vector t . (b) Every image pixel is associated with a light ray from the 3D space with a unique direction.
Preprints 145019 g001
Figure 3. Cameras and projectors differ only in that light rays travel in opposite directions. If the sign of the direction vectors is omitted, cameras and projectors are mathematically equivalent (referred to generically as devices).
Figure 3. Cameras and projectors differ only in that light rays travel in opposite directions. If the sign of the direction vectors is omitted, cameras and projectors are mathematically equivalent (referred to generically as devices).
Preprints 145019 g003
Figure 4. Camera-projector systems lack the corresponding point problem because μ 1 is directly identified as the image’s bright pixel, and μ 2 is known from the slide design.
Figure 4. Camera-projector systems lack the corresponding point problem because μ 1 is directly identified as the image’s bright pixel, and μ 2 is known from the slide design.
Preprints 145019 g004
Figure 7. (a) Phase ϕ ( r , s ) demodulated from the fringe patterns shown in Figure 6. (b) 3D plot of the phase shown in (a). It is worth emphasizing that the phase provides the projector slide coordinates, and the direct association with the object profile must be avoided.
Figure 7. (a) Phase ϕ ( r , s ) demodulated from the fringe patterns shown in Figure 6. (b) 3D plot of the phase shown in (a). It is worth emphasizing that the phase provides the projector slide coordinates, and the direct association with the object profile must be avoided.
Preprints 145019 g007
Figure 9. (a) Phase obtained using only the lowest and highest wrapped phases (b) and (e). The phase artifacts are avoided by including the wrapped phases (c) and (d) with intermediate frequencies. (f)-(i) Unwrapped phases obtained recursively by the hierarchical multi-frequency method.
Figure 9. (a) Phase obtained using only the lowest and highest wrapped phases (b) and (e). The phase artifacts are avoided by including the wrapped phases (c) and (d) with intermediate frequencies. (f)-(i) Unwrapped phases obtained recursively by the hierarchical multi-frequency method.
Preprints 145019 g009
Figure 10. (a) Camera calibration using a single image of a 3D target. (b) Camera calibration using multiple images of a 2D target captured from different viewpoints. (c) A set of 3D input-output pairs is processed to estimate C and extract the camera parameters by a matrix decomposition. (d) Multiple sets of 2D input-output pairs are processed for homography estimation and further extraction of the camera parameters by exploiting the shape of K and the orthogonality of rotation matrices.
Figure 10. (a) Camera calibration using a single image of a 3D target. (b) Camera calibration using multiple images of a 2D target captured from different viewpoints. (c) A set of 3D input-output pairs is processed to estimate C and extract the camera parameters by a matrix decomposition. (d) Multiple sets of 2D input-output pairs are processed for homography estimation and further extraction of the camera parameters by exploiting the shape of K and the orthogonality of rotation matrices.
Preprints 145019 g010
Figure 11. (a) Projector calibration by displaying a 2D target on the reference plane covered with grid-ruled paper for manual feature point coordinate measurement. (b) Yellow target on the reference plane for camera calibration. (c) Camera-projector calibration by displaying a cyan 2D target on the reference yellow target. (d) Color image captured by the camera. The captured image provides the displayed and reference calibration targets through its red (e) and blue (f) channels.
Figure 11. (a) Projector calibration by displaying a 2D target on the reference plane covered with grid-ruled paper for manual feature point coordinate measurement. (b) Yellow target on the reference plane for camera calibration. (c) Camera-projector calibration by displaying a cyan 2D target on the reference yellow target. (d) Color image captured by the camera. The captured image provides the displayed and reference calibration targets through its red (e) and blue (f) channels.
Preprints 145019 g011
Figure 12. (a) Calibrated camera-projector system and an object on the reference plane. (b) Metric object reconstruction using fringe projection profilometry.
Figure 12. (a) Calibrated camera-projector system and an object on the reference plane. (b) Metric object reconstruction using fringe projection profilometry.
Preprints 145019 g012
Figure 13. Reconstruction of a dental teeth model using the calibrated camera-projector system shown in Figure 12. (a)-(d) Fringe patterns of four gratings with eight phase shifts encoding the projector u-axis. (e)-(h) Fringe patterns encoding the projector v-axis. (i) 3D object reconstruction.
Figure 13. Reconstruction of a dental teeth model using the calibrated camera-projector system shown in Figure 12. (a)-(d) Fringe patterns of four gratings with eight phase shifts encoding the projector u-axis. (e)-(h) Fringe patterns encoding the projector v-axis. (i) 3D object reconstruction.
Preprints 145019 g013
Figure 14. (a) Visible-thermal fringe projection profilometry reconstructing a remote controller with a heated battery compartment simulating a failure [18]. (b) and (c) Two of forty-eight fringe patterns used for 3D reconstruction. (d) and (e) Visible and thermal images to provide multimodal texture on the object surface. (f) Multimodal (visible-thermal) 3D object reconstruction.
Figure 14. (a) Visible-thermal fringe projection profilometry reconstructing a remote controller with a heated battery compartment simulating a failure [18]. (b) and (c) Two of forty-eight fringe patterns used for 3D reconstruction. (d) and (e) Visible and thermal images to provide multimodal texture on the object surface. (f) Multimodal (visible-thermal) 3D object reconstruction.
Preprints 145019 g014
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated