Three-dimensional shape measurement of object in water using fringe projection and phase value tracking Qican Zhang∗, Qingfeng Wang, Zhiling Hou, Yuankun Liu, Xianyu Su (Department of Opto-Electronics, Sichuan University, Chengdu Sichuan 610065, China) ABSTRACT In this paper, a practical method using phase tracking algorithm and ray tracing algorithm is proposed for measuring the 3D shape of an underwater object. A sinusoidal fringe is projected through the water to modulate the information of the tested object’s 3D shape. Firstly, according to the binocular stereovision principle and regarding the projector as a special camera, the phase tracking algorithm is taken to identify the homologous points in the phase distribution of the deformed fringe captured by the camera and that of the fringe pattern projected by the projector. The relationship between those two coordinate systems will be fixed on. Secondly, the ray tracing algorithm is used to trace the propagation path of each ray and define the 3D coordinates of each point on the tested object’s surface. Finally, the whole shape of the tested object can be reconstructed. Without any scanning device, the method proposed in this paper can speed up the measurement process and reduce the cost. An actual measurement for a model and a hand in the water was done, the prefect results of the reconstructed shapes prove this method correctly and feasibility. Keywords: 3D measurement; fringe projection; object in water; phase tracking; ray tracing.
1. Introduction Active optical 3D shape measurement, which has been widely used in machine vision, industry monitoring, biomedicine, antique protection, virtual reality technology, etc., is preferred as a non-contact and nondestructive method. In some special measure requirement, such as the shape measurement of the organs which were dipped in formalin, or, the shape measurement of cultural relics reserved in some special liquor, the shape of objects should be measured while the given liquor exists. At the interfaces between different materials, the rays were bent or refracted. Then, considerable errors generated by traditional measurement algorithm. For example, when the Phase-Measuring profilometry1 be used, though precise results can be expected in normal condition (without liquor), the reconstructed shape will very different from the object as the direction of the ray was changed several times when it passes media have different indices of refraction. For a correct result, Atsushi Yamashita2~3, Ding Wanshan4 , et al., proposed ray tracing algorithm based on dot or line laser for measuring the 3D shape of a object under water. Those algorithms resolve the distortion problem. However, 2D or 1D scan is needed as the dot or line laser applied, so the complication and costs of the devices are increased. In this paper, the digital sinusoidal fringe is introduced to record and transmit the information of 3D shape of a tested object under water, then, the surface data will be easily resulted by simple calculating process.
2. Measurement Principle 2.1 brief introduction According to the binocular stereovision principle, the spatial position can be ascertained by the coordination parameters in two cameras from different visual angles. If one of the cameras is substituted for a projector, this system becomes a traditional active optical 3D shape measurement system. The algorithm proposed here adopts this principle. A digital projector projects sinusoidal fringe to the object and a camera records the distorted fringe which modulated by the shape of the tested object. Employing the phase relation between the project fringe and the distorted fringe, the relationship between the spot coordinates of the project coordinate system and the camera will be fixed on. Then, using the tracing algorithm to trace the ray propagation path, the 3D coordinates of each spot can be worked out, so as the whole shape of the object. ∗∗
[email protected]; phone 86 28 85463879; fax 86 28 85464568. Fourth International Conference on Experimental Mechanics, edited by C. Quan, K. Qian, A. Asundi, F. S. Chau, Proc. of SPIE Vol. 7522, 752258 · © 2010 SPIE · CCC code: 0277-786X/10/$18 · doi: 10.1117/12.851571
Proc. of SPIE Vol. 7522 752258-1 Downloaded from SPIE Digital Library on 12 Jun 2010 to 58.251.166.14. Terms of Use: http://spiedl.org/terms
2.2 Measurement principle As Fig.1 shows, the measure system formed with a digital projector and a camera placed in the air, a tested object is placed under water, and there is a surface between two media. Though the light emits from the projector, reflects by the object and ends at the CCD lens of the camera in fact. It can be considered that both the CCD camera and the projector can emit light by the path reversal principle. The point of intersection ray is the spot needs measurement. Set the CCD optical center O as the origin of the world coordinate, the Z coordinate and the optical axis are the same direction. Right-handed coordinate is introduced, as shown. One ray of the light, drawing in Fig.1, emits from the projector and ends at the CCD camera. If K(xg,yg,zg) is the intersection of the ray comes from the CCD and the plane of refraction, and M(xg′,yg′,zg′) is the intersection of the light comes from the projector and the plane of refraction. O′(x0,y0,z0) is the optical center of projector’s coordinates in the world coordinate which takes the optical center of the camera lens as the origin. Q(u0, v0) is the image value of the light comes to the CCD. The angles of incidence and refraction on the refraction surface from the CCD view are i and r, respectively. The angles of incidence and refraction on the refraction surface from the projector view are i′ and r′, respectively. P(xp,yp,zp) (αg,βg,γg)T Z
′
(αg ,βg ,γg )
K(xg,yg,zg)
i'
Air (α′,β′,γ′)
(α,β,γ)T
T
O′(x0,y0,z0)
Y Q(u0, v0)
Water
M(xg′,yg′,zg′)
i
X
O
(λ,μ,ν)T
′ T
r'
r
h
′
Digital Projector
CCD Camera
Fig.1 The principle map of 3D shape measurement for object in water
(λ,μ,ν)T represents the unit normal vector of the refraction interface, h is the distance from the origin of world coordinate system to the interface between media. (α,β,γ)T and (αg,βg,γg)T is unit normal vectors of the line of sight before or after . The parameters like (λ,μ,ν)T and h will be easily worked out after the test system been calibrated, then the surface of the object can be gotten by the ray tracing algorithm. 2.3 Phase tracking In the method proposed in this paper, the phase tracking algorithm is taken to identify the corresponding phase relationship between the phase distribution of the deformed fringe captured by the camera and that of the fringe pattern projected by the projector. So, the homologous points, (u0,v0) in projector pixel coordinate system and (u,v) in the camera pixel coordinate system, will been identified too. The sinusoidal fringe projected from the projector is expressed as: I0(u0,v0)=a0(u0,v0)+b0(u0,v0)sinΦ0(u0,v0) (1) Where b0(u0,v0)/a0(u0,v0) is the contrast of the fringes, Φ0(u0,v0) expresses the phase distribution of the project fringe. After being distorted by the object’s surface, the distortion fringe caught by the camera can be expressed as: I(u,v)=r(u,v)[a(u,v)+b(u,v)sinΦ(u,v)] (2) r(u,v) represents the uniform reflectivity of the object surface, b(u,v)/a(u,v) is the contrast of the distorted fringes, Φ(u,v) is the corresponding value of Φ0(u0,v0). Four step phase-shifting5-7 is adopted here to calculated Φ(u,v) which is the contrast phase distribution about point (u,v) in distortion fringe. According to the corresponding relationship Φ(u,v)=Φ0(u0,v0), the couple points (u0,v0) and (u,v) can be calculated. Then the spatial coordinates (xp,yp,zp) could be identified by ray tracing algorithm. Furthermore, the 3D surface of the object can be identified.
Proc. of SPIE Vol. 7522 752258-2 Downloaded from SPIE Digital Library on 12 Jun 2010 to 58.251.166.14. Terms of Use: http://spiedl.org/terms
2.4 Ray tracking principle Heikkla distorted model was introduced into this work. Using the optical refraction law and the correlative mathematical equations, associated with parameters (λ,μ,ν)T, (α,β,γ)T, (αg,βg,γg)T, (α',β',γ')T and (αg', βg', γg')T, the direction vectors of the ray refracted by CCD and projector can be expressed as: ⎛α g ⎞ ⎛α ⎞ ⎛λ ⎞ n1 ⎜ ⎟ n1 ⎜ ⎟ ⎜ ⎟ (3) ⎜ β g ⎟ = n ⎜ β ⎟ + (cos γ − n cos i ) ⎜ μ ⎟ 2 ⎜ 2 ⎟ ⎜ν ⎟ ⎜γ ⎟ ⎝γ ⎠ ⎝ ⎠ ⎝ g⎠ ⎛α ' ⎞ ⎛ '⎞ ⎛λ ⎞ ⎜ g ⎟ n ⎜α ⎟ n1 (4) ' ' ' ' ⎜ 1 ⎜ β g ⎟ = ⎜ β ⎟ + (cos γ − cos i ) μ ⎟ ⎜ ⎟ n2 ⎜ ' ⎟ n2 ⎜ ' ⎟ ⎜ν ⎟ ⎜γ ⎟ ⎝ ⎠ ⎝γ ⎠ ⎝ g⎠ n1 is the refractive index of the air and n2 is the water’s. Finally, the straight line equations of the rays projected from CCD and projector can be identified by the two equations above. To trace the camera’s and the projector’s rays separately after their pass through the interface, work out the equations about the two rays. Theoretically, the intersection point of the two rays is the just point on the object’s surface. Unfortunately, the intersection point exists rarely when it comes to two rays in practice. For the general use, the midpoint of skew lines common perpendicular is been considered as that point. Shown by P which is the midpoint between P1 and P2 as we can see in Fig.2:
⎛ xp ⎞ ⎛ x1 + x2 ⎞ ⎜ ⎟ 1⎜ ⎟ (5) ⎜ y p ⎟ = 2 ⎜ y1 + y2 ⎟ ⎜z +z ⎟ ⎜z ⎟ ⎝ 1 2⎠ ⎝ p⎠ The straight line equations of the two rays can be obtained as above. Then the point on the object’s surface can be identified with the perpendicular feet of the two rays. P1 P
Ray from Projector
Ray from Camera P2
Fig. 2 Determination of the 3D position of point P
3. Experimental Results and Analysis 3.1 Practical measurement In order to testify the method proposed in this paper, practical measurement has been performed. SLR Cannon E503 and Sanyo LCD projector are applied in this measurement. The air’s refractive index is set at 1, and 1.33 as the water’s. The measure object is a semi-cylinder and a human hand which been dipped into a cuboid container which filled with water. In the measurement, the ray which illuminates the objects should through the side window of the container, i. e, it will pass through three media before it reaching the objects. The thickness of the glass is 0.5 cm, the refractive index is 1.5. The ray tracking principle, which we discussed in section 2.4, is successful to deal with this conditions of three media. 3.2 System calibration First of all, the projector and the camera should be calibrated10-12. Considering the lens distortion, we improve the Matlab tool box introduced by Bouguet for the calibration in this paper. It adopts the Heikkila distortion model8, which mainly considers the radial and eccentric distortion. All of the intrinsic parameters about the camera and the projector, companied with the position parameters12, are calibrated by this improved tool. Then, an image with regular cross mark distribution has been projected onto the media surface to calibrate the plane
Proc. of SPIE Vol. 7522 752258-3 Downloaded from SPIE Digital Library on 12 Jun 2010 to 58.251.166.14. Terms of Use: http://spiedl.org/terms
equation of the media surface. After the system calibration, the centers of all the points, which figured out by binocular vision principle, form a media surface using least square method. 3.3 Measurement and results analysis A sinusoidal fringe is projected onto the surface of the tested object, shown in Fig. 3, and the distortion image is grabbed by digital camera. The object’s arris is 68.00mm and the diameter is 68.55mm. The projected image is 768×1024 pixels (16 pixels per period). The grabbed image is 1152×1728 pixels. For the sake of a more credible proof, the object has been measured three times under different conditions, with and without water. Then the shapes of the same tested object are reconstructed by different algorithms, binocular vision algorithm and ray tracing algorithm. Fig. 4 gives 3D restored shape of the tested object under the condition of without water by binocular vision algorithm. Fig. 5 shows 3D shape of the tested object without water restored by ray tracing algorithm. (a)
(b)
Fig.3 The measured object (a)The captured image, (b) the captured image without fringe (a)
(b)
Fig.4 3D restored shape of object without water by binocular vision algorithm (a)The elevation, (b) the lateral view (a)
(b)
Fig.5 3D restored shape of object without water by ray tracing algorithm (a)The elevation, (b) the lateral view
Proc. of SPIE Vol. 7522 752258-4 Downloaded from SPIE Digital Library on 12 Jun 2010 to 58.251.166.14. Terms of Use: http://spiedl.org/terms
(a)
(b)
Fig.6 3D restored shape of object in water by ray tracing algorithm (a) The elevation, (b) The lateral view
Fig.7 The comparison of the reconstructed profiles in different scenes
After these two time measurements, the water is poured into the container, and the tested object has been measured again using this proposed method. Fig. 6 shows 3D shape of the tested object in water restored by ray tracing algorithm. Fig. 7 demonstrates three reconstructed profiles in different scenes. For the three conditions above, the diameter and the arris of the semi-cylinder were measured five times, and the results are shown in Table.1. Considering the edge effects caused by the image processing, compared with the real actual values, the algorithm proposed here gains well reconstruction shape. Table 1 The measured results
Diameter Length of Arris Diameter Length of Arris Diameter Ray tracing with water Length of Arris In addition, a second experiment has been carried reconstructed hand shape. Binocular stereo vision without water Ray tracing without water
Unit:mm
1 2 3 4 67.283 68.201 67.979 67.822 67.112 67.532 66.808 67.055 67.013 67.788 67.176 67.986 66.494 67.421 66.456 66.435 67.977 67.109 67.964 67.787 67.322 66.499 66.925 67.100 out to measure a human being’s hand
5 Average 68.016 67.860 67.511 67.204 67.868 67.566 66.531 66.667 67.658 67.699 66.992 66.968 in water. Fig.8 shows the
Proc. of SPIE Vol. 7522 752258-5 Downloaded from SPIE Digital Library on 12 Jun 2010 to 58.251.166.14. Terms of Use: http://spiedl.org/terms
(a)
(b)
(c)
(d)
Fig.8 The 3D shape of hand in water calculated by ray tracing (a) The measured hand, (b) The projected fringe, (c) The elevation, (d) The lateral view
4. Conclusion In this paper, a new algorithm, which based on phase tracking and ray tracing, has been proposed to measure the underwater object. The speed of the measurement can be accelerated and the price can be reduced. This proposed algorithm has been evaluated on real object, and the results show that it is feasible. Though the experiment in this paper uses a digital projector to project fringe pattern, the same aim can be met by a homogeneous laser which force its beam transmits a grating, in order to gain the same refraction index and a higher transmittance. To eliminate the influence caused by the water’s wave and flow, a simulation about this condition will be take into account in the future research. Besides, the air bubbles especially the one adhered to the edge of the tested object should be cleared before measuring. Without scanning device, the algorithm proposed here can reduce the cost and speed up the measurement process. Moreover, only a few improvements needed, this system will be able to measure the surface of slowly moving objects underwater. This algorithm can be expected a bright prospect in medical research and teaching, digital underwater relics.
Acknowledgments This project was supported by the National Natural Science Foundation of China (No. 60807006).
REFERENCES [1]. Li Wansong, Su Xianyu, Liu Zhongbao, “Large-scale three-dimensional object measurement: a practical coordinate mapping and image data-patching method,” Applied Optics, 40(20), 3326-3333 (2001). [2]. Atsushi Yamashita, Hirokazu Higuchi, Toru Kaneko, “Three dimensional measurement of object’s surface in water using the light stripe projection method,” IEEE International Conference on Robotics & Automation, 2736-2741 (2004). [3]. Yamashita Atsushi, Hayashimoto Etsukazu, Toru Kaneko, “3-D measurement of objects in a cylindrical glass water tank with a laser range finder,” IEEE Conference on Intelligent Robots and Systems, 1578-1583 (2003). [4]. Ding Wanshang, Liu Yan, “Optical measurement of object’s surface three-dimensional shape in water,” Acta Optica Sinica, 27(1), 58-62. (2007), ( in Chinese). [5]. Anand Asundi, Zhou Wensen, “Fast phase-unwrapping algorithm based on a gray-scale mask and flood fill,” Applied optics, 37(23), 5417-5420 (1998). [6]. Kang Xin, He Xiaoyuan, Quan C, “3-D sensing using sinusoidal fringe projection and phase unwrapping,” Acta Optica Sinica, 21(12), 1444-1447 (2001). (in Chinese).
Proc. of SPIE Vol. 7522 752258-6 Downloaded from SPIE Digital Library on 12 Jun 2010 to 58.251.166.14. Terms of Use: http://spiedl.org/terms
[7]. Janne Heikkilä, Olli Silvén, “A four-step camera calibration procedure with implicit image correction,” IEEE Computer Vision and Pattern Recognition, 1106-1112 (1997). [8]. Ma Songde, Zhang Zhengyou, [Computer Vision]. Beijing: Science Press, 52-93 (1999). (in Chinese). [9]. Min Zhu, Yunjian Ge, Shangfeng Huang, “Stereo vision rectification based on epipolar lines match and three variables projective matrix,” IEEE International Conference on integration, 133-138 (2007). [10]. Liu Yuankun, Su Xianyu, “New camera calibration technique based on phase measurement,” Opto-Electronic Engineering, 34(11), 65-69 (2007), (in Chinese).
Proc. of SPIE Vol. 7522 752258-7 Downloaded from SPIE Digital Library on 12 Jun 2010 to 58.251.166.14. Terms of Use: http://spiedl.org/terms