May 15, 2011 / Vol. 36, No. 10 / OPTICS LETTERS
1945
Automatic focusing in digital holography and its application to stretched holograms P. Memmolo,1,2,* C. Distante,3 M. Paturzo,1 A. Finizio,1 P. Ferraro,1 and B. Javidi4 1
2
CNR-Istituto Nazionale di Ottica, via Campi Flegrei 34, 80078 Pozzuoli (NA), Italy DIBET, Università degli Studi di Napoli “Federico II”, via Claudio 21, 80125 Napoli, Italy 3
4
CNR-Istituto Nazionale di Ottica, viale della Libertà 3, 73010 Arnesano (LE), Italy
Electrical and Computer Engineering Department, University of Connecticut, U-2157 Storrs, Connecticut 06269-2157, USA *Corresponding author:
[email protected] Received January 19, 2011; revised March 23, 2011; accepted April 16, 2011; posted April 18, 2011 (Doc. ID 141393); published May 13, 2011 The searching and recovering of the correct reconstruction distance in digital holography (DH) can be a cumbersome and subjective procedure. Here we report on an algorithm for automatically estimating the in-focus image and recovering the correct reconstruction distance for speckle holograms. We have tested the approach in determining the reconstruction distances of stretched digital holograms. Stretching a hologram with a variable elongation parameter makes it possible to change the in-focus distance of the reconstructed image. In this way, the proposed algorithm can be verified at different distances by dispensing the recording of different holograms. Experimental results are shown with the aim of demonstrating the usefulness of the proposed method, and a comparative analysis has been performed with respect to other existing algorithms developed for DH. © 2011 Optical Society of America OCIS codes: 090.1995, 100.2000.
In the last decade digital holography (DH) has been applied in many fields of science and technology as an inspection and characterization tool, i.e., for testing microelectromechanical systems microstructures [1] and liquid lens arrays [2], for tracking bubbles in liquids [3], for analysis in microfluidics systems [4], for cell analysis [5], and for display of three-dimensional dynamic scenes with real-world objects [6]. One of the most attractive features of DH is its intrinsic flexibility due to numerical focusing capability, since the focus can be found and adjusted a posteriori. However, the searching and recovery of the correct image plane can be cumbersome and time-consuming for dynamic measurements or scenes in which hundreds of holograms are recorded and where the focus can change in each holographic exposure. Furthermore, the evaluation of the correct in-focus reconstruction distance is subjective, as it is usually judged by the observer. Different strategies are necessary to detect the correct focal plane according to the kind of object under investigation and the adopted configuration, i.e., pure phase objects [7], in-line holography [8], scanning holography [9], or for detecting depth of objects in multiple planes [10]. Moreover, an algorithm that maximizes a sharpness metric related to the sparsity of the signal's expansion in distance-dependent waveletlike Fresnelet bases has been found in DH [11]. The final aim of most research activities was to find an automatic way of obtaining the most focused image, called autofocusing. Autofocus approaches were developed in optical scanning holography [12] as well as DH for phase contrast microscopy on pure phase objects for live-cell imaging [5,7]. There are several algorithms that have been adopted to estimate the in-focus distance [12–18]. They exploit the cumulated edge detection to quantify the image sharpness. To this aim, the total sum of the gradient [14,16], the Laplacian [14,15], or the variance [14,17] of gray-value image distribution is calculated. In contrast, threshold-based algorithms [14,18] have been found not to be applicable for 0146-9592/11/101945-03$15.00/0
autofocusing in DH microscopy, as the peak position of the focus function and its width depend on the threshold value, and this inhibits a robust focusing. In this Letter, we propose an autofocusing approach for speckle holograms, i.e., for highly scattering objects. The proposed estimation algorithm is based on the contrast texture measure model [19]. We compare the proposed algorithm to those proposed in [12–18] and demonstrate that the algorithm presented here works properly for speckle holograms and has some advantages when compared to the existing methods adopted up to now to estimate the in-focus distance in DH. In order to detect the degree of focus/defocus of a hologram, we adopt the contrast texture measure usually used under content-based image retrieval and remote sensing [20]. The contrast is a texture measure that describes the gray levels variation and their distribution toward the bright and dark values of an image I. As a measure of contrast, we use an approximation of the Tamura coefficient [21]: sffiffiffiffiffiffiffiffiffi σðIÞ ; ð1Þ C¼ hIi where σðIÞ and hIi represent the image gray-level standard deviation and mean, respectively. The algorithm to recognize the in-focus distance for the digital reconstruction of a generic hologram of a real-world object recorded at distance d, is composed of three steps. In the first step, a range of reconstruction distances ½da ; db is defined and the hologram is reconstructed at each distance in this range. The second step consists of computing the Fourier transform of the hologram and defining a spatial binary mask that allows the selection of one of the two orders (the hologram has been recorded using an off-axis configuration) and then to compute on this filtered reconstruction the coefficient given in Eq. (1). © 2011 Optical Society of America
1946
OPTICS LETTERS / Vol. 36, No. 10 / May 15, 2011
With the aim of testing the method, two different digital holograms of 1024 × 1024 pixels, of same object, were recorded at wavelength λ ¼ 532 nm by using a CCD camera with pixel size 4:4 μm × 4:4μm are considered. The two holograms were recorded with the object at two different distances, d1 ¼ 900 mm and d2 ¼ 950 mm, respectively. Supposing that we do not have a priori information on the recording distance, we choose a distance-searching range of ½400; 1400 mm. The range was explored with 1000 steps (1 mm). The results obtained with the algorithm proposed here was compared with those proposed elsewhere: the Laplacian, the gradient, and, finally, the variance model [12–18]. Figure 1 shows the comparison between our method and those reported in [12–18]. It is interesting to note that all the methods mentioned have several local maxima points. This will result in a wrong convergence during the optimization stage, when global maxima with the classical gradient ascent search are adopted. In contrast, the function shape of the coefficient in Eq. (1) with respect to the variable d assures that only one maximum value exists and that no mistakes can be made in the in-focus distance estimation. In addition, if the searching range is incorrect, we can easily understand how to shift this range by increasing/decreasing the coefficient. In fact, if the coefficient increases, the right extreme must be extended, while, if the coefficient decreases, the left extreme must be reduced. In particular, without any knowledge of the recording distance, our method allows for good in-focus distances d1 ¼ 902 mm and d2 ¼ 950 mm, respectively. In contrast, the other methods furnish d1 ¼ 887 mm and d2 ¼ 949 mm, respectively. Moreover, it is interesting to note that by the other approaches, the difference between the real distance and the found distance is different in the two cases. In fact, in addition to the distance variation, the experimental parameters differ slightly (i.e., speckles, efficiency, recording noise) in the two cases. Therefore, in order to test the autofocusing algorithms at different distances without changing other parameters, we exploit the
Fig. 1. (Color online) Comparison between different in-focus estimation algorithms; (a) and (b) are digital reconstructions of the holograms at the estimated distance.
numerical technique reported in [22]. This approach allows us to dispense with the need of recording many holograms for all considered distances. In fact, by stretching the same hologram with a variable elongation parameter, it is possible to change only the in-focus distance of the reconstructed image. The stretching technique is based on affine transformation on the recorded hologram, which produces an elongation of the size of the original hologram [6,22]. This is equivalent to recording the same object changing only the recording distance. The in-focus distance d before the stretching and the infocus distance D after the stretching are related by the following equation [6,22]: D ¼ α2 d;
ð2Þ
where α is the elongation factor. The numerical procedure described above for the single hologram is repeated for each stretched hologram with α ¼ ½α1 ; α2 . For this test, we consider another hologram, recorded in the same configuration as the previous but at the distance d ¼ 790 mm. We compute the synthetic holograms at different distances, varying α in the range α ¼ ½1∶1:85 corresponding to distances ranging from 790 mm to 2703 mm. The binary mask for each deformed hologram is stretched by a factor of 1=α, the same one defined for the original hologram, and the range is adapted as ½α2 d − 500; α2 d þ 500 mm. Then we compare the in-focus
Fig. 2. (Color online) Different reconstructions of the same hologram with stretching factors (a) α ¼ 1, (b) α ¼ 1:45, and (c) α ¼ 1:85.
May 15, 2011 / Vol. 36, No. 10 / OPTICS LETTERS
1947
proposed method has the intrinsic advantage of finding a single focus-value without ambiguity in the entire reconstruction volume. References
Fig. 3. (Color online) Comparison between the theoretical in-focus distance and the one estimated by the proposed algorithm.
distance d found for each α with the theoretical behavior given by Eq. (2). Figure 2 shows the numerical reconstructions of the same hologram with stretching factors (a) α ¼ 1:0, (b) α ¼ 1:45, and (c) α ¼ 1:85 that correspond to the distances 790 mm, 1660 mm, and 2703 mm, respectively. The autofocus distances recovered by the proposed method give the correct distances at 790 mm, 1660 mm, and 2703 mm. In particular, the result of Laplacian, gradient, and variance models gives a little estimation error in the case in Fig. 2(b) and a large underestimation error in the case in Fig. 2(c). The holographic images reconstructed by the proposed autofocus approach appear focused, demonstrating that the approach works properly. Finally, to quantify the performance of the proposed algorithm, we show in Fig. 3 the comparison between the theoretical infocus distance given by Eq. (4) and the estimated in-focus distance obtained by the proposed algorithm. The reliability of the proposed algorithm is demonstrated by the value of the percentage mean square error of the in-focus distance estimation that results in E ≈ 2:5% for the experiments. In conclusion, we have demonstrated a novel approach for automatic focus searching in DH and tested it for digital holograms affected by speckle. The method has been tested with stretched holograms and compared to a number of autofocusing approaches that have been developed recently for DH. The results show that the
1. P. Ferraro, S. De Nicola, A. Finizio, G. Pierattini, and G. Coppola, Appl. Phys. Lett. 85, 2709 (2004). 2. S. Grilli, L. Miccio, V. Vespini, A. Finizio, S. De Nicola, and P. Ferraro, Opt. Express 16, 8084 (2008). 3. L. Tian, N. Loomis, J. A. Domínguez-Caballero, and G. Barbastathis, Appl. Opt. 49, 1549 (2010). 4. W. Bishara, H. Zhu, and A. Ozcan, Opt. Express 18, 27499 (2010). 5. P. Langehanenberg, B. Kemper, D. Dirksen, G. von Bally, Appl. Opt. 47, D176 (2008). 6. M. Paturzo, P. Memmolo, A. Finizio, R. Näsänen, T. J. Naughton, and P. Ferraro, Opt. Express 18, 8806 (2010). 7. F. Dubois, C. Schockaert, N. Callens, and C. Yourassowsky, Opt. Express 14, 5895 (2006). 8. W. Li, N. C. Loomis, Q. Hu, C. S. Davis, J. Opt. Soc. Am. A 24, 3054 (2007). 9. T. G. Kim and Y. S. Kim, J. Opt. Soc. Korea 14, 104 (2010). 10. M. L. Tachiki, M. Itoh, and T. Yatagai, Appl. Opt. 47, D144 (2008). 11. M. Liebling and M. Unser, J. Opt. Soc. Am. A 21, 2424 (2004). 12. T. Kim and T. C. Poon, Appl. Opt. 48, H153 (2009). 13. Y. Yang, B. S. Kang, and Y. J. Choo, Proc. SPIE, 6723, 672365, doi: 10.1117/12.787618 (2007). 14. F. C. Groen, I. T. Young, and G. Ligthart, Cytometry 6, 81 (1985). 15. Y. Sun, S. Duthaler, and B. J. Nelson, Microsc. Res. Tech. 65, 139 (2004). 16. J.-M. Geusebroek, F. Cornelissen, A. W. M. Smeulders, and H. Geerts, Cytometry 39, 1 (2000). 17. M. T. Özgen and T. E. Tuncer, Opt. Eng. 43, 1300 (2004). 18. S. Lee, J. Y. Lee, W. Yang, and D. Y. Kim, Opt. Express 17, 6476 (2009). 19. H. Tamura, S. Mori, and T. Yamawaki, IEEE Trans. Syst. Man Cyber. 8, 460 (1978). 20. V. Castelli and L. D. Bergman, eds., Image Databases: Search and Retrieval of Digital Imagery (Wiley, 2002). 21. Y.-L. Qi, Second International Symposium on Knowledge Acquisition and Modeling (IEEE, 2009), Vol. 3, pp 174–177. 22. P. Ferraro, M. Paturzo, P. Memmolo, and A. Finizio, Opt. Lett. 34, 2787 (2009).