Optical sectioning in wide-field microscopy obtained ... - OSA Publishing

2 downloads 0 Views 289KB Size Report
May 1, 2003 - Jelena Mitic, Tiemo Anhut, Matthias Meier, Mathieu Ducros,* Alexander Serov, and Theo Lasser. Institute of Imaging and Applied Optics, Swiss ...
698

OPTICS LETTERS / Vol. 28, No. 9 / May 1, 2003

Optical sectioning in wide-field microscopy obtained by dynamic structured light illumination and detection based on a smart pixel detector array Jelena Mitic´, Tiemo Anhut, Matthias Meier, Mathieu Ducros,* Alexander Serov, and Theo Lasser Institute of Imaging and Applied Optics, Swiss Federal Institute of Technology, CH-1015 Lausanne, Switzerland Received October 30, 2002 Optical sectioning in wide-field microscopy is achieved by illumination of the object with a continuously moving single-spatial-frequency pattern and detecting the image with a smart pixel detector array. This detector performs an on-chip electronic signal processing that extracts the optically sectioned image. The optically sectioned image is directly observed in real time without any additional postprocessing. © 2003 Optical Society of America OCIS codes: 180.6900, 110.0180, 120.4640, 040.1240.

The confocal microscope has evolved as an indispensable tool in many different fields, such as material science, semiconductor testing, and life sciences. The key advantage of this instrument lies in its so-called optical sectioning property, wherein the acquired images contain only the in-focus information. The outof-focus contributions are strongly suppressed by the confocal detector pinhole.1 This allows one to obtain a high-resolution visualization of three-dimensional structures by taking series of through-focus images and recombining them with adapted image-processing software.2 However, complete three-dimensional image acquisition requires lateral as well as axial object scanning. Neil et al.3,4 proposed an alternative way to obtain optical sectioning with a conventional wide-f ield microscope and a structured illumination of the sample. A single spatial frequency pattern obtained either by projection of a grid onto the specimen3 or by interference of two laser beams4 is used to illuminate the specimen. Only those parts of the object in which the grid is in focus are imaged efficiently onto the detector. The optically sectioned image, with the illumination pattern removed, is obtained by means of a three-phase step algorithm. Therefore, three images at three different phase lags have to be taken. We propose an alternative for real-time acquisition and imaging of an optically sectioned sample in a wide-field microscope with a time-dependent structured light illumination in combination with a smart pixel detector array (SPDA).5 The setup consists of a standard wide-f ield microscope in which a continuously moving grid in plane S [with a corresponding intensity distribution G共x0 , y0 ; t兲] is incoherently illuminated and imaged onto the object in the conjugated object plane V. This illumination of a ref lecting object leads to the timedependent intensity distribution (up to a multiplicative constant), I 共x, y; t兲 苷 3

ÇZ V

Z S

dx0 dy0 G共x0 , y0; t兲 Ç2

djdhh2 共x 2 j, y 2 h兲t共j, h兲h1 共j 2 x0 , h 2 y0 兲 , (1) 0146-9592/03/090698-03$15.00/0

in the detector plane D. In Eq. (1) normalized lateral optical coordinates1 共x, y兲 苷 kNA共x, y兲 are introduced. NA 苷 n sina is the numerical aperture of the system, and k 苷 2p兾l denotes the wave number at the center wavelength l. The amplitude point-spread functions of the illumination and observation paths are denoted h1 共x, y兲 and h2 共x, y兲. The amplitude ref lectance or transmittance of the object is represented by t共x, y兲. Assuming identical point-spread functions for illumination and observation, a jh共x, y兲j4 object – image relation follows immediately from Eq. (1).1 This relation is identical to the well-known confocal microscopy relation, which indicates that our wide-f ield sectioning will have a depth resolution comparable to that of the confocal microscope. In the case of constant illumination, which corresponds to G共x0 , y0; t兲 苷 1 in Eq. (1), the system is identical to a conventional microscope.1 The single-spatial-frequency illumination pattern moving continuously with the angular frequency v in direction x can be represented as G共x0 , y0 ; t兲 苷 1 1 Re关exp共iv˜ x0 2 ivt兲兴 ,

(2)

where we introduce a normalized spatial frequency v˜ related to the mask frequency v by v˜ 苷 Mill lv兾NA. Mill is the magnif ication between the grid plane S and the conjugated object plane V. Substituting Eq. (2) into Eq. (1) leads to an intensity distribution Iv˜ 共x, y; t兲 苷 Iconv 共x, y兲 1 Iosec 共x, y; t兲 ,

(3)

with Iconv 共x, y兲 苷

Z S

ÇZ

djdhh2 共x 2 j, y 2 h兲 Ç2 3 t共j, h兲h1 共j 2 x0 , h 2 y0 兲 , (4a)

dx0 dy0

V

∑Ç Z Ω Z dy0 Fx0 djdh Iosec 共x, y; t兲 苷 Re exp共2ivt兲 S V Ç2 ∏æ , 3 h2 共x 2 j, y 2 h兲t共j, h兲h1 共j 2 x0 , h 2 y0 兲

(4b)

showing a time-independent contribution Iconv corresponding to the conventional image and a timedependent contribution Iosec containing all image © 2003 Optical Society of America

May 1, 2003 / Vol. 28, No. 9 / OPTICS LETTERS

information of the optically sectioned object close to the focal plane. Fx0 denotes the Fourier transform with respect to x0 . A mirror in the object plane is described by t共j, h兲 ⬅ 1 in Eq. (4b). Simplifying our considerations to one dimension (as a one-dimensional grid is moving in plane S), we can easily calculate the Fourier transform to obtain ∑ Z ˜Iosec 共˜v; t兲 苷 Re exp共2ivt兲 dmP2 共m兲P1 共m兲 ∏ 3 P2 ⴱ 共m 2 v˜ 兲P1 ⴱ 共m 2 v˜ 兲 ,

(5)

where the pupil functions (P1 and P2 ) are introduced as Fourier transforms of the amplitude transfer functions. Defocusing can now be described as a quadratic phase factor of the pupil function.6 If we assume our pupil functions for the illumination and observation paths to be identical and use the normalized axial coordinate u 苷 4zkn sin2 共a兾2兲 (z is the mirror’s axial distance from the focus), we find for the intensity distribution in the detector plane I˜osec 共v˜ ; u, t兲 苷 Re关exp共2ivt兲OTF共v˜ ; 2u兲兴 ,

699

The image of the object is captured by the SPDA, which demodulates the optical signal through an AC amplification, rectification, and low-pass filtering. The SPDA used is composed of 58 3 58 pixels. This prototype detector consists of a silicon photodiode array with a f ill factor of 10%. The output analog voltage for every pixel corresponds to the amplitude of the time-modulated optical signal. The pixels are read out sequentially at a maximal pixel clock rate of tpc 苷 2.5 MHz. The output analog voltage was digitized and simultaneously displayed at the computer screen. Here, we demonstrate for what is believed to be the f irst time the use of a SPDA for wide-f ield optically sectioned microscopy. The two-dimensional intensity distribution corresponds to the depth-discriminated image of the sample. This SPDA, initially conceived for a parallel optical coherence tomography approach, was recently used by Laubscher and co-workers8 – 10 for three-dimensional optical coherence tomography at video rate. To quantify the optical sectioning strength we displaced a planar mirror axially through the objective

(6)

which shows no explicit transversal dependence, as it should be for an unstructured object. OTF共˜v; 2u兲 denotes the two-dimensional optical transfer function parameterized with the normalized defocus parameter u. An analytic expression for the OTF, including defocusing, can be derived on the basis of the Stokseth approximation7 [compare also Eq. (10) in Ref. 3]. Figure 1 illustrates the OTF as function of the normalized spatial frequency calculated for different defocus values. The OTF decreases with defocus for all frequencies except the zero spatial frequency. Out-of-focus regions of a three-dimensional object are illuminated with a diminished grid contrast corresponding to a lower modulation depth of the illumination intensity, resulting in a rapidly vanishing signal at the SPDA. We use a SPDA detector array that responds to the time-varying part of the intensity distribution; i.e., only Iosec 共x, y; t兲 contributes to the SPDA output signal. The array performs on-chip filtering and demodulation of the input signal. Figure 2 illustrates the optical system used to demonstrate the optical sectioning property. A f iberized Xe lamp (lc 苷 550 nm, Dl 苷 220 nm) was used as an incoherent light source. The mask, a rotating grid, was integrated into the illumination path. The rotating grid was sized to ensure linear movement with an almost constant spatial frequency for the structured illumination at the imaged object area. A classical Köhler illumination was realized with the mask (plane S), object (V), and detector (D) in conjugated positions. An objective 403 兾0.60 (Zeiss LD-Achroplan) was used, together with two different tube lenses (L1 and L2 ), to project an image of the grid onto the object and to image the object onto the detector (SPDA). The normalized spatial frequency in the object plane was v˜ c 艐 0.078, calculated for lc . The field of view was 160 mm 3 160 mm, and the total image acquisition time was 1.3 ms.

Fig. 1. OTF of a defocused system as function of the normalized spatial frequency v˜ for different values of the normalized defocus u obtained with the Stokseth analytic formula.7 Vertical dashed line, spatial frequency v˜ e of our illumination pattern.

Fig. 2. Schematic optical setup: Lcol , collector; L1 , L2 , tube lens; Obj, objective lens; FM, f lip mirror; BS, beam splitter; AS, aperture stop. Dashed lines, illumination path; solid lines, imaging path.

700

OPTICS LETTERS / Vol. 28, No. 9 / May 1, 2003

Fig. 3. Measured axial response (squares), corresponding fit (solid curve), and calculated curve for v˜ c 苷 0.078 (dashed curve).

son with the theoretical calculation (monochromatic) can be explained as a longitudinal chromatic aberration that is due to the broadband spectra of the incoherent illumination source and the objective that were used (Achroplan). To compare the optically sectioned images obtained by the SPDA with bright-field images of the same object we acquired in parallel a conventional image with a CCD camera (752 3 582 pixels). The resized wide-field images are shown in Fig. 4 and compared with the SPDA images. Figures 4(a) and 4(b) show conventional images of an electrical wire bonding obtained with the CCD camera, and Figs. 4(c) and 4(d) show the corresponding optically sectioned images obtained by the SPDA. The images were taken at focal distances Dz 苷 13 mm apart. Figures 4(a) and 4(c) show images with the circuit in focus, and Figs. 4(b) and 4(d) show the wire bonding in focus. In conclusion, we have demonstrated a new technique for the acquisition of optically sectioned images in real time. The system performance, as measured by detector pixel f ill factor and acquisition speed, will be improved with the next SPDA generation. Because of its real-time character, the system that we presented here can be used to study dynamic phenomena in biology. We plan to investigate f luorescent samples in the future. This work is financially supported by the Swiss National Foundation under subsidy FN 21.57215.99. The authors thank T. Sidler and C. Amendola for discussion and the fabrication of the masks. J. Mitic´ ’s e-mail address is jelena.mitic@epf l.ch. *Present address, Department of Ophthalmology, University of British Columbia, 2550 Willow Street, Vancouver, British Columbia V5Z 3N9, Canada. References

Fig. 4. Bonding wire of an integrated circuit: (a), (b) conventional images taken with a CCD; (c), (d) corresponding optically sectioned images taken with the SPDA. The object is moved 13 mm axially through the focus between (a) and (b). The scale bars are 50 mm.

lens focus in increments of Dz 苷 0.25 mm and measured the variation in the detected signal. The experimentally obtained curve is shown in Fig. 3. The FWHM is 7.8 mm. The measured axial response corresponds to the spatial frequency v˜ 苷 0.068, which we compared with Eq. (10) in Ref. 3 to fit the experimental data. Using the same formula, we calculated the theoretical curve for v˜ c 苷 0.078. This rather low sectioning strength is due to the low spatial frequency of the mask used here, which is limited by our fabrication procedure (laser cutting). The asymmetry and broadening of the experimentally obtained curve in compari-

1. T. Wilson and C. J. R. Sheppard, Theory and Practice of Scanning Optical Microscopy (Academic, London, 1984). 2. N. S. White, in Handbook of Biological Confocal Microscopy, J. B. Pawley, ed. (Plenum, New York, 1990), pp. 211– 254. 3. M. A. A. Neil, R. Juˇskaitis, and T. Wilson, Opt. Lett. 22, 1905 (1997). 4. M. A. A. Neil, R. Juˇskaitis, and T. Wilson, Opt. Commun. 153, 1 (1998). 5. S. Bourquin, P. Seitz, and R. P. Salathé, Electron. Lett. 37, 975 (2001). 6. J. W. Goodman, Introduction to Fourier Optics (McGrawHill, Boston, 1996). 7. P. A. Stokseth, J. Opt. Soc. Am. 59, 1314 (1969). 8. S. Bourquin, P. Seitz, and R. P. Salathé, Opt. Lett. 26, 512 (2001). 9. M. Ducros, M. Laubscher, B. Karamata, S. Bourquin, T. Lasser, and R. P. Salathé, Opt. Commun. 202, 29 (2002). 10. M. Laubscher, M. Ducros, B. Karamata, T. Lasser, and R. P. Salathé, Opt. Express 9, 429 (2002), http:// www.opticsexpress.org.