The modulation function and realizing method of ... - OSA Publishing

10 downloads 0 Views 1MB Size Report
M. Lucente, “Interactive three-dimensional holographic displays,” Comput. ... St Hilaire, J. Thomas, R. A. Norwood, M. Yamamoto, and N. Peyghambarian, “An ...
The modulation function and realizing method of holographic functional screen Chongxiu Yu,1,* Jinhui Yuan,1 Frank C. Fan,1,2 C. C. Jiang,2 Sam Choi,2 Xinzhu Sang,1 Chang Lin,1 and Daxiong Xu1 1

Key Laboratory of Information Photonics and Optical Communications, Beijing University of Posts and Telecommunications, Ministry of Education, P.O. Box 72 (BUPT), Beijing 100876, China 2 AFC Technology Co. Ltd., ShenZhen, 518104, China *[email protected]

Abstract: The modulation function of holographic functional screen (HFS) in the real-time, large-size full-color (RLF), three-dimensional (3D) display system is derived from angular spectrum analysis. The directional laser speckle (DLS) method to realize the HFS is proposed. A HFS by the DLS method was fabricated and used in the experiment. Experimental results show that the HFS is valid in the RLF 3D display, and that the derived modulation function is valuable for the design of the HFS. The research results are important to realize the RLF 3D display system which will find many applications such as holographic video. ©2010 Optical Society of America OCIS codes: (110.2990) Image formation theory; (120.2040) Displays; (090.2870) Holographic display.

References and links 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18.

S. A. Benton, ed., Selected Papers on Three-Dimensional Displays, SPIE-Int’l Soc. for Optical Eng., 2001. C. B. Burckhardt, and E. T. Doherty, “Beaded plate recording of integral photographs,” Appl. Opt. 8(11), 2329– 2331 (1969). M. Lucente, “Interactive three-dimensional holographic displays,” Comput. Graph. 31(2), 63–67 (1997). L. Huff, and R. L. Fusek, “Color holographic stereograms,” Opt. Eng. 19, 691–695 (1980). D. L. Marks, and D. J. Brady, “Three-dimensional source reconstruction with a scanned pinhole camera,” Opt. Lett. 23(11), 820–822 (1998). N. T. Shaked, and J. Rosen, “Multiple-viewpoint projection holograms synthesized by spatially incoherent correlation with broadband functions,” J. Opt. Soc. Am. A 25(8), 2129–2138 (2008). M. L. Huebschman, B. Munjuluri, and H. R. Garner, “Dynamic holographic 3-D image projection,” Opt. Express 11(5), 437–445 (2003). http://www.holografika.com/Company/Awards.html D. Abookasis, and J. Rosen, “Computer-generated holograms of three-dimensional objects synthesized from their multiple angular viewpoints,” J. Opt. Soc. Am. A 20(8), 1537–1545 (2003). C. W. Slinger, C. D. Cameron, S. D. Coomber, R. J. Miller, D. A. Payne, A. P. Smith, M. G. Smith, M. Stanley, and P. J. Watson, “Recent development in computer generated holography: toward a practical electroholography system for interactive 3D visualization,” Proc. SPIE 5290, 27–41 (2004). Zebra Imaging, Inc (http://www. Zebraimaging.com),2008. M. Alcaraz-Rivera, J. J. Báez-Rojas, and K. Der-Kuan, “Development of a fully functioning digital hologram system,” Proc. SPIE 6912, 69120S (2008). V. M. Bove, “Progress in holographic video displays based on guided-wave acousto-optical device” Practical Holography XXII: Materials and Applications, Proc. of SPIE 6912, 69120H–1 (2008). H. D. Zheng, Y. J. Yu, and C. X. Dai, “A novel three-dimensional holographic display system based on LCR2500 spatial light modulator,” J. Light Electron. Opt 120(9), 431–436 (2009). P. St. Hilaire, S. A. Benton, M. Lucente, J. Underkoffler, and H. Yoshikawa, “Real-time holographic display: Improvements using a multichannel acousto-optic modulator and holographic optical elements”. In Practical Holography V, Proc. SPIE 1461–37, 254–261 (1991). N. T. Shaked, J. Rosen, and A. Stern, “Integral holography: white-light single-shot hologram acquisition,” Opt. Express 15(9), 5754–5760 (2007). M. A. Klug, C. Newswanger, Q. Huang, and E. Holzbach, “Active digital hologram display,” U. S. Patent, PAT6859293, Feb 22, 2005. S. Tay, P. A. Blanche, R. Voorakaranam, A. V. Tunç, W. Lin, S. Rokutanda, T. Gu, D. Flores, P. Wang, G. Li, P. St Hilaire, J. Thomas, R. A. Norwood, M. Yamamoto, and N. Peyghambarian, “An updatable holographic threedimensional display,” Nature 451(7179), 694–698 (2008).

#135927 - $15.00 USD

(C) 2010 OSA

Received 1 Oct 2010; revised 12 Nov 2010; accepted 6 Dec 2010; published 17 Dec 2010

20 December 2010 / Vol. 18, No. 26 / OPTICS EXPRESS 27820

19. St. Reichelt, H. Sahm, N. Leister, and A. Schwerdtner, “Capability of diffractive optical elements for real-time holographic displays,” Proc. SPIE 6912, 69120P (2008). 20. X. Z. Sang, F. C. Fan, C. C. Jiang, S. Choi, W. H. Dou, C. X. Yu, and D. X. Xu, “Demonstration of a large-size real-time full-color three-dimensional display,” Opt. Lett. 34(24), 3803–3805 (2009). 21. F. C. Fan, S. Choi, and C. C. Jiang, “Use of spatial spectrum of light to recover three-dimensinal holographic nature,” Appl. Opt. 49(14), 2676–2685 (2010). 22. F. C. Fan, and S. Choi, “Equipment and method to make digital speckle hologram,” Chinese patent no. ZL200410022193.7.

1. Introduction Three-Dimensional (3D) display has attracted much attention because of its many applications, such as scientific research, industry, medical operation, military, architecture design, and virtual reality. Several 3D display techniques have been demonstrated, such as binocular parallax (including parallax barrier) [1], the integral photography (including multiview imaging) by lens-board [2], and holography [3–8]. The binocular parallax and integral photography belong to parallax imaging. Both their image resolution and brightness are much lower, and there are blind areas between the vision regions of eyes. Recently, digital technology-based holography has captured people’s wide interest, such as computer-generated holography [9–11], 3D holographic video with space light modulator (SLM) [12–14], and others [15–19]. However, it is necessary to improve image qualities, such as resolution, image depth, chromatic aberration, and fidelity. For the conventional holography, it is difficult to realize real-time, full-color, large-size 3D display due to the limitations of recording materials and techniques. Volume displays for realizing 3D images are based on rotation of mechanical components and duration in vision, but they cannot provide a fully convincing 3D experience because of their limited color reproduction and small body display space. In order to overcome above shortages and reduce the system complexity, we proposed and developed a practical real-time, large-size full-color (RLF), 3D display system [20,21]. However, the problems associated with the RLF 3D display system are not fully solved, especially the operating principle of the holographic function screen (HFS). To further investigate the proposed technique, we study the HFS’ modulation function and its realizing method. 2. Modulation function of the HFS As shown in Fig. 1, the RLF 3D display system is mainly composed of a color digital cameras array (CA), a video server (VS), a color projector array (PA), and a HFS. The 3D information of the object is picked up and recorded by the CA. The VS connects the CA and the PA through conventional AV cables. The information transmitted by AV cables is processed in the VS and is then projected by the PA. 3D information is modulated and displayed by the HFS. Here we analyze both the original and recovered 3D information of the 3D object in the system with angular spectrum which is a representation form of space frequency. The light waves scattered from the object are illustrated by Eq. (1)

Fig. 1. Configuration of 3D scene pickup and display.

#135927 - $15.00 USD

(C) 2010 OSA

Received 1 Oct 2010; revised 12 Nov 2010; accepted 6 Dec 2010; published 17 Dec 2010

20 December 2010 / Vol. 18, No. 26 / OPTICS EXPRESS 27821

 ( x, y, z;  l; t )    0(

      , ; t ) exp[i2 ( x  y)]d ( )d ( ) l l l l l l

(l  1,..., k ) (1)

where

    are the continuous angular spectrum components.  0( , ; t ) is the angular , l l l l

spectrum distribution. These waves spread abroad and are respectively picked up by each digital camera in the CA, according to different vision angles [αmn, βmn], and can be represented as Eq. (2) MN

 s ( x, y, z;  l; t )     mn( m n

where exp[i

2

l

12  mn ,  mn ; t ) exp[i 2  mn x   mn y )] (1   2mn   2mn) z ]exp[i 2 ( l l l l l ( mn,  mn  0) (2)

2 2 (1   mn  mn ) z] is an additional phase because of different vision angles 12

 and propagating distance z. The  mn , mn are the separate angular spectrum components and l

the  mn( mn ,

l

l

 mn ; t ) is the angular spectrum distribution at the vision angle [αmn, βmn]. The l

relation between them can be described as

 0(

MN     , ; t )     mn( mn , mn ; t ) l l l l m n

(3)

The intensities and colors of the waves picked up by each digital camera are recorded, and made up of a 2D-image f ( x ', y ';  l; t ) mn , where x ' , y ' are the coordinates of the camera image plane. Essentially f ( x ', y ';  l; t ) mn contains the partial 3D information of the object, similar to the 2D-image used by computer-generated hologram [8].  mn(

 mn ,  mn ; t ) is the Fourier l l

transforming of f ( x ', y ';  l; t ) mn which is transmitted to the processing unit in the VS. The information processing includes information coding, distortions correcting, synchronizing of image frames from different cameras. Then the information processed is projected by the PA, i.e., the projected waves carrying the information f ( x ', y ';  l; t ) mn spread abroad to point R’, can also be represented by angular spectrum as Eq. (4). MN

 's ( x, y, z;  l; t )     mn( m n

12  mn ,  mn ; t ) exp[i 2  mn x   mn ) y] (1   2mn   2mn) z ]exp[i 2 ( l l l l l (4)

Here, the x, y, z are simply used for representing the coordinated of the image space. The angular spectrum information is along the direction [αmn, βmn], and the carried f ( x', y '; l ; t ) mn

can be respectively observed with a diffusion screen. However, this superposed 3D-image can’t be recognized with the same screen. Thus, it is necessary to expand each angular spectrum and to make all angular spectra synthesized as continuous spectra – that is to recover

#135927 - $15.00 USD

(C) 2010 OSA

Received 1 Oct 2010; revised 12 Nov 2010; accepted 6 Dec 2010; published 17 Dec 2010

20 December 2010 / Vol. 18, No. 26 / OPTICS EXPRESS 27822

the original spatial angular spectrum distribution  0(

  , ; t ) . We designed a HFS which has l l

a special modulation function T (x, y, 0). The angular spectrum expansion in a small solid

    x  y)]d ( )d ( ) . l l l l   mn   mn The convolution operation (*) of the integral and  (  ,  ) function can l l l l angle ωmn can be represented by integral operation  exp[i 2 ( mn

represent all expanded angular spectra in the reconstructed image. So the T (x, y, 0) represented by angular spectrum is shown in Eq. (5).

MN         T ( x, y, 0)    [ (  mn ,  mn )   exp[i 2 ( x  y)]d ( ) d ( )] mn l l l l l l l l m n MN           exp{i 2 [(  mn ) x  (  mn ) y ]}d ( )d ( ) mn l l l l l l m n (5) All ωmn are combined into a bigger solid visual angle Ω which is a sum of M × N ωmn., MN      mn m n

(6)

Here, the ωmn is determined by the distance z0 from the recorded object to CA and the space Λ between two neighborhood color digital cameras in the system. Thus, it can be approximately described as ωmn = arctan(Λ/ z0). The light waves modulated by the HFS are given by Eq. (7)

 r ( x, y, z;  l; t )   's ( x, y, z;  l; t ) T ( x, y, 0) 2 12  M N    mn x   mn y )]      mn( mn , mn ; t ) exp[i (1   2mn   2mn) z ]exp[i 2 (  l l l l l  m n   M N   mn      ) x  (  mn ) y ]}d ( )d ( )     mn exp{i 2 [(  l l l l  l  l   m n MN 2     12       mn( mn , mn ; t ) exp[i (1   2mn   2mn) z ]  exp[i 2 ( x  y )]d ( )d ( ) mn        l l l l l l l m n (7)

where the multiplication symbol·means that the  's ( x, y, z;  l; t ) is modulated by the T (x, y, 0). Substituting Eq. (3) and (6) into Eq. (7), we can obtain the Eq. (8) as

 r ( x, y, z;  l; t )    0(

  2     12 , ; t ) exp[i (1   2mn   2mn) z]exp[i 2 ( x  y)]d ( )d ( ) l l l l l l l (8)

Compared with Eq. (1), only an additional phase factor of exp[i

2

l

12

(1   2mn   2mn) z] exists

in Eq. (8), i.e., the functions of modulation and synthesizing angular spectrum information in Eq. (4) are realized by the HFS. Thus we can observe the perfect recovered 3D information of the recorded object within the visual angle Ω

#135927 - $15.00 USD

(C) 2010 OSA

Received 1 Oct 2010; revised 12 Nov 2010; accepted 6 Dec 2010; published 17 Dec 2010

20 December 2010 / Vol. 18, No. 26 / OPTICS EXPRESS 27823

From the above, the HFS is one of the key devices to realize the RLF 3D-display and the Eq. (5) present the HFS’ functions of spreading beam (or transferring wavefronts) and synthesizing light waves. 3. Implementation of the HFS As the human eyes are capable of acclimatizing to the horizontal parallax only (HPO) 3D display, we study the HFS suited for this display. A method to realize the HFS, called as the directional laser speckle (DLS), is given as follows. When a diffusing plate with the size a × b and diffusing grain size of δu and δv as shown in Fig. 2(a) is illuminated by a laser, a laser speckle field appears behind the diffusing plate. The average size of the speckle at X-Y plane is δx = λz0/a, δy = λz0/b. Assuming a«b and δx»δy, a narrow strip-shaped speckle pattern at X-Y plane can be achieved. We record the speckle pattern at a distance z = z0, and the overlay area of the speckles being x y , where Δx = λz0/δu, Δy = λz0/δv. When the speckle pattern is illuminated by a light wave, it diffuses and limits the light in a solid diffused angle θ = λ/δx = a/z0 and θ = λ/δy = b/z0, as shown in Fig. 2(b). It is evident that the diffused angle θis much bigger than θ, so we can consider that the diffused light wave is scattered only along a direction perpendicular to the strip speckles. The speckle screen can facilitate direction selectivity and certain diffusing angle for diffusing the light waves, so it is called as the DLS screen and can spread beam and synchronize light waves in the region of its diffusing angle.

Fig. 2. The speckle generation (a) and the diffusing of strip-shaped speckle (b).

With the analysis above, we propose the DLS method to realize the HFS. According to Eq. (5), the designed HFS is composed of the J × K volume pixels (VP)s, as shown in Fig. 3(a). Each VP is a small-square DLS screen with a side Dn and the horizontal diffusing angle θ = ωn/2 as shown in Fig. 3(b) (here ωn is 1D form of ωmn). Its vertical diffused angle θ satisfies the requirements of viewing by human eyes. When each VP receives the light waves from all projectors, it can spread these light waves to a spatial angle Ω = Nωn in the horizontal direction. Thus the intensity and color information from all projectors can be modulated and recovered with the HFS. In addition, the holographic micro-cylinder lens (HML) and the holographic microcylinder lens (HMCL) can spread beam and synchronize light waves, so we considered the implement of the HFS with them. Through theoretical analysis and primary experimental research, we have shown that the HML method is suitable for the full-parallax 3D display, and the HMCL method can be applied on the HPO 3D display.

#135927 - $15.00 USD

(C) 2010 OSA

Received 1 Oct 2010; revised 12 Nov 2010; accepted 6 Dec 2010; published 17 Dec 2010

20 December 2010 / Vol. 18, No. 26 / OPTICS EXPRESS 27824

Fig. 3. Diagram of the HFS modulation light beams (a) and realizing sub-HML (b).

4. Experiment and analysis In order to demonstrate the proposed scheme and to simplify the experimental system, a 3D HPO display system with 64 color digital cameras and color projectors is constructed. Here each digital camera or projector has the resolution of 480 × 640 pixels. An HFS is designed according to Eq. (5), Eq. (6), etc., and realized with DLS method due to its advantages of simple design and fabrication over the HML method and HMCL method. The setup of manufacturing the VP is with the digital holographic technology and shown in Fig. 4 [22]. The VP parameters (Dn, ωn/2, f’) are realized by selecting proper diaphragm aperture at the D, adjusting devices in the setup. With the computer control, the VP master using the photoresist recording medium is first fabricated, and then the HFS is fabricated by embossing holographic technology. The different magnitude of the HFS can also be realized based on the requirements of the 3D display reconstructed.

Fig. 4. The diagram of setup to fabricate the DS screen where Li (i = 1,2,3)-Lens, S-Shutter, DP-Diffusion plate, D-Diaphragm, RP-Recording plate, BF-Brace flat, SD-Shutter drive, DPDDiffusion plate drive, DD-Diaphragm drive, BFSD-Brace flat shifting drive, CH-Control switch, 1-Optics system and 2-Service system.

The HFS fabricated by us is with the total horizontal visual angle of 45°and the size of 1.8 × 1.3 m2. When it is placed at the plane near the point R’, two group pictures of 3D-image displayed of a kangaroo and an object recorded are taken as shown in Fig. 5. And the pictures (a) and (b) are respectively composed of five pictures, which are separately taken at five visual angles from left to right. We can see that the recovered 3D images have high definition, plenty of color, high brightness and high diffractive efficiency, which is may be seen as a reasonable approximation to full-parallax 3D display. The experiment results show that the DLS method can effectively implement large size HFS and the established display system is effectual to the RLF 3D display.

#135927 - $15.00 USD

(C) 2010 OSA

Received 1 Oct 2010; revised 12 Nov 2010; accepted 6 Dec 2010; published 17 Dec 2010

20 December 2010 / Vol. 18, No. 26 / OPTICS EXPRESS 27825

Fig. 5. (a) and (b) correspond to the pictures of 3D-object displayed by the HFS (color online).

5. Conclusions In summary, the HFS in the RLF 3D display system is analyzed in detail. The modulation function of the HFS is firstly derived, which reveals the basic principle of its modulation waves. The DLS method to realize the HFS is proposed, and used to design and fabricate the HFS. The 3D display experiments with the fabricated HFS are successfully demonstrated. The experimental results show that the 3D information of the recorded object can be well recovered. Therefore, the methods of analyzing and realizing the HFS are effectual, and the derived modulation function Eq. (5) is valuable for the design of the HFS. Moreover, the presented results help to implement the RLF 3D display system, and will find many applications such as the holographic video system. Acknowledgments The research was supported by the Fundamental Research Funds for the Central Universities under no. 2009RC0414. The authors thank all the staff from AFC Technology Co. Ltd. for their support.

#135927 - $15.00 USD

(C) 2010 OSA

Received 1 Oct 2010; revised 12 Nov 2010; accepted 6 Dec 2010; published 17 Dec 2010

20 December 2010 / Vol. 18, No. 26 / OPTICS EXPRESS 27826

Suggest Documents