High resolution imaging and wavefront aberration ...

24 downloads 78268 Views 617KB Size Report
Where [Ipleno] is the model parameter vector created by reorganizing ... image. Creating an image rearranging pixels marked with”*” .... Adobe Systems, Tech.
High resolution imaging and wavefront aberration correction in plenoptic systems JM Trujillo-Sevilla 1*, L F Rodríguez-Ramos 2 , I Montilla 2, JM Rodríguez-Ramos 3 1University

of La Laguna, Spain of Astrophysics of the Canary Islands, Spain 3Center for Biomedical Research of the Canary Islands, University of La Laguna, Spain *Corresponding author: [email protected] 2Institute

Received Month X, XXXX; revised Month X, XXXX; accepted Month X, XXXX; posted Month X, XXXX (Doc. ID XXXXX); published Month X, XXXX Plenoptic imaging systems are becoming more common since they provide capabilities unattainable in conventional imaging systems, but one of their main limitations is the poor bidimensional resolution. Combining the wavefront phase measurement and the plenoptic image deconvolution, we propose a system capable of improving the resolution when a wavefront aberration is present and the image is blurred. In this work, a plenoptic system is simulated using Fourier optics, and the results show that an improved resolution is achieved, even in presence of strong wavefront aberrations. OCIS Codes: (110.0115) Imaging through turbulent media, (100.1830) Deconvolution, (120.5050) Phase measurement http://dx.doi.org/10.1364/OL.99.099999

Plenoptic cameras are devices intended to capture the lightfield or the plenoptic function, which was defined by Adelson [1] as the intensity of every light ray in a volume as a function of wavelength, time and position. The first light field capturing device was proposed in 1908 by G. Lippman, who had the idea of covering a photographic film with an array of spherical lenslets with the goal of rendering several points of view in a single image [2]. A modern plenoptic camera comprises a main lens with a fixed flange focal distance, a microlens array and an image sensor. The microlens array is placed in the focal plane of the main lens, and the sensor is placed in the focal plane of the microlenses (figure 1), this scheme was proposed by Ives [3] in 1930. Images formed by the microlenses would overlap if the f-numbers corresponding to the main lens and to the microlenses don’t match.

recovers the full resolution of the original object, with the drawback of being very computational intensive. In this work we propose a similar method that not only recovers full object resolution but also eliminates any wavefront aberration (blurring) in the final image. The plenoptic images used to test this technique have been simulated using Fourier optics. Depending on the parameters of the system (f-number and resolution) Fraunhofer propagation or Fresnel propagation can be used, as both ways are computationally efficient. In this work we assume Fresnel propagation, as microlenses commonly used in imaging systems has a Fresnel number near 10 or lower. In other cases, good approximate results can be obtained using Fraunhofer propagation [12]. Let 𝑈𝑖 (𝑥, 𝑦) be the input field affected by a phase screen of amplitude Φ, placed just before the main lens (with focal length zml), that aberrates the wavefront (figure 2). The field at the microlens array plane can be written as:



U1  x, y   1 P  x, y U i  x, y       x2  y 2 exp   jk     x, y        2 zml    



exp  jzml  f x 2  f y 2  Fig. 1. Schematic of a basic plenoptic sensor.

Recent research has exploded the capabilities of the plenoptic cameras. In the computational photography field, it makes possible an a posteriori refocus [4,5] and 3D [6]. The plenoptic sensors can also work as wavefront sensors, either when using punctual sources [7] or extended objects [8]. Wavefront recovery is not limited to a single layer. It is also possible to recover tomographic information using a plenoptic sensor [9]. One of the main limitations of plenoptic cameras is the final resolution of images, typically limited to one pixel per microlens. Several methods to improve final image resolution have been proposed by some authors [6, 10]. One of the most promising methods can be found in [11]. It



(1)

Where,  and 1 denote direct and inverse Fourier transform respectively. P  x, y  is the pupil function, k is the wavenumber and  is the wavelength. fx and fy are frequency independent variables associated to the coordinates x and y, respectively. For a microlens array comprising M × N lenslets (of focal length zµl) with coordinates (m,n), each one subtending P × Q pixels in the sensor with coordinates (ξ,η), the field obtained at the sensor by the microlens (m,n)th can be calculated with equation 2, where fξ and fη are frequency independent variables associated to the coordinates ξ and η, respectively.



U m,n  ξ, η  1 U1  mM  ξ, nN  η     jk 2 2 exp  x  y    2z  μ l   



exp  jzμl  fξ 2  f η2 



(2)

Fig. 2. Schematic of the simulated system. Phase screen layer is placed in the main lens pupil. Object is in the conjugated plane of main lens.

The final plenoptic image can be obtained arranging the field formed by each lenslet and calculating the squared modulus:

 x y  I  x, y   U  x   y   x    P,y   Q  P  P , Q   Q     

2

(3)

Where ⌊𝑥⌋ denotes the integer part of x. We used this method to generate a simulated plenoptic image. This image contains information about the phase screen and it can be recovered because the plenoptic sensor can also work as a wavefront sensor. The plenoptic sensor has also been proposed by [8] to measure wavefront phases when using extended objects as a reference. When using this method, the starting point is to generate synthetic aperture images by recomposition of the plenoptic image. Every point in the pupil can be synthetized (or re-imaged) re-arranging pixels on the plenoptic image. Then, the image formed by putting together one pixel from every microlens (with the same relative position to each microlens center) is a synthetic aperture image of the pupil (figure 3).

From the synthetized aperture images, the wavefront phase gradients can be obtained by cross correlation of every image with respect to one of them used as a reference. The position of the correlation peak is proportional to the relative average tilt of the wavefront in the area subtended by the synthetic aperture. The cross correlation may not work well in images lacking enough information to accurately calculate displacements between them. It seems clear that the contrast and the spatial frequency contents at the object intensity are predominant factors in the quality of the recovered phase. Bigger microlenses will lead to a greater amount of synthetic apertures, but suffering from smaller resolution, and then the object should be limited to contain lower spatial frequencies. Smaller microlenses will lead to higher resolution synthetic aperture images, but since the number of possible synthetic apertures is smaller, the final resolution of the recovered wavefront will also be smaller. Research on this topic can be found in [8], where constraints to the size of the microlenses with respect to the frequency of the object intensity irregularities are noted, resulting in a constraint in the microlens diameter (dµl) equal to:

dμl 

1 2ρc

Where, 𝜌𝑐 is the maximum measurable object frequency. As seen before, the quality of the recovered wavefront phase depends on the object intensity irregularities, due to this reason, in this work the recovered phase is assumed to be a re-sampled version of the original phase screen to match with the resolution of the microlenses (P × Q). According to the Huygens-Fresnel principle, the object can be seen as a collection of point sources, each one producing a spherical wave associated with its position (x,y). Then, the system can be characterized by the impulse response for each point in the object plane. Let IRu,v(x,y) be the impulse response image for the position (u,v) in the object plane and Ipleno(x,y) a recorded plenoptic image of an unknown object Iobject(x,y) affected by an unknown wavefront aberration. Note that the object resolution and the plenoptic image resolution must match. In this case we will consider an object resolution of R × S pixels, and thus there is a total of R × S impulse response images. The deconvolution can be accomplished by solving the following linear inverse problem:

 I pleno    IR   I object 

Fig. 3. Left, aperture of the main lens. Right, sample plenoptic image. Creating an image rearranging pixels marked with” ” would synthesize a 4 by 4 pixels image with the point of view from point "a" in the main lens.

*

(4)

(5)

Where [Ipleno] is the model parameter vector created by reorganizing the plenoptic image to a 1 × R∙S elements vector. [IR] is the observation matrix and it is obtained by reorganizing each IRu,v(x,y), in the same way as [Ipleno], and concatenating them into a large R∙S × R∙S matrix. This is the same method that can be found in [11], with the addition of the phase measurement using only the information of the plenoptic image, deconvolving not only the plenoptic image but also the wavefront aberration.

In the ideal case, when the phase screen Φ can be obtained without error and there is no noise or quantification error in the sensor, the system can be simply solved by solving equation 6.

 I object    IR   I pleno  1

(6)

In that case, the object intensity values can be completely restored as the system has a single solution (figure 4), i.e. knowing the exact phase screen leads to a full reconstruction of the object. In a more realistic case, not only the recovered phase screen differs from the real one, but there are also other effects such as quantification error or sensor noise. For this reason, it is not possible to solve the problem by only inverting [IR] , iterative methods are now required, like the LSQR we have used in this work [13].

simulation can be seen in figure 8. Note that even using the full resolution of the phase screen, the full resolution of the original object is not achievable, because the simulated plenoptic sensor uses only eight bit color and introduces quantification error. However, the final resolution is improved with respect to the focal plane image even when the resolution of the recovered phase screen is 16 × 16 pixels. Figure 6 shows a graphical representation of the quality of the deconvolved image versus the recovered phase screen resolution. The metric used is the structural similarity (SSIM), as this is a better technique when measuring image degradation [14]. This metric is valued between 1 (perfect reconstruction of the image) and 0 (no reconstruction at all). This result shows that even when the phase screen is sampled at the microlens resolution, the quality of the deconvolution is near its maximum. 0.35 0.3

SSIM

0.25 0.2 0.15 0.1 0.05 8

16

32

64

128

phase screen resolution (pixels)

256

512

Fig. 6. Structural similarity (SSIM) versus recovered phase screen resolution.

To show the feasibility of this technique, several simulations have been performed varying the recovered phase screen resolution. The system comprises a 400 mm focal length main lens, (f-number of 62.5), and a Kolmogorov phase screen with Fried parameter 0.4 mm. The original resolution of the object, phase screen and plenoptic image is 512 × 512 pixels (figure 5). Every microlens image has a resolution of 32 × 32 pixels, being this the maximum recovered phase screen resolution. The plenoptic image is simulated and sampled to eight bits. The image is deconvolved using the phase screen resampled at different resolutions. The results of the

The immunity to noise of the method has been analyzed deconvolving the plenoptic images after having added Gaussian noise at SNR from 0dB to 60dB (figure 7). We found that at SNR lower than 30 dB the image degradation is noticeable. In this work, a deconvolution method is proposed in a scenario where a strong wavefront aberration is induced by a phase screen. This result is obtained using a single sensor, used simultaneously for both imaging and wavefront phase sensing.

0.3

SSIM

Fig. 4. Upper left, original image. Upper right, blurred image due to phase screen. Bottom left, plenoptic image. Bottom right, deconvolved image.

0.2

0.1 0

Fig. 5. Left, original object used to test, based on the 1951 USAF chart. Center, image created at microlens plane. Right, plenoptic image to be deconvolved.

10

20

30

SNR (dB)

40

50

60

Fig. 7. Structural similarity (SSIM) of object and deconvolved image versus SNR in plenoptic image.

Fig. 8. Deconvolved images and detail of the central groups. Resolution of the recovered phase screen varies between 256x256 and 8x8 pixels.

In contrast with the method proposed in [11], a similar result cannot be obtained by superesolution methods. This is the only method that improves resolution in a plenoptic image when the image has been blurred by wavefront aberration. To test this technique, a laboratory experiment has been designed (with the same optical parameters used in these simulations), where the most difficult part to achieve is to calculate the observation matrix of the system, which implies a calibration of the system. This will be the main objective of our future work. This work was supported by National R&D Program (Project AYA2012-32079) of the Ministry of Economy and Competitiveness, the European Regional Development Fund (ERDF) and the European Project FP7-REGPOT2012-CT2012-31637-IMBRAIN. References 1. E. H. Adelson and J. R. Bergen, Computational models of visual processing 1.2 (1991). 2. H. E. Ives, JOSA 20.6 (1930): 332-340. 3. G. Lippmann, Comptes-Rendus Academie des Sciences 146 (1908): 446-451.

4. J. G. Marichal-Hernández, J. P. Lüke, F. Rosa, F. Perez and J. M. Rodriguez-Ramos, 3DTV Conference: The True Vision-Capture, Transmission and Display of 3D Video, 2009. IEEE, 2009. 5. R. Ng, ACM Transactions on Graphics (TOG). Vol. 24. No. 3. ACM, 2005. 6. J. P. Lüke, F. Perez, J. G. Marichal-Hernandez, J. M. Rodriguez-Ramos and F. Rosa, International Journal of Digital Multimedia Broadcasting 2010 (2009). 7. R.M. Clare and R.G. Lane, JOSA A 22.1 (2005): 117-125. 8. L. F. Rodríguez-Ramos, Y. Martin, J. J. Diaz, J. Piqueras and J. M. Rodriguez-Ramos, SPIE Optical Engineering+ Applications. International Society for Optics and Photonics, 2009. 9. J. M. Rodriguez-Ramos, B. Femenia, I. Montilla, L. F. Rodriguez-Ramos, J. G. Marichal-Hernandez, J. P. Lüke, R. Lopez, J. J. Diaz and Y. Martin, (2009). 10. A. Lumsdaine and T. Georgiev, Indiana University and Adobe Systems, Tech. Rep (2008). 11. S. A. Shroff, and K. Berkner, Applied optics 52.10 (2013): D22-D31. 12. D. G. Voelz, SPIE, 2011. 13. C. C. Paige and M. A. Saunders, ACM Transactions on Mathematical Software (TOMS) 8.1 (1982): 43-71. 14. Z. Wang, A. C. Bovik, H. R. Sheikh and E. P. Simoncelli, Image Processing, IEEE Transactions on 13.4 (2004): 600612.

Full references

15. E. H. Adelson and J. R. Bergen, "The plenoptic function and the elements of early vision." Computational models of visual processing 1.2 (1991). 16. H. E. Ives, "Parallax panoramagrams made with a large diameter lens."JOSA 20.6 (1930): 332-340. 17. G. Lippmann, "Epreuves reversibles. Photographies integrals." Comptes-Rendus Academie des Sciences 146 (1908): 446-451. 18. J. G. Marichal-Hernández, J. P. Lüke, F. Rosa, F. Perez and J. M. Rodriguez-Ramos, "Fast approximate focal stack transform." 3DTV Conference: The True Vision-Capture, Transmission and Display of 3D Video, 2009. IEEE, 2009. 19. R. Ng, "Fourier slice photography." ACM Transactions on Graphics (TOG). Vol. 24. No. 3. ACM, 2005. 20. J. P. Lüke, F. Perez, J. G. Marichal-Hernandez, J. M. Rodriguez-Ramos and F. Rosa, "Near real-time estimation of super-resolved depth and all-in-focus images from a plenoptic camera using graphics processing units." International Journal of Digital Multimedia Broadcasting 2010 (2009). 21. R.M. Clare and R.G. Lane, "Wave-front sensing from subdivision of the focal plane with a lenslet array." JOSA A 22.1 (2005): 117-125. 22. L. F. Rodríguez-Ramos, Y. Martin, J. J. Diaz, J. Piqueras and J. M. Rodriguez-Ramos, "The Plenoptic Camera as a wavefront sensor for the European Solar Telescope (EST)." SPIE Optical Engineering+ Applications. International Society for Optics and Photonics, 2009. 23. J. M. Rodriguez-Ramos, B. Femenia, I. Montilla, L. F. Rodriguez-Ramos, J. G. Marichal-Hernandez, J. P. Lüke, R. Lopez, J. J. Diaz and Y. Martin, "The CAFADIS camera: a new tomographic wavefront sensor for Adaptive Optics." (2009). 24. A. Lumsdaine and T. Georgiev, "Full resolution lightfield rendering."Indiana University and Adobe Systems, Tech. Rep (2008). 25. S. A. Shroff, and K. Berkner, "Image formation analysis and high resolution image reconstruction for plenoptic imaging systems." Applied optics 52.10 (2013): D22-D31. 26. D. G. Voelz, "Computational fourier optics: a MATLAB tutorial." SPIE, 2011. 27. C. C. Paige and M. A. Saunders, "LSQR: An algorithm for sparse linear equations and sparse least squares." ACM Transactions on Mathematical Software (TOMS) 8.1 (1982): 43-71. 28. Z. Wang, A. C. Bovik, H. R. Sheikh and E. P. Simoncelli, "Image quality assessment: from error visibility to structural similarity." Image Processing, IEEE Transactions on 13.4 (2004): 600-612.