Use Bionic Microlens Array and CMOS Image Sensor ...

2 downloads 0 Views 4MB Size Report
Abstract—This paper proposed a novel three dimensional motion detection design. This design consists of a bionic microlens array, which integrates an ...
Use Bionic Microlens Array and CMOS Image Sensor for Three-Dimensional Motion Detection Chung-You Liu, Jun-Fu Chuang, Ting-Chieh Yu, Kerwin Wang Department of Mechatronics Engineering, National Changhua University of Education, Taiwan [email protected]

Abstract—This paper proposed a novel three dimensional motion detection design. This design consists of a bionic microlens array, which integrates an aberration limited four-lens system and a CMOS image sensor to mimic insect compound eye vision. The microlens array, made from time multiplexed SF6/O2 plasma and PDMS (polydimethylsiloxane)-molding, is directly placed on the top of CMOS image sensor. A LED testing setup is prepared to project referent rays through the artificial ommatidium to the image sensor for 3D motion analysis and characterization. A computational method also has been established to convert sensor image into displacement. Both analysis and experiment results demonstrate that the design can predict the three-dimensional positions with average error less than 2.41%. Keywords-bionic; microlens; motion detection; 3D positioning; insect vision; compound eyes.

I.

INTRODUCTION

Three-dimensional motion detection (TDMD) is one of the most important base stones for advanced autonomous robots. These robots can perform independent movement without continuous operator guidance with accurate position sensing or object tracking. A few three-dimensional motion detection technologies have been proposed in the past decay, these methods include gyroscopic sensing, magnetic sensing and video motion detection methods. Video motion detection [1-4] is an ideal approach because it can provide significant information without reengineering the environment. However, the computational cost and response speed would be a major concern when detecting a fast moving object. In the natural of insect motion vision, a compound eye, consists of a significant amount of individual photoreceptor units (eyes), can detect fast movement by combining a number of optical information from each eye. This study presents the design, process, configuration, experiment setup, and testing and analysis results of a bionic compound system for 3D motion positioning [4]. Comparing with vision-recognition based detection method, bionic microlens array possesses simplified motion strategies [5]. II.

SYSTEM DESIGN

The novel 3D motion detection architecture and image recognition methodology are developed by biologically inspired artificial compound eyes without involving complex lens configuration or expensive camera array.

(a) Moth compound eyes

Silicon lens mold

(b) Bionic microlens array (artificial ommatidia) Fig. 1. (a) moth compound eyes, (b) a PDMS microlens array and a silicon lens mold (microlens pitch:125µm).

The artificial compound eye system consists of a microlens array, a CMOS image sensor and an aberration limited fourlens system (Fig. 1). The micro-lens molds are made from time-multiplexed plasma etching [6] and PDMS micromolding. After integrated with the image sensor, the bionic microlens array can project reference light onto CMOS image sensor for image processing. The configuration of the three-dimensional motion detector is shown in Fig. 2. The parameters of the microlens array are listed in Table 1.

TABLE 1.THE PARAMETERS OF THE MICROLENS ARRAY AND THE IMAGE SENSOR

Microlens array Material

PDMS

Optical index

1.5

Pitch

200 µm

Focal length

83.3µm Image sensor

Image Resolution

VGA 640×480

Size

1/4 inch VGA Sensor

Frame rate

30 FPS

to a CMOS image sensor. The center of the optical referent center and the optic axis of the lens system is aligned together. The bionic 3D motion detector is mounted to a fine-tuning stage which can move along a rail toward or backward to the reference light for detector characterization. Three major modes, include one single-reference-mode and two dual reference modes, are tested in the experiment. In the singlereference-mode, only one LED is illuminated. In the dual reference mode, two sets of LEDs are arrange along x and y axes, respectively (Fig. 4). Over 64 pictures have been taken and processed in series for image analysis. One can move the bionic detector around the referent LED. We only test one mode each times.

200µm 30µm 200µm

CMOS image sensor

1900µm 1250µm

(a) A schematic diagram of the cross-section view of the artificial ommatidium, it consists of a bionic microlens array and a CMOS image sensor Fig. 3. The experiment set up of the bionic 3D motion and the referent light sources. One can move the bionic detector around the referent LED. We only charge one or two LED for each test mode.

(b) The configuration of the 3D motion detector Fig. 2. The 3D motion detector consists of a microlens arrays, an aberration limited four-lens system, and a 1.3 million-pixel CMOS image sensor.

(a) Single reference mode

Without using pinhole array [7] or gradient-index microlens, this design integrates the flat artificial compound eyes with an aberration limited four-lens system. It has adjustable focus, which can project referent light spot onto the artificial ommatidium. III.

EXPERIMENT SETUP

Fig. 3 shows the experiment set up of the bionic detector. The detection area of microlens array is exposed to a reference light spot. Each CMOS pixel receives rays from reference light sources; the point light sources are located at predetermined positions. A set of LEDs project referent rays

(b) Dual reference mode in X- direction (c) Dual reference mode in Y-direction Fig. 4. The testing images grabbed from three major test modes. LED modules are divided into a single LED light source as shown in (a). Two dual light sources are set along x and y axes respectively, as shown in (b), (c). Set the direction and CCD camera lens was level with the vertical array.

Single mode

Dual mode in x direction

Dual mode in y direction

Fig. 5. Using MATLAB software analysis light intensity from gray scale images and convert image information into 2D and 3D coordinate systems.

(b)

(a)

Fig. 6. After MATLAB post image processing, optical illumination distribution is shown above: (a) Single source and 200µm microlens array; (b) Double source and 200µm microlens array (vertical).

In the testing setup, a fine-tuning knob controls compound eye module movement toward or backward to the reference light spot for three-dimensional motion detection experiments. Two types of light source, including single source and double sources, are tested with bionic microlens array of photographic image tests. The frame rate is 30FPS. The VGA resolution is 640 x 480 pixels. IV.

EXPERIMENT RESULTS

The full-filled densely packaged bionic microlens can help the sensor gather enough light (Fig. 4). To determine the

relative orientation and distance of the referent ray, three dimensional relative positions are extracted by image analysis (Fig. 5). After the analysis, the three dimensional position sensing is achieved by evaluating the threshold value and weight distribution of the ray intensity spread, distributed along x, y and z directions in the images (Fig. 6(a, b)). V.

ANALYSIS AND 3D MOTION DETECTION

The projected images show angular and distance sensitivity to a given test. One can use a still image or consequence moving pictures to determine the location or movement of the

referent light respectively. The angular acceptance function of an artificial compound eye is shown in Fig. 7. The aberration limited four-lens system is place at front of the microlens array, it also determine the sensitivity of the measurement, as (1).

Seneitivity =

pixel shifs displacement

(1)

However, the sensitivity and coupling efficiency reveal nonlinear dependency to the distance and incident angle between the referent rays and motion detector. The results of pixel shifting versus light source displacement are shown in Fig. 8. The distance between reference light source and detector is 155mm to 355mm. The optical characterizations of artificial compound eyes, sensor images and the real relative displacements are investigated with both analysis and experiment method.

Viewing angle (degree)

A post process method also has been established to convert sensor image into space geometry. After post image processing and evaluation by computational method, the simulated position of light source can be obtained. Fig. 9 shows the comparisons between simulated positions and actual positions.

Fig. 9. Post image processing, optical weight (intensity) distribution function is obtained after conversion to estimate the real position and detected position of the referent target.

VI.

Defer to artificial compound eyes made by 3D microfabrication method, which locates thousands of artificial ommatidium on a convex surface to increase the view angle; this work put significant effort to integrate flat artificial compound eyes with an aberration limited four-lens system and a CMOS image sensor without using pinhole array, gradient-index microlens or 3D microlens array. This research may provide significant impact in fast 3D-motion detecting, self-guided robot development, insect vision and insect path finding behavior studies. Both theoretic analysis and experiment results demonstrate that the design can achieve sensitivity to 0.286 (pixel/mm) along the optic axis. ACKNOWLEDGMENT The authors gratefully acknowledge Prof. C.-C. Lin for valuable discussions. REFERENCES

Moving distance (mm) Fig. 7. The maximum viewing angle of the reference light spot versus relative distance (Z), the maximum viewing angle in this system is 23.6o

Pixel shifting (pixel)

CONCLUSIONS AND DISSCUSSIONS

50 45 40 35 30 25 20 15 10 5 0

[1]

[2]

[3]

[4]

0

50

100

150

200

250

Displacement (mm)

300

350

400

[5]

[6]

Fig. 8. The pixel shifts versus light source displacement. [7]

K. M. Lee and S. Foong, “Lateral optical sensor with slip detection for locating live products on moving conveyor,” Automation Science and Engineering, IEEE Transactions, Volume: 7, Issue: 1, pp.123-132, January 2010. P.-L. Chang, F.-H. Hsieh, W.-L. Hsu, and H.-L. Shieh, “An Efficient Approach for Motion Detecting and Tracking,” Computer Communication Control and Automation (3CA), 2010 International Symposium, Volume: 2, pp.330-333, 2010. A. Bak, S. Bouchafa, and D. Aubert, “Detection of Independently Moving Objects Through Stereo Vision and Ego-Motion Extraction,” Intelligent Vehicles Symposium (IV), 2010 IEEE, pp.863-870, 2010. C. L. Nelson, “3-Dimensional Video Motion Detection and Assessment,” 38th Annual 2004 International Carnahan Conference on Security Technology, pp.295-302, October, 2004. K.H. Jeong, J. Kim, L.P. Lee “Polymeric Synthesis of Biomimetic Artificial Compound Eyes”, The 13th International Conference on Solid-State Sensors, Actuators and Microsystems, 2005 IEEE. 2005. K. Wang and K.F. Böhringer, “Time-Multiplexed Plasma-Etching of High Numerical Aperture Paraboloidal Micromirror Arrays,” IQEC/CLEO-PR, July 11-15, 2005. M. Sekine and K. Umeda, “Thin compound eye camera,” Applied Optics, Vol. 44, Issue 15, pp. 2949-2956, 2005.

Suggest Documents