Multiband Imaging CMOS Image Sensor with Multi-Storied ... - MDPI

6 downloads 0 Views 3MB Size Report
May 24, 2018 - the veins lying beneath the skin because IR penetrates into deep skin ..... (NBI) System is Promising Device to Detect Superficial Pharyngeal ...
sensors Article

Multiband Imaging CMOS Image Sensor with Multi-Storied Photodiode Structure † Yoshiaki Takemoto * ID , Mitsuhiro Tsukimura, Hideki Kato, Shunsuke Suzuki, Jun Aoki, Toru Kondo, Haruhisa Saito, Yuichi Gomi, Seisuke Matsuda and Yoshitaka Tadaki Olympus Corporation, Tokyo 163-0914, Japan; [email protected] (M.T.); [email protected] (H.K.); [email protected] (S.S.); [email protected] (J.A.); [email protected] (T.K.); [email protected] (H.S.); [email protected] (Y.G.); [email protected] (S.M.); [email protected] (Y.T.) * Correspondence: [email protected]; Tel.: +81-42-691-7398 † This paper is an expanded version of our published paper: Takemoto, Y.; Kobayashi, K.; Tsukimura, M.; Takazawa, N.; Kato, H.; Suzuki, S.; Aoki, J.; Kondo, T.; Saito, H.; Gomi, Y.; Matsuda, S.; Tadaki Y. Optical characteristics of multi-storied photodiode CMOS image sensor with 3D stacking technology. In Proceedings of the 2017 International Image Sensor Workshop, Hiroshima, Japan, 30 May–2 June 2017.  

Received: 24 February 2018; Accepted: 22 May 2018; Published: 24 May 2018

Abstract: We developed a multiband imaging CMOS image sensor (CIS) with a multi-storied photodiode structure, which comprises two photodiode (PD) arrays that capture two different images, visible red, green, and blue (RGB) and near infrared (NIR) images at the same time. The sensor enables us to capture a wide variety of multiband images which is not limited to conventional visible RGB images taken with a Bayer filter or to invisible NIR images. Its wiring layers between two PD arrays can have an optically optimized effect by modifying its material and thickness on the bottom PD array. The incident light angle on the bottom PD depends on the thickness and structure of the wiring and bonding layer, and the structure can act as an optical filter. Its wide-range sensitivity and optimized optical filtering structure enable us to create the images of specific bands of light waves in addition to visible RGB images without designated pixels for IR among same pixel arrays without additional optical components. Our sensor will push the envelope of capturing a wide variety of multiband images. Keywords: CMOS image sensor; 3D stacked; near infrared; multiband imaging

1. Introduction There has been demand for a CMOS image sensor (CIS) that captures not only visible red, green, and blue (RGB) images, but also invisible infrared (IR) images [1,2]. In the case of optical cameras, IR light is eliminated intentionally by inserting an IR cut filter in front of the surface of an image sensor to avoid color degradation. An IR signal, however, is valuable for getting additional information such as the veins lying beneath the skin because IR penetrates into deep skin where visible light doesn’t [3,4]. Conventionally, there are three methods to capture RGB and NIR mixed images at the same time; one is a method to make RGB and NIR mixed images by combining two different images taken by two CISs with a dichroic mirror, which separates visible RGB light and NIR light [5]. This combination method requires additional optical components and precise placement adjustment in order to make two images in the exact same frame. However, the required additional components increase the cost and difficulty of assembly. The second one is a method to insert an extra NIR color filter among the RGB color filter array, which means that the pixels with the NIR filter become defective pixels for RGB images [6]. This method can get RGB and NIR mixed images at the same time; however, pixel

Sensors 2018, 18, 1688; doi:10.3390/s18061688

www.mdpi.com/journal/sensors

Sensors 2018, 18, x

2 of 9

defective pixels for RGB images [6]. This method can get RGB and NIR mixed images at the same Sensors 2018, 18, 1688 2 of 9 time; however, pixel interpolation is needed to construct the RGB image. Those pixels for RGB and NIR are on the same chip, which means both pixels could not satisfy the optimized optical and electrical characteristics same time. The third option to splitfor incident light intoare three different interpolation is needed at to the construct the RGB image. Thoseis pixels RGB and NIR on the same wavelength regionsboth or more using as a kind of and filter.electrical Foveon characteristics demonstrated at a chip, which means pixelsby could notsilicon satisfysubstrate the optimized optical CMOS image sensor with multi-layered different depth photodiodes in one silicon substrate [7]. This the same time. The third option is to split incident light into three different wavelength regions or technology maysilicon makesubstrate it possibleastoa capture RGB and NIRdemonstrated images with one sensorimage and causes more by using kind of filter. Foveon a CMOS sensorcolor with degradation of depth signal photodiodes mixing among photodiodes need a complicated device multi-layeredbecause different in layered one silicon substrate which [7]. This technology may make it structure. possible to capture RGB and NIR images with one sensor and causes color degradation because of Inmixing this paper, welayered show the basic characteristics multi-storied photodiode signal among photodiodes which needofa the complicated device structure. CMOS image sensor. As a sophisticated and effective way to capture a wide variety of multiband this In this paper, we show the basic characteristics of the multi-storied photodiodeimages, CMOS in image case RGB and NIR images, at the same time without any image degradation and additional optical sensor. As a sophisticated and effective way to capture a wide variety of multiband images, in this case components, we proposed andtime demonstrated multi-storied photodiode CIS optical with 3D stacking RGB and NIR images, at the same without anyaimage degradation and additional components, technology. we proposed and demonstrated a multi-storied photodiode CIS with 3D stacking technology. Section describes concept structure the multi-storied photodiode CMOS image Section 22describes thethe concept andand structure of theof multi-storied photodiode CMOS image sensor [8]. sensor Sectionand 3 presents discusses measurement results of the basic[9]. characteristics Section [8]. 3 presents discussesand measurement results of the basic characteristics Conclusions [9]. are Conclusions are presented in Section 4. presented in Section 4. 2. Multi-Storied Multi-Storied Photodiode Concept and Structure Figure 11shows of of thethe multi-storied photodiode CMOS image sensor. The showsaaconceptual conceptualdiagram diagram multi-storied photodiode CMOS image sensor. top has ahas pixel arrayarray for mainly visible RGB light, and the bottom substrate has a pixel Thesubstrate top substrate a pixel for mainly visible RGB light, and the bottom substrate has a array pixel for IR for light through the topthe substrate. This means that our CIS splits light into 6 array IRthat lightpasses that passes through top substrate. This means that our CIS incident splits incident light kinds signals, RGB signals in the top substrate, and in the substrate, the other optical into 6 of kinds of signals, RGB signals in the top substrate, andbottom in the bottom substrate, thethree other three signals passingpassing through one ofone three color color filtersfilters and the layer. It enables us us to optical signals through of three andtop the semiconductor top semiconductor layer. It enables select specific multiband imaging bybymodifying to select specific multiband imaging modifyingthe thetop topsemiconductor semiconductorlayer layerthickness thickness or or changing the characteristics of optical optical filters, filters, such such as as the the color color of of the the filters filters and and aa multi-layered multi-layereddielectric dielectricfilter, filter, on or between two substrates. Additionally, Additionally, both the top and bottom substrate have their own driver circuits to adjust the driving speed and electrical shutter speed and to select read out pixels between two photodiode photodiode(PD) (PD) arrays. function improves the dynamic of light incident light by arrays. ThisThis function improves the dynamic range ofrange incident by modifying modifying thetime exposure time anditadapting it to the lightinintensity different wavelength regions. the exposure and adapting to the light intensity differentin wavelength regions. Our novel Our novel 3D stacking [8,9] technology enables us such put in the top pixels in 3D stacking technology enables[8,9] us such functions andfunctions to put theand top to pixels alignment with alignment with pixelsposition to get an to avoid aberrations between from images bottom pixels tobottom get an exact toexact avoidposition aberrations between images obtained theobtained top and from thesubstrates, top and bottom whichPDs means no extrafor PDs for IR, which result bottom whichsubstrates, means no extra designated IR,designated which would result in awould complicated in a complicated fabrication process and pixels that cannot be used for visual would needed fabrication process and pixels that cannot be used for visual signals wouldsignals be needed onbe the same on the same 3D stackingistechnology useful to achieve other like distance substrate. 3Dsubstrate. stacking technology also useful is toalso achieve other functions like functions distance measurement measurement or phase difference detection for using lens focusing using thebottom pixels in the bottom substrate. or phase difference detection for lens focusing the pixels in the substrate.

Concept of of multi-storied multi-storied photodiode photodiode CMOS CMOS image image sensor sensor based based on on 3D silicon stacked Figure 1. Concept technology. The sensor comprises two layers of PD arrays, one in the top and the other in the bottom semiconductor. The converts a part of incident lightlight into corresponding signalssignals and works semiconductor. Thetop topPD PDarray array converts a part of incident into corresponding and as an optical filter for the for bottom PD array. bottom PD array converts light thatlight penetrates through works as an optical filter the bottom PDThe array. The bottom PD array converts that penetrates the top substrate into signals, means themeans top substrate mainlyacts as amainly visibleas light sensorlight and through the top substrate intowhich signals, which the top acts substrate a visible the bottom onebottom is an invisible IRinvisible light sensor. sensor and the one is an IR light sensor.

Sensors 2018, 18, 1688

3 of 9

Sensors 2018, 18, x

3 of 9

Figure SEM image of the image sensor. There are micro lenseslenses and color Figure 22shows showsa across-sectional cross-sectional SEM image of the image sensor. There are micro and filters on the top PD array in the top semiconductor layer, which is 3 µm in this case. Both the top color filters on the top PD array in the top semiconductor layer, which is 3 μm in this case. Both the and substrates have their multilayer wiring layers to function Substrates top bottom and bottom substrates haveown their own multilayer wiring layers toindividually. function individually. were bonded by bonded our 3D stacked technology, which connects two substrates electrically Substrates were by our 3D stacked technology, which connects twophysically substratesand physically and without any harm to the PDs. We measured the image sensor specification with 0.2 µm or less electrically without any harm to the PDs. We measured the image sensor specification with 0.2 μm alignment accuracy.accuracy. A part of A thepart incident light, a longer wavelength, penetrates thepenetrates top substrate or less alignment of the incident light, a longer wavelength, the and top is absorbed by the bottom substrate. Incident light is absorbed by the two substrates in accordance substrate and is absorbed by the bottom substrate. Incident light is absorbed by the two substrates in with the wavelength that we can broader range light information andinformation manipulate and the accordance with the so wavelength sohave that awe can have a of broader range of light obtained images by using two devices, e.g., we can extract IR signals from the top substrate signals manipulate the obtained images by using two devices, e.g., we can extract IR signals from the top with an IRsignals reduction and IR signals from theIR bottom show only a visible light substrate withalgorithm an IR reduction algorithm and signalssubstrate from thetobottom substrate to show image. We canlight also image. show IR images from the IR bottom substrate at bottom the same time. at the same time. only a visible We can also show images from the substrate

Figure 2. 2. Cross-sectional Cross-sectional SEM SEM image image of of image image sensor. sensor. Two Two substrates substrates are are bonded, bonded, and and both both have have aa Figure semiconductor layer. layer. Micro Micro lenses lenses and and color color filters filterslie lieon onthe thetop topsubstrate. substrate. semiconductor

3. Measurement Measurement Results Results and and Discussion Discussion 3. 3.1. Photoelectric Conversion Characteristics Characteristics 3.1. Figure33shows shows photoelectric conversion characteristics the in and the bottom top and bottom Figure photoelectric conversion characteristics of theof PDs in PDs the top substrates substrates withlamp, a halogen lamp, emits a broad range of lightvisible including RGB light and with a halogen which emitswhich a broad range of light including RGB visible light and invisible IR invisible IR light. Each PD array on the substrates captures incoming light as it is designed for and light. Each PD array on the substrates captures incoming light as it is designed for and shows linear shows linear conversion photoelectric conversion characteristics. These sensitivities show theefficiency light absorption photoelectric characteristics. These sensitivities show the light absorption of each efficiency of each pixel. All substrate PDs in the bottom thetop color filters and top pixel. All PDs in the bottom under bothsubstrate the colorunder filters both and the substrate showthe linear substrate show linear photoelectric conversion characteristics, which means enough light passes photoelectric conversion characteristics, which means enough light passes through to be detected by through tothe be detected by the PDs in the bottom substrate. the PDs in bottom substrate.

Sensors 2018, 18, x

4 of 9

Sensors 2018, 18, 1688 Sensors 2018, 18, x

4 of 9 4 of 9

Figure 3. Photoelectric conversion characteristics of PDs in top and bottom substrates. Each PD array on the substrates captures incoming light properly and shows linear photoelectric conversion characteristics. Figure ofof PDs in in toptop and bottom substrates. Each PD PD array on Figure 3. 3. Photoelectric Photoelectricconversion conversioncharacteristics characteristics PDs and bottom substrates. Each array the captures incoming light properly shows and linearshows photoelectric characteristics. on substrates the substrates captures incoming light and properly linear conversion photoelectric conversion

3.2. Spectral Response Characteristics characteristics.

Figure 4Response shows measured and calculated normalized quantum efficiencies of the PDs in the 3.2. Spectral Characteristics bottom substrate; the Characteristics measured data are dots, and calculated are dashed lines. Three types of color 3.2. Spectral Response 4 shows measured and calculated normalized quantum efficiencies of the PDs in the bottom filtersFigure are arranged on the top substrate, and the incident light penetrates the filters and the top Figure shows measured calculated normalized quantum efficiencies of the PDs filters in the substrate; the4 measured data areand and calculated are dashed lines. Three types of color substrate. The blue, green, and dots, red pixels in the bottom substrate showed different spectral bottom substrate; the measured data and calculated are dashed lines. Three types of color are arranged on the top substrate, andare thedots, incident light penetrates the filters and the top substrate. characteristics, which means our multi-storied photodiode CMOS image sensor can obtain six kinds filters are green, arranged the top substrate, and the incident light penetrates the filters and the top The blue, andon red substrate showed different characteristics, of spectral information. Thepixels pixelsin in the the bottom bottom substrate are aligned directlyspectral below the pixels in the substrate. Theour blue, green, andphotodiode red pixels in theimage bottom substrate showed spectral which means can obtain six different kinds ofthrough spectral top substrate, whichmulti-storied means that incoming lightCMOS for the pixels sensor in the bottom substrate passes characteristics, which means our multi-storied photodiode CMOS image sensor can obtain six information. the bottom substrate are aligned directly below the pixels in thekinds top the color filtersThe andpixels the topinsemiconductor layer. We calculated the spectral response characteristics of spectralwhich information. The pixels in the bottom substrate are aligned directly below the through pixels in the the substrate, means that incoming light for the pixels in the bottom substrate passes of the bottom PDs with the optical characteristics. The measured normalized quantum efficiencies top substrate, which means that incoming light for the pixels inspectral the bottom substrate passes through color filters and the top semiconductor layer. We calculated the response characteristics of the show the same tendency as our calculations with the optical constants of the filters and the substrate the color filters top semiconductor layer. We calculated the spectral response characteristics bottom PDs withand the the optical characteristics. The measured normalized quantum efficiencies show the materials. of thetendency bottom PDs with the optical characteristics. The measured normalized same as our calculations with the optical constants of the filters and thequantum substrateefficiencies materials. show the same tendency as our calculations with the optical constants of the filters and the substrate materials.

Figure4. 4. Measured Measuredand andcalculated calculatednormalized normalizedquantum quantumefficiency efficiencyof ofbottom bottomPD PDarray. array. NIR NIRsignals signals Figure are detected by the bottom PD array, which comprises three types of pixels corresponding to the color are detected by the bottom PD array, which comprises three types of pixels corresponding to the color filters on the top substrate. filters on the top substrate. Figure 4. Measured and calculated normalized quantum efficiency of bottom PD array. NIR signals are detected the bottom which comprises three types of pixels to the color 3.3. RGB and NIR by Images TakenPD by array, the Multi-Storied Photodiode CMOS Imagecorresponding Sensor filters on the top substrate.

3.3. RGB and NIR Images Taken by the Multi-Storied Photodiode CMOS Image Sensor

Sensors 2018, 18, 1688

Sensors 2018,and 18, xNIR 3.3. RGB

5 of 9

Images Taken by the Multi-Storied Photodiode CMOS Image Sensor

5 of 9

image obtained with the the signals fromfrom both both substrates of the of image sensor Figure 5a 5a shows showsan anRGB RGB image obtained with signals substrates the image by using an IR reduction algorithm. The image does not show any color distortion without an an IR sensor by using an IR reduction algorithm. The image does not show any color distortion without filter since IR signals are are subtracted adequately by the Figure 5b shows an IRan image from IR filter since IR signals subtracted adequately by algorithm. the algorithm. Figure 5b shows IR image the bottom substrate signalssignals taken by the by same the same time. In time. the image, intensity from the bottom substrate taken thesensor same at sensor at the same In thethe image, the difference between pixels was adjusted by averaging the blue, green, and red pixels’ signals. These intensity difference between pixels was adjusted by averaging the blue, green, and red pixels’ signals. RGB and IRand images, which were taken without additional optical components or additional dedicated These RGB IR images, which were taken without additional optical components or additional pixels, showed no interference between the RGB image and IR signals. dedicated pixels, showed no interference between the RGB image and IR signals.

(b)

(a)

Figure image sensor. TheThe RGB image waswas reconstructed by Figure 5. 5. (a) (a)RGB RGBand and(b) (b)NIR NIRimages imagestaken takenbyby image sensor. RGB image reconstructed using both signals from PD arrays in the top and bottom substrates with IR reduction algorithm. The by using both signals from PD arrays in the top and bottom substrates with IR reduction algorithm. NIR image was was obtained by using onlyonly signals fromfrom the bottom PD array. The NIR image obtained by using signals the bottom PD array.

Figure 6 shows the effect of IR subtraction much more clearly. We put an additional IR light Figure 6 shows the effect of IR subtraction much more clearly. We put an additional IR light emitting diode (LED), whose wavelength and bandwidth were 900 and 60 nm, respectively, as a light emitting diode (LED), whose wavelength and bandwidth were 900 and 60 nm, respectively, as a light source to create a bright spot to degrade images. The IR subtraction algorithm was performed based source to create a bright spot to degrade images. The IR subtraction algorithm was performed based on Equation (1). Here signalRGB, signalTop, and signalBottom are the RGB signal after the subtraction, signal on Equation (1). Here signalRGB , signalTop , and signalBottom are the RGB signal after the subtraction, from the top photodiodes, and signal from the bottom photodiodes, respectively. In addition, the signal from the top photodiodes, and signal from the bottom photodiodes, respectively. In addition, coefficient, A, is based on the quantum efficiencies and exposure times of the top and bottom the coefficient, A, is based on the quantum efficiencies and exposure times of the top and bottom photodiodes. The noise characteristics are described in Equation (2). This means that a bigger bottom photodiodes. The noise characteristics are described in Equation (2). This means that a bigger bottom signal decreases the signal to noise ratio of the RGB signal after IR reduction. Figure 7a shows an RGB signal decreases the signal to noise ratio of the RGB signal after IR reduction. Figure 7a shows an RGB image obtained with only the top substrate signals of a sensor. There was a bright spot caused by the image obtained with only the top substrate signals of a sensor. There was a bright spot caused by the extra IR light source. Figure 7b shows a RGB image with IR reduction using the bottom signals and extra IR light source. Figure 7b shows a RGB image with IR reduction using the bottom signals and IR IR reduction algorithm. Neither degradation to the color or abnormal brightness in the RGB image reduction algorithm. Neither degradation to the color or abnormal brightness in the RGB image were were caused by the extra IR light source, which means that interference to the RGB image by the extra caused by the extra IR light source, which means that interference to the RGB image by the extra IR IR was eliminated from the top substrate signals. The extra IR light can be identified as a bright spot was eliminated from the top substrate signals. The extra IR light can be identified as a bright spot in in the image in Figure 7c. Table 1 shows the design specifications of the image sensor. The image the image in Figure 7c. Table 1 shows the design specifications of the image sensor. The image sensor sensor was fabricated by using a 0.18-μm 1P6M process, and the top substrate has a back-illuminated was fabricated by using a 0.18-µm 1P6M process, and the top substrate has a back-illuminated structure. structure. The pixel size and pixel area of the multi-storied photodiodes is 3.8 μm × 3.8 μm and 16.1 The pixel size and pixel area of the multi-storied photodiodes is 3.8 µm × 3.8 µm and 16.1 mm × 0.9 mm, mm × 0.9 mm, respectively. The photodiodes are arrayed as 4224 × 240 pixels by using our original respectively. The photodiodes are arrayed as 4224 × 240 pixels by using our original stacking process. stacking process. The top and bottom substrates were designed with the same design rule and The top and bottom substrates were designed with the same design rule and architecture. architecture. Signal RGB ==SignalTop −−A ××Signal Bottom

(1) (1)

q 2 2 Noise RGB == NoiseTop ++A × × ((Noise Bottom ))

(2) (2)

Table 1. Design specifications of the demonstrated multi-storied photodiode CMOS image sensor. Items Fabrication process Chip size Pixel size Multi-storied photodiode pixel area Number of multi-storied photodiodes

Specifications 0.18-μm 1P6M 20.1 mm × 19.7 mm 3.8 μm × 3.8 μm 16.1 mm × 0.9 mm 4224 × 240

Sensors 2018, 18, 1688

6 of 9

Table 1. Design specifications of the demonstrated multi-storied photodiode CMOS image sensor. Items

Specifications

Fabrication process Chip size Pixel size Multi-storied photodiode pixel area Number of multi-storied photodiodes

0.18-µm 1P6M 20.1 mm × 19.7 mm 3.8 µm × 3.8 µm 16.1 mm × 0.9 mm 4224 × 240

Sensors 2018, 18, x

6 of 9

(a)

(b)

(c) Figure 6. Effect of IR subtraction. (a) RGB image without IR subtraction. (b) RGB image with IR Figure 6. Effect of IR subtraction. (a) RGB image without IR subtraction. (b) RGB image with IR subtraction. subtraction. (c) (c) NIR NIR image. image. The The object object is is radiated radiated by by an an NIR NIR LED, LED, whose whose wavelength wavelength and and bandwidth bandwidth are 900 nm and 60 nm, respectively. are 900 nm and 60 nm, respectively.

(a) Figure 7. Cont.

Sensors Sensors 2018, 2018, 18, 18, 1688 x

of 99 77 of

(b)

(c) Figure dependence of normalized sensitivity of bottom PDs with 560-, 640-,(b) and (c) Figure 7. 7. Angle Angle dependence of normalized sensitivity of bottom PDs (a) with (a)(b) 560-, 640-, 800-nm wavelength light. and (c) 800-nm wavelength light.

3.4. by the the Multi-Storied Multi-Storied Photodiode Photodiode CMOS CMOS Image Image Sensor Sensor 3.4. RGB RGB and and NIR NIR Images Images Taken Taken by Figure shows the the incident incident angle angle dependence dependence of normalized sensitivities the Figure 77 shows of the the normalized sensitivities of of the the PDs PDs in in the bottom substrate measured with 560-, 640-, and 800-nm wavelength light, respectively. The two bottom substrate measured with 560-, 640-, and 800-nm wavelength light, respectively. The two substrates as substrates were were aligned aligned precisely precisely within within an an accuracy accuracy of of 0.2 0.2 μm, µm, and and both both pixel pixel sizes sizes were were 3.8 3.8 μm µm as mentioned. depends on on the the incident angle of mentioned. Their Their sensitivity sensitivity strongly strongly depends incident angle of light light because because the the wiring wiring layers layers between both substrates also work as a kind of filter and the total thickness of the wiring layers and and between both substrates also work as a kind of filter and the total thickness of the wiring layers the the bonding bonding layer layer was was 15 15 μm. µm. Figure Figure 88 shows shows aa cross-sectional cross-sectional schematic schematic diagram diagram of of the the multi-storied multi-storied PD CIS. The incident light reaches the bottom PDs after penetrating the top semiconductor layer,layer, two PD CIS. The incident light reaches the bottom PDs after penetrating the top semiconductor wiring layers, and the bonding layer. The incident light angle for the bottom PDs is less than 14 two wiring layers, and the bonding layer. The incident light angle for the bottom PDs is less than degrees because of of thethe geometries ofofthe 14 degrees because geometries thewiring wiringlayouts. layouts.The Themeasurement measurementresults resultsshow show aa relatively relatively low sensitivity of over 10 degrees because the incident light is screened by the wiring This low sensitivity of over 10 degrees because the incident light is screened by the wiring layers.layers. This means means that we can design and control the incident angle characteristics of the bottom PDs by that we can design and control the incident angle characteristics of the bottom PDs by modifying modifying the wiring layer layouts and structure along with the optical design to meet specifications. In this case, our CIS has as many as six layers of metal wiring, which limit the incident light angle as

Sensors 2018, 18, 1688

8 of 9

the wiring layer layouts and structure along with the optical design to meet specifications. In this Sensors 2018, 18, x 8 of 9 case, our CIS has as many as six layers of metal wiring, which limit the incident light angle as shown, andshown, degrades sensitivity of the bottom PDs. The quantum efficiency ratioratio between the the toptop and and the degrades the sensitivity of the bottom PDs. The quantum efficiency between bottom photodiode at 800 nm was 8.8% in the case of vertical incident light. Twenty percent of the and bottom photodiode at 800 nm was 8.8% in the case of vertical incident light. Twenty percent of incident light gets lost through the 3-µm thick silicon substrate at 800 nm and 71.2% of incident light the incident light gets lost through the 3-μm thick silicon substrate at 800 nm and 71.2% of incident is blocked orblocked scattered wiring bonding layer. The aperture ratio of wiring layers light is or through scatteredthe through thelayers wiringand layers and bonding layer. The aperture ratio of wiring between the substrates is approximately 30%. This means incident light is refracted by the microlens layers between the substrates is approximately 30%. This means incident light is refracted by the andmicrolens more than themore aperture of incident is blocked wiring layers. It islayers. possible improve and than ratio the aperture ratiolight of incident lightby is blocked by wiring It istopossible the to sensitivity by sensitivity as much as by as reducing number wiringoflayers threetolayers in each improve the bytwice as much twice bythe reducing theof number wiringtolayers three layers each CIS and broadening incident light angle limit. This also suggests that the modifying CISin and broadening the incidentthe light angle limit. This also suggests that modifying materialthe and material and thickness of the wiring layers between the two PD arrays can have an optical optimized thickness of the wiring layers between the two PD arrays can have an optical optimized effect by effect as by aworking a filter for the working filter forasthe bottom PDbottom array. PD array.

Figure 8. Cross-sectional schematic diagram of the multi-storied PD CIS. Incident light penetrates the Figure 8. Cross-sectional schematic diagram of the multi-storied PD CIS. Incident light penetrates the top semiconductor layer, the two wiring layers, and the bonding layer and reaches the bottom PDs. top semiconductor layer, the two wiring layers, and the bonding layer and reaches the bottom PDs. Incident light is limited by the wiring layers. Incident light is limited by the wiring layers.

4. Conclusions 4. Conclusions We demonstrated multiband imaging and showed the characteristics of a multi-storied PD CIS We demonstrated imaging and layered showeddevices the characteristics of a multi-storied PD CIS which comprises twomultiband individually functioning in different substrates. The sensor was which comprises two individually functioning layered devices in different substrates. The sensor confirmed to capture a wide variety of multiband images, which is not limited to conventional visible wasRGB confirmed to capture wide filter variety ofinvisible multiband images,atwhich is not limited to conventional images taken with aaBayer or to IR images, the same time. Our device can make visible images taken with a Bayer filterRGB or toimaging invisible IR images, at the time. device the RGB conventional combination of IR and sensors smaller andsame provide a Our variety of canmultiband make theimages. conventional combination of IR andphotodiode RGB imaging sensors smaller andmultiband provide a This cutting-edge multi-storied concept achieves not only imaging but also other functions like distance measurement phase difference variety of multiband images. This cutting-edge multi-storiedwith photodiode conceptdetection achieves for notlens only focusingimaging and much pixels circuits in both substrates. Our sensor will push the multiband butmore also with otherspecific functions likeand distance measurement with phase difference detection of capturing a wide of multiband multi-images. for envelope lens focusing and much morevariety with specific pixels and and time-divided circuits in both substrates. This Ourarticle sensor demonstrates the two layered PD CIS, however, we need to identify the absorption, blocking, will push the envelope of capturing a wide variety of multiband and time-divided multi-images. and crosstalkthe between the substrates. optical we simulation be the performed to Thisscattering, article demonstrates two layered PD CIS,An however, need toshould identify absorption, categorize attenuations. blocking, scattering, and crosstalk between the substrates. An optical simulation should be performed

to categorize attenuations.

Author Contributions: Y.T. designed fabrication processes, measured characteristics and wrote the paper. M.T. designed analog circuits in the chip and a board to read out signals. H.K. designed analog circuits in the chip. Author Contributions: Y.T. designed fabrication processes, measured characteristics and wrote the paper. designedanalog digitalcircuits architecture. handled overallH.K. analog circuits analog in the chip. H.S.in M.T.S.S. designed in theJ.A. chip and a pixel boarddesign. to readT.K. outled signals. designed circuits handled wafer processes. Y.G. managed process integration. S.M. managed the overall project. Y.T. supervised the chip. S.S. designed digital architecture. J.A. handled pixel design. T.K. led overall analog circuits in the chip. device structure andY.G. processes. H.S.overall handled wafer processes. managed process integration. S.M. managed the overall project. Y.T. supervised overall device structure and processes. Conflicts of Interest: The authors declare no conflict of interest.

Sensors 2018, 18, 1688

9 of 9

Conflicts of Interest: The authors declare no conflict of interest.

References 1.

2.

3. 4. 5. 6. 7. 8.

9.

Kim, W.; Wang, Y.; Ovsiannikov, I.; Lee, S.; Park, Y.; Chung, C.; Fossum, E. A 1.5 Mpixel RGBZ CMOS Image Sensor for Simultaneous Color and Range Image Capture. In Proceedings of the 2012 IEEE International Solid-State Circuits Conference Digest of Technical Papers (ISSCC), San Francisco, CA, USA, 19–23 February 2012; pp. 392–394. Nonaka, S.; Saito, Y.; Gotoda, T.; Kozu, T.; Matsuda, T.; Oda, I.; Suzuki, H.; Saito, D. Narrow Band Imaging (NBI) System is Promising Device to Detect Superficial Pharyngeal Cancer at an Early Stage in Patients with Esophageal Squamous Cell Carcinoma. Gastrointest. Endosc. 2006, 63, AB250. [CrossRef] Hashimoto, J. Finger Vein Authentication Technology and its Future. In Proceedings of the 2006 Symposium on VLSI Circuits Digest of Technical Papers, Honolulu, HI, USA, 15–17 June 2006; pp. 5–8. Zimmermann, T.; Rietdorf, J.; Pepperkok, R. Spectral Imaging and its Applications in Live Cell Microscopy. FEBS Lett. 2003, 546, 87–92. [CrossRef] Harvey, A.R.; Beale, J.E.; Greenaway, A.H.; Hanlon, T.J.; Williams, J.W. Technology Options for Hyperspectral Imaging. Proc. SPIE 2000, 4132, 13–24. Chen, Z.; Wang, X.; Liang, R. RGB-NIR Multispectral Camera. Opt. Express 2014, 22, 4985–4994. [CrossRef] [PubMed] Merrill, R.B. Color Separation in an Active Pixel Cell Imaging Array Using a Triple-Well-Structure. U.S. Patent 5,965,875, 12 October 1999. Takemoto, Y.; Kobayashi, K.; Tsukimura, M.; Takazawa, N.; Kato, H.; Suzuki, S.; Aoki, J.; Kondo, T.; Saito, H.; Gomi, Y.; et al. Multi-Storied Photodiode CMOS Image Sensor for Multiband Imaging with 3D Technology. In Proceedings of the 2015 IEEE International Electron Devices Meeting (IEDM), Washington, DC, USA, 7–9 December 2015; IEEE: Piscataway, NJ, USA, 2015; pp. 771–774. Takemoto, Y.; Tsukimura, M.; Takazawa, N.; Kato, H.; Suzuki, S.; Aoki, J.; Kondo, T.; Saito, H.; Gomi, Y.; Matsuda, S.; et al. Optical Characteristics of Multi-Storied Photodiode CMOS Image Sensor with 3D Stacking Technology. In Proceedings of the 2017 International Image Sensor Workshop, Hiroshima, Japan, 30 May–2 June 2017. © 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

Suggest Documents