Technical Correspondence - IEEE Xplore

6 downloads 0 Views 385KB Size Report
Abstract—This paper presents a novel approach for detecting pedestrian crossings to enhance the safety and mobility of blind people while crossing a road.
IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, VOL. 6, NO. 4, DECEMBER 2005

439

Technical Correspondence Detection of Pedestrian Crossing Using Bipolarity Feature—An Image-Based Technique Mohammad Shorif Uddin and Tadayoshi Shioyama

Abstract—This paper presents a novel approach for detecting pedestrian crossings to enhance the safety and mobility of blind people while crossing a road. It is extremely important for a blind person to know whether a frontal area is a crossing or not. In a crossing, the usual black road surface is painted with constant-width periodic white bands. An image-based technique has been developed to detect the bipolar patterns of pedestrian crossings. The presence of pedestrian crossings are inferred by careful analysis of crossing width, crossing direction, number of crossing bands, as well as bandwidth trend. Experimental evaluation of the proposed approach was conducted using 100 real images with and without crossings. It was found that the proposed technique performed with 95% accuracy and with no false positive. Index Terms—Bipolarity feature, image analysis and computer vision, pedestrian crossing, travel aid for blind people.

I. I NTRODUCTION According to the World Health Organization statistics, approximately 40 million people are blind all over the world [1], [2]. Mobility, which has been defined as “the ability to travel safely, comfortably, gracefully, and independently through the environment,” [3] is the main barrier for these vision-disabled people. The most widely used navigational aids for blind people are the white cane and the guide dog. However, these have many limitations: the range of detection of special patterns or obstacles using a cane is very narrow and a guide dog requires extensive training and is not suitable for people who are not physically fit or cannot maintain a dog [4]. To improve the versatility of the white cane, various devices have been developed such as the SONICGUIDE [5], the Mowat sensor [6], the Laser cane [7], and the Navbelt [8]. Pedestrian crossing is a dangerous place for a blind person to cross safely. Besides blind navigation, pedestrian crossings are important landmarks for outdoor mobile robots, for example in map-building applications. However, the above devices are not able to assist the blind in detecting the location of a pedestrian crossing, its length, and the state of traffic lights. Some traffic lights have beepers, which prompt the blind person to cross the road, when it is safe to do so. However, such equipment is not available at every crossing; perhaps, it would take too long for such equipment to be installed

and maintained at every crossing. Blind people obviously cannot see, but can hear. The arrival of fast and cheap digital portable laptop computers with multimedia computing to convert audio–video streams in real time opens new avenues to develop an intelligent navigation system for the blind people. This paper discusses an application of computer vision to improve the mobility of millions of blind people all over the world. With a view to develop an image-based device by which blind persons can autonomously detect important information for safely negotiating a crossing, crossing-length-measurement techniques were developed [9]–[11]. This length-measurement technique is based on an assumption that the image contains a crossing. However, before measuring the length, a blind person needs information whether the area in front of him is a crossing or not. Therefore, detecting the existence of a crossing is a preprocess followed by the length measurement and the state of traffic-lights detection. Previously, Se [12] proposed a pedestrian-crossing detection by grouping lines and checking for concurrence using the vanishingpoint constraint. However, a thorough evaluation of this technique has not been performed yet and also the technique is working slow and far from real time. Meijer’s [13], [14] “vOICe,” consisting of a head-mounted camera, stereo headphones, and a laptop, is the only commercially available vision-based travel aid that uses 1-to-1 image-to-sound mapping. Though it can recognize the walls, doors, etc., pedestrian-crossing-detection technique is still absent from it. Besides these, many researchers reported image-based pedestriandetection techniques for the safety of pedestrians [15]–[19]. However, these are not motivated by crossing detection. A good review on image-processing constraints for blind mobility has been given in [20]. In this paper, we aim to detect the existence of a crossing on road scenes. In a crossing, the usual black road surface is painted with constant-width periodic white bands. The existence of a crossing is detected by careful evaluation of this bipolar pattern in an image on the basis of crossing width, crossing direction, number of crossing bands, as well as bandwidth trend. To confirm the effectiveness of the proposed method, an experiment is performed using 100 real images with and without crossings. The rest of this paper is organized as follows. Section II provides a description of the principle of the proposed technique. A method for the detection of the existence of a crossing is presented in Section III. Section IV discusses the experimental results with real scenes, and finally, the conclusions are drawn in Section V.

II. P RINCIPLE OF THE D ETECTION OF A C ROSSING Manuscript received June 21, 2004; revised April 24, 2005. This work was supported by the Japan Society for the Promotion of Science under Grantin-Aid for Scientific Research (Nos. 16500110 and 03232). This paper was recommended by Associate Editor H. Chen. M. S. Uddin is with the Department of Computer Science and Engineering, Jahangirnagar University, Savar, Dhaka 1342, Bangladesh. He is also with the Department of Mechanical and System Engineering, Kyoto Institute of Technology, Matsugasaki, Sakyo-ku, Kyoto 606-8585, Japan (e-mail: [email protected]). T. Shioyama is with the Department of Mechanical and System Engineering, Kyoto Institute of Technology, Matsugasaki, Sakyo-ku, Kyoto 606-8585, Japan (e-mail: [email protected]). Digital Object Identifier 10.1109/TITS.2005.858787

In a crossing, the usual black road surface is painted with constantwidth periodic white stripes. The model of a crossing is shown in Fig. 1. In Japan, the width of each white or black band is 45 cm. An image of a real road scene containing a pedestrian crossing is shown in Fig. 2. The crossing pattern can be treated as a bipolar region. First, candidate image regions for a crossing are identified based on the bipolar nature of their image intensities. These regions are then corrected for viewing angle. This correction allows a final labeling based on the number and nature of black–white and white–black transitions within a region. Therefore, we can detect an existence of

1524-9050/$20.00 © 2005 IEEE

440

IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, VOL. 6, NO. 4, DECEMBER 2005

where µi and σi2 represent mean and variance, respectively. In the following, we derive relations among the variables.

∞ µ0 =

xp0 (x)dx −∞

∞ x {αp1 (x) + (1 − α)p2 (x)} dx

= −∞

∞

∞ xp1 (x)dx + (1 − α)

=α −∞

xp2 (x)dx

−∞

= αµ1 + (1 − α)µ2 .

(2)

∞ (x − µ0 )2 p0 (x)dx

σ02 =

Fig. 1. Model of a crossing.

−∞

∞ {x − αµ1 − (1 − α)µ2 }2

= −∞

× {αp1 (x) + (1 − α)p2 (x)} dx

∞ {(x − µ1 ) + (1 − α)(µ1 − µ2 )}2 αp1 (x)dx

= −∞

∞ {(x − µ2 ) − α(µ1 − µ2 )}2 (1 − α)p2 (x)dx

+ −∞

= ασ12 + α(1 − α)2 (µ1 − µ2 )2 + (1 − α)σ22 + α2 (1 − α)(µ1 − µ2 )2 = ασ12 + (1 − α)σ22 + α(1 − α)(µ1 − µ2 )2 .

Fig. 2. Real crossing image.

a crossing based on the strength of bipolarity in an image that matched with the crossing pattern. In a perfect bipolar image, the intensity distribution is concentrated at two points, i.e., the distribution is made from a sum of two delta-isolated functions. As we have used bipolarity as the main feature in detecting a crossing, in sections A and B, first we define bipolarity, and then show a way to estimate it. The edges of crossing bands can be extracted through projection along the crossing direction. Besides this, a blind person is interested in detecting the frontal crossing, not the sided one. Therefore, estimation of the crossing direction is important. We present the principle of crossing-direction estimation in Section II-C.

(3)

Equation (3) shows that the total variance consists of the weighed sum of variances and the difference of means. If σ02 ≈ α(1 − α)(µ1 − µ2 )2 , p0 (x) can be said to be almost bipolar. So, we define the bipolarity γ as γ≡

 1  α(1 − α)(µ1 − µ2 )2 . 2 σ0

(4)

Equation (4) implies that 0 ≤ γ ≤ 1. If γ = 1, there are α, p1 , and p2 such that σ1 = σ2 = 0. This means that p1 (x) = δ(x − µ1 ) and p2 (x) = δ(x − µ2 ). So, γ = 1 corresponds to perfect bipolarity and γ = 0 represents the absence of bipolarity. Fig. 3 represents a typical bipolar intensity distribution of an image block. B. Estimation of Bipolarity We can calculate an estimation of γ in the following way. Let

A. Definition of Bipolarity We denote the intensity distribution of an image block as p0 (x). If the block contains only black and white pixels, then p0 (x) can be written as p0 (x) = αp1 (x) + (1 − α)p2 (x), where 0 ≤ α ≤ 1, p1 (x) is the intensity distribution of black pixels, and p2 (x) is the intensity distribution of white pixels. Let us define some variables as

∞ µi =

∞ xpi (x)dx,

−∞

(x − µi ) pi (x)dx,

σi2 =

2

(i = 0, 1, 2)

−∞

(1)

d = |µ1 − µ2 |,

σ1 ≤

|µ1 − µ2 | , n

and

σ2 ≤

|µ1 − µ2 | . n

Then σ02 = ασ12 + (1 − α)σ22 + α(1 − α)(µ1 − µ2 )2 αd2 (1 − α)d2 + + α(1 − α)d2 2 n n2 1  = d2 + α(1 − α) . 2 n ≤

(5)

IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, VOL. 6, NO. 4, DECEMBER 2005

441

edges of crossing bands (i.e., from black to white and vice versa). Therefore, ∂i/∂v has local extremes about direction v on the edges of crossing bands. For simplicity, we used i instead of i(x, y). Since the edges of crossing bands are straight lines, the integration along u direction emphasizes the local extremes. Consequently, one can find crossing bands by analyzing the projection as

∞

∂i du ∂v

(8)

−∞

which is a one-dimensional function about v. As mentioned earlier, (8) probably has alternate prominent peaks and valleys when crossing bands exist in the image. Then, the integral of square of (8) becomes a good measure of closeness to true crossing direction. Accordingly, we use θ that maximizes Fig. 3.

 2  ∞  ∞   ∂i  du dv.  ∂v  

Bipolar intensity distribution of an image block.

(9)

−∞ −∞

Using Parseval’s formula, one can derive the following equation.

 2  ∞  ∞ ∞     ∂i 1   du dv = ζ 2 I(−ζ sin θ, ζ cos θ)2  dζ (10)  ∂v  2π 

−∞

−∞

−∞

where I(ξ, η) is the Fourier transform of i(x, y) and is given by

∞ ∞ i(x, y)e−j(ξx+ηy) dxdy

I(ξ, η) =

(11)

−∞ −∞

Fig. 4.

Relationship between x, y and u, v coordinates.

j is a complex operator, and ζ = −ξ sin θ + η cos θ. Although the lefthand side of (10) includes two integrations depending on θ, the righthand side includes only one integration depending on θ. Therefore, we find θ that maximizes the right-hand side of (10).

Equation (4) implies γ=

=

α(1 − α)d2 α(1 − α)(µ1 − µ2 )2 1  ≥ 2 σ02 d n2 + α(1 − α) α(1 − α)n2 1 =1− . 1 + α(1 − α)n2 1 + α(1 − α)n2

(6)

Therefore, using (6), one can estimate γ taking α as a parameter. For example, γ must be greater than 0.8 when there are p1 and p2 , such that α = 0.5 and n ≥ 4. C. Principle of the Extraction of Crossing Direction Let i(x, y) denote the intensity of an image, where x and y are the horizontal and vertical spatial coordinates, respectively. If the image rotates by an angle of θ, and considering (u, v) to be the rotated coordinates corresponding to (x, y), then u and v are calculated as follows. u = + x cos θ + y sin θ v = − x sin θ + y cos θ.

(7)

Fig. 4 explains the relation between x, y and u, v coordinates. If there is a pedestrian crossing in the image, then θ will closely correspond to the direction of crossing bands. The differential of i(x, y) along v direction includes alternate peaks and valleys on the

III. M ETHOD FOR D ETECTION OF THE E XISTENCE OF A C ROSSING The images used in this paper are taken by a commercial digital camera and the size of the image is (width × height) = (640 × 480) pixels . The origin of the image coordinates is chosen at the upper left corner of the image. At first, the color image is converted to a gray scale image using the following transformation. Y = 0.299R + 0.587G + 0.114B

(12)

where Y denotes the image intensity (or brightness) corresponding to a color coordinate (R, G, B). Then, the image is partitioned into equalsize rectangular blocks of (16 × 16) pixels. Bipolarity of each block is calculated using a window of (32 × 32) pixels in size. Detection of the existence of a crossing is performed in two stages. Flow diagrams of these stages are given in Fig. 5. In the first stage, we follow the following two steps to find the crossing-region candidates. 1) This step identifies homogenous bipolar regions. Segmentation is carried out by merging neighboring blocks of similar pattern

442

IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, VOL. 6, NO. 4, DECEMBER 2005

Fig. 6. Typical intensity distributions of a bipolar image block and its neighbor.

3)

4)

Fig. 5. Flow diagrams of the detection stages.

on the basis of intensity distributions. Assign a label to a block found by scanning. Join the block with its neighbor if the distance ratio given in (13) is greater than 0.85. If the neighbors join, then assign the same label, otherwise, assign a different label to the neighbor. Iterate in this way until all blocks are labeled. The distance ratio is calculated as follows. distance ratio = =

5)

internal distance external distance ˜2 ) − max(µ1 , µ ˜1 ) min(µ2 , µ max(µ2 , µ ˜2 ) − min(µ1 , µ ˜1 )

(13)

where µ1 and µ2 are the bipolar distribution’s means of the cur˜2 are the means of the neighboring block. rent block and µ ˜1 and µ Fig. 6 represents typical intensity distributions of a bipolar image block and its neighbor. 2) Only keep regions that are sufficiently bipolar. Calculate the bipolarity of each segmented region. First, determine the largest

6)

region that has a bipolarity higher than 0.80. Then, extract the candidate regions that have bipolarity greater than 0.80 and areas more than 50% of the largest region’s area. In the estimation of bipolarity (Section II-B), we have showed that bipolarity γ > 0.80 is a reasonable threshold when α = 0.50. We use the candidate areas that are greater than 50% of the largest candidate based on experimental data. In the second stage, we follow the following steps in checking each candidate region. This section refines the segmentation. First, check the largest area candidate region. If there exists a small region of different label within this candidate region, then, fill it with the same label if its area is less than 5% of the candidate area. Then, refine the bottom boundary region of the crossing area. The refinement is done by assigning the same label to the pixels on a differentlabel horizontal line if its leftmost and rightmost pixels lie in the candidate region. Make sure the region is centered in the image. Check the location (either it is in the appropriate position, or too left or too right or too far with respect to observer) of the candidate region. If the total candidate region lies in a position that is situated within 20% of the image width from the left or right boundaries, then the position of the candidate region is treated too left or too right, respectively. If the y-coordinate of the bottommost point of the candidate region is situated more than 40% of the image height from the bottom boundary, then the position of the candidate region is treated too far from the observer. If the candidate region is not in an appropriate location, then decide that there is no crossing for this candidate and go to examine the next candidate region (if any). This section corrects for viewing angle. Calculate the crossing direction using the original image content at the location of the candidate region using (10) with the help of the two-dimensional fast Fourier transform. If the crossing direction is greater than a threshold (15◦ is used here on the basis of experimental data), then decide that there is no crossing for this candidate, as the crossing direction is too steep, and go to examine the next candidate region (if any). Remove regions with too few crossings. Binarize (0, 1) the original image content at the location of the candidate region using the mean. Use median filtering of window size (5 × 5) pixels on the binarized image to eliminate sporadic noise. Take the mean value of the binarized pixels of the candidate region along the u-axis direction (i.e., along crossing direction) for each

IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, VOL. 6, NO. 4, DECEMBER 2005

443

v-axis position. From this mean integration result, determine the number of crossing points when the mean integration value becomes 0.5. If the number of crossing points (i.e., number of black and white bands) is less than 8, then decide that there is no crossing for this candidate and go to examine the next candidate region (if any). 7) Remove regions with too few white bands. Calculate the bandwidth pattern of existing bands. Usually, the bandwidth is decreasing in an image plane from nearer to farther locations (with respect to observer). Let d0 , d1 , d2 , . . . , dn be the widths of the zeroth, first, second,. . ., nth bands, respectively. Starting from the nearest location, determine the largest continuous set of n bands for each location where (dn−1 + 4) ≥ dn and (dn−2 + 2) ≥ dn .

(14)

In real road scenes, there are noises, and the white paintings on the crossing are not 100% perfect; therefore, disturbances may occur in bandwidths. To cope with these, we add 4 and 2 pixels in the first and the second conditions of (14). Choose the maximum number of bands that satisfy the bandwidth pattern among all locations. Then, count the number of white bands that exists in the maximum number of bands that satisfy (14). A band is taken as a white band if the integration result at the center of the two successive crossing points is greater than 0.5. If there exists at least four successive white bands that satisfy the bandwidth pattern of a crossing, then decide that there is a crossing in the considered image; otherwise, decide that there is no crossing for this candidate and go to check the next candidate region (if any).

IV. E XPERIMENTAL R ESULTS To evaluate the performance of the proposed method for the detection of the existence of a crossing, we used 100 real images with and without a crossing under different backgrounds. The images were taken by a commercial digital camera. Fig. 7 shows some samples of experimental images. Figs. 8 and 9 present the obtained results at different steps in detecting the existence of a crossing for the images of Figs. 2 and 7(c), respectively. The existence of a crossing is detected in the images of Figs. 2, 7(a), (d), and (g) on the basis that there are at least four white bands that exist in the candidate crossing area of the respective images. Nonexistence of a crossing is detected in the images in Fig. 7(f), (i), (k), and (l) on the basis that there are less than eight crossing points that exist; in Fig. 7(b), (e), and (j) on the basis that there are less than four white bands that exist in the candidate area; and in Fig. 7(c) and (h) on the basis that there is no highly bipolar candidate area extracted in the respective images. Detections are false negative in the following images: In Fig. 7(b), only few white bands (less than four) exist in the image and in Fig. 7(h), a highly bipolar area is not extracted as the white paintings are not perfect, because it was done on an unusual pattern of road surface. The complete detection result is summarized in Table I. From this table, we see that the proposed algorithm is quite successful in detecting the existence of crossings from real road images. The method has not made any dangerous (false positive) error, i.e., it decides on the existence of a crossing for a scene without a crossing. For a scene containing a crossing where the white paintings on the crossing are damaged or the scene contains too few crossing bands, then it decides on the nonexistence of a crossing (i.e., false negative). These have happened only in five cases among 100 experimental images.

Fig. 7. Some experimental images.

V. C ONCLUSION In this paper, an effective image-based method for the detection of the existence of a crossing has been proposed. The method detects the existence of a crossing by careful evaluation of the bipolar

444

IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, VOL. 6, NO. 4, DECEMBER 2005

TABLE I DETECTION-RESULT SUMMARY

is quite successful in detecting the existence of crossings from real road images. The method has not made any dangerous (false positive) error, i.e., it decides on the existence of a crossing for a scene without a crossing. For a crossing scene where the white paintings on the crossing are damaged or the scene contains too few crossing bands, then it decides on the nonexistence of crossing (i.e., false negative). These have happened only in five cases among 100 experimental images. We hope this technique will help in increasing the mobility of blind people. We have not tested our algorithm under various illumination conditions yet. In the future, we would like to include an illumination-invariant strategy to make it robust to illumination changes.

ACKNOWLEDGMENT The authors gratefully acknowledge the suggestions and comments of Prof. Y. Yoshida and Mr. T. Matsuo of Kyoto Institute of Technology, Japan, and of Dr. M. Yeasin of University of Memphis, TN. The authors thank the anonymous reviewers for their valuable comments in improving the manuscript. R EFERENCES

Fig. 8. Experimental results of the image of Fig. 2. (a) Bipolarity of each segmented region. (b) Candidate regions using bipolarity. (c) Original image (crossing area) at the location of the candidate region. (d) Binarized crossing area. (e) Mean integration along the crossing direction of the binarized image. (f) Bandwidth of crossing bands obtained from the integration results.

Fig. 9. Experimental results of the image of Fig. 7(c). (a) Bipolarity of each segmented region. (b) Candidate regions using bipolarity.

(i.e., periodic black and white) crossing pattern in an image on the basis of crossing width, crossing direction, number of crossing bands, as well as bandwidth trend. Experimental results show that the method

[1] B. Thylefors, A. D. Negrel, R. Pararajasegaram, and K. Y. Dadzie, “Global data on blindness,” Bull. WHO, vol. 73, no. 1, pp. 115–121, Jan. 1995. [2] “Blindness and visual disability: Seeing ahead-projections into the next century,” WHO Fact Sheet No. 146, 1997. [3] C. A. Shingledecker and E. Foulke, “A human factor approach to the assessment of mobility of blind pedestrians,” Hum. Factor, vol. 20, no. 3, pp. 273–286, Jun. 1978. [4] R. H. Whitestock, L. Frank, and R. Haneline, “Dog guides,” in Foundations of Orientation and Mobility, B. B. Blasch and W. R. Weiner, Eds. New York: Amer. Foundation for the Blind, 1997. [5] L. Kay, “A sensor and to enhance spatial perception of the blind, engineering design and evaluation,” Radio Electron. Eng., vol. 44, no. 11, pp. 605– 629, Nov. 1974. [6] D. L. Morrissette, G. L. Goodrich, and J. J. Hennessey, “A follow-up study of the Mowat sensor’s applications, frequency of use, and maintenance reliability,” J. Vis. Impair. Blind., vol. 75, no. 6, pp. 244–247, Jun. 1981. [7] J. M. Benjamin, “The new C-5 laser cane for the blind,” in Proc. Carnahan Conf. Electronic Prosthetics, Lexington, KY, 1973, pp. 77–82. [8] S. Shoval, J. Borenstein, and Y. Koren, “Auditory guidance with the Navbelt—A computerized travel aid for the blind,” IEEE Trans. Syst., Man, Cybern. C, Appl. Rev., vol. C28, no. 3, pp. 459–467, Aug. 1998. [9] T. Shioyama, H. Wu, N. Nakamura, and S. Kitawaki, “Measurement of the length of pedestrian crossings and detection of traffic lights from image data,” Meas. Sci. Technol., vol. 13, no. 9, pp. 1450–1457, Sep. 2002. [10] M. S. Uddin and T. Shioyama, “Measurement of pedestrian crossing length using vector geometry—An image based technique,” in Proc. IEEE Int. Midwest Symp. Circuits and Systems, Hiroshima, Japan, Jul. 2004, vol. I, pp. 229–232. [11] ——, “Measurement of the length of pedestrian crossings—A navigational aid for blind people,” in Proc. IEEE Int. Conf. Intelligent Transportation Systems (ITSC), Washington, DC, Oct. 2004, pp. 690–695.

IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, VOL. 6, NO. 4, DECEMBER 2005

[12] S. Se, “Zebra-crossing detection for the partially sighted,” in Proc. IEEE Computer Society Conf. Computer Vision and Pattern Recognition (CVPR), Hilton Head, SC, Jun. 2000, vol. 2, pp. 211–217. [13] P. B. L. Meijer. (2003). Vision Technology for the Totally Blind. [Online]. Available: http://www.seeingwithsound.com [14] ——, “An experimental system for auditory image representations,” IEEE Trans. Biomed. Eng., vol. 39, no. 2, pp. 112–121, Feb. 1992. [15] A. Broggi, M. Bertozzi, A. Fascioli, and M. Sechi, “Shape-based pedestrian detection,” in Proc. IEEE Intelligent Vehicles Symp., Dearborn, MI, Oct. 2000, pp. 215–220. [16] L. Zhao and C. E. Thorpe, “Stereo- and neural network-based pedestrian detection,” IEEE Trans. Intell. Transp. Syst., vol. 1, no. 3, pp. 148–154, Sep. 2000.

445

[17] C. Curio, J. Edelbrunner, T. Kalinke, C. Tzomakas, and W. V. Seelen, “Walking pedestrian recognition,” IEEE Trans. Intell. Transp. Syst., vol. 1, no. 3, pp. 155–163, Sep. 2000. [18] U. Franke and S. Heinich, “Fast obstacle detection for urban traffic situations,” IEEE Trans. Intell. Transp. Syst., vol. 3, no. 3, pp. 173–181, Sep. 2002. [19] T. Tsuji, H. Hattori, M. Watanabe, and N. Nagaoka, “Development of night vision system,” IEEE Trans. Intell. Transp. Syst., vol. 3, no. 3, pp. 203–209, Sep. 2002. [20] J. Dowling, A. Maeder, and W. Boles, “Intelligent image processing constraints for blind mobility facilitated through artificial vision,” in Proc. Australian and New Zealand Conf. Intelligent Information Systems, Sydney, Australia, 2003, pp. 109–114.