Proceedings of International Conference

3 downloads 0 Views 316KB Size Report
Proceedings of International Conference. Transport Means. 2008. Extraction and Investigation of Lane Marker Parameters for Steering Signal. Prediction.
Proceedings of International Conference. Transport Means. 2008.

Extraction and Investigation of Lane Marker Parameters for Steering Signal Prediction A. Vidugirienė*, A. Demčenko**, M. Tamošiūnaitė*** * Vytautas Magnus university, Vileikos 8, LT-, Kaunas, Lithuania, E-mail: [email protected] ** Vytautas Magnus university, Vileikos 8, LT-, Kaunas, Lithuania, E-mail: [email protected] *** Vytautas Magnus university, Vileikos 8, LT-, Kaunas, Lithuania, E-mail: [email protected] Abstract In the work several lane marker parameters, extracted from the visual data that can be used for steering prediction in the intelligent self-learning driver’s assistance systems, are presented. During the research the stable parameters that can be used for steering prediction were estimated from the lane marker. The presented parameters show high correlation with a car steering angle. Lane marker detection and lane marker extraction methods from mono-camera images are also considered in the paper. KEY WORDS: driver’s assistance system, intelligent vehicle, lane marker, steering signal 1. Introduction Lane marker extraction is considered almost a solved problem in the field of intelligent transportation systems [3, 7], but usually extracted lane markers are noisy [5]. Various steps of pre-processing are required to use the extracted lane markers for steering assistance systems [4, 5, 6]. Curvature that would be natural feature otherwise includes the first and the second derivatives of cubic spline [9]:

c

y ,,

1  y 

,2 3 / 2

,

(1)

where y is the cubic spline. From the Eq. 1 it is clearly seen that the curvature is numerically unstable, consequently more stable features have to be constructed. The question what parameters of a road curve would be most useful when a car is steered autonomously has not been solved yet. There are various disturbances: lane position would vary from frame to frame because of the car vibration when driving on a road. A driver might keep a car not necessarily in the same position on the lane throughout the drive. This would be adding noise also on the extracted lane parameters. One has to extract parameters as much resistant to this type of noise, and also the selection of parameters should be considered in the context of those uncertainties. Defining stable features that correlate with steering signal is important. In this study lane marker extraction is performed and right lane marker parameters are analyzed. Correlations between extracted parameters and steering signal are compared. 2. Data Volkswagen Passat was used as a test car for recording of driving data. The recorded traffic scenario was collected on a country road, in Germany, at a daytime. The test car was equipped with two high resolution video cameras for recording the traffic scenarios which were stored in a personal computer for later analysis. The images were captured using 0.04 s sampling time. It is important to notice that in the work the images from the right camera were analyzed. The test car control data (including the steering signal) was recorded using CAN-bus with a sampling interval of 0.06 s. Due to the different sampling time intervals between the captured images and the car control data, the car control data was resampled at the same time interval as the images were captured. 3. Methods 3.1. Lane extraction algorithm From the lane markers of the country road the right lane marker is usually best visible on camera images and it can be most readily extracted. The extraction is made using lots of techniques from the simplest ones to very complex [2, 8]. In some cases due to complexity of the proposed algorithms it is complicated to apply them in the real time when the high resolution video cameras (1280×1024 pixels) at relatively high sampling frequency (25 Hz) are used in the cars.

However, the time consuming processes can be overcome when the video cameras resolution is not high (640×480 pixels or less) and the sampling frequency is relatively small (e.g. 15 Hz or less). In automated driving systems usually the curvature parameter is employed [9], but the curvature calculation is not a numerically stable procedure. In the work a simple lane marker detection and extraction algorithm that does not require complex processing of images was proposed. Therefore it can be used in the real time applications. The proposed algorithm includes several image pre-processing steps. As is more convenient to use greyscale visual data for estimation of the lane marker, the raw road scenario RGB domain (Fig. 1) images recorded during the drive were converted to the greyscale domain. After that a standard Matlab filter “average” was used for noise reduction and then the edges of the picture were extracted (Fig. 2). The “Sobel” edge extraction method was used in the work. It is useful to notice that the edge extraction also was tested without filtering of images. The results shown that the edge extraction can be performed without filtering, but in this case level of random noise increases in the images.

Fig.1 The raw visual data and the extracted lane marker (red curve)

Fig. 2. The extracted edges in the image and the detection of the lane marker.

Next, lane marker detection was performed. This task is being solved in different ways [2, 7, 8], because there are plenty of disturbing factors like a car in front, shadows on the lane marker, faded colour lane markers and etc. Using our lane marker detection and extraction algorithm the mentioned circumstances in most cases do not disturb to find and to follow the continuous lane marker on a road. The lane marker detection and extraction strategy is presented below. A searching for lane marker edges are performed at a few different fixed coordinates Y (Fig. 2), where Y=y1,y2,..,yn is the vector of arbitrary selected y coordinates given in pixels. It is necessary to search for a lane marker at a few different y coordinates due to a possible missing of the lane marker edge at a desired coordinate y. Knowing that a presence of edge corresponds to the value 1 and the value is 0 in other ways it is possible to estimate vector X=x1,x2,...,xn of the lane marker edges coordinates. A lane marker origin coordinate for the lane marker tracing in an image can be selected in different ways. In the work the origin coordinate was selected minimizing the vector X:

x0  min( ) .

(2)

Having both origin coordinates (x0 and y0), the tracing of the lane marker begins. In the work the lane marker tracing was performed at each y coordinate of an image in the following way. Neighbouring yi coordinate of the lane marker is given as:

yi  yi1  1 .

(3)

In the last expression the sign “+” can be replaced by sign “-” depending on the lane marker tracing direction: upward or backward, respectively. The xi coordinate at the yi coordinate is estimated from the following interval:

xi  max( xi1  w : 1 : xi1  w) ,

(4)

where w is the x coordinate window in pixels. In the work the window width was selected to be 8 pixels. The lane marker tracing procedure is finished when two following pixels in the row are not found. An illustration of the extracted lane marker is presented in Fig. 1. Summarizing the presented lane marker detection and extraction algorithm, the procedure of it is listed in Table 1.

Table 1 Lane marker extraction algorithm 1. Pre-processing RGB  Greyscale Filtering Edge extraction 2. Lane Detection Searching of lane marker X coordinates at the arbitrary selected Y coordinates; Estimation of the origin x0 coordinate x0 = min(X). 3. Lane marker tracing Cycle though y coordinates and xi coordinate search in the interval [xi-1 – w : xi-1 + w]. 3.2. Extraction of lane marker parameters In the work the following lane marker parameters were estimated:  x coordinate of the lane marker at the fixed y coordinate (Fig. 3). The x coordinates were estimated solving line-curve intersection equation.  area below the lane marker. Due to a mono-camera image processing all measurements are done in pixels. However, it is possible to use stereocamera system and to estimate true distances or coordinates in meters, but the use of stereo camera system rapidly increase the price of the driver’s assistant system. Moreover, the stereo camera system needs calibration from time to time and the image processing is much more complicated than in mono-camera case, so the real time issue is complicated, too.

Fig. 3. The estimation of the lane marker parameters, where the solid curve is the lane marker, the two points denote the estimated coordinates x1 and x2 at the fixed coordinates y1 and y2, respectively. 4. Results The estimated lane marker parameters are shown in Figs. 4 and 5. Variations of the lane marker coordinates at two different fixed y coordinates (500 and 600 pixels) are presented in Fig. 4. The results clearly show that the time delay between the x coordinate variations and the steering signal depends on the y coordinate level. An increase of the y coordinate (see Fig. 3 for coordinate system) reduces the time delay between the steering signal and the x coordinate variation, but increases local peak amplitudes comparing to the x coordinate variation when y = 500 pixels. It is useful to notice that the peak amplitude of the steering signal is 22° and that shows that there were not sharp turns on the road during the drive. Variations of the areas which are below the lane marker are shown in Fig. 5. In the first case area s1 (green curve) was calculated from the start to the middle point of the lane marker (see Fig. 3 for coordinate system). Area s2 (red curve) was calculated between the end and the y = 500 pixels of the lane marker. To determine the quality and stability of parameters, chosen features were evaluated using cross-correlation coefficients between the parameters and the steering signal. The estimated cross-correlation coefficients are listed in Table 2. The results show that the high correlation (0.93) between the car steering signal and the lane marker parameter (the coordinate x at the fixed y coordinate) can be obtained.

Fig. 4. The variations of the lane marker x coordinates at the two different y coordinates. The green curve marks the x coordinate variation when y = 600 pixels. The red curve denotes the x coordinate variation when y = 500 pixels. The blue curve is the steering signal.

Fig. 5. The variations of the estimated areas below the lane marker. The blue curve corresponds to the steering signal, the green and red curves are the calculated areas s1 and s2, respectively.

Table 2 The cross-correlation coefficients between the steering signal and the lane marker parameters Parameter Correlation coefficient variation of lane marker x coordinate when y = 500 pixels 0.93 variation of lane marker x coordinate when y = 600 pixels 0.64 variation of integrated area s1 0.74 variation of integrated area s2 0.86 5. Conclusions A simple and efficient lane marker detection and extraction algorithm has been developed for continuous lane marker on a road. Two parameters were tested in order to find the feature best correlating with steering signal for steering prediction in the self-learning driver assistance system: a) the x coordinate variation through the frames when y has the fixed value and b) the area below the lane marker. The best correlation was found using the coordinate x variation when y = 500 pixels. The cross-correlation coefficient value is 0.93 that is extremely high and accurate for steering signal prediction from visual data. The cross-correlation coefficient values when y = 600 was rather low – 0.64. The parameter concerning area below the lane marker s2 has shown enough high correlation (0.86) with the steering signal, but it is not as high as in x coordinate variation case. 6. Acknowledgement This work was supported in part by the European Commission project “Learning to Emulate Perception – Action Cycles in a Driving School Scenario” (DRIVSCO), FP6-IST-FET, contract No. 016276-2. References 1. 2. 3. 4.

5. 6. 7. 8. 9.

Ahrholdt M., Beutner A., Advanced Driver Asistance for Trucks by Lane Observation, 3rd International Workshop on Intelligent Transportation (WIT 2006) Hamburg, Germany 14 - 15 March 2006. Aly M., Real time Detection of Lane Markers in Urban Streets, Proc. Of Intelligent Vehicles Symposium, Eidenhoven, the Netherlands, June 4-6, 2008. p.7-12. Apostoloff N., Zelinsky A., Robust Vision based Lane Tracking using Multiple Cues and Particle Filtering, Proc. of Intelligent Vehixles Simposium, Columbus, OH, USA, 2003, p. 558-563. Mammar S., Glaser S., Netto M., Tine to Line Crossing for Lane Departure Avoidance: A Theoretical Study and an Experimental Setting, IEEE Transactions on Intelligent Transportation Systems, vol. 7, no. 2, June, 2006, p. 226-241. Oussalah M., Zaatri A., van Brussel H., Kalman Filter Approach for Lane Extraction and Following, Journal of Intelligent and Robotic Systems 34, p.195-218, 2002. Polychronopoulos A., Tsogas M., Admitis A., Etemad A., Extended path prediction using camera and map data for lane keeping support. Proc. of Intelligent Transportation Systems conference, 2005. Proceedings. 2005 IEEE Taylor C. J., Malik J., Weber J., A Real-Time Approach to Stereopsis and Lane-Finding, Proc. Of Intelligent Vehicles Symposium, Tokyo, Japan, 1996, p.207-212. Tsai L. W., Hsieh J. W., Chuang C. H., Fan K. C., Lane Detection Using Directional Random Walks, Proc. Of Intelligent Vehicles Symposium, Eidenhoven, the Netherlands, June 4-6, 2008, p.303-306. Zhang X. F., Yang Y., Xie M., Chen H., Yu Z. P. Perception, Planning and Supervisory Control of Unmanned Vehicle for 2010 World Expo, proc. of IEEE Intelligent Vehicles Symposium, Eidenhoven, Netherlands, 2008.