Cell Recognition by Image Processing and Nonlinear Cross Correlation

3 downloads 0 Views 813KB Size Report
The method is able to detect and track cells based on image processing, nonlinear filters and normalized cross correlation (ncc) and it is tested on a full ...
Cell Recognition by Image Processing and Nonlinear Cross Correlation E. P´erez-Caretaa , M. Torres-Cisnerosa , J.J. Sanchez-Mondrag´onb , O. V´azquez Buenos Airesc , O. G. Ibarra Manzanoa , L. A. Aguilera-Cort´esa a Nanobiophotonics

Group, DICIS, Salamanca, Guanajuato, M´exico; and Physical Optics Lab., Puebla, M´exico c Universidad del Papaloapan, Loma Bonita, Oaxaca, M´ exico

b INAOE/Photonics

ABSTRACT This work implements a novel hybrid method for detection and tracking of biological cells of ”in vitro” samples (Goobic,1 2005). The method is able to detect and track cells based on image processing, nonlinear filters and normalized cross correlation (ncc) and it is tested on a full sequence of 1080 images of cell cultures. In addition of the cell speed, Cell tracking differentiate itself from tracking other kinds of tracking because cells show: mitosis, apthosis, overlapping and migration (Liao,2 1995). Image processing provides an excellent tool to improve cell recognition and background elimination, set as a priori task on this work and conveniently implemented by a Fourier analysis. The normal cross correlation was developed in the Fourier space to reduce time processing. The problem of the target detection was formulated as a nonlinear joint detection/estimation problem on the position parameters. A bank of spatially and temporally localized nonlinear filters is used to estimate the a posteriori likelihood of the existence of the target in a given space-time resolution cell. The shapes of the targets are random and according to the sequence, the targets change of shape almost every frame. However, the cross correlation result is based on the target shape matching, not in the position; and the system is invariant to rotation. Nonlinear filter makes a robust cell tracking method by producing a sharper correlation peak and reducing the false positives in the correlation. These false positives may also be reduced by using image preprocessing. Fourier and nonlinear filtering implementation showed the best results for the proposed cell tracking method presenting the best time consumption and the best cell localization. Keywords: Nonlinear filters, image processing, cell tracking, ncc

1. INTRODUCTION There are several tracking techniques, most of them include object recognition and object segmentation before the cell tracking (Zimmer,3 2002). After the cell recognition the best tracking method is the nonlinear cross correlation (ncc). Image acquisition is transcendental in this work so the next section begins describing the acquisition method. Images were acquired in JPEG format and are 500 × 700 pixels size. These images have a compression rate of 3:1. The JPEG format unfortunately has poor resolution, so all of them were converted to a 24 bits BMP format image. The images were acquired at the Laboratory of Image Synthesis and Analysis(LISA), under the following conditions (Debeir,4 2005): • Pixel resolution of 0.92μM • PC-BRAB-GI card of 32 bits • CCD Hitachi Denshi KP-MIE/K-S10 camera Further author information: Eduardo P´erez-Careta E-mail: [email protected] Telephone: +52 044 464 108 001 6 Miguel Torres-Cisneros: E-mail: [email protected], Telephone: +52 (0)1 464 64 7 99 40 Photonics North 2009, edited by Réal Vallée, Proc. of SPIE Vol. 7386, 73860F © 2009 SPIE · CCC code: 0277-786X/09/$18 · doi: 10.1117/12.839858

Proc. of SPIE Vol. 7386 73860F-1

• Olympus IX50 10:1 microscopy • Cellular cultures in Falcon plates at 37◦ The image preprocessing techniques applied consist in • Image Normalization • Image Equalization • Image Segmentation • Image Dilation • Image Erosion All processing were developed in a HP pavilion dv6220la Notebook pc with AMD Turion 64 X2 processor technology TL-50 1.61GHz and 992 MB of RAM memory.

1.1 Segmentation In 1978, Xu5 introduced the concept of segmentation in image processing. He defined segmentation as the classification of the pixels according to some features in the image. In his work, he tried to classify pixels based on the border information. To complement the definition made by Panda, Image segmentation also can be the partitioning of pixels into different homogeneous groups, where all the pixels in the same region have the same intensity value. The homogeneous groups are characterized by a number of features gathered from the image. A year later, Kuroyanagi6 classified homogeneous regions (introduced in the last paragraph) based on a mathematical-pattern recognition model. His technique did not require training prototypes but operated in an “unsupervised” mode. The basic procedure is a K-means∗ clustering algorithm which converges to a local minimum in the average squared distance for a specified number of clusters. K-means algorithm has been widely used since then for different applications.

2. NORMALIZED CROSS CORRELATION The result of this algorithm is a maximum peak in the same position of the best match of both signals, in this case the scene and the target, Fig. 1 shows this result. The maximum correlation of the target and the scene are shown in Fig. 1 by a white dote in the maximum intensity. From the image also it is possible to infer that several targets in the scene can be confused with the original target, those “fake” targets are called false positives. For the 2D correlation, in our case, we define the functions as a matrix array, instead of a vector like in 1D Cross Correlation. For a continuous case, we define the correlation of two functions f (x, y) and g(x, y) as  f (x, y) ◦ g(x, y) =



−∞





f ∗ (α, β)g(x + α, β + y)dαdβ

(1)

−∞

for the discrete case, as in 1D case, are required extended functions. In this case, let be f (x, y) and g(x, y) two functions of size A × B and C × D respectively, the extended functions fe (x, y) and ge (x, y) with period of M = A + C − 1 and N = B + D − 1, giving as result the correlation in 2D as shown in Eq. 2. fe (x, y) ◦ ge (x, y) =

−1 M−1 N 

fe∗ (m, n)ge (x + m, y + n)

(2)

m=0 m=0

for x = 0, 1, 2, 3, 4, ..., M − 1 and y = 0, 1, 2, 3, 4, ..., N − 1. Now, it is important to remember that the function ge (x, y) is the target and fe (x, y) is the scene ∗

It is an algorithm to cluster n objects based on attributes into k partitions, k < n (Ray,7 2002). It assumes that the object attributes form a vector space (Ray,? 2000). The objective it tries to achieve is to minimize total intra-cluster variance, or, the squared error function (Liao,2 2002).

Proc. of SPIE Vol. 7386 73860F-2

IMAGE PROCESSING

IMAGE PROCESSING

FFT

IMAGE FFT PROCESSING

Figure 1.

This image shows the complete method using frequency domain and the result of it including the detection of false positives and the best correlation match.

3. NONLINEAR FILTERS In the literature there are many methods to evaluate the filter performance of correlation between two signals, to evaluate that in our experiment, the criteria that Javidi used in his work of 20028 was taken as reference for the calculation of the peak to correlation energy (PCE) in Eq. 8. P CE =  

 c(0, 0) 2  c(x, y) 2 dydx

(3)

This parameters measures the ratio between the intensity of the peak of the correlation and the localization of the energy of the output plane. Every time that correlation between the target and the scene is performed, we will obtain as result a sharp and a high peak. PCE values above the threshold will indicate that there is a similar object in the scene, and values below the PCE indicates no clear recognition of the object. The output of the nonlinear filter is given by Eq. 4 υ Hυ [R(α, β), S(α, β) = 2Π



G(ω)exp[R2 (α, β) + S 2 (α, β)]]Jυ [2ωR(α, β)2S(α, β)]dω

(4)

where Jυ is the Bessel function of first kind, order υ. When υ = 1 the phase is preserved and only the amplitude is affected. The Fourier transform of the kth law is given by Eq. 5 Gω =

2 (iω)k+1 Γ(k

+ 1)

;k ≤ 1

(5)

where Γ(∗) is the gamma function and k is the severity of nonlinearity. k = 1 corresponds to a linear device and k = 0 corresponds to a hard clipping nonlinearity. Linear filtering techniques have serious limitations dealing with signals that have been created or processed by a system exhibiting some degree of nonlinearity; in general, situations where the relevance of information cannot be specific in frequency domain. In image processing, many of these characteristics are often present, and it is no wonder that image processing is the field where nonlinear filtering techniques have first shown clear superiority over linear filters (Peltonen,9 2001).

Proc. of SPIE Vol. 7386 73860F-3

The blurring caused by the linear filtering is processed by using nonlinear filter techniques as the median filter. Using linear filtering a substantial part of this information comes from edges because of a blurring effect. So after all analysis we are able to say that median filters effectively reduce spike-like noise, while smoothing filters effectively reduce high space-frequency noise (Itagaki? ). It is easier to improve a nonlinear filtering than a linear one. Using a nonlinear filter with an specific constrain we may obtain the effect without causing any noise disturbance (Liptser,? 1995). Nonlinear filtering produces a finite and tolerant method to rotation and size. Nonlinear filters discriminate information, reduce noise and peak sharpness. There is a law that applies a nonlinear operator in a symmetric way. It is the k’th law, where k is the parameter that controls the nonlinearity of the filter. There are some values that defines the filter in the parameter k, those are when k=0 that are the binarizing nonlinearities and k=1 that defines linear filter techniques, so, to reach nonlinearity systems, the range of k should oscillate between the following values: 0 < k < 1. This thesis presents the results and the analysis of the experimentation for those values of the k parameter. Depending on the different problems of the image (illumination, noise, scale, rotation) the k parameter is adjustable. Nonlinear filtering depends on the nonlinearity constrain value (k), but it is difficult to find an optimal constrain value for the image to be filtered. Depending on the value of the constrain, the effect of nonlinearity changes. A particular contribution that describes better the k’th law and that was the base of the nonlinear implementation of this work was proposed by Javidi10 in 1989. His work proposed several correlations to obtain sign recognition using nonlinear filtering. His processor performed several correlations between the image and several targets that matches at least one feature of the image. He used a bank of single filters to achieve the composite filter.

3.1 Nonlinear Cross-Correlation Correlation gives a better performance in the Fourier frequency domain, where ˆs(x, y) and ˆr(x, y) are the Fourier transforms of s(x, y) and r(x, y) respectively. The complete Fourier transform is defined by Eq. 6.  s(u, v) = T F {s(x, y)} =







s(ξ, η)exp[−i2π(μξ + νη)]dξdη −∞

(6)

−∞

The inverse function is expressed by Eq. 7 c(x, y) = T F −1 {s(u, v)r ∗ (u, v)}

(7)

where T F −1 is the inverse Fourier transform. Eq. 7 shows that the correlation is obtained by a multiplication of each pixel in each image. The system perform or develop a correlation are called correlators, these systems optically implement in a real time application Eq. 7. In the literature there are many methods to evaluate the filter performance of correlation between two signals, to evaluate that in our experiment, we used the same criteria that in,8 the peak to correlation energy (PCE) 8. P CE =  

 c(0, 0) 2  c(x, y) 2 dydx

(8)

This parameters measures the ratio between the intensity of the peak of the correlation and the localization of the energy of the output plane. Every time that correlation between the target and the scene is performed, we will obtain as result a sharp and a high peak. This way, we always get a high value of PCE. PCE values above the threshold will indicate that there is a similar object in the scene, and values below the PCE indicates no clear recognition of the object. The best way to implement a robust system with the less loses possible depends on the design of the filter, if the system has a variety of filters for each problem, the system is able to filter all the images.

Proc. of SPIE Vol. 7386 73860F-4

FRAME

EQUALIZATION

EROSION

DILATION

FILTERING

NCC

1 100 200 300 400 500 600 700 800 900 1000 1080

0.345 s 32.03 s 73.43 s 110.33 s 145.35s 182.36 s 215.65 s 253.23 s 284.32 s 311.35 s 345.36 s 374.38 s

0.024 s 2.3 s 5.03 s 7.21 s 9.53 s 12.33 s 15.03 s 18.00 s 20.4 s 22.56 s 25.34 s 27.34 s

0.025 s 2.4 s 5.13 s 7.31 s 9.23 s 12.53 s 16.63 s 19.30 s 21.3 s 23.46 s 25.54 s 29.04 s

0.1s 20s 66.4 s 94.34 s 123.23 s 167.34 s 197.45 s 239.23 s 270.56 s 314.325 s 344.34 s 388.67 s

0.3s 25.2 s 70.35 s 106.34 s 150.23 s 183.98 s 205.43 s 269.34 s 295.76 s 356.54 s 400.34 s 453.43 s

Figure 2. This table shows the time analysis of the complete algorithm developed in different steps, including ncc results for the 1080 images. Ncc took the longest time for each single frame.

The composite filter used on this approach is made of several positions and size of the target, the correlation of this operation with the scene is time consuming so it is recommendable to minimize the usage of several images to the filter.

4. RESULTS Results of the time analysis and improvements to reduce computational cost are explained in this section. Also an analysis and comparison of the normalized cross correlation (ncc), implemented in spatial and frequency domain, is presented.The following figures describes several results of the nonlinear filters using the kth law.

4.1 Image Processing Image equalization takes a little longer because it completes the accumulative histogram of the image. The image equalization of each frame takes between 0.001-0.005s depending on the frequency of change that exists in the image. The change in the frequency of the image, depends directly on the quantity of cells that exists in the image. If there many cells then the change in frequency is big. In Fig.3 the complete time of processing including image normalization and equalization is shown. The complete time of processing was of 3.954s. The graphic of the figure shows a linear behaviour but some times it change when the cell reduces its size and reduces processing time. Around frame 900 the target reduces its size and the function presents a discontinuity in the average time of processing, it happens when mitoses takes place. An operation that helps to better define the contour of the image are the morphological operators. Even Morphological gradient operator are area operators, this image processing takes a short time. Image Dilation was applied to detect cell borders (5 pixels). Using the intensity of the Halo and detected borders, an Erosion process was applied to eliminate “noise” generated by the image binarization. “Noise” in this case is any region that contains a cell mixed with background as shown in Fig. 4, where most of the region that contains the cell is well defined.

4.2 Localization Algorithm This algorithms is based in the Euclidean distance algorithm, when the target is located in the area defined as cell, the cell position for the following frame is the same that the current one. If the position of the delta of Dirac is now located in the area corresponding to the Halo or a boundary, the target is also settle in the same position for the next frame; but if the delta of Dirac is in an area that does not belong to the cell or closer to it (5 pixels), the minimum distance to the closest pixel with maximum intensity is calculated. A pixels with the maximum intensity is a boundary or the Halo of that cell. After comparing the minimum distance we double that distance in the same direction, to settle the delta of Dirac in the region defined as cell, with this we secure that the delta of Dirac is going to be always in the cell or at least near it in a region considered as good to the tracking.

Proc. of SPIE Vol. 7386 73860F-5

Processing Time of Normalization and Equalization

Processing Time

Frame Number

Figure 3. Time Processing of Normalization and Equalization.

Figure 4. Image Binarization of a sigle frame, using 205 of threshold.

Proc. of SPIE Vol. 7386 73860F-6

EU

Figure 5. Image shows different frames with the tracking of all the cells. Results includes phenomena as mitosis, apthosis, overlapping and migration of cells for the complete sequence.

To face the mitosis problem, the properties of the cell were analyzed, during the mitosis, the cell reduces its size in a considerable amount and in most of the cases it reduces its shape to a semi-circle and all the area becomes white. Therefore, using a circular window of radio equal to 15 pixels around the delta of Dirac, we are able to identify when a mitosis process is taking place. So when most of the area belonged to the circle is white, a new rectangle with the same features was created. Now in the same area we have two rectangles but we have another consideration, we move the rectangles 15 pixels above and 15 pixels below from the original position to make sure that each one is going to follow different targets and not the same. Fig. 5 represents the summary of the methods applied in this work for each single frame, including the analysis and solution for the different natural phenomena of the cell. To detect the target in the scene, first we correlate both images (image and target). The result of this correlation, in spatial domain, was performance in 5s, in frequency domain was done in 2.547s including the transformations to the Fourier plane. When there is a 1 in the value of the cross correlation, it means that the target was detected correctly around 120 frames, then it began to get off the inspection window and the correlation was not perfect. After 200 frames in the correlation we obtained a result of zero, that means that no cells were detected in the window, then it varies around 0.0 − 0.4.

5. CONCLUSIONS Binarization is the faster method to discard the background and only consider the cell. Morphological gradient works as the faster method to detect the borders by the calculation of the dilation and then a substraction between the original image and the dilated. Morphological gradient works as a filter for “garbag” or small objects presented in the blood stream. Dilation of the image does not present good results by itself because it makes bigger the areas considered as background and makes more difficult the localization of the cell. Our experience showed that is not recommendable to apply only one of the morphological gradient operators, we recommend to use at least both (erosion and dilation). So without preprocessing the method is not well tested after 120 frames. Using normalization the algorithm tracks the cell for 170 frames. The detection of the cell does not depends directly on the position of the Delta of Dirac obtained, it depends on the definition of the border of the cell and the Halo. Image processing in this work defines better the borders of the cell (Morphological Gradient).

Proc. of SPIE Vol. 7386 73860F-7

After image processing, correlation in frequency domain showed a better time performance than the spatial one. Fourier transform also improves background elimination using frequency comparison. Fourier transform not only provides information about the frequency of the objects, provides information about this distribution of the objects in it, useful information that complements the information acquired from the spatial domain. Using both domains, a robust method is developed. Nonlinear filters provide a faster performance because nonlinear filters are smaller in size than linear filters, this reduces the processing time. The nonlinear filtering provides a finite detection method because most of the time it centers the Delta of Dirac in the area defined as cell, the algorithm does not lose the cell. Fourier and nonlinear filtering implementation showed the best results for the proposed cell tracking method presenting the best time consuming and best cell localization. It is easier to improve a nonlinear filtering than a linear filter. Our results showed that using the k th law the performance is different and better. k = 0.4 presented the best performance of the filter in the correlation and k = 0.1 improves the detection of the cell with a finer peak. Nonlinear filtering helps in the resolution of some targets during the process. The detection of the cell does not depends directly on the position of the Delta of Dirac obtained, it depends on the definition of the border of the cell and the Halo. Image processing in this work defines better the borders of the cell (Morphological Gradient).

REFERENCES [1] Goobic, A. P., Tang, J., and Acton., S. T., “Image stabilization and registration for tracking cells in the microvasculature.,” IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING. 52, 287 – 299 (February 2005). [2] Liao, G., Nagasaki, T., and Gundersen, G., “Low concentrations of nocodazole interfere with fibroblast locomotion without significantly affecting microtubule level: Implications for the role of dynamic microtubules in cell locomotion,” in [J. Cell Sci.], 108, 3473 – 3483 (1995). [3] Zimmer, C., Labruyere, E., Meas-Yedid, V., Guillen, N., and Olivo-Marin, J.-C., “Improving active contours for segmentation and tracking of motile cells in video microscopy,” Pattern Recognition, 2002. Proceedings. 16th International Conference on 2, 286–289 vol.2 (2002). [4] Debeir, O., Ham, P. V., Kiss, R., and Decaesteker, C., “Tracking or migrating cells under phase-contrast video microscopy combined mean-shift processes,” IEEE TRANSACTIONS ON MEDICAL IMAGING 24, 697–711 (June 2005). [5] Xu, C. and Prince, J. L., “Snakes, shapes, and gradient vector flow,” IEEE Transactions on image processing 7, 359–369 (MARCH 1998). [6] Kuroyanagi, N., Guo, L., and Suehiro, N., “Proposal of a novel signal separation principle based on dft with extended frame fourier analysis,” Global Telecommunications Conference, 1995. Conference record. Communication Theory Mini-Conference, GLOBECOM ’95., IEEE , 111–116 (13-17 Nov 1995). [7] Ray, N. and Acton., S. T., “Motion gradient vector flow: An external force for tracking rolling leukocytes with shape and size constrained active contours.,” IEEE TRANSACTIONS ON MEDICAL IMAGES. 23, 1466 – 1478 (December 2004). [8] Javidi, B., Castro, M.-A., Kishk, S., and P´erez, E., “Automated detection and analysis of speed limit signs,” final, University of Connecticut, Storrs, CT 06269-5202 (February 2002). [9] Peltonen, S., Gabbouj, M., and Astola, J., “Nonlinear filter design: methodologies and challenges,” Image and Signal Processing and Analysis, 2001. ISPA 2001. Proceedings of the 2nd International Symposium on , 102–107 (2001). [10] Bahram, J., “Nolinear joint power spectrum based optical correlation,” APLPIED OPTICS 28, 2358–2367 (June 1989).

Proc. of SPIE Vol. 7386 73860F-8

Suggest Documents