Principle of Maximum Entropy for Histogram Transformation and Image Enhancement Gilson A. Giraldi National Laboratory for Scientific Computing Petr´opolis, RJ, Brazil
[email protected] Paulo S.S. Rodrigues Computer Science and Electrical Engineering Departments of the FEI Technologic University S˜ao Bernardo do Campo, SP, Brazil
[email protected] Abstract In this paper, we present a histogram transformation technique which can be used for image enhancement in 2D images. It is based on the application of the Principle of Maximum Entropy (PME) for histogram modification. Firstly, a PME problem is proposed in the context of the nonextensive entropy and its implicit solution is presented. Then, an iterative scheme is used to get the solution with a desired precision. Finally, we perform a transformation in the intensity values of the input image which attempts to alter its spatial histogram to match the PME distribution. In the case study we take some examples in order to demonstrate the advantages of the technique as a preprocessing step in an image segmentation pipeline.
1. Introduction Histogram modeling and transformations are important tools for image processing [5]. Point operations, such as contrast stretching, equalization and thresholding, are based upon the manipulation of the image histogram. In many applications involving image acquisition, such as medical imaging, the targets are often characterized by low contrast or non-uniform intensity patterns in the regions on interest. Therefore, enhancement algorithms are generally required as a pre-processing step for image analysis [6]. In the literature, many histogram based techniques have been proposed for image enhancement [1]. The simplest method consists in stretching the original histogram linearly to occupy the full available intensity range [5]. Histogram
equalization is another known contrast enhancing technique which tries to keep the transformed histogram as uniformly distributed as possible over the entire intensity range [9]. However, the main disadvantage of these operations is that the frontiers between the original histogram modes are not well preserved which decreases the homogeneity inside the objects of interest. Such effect is undesirable mainly for image segmentation techniques based only on the gray-level information in the image. For example, local minima of the gray-level histogram can be used to segment the image by thresholding [8]. The threshold level can be also obtained by optimizing some information measure associated, like entropy. For instance, [4] a generalization of the classical entropy, called Tsallis entropy, is applied in a general formalism for image thresholding. In the present paper, we propose a new histogram modification technique for image enhancement. We consider the application of the Principle of Maximum Entropy (PME) for histogram transformation, inspired in analogous formulation in statistical physics [10]. The solution of a PME problem is a probability distribution (histogram) that maximizes the entropy subject to some constraints. Firstly, a PME problem is proposed and its implicit solution is presented. We focus on the nonextensive Tsallis entropy in this discussion due to its generality and capability to cover a larger range of applications. Then, a numerical scheme is designed based on the observation that the obtained expressions can be written as a global mapping in the histogram space. Finally, we transform the intensity values of the input so that the histogram of the output image approximately matches the PME solution. Although there are others works using entropy in image
enhancement (enhancement measure for parameter determination, fuzzy entropy approaches, entropy conservation techniques, among others) [1, 3, 7], the novelty of our work is the application of a nonextensive PME for the histogram modification. In the case study, we present some examples in order to show the advantages of the proposed technique. We observe an improvement in the homogeneity inside the regions of interest in the output image. It is important to emphasize that, traditionally, enhancement is accomplished by histogram transformation that preserves the entropy, as much as possible, to avoid artifacts generation [7]. However, although we do observe artifacts in the output, the results show that such artifacts can be easily removed by simple morphological operations or they are simply cut-off by a thresholding technique to be described next. Besides, we get a considerably improvement in the final segmentation.
∂F kq q−1 + α+ =− p ∂pj q−1 j
F =k
PW
i=1
q−1
pqi
+α
W X i=1
! pi − 1 +β
q i=1 ei pi q i=1 pi
PW
− Uq (4)
Therefore, we have to solve the following equations:
PW pqi ln pi − 1 − i=1 pqi + 2 (q − 1)
PW
(6)
(q−1)2
β = −k
PW
(
i=1
. P PW PW q q q pqi )( W i=1 ei pi ln pi )−( i=1 ei pi )( i=1 pi ln pi ) P q 2 ( W i=1 pi )
(7) If we multiplying expression (5) by pj , and sum the result over the pj , then, a simple algebra gives: W
kq X q α= p . q − 1 j=1 j
(8)
If we substitute this expression in (5) and multiply by p1−q we get: j W X kq kq − + pq p1−q +, j q − 1 q − 1 j=1 j
(3)
PW
W i=1
subject to the constraints given by expressions (2) and (3). The equation (6) gives: PW P q q (1−q)( W i=1 pi ) i=1 pi ln pi )−(1−
PW
1−
P
P W q q ln p e p p i i i i=1 i=1 i β − P 2 W q i=1 pi PW PW q q e p p ln p i i=1 i i i=1 i β = 0, P 2 W q p i=1 i
i=1
where Uq is a known application dependent value and ei represent the possible states of the system (in our case, the gray-level intensities). Expression (2) just imposes that pi is probability and equation (3) is a generalized expectation value of the ei (if q = 1 we get the usual mean vale). The proposed PME can be solved using Lagrange multipliers α and β:
(1 − q)
where k is normalization factor, pi is a probability distribution and q ∈ < is called the entropic index. This expression recovers the Shannon entropy in the limit q → 1. The Tsallis entropy offers a new formalism in which the real parameter q quantifies the level of nonextensivity of a physical systems [10]. In particular a general PME has been considered to find out distributions to describe such systems. In this PME, the goal is to find the maximum of Sq subjected to: W X pi = 1, (2)
= Uq ,
∂F = k ∂q
In the last decade, Tsallis [10] has proposed the following generalized nonextensive entropic form: PW 1 − i=1 pqi , (1) Sq = k q−1
q i=1 ei pi q i=1 pi
q i=1 pi
β
2. PME and Image Processing
PW
P W q−1 q qp − e p qej pq−1 j j i=1 i i =0 P 2 W q p i=1 i (5)
PW
β
PW
i=1
P W q pqi (qej ) − i=1 ei pi q = 0, P 2 W q p i=1 i
(9)
We can take off the factor q in all terms, multiply by (q − 1) and simplify the last term using the definition of Uq in ! expression (3) to obtain: .
W X q −k+k pj p1−q +(q−1)β j j=1
ej − Uq PW q i=1 pi
! = 0, (10)
From this expression, we can isolate p1−q which gives: j !! (q − 1) 1 ej − Uq 1−q 1− β PW q pj = P . W k pq i=1 pi j=1
j
(11) By using this equation and a normalization, in order to guarantee that the condition (2) is satisfied, we finally get: 1 i 1−q e −U PjW qq β 1 − (q−1) k i=1 pi pj = 1 , i 1−q PW h e −U (q−1) PmW qq 1 − β m=1 k p
h
i=1
modification of the input image. So, we transform the intensity values of the input such that the histogram of the output image matches the PME solution. The usual procedure to perform this task works as follows [5]. Suppose a random variable u ≥ 0 with probability density p1 (u) given by the histogram of the input image. The idea is to transform the variable u in another random variable v ≥ 0 such that its probability density p2 (v) is given by the solution of the PME. To perform this task, it is just a matter of defining the random variables:
(12) Z
i
with β defined by equation (7). Expression (12) is hard to solve because the right-hand side of it depends also on the pj . However, if the right-hand side works as a contraction map F (kF (x) − F (y)k ≤ α kx − yk, with α ∈ [0, 1)) then, we can obtain a solution through a recursive procedure [2]:
pn+1 W −1 pn+1 W
= F1 (pn1 , pn2 , ..., pnW ) , = F2 (pn1 , pn2 , ..., pnW ) , .................. = FW −1 (pn1 , pn2 , ..., pnW ) , = FW (pn1 , pn2 , ..., pnW ) ,
Z p1 (x) dx = F1 (u) ,
0
pn+1 1 pn+1 2
u
(13)
v
p2 (y) dy = F2 (v) (16) 0
and impose that the value v must satisfies the constraint F2 (v) = F1 (u), which gives: v (u) = F2−1 (F1 (u)) .
(17)
The so obtained random variable v is a function of the variable u and its probability density is given by p2 (v) , according to the definition of F2 (v) in expressions (16). Therefore it is the desired random variable which must be re-scaled to the range [0, 255] in order properly set the intensity range of the output image. Once performed the image enhancement, we can apply a segmentation technique. The whole pipeline is summarized on Figure 1.
where: 1 i 1−q e −U 1 − (q−1) β PjW pqq k i=1 i Fj (pn1 , pn2 , ..., pnW ) = 1 . i 1−q PW h em −Uq (q−1) P q W m=1 1 − k β i=1 pi (14) We stop the iteraction when: D pn+1 , pn ≡ max pn+1 − pni , i =, 2, ..., 256 < δ, i (15) for some pre-defined δ ∈