Investigation of Genetic Algorithms with Self-adaptive Crossover ...

2 downloads 0 Views 317KB Size Report
A method of self-adaptive mutation, crossover and selection was im- ... two major forms of parameter value setting, i.e. parameter tuning and parameter con- trol.
Investigation of Genetic Algorithms with Self-adaptive Crossover, Mutation, and Selection Magdalena Smętek and Bogdan Trawiński Wrocław University of Technology, Institute of Informatics, Wybrzeże Wyspiańskiego 27, 50-370 Wrocław, Poland {magdalena.smetek,bogdan.trawinski}@pwr.wroc.pl

Abstract. A method of self-adaptive mutation, crossover and selection was implemented and applied in four genetic algorithms. So developed self-adapting algorithms were then compared, with respect to convergence, with a traditional genetic one, which contained constant rates of mutation and crossover. The experiments were conducted on six benchmark functions including two unimodal functions, three multimodal with many local minima, and one multimodal with a few local minima. The analysis of the results obtained was supported by statistical nonparametric Wilcoxon signed-rank tests. The algorithm employing self-adaptive selection revealed the best performance. Keywords: self-adaptive GA, self-adaptive mutation, self-adaptive crossover, self-adaptive selection, benchmark functions.

1 Introduction The problem of adapting the values of various parameters to optimize processes in Evolutionary Computation (EC) has been extensively studied for last two decades. The issue of adjusting genetic algorithms (GA) or Evolutionary Algorithms (EA) to the problem while solving it still seems to be a promising area of research. The probability of mutation and crossover, the size of selection tournament, or the population size belong to the most commonly set parameters of GA/EA. A few taxonomies of parameter setting forms in EC have been proposed [1], [7], [13]. Angeline [1] determines three different adaptation levels of GA/EA parameters: population-level where parameters that are global to the population are adjusted, individual-level where changes affect each member of the population separately, and component-level where each component of each member may be modified individually. The classification worked out by Smith and Fogarty [13] is based on three division criteria: what is being adapted, the scope of the adaptation, and the basis for change. The latter is further split into two categories: evidence upon which the change is carried out and the rule or algorithm that executes the change. Eiben, Hinterding, and Michalewicz [7] devised a general taxonomy distinguishing two major forms of parameter value setting, i.e. parameter tuning and parameter control. The first consists in determining good values for the parameters before running GA/EA, and then tuning the algorithms without changing these values during the run (contradiction to the dynamic nature of GA/EA). The second form lies in dynamic E. Corchado, M. Kurzyński, M. Woźniak (Eds.): HAIS 2011, Part I, LNAI 6678, pp. 116–123, 2011. © Springer-Verlag Berlin Heidelberg 2011

Investigation of GA with Self-adaptive Crossover, Mutation, and Selection

117

adjusting the parameter values during the execution. The latter can be categorized into three classes: deterministic (parameters are modified according to some deterministic rules without using any feedback from the optimization process), adapting (some feedback is also used to modify parameters) and self-adapting parameter control (parameters are encoded into the chromosomes and undergo mutation and recombination). A series of parameter control methods have been proposed in the literature [3], [9], [12]. Several mechanisms of mutation and crossover adaptation and self-adaptation have been developed and experimentally tested [2], [4], [5]. Very often benchmark functions are employed to carry out the experiments to validate effectiveness and convergence of novel techniques and to compare them with other methods [6], [14]. Yao [15] categorized them into three groups: unimodal functions, multimodal with many local minima, and multimodal with a few local minima. Our former investigations on the use of evolutionary algorithms to learn the rule base and learn membership functions of fuzzy systems devoted to aid in real estate appraisal showed it is a laborious and time consuming process [10]. Therefore, we intend to examine the usefulness of incorporating self-adapting techniques into our genetic fuzzy systems aimed to generate models for property valuation [8]. The goal of the present paper was to implement a self-adapting method of mutation, crossover, and selection setting in GAs based on an idea developed by Maruo et al. [11] and to test it using six benchmark function belonging to three classes, i.e. unimodal (U), multimodal with many local minima (MM), and multimodal with a few local minima (MF).

2 Self-adapting Method of Mutation, Crossover, Selection Setting A self-adaptive method was implemented employing a binary encoding of a chromosome which, besides the solution, i.e. an argument or arguments of a given benchmark function, comprises mutation and crossover rates, thereby making them subject to evolution. The solution is represented with the accuracy of six decimal places, whereas the rates of both mutation and crossover of two decimal places. The mutation and crossover rates can take real values from the ranges of [0.0,0.3] and [0.0,1.0] respectively. To encode a real value X in a binary chromosome Y formula (1) was used: · 10 (1) where d denotes an accuracy and [Q]2 means the conversion of Q to the binary system. In turn, to obtain a real value from the chromosome formula (2) was applied: /10

(2)

where [Q]10 denotes the conversion of Q to the decimal system. According to the above rules 5 genes are required for the mutation rate and 7 gens for the crossover rate. The chromosome with two self-adaptive parameters is shown in Fig. 1.

Fig. 1. Chromosome with self-adaptive parameters

118

M. Smętek and B. Trawiński

Mutation. The self-adaptive mutation is illustrated in Fig. 2. It differs from a standard GA mutation which remains constant during the run. Each chromosome from the population can be subject to the mutation. A N M matrix with real, randomly selected values from the range of [0.0,0.3] is created, where N is the number of chromosomes in the population, and M stands for the number of genes in the chromosome. Each gene in each chromosome is connected with one real value in the matrix. The self-adaptation of the mutation proceeds as follows. For each chromosome from the population: • • • •

extract the genes representing the mutation rate from the chromosome, calculate the value of mutation rate extracted from chromosome, if the value from the matrix is lower than the value of the mutation rate taken from the chromosome, then the chromosome mutates in a traditional way, the matrix remains unchanged during the run.

Fig. 2. Self-adaptive mutation

Crossover. The self-adaptive crossover depicted in Fig. 3 is also different from a traditional GA crossover. A N 1 table with real, randomly selected values from the range of [0.5,1.0] is created, where N is the number of chromosomes in the population. Each chromosome is connected with one real value in the table. The selfadaptation of the crossover proceeds in the following way. For each chromosome from population: • • • •

extract the genes representing the crossover rate from the chromosome, calculate the value of crossover rate extracted from the chromosome, if the value from the table is lower than the value of crossover rate from the chromosome, then the chromosome is selected to a classic crossover process, the table remains unchanged during the run.

Fig. 3. Self-adaptive crossover

Investigation of GA with Self-adaptive Crossover, Mutation, and Selection

119

Selection. Self-adapting selection depends on the control of a population size. Each chromosome is connected with one integer value, which expresses the aging level of the chromosome. At the beginning this value is set 3 for each chromosome. The self-adaptation of the selection proceeds as follows. For each chromosome from the population: • subtract 1 from the aging level, • add 1 to the aging level if the value of the chromosome fitness function is lower than median of the values of all chromosomes or subtract 1 from the aging level in the opposite case, • subtract 2 from the aging level if the population size has been increased 10 times and the fitness function of the chromosome is not in top 1000 values of the fitness function in the whole population, • remove from the population all chromosomes with the aging level lower or equal to zero.

3 Plan of Experiments The main goal of our experiment was to compare, in respect of a convergence, a classic GA with four self-adapting GAs in which rates of mutation, crossover, or population size were incorporated. Following denotation was used in the remaining part of the paper: GA - a classic genetic algorithm, SAM – a self-adaptive genetic algorithm with adapting mutation rate, SAC – a self-adaptive genetic algorithm with adapting crossover rate, SAMC – a self-adaptive genetic algorithm with adapting both mutation and crossover rates, SAS – a self-adaptive genetic algorithm with adapting selection (i.e. population size). Table 2. Benchmark functions used in experiments Type U

Function ݂ଵ ሺ‫ݔ‬ሻ ൌ ෍

௡ ௜ୀଵ

‫ݔ‬௜ଶ

݂ଶ ሺ‫ݔ‬ଵ ǡ ‫ݔ‬ଶ ሻ ൌ െ …‘•ሺ‫ݔ‬ଵ ሻ …‘•ሺ‫ݔ‬ଶ ሻ ݁ ିሺሺ௫భ ିగሻ MM ݂ଷ ሺ‫ݔ‬ሻ ൌ െʹͲ݁

σ೙ ௫ మ ି଴Ǥଶඨ ೔సభ ೔ ௡

݂ସ ሺ‫ݔ‬ሻ ൌ ͳͲ݊ ൅ ෍ ݂ହ ሺ‫ݔ‬ሻ ൌ  σ௡௜ୀଵ MF



Domain [-100,100]

݂௠௜௡ 0

2 30

[-100,100] [-1,-1]

-1 0

30

[-5.12,5.12]

0

30

[-600,600]

0

[-2,2]

3



െ ݁ σ೔సభ ୡ୭ୱሺଶగ௫೔ሻ ൅ ʹͲ ൅ ݁

ሺ‫ݔ‬௜ଶ െ ͳͲ …‘•ሺʹߨ‫ݔ‬௜ ሻሻ

௜ୀଵ ௫೔మ

ସ଴଴଴

మ ାሺ௫ ିగሻమ ሻ మ

n 30

െ ς௡௜ୀଵ …‘• ଶ

௫೔ ξ௜

൅ͳ

݂଺ ሺ‫ݔ‬ଵ ǡ ‫ݔ‬ଶ ሻ ൌ ൫ͳ ൅ ሺ‫ݔ‬ଵ ൅ ‫ݔ‬ଶ ൅ ͳሻ ȉ ሺͳͻ െ ͳͶ‫ݔ‬ଵ  ൅ ͵‫ݔ‬ଵଶ െ ͳͶ‫ݔ‬ଶ ൅ ͸‫ݔ‬ଵ ‫ݔ‬ଶ ൅ ͵‫ݔ‬ଶଶ ሻ൯ ȉ ሺ͵Ͳ ൅ ሺʹ‫ݔ‬ଵ െ ͵‫ݔ‬ଶ ሻଶ ȉ ሺͳͺ െ ͵ʹ‫ݔ‬ଵ ൅ ͳʹ‫ݔ‬ଵଶ ൅ Ͷͺ‫ݔ‬ଶ െ ͵͸‫ݔ‬ଵ ‫ݔ‬ଶ ൅ ʹ͹‫ݔ‬ଶଶ ሻሻ

2

The algorithms were used to find minimal values (fmin) of six benchmark functions: f1 – De Jong's function, f2 – Easom function, f3 – Ackley's Path function, f4 – Rastrigin's function, f5 – Griewangk's function, f6 – Goldstein-Price function. They were arranged in 3 groups: unimodal (U), multimodal with many local minima (MM), and multimodal with few local minima (MF). The functions we employed are listed in Table 2.

120

M. Smętek and B. Trawiński

The parameters of a classic GA were as follows: mutation rate was equal to 0.15, crossover rate to 0.80, and population size was set to 100. A seven-chromosome tournament method is applied to the selection process. The chromosome length depends on the accuracy of the solution. One third of genes in the chromosome undergo a mutation. A two-point crossover with randomly selected positions of breaking was applied. The initial aging value was 3. One fitness function was used and based on commonly known mean absolute error measure (MAE) expressed in the form of formula (3) where yi denotes actual value and i – predicted value by a model of i-th case. The fitness function, denoted by MAEy, was calculated for the output value of a given benchmark function. It determined how near the optimum was the output value of the function. In this case N was always equal 1. ∑

|

|

(3)

All five algorithms, i.e. GA, SAM, SAC, SAMC, SAS, were executed independently 50 times and the final values of MAEy were calculated as an average over 50 runs for best individuals found by respective algorithms. 50 initial populations composed of 100 chromosomes were randomly created and they were the same for all algorithms in each run. In order to investigate the convergence of individual algorithms, 100 generations were carried out in each run and the values of MAEy were calculated after each five cycles. Moreover, nonparametric Wilcoxon signed-rank tests were carried out for MAEy provided by the algorithms by the 100-th generation over 50 independent runs for individual benchmark functions.

4 Results of Experiments The performance of GA, SAM, SAC, SAMC, SAS algorithms on respective benchmark functions in respect of MAEy measures was shown in Fig. 4-9. In each case SAS, SAC, and SAMC, revealed better convergence than GA. SAM achieved similar results as GA. Moreover, SAS, SAC and SAMC produced lower values of MAEy than SAM. SAS achieved the best results for all functions and in each case. The advantage of SAM and SAMC algorithms over GA is apparent particularly on De Jong's, Ackley's Path, Goldstein-Price function.

Fig. 4. Performance of algorithms on De Jong's function in terms of MAEy

Investigation of GA with Self-adaptive Crossover, Mutation, and Selection

Fig. 5. Performance of algorithms on Easom function in terms of MAEy

Fig. 6. Performance of algorithms on Ackley's Path function in terms of MAEy

Fig. 7. Performance of algorithms on Rastrigin's function in terms of MAEy

Fig. 8. Performance of algorithms on Griewangk's function in terms of MAEy

121

122

M. Smętek and B. Trawiński

Fig. 9. Performance of algorithms on Goldstein-Price function in terms of MAEy Table 3. Results of Wilcoxon tests for GA, SAM, SAC, SAMC, SAS algorithms Alg vs Alg

f1

f2

f3

f4

f5

f6

GA vs SAM GA vs SAC GA vs SAMC GA vs SAS SAM vs SAC SAM vs SAMC SAM vs SAS SAC vs SAMC SAC vs SAS SAMC vs SAS

≈ – – – ≈ – – – – –

≈ – – – – – – ≈ – –

+ – – – – – – – – –

≈ – – – – – – ≈ – –

+ – ≈ – – – – ≈ – –

≈ – – – – – – ≈ – –

The results of the Wilcoxon test are given in Table 3, where +, –, and ≈ denote that the first algorithm in a pair performed significantly better than, significantly worse than, or statistically equivalent to the second algorithm, respectively. Main outcome is as follows: GA was significantly worse than SAS, SAC, SAMC for each benchmark function, besides one case. Differences between SAC, SAMC and between GA, SAM are not so clear. Only SAS performed significantly better than all other algorithms for all of functions.

5 Conclusions and Future Work The experiments aimed to compare the convergence of a classic genetic algorithm (GA) with four self-adapting genetic algorithms (SAM, SAC, SAMC, SAS) in which the rates of mutation and crossover or population size were dynamically evolved. Six benchmark functions were employed including two unimodal, three multimodal with many local minima, and one multimodal function with a few local minima. The results showed that almost all self-adaptive algorithms revealed better convergence than traditional genetic one. Moreover, SAS, SAC and SAMC produced lower values of fitness function than SAM. SAS was the best algorithm for all functions and in each case. The advantage of SAM and SAMC algorithms over GA became particularly apparent on De Jong's, Ackley's Path, Goldstein-Price function. Statistical nonparametric Wilcoxon signed-rank tests allowed for the analysis of behaviour of the algorithms on individual benchmark functions.

Investigation of GA with Self-adaptive Crossover, Mutation, and Selection

123

Further research is planned to extend the self-adaptive parameters to include the tournament size of selection, and number of cross-points. More criteria of the algorithm assessment will be taken into account. The possible application of the selfadaptive techniques to create genetic fuzzy models assisting with real estate appraisal will also be considered. Acknowledgments. This paper was partially supported by the Polish National Science Centre under grant no. N N516 483840.

References 1. Angeline, P.J.: Adaptive and self-adaptive evolutionary computations. In: Palaniswami, M., Attikiouzel, Y. (eds.) Computational Intelligence: A Dynamic Systems Perspective, pp. 152–163. IEEE Press, New York (1995) 2. Bäck, T.: Self-adaptation in genetic algorithms. In: Varela, F.J., Bourgine, P. (eds.) Proc. First European Conference on Arti-ficial Life, Toward a Practice of Autonomous Systems, pp. 263–271. MIT Press, Cambridge (1992) 3. Bäck, T., Schwefel, H.-P.: An Overview of Evolutionary Algorithms for Parameter Optimization. Evolutionary Computation 1(1), 1–23 (1993) 4. Cervantes, J., Stephens, C.S.: Limitations of Existing Mutation Rate Heuristics and How a Rank GA Overcomes Them. IEEE Trans. Evolutionary Computation 13(2) (2009) 5. Deb, K., Beyer, H.-G.: Self-adaptive genetic algorithms with simulated binary crossover. Evolutionary Computation 9(2), 197–221 (2001) 6. Digalakis, J.G., Margaritis, K.G.: An Experimental Study of Benchmarking Functions for Genetic Algorithms. Int. J. Computer Math. 79(4), 403–416 (2002) 7. Eiben, E., Hinterding, R., Michalewicz, Z.: Parameter control in evolutionary algorithms. IEEE Transactions on Evolutionary Computation 3(2), 124–141 (1999) 8. Herrera, F., Lozano, M.: Fuzzy adaptive genetic algorithms: design, taxonomy, and future directions. Soft Computing 7(8), 545–562 (2003) 9. Hinterding, R., Michalewicz, Z., Eiben, A.E.: Adaptation in Evolutionary Computation: A Survey. In: Proceedings of the Fourth International Conference on Evolutionary Computation (ICEC 1997), pp. 65–69. IEEE Press, New York (1997) 10. Król, D., Lasota, T., Trawiński, B., Trawiński, K.: Investigation of evolutionary optimization methods of TSK fuzzy model for real estate appraisal. International Journal of Hybrid Intelligent Systems 5(3), 111–128 (2008) 11. Maruo, M.H., Lopes, H.S., Delgado, M.R.: Self-Adapting Evolutionary Parameters: Encoding Aspects for Combinatorial Optimization Problems. In: Raidl, G.R., Gottlieb, J. (eds.) EvoCOP 2005. LNCS, vol. 3448, pp. 154–165. Springer, Heidelberg (2005) 12. Meyer-Nieberg, S., Beyer, H.-G.: Self-Adaptation in Evolutionary Algorithms. In: Lobo, F.G., Lima, C.F., Michalewicz, Z. (eds.) SCI, vol. 54, pp. 47–75. Springer, Heidelberg (2007) 13. Smith, J.E., Fogarty, T.C.: Operator and parameter adaptation in genetic algorithms. Soft Computing 1(2), 81–87 (1997) 14. Tang, K., Li, X., Suganthan, P.N., Yang, Z., Weise, T.: Benchmark Functions for the CEC 2010 Special Session and Competition on Large Scale Global Optimization, Technical Report, Nature Inspired Computation and Applications Laboratory, USTC, China (2009), http://nical.ustc.edu.cn/cec10ss.php 15. Yao, X., Liu, Y.: Fast evolution strategies. Contr. Cybern 26(3), 467–496 (1997)