Performance Assessment of Generalized Differential Evolution 3 (GDE3) with a Given Set of Problems Saku Kukkonen, Student Member, IEEE and Jouni Lampinen
Abstract— This paper presents results for the CEC 2007 Special Session on Performance Assessment of Multi-Objective Optimization Algorithms where Generalized Differential Evolution 3 (GDE3) has been used to solve a given set of test problems. The set consist of 19 problems having two, three, or five objectives. Problems have different properties in the sense of separability, modality, and geometry of the Pareto-front According to the results, a near optimal set of solutions was found in the majority of the problems. Rotated problems given caused more difficulty than the other problems. Performance metrics indicate that obtained approximation sets were even better than provided reference sets for many problems.
I. I NTRODUCTION In this paper, a general purpose Evolutionary Algorithm (EA) called Generalized Differential Evolution 3 (GDE3) [1], [2] with a diversity maintenance technique suited for manyobjective problems [3] has been used to solve multi-objective problems defined for the CEC 2007 Special Session on Performance Assessment of Multi-Objective Optimization Algorithms. The problems have been defined in [4], where also evaluation criteria are given. The problems have two, three, or five objectives, and the number of decision variables varies from 3 to 30. Difficulty of functions vary in means of separability, modality, and geometry of the Pareto-front. Some of the problems pose also difficulty of loosing diversity among decision variables. This paper continues with the following parts: Multiobjective optimization with constraints is briefly defined in Section II. Section III describes the multi-objective optimization method used to solve the given set of problems. Section IV describes experiments and finally conclusions are given in Section V. II. M ULTI -O BJECTIVE O PTIMIZATION WITH C ONSTRAINTS A multi-objective optimization problem (MOOP) with constraints can be presented in the form [5, p. 37]: minimize {f1 (x), f2 (x), . . . , fM (x)} T subject to (g1 (x), g2 (x), . . . , gK (x)) ≤ 0.
weakly dominates x2 , i.e., x1 x2 iff ∀i : fi (x1 ) ≤ fi (x2 ). Dominance relation ≺ between two vectors is defined such a way that x1 dominates x2 , i.e., x1 ≺ x2 iff x1 x2 ∧ ∃i : fi (x1 ) < fi (x2 ). The dominance relationship can be extended to take into consideration constraint values and objective values at the same time. A constraint-domination ≺c is defined in this paper so that x1 constraint-dominates x2 , i.e., x1 ≺c x2 iff any of the following conditions is true [6]: • x1 and x2 are infeasible and x1 dominates x2 in constraint function violation space. • x1 is feasible and x2 is not. • x1 and x2 are feasible and x1 dominates x2 in objective function space. The definition for weak constraint-domination c is analogous dominance relation changed to weak dominance in the definition above. This constraint-domination is a special case of more general concept of having goals and priorities that is presented in [7]. III. O PTIMIZATION M ETHOD A. Differential Evolution The Differential Evolution (DE) algorithm [8], [9] was introduced by Storn and Price in 1995. The design principles of DE are simplicity, efficiency, and the use of floating-point encoding instead of binary numbers. As a typical EA, DE has a random initial population that is then improved using selection, mutation, and crossover operations. Several ways exist to determine a stopping criterion for EAs but usually a predefined upper limit (Gmax ) for the number of generations to be computed provides an appropriate stopping condition. Other control parameters for DE are the crossover control parameter (CR), the mutation factor (F ), and the population size (NP ). In each generation G, DE goes through each D dimensional decision vector xi,G of the population and creates the corresponding trial vector ui,G as follows [10]: r1 , r2 , r3 ∈ {1, 2, . . . , NP } , (randomly selected, except mutually different and different from i) jrand = floor (rand i [0, 1) · D) + 1 for(j = 1; j ≤ D; j = j + 1) { if(rand j [0, 1) < CR ∨ j = jrand ) uj,i,G = xj,r3 ,G + F · (xj,r1 ,G − xj,r2 ,G ) else uj,i,G = xj,i,G }
Thus, there are M functions to be optimized and K constraint functions. The objective of Pareto-optimization is to find an approximation of the Pareto-front, i.e., to find a set of solutions that are not dominated by any other solution. Weak dominance relation between two vectors is defined such a way that x1 The authors are with the Department of Information Technology, Lappeenranta University of Technology, P.O. Box 20, FIN-53851 Lappeenranta, Finland; email:
[email protected].
3593 c 1-4244-1340-0/07$25.00 2007 IEEE
This is the most common DE version, DE/rand/1/bin. Both CR and F remain fixed during the entire execution of the algorithm. Parameter CR ∈ [0, 1], which controls the crossover operation, represents the probability that an element for the trial vector is chosen from a linear combination of three randomly chosen vectors and not from the old vector xi,G . The condition “j = jrand ” ensures that at least one element of the trial vector is different compared to the elements of the old vector. Parameter F is a scaling factor for mutation and its value range is (0, 1+]1 . In practice, CR controls rotational invariance of the search, and its small value (e.g., 0.1) is practicable with separable problems while larger values (e.g., 0.9) are for non-separable problems. Parameter F controls the speed and robustness of the search, i.e., a lower value for F increases the convergence rate but it also increases the risk of getting stuck into a local optimum. Parameters CR and NP have the similar effect on the convergence rate as F has. After the mutation and crossover operations, the trial vector ui,G is compared to the old vector xi,G . If the trial vector has an equal or better objective value, then it replaces the old vector in the next generation. This can be presented as follows in the case of minimization of an objective [10]: ui,G if f (ui,G ) ≤ f (xi,G ) . xi,G+1 = xi,G otherwise DE is an elitist method since the best population member is always preserved and the average objective value of the population will never deteriorate. B. Generalized Differential Evolution The first version of Generalized Differential Evolution (GDE) extended DE for constrained multi-objective optimization, and it modified only the selection rule of the basic DE [6]. The basic idea in the selection rule of GDE is that the trial vector is selected to replace the old vector in the next generation if it weakly constraint-dominates the old vector. There was no explicit sorting of non-dominated solutions [11, pp. 33 – 44] during the optimization process or any mechanism for maintaining the distribution and extent of solutions. Also, there was no extra repository for nondominated solutions. The second version, GDE2, made the selection based on crowdedness when the trial and old vector were feasible and non-dominating each other in the objective function space [12]. This improved the extent and distribution of the obtained set of solutions but slowed down the convergence of the overall population because it favored isolated solutions far from the Pareto-front until all the solutions were converged near the Pareto-front. The third and latest version is GDE3 [1], [2]. Besides the selection, another part of the basic DE has also been modified. Now, in the case of feasible and non-dominating solutions, both vectors are saved for the population of next 1 Notation means that F is larger than 0 and upper limit is in practice around 1 although there is no hard upper limit.
3594
generation. Before continuing to the next generation, the size of the population is reduced using non-dominated sorting and pruning based on diversity preservation. The pruning technique used in the original GDE3 is based on crowding distance, which provides a good crowding estimation in the case of two objectives. However, crowding distance fails to approximate crowdedness of solutions when the number of objectives is more than two [2]. Since, the provided problem set in [4] consists of problems with more than two objectives, a more general diversity maintenance technique proposed in [3] was used. The technique is based on a crowding estimation using the nearest neighbors of solutions in Euclidean sense, and an efficient nearest neighbors search technique. All the GDE versions handle any number of M objectives and any number of K constraints, including the cases where M = 0 (constraint satisfaction problem) and K = 0 (unconstrained problem). When M = 1 and K = 0, the versions are identical to the original DE, and this is why they are referred as Generalized DEs. IV. E XPERIMENTS A. Configuration GDE3 and the given problems were implemented in the ANSI-C programming language and compiled with the GCC compiler. The hardware was an ordinary PC with 1.2 GHz CPU & 1 GB RAM, and the operating system was Linux. In the case of boundary constraint violations, violating variable values were reflected back from the violated boundary using following rule before the selection operation of GDE3: ⎧ (lo) (lo) ⎪ ⎨ 2xj − uj,i,G if uj,i,G < xj (up) (up) , uj,i,G = 2x − uj,i,G if uj,i,G > xj ⎪ ⎩ u j otherwise j,i,G (lo)
(up)
are lower and upper limits respectively where xj and xj for a decision variable xj . B. Parameter Setting Along stopping criterion and size of the population (NP ), GDE3 has two control parameters (CR and F ) as described in Section III-A, where the effect and ranges of these are also given. As a rule of thumb in single-objective optimization, if nothing is known about the problem in hand then suitable initial control parameter values are CR = 0.9 and F = 0.9, and NP = 5 · D . . . 30 · D, where D is the number of decision variables of the problem [13]. For an easy problem (e.g., moderately multimodal and low dimensional), a small value of NP is sufficient but with difficult problems, a large value of NP is recommended in order to avoid stagnation to a local optimum. In general, increase of control parameter values, will also increase the number of function evaluations (FES) needed. Dependency between NP and FES needed is linear while FES needed increases more rapidly along CR and F [14]. If values of F and/or NP are too low, search is prone to stagnate to a local optimum (with very low control
2007 IEEE Congress on Evolutionary Computation (CEC 2007)
parameter values the search converges without the selection pressure). In the case of multi-objective optimization and conflicting objectives, lower control parameter values (e.g., 0.2) for CR and F can be used than in single-objective optimization because conflicting objectives already maintain diversity and restrain the search speed. This has been noted in [15], [16], where the effect of the control parameters has been studied empirically. Also, if the complexity of objectives differ (e.g., as in the case of the ZDT problems [11, pp. 356–360]), then a high value for CR might lead to premature convergence with respect to one objective compared to another. The value of NP can be selected in same way as in single-objective optimization or it can be selected according to a desired approximation set size of the Pareto-optimal front. To keep the setup as simple as possible, same set of control parameter values were used for all the problems. It would had been also possible to apply some kind of dependency on the number of objectives and/or the number of decision variables. As well, it would had been possible to use some control parameter adaptation strategy as in [17]–[19]. However, these were not applied, because then it would had been unclear, how parameter adjustment vs. the optimization algorithm itself contributes to the results. The control parameter values used were CR = 0.1, F = 0.5, NP = 100, and Gmax = 4999. First two control parameter values were obtained by a couple of preliminary tests with the problems. Value of CR was relatively low but suitable because most of the problems were separable and because the complexity of objectives differ for the problems modified from the ZDT problems [4]. Value of F is a compromise between speed and robustness. Value F = 0.5 has been noted to be especially suitable for the ZDT4 problem because of equally spaced local optima [15], [16]. The size of the population was set according to the smallest desired approximation set size given in [4]. With chosen NP and Gmax values, FES is exactly 500 000, which was an upper limit given in [4]. In [4], approximation sets of different sizes were demanded for different problems. In GDE3, the size of the approximation set is usually same as NP . Now, NP was kept fixed for all the problems, and solutions for the approximation set were collected during generations. Populations of 200 last generations (49 last generations in the case of 5 000 function evaluations) were merged together and nondominated solutions were selected from this merged set of solutions. If the size of non-dominated set was larger than the desired approximation set size, then the set was reduced to desired size using the pruning technique described in [3]. C. Results of Experiments The problems given in [4] were solved 25 times and achieved results are presented in Tables I–VII and Figs. 1–3. Tables II–VII show the best, the worst, median, mean, and standard deviation values of two performance indicators after different FES. Binary indicators used were R [20] and Hypervolume indicator (IH¯ ) [21]. When these are used to
compare against a reference set, then a smaller indicator value express better performance (value 0 indicates equal performance respect to the reference set). According to the R indicator values in Tables II–IV, a better approximation set than the given reference set (i.e., negative indicator value) was found for problems OKA2, S ZDT4, S ZDT6, WFG1 (M=3), WFG8 (M=3,5), and WFG9 (M=3). According to the Hypervolume indicator values in Tables V–VII, a better approximation set than the given reference set was found for problems OKA2, S ZDT6, S DTLZ2 (M=3), WFG1 (M=3), WFG8 (M=3,5), WFG9 (M=3,5), and R DTLZ3 (M=5). Covered sets (CS) metric [22] measures number of covered Pareto-subsets in the decision variable space. The SYMPART problem has been designed to have several Paretosubsets in the the decision variable space mapping into a single Pareto-set in the objective space. CS values for the SYMPART problem are shown in Table I and these indicate that solutions converged into a single Pareto-subset in the decision variable space. This is not surprising since GDE3 does not explicitly maintain diversity in the decision variable space but in the objective space according to common goals of multi-objective optimization [11]. TABLE I T HE RESULTS FOR C OVERED SETS CS FOR TEST FUNCTION SYMPART
FES Best Median Worst Mean Std
5e+3 2.0000e+00 1.0000e+00 0.0000e+00 1.0000e+00 0.5000e+00
5e+4 1.0000e+00 1.0000e+00 1.0000e+00 1.0000e+00 0.0000e+00
5e+5 1.0000e+00 1.0000e+00 1.0000e+00 1.0000e+00 0.0000e+00
Attainment surfaces [23] in the case of two and three objectives are shown in Figs. 1 and 2. Results in Fig. 1 indicate that the Pareto-optimal front is found reliably in all the two-objective cases except with the ZDT4 problem modifications and S ZDT2. According to Fig. 1, the rotated version of ZDT4 (R ZDT4) is harder to solve than the unrotated version (S ZDT4). 100 % attainment surface for S ZDT4 is a single point (1, 2), thus sometimes the result can be a single solution for this problem. Different control parameter values would give different results. Attainment surfaces for the three-objective problems in Fig. 2 indicate good performance in the most cases. The rotated DTLZ2 problem (R DTLZ2) seems to be the most difficult one, and also S DTLZ3 causes some difficulty. Figure 3 illustrates pairwise interaction of objective functions in the case of the five-objective WFG8 and WFG9 problems. The results indicate good coverage of the Paretooptimal front since the approximation sets appear to be close to the true Pareto-front and also the area of the Pareto-front is well covered. D. Algorithm Complexity GDE3 is well scalable algorithm with the simple genetic operations of DE. Therefore, also problems with a large
2007 IEEE Congress on Evolutionary Computation (CEC 2007)
3595
TABLE II T HE RESULTS FOR R INDICATOR ON TEST FUNCTIONS 1–7
FES 5e+3
5e+4
5e+5
Best Median Worst Mean Std Best Median Worst Mean Std Best Median Worst Mean Std
1. OKA2 -9.1667e-04 -3.2720e-04 8.7530e-03 1.6276e-04 1.9278e-03 -1.0604e-03 -9.8426e-04 -8.6300e-04 -9.8540e-04 5.2305e-05 -1.0646e-03 -1.0593e-03 -1.0401e-03 -1.0583e-03 5.4274e-06
2. SYMPART 1.9591e-02 2.7242e-02 3.8250e-02 2.8114e-02 5.1089e-03 1.5354e-05 2.2455e-05 4.8377e-05 2.4138e-05 7.0431e-06 1.1824e-06 1.6037e-06 1.9349e-06 1.5629e-06 2.0334e-07
3. S ZDT1 3.1691e-02 4.4954e-02 5.2071e-02 4.4531e-02 5.2328e-03 6.1755e-05 1.0604e-04 1.9624e-04 1.1611e-04 3.5321e-05 2.9681e-08 3.7213e-06 1.5023e-05 4.5328e-06 3.6209e-06
4. S ZDT2 8.5101e-02 9.8426e-02 1.0832e-01 9.8161e-02 6.0500e-03 1.1911e-04 1.8268e-04 4.0119e-02 3.3835e-03 1.1056e-02 4.2118e-07 7.6896e-06 4.0053e-02 3.2126e-03 1.1088e-02
5. S ZDT4 5.9457e-02 7.5723e-02 8.4380e-02 7.5424e-02 5.1217e-03 5.6863e-03 6.8634e-03 9.1519e-03 6.9527e-03 8.8275e-04 -5.6401e-09 -3.6526e-09 7.1672e-05 1.1464e-05 2.6818e-05
6. R ZDT4 9.4119e-03 1.7792e-02 2.5286e-02 1.7979e-02 3.5124e-03 3.0167e-04 7.7423e-04 2.2171e-03 8.6134e-04 4.6544e-04 1.7335e-04 5.7908e-04 1.9619e-03 6.7173e-04 4.2043e-04
7. S ZDT6 1.3016e-01 1.3353e-01 1.3880e-01 1.3425e-01 2.7636e-03 1.7994e-02 1.9935e-02 2.1445e-02 1.9906e-02 7.2638e-04 -1.0788e-06 -1.0788e-06 -1.0788e-06 -1.0788e-06 6.4837e-22
TABLE III T HE RESULTS FOR R INDICATOR ON TEST FUNCTIONS 8–13 WHEN M = 3
FES 5e+3
5e+4
5e+5
Best Median Worst Mean Std Best Median Worst Mean Std Best Median Worst Mean Std
8. S DTLZ2 1.2102e-04 1.9031e-04 3.5465e-04 2.0985e-04 6.1425e-05 2.3955e-05 4.7156e-05 9.4718e-05 5.0877e-05 2.0136e-05 1.7526e-06 5.1877e-06 1.9259e-05 6.0540e-06 3.9601e-06
9. R DTLZ2 3.3374e-04 4.3908e-04 4.9278e-04 4.3003e-04 4.5257e-05 9.9901e-05 1.2175e-04 1.7084e-04 1.2273e-04 1.6097e-05 1.2872e-05 2.0366e-05 3.8971e-05 2.1897e-05 7.1629e-06
10. S DTLZ3 3.5014e-04 4.8860e-04 7.4403e-04 4.8803e-04 1.0472e-04 1.0047e-05 1.7569e-05 2.4134e-05 1.7951e-05 2.7603e-06 2.8851e-10 1.7172e-07 1.4113e-06 3.4950e-07 3.5416e-07
11. WFG1 6.5228e-02 8.0660e-02 8.2120e-02 8.0179e-02 3.2198e-03 4.5607e-02 5.8939e-02 6.2100e-02 5.5117e-02 6.0483e-03 -2.9818e-04 5.2467e-04 6.9913e-03 1.6775e-03 2.2177e-03
12. WFG8 -1.2425e-02 -1.0336e-02 -7.6820e-03 -9.9919e-03 1.4138e-03 -2.7319e-02 -2.5931e-02 -2.4959e-02 -2.5956e-02 6.2452e-04 -2.8890e-02 -2.8532e-02 -2.7996e-02 -2.8504e-02 1.6545e-04
13. WFG9 -6.4830e-03 -3.1983e-03 2.5751e-04 -3.3098e-03 2.2460e-03 -1.1960e-02 -8.9520e-03 -5.6245e-03 -8.9123e-03 1.2524e-03 -1.5883e-02 -8.9031e-03 -5.4468e-03 -9.2222e-03 1.9794e-03
TABLE IV T HE RESULTS FOR R INDICATOR ON TEST FUNCTIONS 8–13 WHEN M = 5
FES 5e+3
5e+4
5e+5
3596
Best Median Worst Mean Std Best Median Worst Mean Std Best Median Worst Mean Std
8. S DTLZ2 1.1874e-04 2.0113e-04 3.0906e-04 2.0537e-04 5.4517e-05 2.5119e-05 3.2789e-05 5.6378e-05 3.5491e-05 7.2419e-06 1.1620e-05 2.2048e-05 2.9654e-05 2.1511e-05 4.5056e-06
9. R DTLZ2 1.4489e-04 1.9262e-04 2.4053e-04 1.9155e-04 2.3081e-05 3.8708e-05 5.4383e-05 7.1649e-05 5.4718e-05 8.8530e-06 2.1338e-05 3.2824e-05 4.7707e-05 3.3115e-05 5.1085e-06
10. S DTLZ3 1.8060e-04 2.0902e-04 3.7126e-04 2.2656e-04 4.6965e-05 8.2593e-06 1.3803e-05 2.2322e-05 1.3649e-05 3.3250e-06 8.6601e-09 1.7526e-07 1.1207e-06 2.5426e-07 2.7395e-07
11. WFG1 5.5501e-02 5.6547e-02 5.7383e-02 5.6468e-02 4.5748e-04 4.3338e-02 4.4686e-02 4.8814e-02 4.5536e-02 1.7500e-03 2.5696e-03 4.0640e-03 8.3991e-03 4.6089e-03 1.5691e-03
12. WFG8 2.5957e-03 6.3762e-03 8.4641e-03 5.8783e-03 1.6508e-03 -8.7779e-03 -7.5197e-03 -5.6231e-03 -7.5371e-03 8.5325e-04 -1.1963e-02 -1.1466e-02 -1.0254e-02 -1.1345e-02 3.9922e-04
2007 IEEE Congress on Evolutionary Computation (CEC 2007)
13. WFG9 3.4959e-03 6.5687e-03 1.2315e-02 7.0251e-03 2.4993e-03 1.2384e-03 2.5167e-03 3.9287e-03 2.4667e-03 7.4804e-04 5.4102e-04 2.2168e-03 3.2987e-03 1.9398e-03 7.0266e-04
TABLE V T HE RESULTS FOR H YPERVOLUME INDICATOR IH ¯
FES 5e+3
5e+4
5e+5
Best Median Worst Mean Std Best Median Worst Mean Std Best Median Worst Mean Std
1. OKA2 -8.1822e-04 -1.2067e-04 1.1253e-02 5.7619e-04 2.3879e-03 -1.2271e-03 -1.1387e-03 -9.9987e-04 -1.1430e-03 5.7123e-05 -1.2359e-03 -1.2280e-03 -1.2108e-03 -1.2271e-03 5.6516e-06
2. SYMPART 5.6037e-02 7.7552e-02 1.0814e-01 7.9939e-02 1.4267e-02 4.5698e-05 6.7513e-05 1.4421e-04 7.2177e-05 2.1111e-05 3.6903e-06 4.6079e-06 5.9303e-06 4.6477e-06 6.1853e-07
3. S ZDT1 1.4412e-01 1.5541e-01 1.7080e-01 1.5566e-01 6.4482e-03 4.5187e-04 5.4824e-04 7.9815e-04 5.7459e-04 9.1660e-05 1.5672e-04 1.7207e-04 2.0140e-04 1.7484e-04 1.0255e-05
ON TEST FUNCTIONS
4. S ZDT2 2.0390e-01 2.3654e-01 2.6279e-01 2.3674e-01 1.5411e-02 5.3297e-04 6.3449e-04 4.8012e-02 4.4286e-03 1.3115e-02 1.9244e-04 2.1011e-04 4.7812e-02 4.0254e-03 1.3178e-02
1–7
5. S ZDT4 1.7806e-01 2.2822e-01 2.5523e-01 2.2904e-01 1.5735e-02 1.7081e-02 2.0508e-02 2.7175e-02 2.0748e-02 2.5913e-03 8.5011e-07 8.8127e-07 1.6596e-04 2.7290e-05 6.1760e-05
6. R ZDT4 2.9857e-02 5.5198e-02 7.8043e-02 5.5075e-02 1.0553e-02 1.3164e-03 2.9080e-03 6.8484e-03 3.0337e-03 1.3294e-03 7.0434e-04 2.1165e-03 6.0925e-03 2.2690e-03 1.2492e-03
7. S ZDT6 3.3013e-01 3.3673e-01 3.5134e-01 3.3957e-01 6.3348e-03 4.1598e-02 4.4862e-02 4.8148e-02 4.4900e-02 1.5190e-03 -2.3166e-04 -2.3057e-04 -2.2998e-04 -2.3067e-04 4.6401e-07
TABLE VI T HE RESULTS FOR H YPERVOLUME INDICATOR IH ¯
FES 5e+3
5e+4
5e+5
Best Median Worst Mean Std Best Median Worst Mean Std Best Median Worst Mean Std
8. S DTLZ2 1.1903e-03 1.7331e-03 2.7208e-03 1.8318e-03 3.9030e-04 1.9213e-05 1.0317e-04 3.0643e-04 1.1763e-04 8.1453e-05 -1.1926e-05 -7.2509e-06 5.9468e-06 -6.8285e-06 4.1187e-06
9. R DTLZ2 7.0009e-03 1.0490e-02 1.6287e-02 1.0864e-02 2.4615e-03 3.6020e-04 5.4497e-04 8.6963e-04 5.5892e-04 1.4571e-04 1.8820e-06 7.1121e-06 2.7589e-05 8.5274e-06 6.5709e-06
ON TEST FUNCTIONS
10. S DTLZ3 5.8468e-03 7.3463e-03 1.0130e-02 7.4275e-03 1.0603e-03 1.0362e-06 2.4363e-06 7.7858e-06 2.9691e-06 1.6691e-06 2.9421e-13 7.7444e-11 2.0865e-08 2.2984e-09 5.0220e-09
11. WFG1 3.3514e-01 4.0795e-01 4.1498e-01 4.0540e-01 1.5231e-02 2.3857e-01 3.0322e-01 3.1826e-01 2.8495e-01 2.9185e-02 -7.7648e-03 -7.3850e-04 3.4607e-02 5.5177e-03 1.2774e-02
8–13 WHEN M = 3
12. WFG8 -8.4283e-02 -7.1249e-02 -5.7687e-02 -7.1250e-02 7.1969e-03 -1.6707e-01 -1.6076e-01 -1.5562e-01 -1.6072e-01 3.0068e-03 -1.7596e-01 -1.7490e-01 -1.7279e-01 -1.7461e-01 9.6867e-04
13. WFG9 -3.9985e-02 -1.9643e-02 4.6558e-03 -1.9533e-02 1.4889e-02 -7.0592e-02 -5.6592e-02 -3.8630e-02 -5.6463e-02 6.0416e-03 -9.1893e-02 -5.7558e-02 -3.8588e-02 -5.9748e-02 9.8129e-03
TABLE VII T HE RESULTS FOR H YPERVOLUME INDICATOR IH ¯
FES 5e+3
5e+4
5e+5
Best Median Worst Mean Std Best Median Worst Mean Std Best Median Worst Mean Std
8. S DTLZ2 8.2136e-04 1.6175e-03 3.4713e-03 1.7376e-03 6.6919e-04 1.4208e-05 3.0415e-05 1.0524e-04 3.7278e-05 2.1466e-05 2.3565e-06 9.5935e-06 2.0628e-05 1.0020e-05 4.8347e-06
9. R DTLZ2 2.1075e-03 3.0403e-03 4.5852e-03 3.1397e-03 8.1164e-04 -8.1147e-05 1.0186e-05 8.0324e-05 -1.7555e-06 4.0644e-05 -1.5004e-04 -1.3870e-04 -9.8051e-05 -1.3436e-04 1.3372e-05
ON TEST FUNCTIONS
10. S DTLZ3 1.8676e-03 2.4715e-03 4.5449e-03 2.6169e-03 6.3012e-04 1.9484e-06 7.4486e-06 1.7884e-05 7.6879e-06 3.8216e-06 4.8850e-15 1.9213e-11 3.8028e-09 4.9525e-10 9.0507e-10
11. WFG1 6.2449e-01 6.3553e-01 6.4442e-01 6.3496e-01 4.8418e-03 4.9639e-01 5.1423e-01 5.5708e-01 5.2167e-01 1.9071e-02 2.6997e-02 4.5399e-02 1.0187e-01 5.2961e-02 2.0532e-02
8–13 WHEN M = 5
12. WFG8 -7.0962e-02 -4.2623e-02 -1.6277e-02 -4.2629e-02 1.6789e-02 -2.8735e-01 -2.6913e-01 -2.4381e-01 -2.6760e-01 1.2959e-02 -3.5647e-01 -3.5044e-01 -3.1668e-01 -3.4706e-01 9.3962e-03
2007 IEEE Congress on Evolutionary Computation (CEC 2007)
13. WFG9 7.1520e-03 5.9194e-02 1.2018e-01 5.6182e-02 2.7955e-02 -1.1565e-01 -1.0207e-01 -7.7629e-02 -1.0011e-01 9.4746e-03 -1.2640e-01 -1.1320e-01 -1.0204e-01 -1.1315e-01 6.2539e-03
3597
SYMPART
OKA2 1
2
0.8
0% attainment surface 50% attainment surface 100% attainment surface Pareto optimal front
1.5
f2
f2
0.6
1
0.4 0.2
0% attainment surface 50% attainment surface 100% attainment surface Pareto optimal front
0 −4
−2
0 f1
0.5
2
0.5
4
1 f
1.5
2
1
S_ZDT1
S_ZDT2
2
2 0% attainment surface 50% attainment surface 100% attainment surface Pareto optimal front
1.8
1.8
f2
1.6
f2
1.6 1.4
1.4
1.2 1 1
0% attainment surface 50% attainment surface 100% attainment surface Pareto optimal front
1.2
1.2
1.4
1.6
1.8
1 1
2
1.2
1.4
f1
1.8
2
R_ZDT4
S_ZDT4
5
2.5 0% attainment surface 50% attainment surface 100% attainment surface Pareto optimal front
0% attainment surface 50% attainment surface 100% attainment surface Pareto optimal front
4
f
f2
2
2
1.6 f1
3
1.5
2
1 1
1.2
1.4
1.6
1.8
1 1
2
1.2
1.4
1.6 f1
f1 S_ZDT6 2 1.8
f2
1.6 1.4 1.2 1
Fig. 1.
3598
0% attainment surface 50% attainment surface 100% attainment surface Pareto optimal front 1.3
1.4
1.5
1.6 f1
1.7
1.8
1.9
2
Pareto-optimal front and 0%, 50%, 100% attainment surfaces after 5e+5 FES on test functions 1–7.
2007 IEEE Congress on Evolutionary Computation (CEC 2007)
1.8
2
S_DTLZ3 R_DTLZ2
S_DTLZ2
7
1600
2 1.8
1400
6
1200
5
f3
3
f
1.4
3
1000 800
f
1.6
400
1.2
4 3
600
2
200 1 2
1
1.5
2 f1
f
2
1 2
500 1000 1500
1.5
5
5
4
4
4
3
3 2
1
1
2
f3
5
f3
3
WFG9 6
3 2 1
2
2
2
f
2
1.5 f1
f1
WFG8 6
2
4
0.5
1
2
f2
Fig. 2.
f1
2
4
6
2
6
4
6
f
1
2
4
500
f
f
WFG1
f
1000
1500
1.5 f1
1
4
0.5
1.5 f1
2
f
2
0.5
1
50% attainment surfaces after 5e+5 FES on test functions 8–13 (M = 3).
1.4
1.4
1.4
1.4
1.2
1.2
1.2
1.2
1
1
1.5
1.5
1
1
1.5
2
1.5
1
1
1.5
2
1.5
1
1
1.5
2
1
1.5
2
1
1.5
2
1
1.5
2
1.5
f2 1
1
1.2
1
1.4
2
2
1.5
1.5
1
1
1.2
1.4
1
1
1.5
1.5
1.5
1.4
1
1
1.5
1
1
1
1.5
1
1.5
Fig. 3.
1
1
1.5
2
1.5
1.5
2
4
1.5
1
1.5
f
1.5
1
1
2
2
1.4
1
1.5
2
1.2
2
1.5
2
1
1.5
2
2
1
1
2
1
1.5 2
1.2
2
3
2
1
1.5
f
2
1
1
1
1.5
2
1
f5 1
1.5
2
Upper diagonal plots are for WFG8 (M = 5) and lower diagonal plots are for WFG9 (M = 5).
2007 IEEE Congress on Evolutionary Computation (CEC 2007)
3599
number of decision variables and/or a large population size as in [24] are solvable in reasonable time. The most complex operation in GDE3 sorting, which time is non-dominated complexity is O N logM −1 N [25]2 . The time complexity of GDE3 was measured according to instructions given in [4], and observed CPU times are reported in Table VIII. TABLE VIII C OMPUTATIONAL COMPLEXITY: T1 IS MEAN TIME FOR EVALUATING ALL THE PROBLEMS 10000 TIMES AND T2 IS MEAN TIME FOR SOLVING ALL THE PROBLEMS USING
T1 1.4563 s
GDE3 AND 10000 FUNCTION EVALUATIONS T2 1.7284 s
(T 2 − T 1) /T 1 0.1868
V. C ONCLUSIONS Results of Generalized Differential Evolution 3 (GDE3) for the CEC 2007 Special Session on Performance Assessment of Multi-Objective Optimization Algorithms have been reported. The problems given were solved with the same fixed control parameter settings, i.e., there was no parameter adaption based on problem characteristic or other criteria. According to the results, GDE3 performed well especially with the problems without artificial rotation according to the attainment surfaces obtained. Worse performance with rotated problems was probably due to selected control parameter values. In several cases, the performance metrics indicate that the obtained approximation sets were even better than the given reference sets. R EFERENCES [1] S. Kukkonen and J. Lampinen, “GDE3: The third evolution step of Generalized Differential Evolution,” in Proceedings of the 2005 Congress on Evolutionary Computation (CEC 2005), Edinburgh, Scotland, Sept 2005, pp. 443–450. [Online]. Available: http://www.iitk.ac.in/kangal/papers/k2005013.pdf, 6.11.2006 [2] S. Kukkonen and K. Deb, “Improved pruning of non-dominated solutions based on crowding distance for bi-objective optimization problems,” in Proceedings of the 2006 Congress on Evolutionary Computation (CEC 2006), Vancouver, BC, Canada, July 2006, pp. 3995–4002. [3] ——, “A fast and effective method for pruning of non-dominated solutions in many-objective problems,” in Proceedings of the 9th International Conference on Parallel Problem Solving from Nature (PPSN IX),, Reykjavik, Iceland, Sept 2006, pp. 553–562. [4] V. L. Huang, A. K. Qin, K. Deb, E. Zitzler, P. N. Suganthan, J. J. Liang, M. Preuss, and S. Huband, “Problem definitions for performance assessment on multi-objective optimization algorithms,” School of EEE, Nanyang Technological University, Singapore, 639798, Technical Report, January 2007. [Online]. Available: http://www.ntu.edu.sg/home/EPNSugan/index files/CEC07/CEC-07-TR-13-Feb.pdf, 30.3.2007 [5] K. Miettinen, Nonlinear Multiobjective Optimization. Boston: Kluwer Academic Publishers, 1998. [6] J. Lampinen, “DE’s selection rule for multiobjective optimization,” Lappeenranta University of Technology, Department of Information Technology, Tech. Rep., 2001. [Online]. Available: http://www.it.lut.fi/kurssit/03-04/010778000/MODE.pdf, 15.2.2006
[7] C. M. Fonseca and P. J. Fleming, “Multiobjective optimization and multiple constraint handling with evolutionary algorithms–part I: A unified formulation,” IEEE Transactions on Systems, Man, and Cybernetics–Part A: Systems and Humans, vol. 28, no. 1, pp. 26–37, Jan 1998. [8] R. Storn and K. V. Price, “Differential Evolution - a simple and efficient heuristic for global optimization over continuous spaces,” Journal of Global Optimization, vol. 11, no. 4, pp. 341–359, Dec 1997. [9] K. V. Price, R. Storn, and J. Lampinen, Differential Evolution: A Practical Approach to Global Optimization. Berlin: Springer-Verlag, 2005. [10] K. V. Price, New Ideas in Optimization. London: McGraw-Hill, 1999, ch. An Introduction to Differential Evolution, pp. 79–108. [11] K. Deb, Multi-Objective Optimization using Evolutionary Algorithms. Chichester, England: John Wiley & Sons, 2001. [12] S. Kukkonen and J. Lampinen, “An extension of Generalized Differential Evolution for multi-objective optimization with constraints,” in Proceedings of the 8th International Conference on Parallel Problem Solving from Nature (PPSN VIII), Birmingham, England, Sept 2004, pp. 752–761. [13] J. Lampinen and R. Storn, New Optimization Techniques in Engineering. Springer, 2004, ch. Differential Evolution, pp. 123–166. [14] S. Kukkonen and J. Lampinen, “Constrained real-parameter optimization with Generalized Differential Evolution,” in Proceedings of the 2006 Congress on Evolutionary Computation (CEC 2006), Vancouver, BC, Canada, July 2006, pp. 911–918. [15] ——, “An empirical study of control parameters for Generalized Differential Evolution,” in Proceedings of the Sixth Conference on Evolutionary and Deterministic Methods for Design, Optimization and Control with Applications to Industrial and Societal Problems (EUROGEN 2005), Munich, Germany, Sept 2005, p. 12 pages. [16] ——, “An empirical study of control parameters for the third version of Generalized Differential Evolution (GDE3),” in Proceedings of the 2006 Congress on Evolutionary Computation (CEC 2006), Vancouver, BC, Canada, July 2006, pp. 7355–7362. [17] H. A. Abbass, “The self-adaptive Pareto Differential Evolution algorithm,” in Proceedings of the 2002 Congress on Evolutionary Computation (CEC 2002), Honolulu, Hawaii, May 2002, pp. 831–836. [18] D. Zaharie, “Control of population diversity and adaptation in Differential Evolution algorithms,” in Proceedings of Mendel 2003, 9th International Conference on Soft Computing, Brno, Czech Republic, June 2003, pp. 41–46. [19] V. L. Huang, A. K. Qin, and P. N. Suganthan, “Self-adaptive Differential Evolution algorithm for constrained real-parameter optimization,” in Proceedings of the 2006 Congress on Evolutionary Computation (CEC 2006), Vancouver, BC, Canada, July 2006, pp. 324–331. [20] M. P. Hansen and A. Jaszkiewicz, “Evaluating the quality of approximations of the non-dominated set,” Institute of Mathematical Modeling, Technical University of Denmark, IMM Technical Report IMM-REP-1998-7, 1998. [21] E. Zitzler, “Evolutionary algorithms for multiobjective optimization: Methods and applications,” Ph.D. dissertation, Swiss Federal Institute of Technology (ETH) Zurich, TIK-Schriftenreihe Nr. 30, Diss ETH No. 13398, Shaker Verlag, Germany, Dec 1999. [22] G. Rudolph, B. Naujoks, and M. Preuss, “Capabilities of EMOA to detect and preserve equivalent Pareto subsets,” in Proceedings of the 4th International Conference on Evolutionary Multi-Criterion Optimization (EMO 2007), Matsushima, Japan, March 2007, pp. 36– 50. [23] V. Grunert da Fonseca, C. M. Fonseca, and A. O. Hall, “Inferential performance assessment of stochastic optimizers and the attainment function,” in Proceedings of the First International Conference on Evolutionary Multi-Criterion Optimization (EMO 2001), Zurich, Switzerland, March 2001, pp. 213–225. [24] S. Kukkonen, S. R. Jangam, and N. Chakraborti, “Solving the molecular sequence alignment problem with Generalized Differential Evolution 3 (GDE3),” in Proceedings of the 2007 IEEE Symposium on Computational Intelligence in Multi-Criteria Decision-Making (MCDM 2007), Honolulu, HI, USA, April 2007, pp. 302–309. [25] M. T. Jensen, “Reducing the run-time complexity of multiobjective EAs: The NSGA-II and other algorithms,” IEEE Transactions on Evolutionary Computation, vol. 7, no. 5, pp. 503–515, Oct 2003.
2 Actual non-dominated sorting implementation in GDE3 is naive that increased the computation time.
3600
2007 IEEE Congress on Evolutionary Computation (CEC 2007)