Document not found! Please try again

The Hybrid Framework for Multi-objective ...

2 downloads 0 Views 3MB Size Report
In this paper, the performance of NSHS and MOHS/D is enhanced by using a previously proposed hybrid framework. In this framework, the diversity of the pop-.
The Hybrid Framework for Multi-objective Evolutionary Optimization Based On Harmony Search Algorithm Iyad Abu Doush(0000-0001-7200-0032)1? , Mohammad Qasem Bataineh2 , and Mohammed El-Abd3 1

3

Computer Science and Information Systems, American University of Kuwait, Salmiya, Kuwait [email protected] 2 Computer Science Department, Yarmouk University, Irbid, Jordan [email protected] Electrical and Computer Engineering, American University of Kuwait, Salmiya, Kuwait [email protected]

Abstract. In evolutionary multi-objective optimization, an evolutionary algorithm is invoked to solve an optimization problem involving concurrent optimization of multiple objective functions. Many techniques have been proposed in the literature to solve multi-objective optimization problems including NSGA-II, MOEA/D and MOPSO algorithms. Harmony Search (HS), which is a relatively new heuristic algorithm, has been successfully used in solving multi-objective problems when combined with non-dominated sorting (NSHS) or the breakdown of the multi-objectives into scalar sub-problems (MOHS/D). In this paper, the performance of NSHS and MOHS/D is enhanced by using a previously proposed hybrid framework. In this framework, the diversity of the population is measured every a predetermined number of iterations. Based on the measured diversity, either local search or a diversity enhancement mechanism is invoked. The efficiency of the hybrid framework when adopting HS is investigated using the ZDT, DTLZ and CEC2009 benchmarks. Experimental results confirm the improved performance of the hybrid framework when incorporating HS as the main algorithm. Keywords: Multi-objective Optimization, Harmony Search, Multi-objective Optimization Evolutionary Algorithms

1

Introduction

In Multi-objective Optimization Problems (MOPs), the aim is to find an optimal solution to a problem having multiple objectives [4]. Almost all engineering problems are multi-objective. Moreover, these objectives are often conflicting ?

Dr. Iyad Abu Doush, Department Computer Science and Information Systems, American University of Kuwait, Salmiya, Kuwait.

(i.e., maximize performance function, minimize cost function, maximize reliability function, etc.). Hence, an optimal solution of one objective function will not provide the best solution for other objective(s). Therefore, different solutions offer trade-offs between different objectives and a set of solutions is maintained [10, 12]. Population-based meta-heuristics were proven to be very effective in handling MOPs. This class covers a wide range of algorithms including EAs, swarm intelligence, and foraging algorithms. These algorithms rely on updating a population of solutions in each iteration. Hence, such algorithms have the ability to maintain a set of non-dominated solutions. These algorithms are generally characterized by having less sensitivity to the shape of the pareto front, more robustness, and are easily parallelized [1]. Well-known algorithms applied to MOPs include NSGA-II [5], MOEA/D [18], and MOPSO [15]. The work in [17] proposed a hybrid evolutionary multi-objective optimization framework having a modular structure. Sample implementations of this framework were implemented using NSGA-II and MOEA/D (referred to from now on as Hybrid NSGA-II and Hybrid MOEA/D). In this work, the same hybrid framework is implemented using the Harmony Search (HS) algorithm and performances are compared against Hybrid NSGA-II and Hybrid MOEA/D. HS [11] is a meta-heuristic algorithm simulating a group of musicians searching for the right harmony. A single problem variable is a pitch of a different musical instrument and a complete solution is a harmony vector. If a specific pitch results in an improved harmony, it will be stored in the Harmony Memory (HM). Initially, HM is filled with randomly initialized harmonies. In every iteration, the generated harmony replaces the worst harmony in HM if it is better than it [8, 7]. In HS, new harmonies are generated by developing a sequence of new pitches. A pitch is produced by playing a pitch from memory, picking then perturbing a pitch from memory, or playing a random pitch. One operation is applied based on two parameters: the Harmony Memory Considering Rate (HMCR) and the Pitch Adjusting Rate (PAR). HS and its variants were successfully applied for single objective continuous optimization as in [3, 9]. Previous work tackled the problem of MOPs using HS [16, 14, 2]. The rest of paper is divided as follows: the implemented hybrid framework is introduced in Section 2. Experimental results are presented in Section 3. Finally, the paper is concluded in Section 4.

2

Methodology

In this section, the previously proposed NSHS and MOHS/D [2] are briefly explained. This is followed by a brief explanation of the hybrid framework. NSHS and MOHS/D: In [2], Abu Doush and Bataineh followed similar frameworks in [5, 13, 17]. For NSHS, HM is randomly initialized with HMS solutions. Non-dominated sorting is used to sort the population based on Pareto optimality, where the best solution is ranked number 1. HS operators are used

to create additional HMS solutions. The two populations are then combined to form a large population of size 2 × HMS and the sorting procedure is run to rank the two populations. An elitism process is adopted to select the best HMS solutions for the next generation. If the number of rank 1 solutions is less than HMS, we select all individuals to be inserted in the new HM. Next we select individuals from the set of ranked 2 solutions, and so on. This procedure is continued until no more solutions can be accommodated. In the last set having more solutions than needed, remaining solutions are selected using a crowding comparator. NSHS steps are repeated until the stopping criterion is reached. For MOHS/D, parameters to be tuned are: N (the number of sub-problems of MOP), N weight vectors uniformly generated, and the neighborhood size T of each weight vector. An empty External Population (EP) is created to be used as the set of non-dominated solutions. MOHS/D then finds, for each solution, the indexes of the closest T weight vectors formulating the B sets. An initial population HM is randomly generated. For all solutions in HM, F(x) is computed, which consists of a set of f objective functions with size m. In the last step of the initialization process, MOHS/D assigns a target point in the objective space Z using a problem-specific method in order to guide the solutions towards the Pareto front. This Z vector is subsequently modified in every iteration. For each weight vector, HS operators are used to create a new Harmony (y) by randomly selecting two indexes from the B set associated with this weight vector. MOHS/D then applies a problem-specific repair on y to produce y 0 . The objective values of y 0 is then compared to the target vector Z. If f(y 0 i ) is better than zi for each i = (1, . . . , m), then zi = F(y 0 i ). After that the subproblem updates its neighborhood weight vector and if the Tchebycheff value of y 0 is better than its neighbor, the neighbor becomes y 0 . Finally, MOHS/D removes from EP all the vectors that are dominated by F(y 0 ) and add F(y 0 ) to EP if no vectors in EP dominate F(y 0 ). This process is repeated until the stopping criterion is satisfied. Hybrid Framework: In this framework, a clustering process is run on the population every a predetermined number of iterations. The process clusters the population into k clusters and then Qcurrent is computed as follows: Qcurrent =

si k   X 1 X D cji , σi s i=1 i j=1

(1)

where k is the number of clusters, σi is the centroid of cluster i, and Cij is a point j in cluster i. Furthermore, D(Cij , σi ) is the Euclidean distance of point j in cluster i to its centroid and si is the number of individuals in the cluster i. The equation is used to evaluate the diversity for a specific generation (in our case each 5 generations). The calculated value of Qcurrent is used to be our new lower bound Qbound of the next generation. Starting from the next generation, the value of Qbound value will be compared with Qcurrent until we reach to the next 5th generation. In case Qcurrent is found to be lower than Qbound then the population has a bad diversity otherwise, it has a good diversity. In case the

diversity is good, local search (LS) is triggered, the local search that used in this step is already implemented in Jmetal framework. In case the diversity is bad, the diversity enhancement module is triggered. In this work, this hybrid framework is implemented using NSHS and MOHS/D (referred to from now on as Hybrid NSHS and Hybrid MOHS/D) and compared against Hybrid NSGA-II and Hybrid MOEA/D.

3

Experimental Procedure

To evaluate the Hybrid NSHS and Hybrid MOHS/D algorithms, experiments are conducted using the ZDT [20], DLTZ [6], and CEC09 [19] benchmarks. The performance is compared against Hybrid NSGA-II and Hybrid MOEA/D. Comparison are based on minimizing the Inverted Generational Distance (IGD) measure [17]. The smaller values of IGD, the closer the generated solutions are to the true PF. IGD is calculated as follows: P ∗ d (V, P ) ∗ IGD (P, P ) = v∈P ∗ |P | Where P is the set of obtained approximated solutions in objective space, P ∗ is the true PF in objective space, d(V, P ) is the minimum Euclidean distance between v and points in P, and |P ∗ | is the number of points in P ∗ . Maximum number of function evaluations (FE) is different for different test instances. As in [17] FE is set to 20,000 for ZDT family and DTLZ1 and DTLZ7, 5,000 for DTLZ2, DTLZ4, and DTLZ5, 15,000 for DTLZ6, and 50,000 for DTLZ3 and UF. Results reported based on 15 independent runs. For HS, HMS (i.e., the population size) is 100 for bi-objective problems and 200 for three objectives, M CR = 0.95, P AR = 0.4, and bw = 0.01. Finally, Sequential Quadratic Programming SQP is used as the LS with n of Plocal = 3 for Hybrid NSHS and n of Plocal = 20 for Hybrid MOHS/D. 3.1

Results and Discussion

Hybrid NSHS and Hybrid NSGA-II: Table 1 shows the minimum, median, and maximum of the IGD-metric values for both Hybrid NSHS and Hybrid NSGA-II algorithms. The bold font represents the algorithm with the best result. Figure 1 presents a random sample of PF of some test problems for both algorithms. In a previous study [17], Hybrid NSGA-II was tested on ZDT4, DTLZ1, DTLZ2, DTLZ3, DTLZ4, DTLZ5 and DTLZ6 only, the experimental results showed that the Hybrid NSGA-II outperformed NSGA-II in these test problems. From Table 1, Hybrid NSHS outperforms Hybrid NSGA-II in 14 out of 22 problems. Hybrid NSHS outperformed Hybrid NSGA-II in solving DTLZ1 and DTLZ3, this emphasizes that Harmony operators have the advantage in solving multiple local fronts problems. However, Hybrid NSGA-II algorithm showed lower IGD values in ZDT4 as Hybrid NSHS is stuck in local minima.

Hybrid NSGA-II is better in solving DTLZ2, DTLZ4 and DTLZ5. The reason for the higher median IGD values for the Hybrid NSHS algorithm could be the excessive diversity caused by inappropriate pitch adjustment rate. In contrast, Hybrid NSHS algorithm produces better results in DTLZ6. Figure 1 presents a visual evidence on approximating the two algorithms for PF of sample functions. Hybrid NSHS approximates the DTLZ6 and UF5 PF slightly better than Hybrid NSGA-II while Hybrid NSGA-II approximate DTLZ1 PF slightly better. In Figure 1, we can see that Hybrid NSHS plots ZDT6 and UF5 better than the Hybrid NSGA-II. Since Harmony operators are better suited to deal with non-convex and discontinuous problems. Figure 1 shows that Hybrid NSGA-II was more successful in plotting PF of ZDT1. However Hybrid NSGA-II and Hybrid NSHS have relatively the same approximation for ZDT1 PF.

ZDT1

1.5

1.5 PF Harmony−HybridNSGAII

PF HybridNSGAII

1.5

PF HybridNSGAII

1

1

f3

f2

1

f3

1

ZDT1

1.5

DTLZ4 PF Harmony−HybridNSGAII

f2

DTLZ4

0.5

0.5

0 2

0 2

1 f2

0

0

1

0.5 f1

1.5

0.5

1 f2

0

1

0.5 f1

0

1.5

0

0.5

0

0.2

0.4

0.6

0.8

0

1

0

0.2

f1

DTLZ5

0.5

0.4

0 1

0 1

0.2

1

f2

0

0

1

0.5

0.5

f2

f1

0.6

f2

f3 0.5

0.5

0

0

0.4 0.2

0 0.2

0.5

1

PF HybridNSGAII

0.8

0.6

1

f3

1

PF Harmony−HybridNSGAII

0.8

PF HybridNSGAII

1.5

0.8

1

f2

PF Harmony−HybridNSGAII

1.5

0.6 ZDT6

1

DTLZ5

0.4 f1

ZDT6

0.4

0.6 f1

f1 UF5

0.8

0 0.2

1

0.4

0.6 f1

0.8

1

UF5

1.5

1.5 PF Harmony−HybridNSGAII

PF HybridNSGAII

f2

1

f2

1

0.5

0

0.5

0

0.5

1 f1

1.5

0

0

0.5

1

1.5

f1

Fig. 1. Pareto fronts for lowest IGD values of Hybrid NSHS and Hybrid NSGA-II algorithms

Hybrid MOHS/D and Hybrid MOEA/D: Table 2 shows the minimum, median, and maximum of the IGD-metric values for both Hybrid MOHS/D and Hybrid MOEA/D. The bold font represents the algorithm with the best result. Figure 2 presents a random sample of PF of some test problems for both algorithms. In the same study [17], Hybrid MOEA/D was tested on ZDT4, DTLZ1, DTLZ2, DTLZ3, DTLZ4, DTLZ5 and DTLZ6 only. The experimental results showed that the Hybrid MOEA/D outperformed MOEA/D in these test problems except DTLZ3 and DTLZ6. Hybrid MOHS/D outperformed Hybrid MOEA/D in 14 out of 22 problems. Hybrid MOHS/D outperformed Hybrid MOEAD in solving DTLZ1 and DTLZ3, again due to harmony operators being better suited for solving multiple local

fronts problems. In ZDT4, Hybrid MOHS/D algorithm showed lower median IGD value as the Hybrid MOEA/D is stuck in local minima.

UF6

5

2 1

f2

0

f2

f1

1

1

0.5

0.5 0

1.5

0.5

0 1

1

0.5

PF HybridMOEAD

1.5

f2

4

2 PF Harmony−HybridMOEAD

PF HybridMOEAD f2

6

UF6

2

DTLZ7 PF Harmony−HybridMOEAD 10

f3

f3

DTLZ7

0

0.5 0

0

1

0.5

0

0.5

1

1.5 f1

f1

2

2.5

0

3

0

0.5

1 f1

ZDT2 UF9

UF9

PF HybridMOEAD 1.5

f2

f2

1

f3

f3

1 1

0.5 0 1

f2

0

0.5 1

0.5

0.5 0

f2

f1

1

0.5

0 1

1

0.5

2

2 PF Harmony−HybridMOEAD

PF HybridMOEAD

PF Harmony−HybridMOEAD 1.5

2

1.5

ZDT2

1.5

0

0.5 0

0

0

0.2

0.4

0.6

0.8

1

f1

f1

UF8

0

0

0.2

0.4

0.6

0.8

1

f1

UF8 PF Harmony−HybridMOEAD

1.5

PF HybridMOEAD

2

f3

f3

1 1

0.5 0 2 1 f2

0

1

0.5 f1

0

0 2

1.5

2

1 f2

1 0

UF4

0

f1 UF4

1.5

1.5 PF Harmony−HybridMOEAD

PF HybridMOEAD

f2

1

f2

1

0.5

0

0.5

0

0.5

1 f1

1.5

0

0

0.5

1

1.5

f1

Fig. 2. Pareto fronts for lowest IGD values of Hybrid MOHS/D and hybrid MOEAD algorithms

Figure 2 presents a visual evidence on approximating the two algorithms for PF of sample functions. Hybrid MOHS/D approximate the PF of DTLZ7 and UF8 significantly better than Hybrid MOEAD. The figure also shows that Hybrid MOHS/D plots ZDT2 and UF6 better than Hybrid MOEAD. The Friedman test is the non-parametric alternative to the one-way ANOVA with repeated measures. It is used to test for differences between groups when the dependent variable being measured is ordinal. Table 3 shows mean ranks for the proposed algorithm in terms of quality indicators, the lowest mean rank is the best (bold font). From Table 3 we can see that Harmony Hybrid MOEAD algorithm outperformed all proposed algorithms.

4

Conclusions

In this paper, we present an enhanced version of the multi-objective harmony search algorithm NSHS and MOHS/D using a previously proposed hybrid framework. The two new algorithms are called Hybrid NSHS and Hybrid MOHS/D. The applied framework investigate the solutions diversity on each predetermined

number of iterations. The two proposed algorithms are compared with the Hybrid NSGA-II and Hybrid MOEA/D using well-known datasets from the literature. The experimental results show that both Hybrid NSHS and Hybrid MOHS/D outperforms Hybrid NSGA-II and Hybrid MOEA/D respectively in terms of IGD. The proposed algorithms provide a better vehicle for solving problems with multiple local fronts. However, for a few benchmark functions, the proposed algorithms show shortcoming because of the excessive diversity caused by the high exploration due to the pitch adjustment rate. As a future work, we will study the effect of the different operators on the performance of the proposed algorithms.

References 1. Abraham, A., Jain, L.: Evolutionary multiobjective optimization. Springer (2005) 2. Abu Doush, I., Bataineh, M.Q.: Hybedrized NSGA-II and MOEA/D with Harmony Search Algorithm to Solve Multi-objective Optimization Problems, pp. 606–614. Springer (2015) 3. Al-Betar, M.A., Doush, I.A., Khader, A.T., Awadallah, M.A.: Novel selection schemes for harmony search. Applied Mathematics and Computation 218(10), 6095–6117 (2012) 4. Deb, K.: Multi-objective optimization using evolutionary algorithms. John Wiley & Sons Chichester (2001) 5. Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: Nsga-ii. Evolutionary Computation, IEEE Transactions on 6(2), 182–197 (2002) 6. Deb, K., Thiele, L., Laumanns, M., Zitzler, E.: Scalable multi-objective optimization test problems. pp. 825–830. Proceedings of the Congress on Evolutionary Computation (CEC-2002),(Honolulu, USA) (2002) 7. Doush, I.A.: Harmony Search with Multi-Parent Crossover for Solving IEEECEC2011 Competition Problems, pp. 108–114. Springer Berlin Heidelberg, Berlin, Heidelberg (2012) 8. Doush, I.A., Al-Betar, M.A., Khader, A.T., Awadallah, M.A., Mohammed, A.B.: Analysis of takeover time and convergence rate for harmony search with novel selection methods. International Journal of Mathematical Modelling and Numerical Optimisation 4(4) (2013) 9. El-Abd, M.: An improved global-best harmony search algorithm. Applied Mathematics and Computation 222, 94–106 (2013) 10. Esfe, M.H., Hajmohammad, H., Toghraie, D., Rostamian, H., Mahian, O., Wongwises, S.: Multi-objective optimization of nanofluid flow in double tube heat exchangers for applications in energy systems. Energy (2017) 11. Geem, Z.W., Kim, J.H., Loganathan, G.: A new heuristic optimization algorithm: harmony search. Simulation 76(2), 60–68 (2001) 12. Gutjahr, W.J., Pichler, A.: Stochastic multi-objective optimization: a survey on non-scalarizing methods. Annals of Operations Research 236(2), 475–499 (Jan 2016) 13. Li, H., Zhang, Q.: Multiobjective optimization problems with complicated pareto sets, moea/d and nsga-ii. Evolutionary Computation, IEEE Transactions on 13(2), 284–302 (2009)

14. Pavelski, L.M., Almeida, C.P., Gonalves, R.A.: Harmony search for multi-objective optimization. In: Neural Networks (SBRN), 2012 Brazilian Symposium on. pp. 220–225. IEEE (2012) 15. Reyes Sierra, M., Coello Coello, C.A.: Multi-objective particle swarm optimizers: A survey of the state-of-the-art. International Journal of Computational Intelligence Research 2(3), 287–308 (2006) 16. Ricart, J., Httemann, G., Lima, J., Barn, B.: Multiobjective harmony search algorithm proposals. Electronic Notes in Theoretical Computer Science 281, 51–67 (2011) 17. Sindhya, K., Miettinen, K., Deb, K.: A hybrid framework for evolutionary multiobjective optimization. Evolutionary Computation, IEEE Transactions on 17(4), 495–511 (2013) 18. Zhang, Q., Li, H.: Moea/d: A multiobjective evolutionary algorithm based on decomposition. Evolutionary Computation, IEEE Transactions on 11(6), 712–731 (2007) 19. Zhang, Q., Zhou, A., Zhao, S., Suganthan, P.N., Liu, W., Tiwari, S.: Multiobjective optimization test instances for the cec 2009 special session and competition. University of Essex, Colchester, UK and Nanyang Technological University, Singapore, Special Session on Performance Assessment of Multi-Objective Optimization Algorithms, Technical Report (2008) 20. Zitzler, E., Deb, K., Thiele, L.: Comparison of multiobjective evolutionary algorithms: Empirical results. Evolutionary computation 8(2), 173–195 (2000)

Table 1. The minimum, median, and maximum of the IGD-metric values for Hybrid NSHS and Hybrid NSGA-II. Test Problem Type Hybrid NSHS Hybrid NSGA-II Best 1.42e-03 6.98e-04 ZDT1 Median 4.13e-03 7.33e-03 Worst 9.19e-03 8.18e-03 Best 8.83e-04 5.39e-04 ZDT2 Median 1.15e-03 2.62e-03 Worst 1.26e-03 7.09e-03 Best 5.88e-04 5.30e-04 ZDT3 Median 5.11e-04 2.65e-03 Worst 1.03e-03 3.06e-03 Best 2.55e-02 4.89e-04 ZDT4 Median 4.42e-02 1.90e-03 Worst 1.32e-01 1.06e-02 Best 6.04e-04 1.02e-03 ZDT6 Median 2.65e-03 9.44e-03 Worst 8.27e-03 2.50e-02 Best 8.82e-04 1.14e-02 DTLZ1 Median 6.24e-03 2.37e-02 Worst 1.24e-02 4.47e-02 Best 5.93e-04 6.29e-04 DTLZ2 Median 8.31e-04 5.04e-04 Worst 9.39e-04 8.08e-04 Best 1.38e-02 1.37e-01 DTLZ3 Median 6.74e-02 2.87e-01 Worst 1.31e-01 5.82e-01 Best 8.49e-04 8.53e-04 DTLZ4 Median 1.05e-03 9.58e-04 Worst 1.34e-03 2.32e-03 Best 3.25e-05 3.19e-05 DTLZ5 Median 2.46e-04 4.79e-05 Worst 6.25e-04 6.14e-05 Best 4.65e-04 2.11e-02 DTLZ6 Median 8.34e-04 1.23e-02 Worst 5.63e-03 2.40e-02 Best 2.86e-03 1.68e-03 DTLZ7 Median 3.42e-03 1.87e-03 Worst 3.72e-03 2.00e-03 Best 3.05e-03 3.98e-03 UF1 Median 3.93e-03 7.07e-03 Worst 5.25e-03 2.57e-02 Best 4.33e-03 6.54e-03 UF2 Median 5.58e-03 9.56e-03 Worst 7.75e-03 1.18e-02 Best 7.82e-03 4.60e-03 UF3 Median 8.69e-03 8.33e-03 Worst 1.01e-02 1.18e-02 Best 3.02e-03 3.01e-03 UF4 Median 3.22e-03 7.44e-03 Worst 3.62e-03 1.75e-02 Best 4.33e-02 1.12e-01 UF5 Median 1.49e-01 8.15e-01 Worst 9.05e-01 1.08e+00 Best 6.13e-03 6.20e-03 UF6 Median 8.80e-03 9.10e-03 Worst 1.31e-02 1.34e-02 Best 2.15e-03 3.88e-03 UF7 Median 3.48e-03 1.34e-02 Worst 1.70e-02 1.89e-02 Best 2.76e-03 2.16e-03 UF8 Median 2.91e-03 2.81e-03 Worst 3.19e-03 2.92e-03 Best 3.59e-03 1.27e-03 UF9 Median 4.31e-03 2.43e-03 Worst 4.84e-03 3.65e-03 Best 2.85e-03 2.87e-03 UF10 Median 3.46e-03 3.56e-03 Worst 3.83e-03 5.10e-03

Table 2. The minimum, median, and maximum of the IGD-metric values for Hybrid MOHS/D and Hybrid MOEA/D. Test Problem Type Hybrid MOHS/D Hybrid MOEA/D Best 1.49e-03 2.67e-04 ZDT1 Median 1.97e-03 3.28e-04 Worst 2.70e-03 4.43e-04 Best 3.30e-04 3.37e-02 ZDT2 Median 2.85e-04 2.96e-03 Worst 4.82e-04 3.58e-03 Best 9.41e-04 3.00e-04 ZDT3 Median 1.27e-03 3.51e-04 Worst 1.49e-03 4.28e-04 Best 8.70e-03 4.92e-02 ZDT4 Median 2.45e-02 1.39e-01 Worst 5.89e-02 3.57e-01 Best 1.56e-04 1.40e-04 ZDT6 Median 1.73e-04 1.42e-04 Worst 7.87e-03 6.08e-04 Best 3.82e-03 4.56e-03 DTLZ1 Median 1.82e-02 3.00e-02 Worst 4.73e-02 9.20e-02 Best 3.36e-03 3.36e-03 DTLZ2 Median 4.37e-03 3.75e-03 Worst 4.66e-03 4.63e-03 Best 1.03e-01 1.63e-01 DTLZ3 Median 1.39e-01 4.71e-01 Worst 1.88e-01 1.17e+00 Best 8.41e-03 8.89e-03 DTLZ4 Median 1.00e-02 1.08e-02 Worst 1.09e-02 1.27e-02 Best 1.36e-03 1.42e-03 DTLZ5 Median 1.41e-03 1.43e-03 Worst 1.46e-03 1.48e-03 Best 7.96e-03 6.84e-03 DTLZ6 Median 2.08e-02 1.03e-02 Worst 2.86e-02 3.15e-02 Best 2.04e-02 2.79e-01 DTLZ7 Median 8.98e-02 1.80e+00 Worst 3.09e-01 5.90e+00 Best 1.65e-03 1.22e-03 UF1 Median 2.77e-03 2.68e-03 Worst 3.80e-03 7.30e-03 Best 1.40e-03 8.00e-04 UF2 Median 1.79e-03 1.30e-03 Worst 2.74e-03 6.99e-03 Best 5.54e-03 4.73e-03 UF3 Median 5.53e-03 8.38e-03 Worst 8.59e-03 1.14e-02 Best 1.91e-03 1.79e-03 UF4 Median 2.01e-03 2.09e-03 Worst 3.37e-03 2.55e-03 Best 3.86e-02 8.25e-02 UF5 Median 4.89e-02 1.35e-01 Worst 7.00e-02 1.74e-01 Best 6.92e-03 3.46e-03 UF6 Median 1.03e-02 1.14e-02 Worst 1.25e-02 2.54e-02 Best 1.48e-03 9.28e-04 UF7 Median 1.22e-02 2.33e-03 Worst 1.56e-02 2.17e-02 Best 5.35e-03 8.52e-03 UF8 Median 5.35e-03 9.57e-03 Worst 8.91e-03 9.93e-03 Best 6.36e-03 5.23e-03 UF9 Median 8.99e-03 9.22e-03 Worst 6.69e-02 6.78e-02 Best 6.57e-03 8.10e-03 UF10 Median 8.33e-03 1.04e-02 Worst 9.57e-03 1.32e-02

Table 3. Friedman mean ranks for the proposed algorithm in terms of quality indicators. Indicator Harmony NSGA-II Harmony MOEAD Harmony Hybrid NSGA-II Harmony Hybrid MOEAD IGD 2.64 3.00 2.36 2.00

Suggest Documents