Development of self-consistent multi-objective harmony search algorithm Siddharth Jain
Jaydev Kalivarapu
Swarup Bag
Department of Mechanical Engineering Indian Institute of Technology Guwahati Guwahati 781039, Assam, India E-mail:
[email protected]
Department of Mechanical Engineering Indian Institute of Technology Guwahati Guwahati 781039, Assam, India E-mail:
[email protected]
Department of Mechanical Engineering Indian Institute of Technology Guwahati Guwahati 781039, Assam, India E-mail:
[email protected]
Abstract—This work presents the development of multiobjective harmony search (MOHS) algorithm for optimization problem using self-adaptive improved harmony search (SIHS) algorithm which is a variant of recently developed harmony search algorithm (HS) for single objective optimization. The approach used in this work is decomposing of multiple objectives into several single objective functions which are simultaneously optimized in such a way that a nearly uniform distribution of the solutions along the pareto-front is followed. The algorithm upon testing for standard test functions has shown promising results.
dominance and crowding distance concepts to find near approximate solutions along the pareto-front.
Keywords—Harmony Search, Multi-objective optimization, Pareto front
Introduction Although most real world optimization problems can be formulated as single-objective but in instances of conflicting objective functions there arises the requirement of paretooptimal front. A Multi-objective optimization problem (MOOP) consists of more than one objective function. Generally a MOOP has been solved as a single objective problem in the past and hence yielding only one result. A general multi objective problem can be summarized as below: min/max
Subject to
$% *+ / .
,
1,2, … .
(1)
& 0 , ( 1,2, … . ) 0 , , 1,2, … . 0 . 0 .1
It is possible that there are a host of conflicting objectives that are needed to be fulfilled at a particular point of time. The presence of these multiple objective problems gives rise to a set of optimal solutions that are known as Paretooptimal solutions or a Pareto front (PF). No other point dominates (in terms of optimality) or is better than this paretooptimal front. Generally a slight improvement of the Pareto optimal point in terms of one objective leads to deterioration of the other objective function. A Pareto front of two minimization functions 2 and 3 is shown in figure 1. Evolutionary Algorithms (EAs) are nature-inspired stochastic search techniques that mimic natural evolution mechanisms. These algorithms have been a valuable tool in the field of multi-objective optimization, mostly because their population based approach and flexibility. Although several techniques have been developed for multi-objective optimization, the Non-dominated Sorting Genetic Algorithm (NSGA-II) has shown promising results. This algorithm uses pareto-
1
Fig. 1Pareto Optimal Front
This work implements self-adaptive improved harmony search (SIHS) algorithm for multi-objective optimization problem, a variant of recently developed Harmony Search (HS) algorithm. HS algorithm was originally developed by Geem et al. [1] in 2001. Afterwards a significant improvement has been done by several researchers. The working principles and further developments of HS algorithm are explained in the later sections.
Literature Review One of the cornerstones in Multi-Objective evolutionary algorithms was the presentation of Multi Objective Genetic Algorithm (MOGA) by Fonseca et al. [2]. The authors used a rank-based fitness assignment method to optimize various problems. Genetic algorithm (GA) here is seen as the element of a multi-objective optimization loop which uses inputs from the decision maker (DM). It also uses the concept of sharing function to get better solutions. The ability of MOGA to solve various optimization problems is demonstrated here. Horn et al. [3] proposed another non-elitist algorithm by the name of Niched Pareto genetic algorithm (NPGA) to find a diverse “Pareto optimal population”. The algorithm is
substantiated with its application on two artificial problems and one open problem in hydro systems. Zitzler et al. [4] proposed strength pareto evolutionary algorithm (SPEA) which they further improved in another paper and called it SPEA-2 [5]. The authors also presented a comparative study of various multi-objective optimization techniques that have been developed and are fair enough to search for multiple solutions in a run. One of the popular and classic benchmark algorithm in multi objective optimization was proposed by Deb et al. [6] which goes famously in the research circle by the name of NSGA-II. NSGA-II is a fast and elitist method which considerably improved the performance from NSGA and other MOEA that were present up until that moment. It was able to find a diverse Pareto optimal front in spite of a drastic reduction in the time complexity of the algorithm. Currently the authors have moved a step further and are trying to propose NSGA-III. Zhang et al. [7] proposed a multi-objective evolutionary algorithm based on a method of decomposition (MOEA/D). This algorithm was particularly suited well for optimizing problems with 2 or more objectives in relatively less time. It first decomposes a multi-objective optimization problem into various scalar optimization sub-problems and then optimizes them simultaneously. This algorithm has lower computational complexity than multi-objective genetic local search (MOGLS) and NSGA-II [6, 7]. Ricart et al. [8] used harmony search algorithm as the base and proposed two variants of modified harmony search to test ZDT family of functions. The author concluded the competence of such an EA with respect to general HS algorithm. Sivasubramani et al. [9] proposed a multi-objective harmony search algorithm (MOHSA) to find optimality in power flow (PF) problem. Fast elitist non-dominated sorting and crowding distances were been used to find the Pareto optimal front. It was observed that a clear and well distributed Pareto optimal solution can produce good results. Pavelski et al. [10] investigated the efficiency of HS by adapting it along with the (NSGA-II) framework. The proposed methods were then tested against each other using a set of benchmark instances. A multi-objective binary harmony search algorithm (MBHS) was proposed by Wang et al. [11]. With MBHS, they solved binary-coded multi-objective optimization problems. A modified pitch adjustment operator was used by them to improve the search capability of MBHS. They also used the concept of the non-dominated sorting based crowding distance and adopted the evaluated solution to update the harmony memory and hence maintain the diversity of algorithm. Promising result was found when the algorithm was compared with NSGA-II.
Background of Harmony Search algorithm The harmony search is a population based metaheuristic algorithm, based on the musician’s process of finding a perfect state of harmony. Just in a way a musician intends to compose music with perfect harmony and reach a perfect state this algorithm endeavours to optimize the given objective function and reach the best solution available to the problem. 3.1 Analogies – Music Improvization and Optimization For every music instrument the pitch generally determines the aesthetic quality of music, hence is analogous to the value of a decision variable which determines the fitness value. Also, in a composition process the variation in pitch changes the harmony; likewise the variation in the value of a single decision variable changes the objective function value in HS algorithm. Therefore, the pitch of the musical instrument could be considered analogous to the value of each decision variable. Now consider an orchestra group. The final music that comes out of the orchestra is a combination of the contribution of all the fellow musicians. Similarly, the objective function value is an output of each decision variable. Therefore a musician is analogous to a decision variable. In a similar way there exists an analogy between the whole orchestra together playing different harmonies and the harmony memory. 3.2 Improvization process The following are the three major actions that skilled musicians undertake to improve his tune or harmony: (a) The musician uses repetition and uses the pitch from his/her memory which is analogous to choosing any value from HS memory. (b) The musician plays a slight variation of the pitch he/she remembers which is analogous choosing a slight variation of one value from the HS memory. (c) The musician plays altogether a new tone which is analogous to choosing totally random value from the possible value range. The following are the basic steps that are carried out as a part of the HS algorithm Step 1: Defining the objective function and initializing the algorithm parameters namely, Harmony memory considering rate (HMCR), Pitch adjusting rate (PAR), Band width (BW) etc. Step 2: Randomly initializing the harmony memory (HM). Step 3: Improvising a new harmony from the HM with probability HMCR or generating a new harmony randomly with probability 1-HMCR. Step 4: Updating the harmony memory (HM) based on survival of the fittest principle. Step 5: Checking for the convergence criteria and going to step 3 if the convergence criteria is not met.
3.3 Algorithm parameters The harmony memory is of the form 2
⋮ 5 ;0.9) may cause the solution to scatter around potential optima in a random search. Thus, we usually use PAR ∈ [0.4, 0.75] in most applications. Actual probability of pitch adjustment is a combination of both HMCR and PAR and is a bc × dec. The pitch adjustment operation essentially produces a new solution around the existing quality solution by varying the pitch slightly by a small random amount. The following equation gives the linear relation governing the operation of pitch adjustment. x.,`fg = x.,h/i + IJ × ABCD
(6)
xH,Cjk is the new harmony which is now stored in the harmony memory. Here ABCD is a random number in the range of [-1, 1]. This is essentially corresponds to a local search operation.
Development of Multi-Objective Harmony Search (MOHS) Algorithm The aim of MOHS is to find a set of solutions of two or more objectives that should have the following characteristics: a. The set of points should be close or on the Pareto optimal front and form a set of non-dominated solutions. b. The point should form as diverse Pareto front as possible. A modification of the robust SISH algorithm is used to find the Pareto optimal front. The success of SIHS for single objective functions is detailed in the result and discussions section. The algorithm makes it fit for the points to reach as close as possible to the Pareto optimal front. Some constraints related to the Euclidian distance of the set have been incorporated to maintain a diverse population. 4.1 Steps in MOHS The following are some of the steps that are carried out as a part of the MOHS algorithm Step 1: Defining the objective functions and initialize the algorithm parameters namely, Harmony memory considering rate (HMCR), pitch adjusting rate (PAR), band width (BW), number of objective functions (N) and populations size (NPOP) etc. Step 2: Based on the population size (NPOP) and the number of objective functions (N) assign a uniformly distributed weight matrix along N dimensions. Step 3: For each NPOP assign the weighted objective function and formulate it into a single objective optimization problem. Step 4: Randomly initializing the harmony memory (HM) for each member of the population (NPOP). Step 5: Improvise the harmony and update based on the survival of the fittest principle. Step 6: Checking for the convergence criteria and repeat if the convergence criteria is not met.
Fig. 3 Pseudo code for multi-objective optimization algorithm
The weight matrix is decided in such a way that the near uniform distribution of solutions is maintained throughout. 4.3 Flowchart
4.2 Pseudo code of MOHS The pseudo code in Fig 3 illustrated governing algorithm which forms the backbone of MOHS.
Fig. 4 Flow chart for MOHS
4.4 Objective function formulation The weights are needed to be assigned at equal intervals to maintain the diversity of the population. Following equation gives the weight assignment criteria with number of objective function as N: ∑`.m2 k. = 1
4
(7)
1
which are the minimum of all the individual objective functions. Under Ideal condition we would want our optimal point to be z*, but due to the constraints of the problem this might not always be possible. Hence we try to tend all our points in the direction of z*.
0.9 0.8 0.7
Function 2
0.6
z ∗ = min (f2 , f3 , … … fp )
w1 + w2 =1
0.5 0.4 0.3 0.2 0.1 0
0
0.2
0.4 0.6 Function 1
0.8
1
Fig. 5 Weight Assignment for N=2
Here all the k. are required to be distributed uniformly about the ideal point. Fig 6 illustrates the weight distribution for 20 solutions and two objective functions.
(8)
Figure 8 shows the point z* which clearly violates the boundary of the Pareto front and hence is unreachable. Now with dual objective of minimizing the distance from the line (d2) and minimizing the distance from z* (d1) we formulate an objective function. The objective function can be formulated as given below where the first part is representative of the distance d1 and the second part is representative of a constraint to minimize distance d2. min f = ∑pqm2(fq − z ∗ )3 + ∑pqm2 cq (fqQ2 wq − fq wqQ2 ) (9) where cq is a penalty variable generally taken between 10 to 1000 so as to force the solution to be along the corresponding line.
1 0.9 0.8 0.7
1
Weight 2
0.6
0.9 0.8
0.5
0.7 Function 2
0.4 0.3 0.2
0.5
0.2
0
0.2
0.4
0.6
0.8
1
z*
0.1
Weight 1
0
Fig. 6 Weights for N=2 & NPOP=21
Figure 7 illustrates the weight distribution for 28 solutions and two objective functions.
0
0.2
0.4 0.6 Function 1
0.8
1
Fig. 8 Ideal Point z*, distances d1 and d2
For two conflicting normalized objective functions having their minima both at 0, the following equation and Fig. 9 summarizes the scenario: min f = f2 3 + f3 3 + c2 (f3 w2 − f2 w3 )
(10)
1
1 0.8
0.9
0.6
0.8 0.7
0.4
d2
0.6
0.2 0 0
0
Function 2
Weight 3
d1
0.4 0.3
0.1 0
d2
0.6
0.5 0.4
0.5
0.5
0.3
1 Weight 2
1
d1
Weight 1
Fig. 7 Weights for N=3 & NPOP=28
0.2 0.1 0
A simple procedure is followed to bring the set of population points close to the lines passing through the weighted points. Consider first an ideal point z*, the coordinates of 5
0
0.2
0.4 0.6 Function 1
0.8
1
Fig. 9 Distance minimization for N=2 and normalized functions
1
In the MHSO algorithm each of the members of the NPOP is represented by and backed by a harmony memory which converges to the same line. So for the actual case the number of members generally present in the population is NPOP × HMS. For the sake of convenience we represent the population size as NPOP and do not multiply it with NPOP. A sample is shown below which tries to work with MHSO to find the Pareto front with a NPOP=3. 30 and N=2. Initially the population is assigned randomly to each line.
0.9 0.8 0.7
Function 2
0.6 0.5 0.4 0.3
1 0.2
0.9
0.1
0.8
0
0
0.2
0.7
0.8
1
Fig. 12 Further improvement and close clustering
0.6 Function 2
0.4 0.6 Function 1
0.5
After MOHS has finished running the points finally converge and form a set of non-dominated solutions on the Pareto optimal front.
0.4 0.3
1 0.2
0.9 0.1
0.8 0
0
0.2
0.4 0.6 Function 1
0.8
1
0.7
After some iterations have gone through improvisation takes place. The points begin to form a cluster along their respective lines and also begin to move near the Pareto optimal front.
Function 2
0.6
Fig. 10 Initial random population
0.5 0.4 0.3 0.2
1
0.1
0.9
0
0
0.2
0.8
0.4 0.6 Function 1
0.8
1
Fig. 13 Final Output
0.7
Function 2
0.6
Similar methods have been replicated for N=3 and have been tested on various objective functions. The results so found are summarized in the results and discussion section.
0.5 0.4
4.5 Proof of concept of MOHS
0.3 0.2 0.1 0
0
0.2
0.4 0.6 Function 1
0.8
1
Fig. 11 Improvisation
As the MOHS progresses further the points gain momentum and form even close clusters. The cluster so formed is now very near to the Pareto optimal front and is about to converge
6
MOHS method is an inspiration from the paper on a multiobjective evolutionary algorithm based on decomposition (MOEA/D)[7]. Similar to MOEA/D, MOHS decomposes the multi-objective problem into a number of single objective optimization functions and solves them simultaneously. The weight sum approach for any algorithm requires the optimization of the following function: HC ∑`.m2 . k.
(11)
This approach as stated above considers a convex combination of the different objectives. Different weight vectors are generally used to get different set of optimal points. However, with the above approach not all Pareto optimal points can be
reached especially if the Pareto front is non-concave. MOHS doesn’t have any such disability related to concave PF’s. Another interesting approach to solve MOOP’s is the Tchebycheff approach. In this approach, the scalar optimization problem formulation is given as follows: min f = max2r.rs k. ( . − t ∗ )
(12)
where z* is the reference point and is similar to that in MHSO and k. is the weight for each optimal solution. We are able to find many such optimal solution by varying the weight vector. This approach has also been found to be quite successful in literature. Applying various methods for MOHS we found the method applied in the objective function formulation in the section above to be the most suitable method to solve Multi-Objective optimization problems.
where $( ) = 1 +
z(∑{ S|} PS ) `~2
BCD 0 ≤
≤1
.
The Pareto front of ZDT3 as observed is disconnected and of varying shapes and satisfies the equation 3 = 1 − • 2 − 2 sin (10€ 2 ) in for a limited domain of 2 . The test function is solved for n=30. 5.1.4 ZDT4 min 2 ( ) =
min
3(
2
v (P)
) = $( )(1 − u w
where $( ) = 1 + 10(C − 1) + ∑`.m3( & 0 ≤
x(P)
)
(16)
− 10cos(4€ 2 )) 2 ≤ 1, −5 ≤ . ≤ 5
3 .
The given test functions has many local Pareto fronts in the feasible range. The test function is solved for n=10.
Multi-Objective Optimization
5.1.5 DTLZ1 min 2 ( ) = (1 + $( ))
5.1 Test functions Following are shown various test functions that have been used to test the robustness for multi-objective harmony search algorithm
min
min
3( …(
2 3
) = ƒ1 + $( )„ 2 (1 − ) = ƒ1 + $( )„(1 −
3)
2)
(17) where $( ) = 100(C − 2) + 100 ∑`.m…((
5.1.1 ZDT1 min 2 ( ) = min
3(
2
) = $( )(1 − u w
where $( ) = 1 +
v (P) x(P)
z(∑{ S|} PS ) `~2
cos(20€( )
(13)
BCD 0 ≤
.
≤1
The Pareto front of ZDT1 as observed is convex and satisfies the equation 3 = 1 − • 2 . The test function is solved for n=30.
min
3(
2
x(P)
(14) where $( ) = 1 +
z(∑{ S|} PS ) `~2
V ))
min 2 ( ) = ƒ1 + $( )„†[‡(
min
BCD 0 ≤
.
≤1
5.1.3 ZDT3
min
7
3(
2
v (P)
) = $( )(1 − u w
x(P)
−
vw (P) x(P)
≤1
3(
ˆPw 3
)†[‡(
ˆP} 3
)
ˆPw
) = ƒ1 + $( )„†[‡(
3
)‡HC(
ˆP} 3
) (18)
The Pareto front of ZDT2 as observed is concave and satisfies the equation 3 = 1 − 23 . The test function is solved for n=30.
min 2 ( ) =
.
5.1.6 DTLZ2
min
v (P) 3
) = $( )(1 − R w
− 0.5)) & 0 ≤
− 0.5)3 −
Its PF is non convex and the function value solution of the objective function satisfies ∑….m2 . = 1 with . ≥ 0.The test function is solved for n=10.
5.1.2 ZDT2 min 2 ( ) =
.
.
sin(10€ 2 )) (15)
…(
) = ƒ1 + $( )„‡HC(
where $( ) = ∑`.m…
.
3
ˆPw 3
& 0 ≤
) 2, 3
≤ 1 & − 1 ≤
.
≤1
Its PF is non convex and the function value solution of the objective function satisfies ∑….m2 .3 = 1 with . ≥ 0. The test function is solved for n=10.
5.2 Results tabulation and discussion
ZDT1 1 0.9 0.8 0.7 0.6 function2
All the parameters that have been used are the same as that used in the SIHS-II implementation of the harmony search except BW[max]=0.5. The other parameter values are HMCR=0.9, PAR=0.9, HMS=10, BW[min]=0.0002 etc. The value of the penalty parameter ranges from 10 to 1000 depending upon the test function but 100 works well for all the test functions. The number of iterations NI have been kept fixed at a value 5000.
0.5 0.4 0.3
5.2.1 ZDT1
0.2
Fig 14, Fig 15 and Fig 16 demonstrate the results on optimizing the test function ZDT 1. The solutions thereby obtained are very close to the pareto optimal front and are also uniformly distributed.
0.1 0
0
0.1
0.2
0.3
0.4
0.5 0.6 function1
0.7
0.8
0.9
1
0.7
0.8
0.9
1
Fig. 16 ZDT1 for NPOP=100
ZDT1 1
5.1.2 ZDT2
0.9
ZDT2
0.8
1.4
Function 2
0.7
1.2
0.6 0.5
1 0.4
Function 2
0.3 0.2 0.1 0
0
0.1
0.2
0.3
0.4
0.5 0.6 Function 1
0.7
0.8
0.9
0.8
0.6
0.4
1
Fig. 14ZDT1 for NPOP=4
0.2
0
0
0.1
0.2
0.3
0.4
0.5 0.6 Function 1
Fig. 17 ZDT2 for NPOP=4
ZDT1 1 0.9
ZDT2 1.4
0.8
1.2
0.6
1 0.5 0.4
Function 2
Function 2
0.7
0.3
0.8
0.6
0.2
0.4
0.1 0
0
0.1
0.2
0.3
Fig. 15 ZDT1 for NPOP=20
0.4
0.5 0.6 Function 1
0.7
0.8
0.9
1
0.2
0
0
0.1
0.2
Fig. 18 ZDT2 for NPOP=20
8
0.3
0.4
0.5 0.6 Function 1
0.7
0.8
0.9
1
The initial front created before the removal of nondominated solutions is shown in the Fig 22. Fig 23 shows the front after removal of non-dominated solutions.
ZDT 2 1.4
1.2
ZDT 4 1.5
0.8
1
0.6 Function 2
Function 2
1
0.4
0.2
0
0.5
0
0.1
0.2
0.3
0.4
0.5 0.6 Function 1
0.7
0.8
0.9
1
Fig. 19ZDT2 for NPOP=100
0
5.1.3 ZDT3 As stated before the Pareto front of ZDT3 is discontinuous in nature. Since the algorithm does not adopt the concept of non-dominance implicitly, therefore we have to remove the non-dominated solutions explicitly from this Pareto front.
0
0.1
0.2
0.3
0.4
0.5 0.6 Function 1
0.7
0.8
0.9
1
Fig. 22 ZDT4 for NPOP=100 before removal of dominated solutions
ZDT3 1.8 1.6 1.4
Function 2
1.2 1 0.8 0.6 0.4 0.2 0
0
0.1
0.2
0.3
0.4
0.5 0.6 Function 1
0.7
0.8
0.9
1
Fig. 20 ZDT3 for NPOP=10 before removal of dominated solutions
Fig. 23 ZDT4 for NPOP=100 after removing dominated solutions
DTLZ 1 and DTLZ 2 are optimization problems with 3 objective functions. Upon optimization of these test functions the following graphs demonstrate the results. It can be witnessed that the solutions have been uniformly distributed along the Pareto front.
5.1.5 DTLZ1 DTLZ1
1
Function 3
0.8
0.6
0.4
0.2
Fig. 21 ZDT3 for NPOP=10 after removing dominated solutions
0 0
0 0.5
0.5 1
Function 2
5.1.4 ZDT4 9
Fig. 24 DTLZ1 for NPOP=28
1 Function 1
Abbreviations and Acronyms
DTLZ1
1
Function 3
0.8 0.6 0.4 0.2 0 0
0 0.5
0.5 1
1
Function 1
Function 2
Fig. 25 DTLZ1 for NPOP=190
5.1.6 DTLZ2 DTLZ2
1.2
BW: Band Width DE: Differential Evolution DV: Decision Variables EA: Evolutionary Algorithm GA: Genetic Algorithm HMCR: Harmony Memory Consideration Rate HSA: Harmony Search Algorithm HS: Harmony search PAR: Pitch Adjustment Rate PF: Pareto Front IHS: Improved Harmony Search SIHS: Self-adaptive Improved harmony search algorithm SPEA: Strength Pareto Evolutionary Algorithm IHSA: Improved Harmony Search Algorithm MOEA: Multi-Objective Evolutionary algorithm MOGA: Multi-Objective Genetic Algorithm MOHS: Multi-Objective Harmony Search MOOP: Multi-Objective Optimization Problem NOI: Number of Iterations NPGA: Niche Pareto Genetic Algorithm NSGA: Non-dominated Sorting Genetic Algorithm
References
1
Function 3
0.8 0.6 0.4 0.2 0 0
0 0.5
0.5 1
1
Function 1 Function 2
Fig. 26 DTLZ2 for NPOP=28 DTLZ2
1.2 1 Function 3
0.8 0.6 0.4 0.2 0 0
0 0.5
0.5 1
1 Function 1
Function 2
Fig. 27 DTLZ2 for NPOP=190
Conclusion A Multi Objective Harmony Search (MOHS) algorithm based on a decomposition approach is proposed. MOHS formulates a host of single objective optimization problems by using a weight generator. These objective functions are then solved simultaneously using Self-adaptive Improved Harmony Search (SIHS) algorithm. On six different difficult benchmark test problems which were borrowed from the literature, the proposed MOHS was able to maintain a nearly uniform spread of solutions and converged on the Pareto front.
10
[1] Z. W. Geem and J. H. Kim: A New Heuristic Optimization Algorithm: Harmony Search, Simulation, 76(2), 60–68, 2001. [2] C. M. Fonseca and P. J. Fleming, “Genetic Algorithms for Multiobjective Optimization: Formulation, Discssions and Generalization, Proceedings of the Fifth Inter-national Conference (S. Forrest, ed.), San Mateo, CA: Morgan Kaufmann, July 1993. [3] J. Horn, N. Nafpliotis, and D. E. Goldberg: A Niched Pareto Genetic Algorithm for Multiobjective Optimization, Proceedings of the First IEEE Conference on Evolutionary Computation, 1, 82–87, 1994. [4] E. Zitzler and L. Thiele: Multiobjective Evolutionary Algorithms : A Comparative Case Study and the Strength Pareto Approach, IEEE Transactions on Evolutionary Computation, 3(4), 257–271, 1999. [5] E. Zitzler, M. Laumanns, and L. Thiele: SPEA2 : Improving the Strength Pareto Evolutionary Algorithm, TIK-Report 103, 1–21, 2001. [6] K. Deb, A. Member, A. Pratap, S. Agarwal, and T. Meyarivan: A Fast and Elitist Multiobjective Genetic Algorithm, IEEE Transactions on Evolutionary Computation, 6(2), 182–197, 2002. [7] Q. Zhang, S. Member, and H. Li: MOEA/D: A Multiobjective Evolutionary Algorithm Based on Decomposition, IEEE Transactions on Evolutionary Computation, 11(6), 712–731, 2007. [8] J. Ricart, G. Hüttemann, J. Lima, and B. Barán: Multiobjective Harmony Search Algorithm Proposals, Electron. Notes Theor. Comput. Sci., 281, 51–67, 2011. [9] S. Sivasubramani and K. S. Swarup: Multi-objective harmony search algorithm for optimal power flow problem, Int. J. Electr. Power Energy Syst., 33(3), 745–752, 2011. [10] L. M. Pavelski, C. P. Almeida, and R. a. Goncalves: Harmony Search for Multi-objective Optimization, 2012 Brazilian Symp. Neural Networks, 220–225, 2012. [11] L. Wang, Y. Mao, Q. Niu, and M. Fei: A Multi-Objective Binary Harmony Search Algorithm, Lecture Notes in Computer Science, 6729, 74–81, 2011. [12] X. Yang: Harmony Search as a Metaheuristic Algorithm, Proceedings of Studies in computational intelligence, 191, 1-18, 2009.