Apple Image Recognition Multi-Objective Method Based on the ... - MDPI

0 downloads 0 Views 4MB Size Report
Jul 20, 2018 - multi-objective optimization; apple image recognition .... B. = ω0(µ0 − µT). 2. + ω1(µ1 − µT). 2. (1). The optimal threshold t∗ should maximize the ...
information Article

Apple Image Recognition Multi-Objective Method Based on the Adaptive Harmony Search Algorithm with Simulation and Creation Liqun Liu 1, * 1 2

*

ID

and Jiuyuan Huo 2

ID

College of Information Science and Technology, Gansu Agricultural University, Lanzhou 730070, China School of Electronic and Information Engineering, Lanzhou Jiaotong University, Lanzhou 730070, China; [email protected] Correspondence: [email protected]; Tel.: +86-0931-7631-865

Received: 29 June 2018; Accepted: 18 July 2018; Published: 20 July 2018

 

Abstract: Aiming at the low recognition effect of apple images captured in a natural scene, and the problem that the OTSU algorithm has a single threshold, lack of adaptability, easily caused noise interference, and over-segmentation, an apple image recognition multi-objective method based on the adaptive harmony search algorithm with simulation and creation is proposed in this paper. The new adaptive harmony search algorithm with simulation and creation expands the search space to maintain the diversity of the solution and accelerates the convergence of the algorithm. In the search process, the harmony tone simulation operator is used to make each harmony tone evolve towards the optimal harmony individual direction to ensure the global search ability of the algorithm. Despite no improvement in the evolution, the harmony tone creation operator is used to make each harmony tone to stay away from the current optimal harmony individual for extending the search space to maintain the diversity of solutions. The adaptive factor of the harmony tone was used to restrain random searching of the two operators to accelerate the convergence ability of the algorithm. The multi-objective optimization recognition method transforms the apple image recognition problem collected in the natural scene into a multi-objective optimization problem, and uses the new adaptive harmony search algorithm with simulation and creation as the image threshold search strategy. The maximum class variance and maximum entropy are chosen as the objective functions of the multi-objective optimization problem. Compared with HS, HIS, GHS, and SGHS algorithms, the experimental results showed that the improved algorithm has higher a convergence speed and accuracy, and maintains optimal performance in high-dimensional, large-scale harmony memory. The proposed multi-objective optimization recognition method obtains a set of non-dominated threshold solution sets, which is more flexible than the OTSU algorithm in the opportunity of threshold selection. The selected threshold has better adaptive characteristics and has good image segmentation results. Keywords: harmony search algorithm; adaptive factor; simulation operator; creation operator; multi-objective optimization; apple image recognition

1. Introduction The harmony search (HS) algorithm [1] is a swarm intelligence algorithm that simulates the process of music players’ repeated tuning of the pitch of each musical instrument through memory to achieve a wonderful harmony process. It has been successfully applied to many engineering problems. Like other swarm intelligence algorithms, the HS algorithm also has problems with premature convergence and stagnant convergence [2,3]. In recent years, scholars have made many

Information 2018, 9, 180; doi:10.3390/info9070180

www.mdpi.com/journal/information

Information 2018, 9, 180

2 of 24

improvements to the optimization performance of the HS algorithm, such as the probabilistic idea of applying the distributed estimation algorithm in the [2] to design an adaptive adjustment strategy and improve the search ability of the algorithm. In [3], the HS algorithm and the genetic algorithm (GA) are combined to obtain a better initial solution based on multiple iterations of the genetic algorithm, and then the harmony search algorithm is used to further search for possible solutions in the adjacent region. Zhu et al. linearly combines the optimal harmony vector with the two harmony vectors randomly selected in the population to generate a new harmony and expand the local search area [4]. Wu et al. uses the competitive elimination mechanism to update the harmony memory library to accelerate the process of survival of the fittest. Some studies have designed two key parameters of the HS algorithm to overcome the shortcomings of fixed parameters [5]. These two parameters also have improved from different perspectives and proposed different improved harmony search algorithms in [6–8]. The studies in [9,10] apply the improved HS algorithm to multi-objective optimization problems. Relevant scholars have conducted a large number of basic studies on the HS algorithm, and proposed improvements, achieving good results in application. Simulated annealing and a genetic algorithm are used to optimally adjust the weights for the aggregation mechanism [11]. An innovative tuning approach for fuzzy control systems (CSs) with a reduced parametric sensitivity using the grey wolf optimizer (GWO) algorithm was proposed in [12]. The GWO algorithm is used in solving the optimization problems, where the objective functions include the output sensitivity functions [12]. Echo state networks (ESN) are a special form of recurrent neural networks (RNNs), which allow for the black box modeling of nonlinear dynamical systems. The harmony search (HS) algorithm shows good performance, proposed for training the echo state network in an online manner [13]. The design of the Takagi–Sugeno fuzzy controllers in state feedback formed using swarm intelligence optimization algorithms was proposed in [14], and the particle swarm optimization, simulated annealing and gravitational search algorithms are adopted. In terms of image applications of HS, the following studies have been conducted. A novel approach to visual tracking called the harmony filter is presented. The harmony filter models the target as a color histogram and searches for the best estimated target location using the Bhattacharyya coefficient as a fitness metric [15]. In [16], a new dynamic clustering algorithm based on the hybridization of harmony search (HS) and fuzzy c-means to automatically segment MRI brain images in an intelligent manner was presented. Improvements to Harmony Search (HS) in circle detection was proposed that enables the algorithm to robustly find multiple circles in larger data sets and still work on realistic images that are heavily corrupted by noisy edges [17]. A new approach to estimate the vanishing point using a harmony search (HS) algorithm was proposed in [18]. In the theoretical background of HS, the following studies have been conducted. In order to consider the realistic design situations, a novel derivative for discrete design variables based on a harmony search algorithm was proposed in [19]. In [20], it presents a simple mathematical analysis of the explorative search behavior of harmony search (HS) algorithm. The comparison of the final designs of conventional methods and metaheuristic based structural optimization methods was discussed in [21]. In the improvement of adaptive methods of HS, the following studies have been conducted. An adaptive harmony search algorithm was proposed for solving structural optimization problems to incorporate a new approach for adjusting these parameters automatically during the search for the most efficient optimization process [22]. A novel technique to eliminate tedious and experience-requiring parameter assigning efforts was proposed for the harmony search algorithm in [23]. Pan et al. presents a self-adaptive global best harmony search (SGHS) algorithm for solving continuous optimization problems [24]. Recognition of apple images is one of the most basic aspects of computer processing apple grading [25]. Apple image recognition refers to the separation of apple fruit from branches, leaves, soil, and sky, i.e., image segmentation techniques [26]. The threshold segmentation method is an extremely important and widely used image segmentation method [27]. Among them, the maximum class variance (OTSU) [28] and maximum entropy method [29] are the two most commonly used threshold segmentation methods. Both types of segmentation methods have different degrees of

Information 2018, 9, 180

3 of 24

defects [30]. The classical OTSU algorithm has the problem that the ideal segmentation effect cannot be obtained for complex images of natural scenes [30]. Although the OTSU algorithm solves the problem of threshold selection, it has the problems of lacking adaptability, easily causes noise interference and over-segmentation, and requires a great deal of time for computing, and so on. Therefore, the classical OTSU algorithm still needs further improvement. In order to comprehensively utilize the advantages of the two segmentation methods, a multi-objective optimization problem is introduced. In recent years, the HS algorithm has made some progress in solving multi-objective optimization problems, such as the stand-alone scheduling problem [10], the wireless sensor network routing problem [31], the urban one-way traffic organization optimization problem [32], and the distribution network model problem [33]. In [34], the study employed a phenomenon-mimicking algorithm and harmony search to perform the bi-objective trade-off. The harmony search algorithm explored only a small amount of total solution space in order to solve this combinatorial optimization problem. A multi-objective optimization model based on the harmony search algorithm (HS) was presented to minimize the life cycle cost (LCC) and carbon dioxide equivalent (CO2 -eq) emissions of the buildings [35]. A novel fuzzy-based velocity reliability index using fuzzy theory and harmony search was proposed, which is to be maximized while the design cost is simultaneously minimized [36]. These studies are closely integrated with specific application issues and cannot be easily extended to other application areas. This paper proposes an apple image recognition multi-objective method based on the adaptive harmony search algorithm with simulation and creation (ARMM-AHS). This method transforms the image segmentation problem of natural scenes into a multi-objective optimization problem. It takes the maximum class variance and maximum entropy as the two objective functions of the optimization problem. A new adaptive harmony search algorithm with simulation and creation (SC-AHS) is proposed as a strategy for the image threshold search to achieve natural scene apple image segmentation. The results showed that the SC-AHS algorithm improves the global search ability and the diversity of the solution of the HS algorithm, and has obvious performance advantages of fast convergence. The ARMM-AHS method can effectively segment the apple image. 2. Theory Basis 2.1. Threshold Segmentation Principle of the OTSU Algorithm The OTSU algorithm is a global threshold selection method. The basic idea is to use the grayscale histogram of the image to dynamically determine the segmentation threshold of the image with the maximum variance of the target and the background. Suppose that the gray level of an apple image is L the gray value of each pixel is 0, 1, · · · , L − 1, the number of pixels with gray value i is Wi , the total number of pixels in the image is W = ∑iL=−01 Wi , i and the probability of gray value i is set to pi = W W. Let the threshold t divide the image into two classes C0 and C1 (target and background), that is, C0 and C1 respectively correspond to pixels with gray levels {0, 1, · · · , t} and {t + 1, t + 2, · · · , L − 1}, then the variance between the two classes is as shown in Equation (1): σB2 = ω0 (µ0 − µ T )2 + ω1 (µ1 − µ T )2

(1)

The optimal threshold t∗ should maximize the variance between classes, which is an optimization problem, as shown in Equation (2): t∗ = arg

max

0≤ t ≤ L −1



σB2 (t)



(2)

Among them, ω0 and ω1 respectively represent the probability of the occurrence of class C0 and class C1 . µ0 and µ1 represent the average gray value of the class C0 and class C1 , respectively. µ T represents the average gray level of the entire image. Therefore, the larger the value of σB2 , the better the threshold that is selected.

Information 2018, 9, 180

4 of 24

2.2. Maximum Entropy Segmentation Principle For the same image, let us assume that the threshold t divides n o the image into two classes C0 and p

p

p

C1 . The probability distribution of the class C0 is P0t , P1t , · · · , Ptt , and the probability distribution of n o p p p L −1 t the class C1 is 1−t+P1t , 1−t+P2t , · · · , 1− Pt , where Pt = ∑i =0 pi , 0 ≤ t ≤ L − 1. The entropy formula is as shown in Equation (3): t

H (t) = − ∑

i =0

L −1 p pi pi pi ln i − ∑ ln Pt Pt i=t+1 1 − Pt 1 − Pt

(3)

The optimal threshold should maximize the entropy, which is also an optimization problem, as shown in Equation (4). max { H (t)} t∗ = arg (4) 0≤ t ≤ L −1

3. Adaptive Harmony Search Algorithm with Simulation and Creation The inertial particle and fuzzy reasoning were introduced into the bird swarm optimization algorithm (BSA) in [37] to make those birds that are foraging jump out of the local optimal solution to enhance the ability of global optimization. The fast shuffled frog leaping algorithm (FSFLA) proposed in [38] makes each frog individual evolve toward the best individual, which accelerates the individual evolutionary speed and improves the search speed of the algorithm. According to the no-free lunch theorem [39], this paper fuses the ideas of the above two different types of optimization algorithms’ mechanisms and proposes an adaptive harmony search algorithm with simulation and creation (SC-AHS) to improve the HS algorithm optimization performance. 3.1. Theory Basis 3.1.1. Adaptive Factor of Harmony Tone Definition 1 (Adaptive factor of harmony tone). Randomly generate  Harmony Memory whose Size is g g g g HMS to find out the global optimal harmony xi = x1 , x2 , · · · , x N , current optimal harmony xinow =   x1now , x2now , · · · , x now , and global worst harmony xiw = x1w , x2w , · · · , x w N N in each creation of the harmony memory. The adaptive factor of the harmony tone ξ i is defined as follows, and it is used to suppress randomness during simulation and creation of harmony tone: ξi =

∑iN=1 ri xinow , i = 1, 2, · · · ∑iN=1 ri g f ( xi )− f ( xiw ) ri = f x now

(

i

,N (5)

)

3.1.2. Harmony Tone Simulation Operator Definition 2 (Harmony tone simulation operator). In one harmony evolution, each tone component j xi , i = 1, 2, · · · , N of each harmony x j , j = 1, 2, · · · , HMS in the harmony memory adaptively evolves through the harmony tone simulation operator, as follows, to adjust the harmony individuals to evolve to optimal harmony individuals:   j,new

xi

j,old

= xi

g

j,old

+ ξ i xi − xi

(6)

Information 2018, 9, 180

5 of 24

3.1.3. Harmony Tone Creation Operator j

Definition 3 (Harmony tone creation operator). In one harmony evolution, each tone component xi , i = 1, 2, · · · , N of each harmony x j , j = 1, 2, · · · , HMS in the harmony memory adaptively evolves through the harmony tone creation operator, as follows, to adjust harmony individuals away from the current optimal harmony individual evolution:   j,new

xi

j,old

= xi

j,old

− ξ i xinow − xi

(7)

3.2. Adaptive Evolution Theorem with Simulation and Creation Theorem 1 (Adaptive evolution theorem with simulation and creation). In one harmony evolution, each j tone component x j , j = 1, 2, · · · , HMS in each harmony xi , i = 1, 2, · · · , N in the harmony memory adaptively    evolves through the harmony tone simulation operator according to the Equation (6). If f x j,new < f x j,old j,old

, replace xi

j,new

with xi

, respectively. Otherwise, it evolves through the harmony tone creation operator    according to the Equation (7). If f x j,new ≥ f x j,old still exists, use Equation (8) to randomly generate the harmony tone: j,new xi = LBi + r (UBi − LBi ) (8)

3.3. SC-AHS Algorithm Process Algorithm 1. SC-AHS Input: the maximum number of evolutions Tmax , the size of the harmony memory HMS, the number of tone component N. Output: global optimal harmony x g . Step1: Randomly generate initial harmony individuals which number is HMS, marked as n   o j j j X HMS = x j x j = x1 , x2 , · · · , x N , j = 1, 2, · · · , HMS . The harmony memory is initialized according to   Equation (1). The fitness of x j is f x j . j

Step2: Each tone component xi , i = 1, 2, · · · , N of each harmony x j , j = 1, 2, · · · , HMS is updated. Begin   g Sort each harmony by fitness f x j in descending order to determine xi , xinow and xiw . For (x j , j = 1, 2, · · · , HMS) Begin Calculated ξ i according to Equation (5) j

For (xi , i = 1, 2, · · · , N) j,old

Update xi

according to Equation (6) to perform harmony tone simulation End    if ( f x j,new < f x j,old ) j

For (xi , i = 1, 2, · · · , N)

j,old

xi

j,new

= xi

End else j

For (xi , i = 1, 2, · · · , N)

j,old

Update xi End

according to Equation (7) to perform harmony tone creation

   If ( f x j,new < f x j,old )

Information 2018, 9, 180

6 of 24

j

For (xi , i = 1, 2, · · · , N)

j,old

xi

j,new

= xi

End else j

For (xi , i = 1, 2, · · · , N)

j,new

Randomly generate harmony tone xi according to Equation (8). End End End Step3: Determine if the number of evolutions has been reached Tmax . If not, go to Step 2. Otherwise, the algorithm stops and outputs x g .

3.4. Efficiency Analysis of SC-AHS The efficiency analysis of the algorithm shows that the time complexity of the HS algorithm is O( Tmax × N ), and the space complexity is O( HMS × N ), while the time complexity of the SC-AHS algorithm is O( Tmax × N × HMS), and the space complexity is O( HMS × N ). Since Tmax  HMS, the execution time and storage space of the SC-AHS algorithm and the HS algorithm are of the same order of magnitude. This shows that the SC-AHS algorithm does not increase the computational overhead and space. 4. Apple Image Recognition Multi-Objective Method Based on an Adaptive Harmony Search Algorithm with Simulation and Creation 4.1. Mathematical Problem Description of the Multi-Objective Optimization Recognition Method Suppose that the gray level of an apple image is L, the gray value of each pixel is 0, 1, · · · , L − 1. There are two classes to be distinguished in the apple image, and the corresponding harmonic threshold in each class is written as x j , j = 1, 2, · · · , HMS, 0 ≤ x j ≤ L − 1. The optimization process of seeking thresholds for segmentation of apple images can be regarded as a multi-objective optimization problem, in which the maximum class variance and maximum entropy are taken as two objective functions of the multi-objective optimization problem. The proposed SC-AHS algorithm is used as an image threshold search strategy. If we take the calculation of the minimization function as an example, the multi-objective optimization problem can be defined as follows:                   

min f ( x ) = [ f 1( x ), f 2 ( x )] j j j = x , x , · · · , xN ∈ X ⊆ RN , N = 8  j 1 2 X = x , j = 1, 2, · · · , HMS 0 ≤ x j ≤ L − 1   f 1 ( x ∗ ) = − min σB2 x j 0≤ x j ≤ L −1   f 2 ( x ∗ ) = − min H xj s.t.x j

(9)

0≤ x j ≤ L −1

4.2. Process of the Apple Image Recognition Multi-Objective Method Based on an Adaptive Harmony Search Algorithm with Simulation and Creation Step 1: Read and preprocess the apple image. Count the gray level L of the image and the number of corresponding pixels Wi to calculate the probability pi of the occurrence of grayscale. Step 2: Enter the maximum number of evolutions Tmax , the size of the harmony memory bank HMS, and the number of tone components N. Step thresholds oare generated randomly, which is denoted as n 2.1: HMS initial harmony  j j j j j X = x x = x1 , x2 , · · · , x N , j = 1, 2, · · · , HMS . Each harmonic threshold is binary coded and

Information 2018, 9, 180

7 of 24

calculated f 1 ( X ) and f 2 ( X ) according to Equation (9) to initialize the harmony memory. The fitness of   the harmonic threshold x j is f 1 x j and f 2 x j . Step 2.2: Construct a Pareto optimal solution set, and note the harmonic threshold deleted Information 2018, 9, x  7 of 24  from the optimal solution set as x w,old . Step 2.2: Construct a Pareto optimal solution set, and note the harmonic threshold deleted from  Step 2.3: Each tone component x w,old , i = 1, 2, · · · , N of the harmony threshold x w,old is updated the optimal solution set as  , .  i according to the SC-AHS algorithm step., Step 2.3: Each tone component  , 1,2, ⋯ ,   of the harmony threshold  ,   is updated  Step 2.4: Determine whether the number of evolutions has been reached Tmax . If not, go to Step according to the SC‐AHS algorithm step.  2.2. Otherwise, the algorithm stops and the optimal harmonic threshold solution set is output and Step 2.4: Determine whether the number of evolutions has been reached  . If not, go to Step  noted 2.2. Otherwise, the algorithm stops and the optimal harmonic threshold solution set is output and  as x ∗ . ∗ noted as  Step 3: Use . the optimal harmony threshold solution set x ∗ to divide the apple image and output Step  image. 3:  Use  the  optimal  harmony  threshold  solution  set  ∗   to  divide  the  apple  image  and  the segmented output the segmented image.  The apple image recognition multi-objective method based on an adaptive harmony search The  apple  image  recognition  multi‐objective  method  based  on  an  adaptive  harmony  search  algorithm with simulation and creation process is shown in Figure 1. algorithm with simulation and creation process is shown in Figure 1. 

  Figure  1.  Flowchart  of  the  apple  image  recognition  multi‐objective  method  based  on  an  adaptive  Figure 1. Flowchart of the apple image recognition multi-objective method based on an adaptive harmony search algorithm with simulation and creation. 

harmony search algorithm with simulation and creation.

Information 2018, 9, 180

8 of 24

5. Experiment Design and Results Analysis 5.1. Experiment Design The experimental samples were chosen from the mature HuaNiu apples. A series of mature HuaNiu apple image segmentation experiments are conducted under six conditions of direct sunlight with strong, medium, and weak illumination, and backlighting with strong, medium, and weak illumination in a natural scene. The experiment contains two parts. Firstly, the benchmark functions are used to test the optimization performance of the SC-AHS algorithm. Secondly, the ARMM-AHS method is used to segment and identify the apple image. The experiment used a Lenovon (Beijing, China) Notebook computer with an Intel Core i5 CPU 2.6 GHz and 4.0 GB of memory. The test platform was Microsoft Visual C++ 6.0 and Matlab 2012. 5.2. Benchmark Function Comparison Experiment and Result Analysis Five benchmark functions, including Sphere, Rastrigrin, Griewank, Ackley, and Rosenbrock, were used to simulate the individual fitness of harmony. The parameters of each function are shown in Table 1 [40]. N

f1 (x) =

∑ xi2

(10)

i =1 N

f2 (x) =



h

xi2 − 10 cos(2πxi ) + 10

i

(11)

i =1

  N x 1 N 2 xi − ∏ cos √i + 1 f3 (x) = ∑ 4000 i=1 i i =1 v   " # u N u1 N 1 t 2 xi  − exp cos(2πxi ) + 20 + e f 4 ( x ) = −20 exp−0.2 N i∑ N i∑ =1 =1 N

f5 (x) =

∑ [100(xi+1 − xi2 )

2

(12)

(13)

+ (1 − x i )2 ]

(14)

i =1

Table 1. Parameters of five benchmark functions. Function

Name

Peak Value

Dimension

Search Cope

Theoretical Optimal Value

Target Accuracy

f1 f2 f3 f4 f5

Sphere Rastrigrin Griewank Ackley Rosenbrock

unimodal multimodal multimodal multimodal unimodal

30 30 30 30 30

[−100,100] [−5.12,5.12] [−600,600] [−32,32] [−30,30]

0 0 0 0 0

10−15 10−2 10−2 10−6 10−15

5.2.1. Fixed Parameters Experiment The fixed parameters experiment uses a minimum value optimization simulation to compare and test the optimization performance of the HS, Improved Harmony Search (IHS) [7], Global-best Harmony Search (GHS) [41], Self-adaptive Global best Harmony Search (SGHS) [24], and SC-AHS algorithms. In the fixed parameter experiment, the experimental parameters were set as follows. HMS = 200, HMCR = 0.9, PAR = 0.3, BW = 0.01, Tmax = 20,000, N = 30. The HCMR is the Harmony memory library probability, the PAR is the tone fine-tuning probability, the BW is the tone fine-tuning bandwidth, and Tmax is the number of evolutions. In the IHS algorithm, the pitch tune probability minimum value PARmin = 0.01, the pitch tuning probability maximum value PARmax = 0.99, the pitch tune bandwidth minimum value BWmin = 0.0001, and the pitch tune bandwidth maximum value

Information 2018, 9, 180

9 of 24

BWmax = 1/(20 × (UB − LB)) [7]. In the GHS algorithm, PARmin = 0.01, PARmax = 0.99 [41]. In the SGHS algorithm, the initial probability value of the harmony memory is HMCRm = 0.98, the standard deviation of the value of the harmony memory is HMCRs = 0.01, and the probability of the harmony memory is normally distributed within [0.9, 1.0], the average initial value of the bandwidth is PARm = 0.9, the standard deviation of the pitch tune bandwidth PARs = 0.05, the pitch tune bandwidth is normally distributed within [0.0, 1.0], the specific evolution numbers LP = 100 [24], the pitch tune bandwidth minimum BWmin = 0.0001, and the pitch tune bandwidth maximum BWmax = 1/(20 × (UB − LB)) [7]. Upper Bound (UB) is the maximum value of the search range of the algorithm, and Lower Bound (LB) is the minimum value of the search range of the algorithm. In the SC-AHS algorithm, αinit = 0.1, α f inal = 1.2. The final test results were averaged after 30 independent runs. The performance evaluation of the algorithm uses the following methods: (1) the method of convergence accuracy and speed of the analysis algorithm under a fixed evolutionary frequency; and (2) the method of comparing the evolution curve of the average optimal value of each function. The convergence accuracy of each algorithm is shown in Table 2. For both the unimodal Sphere (f 1) and Rosenbrock (f 5) functions, the data in Table 2 shows that the algorithm has reached the theoretical minimum of 0 for both functions. It shows that the convergence accuracy of this algorithm is very high, and the convergence speed and accuracy of other functions are slightly smaller than the SC-AHS algorithm. For the multimodal Rastrigrin (f 2) function, Table 2 shows that the SC-AHS algorithm has reached the theoretical minimum of 0. It indicates that the algorithm has high convergence accuracy, and the convergence accuracy of other functions does not reach the theoretical minimum. For multimodal Griewank (f 3) and Ackley (f 4) functions, the data in Table 2 shows that the GHS algorithm reached the theoretical minimum value of 0 as it reaches the 5000 iterations in the Griewank (f 3) function optimization. At the same time, the convergence accuracy of the GHS algorithm in the Ackley (f 4) function optimization is also better than other algorithms, indicating that for these two functions, the SC-AHS algorithm converges faster, and the convergence accuracy is better than the other algorithms, except the GHS algorithm. Table 2. Comparison of the results of fixed global evolution times. Algorithm

Optimization Performance

f1

f2

f3

f4

f5

HS

Optimum Standard deviation

5.07 × 102 1.36 × 100

1.76 × 101 9.17 × 10−2

4.47 × 100 4.89 × 10−3

5.18 × 100 1.88 × 10−2

1.13 × 104 3.06 × 102

IHS

Optimum Standard deviation

3.80 × 102 6.12 × 10−3

2.83 × 101 6.81 × 10−1

7.62 × 100 8.60 × 10−6

6.10 × 100 1.76 × 10−3

4.49 × 104 2.26 × 101

GHS

Optimum Standard deviation

1.20 × 102 9.19 × 10−2

3.62 × 10−4 4.70 × 10−2

0.00 × 100 6.11 × 10−4

5.89 × 10−16 4.90 × 10−29

3.00 × 101 3.33 × 100

SGHS

Optimum Standard deviation

2.47 × 102 8.50 × 10−12

2.98 × 101 1.97 × 10−1

5.92 × 100 2.67 × 10−5

5.62 × 100 6.89 × 10−4

1.89 × 104 1.27 × 101

DOSC-AFHS

Optimum Standard deviation

0.00 × 100 0.00 × 100

0.00 × 100 4.27 × 10−1

1.97 × 10−2 1.02 × 10−15

3.27 × 10−6 2.72 × 10−5

0.00 × 100 0.00 × 100

By comparing the minimum values of the HS, HIS [7], GHS [41], and SGHS [24] algorithms, and the SC-AHS algorithm with fixed parameters, the results show that for unimodal Sphere (f 1) and Rosenbrock (f 5) functions and multimodal Rastrigrin (f 2) function, the SC-AHS algorithm has better performance than the other algorithms. For the multimodal Grievenk (f 3) and Ackley (f 4) functions, the SC-AHS algorithm converges faster than other algorithms. In addition to the GHS algorithm, the SC-AHS algorithm has better convergence accuracy than other algorithms. For each function, compared with the HS algorithm, the convergence accuracy and speed of the SC-AHS algorithm are significantly improved. Compared with other algorithms, the SC-AHS algorithm has the advantage of fast convergence.

Information 2018, 9, 180

10 of 24

5.2.2. Dynamic Parameters Experiment The dynamic parameters experiment uses two test methods, including varying the number of harmony tone components and varying the harmony memory size, HMS. Under the condition of dynamic parameters, the optimization performance effect of the above algorithms, like HS, HIS [7], GHS [41], SGHS [24], and the SC-AHS is compared. The average value of each function, the relative change rate [42], and the average change value were used as measures of the performance of the algorithm. Test of Average Optimum with Different Harmony Tone Components of Five Functions The number of harmony tone components corresponds to the dimension of the function. High-dimensional, multimodal, complex optimization problems can easily cause a local optimum and premature convergence. For this reason, it is extremely necessary to adjust the dimension of the benchmark function (i.e., the number of tone components) and test the optimization performance of each algorithm in high dimensions. The tone components are in the range of 50 to 100, and the other parameters are unchanged. The comparison of the changes in the number of tone components for the optimization performance of each algorithm is shown in Table 3. The average optimal value of each function varied with the number of tone components is shown in Figures 2–6. In Table 3, there is: Relative rate o f change

=

| average optimal value o f the 50 tone components− average optimal value o f the 100 tone components| average optimal value o f the 50 tone components

(15)

×100% ∑6i=1 average optimal value o f di f f erent numbers o f tone components × 100% 6 (16) It can be seen from the above that for the uni-peak Sphere (f 1) and Rosenbrock (f 5) functions, the average optimal value of each algorithm has an increasing trend with the increase of the number of harmonic components, but the SC-AHS algorithm shows a decline. In the trend, the relative change rate of the SC-AHS algorithm in Table 3 is also the lowest. For the multi-peak Rastrigrin (f 2), Griewank (f 3), and Ackley (f 4) functions, the average optimal value of the SC-AHS algorithm varies unstably. The average change value of the SC-AHS algorithm in Table 3 is the lowest among the algorithms. Therefore, the SC-AHS algorithm does not show a significant decrease in convergence accuracy with the increase in the number of tone components. That is, the algorithm still has good convergence performance in high-dimensional cases. Average change =

Table 3. Comparison of the optimal performance with different harmony tone components of five functions.

Functions

N

Average Optimal Value of HS

Average Optimal Value of IHS

Average Optimal Value of GHS

Average Optimal Value of SGHS

Average Optimal Value of SC-AHS

f1

50 60 70 80 90 100 Relative rate of change/% Average change

1.29 × 103 1.35 × 103 1.63 × 103 2.67 × 103 4.04 × 103 5.83 × 103 3.52 × 102 2.80 × 103

9.48 × 102 1.96 × 103 1.92 × 103 2.82 × 103 4.49 × 103 6.08 × 103 5.42 × 102 3.04 × 103

2.00 × 102 2.40 × 102 1.12 × 103 3.20 × 102 3.20 × 101 3.60 × 103 1.70 × 103 9.19 × 102

1.17 × 103 1.09 × 103 3.59 × 103 1.82 × 104 2.33 × 104 3.59 × 104 2.97 × 103 1.39 × 104

9.62 × 10−19 2.49 × 10−21 3.34 × 10−17 1.00 × 10−18 5.78 × 10−17 2.57 × 10−25 1.00 × 102 1.55 × 10−17

50 60 70 80 90

3.50 × 101 4.76 × 101 6.69 × 101 6.75 × 101 1.09 × 102

5.73 × 101 7.81 × 101 1.18 × 102 1.13 × 102 1.47 × 102

5.81 × 101 4.36 × 100 7.24 × 100 9.30 × 101 2.01 × 100

7.38 × 102 7.09 × 102 9.06 × 102 1.05 × 102 1.19 × 103

1.29 × 101 1.69 × 101 2.29 × 101 3.78 × 101 4.38 × 101

f2

Information 2018, 9, 180

11 of 24

Table 3. Cont.

Functions

f3

f4

f5

N

Average Optimal Value of HS

Average Optimal Value of IHS

Average Optimal Value of GHS

Average Optimal Value of SGHS

Average Optimal Value of SC-AHS

100 Relative rate of change/% Average change

1.13 × 102 2.22 × 102 7.31 × 101

1.75 × 102 2.06 × 102 1.15 × 102

2.01 × 100 9.65 × 101 2.78 × 101

1.62 × 102 7.80 × 101 6.35 × 102

6.07 × 101 3.69 × 102 3.25 × 101

50 60 70 80 90 Information 2018, 9, x  100 Relative rate of change/% Information 2018, 9, x  change/%  Average change

1.03 × 101 1.59 × 101 1.77 × 101 2.94 × 101 4.81 × 101 4.67 × 101 3.56 × 102 2.80 × 1011

8.13 × 100 1.66 × 101 1.99 × 101 3.22 × 101 3.76 × 101 4.96 × 101 5.10 × 102 1 2.73 × 10 1

2.80 × 100 1.92 × 10−1 3.52 × 100 3.88 × 100 1.40 × 101 3.34 × 101 1.09 × 103 9.630 × 100

1.03 × 103 2.00 × 102 1.66 × 102 1.52 × 101 2.78 × 102 3.52 × 101 9.66 × 101 2.88 × 102 2

Average change  50change/%  50  60  Average change  60 70  50  70 80  60  80 f4   90  70  90 80  100 100  f 4   rateRelative rate of  90  Relative of change/% change/%  100  Average change Average change  Relative rate of  50change/%  50  60 Average change  60  70 70  50  80 80  60  f5   90  90 70  80  100 100  f5   90  Relative rateRelative rate of  of change/% change/%  100  Average change Average change  Relative rate of  change/%  Average change 

2.80 × 10  

4.93 × 10−3 9.86 × 10−3 1.11 × 10−16 1.23 × 10−2 4.93 × 10−3 11 of 24  1.72 × 10−2 2.49 × 102 11 of 24  8.21 × 10−3 −3

1.32 × 106  1.19 × 103 

2.73 × 10   1  1 1.22 × 10 1.22 × 10 1  1 1.25 × 10 2.73 × 10 1.25 × 10 1  1 1.25 × 10 1.22 × 10 1.25 × 10 10  0 8.93 × 10 1.25 × 10 8.93 × 10 10  0 3.42 × 10 1.25 × 10 3.42 × 10 −1 0  −1 1.96 × 10 8.93 × 10 1.96 × 10 0  1 3.42 × 10 1  9.84 × 10 9.84 × 10 −1 1.96 × 10 8.30 × 10  0 8.30 × 1010  9.84 × 10   5 5  1.53 × 10 1.53 × 10 5 0 5 7.42 × 10 8.30 × 10 7.42 × 10   5  5 9.88 × 10 9.88 × 10 1.53 × 10 56  6 1.14 × 10 1.14 × 10 7.42 × 10 56  6 1.15 × 10 9.88 × 10 1.15 × 10 6  6 1.46 × 10 1.14 × 10 1.46 × 10 6  2 1.15 × 10 8.55 × 10 8.55 × 1062  1.46 × 10   5 9.39 × 10 9.39 × 1025  8.55 × 10  

9.63 × 10   1.12 × 10 1.121 × 101 1.44 × 10 9.63 × 10 1.440 × 100 3.87 × 10 1.12 × 10 3.8710 × 100 2.11 × 10 1.44 × 10 2.110 × 100 7.55 × 10 3.87 × 10    10−1 7.55 −10× −2 0   4.75 × 10 2.11 × 10 4.75 × 10−2 −1  7.55 × 10 9.961 × 101 9.96 × 10 −2  4.75 × 10 3.24 × 100 3.24 × 1010  9.96 × 10   3.032 × 102 3.03 × 10 5.5002 × 102 3.24 × 10 5.50 × 10 1.0923 × 103 1.09 × 10 3.03 × 10 8.46 × 10 8.4624 × 104 5.50 × 10 7.97 × 10 1.09 × 10 7.9735 × 105 1.00 × 10 8.46 × 10 1.0042 × 102 7.97 × 10 6.7051  × 101 6.70 × 10 1.00 × 10 1.472 × 105 1.47 × 1015  6.70 × 10  

2.88 × 10   8.21 × 10   0  8.42 × 10   8.42 × 100 7.10 × 10−57.10 × 10−5 −6  21  1.07 × 10 7.73 × 10−3 2.88 × 10 1.07 × 101 8.21 × 10 7.73 × 10−6 01  1.61 × 10 8.51 × 10−50  8.51 × 100 8.42 × 10 1.61 × 101 7.10 × 10 1  2.00 × 10 1.88 × 10−61  1.88 × 101 1.07 × 10 2.00 × 101 7.73 × 10 1  2.03 × 10 1.56 × 1001 1.56 × 101 1.61 × 10 2.03 × 101 8.51 × 10 1  2.05 × 10 2.08 × 101 2.08 × 101 2.00 × 10 2.05 × 101 1.88 × 10 1  1 2 2.03 × 10 1.56 × 10 2  1.43 × 10 2.93 × 107  2.93 × 107 1.43 × 10 1  1 2.05 × 10 1.60 × 101 2.08 × 10  1.06 × 101 1.60 × 1021  1.06 × 1071  1.43 × 10   2.93 × 10   8 −20 8  4.33 × 10 1.00 × 10−20 1.00 4.33 × 10   × 10 7 −20 1 1 7 −20 1.16 × 10 1.00 × 10 1.60 × 10 1.06 × 10     1.16 × 10   1.00 × 10 −20 87  1.08 × 107 1.00 × 10−20 1.00 1.08 × 10 4.33 × 10   × 10 75  2.29 × 10 2.29 × 105 1.00 × 10−20 1.00 1.16 × 10   × 10−20 7  3.56 × 10 1.08 × 10   × 10−20 3.56 × 107 1.00 × 10−20 1.00 5  2.30 × 10 2.29 × 10   × 10−20 2.30 × 105 1.00 × 10−20 1.00 7  1 1.00 × 10−200.00 3.56 × 10   9.99 × 10 × 100 1 0 9.99 × 105  0.00 × 10   2.30 × 10   × 107 1.00 × 10−20   × 10−20 8.18 1.00 8.18 × 107  1.00 × 10−20   9.99 × 101  0.00 × 100 

1.32 × 106 

9.39 × 105 

1.47 × 105 

8.18 × 107 

1.26 × 10 1.26 × 1011  1.31 × 10 2.80 × 10 1.31 × 1011  1.11 × 10 1.26 × 10 1.11 × 1011  8.45 × 10 1.31 × 10 8.45 × 10100  2.79 × 10 1.11 × 10 2.79 × 10100  −1 6.41 × 10 8.45 × 10 6.41 × 10−0  1 0 2.79 × 10 9.49 × 1011   9.49 × 10 −1 6.41 × 10 8.11 × 100  8.11 × 1010 

9.49 × 10 2.62 × 1055   2.62 × 10 5.24 × 10055  8.11 × 10 5.24 × 10 4.95 × 1055  4.95 × 10 2.62 × 10 1.26 × 10 1.26 × 10566  5.24 × 10 2.04 × 10 4.95 × 10 2.04 × 10566  3.37 × 10 1.26 × 10 3.37 × 1066  2.04 × 10 1.19 × 10633   1.19 × 10 3.37 × 10 1.32 × 1066 

1.00 × 10−20 

Figure  2.  Comparison  of  the  average  optimum  with  different  harmony  tone  components  of  the 

Figure 2. Sphere function.  Comparison of the average optimum with different harmony tone components of the Figure  2.  Comparison  of  the  average  optimum  with  different  harmony  tone  components  of  the  Sphere function. Sphere function. 

  Figure  3.  Comparison  of  the  average  optimum  with  different  harmony  tone  components  of  the    Rastrigrin function.  Figure  3.  Comparison  of  the  average  optimum  with  different  harmony  tone  components  of  the  Comparison of the average optimum with different harmony tone components Rastrigrin function. 

Figure 3. Rastrigrin function.

of the

Information 2018, 9, 180

12 of 24

Information 2018, 9, x 

12 of 24 

Information 2018, 9, x 

12 of 24 

Information 2018, 9, x 

12 of 24 

    Figure  4.  Comparison  of  the  average  optimum  with  different  harmony  tone  components  of  the 

Figure 4. Comparison of the average optimum with different harmony tone components of the Griewank function.    Figure  4.  Comparison  of  the  average  optimum  with  different  harmony  tone  components  of  the  Griewank function. Griewank function.  Figure  4.  Comparison  of  the  average  optimum  with  different  harmony  tone  components  of  the  Griewank function. 

    Figure  5.  Comparison  of  the  average  optimum  with  different  harmony  tone  components  of  the  Ackley function.    Figure  5.  Comparison  of  the  average  optimum  with  different  harmony  tone  components  of  the 

Figure 5. Comparison of the average optimum with different harmony tone components of the Ackley function.  Figure  5.  Comparison  of  the  average  optimum  with  different  harmony  tone  components  of  the  Ackley function. Ackley function. 

    Figure  6.  Comparison  of  the  average  optimum  with  different  harmony  tone  components  of  the  Rosenbrock function.    Figure  6.  Comparison  of  the  average  optimum  with  different  harmony  tone  components  of  the  Rosenbrock function.  Figure  6.  Comparison  of  the  average  optimum  with  different  harmony  tone  components  of  the 

Figure 6. Comparison of the average optimum with different harmony tone components of the Test of the Average Optimum with Different Harmony Memory Sizes of Five Functions  Rosenbrock function.  Rosenbrock function. Test of the Average Optimum with Different Harmony Memory Sizes of Five Functions  The population diversity can be affected with the changes of the size of the harmony memory  Test of the Average Optimum with Different Harmony Memory Sizes of Five Functions  HMS. The range of parameters HMS is 20, 50, 100, 150, and 200. Other parameters are unchanged.  The population diversity can be affected with the changes of the size of the harmony memory  Test of the Average Optimum with Different Harmony Memory Sizes of Five Functions The effect of the change in the size of the harmony memory on the optimization performance of each  HMS. The range of parameters HMS is 20, 50, 100, 150, and 200. Other parameters are unchanged.  The population diversity can be affected with the changes of the size of the harmony memory  The effect of the change in the size of the harmony memory on the optimization performance of each  HMS. The range of parameters HMS is 20, 50, 100, 150, and 200. Other parameters are unchanged.  The population diversity can be affected with the changes of the size of the harmony memory HMS.The effect of the change in the size of the harmony memory on the optimization performance of each  The range of parameters HMS is 20, 50, 100, 150, and 200. Other parameters are unchanged.

The effect of the change in the size of the harmony memory on the optimization performance of each

Information 2018, 9, 180

13 of 24

algorithm is shown in Table 4. The comparison of the average optimum value of each function changed with the size of the harmony memory as showed in Figures 7–11. Table 4. Comparison of the optimal performance with different harmony memory sizes of five functions.

Functions

HMS

Average Optimal Value of HS

Average Optimal Value of IHS

Average Optimal Value of GHS

Average Optimal Value of SGHS

Average Optimal Value of SC-AHS

f1

20 50 100 150 200 Relative rate of change/% Average change

2.09 × 104 2.59 × 103 1.02 × 103 8.41 × 102 5.05 × 102 9.76 × 101 5.16 × 103

1.38 × 104 5.33 × 103 2.33 × 103 1.04 × 103 6.20 × 102 9.55 × 101 4.63 × 103

1.60 × 101 4.00 × 100 1.20 × 102 4.80 × 102 1.20 × 102 6.50 × 102 1.48 × 102

4.65 × 103 3.07 × 103 1.68 × 103 6.58 × 102 4.31 × 102 9.07 × 101 2.10 × 103

8.23 × 10−20 2.14 × 10−27 7.58 × 10−17 4.29 × 10−25 1.00 × 10−30 1.00 × 102 1.52 × 10−17

f2

20 50 100 150 200 Relative rate of change/% Average change

1.60 × 102 7.11 × 101 3.23 × 101 2.70 × 101 1.76 × 101 8.91 × 101 6.17 × 101

1.82 × 102 9.51 × 101 5.28 × 101 2.87 × 101 2.68 × 101 8.53 × 101 7.71 × 101

3.49 × 101 1.16 × 100 1.16 × 100 2.04 × 100 4.02 × 100 8.85 × 101 8.65 × 100

1.22 × 102 1.72 × 102 5.40 × 101 2.73 × 102 2.25 × 101 8.15 × 101 1.29 × 102

1.59 × 101 7.96 × 100 1.99 × 100 9.95 × 10−1 1.78 × 10−15 1.00 × 102 5.37 × 100

f3

20 50 100 150 200 Relative rate of change/% Average change

1.77 × 102 4.32 × 101 1.70 × 101 6.50 × 100 4.46 × 100 9.75 × 101 4.97 × 101

1.25 × 102 5.19 × 101 1.29 × 101 1.01 × 101 5.21 × 100 9.58 × 101 4.11 × 101

1.12 × 100 1.60 × 100 7.20 × 10−1 1.92 × 10−1 2.08 × 100 8.57 × 101 1.14 × 100

7.10 × 101 3.30 × 101 6.35 × 100 5.19 × 101 6.63 × 100 9.07 × 101 3.38 × 101

6.14 × 10−1 2.46 × 10−2 1.11 × 10−16 2.22 × 10−2 1.97 × 10−2 9.68 × 101 1.36 × 10−1

20 50 100 f4 150 200 Relative rate of change/% Information 2018, 9, x  Average change

1.54 × 101 1.30 × 101 9.06 × 100 6.05 × 100 5.15 × 100 6.66 × 101 9.73 × 100

1.48 × 101 1.24 × 101 9.64 × 100 6.22 × 100 6.81 × 100 5.41 × 101 9.98 × 100

5.89 × 10−16 5.89 × 10−16 5.89 × 10−16 5.89 × 10−16 4.59 × 100 7.80 × 1017 9.19 × 10−1

1.43 × 101 9.69 × 100 5.89 × 100 5.89 × 100 5.54 × 100 6.13 × 101 8.27 × 100

8.97 × 10−1 8.96 × 10−1 8.96 × 10−1 4.16 × 10−5 8.48 × 10−7 1.00 × 102 14 of 24  5.38 × 10−1

2050  50100  100 150  150 200  200 Relative rate of  Relative rate of change/% change/%  Average change Average change 

7 1.57 × 10 6  4.17 × 10 6 4.17 × 10 5 2.14 × 10   5 2.14 × 10 1.85 × 104  4 1.85 × 10 1.08 × 104  4 1.08 × 10 1  1 9.99 × 10 9.99 × 10 4.01 × 106 4.01 × 106 

7 4.96 × 10 6  3.71 × 10 6 3.71 × 10 5 5.21 × 10   5 5.21 × 10 5  5.58 × 10 5 5.58 × 10 1.05 × 105  5 1.05 × 10 1  1 9.98 × 10 9.98 × 10 1.09 × 107 1.09 × 107 

6 1.09 × 4  10 9.13 × 10 4 9.13 × 10 4 7.31 × 10   4 7.31 × 10 1.47 × 106  6 1.47 × 10 1.67 × 104  4 1.67 × 10 1  101 9.85 × 9.85 × 10 5.49 × 105 5.49 × 105 

−2 4.36 × −1610 7.72 × 10   −16 7.72 × 10 −20 1.00 × 10   −20 1.00 ×−2010 1.00 × 10   1.00 ×−2010−20 1.00 × 10   −20 1.00 × 10 2  102 1.00 × 1.00 × 10 8.72 × 10−3 8.72 × 10−3 

f5

3.00 ×2 101 4.66 × 10 4.66 ×2 102 3.76 × 10 3.76 ×1 102 3.00 × 10 3.00 ×2 101 4.66 × 10   2 4.66 × 10 1.45 ×3 103 1.45 × 10 2.73 × 102 2.73 × 102 

  Figure 7. Comparison of the average optimum with different harmony memory sizes of the Sphere  Figure 7. Comparison of the average optimum with different harmony memory sizes of the function. 

Sphere function.

    Information 2018, 9, 180 Figure 7. Comparison of the average optimum with different harmony memory sizes of the Sphere 

14 of 24

function.  Figure 7. Comparison of the average optimum with different harmony memory sizes of the Sphere  function. 

  Figure  8.  Comparison  of  the  average  optimum  with  different  harmony  memory  sizes  of   the  Figure 8. Comparison of the average optimum with different harmony memory sizes of the Rastrigrin function.  Figure  8.  Comparison  of  the  average  optimum  with  different  harmony  memory  sizes  of  the 

Rastrigrin function.

Rastrigrin function. 

  Figure  9.  Comparison  of  the  average  optimum  with  different  harmony  memory  sizes  of   the  Griewank function.  Figure  9.  Comparison  of  the  average  optimum  with  different  harmony  memory  sizes  of  the  Figure 9. Comparison of the average optimum with different harmony memory sizes of the Griewank function. 

Griewank function. Information 2018, 9, x 

15 of 24 

  Figure 10. Comparison of the average optimum with different harmony memory sizes of the Ackley 

Figure 10. Comparison of the average optimum with different harmony memory sizes of the function.  Ackley function.

  Information 2018, 9, 180 Figure 10. Comparison of the average optimum with different harmony memory sizes of the Ackley 

15 of 24

function. 

  Figure  11.  Comparison  of  the  average  optimum  with  different  harmony  memory  sizes  of  the  Figure 11. Comparison of the average optimum with different harmony memory sizes of the Rosenbrock function. 

Rosenbrock function.

5.2.3. Test of SC‐AHS Algorithm with Dynamic Parameters Experiment 

In Table there is: is  mainly  to  verify  that  the  SC‐AHS  algorithm  has  fast  convergence  The  4,experiment  performance.  The  experiment  still  adopts  the  conditions  of  changing  parameters,  and  still  adopts  Relative rate o f change two test methods, including varying the number of harmonic components and varying the harmonic  | average optimal value o f 20 f or the harmony memory− average optimal value o f 200 f or the harmony memory| × 100% = memory size, HMS. The effect of changing the parameters under test on the convergence speed of  average optimal value o f 20 f or the harmony memory the  SC‐AHS  algorithm  is  studied.  The  average  optimal  value  of  each  function  and  the  number  of  (17) evolutions are used as a measure of the performance of the SC‐AHS algorithm.  ∑5 average optimal value o f di f f erent sizes o f harmony memories

Average change =

i =1

5

× 100% (18)

Test of Average Optimum with Different Harmony Tone Components of the SC‐AHS Algorithm 

It can be seen that for the single-peak Sphere (f 1) and Rosenbrock (f 5) functions and the multi-peak RastrigrinThe range of the number of harmonic components, N, is from 50 to 100. The other parameters  (f 2) function, the average value of each algorithm has an increasing trend with the increase are unchanged. The effect of the change in the number of harmonic components on the convergence  of the size of the harmony memory, but the SC-AHS algorithm shows a significant downward trend. rate  of  the  SC‐AHS  algorithm  is  shown  in  Table  5.  The  comparison  of  the  change  of  the  average  The relative change rate of the SC-AHS algorithm in Table 4 is higher than other algorithms, indicating optimum value and the sum of the tone components of each function is shown in Figure 12. It can be  that with the increase of the size of the harmony memory, the diversity of the population is maintained seen that with the increase in the number of tone components, there is no apparent upward trend in  and the convergence accuracy is improved. For the multi-peak Grievenk (f 3) and Ackley (f 4) functions, the average optimal value of the functions. Table 5 shows that the number of evolutions of the SC‐AHS  the average optimal value20,000  of thefor  SC-AHS algorithm is not stable. The average change of the algorithm  is  less  than  single‐peak  Sphere  (f1)  and  Rosenbrock  (f5)  functions,  as value well  as  SC-AHS algorithm in Table 4 is the lowest among the algorithms, and the relative change rate of the multi‐peak Grievenk function (f3), and the number of tone components is different, indicating that  the SC‐AHS algorithm still has a high convergence speed in the high‐dimensional case.  algorithm is higher than other algorithms. Therefore, the SC-AHS algorithm increases the convergence accuracy of the algorithm with the increase of the size of the harmonic memory, and it has better convergence than other algorithms. 5.2.3. Test of SC-AHS Algorithm with Dynamic Parameters Experiment The experiment is mainly to verify that the SC-AHS algorithm has fast convergence performance. The experiment still adopts the conditions of changing parameters, and still adopts two test methods, including varying the number of harmonic components and varying the harmonic memory size, HMS. The effect of changing the parameters under test on the convergence speed of the SC-AHS algorithm is studied. The average optimal value of each function and the number of evolutions are used as a measure of the performance of the SC-AHS algorithm. Test of Average Optimum with Different Harmony Tone Components of the SC-AHS Algorithm The range of the number of harmonic components, N, is from 50 to 100. The other parameters are unchanged. The effect of the change in the number of harmonic components on the convergence rate of the SC-AHS algorithm is shown in Table 5. The comparison of the change of the average optimum value and the sum of the tone components of each function is shown in Figure 12. It can be seen that with the increase in the number of tone components, there is no apparent upward trend in the average optimal value of the functions. Table 5 shows that the number of evolutions of the SC-AHS algorithm

Information 2018, 9, 180

16 of 24

is less than 20,000 for single-peak Sphere (f 1) and Rosenbrock (f 5) functions, as well as multi-peak Grievenk function (f 3), and the number of tone components is different, indicating that the SC-AHS algorithm still has a high convergence speed in the high-dimensional case.

Information 2018, 9, x 

16 of 24 

Table 5. Influences of convergence performance with different harmony tone components of the Table 5. Influences of convergence performance with different harmony tone components of the SC‐AHS  SC-AHS algorithm. algorithm.  Function Function 

Optimize Performance Optimize Performance  Average optimal value Average optimal value  Evolution times/times Evolution times/times  Average optimal value Average optimal value  Evolution times/times Evolution times/times  Average optimal value Average optimal value  Evolution times/times Evolution times/times  Average optimal value Average optimal value  Evolution times/times Evolution times/times  Average optimal value Average optimal value  Evolution times/times Evolution times/times 

f 1f   1 f 2f   2 f 3f   3

f 4f   4 f 5f   5

NN = 50  = 50

NN = 60  = 60

N = 70 N = 70 

17 −21  3.34 9.62 × 10− 19−19  2.49 × 10− 21 × 10−−17 2.49 × 10 3.34 × 10   9.62 × 10 309 414 480 309  414  480 

1.29 × 101 1  1.29 × 10 20,000 20,000 

1.69 × 101 1  1.69 × 10 20,000 20,000 

2.29 × 1011  2.29 × 10 20,000 20,000 

16 4.93 × 10− 3−3  9.86 × 10− 3−3  1.11 × 10−−16 9.86 × 10 1.11 × 10   4.93 × 10 20,000 20,000 19,763 20,000  20,000  19763 

7.10 × 10− 5−5  7.73 × 10− 6−6  7.73 × 10 7.10 × 10 20,000 20,000

N = 80 N = 80  18 1.00 × 10−−18 1.00 × 10   548 548 

3.78 × 1011  3.78 × 10 20,000 20,000  1.23 × 10−−22  1.23 × 10 20,000 20000 

0

1

8.51 × 10 0  1.88 × 10 1  8.51 × 10 1.88 × 10 20,000 20,000 20,000  20,000  20,000  20,000  20 20 −20  1.00 1.00 × 10− 20−20  1.00 × 10− 20 × 10−−20 × 10−−20 1.00 × 10 1.00 × 10   1.00 1.00 × 10   1.00 × 10 5292 6771 7802 9855 5292  6771  7802  9855 

N = 90 N = 90  17 −17 5.78 × 10− 5.78 × 10   620 620  4.38 × 1011  4.38 × 10 20,000 20,000  −33  4.93 × 10− 4.93 × 10 20,000 20000  1.56 × 1011  1.56 × 10 20,000 20,000  20 −20  1.00 × 10− 1.00 × 10 10,441 10,441 

N = 100 N = 100  − 25 2.57 × 10−25 2.57 × 10   765 765  6.07 × 101 1 6.07 × 10 20,000 20,000  −2 1.72 × 10−2 1.72 × 10   20,000 20,000  2.08 × 101 1 2.08 × 10 20,000 20,000  − 20 1.00 × 10−20 1.00 × 10   10,890 10,890 

  Figure  12.  Comparison  of  the  average  optimum  with  different  harmony  tone  components  of  five  Figure 12. Comparison of the average optimum with different harmony tone components of functions.  five functions.

Test of Average Optimum with Different Harmony Memory Sizes of the SC‐AHS Algorithm  Test of Average Optimum with Different Harmony Memory Sizes of the SC-AHS Algorithm The value range of the HMS memory size is also 20, 50, 100, 150, and 200, respectively, and the  The value range of the HMS memory size is also 20, 50, 100, 150, and 200, respectively, and the other parameter settings are unchanged. The effect of the change in the size of the harmony memory  other parameter settings are unchanged. The effect of the change in the size of the harmony memory on  the  convergence  rate  of  the  SC‐AHS  algorithm  is  shown  in  Table  6.  The  comparison  of  the  on the convergence rate of the SC-AHS algorithm is shown in Table 6. The comparison of the average average  optimum  value  of  each  function  with  the  change  of  the  size  of  the  harmony  memory  is  optimum value of each function with the change of the size of the harmony memory is shown in shown in Figure 13. It can be seen that, as the size of the harmonic memory increases, the average  Figure 13. It can be seen that, as the size of the harmonic memory increases, the average optimal value optimal value of each function has a more obvious downward trend. This is because the size of the  of each function has a more obvious downward trend. This is because the size of the harmonic memory harmonic  memory  increases,  which  increases  the  diversity  of  the  population  and  the  search  increases, which increases the diversity of the population and the search performance of the algorithm. performance of the algorithm. Table 6 shows that the number of SC‐AHS evolutions is lower than  Table 6 shows that the number of SC-AHS evolutions is lower than 20,000 for single-peak Sphere (f 1), 20,000  for  single‐peak  Sphere  (f1),  Rosenbrock  (f5),  multi‐peak  Rastrigrin  (f2),  and  Griewank  (f3)  Rosenbrock (f 5), multi-peak Rastrigrin (f 2), and Griewank (f 3) functions. It shows that the SC-AHS functions.  It  shows  that  the  SC‐AHS  algorithm  has  increased  the  convergence  speed  when  the  algorithm has increased the convergence speed when the harmony memory is large. harmony memory is large.  Table 6. Comparisons of the convergence performance with different harmony memory sizes of the  SC‐AHS algorithm.  Function 

f1   f2   f3  

f4   f5  

Optimize Performance  Average optimal value  Evolution times/times  Average optimal value  Evolution times/times  Average optimal value  Evolution times/times  Average optimal value  Evolution times/times  Average optimal value  Evolution times/times 

HMS = 20  8.23 × 10−20  1606  1.59 × 101  20,000  3.06 × 10−2  20,000  2.27 × 10−5  20,000  2.43 × 10−3  20,000 

HMS = 50  2.14 × 10−27  280  7.96 × 100  20,000  2.46 × 10−2  20,000  1.60 × 10−8  20,000  4.33 × 10−17  20,000 

HMS = 100  7.58 × 10−17  203  1.99 × 100  20,000  1.11 × 10−16  11936  3.93 × 10−9  20,000  1.00 × 10−20  4296 

HMS = 150  4.29 × 10−25  219  9.95 × 10−1  20,000  2.22 × 10−2  20000  2.87 × 10−10  20,000  1.00 × 10−20  3932 

HMS = 200  1.00 × 10−30  209  1.78 × 10−15  18,361  1.97 × 10−2  20,000  1.31 × 10−11  20,000  1.00 × 10−20  3346 

Information 2018, 9, 180

17 of 24

Table 6. Comparisons of the convergence performance with different harmony memory sizes of the SC-AHS algorithm. Function

Optimize Performance

HMS = 20 10− 20

8.23 × 1606

HMS = 50 10− 27

2.14 × 280

HMS = 100

HMS = 150

HMS = 200

10− 17

10− 25

4.29 × 219

1.00 × 10− 30 209

7.58 × 203

f1

Average optimal value Evolution times/times

f2

Average optimal value Evolution times/times

1.59 × 101 20,000

7.96 × 100 20,000

1.99 × 100 20,000

9.95 × 10− 1 20,000

1.78 × 10− 15 18,361

f3

Average optimal value Evolution times/times

3.06 × 10− 2 20,000

2.46 × 10− 2 20,000

1.11 × 10− 16 11936

2.22 × 10− 2 20000

1.97 × 10− 2 20,000

f4

Average optimal value Evolution times/times

2.27 × 10− 5 20,000

1.60 × 10− 8 20,000

3.93 × 10− 9 20,000

2.87 × 10− 10 20,000

1.31 × 10− 11 20,000

Average optimal value f5 Evolution times/times Information 2018, 9, x 

2.43 × 10− 3 20,000

4.33 × 10− 17 20,000

1.00 × 10− 20 4296

1.00 × 10− 20 3932

1.00 × 10− 20 3346 17 of 24 

  Figure  13.  Comparison  of  the  average  optimum  with  different  harmony  memory  sizes  of  five  Figure 13. Comparison of the average optimum with different harmony memory sizes of five functions. functions. 

5.3. Apple Image Segmentation Experiment 5.3. Apple Image Segmentation Experiment  The experiment uses a single apple image as input, and takes the ARMM-AHS method proposed in The  experiment  uses  a  single  apple  image  as  input,  and  takes  the  ARMM‐AHS  method  this paper as the threshold search strategy of images to perform apple image segmentation experiments. proposed  in  this  paper  as  the  threshold  search  strategy  of  images  to  perform  apple  image  The maximum class variance function and the maximum entropy function are used as the two objective segmentation  experiments.  The  maximum  class  variance  function  and  the  maximum  entropy  fitness functions of the multi-objective optimization method. The main interface of the apple image function are used as the two objective fitness functions of the multi‐objective optimization method.  segmentation program developed with Matlab2012 GUI is shown in Figure 14. The  main  interface  of  the  apple  image  segmentation  program  developed  with  Matlab2012  GUI  is  Five segmentation indicators based on the image recognition rate, such as the segmentation rate shown in Figure 14.  (RSeg ), fruit segmentation rate (RApp ), branch segmentation rate (RBra ), segmentation success rate (RSuc ), and missing rate (RMis ), are respectively defined in this paper. The equations of the five indicators are defined in Equations (19)–(23). Among them, Nto is the total number of segmentation targets (including the number of fruits and the number of branches), Npo is the total number of targets obtained by manual visualization (including the number of fruits and the number of branches), Na is the total number of segmentation fruits, Npa is the total number of fruits obtained by manual visualization, and Nl is the total number of segmentation branches. Due to the effects of the illumination condition and interference factors, the experiment sets the fruit size that is greater than 1/4 that of complete fruit as a fruit statistics value, and sets the branch length that is greater than 1/4 that of a complete branch as a branch of the statistics value, and fruits that are too small in the perspective of the image are not counted in the statistics values. Nto RSeg = × 100% (19) Npo R App =

Na × 100% Nto

(20)

  Figure  13.  Comparison  of  the  average  optimum  with  different  harmony  memory  sizes  of  five  18 of 24 functions. 

Information 2018, 9, 180

5.3. Apple Image Segmentation Experiment  N Nto − Na R Bra = l × 100% = R App = × 100% (21) Nto apple  image  as  input,  Ntoand  takes  the  ARMM‐AHS  method  The  experiment  uses  a  single  Na strategy  of  images  to  perform  apple  image  proposed  in  this  paper  as  the  threshold  search  RSuc = × 100% (22) Npa variance  function  and  the  maximum  entropy  segmentation  experiments.  The  maximum  class  function are used as the two objective fitness functions of the multi‐objective optimization method.  Npa − Na R Mis = × 100% (23) The  main  interface  of  the  apple  image  segmentation  Npa program  developed  with  Matlab2012  GUI  is  shown in Figure 14. 

  Figure 14. Main interface of the apple image segmentation program. Figure 14. Main interface of the apple image segmentation program.  

Figures 15–17 show the comparison of the results of image segmentation under three illumination conditions, including direct sunlight with strong illumination, backlighting with strong illumination, and backlighting with medium illumination. The figures show the original image, the preprocessed image, the Pareto optimal frontier fitness curve using the SC-AHS optimization algorithm, the ARMM-AHS method segmentation result, and the OTSU algorithm segmentation result. Figures 15–17 show that the multi-objective optimization method can be used to obtain a set of non-dominated threshold solutions, which can provide more opportunities to select thresholds. The thresholds of the non-dominated solution set under six illumination conditions using the ARMM-AHS method is shown in Table 7. The number of non-dominated solutions under six illumination conditions are different.

Information 2018, 9, 180 Information 2018, 9, x  Information 2018, 9, x 

19 of 24 19 of 24  19 of 24 

    Figure 15. Comparison of the results of image segmentation under direct sunlight under the strong  Figure 15. Comparison of the results of image segmentation under direct sunlight under the strong Figure 15. Comparison of the results of image segmentation under direct sunlight under the strong  illumination condition.  illumination condition. illumination condition. 

    Figure  16.  Comparison  of  the  results  of  image  segmentation  under  backlighting  under  the  strong  Figure  16.  Comparison  of  the  results  of  image  segmentation  under  backlighting  under  the  strong  Figure 16. Comparison of the results of image segmentation under backlighting under the strong illumination condition.  illumination condition.  illumination condition.

Information 2018, 9, 180 Information 2018, 9, x 

20 of 24 20 of 24 

  Figure 17. Comparison of the results of image segmentation under backlighting under the medium  Figure 17. Comparison of the results of image segmentation under backlighting under the medium illumination condition.  illumination condition. Table 7. Thresholds of the non‐dominated solution set under six illumination conditions using the  Table 7. Thresholds of the non-dominated solution set under six illumination conditions using the ARMM‐AHS method.  ARMM-AHS method. Direct Sunlight 

Direct Sunlight 

with Medium  Directwith Strong  Sunlight Direct Sunlight Non‐Dominated  Non‐Dominated  with Strong with Medium

Direct Sunlight 

with Weak  Direct Sunlight Non‐Dominated  with Weak

Solution 

Non-Dominated Number: 54  Solution 77  Number: 54 116 

Solution 

Non-Dominated Number: 36  Solution 121  Number: 36 121 

Solution 

Non-Dominated Number: 28  Solution 130  Number: 28 130 

77 77  116 77  77 77  77 116  77 116  116 77  116 116  77 116  116 116  116 77  116 77  77 77 77  77 116  116 77  77 116  116 77  77 116  116 77  77 116  116 77  77 77 

121 114  121 121  114 121  121 121  121 131  121 121  131 121  121 121  121 121  121 121  121 121  121 131  121 114  131 121  114 131  121 121  131 114  121 121  114 121  121 121  121 114  121

130 130  130 130  130 130  130 130  130 130  130 130  130 130  130 130  130 130  130 130  130 130  130 130  130 130  130 130  130 130  130 130  130 130  130 130  130 130  130 46  130 133  46

77 

121 

85 

Backlighting  Backlighting  Backlighting  with Strong  with Medium  with Weak  Backlighting Backlighting Backlighting Non‐Dominated  Non‐Dominated  with Strong with Medium Non‐Dominated  with Weak Solution  Solution  Solution  Non-Dominated Non-Dominated Non-Dominated Number: 18  Number: 54  Number: 62  Solution Solution Solution 131  107  146  Number: 18 Number: 54 Number: 62 131  99  152  131 131  107 107 129  146 131 131  126 99 113  152 131 107 131  107  152  129 131 131  126 126 152  113 131 107 131  126  152  152 131 126 152 131  126  115  131 126 152 131  107  115  131 126 115 131  107  115  131 107 115 131  106  113  131 107 115 131  99  113  131 106 113 172  99  113  131 99 113 228  107  122  172 99 113 66  99 107 152  122 228 221  126 99 113  152 66 206  99 126 113  113 221 113  15  106  206 99 113  15 107 106 120  113   106 107 129  120   106 106 120  129   99 106 129  120   106 99 146  129   106  129 

Information 2018, 9, 180

21 of 24

Table 7. Cont. Direct Sunlight with Strong

Direct Sunlight with Medium

Direct Sunlight with Weak

Backlighting with Strong

Backlighting with Medium

Backlighting with Weak

Non-Dominated Solution Number: 54

Non-Dominated Solution Number: 36

Non-Dominated Solution Number: 28

Non-Dominated Solution Number: 18

Non-Dominated Solution Number: 54

Non-Dominated Solution Number: 62

77 77 77 77 77 77 77 116 116 116 116 116 77 116 77 77 77 77 77 116 116 153 69 231 205 146 36 33 221 21 163 203

114 121 121 121 114 149 114 178 243 39 198 66 166 97

133 85 72 146 55 45

106 106 126 107 126 107 99 126 99 99 126 126 126 126 107 126 107 106 106 126 106 168 70 230 98 106 159 163 70 10 208 91

146 129 120 120 152 113 152 129 146 115 122 120 146 122 122 120 152 122 120 122 146 120 146 120 146 122 113 122 202 58 47 122 214 28 129 209 181 204 248 172

The comparison of the thresholds selected from the non-dominated solution set under six illumination conditions with OTSU thresholds is shown in Table 8. The effect of apple image segmentation under six lighting conditions is shown in Table 9. The data in Table 9 shows that the segmentation rate and segmentation success rate of the direct sunlight with strong illumination and backlighting with strong illumination all reach 100%. When the segmentation result includes branches, the branch segmentation rate and the missing rate are equivalent. This shows that the branches are well segmented and achieve the desired segmentation effect.

Information 2018, 9, 180

22 of 24

Table 8. Comparison of thresholds selected from the non-dominated solution set under six illumination conditions with OTSU thresholds. Illumination Condition

Select Threshold by SC-AHS

OTSU Threshold

Direct sunlight with strong Direct sunlight with medium Direct sunlight with weak Backlighting with strong Backlighting with medium Backlighting with weak

116 121 130 131 99 113

129 129 133 144 130 130

Table 9. Apple image segmentation effect comparison under six illumination conditions. Illumination Condition

RSeg

RApp

RBra

RSuc

RMis (%)

Direct sunlight with strong Direct sunlight with medium Direct sunlight with weak Backlighting with strong Backlighting with medium Backlighting with weak

100.0 80.0 80.0 100.0 85.7 66.7

85.7 75.0 100.0 83.3 83.3 100.0

14.3 25.0 0.0 16.7 16.7 0.0

100.0 75.0 80.0 100.0 83.3 66.7

0.0 25.0 20.0 0.0 16.7 33.3

6. Conclusions This paper proposes a new adaptive harmony search algorithm with simulation and creation, which expands the search space to maintain the diversity of the solution and accelerates the convergence of the algorithm compared with HS, HIS, GHS, and SGHS algorithms. An apple image recognition multi-objective method based on the adaptive harmony search algorithm with simulation and creation is proposed. The multi-objective optimization recognition method transforms the apple image recognition problem collected in the natural scene into a multi-objective optimization problem, and uses the new adaptive harmony search algorithm with simulation and creation as the image threshold search strategy. The proposed multi-objective optimization recognition method obtains a set of non-dominated threshold solution sets, which is more flexible than the OTSU algorithm in the opportunity of threshold selection. The selected threshold has better adaptive characteristics and has good image segmentation results. Author Contributions: L.L. conceived and designed the multi-objective method and the experiments, and wrote the paper; J.H. performed the experiments and analyzed the experiments. Both authors have read and approved the final manuscript. Funding: This work is supported by the National Nature Science Foundation of China (grant No. 61741201, 61462058). Conflicts of Interest: The authors declare no conflict of interest.

References 1. 2. 3. 4. 5.

Zong, W.G.; Kim, J.H.; Loganathan, G.V. A new heuristic optimization algorithm: Harmony search. Simulation 2001, 76, 60–68. [CrossRef] Ouyang, H.B.; Xia, H.G.; Wang, Q.; Ma, G. Discrete global harmony search algorithm for 0–1 knapsack problems. J. Guangzhou Univ. (Nat. Sci. Ed.) 2018, 17, 64–70. Xie, L.Z.; Chen, Y.J.; Kang, L.; Zhang, Q.; Liang, X.J. Design of MIMO radar orthogonal polyphone code based on genetic harmony algorithm. Electron. Opt. Control. 2018, 1, 1–7. Zhu, F.; Liu, J.S.; Xie, L.L. Local Search Technique Fusion of Harmony Search Algorithm. Comput. Eng. Des. 2017, 38, 1541–1546. Wu, C.L.; Huang, S.; Wang, Y.; Ji, Z.C. Improved Harmony Search Algorithm in Application of Vulcanization Workshop Scheduling. J. Syst. Simul. 2017, 29, 630–638.

Information 2018, 9, 180

6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20.

21. 22. 23. 24. 25. 26. 27. 28. 29. 30. 31.

23 of 24

Peng, H.; Wang, Z.X. An Improved Adaptive Parameters Harmony Search Algorithm. Microelectron. Comput. 2016, 33, 38–41. Mahdavi, M.; Fesanghary, M.; Damangir, E. An improved harmony search algorithm for solving optimization problems. Appl. Math. Comput. 2007, 188, 1567–1579. [CrossRef] Han, H.Y.; Pan, Q.K.; Liang, J. Application of improved harmony search algorithm in function optimization. Comput. Eng. 2010, 36, 245–247. Zhang, T.; Xu, X.Q.; Ran, H.J. Multi-objective Optimal Allocation of FACTS Devices Based on Improved Differential Evolution Harmony Search Algorithm. Proc. CSEE 2018, 38, 727–734. LIU, L.; LIU, X.B. Improved Multi-objective Harmony Search Algorithm for Single Machine Scheduling in Sequence Dependent Setup Environment. Comput. Eng. Appl. 2013, 49, 25–29. Hosen, M.A.; Khosravi, A.; Nahavandi, S.; Creighton, D. Improving the quality of prediction intervals through optimal aggregation. IEEE Trans. Ind. Electron. 2015, 62, 4420–4429. [CrossRef] Precup, R.E.; David, R.C.; Petriu, E.M. Grey wolf optimizer algorithm-based tuning of fuzzy control systems with reduced parametric sensitivity. IEEE Trans. Ind. Electron. 2017, 64, 527–534. [CrossRef] Saadat, J.; Moallem, P.; Koofigar, H. Training echo state neural network using harmony search algorithm. Int. J. Artif. Intell. 2017, 15, 163–179. Vrkalovic, S.; Teban, T.A.; Borlea, L.D. Stable Takagi-Sugeno fuzzy control designed by optimization. Int. J. Artif. Intell. 2017, 15, 17–29. Fourie, J.; Mills, S.; Green, R. Harmony filter: A robust visual tracking system using the improved harmony search algorithm. Image Vis. Comput. 2010, 28, 1702–1716. [CrossRef] Alia, O.M.; Mandava, R.; Aziz, M.E. A hybrid harmony search algorithm for MRI brain segmentation. Evol. Intell. 2011, 4, 31–49. [CrossRef] Fourie, J. Robust circle detection using Harmony Search. J. Optim. 2017, 2017, 1–11. [CrossRef] Moon, Y.Y.; Zong, W.G.; Han, G.T. Vanishing point detection for self-driving car using harmony search algorithm. Swarm Evol. Comput. 2018, 41, 111–119. [CrossRef] Zong, W.G. Novel Derivative of Harmony Search Algorithm for Discrete Design Variables. Appl. Math. Comput. 2008, 199, 223–230. Das, S.; Mukhopadhyay, A.; Roy, A.; Abraham, A.; Panigrahi, B.K. Exploratory Power of the Harmony Search Algorithm: Analysis and Improvements for Global Numerical Optimization. IEEE Trans. Syst. Man Cybern. 2011, 41, 89–106. [CrossRef] [PubMed] Saka, M.P.; Hasançebi, O.; Zong, W.G. Metaheuristics in Structural Optimization and Discussions on Harmony Search Algorithm. Swarm Evol. Comput. 2016, 28, 88–97. [CrossRef] Hasançebi, O.; Erdal, F.; Saka, M.P. Adaptive Harmony Search Method for Structural Optimization. J. Struct. Eng. 2010, 136, 419–431. [CrossRef] Zong, W.G.; Sim, K.B. Parameter-Setting-Free Harmony Search Algorithm. Appl. Math. Comput. 2010, 217, 3881–3889. Pan, Q.K.; Suganthan, P.N.; Tasgetiren, M.F.; Liang, J.J. A self-adaptive global best harmony search algorithm for continuous optimization problems. Appl. Math. Comput. 2010, 216, 830–848. [CrossRef] Bao, X.M.; Wang, Y.M. Apple image segmentation based on the minimum error Bayes decision. Trans. Chin. Soc. Agric. Eng. 2006, 22, 122–124. Qian, J.P.; Yang, X.T.; Wu, X.M.; Chen, M.X.; Wu, B.G. Mature apple recognition based on hybrid color space in natural scene. Trans. Chin. Soc. Agric. Eng. 2012, 28, 137–142. Lu, B.B.; Jia, Z.H.; He, D. Remote-Sensing Image Segmentation Method based on Improved OTSU and Shuffled Frog-Leaping Algorithm. Comput. Appl. Softw. 2011, 28, 77–79. Otsu, N. A Threshold Selection Method from Gray-Level Histograms. IEEE Trans. Syst. Man Cybern. 1979, 9, 62–66. [CrossRef] Kapur, J.N.; Sahoo, P.K.; Wong, A.K.C. A new method for gray-level picture thresholding using the entropy of the histogram. Comput. Vis. Graph. Image Process. 1985, 29, 273–285. [CrossRef] Zhao, F.; Zheng, Y.; Liu, H.Q.; Wang, J. Multi-population cooperation-based multi-objective evolutionary algorithm for adaptive thresholding image segmentation. Appl. Res. Comput. 2018, 35, 1858–1862. Li, M.; Cao, X.L.; Hu, W.J. Optimal multi-objective clustering routing protocol based on harmony search algorithm for wireless sensor networks. Chin. J. Sci. Instrum. 2014, 35, 162–168.

Information 2018, 9, 180

32. 33. 34. 35. 36. 37. 38. 39. 40. 41. 42.

24 of 24

Dong, J.S.; Wang, W.Z.; Liu, W.W. Urban One-way traffic optimization based on multi-objective harmony search approach. J. Univ. Shanghai Sci. Technol. 2014, 36, 141–146. Li, Z.; Zhang, Y.; Bao, Y.K.; Guo, C.X.; Wang, W.; Xie, Y.Z. Multi-objective distribution network reconfiguration based on system homogeneity. Power Syst. Prot. Control. 2016, 44, 69–75. Zong, W.G. Multiobjective Optimization of Time-Cost Trade-Off Using Harmony Search. J. Constr. Eng. Manag. 2010, 136, 711–716. Fesanghary, M.; Asadi, S.; Zong, W.G. Design of low-emission and energy-efficient residential buildings using a multi-objective optimization algorithm. Build. Environ. 2012, 49, 245–250. [CrossRef] Zong, W.G. Multiobjective optimization of water distribution networks using fuzzy theory and harmony search. Water 2015, 7, 3613–3625. Shi, X.D.; Gao, Y.L. Improved Bird Swarm Optimization Algorithm. J. Chongqing Univ. Technol. (Nat. Sci.) 2018, 4, 177–185. Wang, L.G.; Gong, Y.X. A Fast Shuffled Frog Leaping Algorithm. In Proceedings of the 9th International Conference on Natural Computation (ICNC 2013), Shenyang, China, 23–25 July 2013; pp. 369–373. Wolpert, D.H.; Macready, W.G. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [CrossRef] Wang, L. Intelligent Optimization Algorithms and Applications; Tsinghua University Press: Beijing, China, 2004; pp. 2–6. ISBN 9787302044994. Omran, M.G.H.; Mahdavi, M. Global-best harmony search. Appl. Math. Comput. 2008, 198, 634–656. [CrossRef] Han, J.Y.; Liu, C.Z.; Wang, L.G. Dynamic Double Subgroups Cooperative Fruit Fly Optimization Algorithm. Parttern Recognit. Artif. Intell. 2013, 26, 1057–1067. © 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

Suggest Documents