[Downloaded from www.aece.ro on Monday, December 18, 2017 at 11:20:02 (UTC) by 191.101.90.26. Redistribution subject to AECE license or copyright.]
Advances in Electrical and Computer Engineering
Volume 17, Number 2, 2017
Golden Sine Algorithm: A Novel Math-Inspired Algorithm Erkan TANYILDIZI, Gokhan DEMIR Department of Software Engineering, Firat University, 23119, Elazig, Turkey
[email protected] Abstract—In this study, Golden Sine Algorithm (Gold-SA) is presented as a new metaheuristic method for solving optimization problems. Gold-SA has been developed as a new search algorithm based on population. This math-based algorithm is inspired by sine that is a trigonometric function. In the algorithm, random individuals are created as many as the number of search agents with uniform distribution for each dimension. The Gold-SA operator searches to achieve a better solution in each iteration by trying to bring the current situation closer to the target value. The solution space is narrowed by the golden section so that the areas that are supposed to give only good results are scanned instead of the whole solution space scan. In the tests performed, it is seen that Gold-SA has better results than other population based methods. In addition, Gold-SA has fewer algorithm-dependent parameters and operators than other metaheuristic methods, increasing the importance of this method by providing faster convergence of this new method. Index Terms—artificial intelligence, computational intelligence, evolutionary computation, heuristic algorithms, optimization.
I. INTRODUCTION Metaheuristic algorithms have been used frequently in recent years to find the optimal solution of real engineering problems. The common use of general-purpose metaheuristics can be summarized as four main reasons for simplicity, flexibility, derivatability, and local optimum avoidance [1]. These methods are used regularly in all sectors such as trade, industry, engineering [2]. Metaheuristics are class of algorithms that provide a good enough solution over an acceptable period of time with incomplete or limited information using intuition or methods to solve a problem. These algorithms often use iterative methods and intuition to produce a solution using stochastic optimization techniques. Another advantage of metaheuristic algorithms is that they are not problemdependent. They are general-purpose methods that can be used to solve almost all kinds of problems. Conversely, the conclusion of the results is compromised as much as the minimum error value [3]. Metaheuristic algorithms are inspired by concepts such as nature, biology, physics, chemistry, human, mathematics, and etc. The rich source of nature for researchers has been inspiration in many respects. Today, most of the new algorithms are developed from nature. Nature-inspired algorithms are based on some successful properties of the biological system. Therefore, the greatest part of the algorithms that are inspired by nature are inspired by the biology [4]. The Genetic Algorithm [5] is inspired by the natural selection process. Biogeography-Based Optimization
[6] is based on biogeography, which is the geographical distribution of biological organisms. Brain Storm Optimization [7] simulates the problem-solving process of people creatively. Dolphin Echolocation Optimization [8] is inspired by the integration skills of dolphins, a biological sonar used for navigation and hunting in a variety of environments. The Flower Pollination Algorithm [9] models the pollination process of flowers. A special class of algorithms has been developed, inspired by swarm intelligence, among algorithms inspired by biology. For this reason, some algorithms based on biology are based on swarm intelligence. Swarm intelligence deals with behaviors that eventually result in the co-movement of multiple agents following certain rules. Particle Swarm Optimization [10] is inspired by the movements of birds and fish swarms. Ant Colony Algorithm [11] simulates the shortest path finding behavior of ants for food search. Bacterial Foraging Optimization Algorithm [12] is based on Escherichia Coli's social nutritional behavior. Bat algorithm [13] is inspired by the ability of bats to find their prey and to distinguish different insect species even in the darkest. Firefly Algorithm [14] is inspired by the flashing behavior of fireflies. Ant Lion Optimizer [15] simulates the hunting mechanism of ant lions in the nature. Whale Optimization Algorithm [16] is based on the bubble net hunting strategy of humpback whales. The inspiration for all of the metaheuristic algorithms is not based on biology. Some metaheuristic algorithms have been developed with the inspiration of physics and chemistry. These algorithms have been developed by simulating certain physical and chemical laws involving electric loads, gravity, river systems, etc. Black Hole Algorithm [17] is inspired by the observable realities of the black hole phenomenon. Charged System Search [18] is based on some physical and mechanical rules. ElectroMagnetism Optimization [19] is based on electromagnetism principles. Gravitational Search Algorithm [20] is based on the laws of gravity and mass interaction. Harmony Search Algorithm [21] is based on the purpose of music being looking for perfect harmony. There are metaheuristic algorithms that are inspired by human behaviors as well as other living things in the natural world. Anarchic Society Optimization [22] is inspired by the social group in which members behave anarchically to improve their situation. Imperialist Competitive Algorithm [23] is inspired by people's socio-political evolution process. Social-Based Algorithm [24] suggests a new approach by combining Evolutionary Algorithm and Imperialist Competitive Algorithm. Algorithms inspired by mathematics as well as all these
Digital Object Identifier 10.4316/AECE.2017.02010 1582-7445 © 2017 AECE
71
[Downloaded from www.aece.ro on Monday, December 18, 2017 at 11:20:02 (UTC) by 191.101.90.26. Redistribution subject to AECE license or copyright.]
Advances in Electrical and Computer Engineering algorithms are extremely important. The heuristic algorithm developed by the combination of metaheuristic and mathematical programming (MP) techniques is called Matheuristic. However, the number of mathematically inspired algorithms in the literature is very small. Base Optimization Algorithm [25] uses the combination of basic arithmetic operators with a displacement parameter that directs the optimum point. Sine Cosine Algorithm [26] uses a mathematical model based on sine and cosine functions. It is an important task to design, develop, and implement new techniques under the scientific philosophy of continuous improvement and always better search philosophy [27]. For this purpose, Gold-SA, a new generalpurpose matheuristic algorithm, has been developed for solving optimization problems.
Volume 17, Number 2, 2017 and x 2 are the coefficients obtained by the golden section method. These coefficients narrow the search space and allow the current value to approach the target value. As in other population-based algorithms, Gold-SA also starts with a randomly generated population. Good selection of the initial population is very important for populationbased algorithms. As shown in equation (2), the Gold-SA initial population aims to scan the search space better by randomly generating a uniform distribution for each dimension. V rand(agentno , dimension)*(up _ bound low_ bound) low_ bound (2)
II. GOLDEN SINE ALGORITHM Metaheuristic methods use stochastic based operators, unlike deterministic methods. The creation of these operators is inspired by different sources (biology, physics, music, mathematics, etc.). The inspired source of the proposed mathematics based algorithm is the sine function as shown in Fig. 1. Sine, a trigonometric function, is expressed by the sin abbreviation. The sine is the coordinate relative to the y-axis of a point on a 1-unit radius circle that is the central origin. An orthogonal triangle with an angle made by the y-axis of a straight line drawn from the origin or with the same angle is calculated with the hypotenuse section of the edge opposite this angle. The definition range of the function is [-1, 1]. The sine function is a periodic function that repeats values at regular intervals. The period of the sine function is 2π and all the unit circle with all sine values can be scanned as shown in Fig. 1.
(a)
(b) Figure 1. Change of sine period in unit circle
The scan of the unit circle of all values of the sine function is similar to the search of the search space in optimization problems. This similarity has inspired the development of Gold-SA. The operator used in the algorithm shown in Fig. 2 is shown in equation (1). V (i, j ) V (i, j )* | sin( r1 ) | r2 *sin( r1 )* | x1 * D ( j ) x2 * V (i, j ) | (1)
V(i, j) is the value of current solution in i-th dimension. D is the determined target value. r 1 is a random number in the range [0, 2π]. r 2 is a random number in the range [0, π]. x 1 72
Figure 2. Golden Sine Algorithm
The main goal of metaheuristic methods is to explore the areas where search space is thought to be the best, and to make sure that these areas are scanned as completely as possible. The wide range of search space is a major problem for solving problems. The effect of narrowing the search space in solving problems is significantly affecting the results. Gold-SA uses the golden section method to make this process the best possible way. Golden section search is an optimization technique that can be used to find the maximum or minimum value of a single unimodal function. The name is from the golden ratio. The golden ratio is a geometric and numerical ratio association that is believed to give the most accurate dimensions in terms of harmony, observed between pieces of a whole in mathematics and art [28]. Two numbers, p and q, are in a golden ratio if pq p (3) p q Equation (3) can be written as equation (4) or equation (5) q 1 (4) p
1
1
(5)
Equation (6) is obtained on the solution of the quadratic equation
1 5 0.618033... (positive root) (6) 2 τ is called the golden number, which has a significance in aesthetics (e.g., the Egyptian pyramids). The golden section method does not require gradient information. This method has two important advantages over others [28]: a) Only one new function evaluation is required at each step.
[Downloaded from www.aece.ro on Monday, December 18, 2017 at 11:20:02 (UTC) by 191.101.90.26. Redistribution subject to AECE license or copyright.]
Advances in Electrical and Computer Engineering
Volume 17, Number 2, 2017
b) There is a constant reduction factor at each step. The pseudo code in the value range [a, b] of the golden section search method is shown in Fig. 3.
Figure 3. Golden section method
In Gold-SA, initial default values for a and b are considered to be -π and π, respectively. These two coefficients are applied to the current and target values in the first iteration. Then the coefficients x 1 and x 2 are updated as the target value changes. The use of the golden section method in Gold-SA is shown in Fig. 4.
Figure 5. Pseudo code of Gold-SA
Figure 4. Use of gold section method in Gold-SA
If the equality of x 1 and x 2 is undesirable, the equality check is performed as shown in Step 2 in Fig. 3 to avoid this situation. If the two values are equal to each other, the random number random1 is generated in the range [0, π] and the random number random2 is generated in the range [0, π]. x 1 and x 2 values are recalculated. The Gold-SA pseudo code is shown in Fig. 5. III. RESULTS AND DISCUSSION A. Benchmark Functions Twenty-three (F1 - F23) benchmark functions [29] commonly used in the literature have been selected to evaluate and test the performance of Gold-SA. These functions can be examined in three categories: unimodal functions, multimodal functions, and fixed-dimensional multimodal functions. The functions F1 - F7 are unimodal functions and have a single global optimum. These functions are designed to test the convergence rate of search algorithms.
The multimodal functions are F8 - F13, which have more than one local minimum and are thus difficult to optimize. In multimodal functions, as the number of problem sizes increases, the local optimum number also increases. For this reason, such test problems are very important to evaluate the search capacities of the optimization algorithms. The only difference from the multimodal functions of fixed-size multimodal functions F14 - F23 is that, they have a small number of local minimums because of their low number of dimensions. Gold-SA was also used to solve the pressure vessel problem and tension / compression spring design problem, and the effectiveness of the algorithm on these real constrained engineering problems was also tested. Gold-SA is compared with 2 well known swarm-based algorithms such as Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO) and 3 current algorithms such as Whale Optimization Algorithm (WOA), Gravitational Search Algorithm (GSA), and Sine Cosine Algorithm. The population size of the algorithms is 30 and the number of iterations is 1000. The algorithms were run 30 times for each benchmark function, the dimensions being V no in the function definitions. The obtained mean, standard deviation, the best result, and the worst results are shown in Table I. The parameters used in the comparison algorithms are as follows: 1) PSO: inertia weight = 1, inertia weight damping ratio = 0.99, personal learning coefficient = 1.5, global learning coefficient = 2.0 73
[Downloaded from www.aece.ro on Monday, December 18, 2017 at 11:20:02 (UTC) by 191.101.90.26. Redistribution subject to AECE license or copyright.]
Advances in Electrical and Computer Engineering 2) AC0: sample size = 40, intensification factor = 0.5, deviation-distance ratio = 1 3) WOA: b =1
Volume 17, Number 2, 2017 4) GSA: r_norm = 2, r_power = 1, elitist_check =1 5) SCA: a = 2
TABLE I. RESULTS OF BENCHMARK FUNCTIONS (MEAN: MEAN SOLUTION, STD: STANDARD DEVIATION, BEST: THE BEST SOLUTION, WORST: THE WORST SOLUTION). F Statistics Gold-SA PSO SCA WOA ACO GSA mean 0 6.5085e-16 0.0310 1.4109e-154 1.2766 1.0976e-16 std 0 1.1800e-15 0.0866 7.0872e-154 0.8429 3.5117e-17 F1 best 0 1.0232e-17 2.4792e-06 3.3634e-166 0.4405 6.6897e-17 worst 0 5.4858e-15 0.3945 3.8875e-153 4.4251 2.1405e-16 mean 5.4526e-214 2.1071e-05 2.1026e-05 5.0254e-102 3.6260e+03 5.2733e-08 std 0 1.0554e-04 4.3726e-05 1.9523e-101 1.9618e+04 1.6126e-08 F2 best 0 6.9921e-10 2.9768e-09 1.4491e-114 1.1190 2.7875e-08 worst 1.6358e-212 5.7914e-04 2.1087e-04 9.2622e-101 1.0750e+05 1.0028e-07 mean 0 7.6568 4.0282e+03 1.9940e+04 1.0596e+05 432.9601 std 0 5.5415 3.2698e+03 1.1221e+04 2.9680e+04 155.3030 F3 best 0 1.0370 109.8140 3.6202e+03 4.7676e+04 217.7493 worst 0 22.0031 1.4489e+04 4.3344e+04 1.6262e+05 780.2057 mean 1.0387e-271 0.6941 19.9222 40.8308 78.1506 1.5515 std 0 0.3446 11.2356 32.4037 9.6495 1.5699 F4 best 0 0.2272 1.5395 0.0519 42.8106 1.0044e-08 worst 2.7315e-270 1.3777 42.7265 92.0102 92.1774 5.3571 mean 0.0011 41.3337 1.0127e+03 27.2594 5.0124e+04 36.3915 std 0.0015 32.4499 2.8445e+03 0.6228 4.5712e+04 53.9666 F5 best 6.1189e-08 2.3651 28.3801 26.5891 6.4021e+03 24.5371 worst 0.0050 110.3879 1.2227e+04 28.7495 1.9159e+05 322.0843 mean 6.0554e-05 2.7841e-15 4.5788 0.1062 1.0898 0.2000 std 1.2696e-04 8.7185e-15 0.5852 0.1165 0.5781 0.7611 F6 best 1.5292e-08 1.7238e-17 3.8463 0.0094 0.4107 0 worst 6.6993e-04 4.7394e-14 6.7212 0.4399 3.1241 4.0000 mean 7.7789e-05 0.0148 0.0421 0.0024 0.1854 0.0647 std 7.4299e-05 0.0059 0.0528 0.0023 0.0757 0.0254 F7 best 5.7196e-06 0.0063 0.0047 5.6430e-05 0.0546 0.0094 worst 3.3952e-04 0.0282 0.2775 0.0090 0.4039 0.1131 mean -1.2569e+04 -6.3686e+03 -3.9367e+03 -1.1224e+04 -4.4590e+149 -2.4485e+03 std 0.0632 867.5216 258.2417 1.6018e+03 2.4407e+150 425.4308 F8 best -1.2569e+04 -8.0276e+03 -4.5524e+03 -1.2569e+04 -1.3368e+151 -3.5570e+03 worst -1.2569e+04 -4.3764e+03 -3.5344e+03 -7.8490e+03 -3.0059e+98 -1.7879e+03 mean 0 44.7399 23.4859 0 252.9477 27.0629 std 0 13.7505 32.5145 0 18.7779 6.2785 F9 best 0 23.8790 6.9014e-06 0 193.7615 17.9093 worst 0 81.5864 168.7336 0 276.6462 41.7882 mean 8.8818e-16 1.0915 12.6852 4.0856e-15 0.6876 7.8281e-09 std 0 0.7594 9.5190 2.3511e-15 0.3804 1.6719e-09 F10 best 8.8818e-16 2.7719e-09 2.5556e-04 8.8818e-16 0.1548 5.7326e-09 worst 8.8818e-16 2.3162 20.3227 7.9936e-15 1.7909 1.3408e-08 mean 0 0.0240 0.3685 0.0050 0.9188 8.2013 std 0 0.0246 0.3339 0.0193 0.0798 3.2014 F11 best 0 0 7.5275e-04 0 0.6492 2.6444 worst 0 0.0860 0.9431 0.0875 1.0283 14.4065 mean 2.6557e-06 0.2319 3.1533 0.0110 3.2754e+04 0.1608 std 7.0487e-06 0.5471 6.2240 0.0177 7.9615e+04 0.2849 F12 best 1.0482e-10 1.0560e-18 0.3258 0.0012 18.9615 3.5333e-19 worst 3.4562e-05 2.7038 33.4485 0.0956 3.2040e+05 1.4847 mean 5.4864e-06 0.1020 18.5548 0.1962 8.1417e+04 0.0033 std 8.9948e-06 0.4602 65.8292 0.1581 1.1285e+05 0.0104 F13 best 1.4681e-08 5.5146e-18 2.2029 0.0398 961.1513 4.1030e-18 worst 4.0901e-05 2.5085 365.0753 0.7707 3.9818e+05 0.0548 mean 1.0311 4.4098 1.5275 2.7961 1.3235 3.4221 std 0.1815 2.9700 0.8922 3.2852 1.7829 2.6992 F14 best 0.9980 0.9980 0.9980 0.9980 0.9980 0.9980 worst 1.9920 11.7187 2.9821 10.7632 10.7632 13.8192 mean 3.6152e-04 3.4190e-04 9.7180e-04 5.6780e-04 0.0011 0.0023 std 4.5117e-05 1.6780e-04 3.8543e-04 2.7009e-04 3.1570e-04 8.7184e-04 F15 best 3.0868e-04 3.0749e-04 3.4077e-04 3.0836e-04 8.8731e-04 6.2116e-04 worst 5.1038e-04 0.0012 0.0015 0.0015 0.0019 0.0052 mean -1.0303 -1.0316 -1.0316 -1.0316 -1.0316 -1.0316 std 0.0025 6.7752e-16 2.5344e-05 1.6070e-10 6.7752e-16 4.8787e-16 F16 best -1.0316 -1.0316 -1.0316 -1.0316 -1.0316 -1.0316 worst -1.0184 -1.0316 -1.0315 -1.0316 -1.0316 -1.0316 mean 0.3980 0.3979 0.3985 0.3979 0.3979 0.3979 std 1.9567e-04 0 6.0681e-04 3.9088e-06 0 0 F17 best 0.3979 0.3979 0.3979 0.3979 0.3979 0.3979 worst 0.3988 0.3979 0.4003 0.3979 0.3979 0.3979
74
[Downloaded from www.aece.ro on Monday, December 18, 2017 at 11:20:02 (UTC) by 191.101.90.26. Redistribution subject to AECE license or copyright.]
Advances in Electrical and Computer Engineering
F18
F19
F20
F21
F22
F23
mean std best worst mean std best worst mean std best worst mean std best worst mean std best worst mean std best worst
3.0005 5.0420e-04 3.0000 3.0018 -3.8148 0.0674 -3.8628 -3.6041 -3.0407 0.2347 -3.3011 -2.0149 -10.1528 5.3227e-04 -10.1532 -10.1514 -10.4016 0.0028 -10.4028 -10.3902 -10.5360 5.9386e-04 -10.5363 -10.5336
3.0000 1.7954e-15 3.0000 3.0000 -3.8370 0.1411 -3.8628 -3.0898 -3.2863 0.0554 -3.3220 -3.2031 -5.4786 3.4558 -10.1532 -2.6305 -6.9147 3.6103 -10.4029 -2.7519 -5.9754 3.6315 -10.5363 -2.4217
Volume 17, Number 2, 2017 3.0000 3.2611e-05 3.0000 3.0001 -3.8551 0.0024 -3.8618 -3.8519 -2.8866 0.4145 -3.2333 -1.2291 -2.2850 1.8892 -5.9900 -0.4965 -3.5487 1.6587 -5.2185 -0.9071 -4.0940 1.7448 -7.9749 -0.9460
B. Nonparametric Statistical Analysis Results Statistical tests are used to obtain more reliable results due to the stochastic of metaheuristic algorithms. Methods for statistical analysis are two types, parametric and nonparametric tests. Parametric tests are generally used in the analysis of numerical intelligence experiments. However, parametric tests are based on conditions that are violated in the analysis of the performance of stochastic algorithms based on digital intelligence. These conditions are known as independence, normality and homoscedasticity. Non-parametric tests are used to overcome this problem [30]. Generally, the Wilcoxon signed rank test, which is a nonparametric test, is used to compare the performance of proposed algorithms for solving numerical optimization problems. The Wilcoxon signed rank test was performed between Gold-SA and other metaheuristic algorithms with statistical significance α = 0.05. The statistical results are shown in
3.0000 9.2119e-05 3.0000 3.0005 -3.8609 0.0022 -3.8628 -3.8549 -3.2429 0.1113 -3.3219 -2.9868 -9.0471 2.2807 -10.1530 -2.6303 -7.7342 2.9214 -10.4027 -3.7235 -8.5166 2.9532 -10.5363 -2.4217
3.0000 1.3194e-15 3.0000 3.0000 -3.8628 2.7101e-15 -3.8628 -3.8628 -3.2665 0.0603 -3.3220 -3.2031 -5.6870 3.7094 -10.1532 -2.6829 -6.8594 2.5485 -10.4029 -5.0877 -7.2484 3.0040 -10.5363 -2.4217
3.0000 3.1250e-15 3.0000 3.0000 -3.8628 2.2913e-15 -3.8628 -3.8628 -3.3220 1.4889e-15 -3.3220 -3.3220 -6.4134 3.6846 -10.1532 -2.6305 -10.4029 1.4378e-15 -10.4029 -10.4029 -10.2897 1.3511 -10.5363 -3.1359
Table II. p < 0.05 indicates statistically significant difference between the results obtained from the comparison algorithms. R+ represents the sum of the ranks in which Gold-SA achieves superior results to the comparison algorithm, and R- represents the sum of the ranks in which the compared algorithm obtains superior results to Gold-SA. When the results of benchmark functions in Table I are examined, it is clear that Gold-SA converges much faster than all other algorithms (PSO, SCA, WOA, ACO, GSA) compared to the optimum result. It is also noted that in the unimodal functions of F1 - F7 and F8 - F13 multimodal functions, Gold-SA is significantly superior to the compared algorithms. The proposed algorithm for unimodal functions F1, F2, F3, F4; multimodal functions F9, F11; fixeddimensional multimodal functions F16, F17, F18, F19, F21, F22, F23 has converged to the optimal result.
TABLE II. THE RESULTS OF WILCOXON SIGNED RANK TEST F F1 F2 F3 F4 F5 F6 F7 F8 F9 F10 F11 F12 F13 F14 F15 F16 F17 F18 F19 F20 F21 F22 F23
PSO vs Gold-SA p - value R+ RWin 1.73e-06 465 0 + 1.73e-06 465 0 + 1.73e-06 465 0 + 1.73e-06 465 0 + 1.73e-06 465 0 + 1.73e-06 0 465 1.73e-06 465 0 + 1.73e-06 465 0 + 1.73e-06 465 0 + 1.72e-06 465 0 + 2.56e-06 435 30 + 0.6435 255 210 + 0.3820 275 190 + 3.18e-06 459 6 + 3.06e-04 57 408 1.73e-06 0 465 1.73e-06 0 465 1.73e-06 0 465 3.11e-05 30 435 3.51e-06 7 458 2.61e-04 410 55 + 0.0207 345 120 + 6.15e-04 399 66 +
SCA vs Gold-SA WOA vs Gold-SA p - value R+ RWin p - value R+ RWin 1.73e-06 465 0 + 1.73e-06 465 0 + 1.73e-06 465 0 + 1.73e-06 465 0 + 1.73e-06 465 0 + 1.73e-06 465 0 + 1.73e-06 465 0 + 1.73e-06 465 0 + 1.73e-06 465 0 + 1.73e-06 465 0 + 1.73e-06 465 0 + 1.73e-06 465 0 + 1.73e-06 465 0 + 1.92e-06 464 1 + 1.73e-06 465 0 + 1.73e-06 465 0 + 1.73e-06 465 0 + 1 0 0 = 1.73e-06 465 0 + 1.33e-05 253 212 + 1.73e-06 465 0 + 0.5000 3 0 + 1.73e-06 465 0 + 1.73e-06 465 0 + 1.73e-06 465 0 + 1.73e-06 465 0 + 1.49e-05 443 22 + 0.0687 321 144 + 3.18e-06 459 6 + 4.07e-05 432 35 + 2.60e-06 4 461 1.73e-06 0 465 6.15e-04 399 66 + 1.73e-06 0 465 1.92e-06 8 457 1.73e-06 20 445 0.0093 106 359 8.18e-05 41 424 0.0786 318 147 + 1.02e-05 18 447 1.73e-06 465 0 + 5.75e-06 453 12 + 1.73e-06 465 0 + 2.16e-05 439 26 + 1.73e-06 465 0 + 1.97e-05 440 25 +
According to Wilcoxon signed rank test comparison results shown in Table II, Golds-SA over 23 functions; 16/7
ACO vs Gold-SA p - value R+ RWin 1.73e-06 465 0 + 1.73e-06 465 0 + 1.73e-06 465 0 + 1.73e-06 465 0 + 1.73e-06 465 0 + 1.73e-06 465 0 + 1.73e-06 465 0 + 1.73e-06 0 465 1.73e-06 465 0 + 1.73e-06 465 0 + 1.73e-06 465 0 + 1.73e-06 465 0 + 1.73e-06 465 0 + 3.11e-05 30 435 1.73e-06 465 0 + 1.73e-06 0 465 1.73e-06 0 465 1.73e-06 0 465 1.73e-06 0 465 2.87e-06 5 460 0.0015 387 78 + 0.6733 410 55 + 0.0148 374 91 +
GSA vs Gold-SA p - value R+ RWin 1.73e-06 465 0 + 1.73e-06 465 0 + 1.73e-06 465 0 + 1.73e-06 465 0 + 1.73e-06 465 0 + 0.0028 87 378 1.73e-06 465 0 + 1.73e-06 465 0 + 1.73e-06 465 0 + 1.73e-06 465 0 + 1.73e-06 465 0 + 0.0093 359 106 + 0.0571 140 325 5.75e-06 453 12 + 1.73e-06 465 0 + 1.73e-06 0 465 1.73e-06 0 465 1.73e-06 0 465 1.73e-06 0 465 1.73e-06 0 465 0.0087 360 105 + 1.73e-06 0 465 3.11e-05 30 435 -
against PSO, 20/3 against SCA, 17/5 against WOA, 17/6 against ACO and 14/9 against GSA has been successful. 75
[Downloaded from www.aece.ro on Monday, December 18, 2017 at 11:20:02 (UTC) by 191.101.90.26. Redistribution subject to AECE license or copyright.]
Advances in Electrical and Computer Engineering
Volume 17, Number 2, 2017
The algorithms compared in the F15 - F20 fixed size multimodal benchmark functions have been more successful than Gold-SA. C. Sensitivity Analysis on Control Parameters of Gold-SA The least sensitive response of the algorithm to changes on the control parameters is an important criterion indicating the robustness of the algorithm. Sensitivity analysis was performed by varying the size of the initial population of control parameters to test the robustness of the proposed new algorithm. TABLE III. THE RESULTS OBTAINED BY CHANGING THE POPULATION SIZE (RUNTIME: AVERAGE RUNNING TIME IN SECONDS) F Statistics size = 10 size = 30 size = 50 mean 0.0101 0.0011 6.7071e-04 std 0.0262 0.0021 0.0016 F5 best 1.8910e-06 1.6302e-07 4.9791e-08 worst 0.1344 0.0089 0.0078 runtime 0.6857 1.5172 2.6075 mean 0 0 0 std 0 0 0 F9 best 0 0 0 worst 0 0 0 runtime 0.5947 1.3725 2.2918 mean 7.3204e-05 1.4030e-05 4.3964e-06 std 1.6166e-04 2.3757e-05 8.7681e-06 F13 best 3.8238e-08 1.5544e-08 1.8025e-09 worst 7.6928e-04 1.1967e-04 4.4376e-05 runtime 0.5362 3.2876 6.1802
Table III shows the results obtained for F5, F9, F13 functions when the initial population size is 10, 30, and 50, respectively. The results are obtained by running the algorithm 1000 iterations and 30 times for each function. The F5 function convergence graphs are shown in Fig. 6, the F9 function convergence graphs are shown in Fig. 7, and the F13 function convergence graphs are shown in Fig. 8. The statistical results in Table III show that the increase in population size also caused an increase in the average running times of F5, F9, and F13 functions. It is seen that increasing the population size in F5 and F9 functions has little effect on convergence to the optimal result, but does not affect the solution of F9 function.
Figure 6. Convergence graphs of F5 for population size = 30
76
Figure 7. Convergence graphs of F9 for population size = 30
Figure 8. Convergence graphs of F13 Function for population size = 30
D. Gold-SA for Engineering Problems The proposed Gold-SA was also used to solve engineering design problems such as pressure vessel problem and tension / compression spring design problem. The most important point to note when optimizing these constrained engineering problems is that constraints should not be violated. To achieve this, a method that controls constraints must work in an integrated manner with the algorithm. There are methods such as penalty method, special operators, repair algorithms, separation of objectives and constraints, and hybrid methods which enable the use of constraints [31]. The penalty method is used for constraint handling. The effectiveness of the Gold-SA on these two constrained real engineering design problems was tested. While testing Gold-SA and other comparative algorithms (PSO, SCA, WOA, ACO, GSA) on the pressure vessel problem and tension / compression spring design problem, the number of populations was determined as 200, the number of iterations as 10000, and the algorithms were run 30 times. The obtained results are shown in Table IV, Table V and the statistical results of Wilcoxon signed rank test are shown in Table VI, Table VII. A cylindrical vessel is capped at both ends by hemispherical heads as shown in Fig. 9. The objective is to minimize the total cost, including the cost of material, forming and welding. There are four design variables: T s (thickness of the shell, x 1 ), T h (thickness of the head, x 2 ), R
[Downloaded from www.aece.ro on Monday, December 18, 2017 at 11:20:02 (UTC) by 191.101.90.26. Redistribution subject to AECE license or copyright.]
Advances in Electrical and Computer Engineering
Volume 17, Number 2, 2017
(inner radius, x 3 ) and L (length of cylindrical section of the vessel, not including the head, x 4 ) [32]. This problem consists of 4 constraints. These constraints and problems are shown in equation (7). min f ( x ) 0.6224 x1 x3 x4 1.7781x2 x32 3.1661x12 x4 19.84 x12 x3 s.t. g1 ( x ) x1 0.0193 x3 0 g 2 ( x ) x2 0.00954 x3 0 4 g3 ( x ) x32 x4 x33 1.296000 0 3 g 4 ( x ) x4 240 0 variables ranges: 0 x1 99,
(7)
w (x 1 ), the mean coil diameter d (x 2 ) and the length (or number of bobbins) L (x 3 ). These constraints and problems are shown in equation (8). min f ( x1 , x2 , x3 ) ( x3 2) x12 x2
s.t. g1 ( X ) 1 g2 ( X )
x23 x3 0 71785 x14
x2 (4 x2 x1 ) 1 1 0 12566 x13 ( x2 x1 ) 5108 x12
g3 ( X ) 1
0 x2 99,
10 x3 200, 10 x1 200
140.45 x1 0 x22 x3
(8)
2( x1 x2 ) 1 0 3 variables ranges: 0.05 x1 2, 0.25 x2 1.3, g4 ( X )
2.0 x3 15.0
Figure 9. Schematic of pressure vessel Figure 10. Schematic of tension / compression spring design problem
It is very impressive that Gold-SA solves the pressure vessel problem in a very short time of 58.97245 seconds and finds the minimum cost as 5895.98803. Gold-SA is the best among the algorithms compared in terms of finding runtime and minimum cost. Obtained results for pressure vessel design problem from the algorithms have been demonstrated in Table IV. The main goal in solving the tension / compression spring design problem shown in Fig. 10 is to design a minimum weighted spring by obtaining optimum values of the variables [33]. The problem consists of three nonlinear inequality constraints and three continuous variables: the wire diameter
As shown in Table V, Gold-SA has been the algorithm that finds the shortest possible time between the algorithms compared by solving the tension / compression spring design problem at 25.49906 seconds. In addition, the optimum cost is 0.01267, which is the second algorithm which finds the optimal cost after PSO. According to the Wilcoxon signed rank test comparison results given in Table VI and Table VII, it is clear that GoldSA is superior to its competitors.
TABLE IV. COMPARING OF THE BEST SOLUTION FOR PRESSURE VESSEL DESIGN PROBLEM FOUND BY DIFFERENT ALGORITHMS Optimum variables Algorithm Optimum Runtime Cost Ts Th R L Gold-SA 0.779350523979 0.385169061742 40.323163712897 200 5895.98803 58.97245 PSO 0.828797029813 0.409674804493 42.942851305476 166.44667183143 5977.62589 517.86163 SCA 0.780408762923 0.386271554329 40.406252027563 200 5920.53556 59.77302 WOA 0.778189825887 0.390165645138 40.319639981412 199.99970408291 5901.42948 87.78191 ACO 0.921103032395 0.455301706112 47.725545735306 117.48048122090 6177.27956 498.07486 GSA 0.896775543832 0.443276615916 46.465054100846 129.12087575297 6120.54193 2907.64910 TABLE V. COMPARING OF THE BEST SOLUTION FOR TENSION / COMPRESSION SPRING DESIGN PROBLEM FOUND BY DIFFERENT ALGORITHMS Optimum variables Optimum Runtime Algorithm Cost w d L Gold-SA 0.051797597638 0.359216890016 11.15048159186 0.012674114949 25.49906 PSO 0.051580471304 0.354110970914 11.44344489883 517.93111 0.012665448278 SCA 0.051108490847 0.342889153904 12.15302541048 0.012676201923 71.69195 WOA 0.052614959700 0.379403901376 10.07314988814 0.012680631117 89.987781 ACO 0.057606226982 0.516451973173 5.739265672904 0.013263818150 511.46341 GSA 0.053462849967 0.395370059884 9.564393481895 0.013068653733 2463.76132 TABLE VI. THE RESULTS OF WILCOXON SIGNED RANKS TEST BASED ON THE BEST SOLUTION FOR PRESSURE VESSEL DESIGN PROBLEM
PSO vs Gold-SA SCA vs Gold-SA WOA vs Gold-SA ACO vs Gold-SA GSA vs Gold-SA
p-value 0.1846 0.6288 3.8811e-04 0.0012 0.1650
R+ 297 256 405 390 300
R168 209 60 75 165
Win + + + + +
TABLE VII. THE RESULTS OF WILCOXON SIGNED RANKS TEST BASED ON THE BEST SOLUTION FOR TENSION / COMPRESSION SPRING DESIGN PROBLEM PSO vs Gold-SA SCA vs Gold-SA WOA vs Gold-SA ACO vs Gold-SA GSA vs Gold-SA
p-value 0.1254 0.0030 4.7292e-06 1.7344e-06 1.7344e-06
R+ 307 377 455 465 465
R158 88 10 0 0
Win + + + + +
77
[Downloaded from www.aece.ro on Monday, December 18, 2017 at 11:20:02 (UTC) by 191.101.90.26. Redistribution subject to AECE license or copyright.]
Advances in Electrical and Computer Engineering
IV. CONCLUSION Many metaheuristic algorithms have been developed, inspired by different sources, from the past to the present. In this study, a novel mathematical search algorithm called Golden Sine Algorithm (Gold-SA) is proposed. The proposed algorithm is inspired by a trigonometric function, sine. Gold-SA's narrowing of the solution space with the golden section avoids local optimum solutions and provides convergence to the global optimum in a short time. The new algorithm has been tested on 23 benchmark functions and 2 real constrained engineering problems namely, pressure vessel problem and tension / compression spring design problem. The results have been compared to those of other metaheuristic algorithms such as PSO, SCA, WOA, ACO, and GSA. In addition, the results obtained from the algorithms were statistically compared using the nonparametric Wilcoxon signed rank test that allowed to determine the performance of the stochastic methods more reliably. Since Gold-SA is a new algorithm and it contains few parameters and operators that depend on the algorithm, so it is possible to develop the algorithm in future studies and to use it as a hybrid with other methods. REFERENCES [1]
S. Mirjalili, S. M. Mirjalili, A. Lewis, “Grey wolf optimizer”, Advances in Engineering Software, vol. 69, pp. 46-61, 2014. doi:10.1016/j.advengsoft.2013.12.007 [2] G. Demir, B. Alataş, “Lig şampiyonası algoritması ile gezgin satıcı probleminin çözümü”, 1st International Conference on Engineering Technology and Applied Sciences (ICETAS), Afyon, Turkey, pp. 793-800, 2016. [3] A. Prakasam, N. Savarimuthu, “Metaheuristic algorithms and polynomial turing reductions: a case study based on ant colony optimization”, Procedia Computer Science, vol. 46, pp. 388-395, 2015. doi:10.1016/j.procs.2015.02.035 [4] I. Fister Jr., X. S. Yang, D. Fister, I. Fister, “A brief review of natureinspired algorithms for optimization”, Elektrotehniski Vestnik, vol. 80, no. 3, pp. 1-7, 2013. [5] J. H. Holland, “Genetic algorithms”, Scientific American, vol. 267, pp. 66-72, 1992. doi:10.1038/scientificamerican0792-66 [6] D. Simon, “Biogeography-based optimization”, Evolutionary Computation, IEEE Transactions on, vol. 12, no. 6, pp. 702-713, 2008. doi:10.1109/TEVC.2008.919004 [7] Y. Shi, “An optimization algorithm based on brainstorming process”, International Journal of Swarm Intelligence Research (IJSIR), vol. 2, no.4, pp. 35-62, 2011. doi:10.4018/ijsir.2011100103 [8] A. Kaveh and N. Farhoudi, “A new optimization method: Dolphin echolocation”, Advances in Engineering Software, vol. 59, pp. 53-70, 2013. doi:10.1016/j.advengsoft.2013.03.004 [9] X. S. Yang, “Flower pollination algorithm for global optimization”, Unconventional Computation and Natural Computation, pp. 240-249, 2012. doi:10.1007/978-3-642-32894-7_27 [10] J. Kennedy, R. Eberhart, “Particle swarm optimization”, in Neural Networks, Proceedings, IEEE International Conference on, vol. 4, pp. 1942–1948, IEEE, 1995. doi:10.1109/ICNN.1995.488968 [11] M. Dorigo, “Optimization, learning and natural algorithms”, Ph. D. Thesis, Politecnico di Milano, Italy, 1992. [12] K. M. Passino, “Biomimicry of bacterial foraging for distributed optimization and control”, Control Systems, IEEE, vol. 22, no. 3, pp. 52-67, 2002. doi:10.1109/MCS.2002.1004010
78
Volume 17, Number 2, 2017 [13] X. S. Yang, “A new metaheuristic bat-inspired algorithm”, Nature Inspired Cooperative Strategies for Optimization (NICSO 2010), vol. 284, pp. 65-74, 2010. doi:10.1007/978-3-642-12538-6_6 [14] X. S. Yang, “Firefly algorithm, stochastic test functions and design optimization”, International Journal of Bio-Inspired Computation, vol. 2, no. 2, pp. 78-84, 2010. doi:10.1504/IJBIC.2010.032124 [15] S. Mirjalili, “The ant lion optimizer”, Advances Engineering Software, vol. 83, pp. 80-8, 2015. doi:10.1016/j.advengsoft.2015.01.010 [16] S. Mirjalili, S. M. Mirjalili, “The whale optimization algorithm”, Advances Engineering Software, vol. 95, pp. 51-67, 2016. doi:10.1016/j.advengsoft.2016.01.008 [17] A. Hatamlou, “Black hole: A new heuristic optimization approach for data clustering”, Information Sciences, vol. 222, pp. 175-184, 2013. doi:10.1016/j.ins.2012.08.023 [18] A. Kaveh, S. Talatahari, “A novel heuristic optimization method: charged system search”, Acta Mechanica, vol. 213, no. 3, pp. 267289, 2010. doi:10.1007/s00707-009-0270-4 [19] E. Cuevas, D. Oliva, D. Zaldivar, M. Perez, R. Rojas, “Circle detection algorithm based on electromagnetism-like optimization”, vol. 38, pp. 907-934, 2013. doi:10.1007/978-3-642-30504-7_36 [20] E. Rashedi, H. N. Pour, S. Saryazdi, “GSA: a gravitational search algorithm. Information sciences”, vol. 179, no. 13, pp. 2232-2248, 2009. doi:10.1016/j.ins.2009.03.004 [21] Z. W. Geem, J. H. Kim, G. V. Loganathan, “A new heuristic optimization algorithm: harmony search”, Simulation, vol. 76, no. 2, pp. 60-68, 2001. [22] H. Shayeghi, J. Dadashpour, “Anarchic society optimization based pid control of an automatic voltage regulator (avr) system”, Electrical and Electronic Engineering, vol. 2, no. 4, pp. 199-207, 2012. doi:10.5923/j.eee.20120204.05 [23] E. A. Gargari, C. Lucas, “Imperialist competitive algorithm: an algorithm for optimization inspired by imperialistic competition”, Evolutionary Computation, 2007, CEC 2007, IEEE Congress on, pp. 4661-4667, IEEE, 2007. doi:10.1109/CEC.2007.4425083 [24] F. Ramezani, S. Lotfi, “Social-based algorithm”, Applied Soft Computing, vol. 13, pp. 2837-2856, 2013. doi:10.1016/j.asoc.2012.05.018 [25] S. A. Salem, “BOA: A novel optimization algorithm”, International Conference on Engineering and Technology (ICET), pp. 1-5, Egypt, IEEE, 2012. doi: 10.1109/ICEngTechnol.2012.6396156 [26] S. Mirjalili, “SCA: A Sine Cosine Algorithm for solving optimization problems”, Knowledge-Based Systems, vol. 96, pp. 120-133, 2016. doi:10.1016/j.knosys.2015.12.022 [27] F. Altunbey, B. Alataş, “Sosyal ağ analizi için sosyal tabanlı yapay zekâ optimizasyon algoritmalarının incelenmesi”, Int. J. Pure Appl. Sci., vol. 1, pp. 33-52, 2015. [28] R. K. Arora. Optimization Algorithms and Applications. ISBN-13: 978-1-4987-2115-8. pp. 46-47, 2015. [29] P. N. Suganthan, N. Hansen, J. J. Liang, K. Deb, Y. Chen, A. Auger, “Problem definitions and evaluation criteria for the CEC 2005 special session on realparameter optimization” KanGAL report, vol. 2005005, 2005. [30] J. Derrac, S. García, D. Molina, F. Herrera, “A practical tutorial on the use of non-parametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms”, Swarm Evol. Comput., vol.1, no.1, pp. 3-18, 2011. doi:10.1016/j.swevo.2011.02.002 [31] C. A. C. Coello, “Theoretical and numerical constraint-handling techniques used with evolutionary algorithms: a survey of the state of the art”. Comput Methods Appl Mech Eng, vol. 191, no.11-12, pp. 1245-1287, 2002. doi:10.1016/S0045-7825(01)00323-1 [32] S. H. Nasseri, Z. Alizadeh, F. Taleshian, “Optimized solution of pressure vessel design using geometric programming”, The Journal of Mathematics and Computer Science, vol. 4, no. 3, pp. 344 – 349, 2012. [33] M. Li, H. Zhao, X. Weng, T. Han, “Cognitive behavior optimization algorithm for solving optimization problems”, Applied Soft Computing, vol. 39, pp. 199 – 222, 2016.