Firefly Algorithm for Continuous Constrained Optimization Tasks

11 downloads 1043 Views 186KB Size Report
Page 1 ... The presented technique is inspired by social behavior of fire- ... intelligence optimization technique is based on the assumption that solution of.
Firefly Algorithm for Continuous Constrained Optimization Tasks ˙ Szymon L Ã ukasik and SÃlawomir Zak Systems Research Institute, Polish Academy of Sciences [email protected] [email protected]

Abstract. The paper provides an insight into the improved novel metaheuristics of the Firefly Algorithm for constrained continuous optimization tasks. The presented technique is inspired by social behavior of fireflies and the phenomenon of bioluminescent communication. The first part of the paper is devoted to the detailed description of the existing algorithm. Then some suggestions for extending the simple scheme of the technique under consideration are presented. Subsequent sections concentrate on the performed experimental parameter studies and a comparison with existing Particle Swarm Optimization strategy based on existing benchmark instances. Finally some concluding remarks on possible algorithm extensions are given, as well as some properties of the presented approach and comments on its performance in the constrained continuous optimization tasks. Key words: firefly algorithm, constrained continuous optimization, swarm intelligence, metaheuristics

1

Introduction

Fireflies, also called lighting bugs, are one of the most special and fascinating creatures in nature. These nocturnal luminous insects of the beetle family Lampyridae (order Coleoptera), inhabit mainly tropical and temperate regions, and their population is estimated at around 1900 species [1]. They are capable of producing light thanks to special photogenic organs situated very close to the body surface behind a window of translucent cuticle [2]. Bioluminescent signals are known to serve as elements of courtship rituals, methods of prey attraction, social orientation or as a warning signal to predators (in case of immature firefly forms commonly referred to as glowworms). The phenomenon of firefly glowing is an area of continuous research considering both its biochemical [3] and social aspects [4]. Mechanisms of firefly communication via luminescent flashes and their synchronization has been imitated effectively in various techniques of wireless networks design [5], dynamic market pricing [6] and mobile robotics [7]. Firefly Algorithm developed recently by Xin-She Yang at Cambridge University and presented in the Chapter 8 of monograph [8] follows this approach. This swarm

2

˙ Szymon L Ã ukasik and SÃlawomir Zak

intelligence optimization technique is based on the assumption that solution of an optimization problem can be perceived as agent (firefly) which “glows” proportionally to its quality in a considered problem setting. Consequently each brighter firefly attracts its partners (regardless of their sex), which makes the search space being explored more efficiently. Similar nature inspired metaheuristics include: Particle Swarm Optimization (PSO) [9] or Artificial Bee Colony optimization technique (ABC) [10]. This paper is devoted to the detailed study of Firefly Algorithm (FA), its experimental evaluation and possible improvements. It is organized as follows. In the next Section a comprehensive review of the existing FA scheme based on monograph [8] is given as well as some proposals for its extension. The subsequent part of the paper contains the results of parameter studies and some guidelines for its proper assignment. Next the comparison with Particle Swarm Optimization technique is performed. Ultimately, the final part of the paper presents some concluding remarks and suggestions for future work in the subject.

2 2.1

Firefly Algorithm in Practice FA Scheme

Let us consider continuous constrained optimization problem where the task is to minimize cost function f (x) for x ∈ S ⊂ Rn i.e. find x∗ such as: f (x∗ ) = min f (x) . x∈S

(1)

Assume that there exists a swarm of m agents (fireflies) solving abovementioned problem iteratively and xi represents a solution for a firefly i in algorithm’s iteration k, whereas f (xi ) denotes its cost. Initially all fireflies are dislocated in S (randomly or employing some deterministic strategy). Each firefly has its distinctive attractiveness β which implies how strong it attracts other members of the swarm. As a firefly attractiveness one should select any monotonically decreasing function of the distance rj = d(xi , xj ) to the chosen firefly j, e.g. the exponential function: β = β0 e−γrj

(2)

where β0 and γ are predetermined algorithm parameters: maximum attractiveness value and absorption coefficient, respectively [8]. Furthermore every member of the swarm is characterized by its light intensity Ii which can be directly expressed as a inverse of a cost function f (xi ). To effectively explore considered search space S it is assumed that each firefly i is changing its position iteratively taking into account two factors: attractiveness of other swarm members with higher light intensity i.e. Ij > Ii , ∀j = 1, ...m, j 6= i which is varying across distance and a fixed random step vector ui . It should be noted as well that if no brighter firefly can be found only such randomized step is being used [8].

Firefly Algorithm for Continuous Constrained Optimization Tasks

3

To summarize, when taking into consideration all above statements, the algorithm scheme established in [8] can be presented in the following pseudo-code form: Firefly Algorithm for Constrained Optimization Input: T f (z), z = [z1 , z2 , ..., zn ] S = [ak , bk ], ∀k = 1, ..., n m, β0 , γ, min ui , max ui Output: ximin

{cost function} {given constraints} {algorithm’s parameters} {obtained minimum location}

begin for i=1 to m do xi ← Generate Initial Solution () end repeat imin ← arg mini f (xi ) ximin ← arg minxi f (xi ) for i=1 to m do for j=1 to m do if f (xj ) < f (xi ) then {move firefly i towards j} rj ← Calculate Distance (xi ,xj ) β ← β0 e−γrj {obtain attractiveness} ui ← Generate Random Vector (min ui , max ui ) for k=1 to n do xi,k ← (1 − β)xi,k + βxj,k + ui,k end end end end uimin ← Generate Random Vector (min ui , max ui ) for k=1 to n do ximin ,k ← ximin ,k + uimin ,k {best firefly should move randomly} end until stop condition true end

In the next part of the paper some technical details of the algorithm will be considered. A closer look will be taken at such important issues as the sensitivity of the parameters, their influences on the convergence rate of the algorithm, the potential improvement, and further development. For more detailed theoretical considerations, MATLAB code and convincing two dimensional demonstrations of the algorithm performance one could refer to the pioneer publication already mentioned [8].

4

2.2

˙ Szymon L Ã ukasik and SÃlawomir Zak

Technical Details

The algorithm presented here makes use of a synergic local search. Each member of the swarm explores the problem space taking into account results obtained by others, still applying its own randomized moves as well. The influence of other solutions is controlled by value of attractiveness (2). It can be adjusted by modifying two parameters: its maximum value β0 and an absorption coefficient γ. The first parameter describes attractiveness at rj = 0 i.e. when two fireflies are found at the same point of search space S. In general β0 ∈ [0, 1] should be used and two limiting cases can be defined: when β0 = 0, that is only noncooperative distributed random search is applied and when β0 = 1 which is equivalent to the scheme of cooperative local search with the brightest firefly strongly determining other fireflies positions, especially in its neighborhood [8]. On the other hand, the value of γ determines the variation of attractiveness with increasing distance from communicated firefly. Using γ = 0 corresponds to no variation or constant attractiveness and conversely setting γ → ∞ results in attractiveness being close to zero which again is equivalent to the complete random search. In general γ ∈ [0, 10] could be suggested [8]. It is more convenient, however, to derive γ value specifically for the considered problem. Such customized absorption coefficient should be based on the “characteristic length” of the optimized search space. It is proposed here to use: γ= or: γ=

γ0 rmax γ0 2 rmax

(3) (4)

wheras γ0 ∈ [0, 1] and: rmax = max d(xi , xj ), ∀xi , xj ∈ S .

(5)

Efficiency of both techniques introduced here will be experimentally evaluated in the next Section. Finally one has to set random step size i.e. its lower and upper bounds (min ui , max ui ) and define the method of its generation. In [8] it was proposed to use min ui = −0.5α and max ui = 0.5α, with α ∈ [0, 1] being algorithm’s parameter. In consequence ui for each search space dimension k is supposed to be generated according to: 1 ui,k = α(rand − ) . 2

(6)

with rand ∼ U (0, 1) - a random number obtained from the uniform distribution. Here it is suggested to use alternative approach i.e. to define random vector as a fraction of firefly distance to search space boundaries: ½ α rand2 (bk − xi,k ) if sgn(rand1 − 0.5) < 0 ui,k = (7) −α rand2 (xi,k − ak ) if sgn(rand1 − 0.5) ≥ 0

Firefly Algorithm for Continuous Constrained Optimization Tasks

5

with two uniform random numbers rand1 , rand2 obtained similarly as above. In the end it could be noted that computational complexity of the algorithm under consideration is O(m2 ), so using larger population size leads to substantial increase in calculation time. It can, however, bring significant benefits in terms of algorithm’s performance, especially when some deterministic technique of initial swarm displacement is being employed. In the paper, simple random dislocation of fireflies in S, instead of such strategy, is being assumed.

3

Numerical Experiments and Parameter Studies

The performance of the presented technique was verified experimentally using its MATLAB implementation and a set of 14 benchmark problems (for a detailed list please refer to the Appendix). All tests were conducted for a fixed number of algorithm iterations l and repeated in 100 independent trials with different random number generator seeds. As problems are characterized by different scales on the cost function it was more convenient to use ranking of different algorithm’s variants instead of direct analysis of quality indexes |fmin − f (ximin )|. It means that each problem was considered separately with tested configurations being ranked by their performance. Then the final comparison was carried out using medians of obtained ranks. Due to space limitations only most representative results are presented in the paper. The full set of simulation data can be found on the first author’s web site (http://www.ibspan.waw.pl/~slukasik). 3.1

Population Size

Firstly, the influence of swarm size on the algorithm efficiency was analyzed. For such purpose a fixed number of cost function evaluations (2000) and FA variants with m = {6, 8, 10, ..., 200} were assumed and suitably decreasing number of iterations l were compared. To make such a comparison more representative variants characterized by significant rounding errors of m l product i.e. |bm lc − m l| > 20 were rejected. All tests were conducted for β0 = 1, α = 0.01 and fixed γ = 1. Although general remarks on optimal number of fireflies cannot be made, two tendencies can be observed. For difficult optimization tasks, such as instances no 2, 3, 4, 8, 9, 13, 14 it is always a better option to use a maximum number of fireflies. It is an observation based on medians of ranks for those problems depicted on Fig. 1. Still, there exists a set of relatively easy optimization problems like 1, 5, 6, 7, 10, 11, 12, 13 where optimal number of fireflies could be found, e.g. the problem of Sphere function minimization is solved most effectively by a set of 28 fireflies as shown on Fig. 2. The aforementioned remarks should be accompanied by an additional look on associated calculation time which, as noted before, increases significantly with size of the swarm. Taking it into account it is advisable to use reasonable population of 40-50 fireflies and refrain from applying such rule only for more complicated optimization tasks. It is worth noting that similar remarks have been made in [9] with reference to the Particle Swarm Optimization technique.

˙ Szymon L Ã ukasik and SÃlawomir Zak

6

median of rank

35 30 25 20 15 10 5

6 8 10 12 14 16 18 20 22 24 28 30 32 36 40 44 50 54 60 62 64 66 74 80 9 10 0 11 0 12 0 13 4 14 2 16 2 18 6 19 0 20 8 0

0

number of fireflies m Fig. 1. Median of performance ranks for varying population size (problems no 2, 3, 4, 8, 9, 13, 14

median of rank

35 30 25 20 15 10 5

90 10 0 11 0 12 4 13 2 14 2 16 6 18 0 19 8 20 0

80

74

66

64

62

60

0 54

5

44

40

36

32

30

28

24

22

20

18

16

14

12

8 10

6

0

number of fireflies m Fig. 2. Median of performance ranks for varying population size (problem: 12)

3.2

Maximum of Attractiveness Function

In the second series of computational experiments the influence of β0 value on the algorithm performance was studied. Testing runs were conducted for β0 = {0, 0.1, 0.2, ..., 1.0} with other parameters fixed i.e. m = 40, l = 250, α = 0.01 and γ = 1. Again, each of the tested configurations was ranked by its performance in the considered problem and median of such rank is reported (see Fig. 3). It is observed that the best option is to use maximum attractiveness value β0 = 1 which implies the strongest dependence of fireflies’ positions on their brighter neighbors location. 3.3

Absorption Coefficient and Random Step Size

Finally changes in the algorithm’s performance with varying absorption coefficient γ and random step size α were under investigation. Maximum attractiveness β0 = 1 was used, with population size m = 40 and iteration number l = 250.

7

0

1

2

3

4

5

6

7

8

9

0

0,

0,

0,

0,

0,

0,

0,

0,

0,

1,

11 10 9 8 7 6 5 4 3 2 1 0 0,

median of rank

Firefly Algorithm for Continuous Constrained Optimization Tasks

β Fig. 3. Median of performance ranks with varying maximum of attractiveness function

Firefly Algorithm variants with α = {0.001, 0.01, 0.1} and γ = {0.1, 1.0, 10.0} were tested. Additionally two problem-related techniques of obtaining absorption coefficient (Eq. (3) and (4)) were considered (with γ0 = {0.1, 0.2, ..., 1.0}), so the overall number of examined configurations reached 75. The obtained results indicate that for the examined optimization problems variants of the algorithm with α = 0.01 are the best in terms of performance. Furthermore it could be advisable to use adaptable absorption coefficient according to (3) with γ0 = 0.8 as this configuration achieved best results in the course of executed test runs. Although proposed technique of γ adaptation in individual cases often performs worse than fixed γ values it has an advantage to be automatic and “tailored” to the considered problem.

4

Comparison with Particle Swarm Optimization

Particle Swarm Optimization is a swarm-based technique introduced by Kennedy and Eberhart [11]. It has been intensively developed recently with research studies resulting in numerous interesting both theoretical and practical contributions [9]. The Particle Swarm Optimizer was studied in continuous optimization context in [12], with suggested variants and recommended parameter values being explicitly given. Experiments reported here involved a performance comparison of Firefly Algorithm with such advanced PSO algorithm defined with constriction factor and the best parameters set suggested in conclusion of [12]. Both algorithm were executed with the same population size m = 40, iteration number l = 250 and the test was repeated 100 times for its results to be representative. The obtained results are presented in Tab. 1. It contains algorithms’ performance indices given as average difference between a result obtained by both techniques f (ximin ) and actual minimum of cost function f (x∗ ) with standard deviations given for reference. For the Firefly Algorithm results obtained by the best configuration

8

˙ Szymon L Ã ukasik and SÃlawomir Zak

selected in Section 3.3 are presented, as well as the best result obtained for each test problem by one of the 75 algorithm variants considered in the same Section. Table 1. Performance comparison of Firefly Algorithm and Particle Swarm Optimization technique Problem 1 2 3 4 5 6 7 8 9 10 11 12 13 14

avg.|f (x∗ ) − f (ximin )| ± std.dev.|f (x∗ ) − f (ximin )| PSO FA(γ0 = 0.8) FA(best) 5.75E-21 ± 5.10E-02 2.35E-04 ± 3.76E-04 1.01E-04 ± 1.78E-04 4.98E+02 ± 1.59E+02 6.59E+02 ± 2.25E+02 4.14E+02 ± 2.98E+02 0.00E+00 ± 0.00E+00 1.42E-01 ± 3.48E-01 3.05E-02 ± 1.71E-01 3.04E+01 ± 1.09E+01 1.63E+01 ± 5.78E+00 2.78E+00 ± 5.39E-01 4.63E-02 ± 2.59E-02 1.56E-01 ± 4.56E-02 1.48E-01 ± 4.17E-02 2.31E-01 ± 6.17E-01 1.10E+00 ± 6.97E-01 5.66E-01 ± 3.36E-01 1.16E-17 ± 6.71E-17 7.90E-05 ± 6.82E-05 4.14E-05 ± 2.17E-05 2.15E-06 ± 6.51E-15 2.80E-02 ± 7.78E-02 2.21E-03 ± 1.11E-03 5.84E-02 ± 5.99E-02 2.18E-01 ± 1.67E-01 2.18E-02 ± 2.33E-02 2.86E+00 ± 3.52E+00 3.09E+00 ± 3.72E+00 1.69E-01 ± 1.08E+00 4.00E-18 ± 9.15E-18 1.63E-06 ± 1.60E-06 1.44E-06 ± 1.47E-06 7.31E-22 ± 1.61E-21 1.59E-06 ± 1.73E-06 6.17E-07 ± 6.23E-07 4.61E+00 ± 1.09E-01 6.46E-03 ± 6.46E-02 2.75E-06 ± 5.24E-06 3.63E-03 ± 6.76E-03 2.59E-02 ± 3.96E-02 1.00E-02 ± 1.28E-02

It is noticeable that Firefly Algorithm is outperformed repeatedly by Particle Swarm Optimizer (PSO was found to perform better for 11 benchmark instances out of 14 being used). It is also found to be less stable in terms of standard deviation. It is important to observe though that the advantage of PSO is vanishing significantly (to 8 instances for which PSO performed better) when one relates it to the best configuration of firefly inspired heuristic algorithm. Consequently, such comparison should be repeated after further steps in the development of the algorithm being described here will be made. Some possible improvements contributing to the Firefly Algorithm performance progress will be presented in the final Section of the paper.

5

Conclusion

Firefly Algorithm described here could be considered as an unconventional swarmbased heuristic algorithm for constrained optimization tasks. The algorithm constitutes a population-based iterative procedure with numerous agents (perceived as fireflies) concurrently solving a considered optimization problem. Agents communicate with each other via bioluminescent glowing which enables them to explore cost function space more effectively than in standard distributed random search. Most heuristic algorithms face the problem of inconclusive parameters settings. As shown in the paper, coherent suggestions considering population size

Firefly Algorithm for Continuous Constrained Optimization Tasks

9

and maximum of absorption coefficient could be derived for the Firefly Algorithm. Still the algorithm could benefit from additional research in the adaptive establishment of absorption coefficient and random step size. Furthermore some additional features like decreasing random step size and more sophisticated procedure of initial solution generation could bring further improvements in the algorithm performance. The algorithm could be hybridized together with other heuristic local search based technique like Adaptive Simulated Annealing [13]. Firefly communication scheme should be exploited then on the higher level of the optimization procedure. Acknowledgments. Authors would like to express their gratitude to Dr XinShe Yang of Cambridge University for his suggestions and comments on the initial manuscript.

Appendix: Benchmark Instances No. Name n 1 Himmelblau [14] 2 2 3 4 5 6 7 8 9 10 11 12 13 14

Schwefel [15] Easom [16]

10 2

S (-6,6) (-500,500) (-100,100)

f (x∗ ) Remarks 0 four identical local minima 0 -1

several local minima a singleton maximum in a horizontal valley Rastrigin [17] 20 (-5.12,5.12) 0 highly multimodal and difficult several local minima Griewank [18] 5 (-600,600) 0 Rosenbrock [19] 4 (-2.048,2.048) 0 long curved only slightly decreasing valley, unimodal Permutation [20] 2 (-2,2) 0 function parameter β=10 Hartman3 [21] 3 (0,1) -3.862 4 local minima Hartman6 [21] 6 (0,1) -3.322 6 local minima Shekel [22] 4 (0,10) -10.536 10 local minima Levy 10 [23] 5 (-10,10) 0 105 local minima Sphere [24] 3 (-5.12,5.12) 0 unimodal Michalewicz [25] 5 (0,π) -4.687 n! local minima Powersum [26] 4 (0,2) 0 singular minimum among very flat valleys

References 1. Encyclopædia Britannica: Firefly. In: Encyclopædia Britannica. Ultimate Reference Suite. Chicago: Encyclopædia Britannica (2009) 2. Babu, B.G., Kannan, M.: Lightning bugs. Resonance 7(9) (2002) 49–55 3. Fraga, H.: Firefly luminescence: A historical perspective and recent developments. Journal of Photochemical & Photobiological Sciences 7 (2008) 146–158 4. Lewis, S., Cratsley, C.: Flash signal evolution, mate choice, and predation in fireflies. Annual Review of Entomology 53 (2008) 293–321

10

˙ Szymon L Ã ukasik and SÃlawomir Zak

5. Leidenfrost, R., Elmenreich, W.: Establishing wireless time-triggered communication using a firefly clock synchronization approach. In: Proceedings of the 2008 International Workshop on Intelligent Solutions in Embedded Systems. (2008) 1–18 6. Jumadinova, J., Dasgupta, P.: Firefly-inspired synchronization for improved dynamic pricing in online markets. In: Proceedings of the 2008 Second IEEE International Conference on Self-Adaptive and Self-Organizing Systems. (2008) 403–412 7. Krishnanand, K., Ghose, D.: Glowworm swarm based optimization algorithm for multimodal functions with collective robotics applications. Multiagent and Grid Systems 2(3) (2006) 209–222 8. Yang, X.S.: Nature-Inspired Metaheuristic Algorithms. Luniver Press (2008) 9. Eberhart, R.C., Shi, Y.: Computational Intelligence: Concepts to Implementations. Morgan Kaufmann (2007) 10. Karaboga, D., Basturk, B.: A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm. Journal of Global Optimization 39(3) (2007) 459–471 11. Kennedy, J., Eberhart, R.: Particle swarm optimization. In: Neural Networks, 1995. Proceedings., IEEE International Conference on. (1995) 1942–1948 vol.4 12. Schutte, J.F., Groenwold, A.A.: A study of global optimization using particle swarms. Journal of Global Optimization 31(1) (2005) 93–108 13. Ingber, L.: Adaptive simulated annealing (ASA): lessons learned. Control & Cybernetics 25(1) (1996) 33–55 14. Himmelblau, D.M.: Applied Nonlinear Programming. McGraw-Hill (1972) 15. Schwefel, H.P.: Numerical Optimization of Computer Models. John Wiley & Sons, Inc. (1981) 16. Easom, E.: A survey of global optimization techniques. Master’s thesis, University of Louisville (1990) 17. M¨ uhlenbein, H., Schomisch, D., Born, J.: The Parallel Genetic Algorithm as Function Optimizer. Parallel Computing 17(6-7) (1991) 619–632 18. Griewank, A.: Generalized descent for global optimization. Journal of Optimization Theory and Applications 34 (1981) 11–39 19. Rosenbrock, H.H.: State-Space and Multivariable Theory. Thomas Nelson & Sons Ltd (1970) 20. Neumaier, A.: Permutation function. http://www.mat.univie.ac.at/~neum/ glopt/my_problems.html ˇ 21. T¨ orn, A., Zilinskas, A.: Global Optimization. Springer (1989) 22. Shekel, J.: Test functions for multimodal search techniques. In: Proceedings of the 5th Princeton Conference on Infomration Science and Systems. (1971) 354–359 23. Jansson, C., Kn¨ uppel, O.: Numerical results for a self-validating global optimization method. Technical Report 94.1, Technical University of Hamburg-Harburg (1994) 24. Bilchev, G., Parmee, I.: Inductive search. In: Proceedings of IEEE International Conference on Evolutionary Computation. (1996) 832–836 25. Michalewicz, Z.: Genetic Algorithms + Data Structures = Evolution Programs. Springer (1998) 26. Neumaier, A.: Powersum function. http://www.mat.univie.ac.at/~neum/glopt/ my_problems.html

Suggest Documents